Forum Widgets
Latest Discussions
Integrating External Authentication with Symmetric Encryption in SQL Server
Hello everyone, I am exploring a method to enhance data security in SQL Server by combining symmetric encryption with external authentication (e.g., via a service like Facebook using OAuth). The idea is to condition access to symmetric keys or their usage on successful authentication through an external provider. After reviewing resources like the official SQL Server documentation (Always Encrypted, Extensible Key Management) and whitepapers, I couldn’t find any examples or mentions of such integration. Here are some questions I’m pondering: Is this approach feasible within SQL Server or a similar system? Are there any documented scenarios or implementations that explore similar concepts? What challenges or technical limitations might arise from such an architecture? I’d love to hear your thoughts, experiences, or insights on this idea. Thank you in advance for sharing your expertise! My Best regards FredTech_PolyTechDec 18, 2024Copper Contributor20Views0likes0CommentsCapturing Latest Effective Date of Current Instance of Position
I am having issues trying to formulate a JOIN statement within my query that will provide the values that I am needing for all employees. Specifically, situations where an employee held the position multiple times. Here my current statement segment: JOIN (SELECT R_POSITION, WORK_ASSIGNMNT, EMPLOYEE, MIN(EFFECT_DATE) AS EFFECT_DATE FROM HR_EMP_POSITIONS GROUP BY R_POSITION, WORK_ASSIGNMNT, EMPLOYEE) AS EP ON EP.R_POSITION = W.POSITION AND EP.EMPLOYEE = W.EMPLOYEE AND EP.WORK_ASSIGNMNT = W.WORK_ASSIGNMNT For my statement, I need to retrieve the latest EFFECT_DATE of the last time the employee held a specific position that matches their current position. Please find my sample document with the EP and W tables. My current statement works for employee A; however, for employee B it is providing the 10/16/2023 date when I need it to populate the 8/26/2024 date. Any ideas on what I can do to get the data that I need? I don't have any other fields that I can use to help refine the criteria.TCatron18Dec 18, 2024Copper Contributor53Views0likes3CommentsEvaluation expired SQL Server
Hello, We are dealing with an issue where the evaluation version. We will be upgrading it but i dont have access to billing, in the interim am i able to run it as developer? what are the differences in running it as developer when previously it was evaluation? Thanks.ShaunMcDec 13, 2024Copper Contributor25Views0likes1CommentDelete Statement slowness
Hello, We are running a delete statement on database every weekend where it deletes rows based on the 16 where conditions. It has to pass 16 where conditions before delete more than millions row. Since delete statement itself is resource consuming activity and takes time to delete more than a millions of data. Is it something these many where conditions causing this issue? our DB environment is like :- SQL Server 2022 Ent Version with latest patch TempDB files added considering the number of vCPUS on VM, also TempDB is on separate disk NDF Files of secondary filegroups on separate disk for better throughput MaxDOP is set to default Data and log files are on separate disks. Weekly DB maintenance plan is in place. regards, Nitinnitinshete1975Dec 12, 2024Copper Contributor43Views0likes2CommentsImport from Excel changes Header Names
I have a wide table in Excel with column names like Q1.1, Q1.2, etc. However, when I import this table using SSMS, the column names are renamed to Q1#1#, Q1#2#, and so on. This renaming is visible when you look at Column Mapping before actually importing. Is there a way to prevent this renaming? If not, can I write some code to rename the columns afterward? I have 96 columns in this dataset, which I need to upload monthly. This issue is becoming a nuisance because I want SQL to clean up this dataset before pivoting it and appending it to my final table. Currently, I manually clean up the data in Excel, pivot it, and then upload a tall file with only 4 columns. This method avoids the renaming issue because all the question numbers become values in a column rather than column names.SolvedMartin LucasDec 12, 2024Copper Contributor42Views0likes2CommentsWhat is the best practice to create documentation for our SQL Server
I know that we have to documents all our risk assessments and backup and recovery strategy but for Database Administrator to handover the task to other Database Admin, is there any best practice what to documents? Also I am aware of the 3rd party tool to help the documentation, but based on your experience what are the important things to document? Thank youelly_watiDec 11, 2024Copper Contributor37Views0likes1CommentSQL Server Collation
Hi Experts, I am doing cross database migration. source database characterset is ISO_1 (ISO 8859-1) and binary sorting. While checking equivalent collation in MS SQL, I found Latin1_General_BIN. Later, I checked that all 4 collations are equivalent to the source database characterset and sorting. Latin1_General_BIN Latin1_General_BIN2 Latin1_General_100_BIN2 Latin1_General_100_BIN2_UTF8 Could you share your expert advice on the difference between the above collations and which one is best?sljesra27Dec 11, 2024Copper Contributor10Views0likes0CommentsSQL Server Collation
Hi Experts, I am doing cross database migration. source database characterset is ISO_1 (ISO 8859-1) and binary sorting. While checking equivalent collation in MS SQL, I found Latin1_General_BIN. Later, I checked that all 4 collations are equivalent to the source database characterset and sorting. Latin1_General_BIN Latin1_General_BIN2 Latin1_General_100_BIN2 Latin1_General_100_BIN2_UTF8 Could you share your expert advice on the difference between the above collations and which one is best?sljesra27Dec 11, 2024Copper Contributor5Views0likes0CommentsWhat are the best practices for managing and optimizing a SQL table that contains 10 million records
I have a table that currently holds 10 million records, and it continues to grow daily. As the dataset expands, I'm encountering significant challenges in retrieving data efficiently and generating reports. The increasing volume of data is causing performance bottlenecks, leading to slow query execution times and delays in report generation. To address these issues, I need to implement strategies for optimizing data retrieval and managing the growing dataset effectively and to ensure that the system remains responsive and capable of handling the increasing data load.VallikannuDec 09, 2024Copper Contributor72Views0likes2CommentsHelp in Query to calculate the % from last two timelines
I want to compare the FAmt from last recent two lines and if difference is more than 80% then it should be in the result. The group/index keys are A,B,C,D, Edate. Create table #AR (A int, B char(10), C char(10), D char(10), Edate datetime, FAmt decimal (6,2), G char(2)) Insert into #AR values ('AA','CCC','1','E', '2022-01-01',12.60,'2') Insert into #AR values ('AA','CCC','1','E', '2023-01-01',15.80,'2') Insert into #AR values ('AA','CCC','1','E', '2024-01-01',18.60,'2') Insert into #AR values ('BB','DCC','1','E', '2022-01-01',11.40,'2') Insert into #AR values ('BB','DCC','1','E', '2024-01-01',15.60,'2') Insert into #AR values ('CC','DCC','1','E', '2021-01-01',12.60,'2') Insert into #AR values ('CC','GCC','2','E', '2022-01-01',15.60,'2') Insert into #AR values ('CC','GCC','2','E', '2023-04-01',18.60,'2') Insert into #AR values ('CC','GCC','2','E', '2024-04-01',34.80,'2') --Note: This FAmt is more than 80% from last timeline and it should be in the expected result Expected result: A B C D Edate FAmt G Comments CC GCC 2 E 2024-04-01 34.80 2 Current Amount is > 80% from last timelineSolvedKenny_GuaDec 06, 2024Copper Contributor59Views0likes6Comments
Resources
Tags
- Data Warehouse64 Topics
- Integration Services58 Topics
- sql server43 Topics
- Reporting Services41 Topics
- SQL36 Topics
- Business Intelligence35 Topics
- Analysis Services29 Topics
- Business Apps22 Topics
- Analytics18 Topics
- Big Data11 Topics