Recent Discussions
Integrating External Authentication with Symmetric Encryption in SQL Server
Hello everyone, I am exploring a method to enhance data security in SQL Server by combining symmetric encryption with external authentication (e.g., via a service like Facebook using OAuth). The idea is to condition access to symmetric keys or their usage on successful authentication through an external provider. After reviewing resources like the official SQL Server documentation (Always Encrypted, Extensible Key Management) and whitepapers, I couldn’t find any examples or mentions of such integration. Here are some questions I’m pondering: Is this approach feasible within SQL Server or a similar system? Are there any documented scenarios or implementations that explore similar concepts? What challenges or technical limitations might arise from such an architecture? I’d love to hear your thoughts, experiences, or insights on this idea. Thank you in advance for sharing your expertise! My Best regards Fred20Views0likes0CommentsCleanup of auto created statistics
Hello everyone, during annual database checkup i found that there were a lot of auto created statistics, which weren't used. It seems that DBMS doesn't cleanup unused auto created statistic on his own. i guess i can drop them with the following query: DROP STATISTICS Schema.TableName._WA_Sys_XXXXXXXXXXXXXXXXXXXX But i am still thinking whether it is safe, about impact on the performance. Has anybody had any experience with that?87Views0likes8CommentsEvaluation expired SQL Server
Hello, We are dealing with an issue where the evaluation version. We will be upgrading it but i dont have access to billing, in the interim am i able to run it as developer? what are the differences in running it as developer when previously it was evaluation? Thanks.25Views0likes1CommentCapturing Latest Effective Date of Current Instance of Position
I am having issues trying to formulate a JOIN statement within my query that will provide the values that I am needing for all employees. Specifically, situations where an employee held the position multiple times. Here my current statement segment: JOIN (SELECT R_POSITION, WORK_ASSIGNMNT, EMPLOYEE, MIN(EFFECT_DATE) AS EFFECT_DATE FROM HR_EMP_POSITIONS GROUP BY R_POSITION, WORK_ASSIGNMNT, EMPLOYEE) AS EP ON EP.R_POSITION = W.POSITION AND EP.EMPLOYEE = W.EMPLOYEE AND EP.WORK_ASSIGNMNT = W.WORK_ASSIGNMNT For my statement, I need to retrieve the latest EFFECT_DATE of the last time the employee held a specific position that matches their current position. Please find my sample document with the EP and W tables. My current statement works for employee A; however, for employee B it is providing the 10/16/2023 date when I need it to populate the 8/26/2024 date. Any ideas on what I can do to get the data that I need? I don't have any other fields that I can use to help refine the criteria.53Views0likes3CommentsNeed help SQL query performance issue
I am looking to help optimize the below select statement I have created supporting indexes and updated statistics , the same query is taking 2sec in one database ( note that same table structure and same data on both tables ) but another database it's taking 95sec on the same server. I compared the execution plan for both 2sec vs 95sec it's exactly matching . Any clue on why it's running faster on one database but taking lot of time on other database. These tables not heavily fragmented and stats are up to date on both database. These tables contains 200million rows. Another important note is that indexes is not playing any role here , with or with out indexes query is taking 95sec declare @v_vch_cut_off_date date =cast( getdate()-10 as date) Select ztr.work_day,tlms.work_day, ztr.from_zone,zl.travel_zone_name, ztr.from_business_unit,tlms.business_unit, ztr.from_process,tlms.process, ztr.from_item_category,tlms.item_category FROM dbo.t_zone_travel_report(nolock) ztr LEFT JOIN dbo.t_lms_process_time (NOLOCK) tlms ON ztr.wh_id=tlms.wh_id AND ztr.employee_id=tlms.employee_id AND ztr.start_tran_datetime >= tlms.start_tran_datetime AND ztr.start_tran_datetime < tlms.next_tran_startdatetime -- AND ztr.log_id >= tlms.log_id AND (ztr.log_id < tlms.next_log_id OR tlms.next_log_id=999999999) AND tlms.work_day >= @v_vch_cut_off_date LEFT JOIN dbo.t_travel_zones_locations (NOLOCK) zl ON ztr.wh_id=zl.wh_id AND ztr.from_location=zl.location_id WHERE (ztr.work_day >= @v_vch_cut_off_date OR ISNULL(ztr.work_day,'') = '' ) AND ( ISNULL(ztr.from_zone,'') <> ISNULL(zl.travel_zone_name,'') OR ISNULL(ztr.from_business_unit,'') <> ISNULL(tlms.business_unit,'') OR ISNULL(ztr.from_process,'') <> ISNULL(tlms.process,'') );26Views0likes0CommentsSQL Server Collation
Hi Experts, I am doing cross database migration. source database characterset is ISO_1 (ISO 8859-1) and binary sorting. While checking equivalent collation in MS SQL, I found Latin1_General_BIN. Later, I checked that all 4 collations are equivalent to the source database characterset and sorting. Latin1_General_BIN Latin1_General_BIN2 Latin1_General_100_BIN2 Latin1_General_100_BIN2_UTF8 Could you share your expert advice on the difference between the above collations and which one is best?10Views0likes0CommentsSQL Server Collation
Hi Experts, I am doing cross database migration. source database characterset is ISO_1 (ISO 8859-1) and binary sorting. While checking equivalent collation in MS SQL, I found Latin1_General_BIN. Later, I checked that all 4 collations are equivalent to the source database characterset and sorting. Latin1_General_BIN Latin1_General_BIN2 Latin1_General_100_BIN2 Latin1_General_100_BIN2_UTF8 Could you share your expert advice on the difference between the above collations and which one is best?5Views0likes0CommentsWhat is the best practice to create documentation for our SQL Server
I know that we have to documents all our risk assessments and backup and recovery strategy but for Database Administrator to handover the task to other Database Admin, is there any best practice what to documents? Also I am aware of the 3rd party tool to help the documentation, but based on your experience what are the important things to document? Thank you37Views0likes1CommentDelete Statement slowness
Hello, We are running a delete statement on database every weekend where it deletes rows based on the 16 where conditions. It has to pass 16 where conditions before delete more than millions row. Since delete statement itself is resource consuming activity and takes time to delete more than a millions of data. Is it something these many where conditions causing this issue? our DB environment is like :- SQL Server 2022 Ent Version with latest patch TempDB files added considering the number of vCPUS on VM, also TempDB is on separate disk NDF Files of secondary filegroups on separate disk for better throughput MaxDOP is set to default Data and log files are on separate disks. Weekly DB maintenance plan is in place. regards, Nitin43Views0likes2CommentsCDC- change data capture
Hi, Trying to run ALTER TABLE on cdc getting below error. i have tried disabling cdc and renabling. still nothing works. Can anyone assist? Cannot alter the Error: 8277, Severity: -1, State: 0. (Params:). The error is printed in terse mode because there was error during formatting. Tracing, ETW, notifications etc are skipped. 'Table ' because it is being used for Change Data Capture. Regards12Views0likes0CommentsCatch Error in a SQL Server Logon Trigger
I have written a Logon trigger in a SQL Server to control logon authorization. ALTER TRIGGER [LOG_TRG_01] ON ALL SERVER WITH EXECUTE AS 'SA' FOR LOGON AS BEGIN if ORIGINAL_LOGIN() = 'sa' begin return; end; if ORIGINAL_LOGIN() = 'OMEGACAEVDEV1' begin -- not allowed to logon rollback; end; -- Insert Trail EXEC [OmegaCoreAudit].[OMEGACA].[P_ACC_UNF_TRAIL] 0, 'trail_details', 'ip', 'server', 'db', 0 ; END GO It does (as expected): OMEGACAEVDEV1 is not allowed to logon OMEGACAEVDEV2 is allowed to logon An audit event is inserted by proc P_ACC_UNF_TRAIL in a table for both users. All three above I want to stay this way ! But I need to have error handling in it, so that in case of whatever error the login is allowed to login - but keeping a record on another table named SYS_ERROR_LOG (with error details). In this trigger I have intentionally provoked an error by "select 10/0" The new trigger is: ALTER TRIGGER [LOG_TRG_02] ON ALL SERVER WITH EXECUTE AS 'SA' FOR LOGON AS BEGIN BEGIN TRY if ORIGINAL_LOGIN() = 'sa' begin return; end; --provoke error select 10/0; if ORIGINAL_LOGIN() = 'OMEGACAEVDEV1' begin -- not allowed to logon rollback; end; -- Insert Trail EXEC [OmegaCoreAudit].[OMEGACA].[P_ACC_UNF_TRAIL] 0, 'trail_details', 'ip', 'server', 'db', 0 ; END TRY BEGIN CATCH Insert into OmegaCoreAudit.OMEGACA.SYS_ERROR_LOG ([LOGIN_USER_NAME], [DB_USER_NAME], [USERHOST], [IP_ADDRESS], [OS_USER], [BG_JOB_ID], [ERROR_DATA]) values ('LOGIN_NAME', 'USER_NAME', 'USER_HOST', 'IP', NULL, NULL, 'ERROR_MESSAGE'); END CATCH END GO In the above code "if ORIGINAL_LOGIN() = 'OMEGACAEVDEV1'" represents a simplified version of a wider authorization process. Problem: CATCH is not working. Both users are not allowed to logon ("Logon failed for login '[user]' due to trigger execution") No record is written on table SYS_ERROR_LOG. I was expecting one for each user. What can I do to fix this problem? best regards Altin137Views0likes13CommentsWhat are the best practices for managing and optimizing a SQL table that contains 10 million records
I have a table that currently holds 10 million records, and it continues to grow daily. As the dataset expands, I'm encountering significant challenges in retrieving data efficiently and generating reports. The increasing volume of data is causing performance bottlenecks, leading to slow query execution times and delays in report generation. To address these issues, I need to implement strategies for optimizing data retrieval and managing the growing dataset effectively and to ensure that the system remains responsive and capable of handling the increasing data load.72Views0likes2Commentssql query to check status change of an item
I have a table dbo.tblPresentationStatus (sql script at the bottom ) I have to select rows where the status change is wrong. Flow should be as this: Review --> approve --> presentation --> close From 'Review' the status for a specific presentation ID can change to either 'Approve' or 'Presentation' or 'Close' From 'Approve' the status for a specific presentation ID can change to either 'Presentation' or 'Close' From 'Presentation' the status for a specific presentation ID can change to only 'Close' I want to write query to return Presentations with wrong status flow. So expected output as per records given in sql script at the bottom should be : PrID | Status1 | Status2 | Status3 | Status 4 103 | REVIEW | PRESENTATION | APPROVE |CLOSE 101 | APPROVE | REVIEW | NULL | NULL -----------------------------------sql script------------------------------------------- DROP TABLE IF EXISTS#tblPresentation CREATE TABLE#tblPresentation ( PrID int NOT NULL, PrName nvarchar(100) NULL ) INSERT INTO#tblPresentationVALUES (100,'PrA'), (101,'PrB'), (102,'PrC'), (103,'PrD') DROP TABLE IF EXISTS#tblPresentationStatus CREATE TABLE#tblPresentationStatus ( StatusID int NOT NULL, PrID int NOT NULL, PrStatus nvarchar(100) NOT NULL, StatusDate datetime NOT NULL ) INSERT INTO#tblPresentationStatusVALUES --PRESENTATION ID 100 (1,100,'REVIEW','2024-01-01 00:00:00.00'), (2,100,'APPROVE','2024-01-02 00:00:00.00'), (3,100,'PRESENTATION','2024-01-03 07:00:00.00'), (4,100,'CLOSE','2024-01-03 10:00:00.00'), --PRESENTATION ID 101 (5,101,'APPROVE','2024-01-01 00:00:00.00'), (6,101,'REVIEW','2024-01-03 10:00:00.00'), --wrong status change from 'APPROVE' to back ward status of ' REVIEW ' is not allowed --PRESENTATION ID 102 (7,102,'REVIEW','2024-01-01 00:00:00.00'), (8,102,'PRESENTATION','2024-01-02 00:00:00.00'), (9,102,'CLOSE','2024-01-03 10:00:00.00'), --PRESENTATION ID 103 (10,103,'REVIEW','2024-01-01 00:00:00.00'), (11,103,'PRESENTATION','2024-01-02 00:00:00.00'), (12,103,'APPROVE','2024-01-03 00:00:00.00'), --wrong status change from 'PRESENTATION' to back ward status of ' APPROVE' is not allowed (13,103,'CLOSE','2024-01-04 00:00:00.00')24Views0likes1CommentSticky Notes Account Login and Sync Issue Report
I have a habit of using sticky notes as a daily memo tool, and everything has been functioning normally until recently. On the morning of Friday, November 29th, I suddenly discovered that my sticky notes were unable to sync. After testing by logging out and logging back in, I encountered the message: "Your account does not have an email address, so your sticky notes cannot be saved to the cloud." Since then, I have been unable to log in. That day, I contacted Microsoft Support. An engineer conducted remote testing and found that it was not a system issue, as other accounts could log in successfully. However, the account I normally use, email address removed for privacy reasons, could not log in to Sticky Notes. They advised me to contact customer service and get assistance from the account department. The next day, on Saturday, I contacted the account department. They asked me to log in to Microsoft using my regular account, which I was able to do without issue. However, I still could not log in to Sticky Notes, leaving the account department uncertain about the cause of the problem. They then transferred me to the Office department, but they were also unable to resolve the issue.10Views0likes0CommentsIssue with importing json file
I am using the following script to read data from a json file and import into SQL server. Data import works without any issue. However, I noticed that when the json file contains special characters (Ex: è) is replaces it with è I already saved the json file with UTF-8. Is there anyway to import the actual character? drop table if exists #Test3 CREATE TABLE #Test3 ( EMP_COMMON_FULL_NAME NVARCHAR(500), EMP_COMMON_PRIMARY_JOB NVARCHAR(500), PEOPLE_EMAIL NVARCHAR(500), EMP_COMMON_PRIMARY_ORG NVARCHAR(500), PEOPLE_BADGE_NUMBER NVARCHAR(500), PEOPLE_EXPECTED_WEEKLY_HOURS NVARCHAR(500), PEOPLE_EXPECTED_DAILY_HOURS NVARCHAR(500) ); DECLARE @json NVARCHAR(MAX); SELECT @json = BulkColumn FROM OPENROWSET(BULK 'F:\PowerBI\json\file2.json', SINGLE_CLOB, CODEPAGE = '65001') AS jsonFile; INSERT INTO #Test3 ( EMP_COMMON_FULL_NAME, EMP_COMMON_PRIMARY_JOB, PEOPLE_EMAIL, EMP_COMMON_PRIMARY_ORG, PEOPLE_BADGE_NUMBER, PEOPLE_EXPECTED_WEEKLY_HOURS, PEOPLE_EXPECTED_DAILY_HOURS ) SELECT JSONData.EMP_COMMON_FULL_NAME, JSONData.EMP_COMMON_PRIMARY_JOB, JSONData.PEOPLE_EMAIL, JSONData.EMP_COMMON_PRIMARY_ORG, JSONData.PEOPLE_BADGE_NUMBER, JSONData.PEOPLE_EXPECTED_WEEKLY_HOURS, JSONData.PEOPLE_EXPECTED_DAILY_HOURS FROM OPENJSON(@json, '$.data.children') WITH ( EMP_COMMON_FULL_NAME NVARCHAR(500) '$.attributes[0].value', EMP_COMMON_PRIMARY_JOB NVARCHAR(500) '$.attributes[1].value', PEOPLE_EMAIL NVARCHAR(500) '$.attributes[2].value', EMP_COMMON_PRIMARY_ORG NVARCHAR(500) '$.attributes[3].value', PEOPLE_BADGE_NUMBER NVARCHAR(500) '$.attributes[4].value', PEOPLE_EXPECTED_WEEKLY_HOURS NVARCHAR(500) '$.attributes[5].value', PEOPLE_EXPECTED_DAILY_HOURS NVARCHAR(500) '$.attributes[6].value' ) AS JSONData; SELECT * FROM #Test3; { "metadata": { "numNodes": "500", "metadataKey": "fewfewf-sdvrv-evfewsv", "cacheKey": "fewfewf-sdvrv-evfewsv", "currencyCode": "NONE", "totalNodes": "500", "totalElements": "500" }, "data": { "key": { "ROOT": "-1" }, "coreEntityKey": {}, "attributes": [], "children": [ { "key": { "PEOPLE": "67648" }, "coreEntityKey": { "EMP": { "id": "11111" } }, "attributes": [ { "key": "EMP_COMMON_FULL_NAME", "alias": "Name", "rawValue": "Sam, Rogers", "value": "Sam, Rogers" }, { "key": "EMP_COMMON_PRIMARY_JOB", "alias": "Primary Job", "rawValue": "Accountant", "value": "Accountant" }, { "key": "PEOPLE_EMAIL", "alias": "Email Address", "rawValue": "email address removed for privacy reasons", "value": "email address removed for privacy reasons" }, { "key": "EMP_COMMON_PRIMARY_ORG", "alias": "Location", "rawValue": "1ère Inc/1ère Inc", "value": "1ère Inc/1ère Inc" }, { "key": "PEOPLE_BADGE_NUMBER", "alias": "Active Badge Number", "rawValue": "1234", "value": "1234" }, { "key": "PEOPLE_EXPECTED_WEEKLY_HOURS", "alias": "Weekly Hours", "rawValue": "35.75", "value": "35.75" }, { "key": "PEOPLE_EXPECTED_DAILY_HOURS", "alias": "Daily Hours", "rawValue": "7.8", "value": "7.80" } ], "children": [], "summaryListDisplay": [], "rootEntity": "PEOPLE", "customProperties": {} }, { "key": { "PEOPLE": "22222" }, "coreEntityKey": { "EMP": { "id": "22222" } }, "attributes": [ { "key": "EMP_COMMON_FULL_NAME", "alias": "Name", "rawValue": "F3irst1, Last1", "value": "F3irst1, Last1" }, { "key": "EMP_COMMON_PRIMARY_JOB", "alias": "Primary Job", "rawValue": "Cl3erk", "value": "Cl3erk" }, { "key": "PEOPLE_EMAIL", "alias": "Email Address", "rawValue": "email address removed for privacy reasons", "value": "email address removed for privacy reasons" }, { "key": "EMP_COMMON_PRIMARY_ORG", "alias": "Location", "rawValue": "1ère 3Inc/1ère Inc", "value": "1ère 3Inc/1ère Inc" }, { "key": "PEOPLE_BADGE_NUMBER", "alias": "Active Badge Number", "rawValue": "23222", "value": "23222" }, { "key": "PEOPLE_EXPECTED_WEEKLY_HOURS", "alias": "Weekly Hours", "rawValue": "30.0", "value": "30.00" }, { "key": "PEOPLE_EXPECTED_DAILY_HOURS", "alias": "Daily Hours", "rawValue": "30.0", "value": "30.00" } ], "children": [], "summaryListDisplay": [], "rootEntity": "PEOPLE", "customProperties": {} } ], "summaryListDisplay": [], "rootEntity": "ROOT", "customProperties": {} } }35Views0likes1CommentUpgrading from 2012 (SP4) to 2019 including SSRS
Hi, I have done a fair bit of googling/reading and found conflicting paths/advice so am looking for a definitive guide. Step by step would be ideal if possible. One of our servers is currently running SQL 2012 and we need to upgrade to 2019. The instance also includes SSRS and the report server is on the same server. With approx. 100+ reports in use. I've read that SSRS does not upgrade when you upgrade the SQL version as it is now a stand alone program (true?) So basically I need to upgrade (in place) SQL Server (easy) but then do what ever I need to do to get SSRS to 2019 and obviously have the report definitions (RDLs) recognise their new environment and upgrade themselves in the process. I imagine the schema path at the top of the RDL would need changing at the very least. We also have scheduled subscriptions that we need to retain.... and of course datasources that point to the relevant DBs (not sure if these would be affected as part of any upgrade/install of SSRS). Would be grateful for any legitimate sources that deal with this scenario and I thank you in advance.13Views0likes0CommentsNested Query Slowing Performance
Hi, At the beginning of my SQL code I have a cte target table called TARGETS, which contains 3 columns TARGET_NAME, MODE AND VALUE. Further down in my query I am trying to use a nested query to retrieve values from this table, however, it is really slowing down the performance. I am wondering if there is a better way to do this? An example of the nested query looks like this SUM(CASE WHEN LEG = 'Empty' AND LOCATION = 'Perth' THEN (SELECT VALUE FROM TARGETS WHERE TARGET_NAME = 'Empty Perth' AND MODE = 'Yes') WHEN LEG = 'Empty' AND LOCATION= 'Melbourne' THEN (SELECT VALUE FROM TARGETS WHERE TARGET_NAME= 'Empty Melbourne' AND MODE = 'No') etc... END) / COUNT(DISTINCT(LEG)) AS Target FROM LEG_DETAILS Many Thanks,49Views0likes5Commentsone column refer to 2 columns
Hello there global table is dbo.Employee(EmployeeId int, sex char(10)) Another global table is dbo.GradeEmployee(PolicyEmployeeId int (FK of Employee-EmployeeId), FaultEmployeeId int (FK of Employee-EmployeeId), grade char(5) Another is my temp table called #TmpEmployee(EmployeeId int, employeeName char(50), quantity) GradeEmployee- PolicyEmployeeId is derived from Employee-EmployeeId GradeEmployee- FaultEmployeeId is also derived from Employee-EmployeeId Now, the task is I want to get the data from #TmpEmployee table, there I want to Display in the below format PolicyEmployeeId, FaultEmployeeId, employeeNameSum(quantity) Sum(quantity) is total quantity of a pair FaultEmployeeId, PolicyEmployeeId, note in #TmpEmployee table sometime the Quantity of FaultEmployeeId wont present; sometime the Quantity of PolicyEmployeeId wont present but, in the result it has to display. How to write the select query?28Views0likes1CommentData masking in SQL server 2019
We tried to mask a column in SQLDB I first backup the table tblContacts to tblContacts_backup then I issue following SQL command to mask the column "Email" ALTER TABLE tblContacts ALTER COLUMN Email ADD MASKED WITH (FUNCTION = 'default()'); Then I check the result but not thing happen (no mask applied to the column) compared tblContacts and tblContacts_backup but no different SELECT Email FROM [ABC].[dbo].[tblContacts] where Emailis not null SELECT Email FROM [ABC].[dbo].[tblContacts_backup] where Emailis not null I have done following step but the problem still not solve make sure the user is not db admin role REVOKE UNMASK TO the user disconnect and connect again checked configure have been applied to the column select c.name AS column_name, t.name AS data_type, m.is_masked, m.masking_function FROM sys.columns c JOIN sys.tables t ON c.object_id = t.object_id LEFT JOIN sys.masked_columns m ON c.object_id = m.object_id AND c.column_id = m.column_id WHERE t.name = 'tblContacts_backup' found is_masked = 1 and m.masking_function = default() What else can I check or try?43Views0likes1CommentSQL Server 2022 DateDiff D not working
Just updated an SQL server 2017 instance to 2022 and a very simple where clause using DataDiff(d) no longer works in a stored procedure. Here is the where clause: ...AND datediff(d, A.Custfirstvisit,getdate()) < 7 Simple, but no longer working. Tried casting as int, changing the 7 to being loaded into a declared int type var. Nope. But, I can perform addition etc. e.g. subtract the 7 from the DateDiff(d) return valud and make the value 7 lower. We are getting close to having to just walk away from MS tools altogether. Microsoft just breaking stuff that used to work now, and EASY stuff at that. Wow.32Views0likes2Comments
Events
Recent Blogs
- The SQL Tools team is excited to announce the upcoming release of SQL Server Management Studio (SSMS) 21 Preview 1, and the upcoming Private Preview of Copilot in SSMS.Dec 20, 202412KViews22likes24Comments
- The SQL Tools team, in partnership with Visual Studio team, is thrilled to announce the release of SQL Server Management Studio 21 Preview 1, which is based on Visual Studio 2022 (17.13.0 Previe...Dec 19, 202414KViews15likes31Comments