Forum Widgets
Latest Discussions
sql query to check status change of an item
I have a table dbo.tblPresentationStatus (sql script at the bottom ) I have to select rows where the status change is wrong. Flow should be as this: Review --> approve --> presentation --> close From 'Review' the status for a specific presentation ID can change to either 'Approve' or 'Presentation' or 'Close' From 'Approve' the status for a specific presentation ID can change to either 'Presentation' or 'Close' From 'Presentation' the status for a specific presentation ID can change to only 'Close' I want to write query to return Presentations with wrong status flow. So expected output as per records given in sql script at the bottom should be : PrID | Status1 | Status2 | Status3 | Status 4 103 | REVIEW | PRESENTATION | APPROVE |CLOSE 101 | APPROVE | REVIEW | NULL | NULL -----------------------------------sql script------------------------------------------- DROP TABLE IF EXISTS#tblPresentation CREATE TABLE#tblPresentation ( PrID int NOT NULL, PrName nvarchar(100) NULL ) INSERT INTO#tblPresentationVALUES (100,'PrA'), (101,'PrB'), (102,'PrC'), (103,'PrD') DROP TABLE IF EXISTS#tblPresentationStatus CREATE TABLE#tblPresentationStatus ( StatusID int NOT NULL, PrID int NOT NULL, PrStatus nvarchar(100) NOT NULL, StatusDate datetime NOT NULL ) INSERT INTO#tblPresentationStatusVALUES --PRESENTATION ID 100 (1,100,'REVIEW','2024-01-01 00:00:00.00'), (2,100,'APPROVE','2024-01-02 00:00:00.00'), (3,100,'PRESENTATION','2024-01-03 07:00:00.00'), (4,100,'CLOSE','2024-01-03 10:00:00.00'), --PRESENTATION ID 101 (5,101,'APPROVE','2024-01-01 00:00:00.00'), (6,101,'REVIEW','2024-01-03 10:00:00.00'), --wrong status change from 'APPROVE' to back ward status of ' REVIEW ' is not allowed --PRESENTATION ID 102 (7,102,'REVIEW','2024-01-01 00:00:00.00'), (8,102,'PRESENTATION','2024-01-02 00:00:00.00'), (9,102,'CLOSE','2024-01-03 10:00:00.00'), --PRESENTATION ID 103 (10,103,'REVIEW','2024-01-01 00:00:00.00'), (11,103,'PRESENTATION','2024-01-02 00:00:00.00'), (12,103,'APPROVE','2024-01-03 00:00:00.00'), --wrong status change from 'PRESENTATION' to back ward status of ' APPROVE' is not allowed (13,103,'CLOSE','2024-01-04 00:00:00.00')SqlPuzzleDec 03, 2024Occasional Reader14Views0likes1CommentSticky Notes Account Login and Sync Issue Report
I have a habit of using sticky notes as a daily memo tool, and everything has been functioning normally until recently. On the morning of Friday, November 29th, I suddenly discovered that my sticky notes were unable to sync. After testing by logging out and logging back in, I encountered the message: "Your account does not have an email address, so your sticky notes cannot be saved to the cloud." Since then, I have been unable to log in. That day, I contacted Microsoft Support. An engineer conducted remote testing and found that it was not a system issue, as other accounts could log in successfully. However, the account I normally use, email address removed for privacy reasons, could not log in to Sticky Notes. They advised me to contact customer service and get assistance from the account department. The next day, on Saturday, I contacted the account department. They asked me to log in to Microsoft using my regular account, which I was able to do without issue. However, I still could not log in to Sticky Notes, leaving the account department uncertain about the cause of the problem. They then transferred me to the Office department, but they were also unable to resolve the issue.Billy0821Dec 03, 2024Occasional Reader3Views0likes0CommentsHelp in Query to calculate the % from last two timelines
I want to compare the FAmt from last recent two lines and if difference is more than 80% then it should be in the result. The group/index keys are A,B,C,D, Edate. Create table #AR (A int, B char(10), C char(10), D char(10), Edate datetime, FAmt decimal (6,2), G char(2)) Insert into #AR values ('AA','CCC','1','E', '2022-01-01',12.60,'2') Insert into #AR values ('AA','CCC','1','E', '2023-01-01',15.80,'2') Insert into #AR values ('AA','CCC','1','E', '2024-01-01',18.60,'2') Insert into #AR values ('BB','DCC','1','E', '2022-01-01',11.40,'2') Insert into #AR values ('BB','DCC','1','E', '2024-01-01',15.60,'2') Insert into #AR values ('CC','DCC','1','E', '2021-01-01',12.60,'2') Insert into #AR values ('CC','GCC','2','E', '2022-01-01',15.60,'2') Insert into #AR values ('CC','GCC','2','E', '2023-04-01',18.60,'2') Insert into #AR values ('CC','GCC','2','E', '2024-04-01',34.80,'2') --Note: This FAmt is more than 80% from last timeline and it should be in the expected result Expected result: A B C D Edate FAmt G Comments CC GCC 2 E 2024-04-01 34.80 2 Current Amount is > 80% from last timelineKenny_GuaNov 29, 2024Copper Contributor43Views0likes5CommentsIssue with importing json file
I am using the following script to read data from a json file and import into SQL server. Data import works without any issue. However, I noticed that when the json file contains special characters (Ex: è) is replaces it with è I already saved the json file with UTF-8. Is there anyway to import the actual character? drop table if exists #Test3 CREATE TABLE #Test3 ( EMP_COMMON_FULL_NAME NVARCHAR(500), EMP_COMMON_PRIMARY_JOB NVARCHAR(500), PEOPLE_EMAIL NVARCHAR(500), EMP_COMMON_PRIMARY_ORG NVARCHAR(500), PEOPLE_BADGE_NUMBER NVARCHAR(500), PEOPLE_EXPECTED_WEEKLY_HOURS NVARCHAR(500), PEOPLE_EXPECTED_DAILY_HOURS NVARCHAR(500) ); DECLARE @json NVARCHAR(MAX); SELECT @json = BulkColumn FROM OPENROWSET(BULK 'F:\PowerBI\json\file2.json', SINGLE_CLOB, CODEPAGE = '65001') AS jsonFile; INSERT INTO #Test3 ( EMP_COMMON_FULL_NAME, EMP_COMMON_PRIMARY_JOB, PEOPLE_EMAIL, EMP_COMMON_PRIMARY_ORG, PEOPLE_BADGE_NUMBER, PEOPLE_EXPECTED_WEEKLY_HOURS, PEOPLE_EXPECTED_DAILY_HOURS ) SELECT JSONData.EMP_COMMON_FULL_NAME, JSONData.EMP_COMMON_PRIMARY_JOB, JSONData.PEOPLE_EMAIL, JSONData.EMP_COMMON_PRIMARY_ORG, JSONData.PEOPLE_BADGE_NUMBER, JSONData.PEOPLE_EXPECTED_WEEKLY_HOURS, JSONData.PEOPLE_EXPECTED_DAILY_HOURS FROM OPENJSON(@json, '$.data.children') WITH ( EMP_COMMON_FULL_NAME NVARCHAR(500) '$.attributes[0].value', EMP_COMMON_PRIMARY_JOB NVARCHAR(500) '$.attributes[1].value', PEOPLE_EMAIL NVARCHAR(500) '$.attributes[2].value', EMP_COMMON_PRIMARY_ORG NVARCHAR(500) '$.attributes[3].value', PEOPLE_BADGE_NUMBER NVARCHAR(500) '$.attributes[4].value', PEOPLE_EXPECTED_WEEKLY_HOURS NVARCHAR(500) '$.attributes[5].value', PEOPLE_EXPECTED_DAILY_HOURS NVARCHAR(500) '$.attributes[6].value' ) AS JSONData; SELECT * FROM #Test3; { "metadata": { "numNodes": "500", "metadataKey": "fewfewf-sdvrv-evfewsv", "cacheKey": "fewfewf-sdvrv-evfewsv", "currencyCode": "NONE", "totalNodes": "500", "totalElements": "500" }, "data": { "key": { "ROOT": "-1" }, "coreEntityKey": {}, "attributes": [], "children": [ { "key": { "PEOPLE": "67648" }, "coreEntityKey": { "EMP": { "id": "11111" } }, "attributes": [ { "key": "EMP_COMMON_FULL_NAME", "alias": "Name", "rawValue": "Sam, Rogers", "value": "Sam, Rogers" }, { "key": "EMP_COMMON_PRIMARY_JOB", "alias": "Primary Job", "rawValue": "Accountant", "value": "Accountant" }, { "key": "PEOPLE_EMAIL", "alias": "Email Address", "rawValue": "email address removed for privacy reasons", "value": "email address removed for privacy reasons" }, { "key": "EMP_COMMON_PRIMARY_ORG", "alias": "Location", "rawValue": "1ère Inc/1ère Inc", "value": "1ère Inc/1ère Inc" }, { "key": "PEOPLE_BADGE_NUMBER", "alias": "Active Badge Number", "rawValue": "1234", "value": "1234" }, { "key": "PEOPLE_EXPECTED_WEEKLY_HOURS", "alias": "Weekly Hours", "rawValue": "35.75", "value": "35.75" }, { "key": "PEOPLE_EXPECTED_DAILY_HOURS", "alias": "Daily Hours", "rawValue": "7.8", "value": "7.80" } ], "children": [], "summaryListDisplay": [], "rootEntity": "PEOPLE", "customProperties": {} }, { "key": { "PEOPLE": "22222" }, "coreEntityKey": { "EMP": { "id": "22222" } }, "attributes": [ { "key": "EMP_COMMON_FULL_NAME", "alias": "Name", "rawValue": "F3irst1, Last1", "value": "F3irst1, Last1" }, { "key": "EMP_COMMON_PRIMARY_JOB", "alias": "Primary Job", "rawValue": "Cl3erk", "value": "Cl3erk" }, { "key": "PEOPLE_EMAIL", "alias": "Email Address", "rawValue": "email address removed for privacy reasons", "value": "email address removed for privacy reasons" }, { "key": "EMP_COMMON_PRIMARY_ORG", "alias": "Location", "rawValue": "1ère 3Inc/1ère Inc", "value": "1ère 3Inc/1ère Inc" }, { "key": "PEOPLE_BADGE_NUMBER", "alias": "Active Badge Number", "rawValue": "23222", "value": "23222" }, { "key": "PEOPLE_EXPECTED_WEEKLY_HOURS", "alias": "Weekly Hours", "rawValue": "30.0", "value": "30.00" }, { "key": "PEOPLE_EXPECTED_DAILY_HOURS", "alias": "Daily Hours", "rawValue": "30.0", "value": "30.00" } ], "children": [], "summaryListDisplay": [], "rootEntity": "PEOPLE", "customProperties": {} } ], "summaryListDisplay": [], "rootEntity": "ROOT", "customProperties": {} } }dp3450Nov 28, 2024Copper Contributor33Views0likes1CommentUpgrading from 2012 (SP4) to 2019 including SSRS
Hi, I have done a fair bit of googling/reading and found conflicting paths/advice so am looking for a definitive guide. Step by step would be ideal if possible. One of our servers is currently running SQL 2012 and we need to upgrade to 2019. The instance also includes SSRS and the report server is on the same server. With approx. 100+ reports in use. I've read that SSRS does not upgrade when you upgrade the SQL version as it is now a stand alone program (true?) So basically I need to upgrade (in place) SQL Server (easy) but then do what ever I need to do to get SSRS to 2019 and obviously have the report definitions (RDLs) recognise their new environment and upgrade themselves in the process. I imagine the schema path at the top of the RDL would need changing at the very least. We also have scheduled subscriptions that we need to retain.... and of course datasources that point to the relevant DBs (not sure if these would be affected as part of any upgrade/install of SSRS). Would be grateful for any legitimate sources that deal with this scenario and I thank you in advance.JayworkNov 27, 2024Copper Contributor5Views0likes0CommentsNested Query Slowing Performance
Hi, At the beginning of my SQL code I have a cte target table called TARGETS, which contains 3 columns TARGET_NAME, MODE AND VALUE. Further down in my query I am trying to use a nested query to retrieve values from this table, however, it is really slowing down the performance. I am wondering if there is a better way to do this? An example of the nested query looks like this SUM(CASE WHEN LEG = 'Empty' AND LOCATION = 'Perth' THEN (SELECT VALUE FROM TARGETS WHERE TARGET_NAME = 'Empty Perth' AND MODE = 'Yes') WHEN LEG = 'Empty' AND LOCATION= 'Melbourne' THEN (SELECT VALUE FROM TARGETS WHERE TARGET_NAME= 'Empty Melbourne' AND MODE = 'No') etc... END) / COUNT(DISTINCT(LEG)) AS Target FROM LEG_DETAILS Many Thanks,leerjones85Nov 26, 2024Copper Contributor37Views0likes5Commentsone column refer to 2 columns
Hello there global table is dbo.Employee(EmployeeId int, sex char(10)) Another global table is dbo.GradeEmployee(PolicyEmployeeId int (FK of Employee-EmployeeId), FaultEmployeeId int (FK of Employee-EmployeeId), grade char(5) Another is my temp table called #TmpEmployee(EmployeeId int, employeeName char(50), quantity) GradeEmployee- PolicyEmployeeId is derived from Employee-EmployeeId GradeEmployee- FaultEmployeeId is also derived from Employee-EmployeeId Now, the task is I want to get the data from #TmpEmployee table, there I want to Display in the below format PolicyEmployeeId, FaultEmployeeId, employeeNameSum(quantity) Sum(quantity) is total quantity of a pair FaultEmployeeId, PolicyEmployeeId, note in #TmpEmployee table sometime the Quantity of FaultEmployeeId wont present; sometime the Quantity of PolicyEmployeeId wont present but, in the result it has to display. How to write the select query?JohnMathewNov 25, 2024Copper Contributor28Views0likes1CommentData masking in SQL server 2019
We tried to mask a column in SQLDB I first backup the table tblContacts to tblContacts_backup then I issue following SQL command to mask the column "Email" ALTER TABLE tblContacts ALTER COLUMN Email ADD MASKED WITH (FUNCTION = 'default()'); Then I check the result but not thing happen (no mask applied to the column) compared tblContacts and tblContacts_backup but no different SELECT Email FROM [ABC].[dbo].[tblContacts] where Emailis not null SELECT Email FROM [ABC].[dbo].[tblContacts_backup] where Emailis not null I have done following step but the problem still not solve make sure the user is not db admin role REVOKE UNMASK TO the user disconnect and connect again checked configure have been applied to the column select c.name AS column_name, t.name AS data_type, m.is_masked, m.masking_function FROM sys.columns c JOIN sys.tables t ON c.object_id = t.object_id LEFT JOIN sys.masked_columns m ON c.object_id = m.object_id AND c.column_id = m.column_id WHERE t.name = 'tblContacts_backup' found is_masked = 1 and m.masking_function = default() What else can I check or try?johnyhhNov 21, 2024Copper Contributor40Views0likes1CommentSQL Server 2022 DateDiff D not working
Just updated an SQL server 2017 instance to 2022 and a very simple where clause using DataDiff(d) no longer works in a stored procedure. Here is the where clause: ...AND datediff(d, A.Custfirstvisit,getdate()) < 7 Simple, but no longer working. Tried casting as int, changing the 7 to being loaded into a declared int type var. Nope. But, I can perform addition etc. e.g. subtract the 7 from the DateDiff(d) return valud and make the value 7 lower. We are getting close to having to just walk away from MS tools altogether. Microsoft just breaking stuff that used to work now, and EASY stuff at that. Wow.togethereNov 19, 2024Copper Contributor28Views0likes2CommentsSSIS Debug Spawns 2 DtsDebugHost.exe and Freezes
Operating System: Windows 10 Enterprise Visual Studio: 2022 Enterprise (x64) Target SQL Server: SQL Server 2019 On many packages, running the package in debug mode (x64, ??) locks up the UI (windows message "Visual Studio is Busy"). Task Manager shows 2 instances of DtsDebugHost star C:\Program Files\Microsoft Visual Studio\2022\Enterprise\Common7\IDE\CommonExtensions\Microsoft\SSIS\150\Binn each with different -regas {guid} command lines. For packages where this doesn't happen only 1 is spawned. Killing the DtsDebugHost.exe packages frees up the Visual Studio GUI but ends the debug session. These packages do not use any 3rd party components.ArchivistNov 19, 2024Copper Contributor26Views0likes1Comment
Resources
Tags
- Data Warehouse64 Topics
- Integration Services58 Topics
- sql server42 Topics
- Reporting Services41 Topics
- Business Intelligence35 Topics
- SQL35 Topics
- Analysis Services29 Topics
- Business Apps22 Topics
- analytics18 Topics
- ssms10 Topics