Forum Widgets
Latest Discussions
Synapse SQL - Index creation is extremely slow for a small table
I attempted to create a nonclustered index on a small table in synapse. It was running for was running 15 mins and started to block sessions, so I cancelled it. The FACT_NON_MERCH_DISCOUNT table had only 25,597 rows, so normally an index creation on that should be very fast in a sql server instance. This is the TSQL used. CREATE NONCLUSTERED INDEX [NCI_BUSINESS_DATE] ON [bnbi_retail].[FACT_NON_MERCH_DISCOUNT] ( [BUSINESS_DATE] ASC )WITH (DROP_EXISTING = OFF) GO I did a second test and cancelled after 20 minutes. Can you tell me what could be causing a small new index to run so long? Attached are some screen prints showing the resources are good and the locks created. A larger table recommended has over 61 million rows, so I need to know the root cause for the excessive runtime of an index creation, before attempting anything that large. Also, attached is the DDL for the existing table with the existing Index and the distribution settings that pre-existed for the [bnbi_retail].[FACT_NON_MERCH_DISCOUNT] table. Here is the reversed DDL for the preexisting table I tried to add the nonclustered index to.: SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [bnbi_retail].[FACT_NON_MERCH_DISCOUNT] ( [TRAN_KEY_ID] [bigint] NOT NULL, [NON_MERCH_SEQ_NUM] [int] NOT NULL, [NON_MERCH_DISCOUNT_SEQ_NUM] [int] NOT NULL, [BUSINESS_DATE_INT] [int] NULL, [SK_MEMBERSHIP] [bigint] NULL, [DISCOUNT_ID] [varchar](20) NULL, [DISCOUNT_AMT] [decimal](9, 2) NULL, [DISCOUNT_PCT] [int] NULL, [DISCOUNT_TYPE] [int] NULL, [DISCOUNT_CALCULATION] [int] NULL, [DISCOUNT_SOURCE] [int] NULL, [PROMO_ID] [char](10) NULL, [DISCOUNT_TYPE_EXTENDED] [int] NULL, [ENTITLEMENT_TYPE] [char](10) NULL, [SK_SOURCE_FEED] [bigint] NULL, [CREATE_DATE] [datetime2](7) NULL, [UPDATE_DATE] [datetime2](7) NULL, [CREATE_USER_ID] [varchar](100) NULL, [UPDATE_USER_ID] [varchar](100) NULL, [FILE_NAME] [varchar](250) NULL, [ETL_LOAD_ID] [int] NULL, [BUSINESS_DATE] [date] NULL, CONSTRAINT [PK_] PRIMARY KEY NONCLUSTERED ( [TRAN_KEY_ID] ASC, [NON_MERCH_SEQ_NUM] ASC, [NON_MERCH_DISCOUNT_SEQ_NUM] ASC ) NOT ENFORCED ) WITH ( DISTRIBUTION = REPLICATE, CLUSTERED COLUMNSTORE INDEX ) GODBATonyMar 24, 2025Copper Contributor35Views0likes2CommentsStrange behavior of MSSQL Listener name + port
Hello, everyone. We are running MSSQL 2022, we have 2 listener and port, said LSNA with port 1434, LSNB with port 1435. The connection is no problem. But strange is that it can also go into DB with LSNA,1435 and LSNB,1434 which do not exist, it will go to DB according to port but not listener+port. Anyone can give advise to me on this? Thank you very much in advance.chanrky1202Mar 21, 2025Copper Contributor69Views0likes1CommentImport does not display folder, so cannot reach file
I am trying to restore a database using SMSS under Windows 11. I don't recall what version of SMSS (it's on another computer), but I downloaded it less than a week ago, so it should be the current one. Everything is OK up to the point where I open the Locate Backup File dialog and try to select the backup file I want to restore. The file is in the Documents library, so its pathname begins C:\Users\<username>\Documents\... But when I get to the <username> folder, the dialog does not list any of its subfolders. I can't select the backup file if I can reach the folder that contains it. I tried pasting the folder's pathname into the dialog's Backup File location field. When I pressed Enter, SSMS displayed an error telling me, in essence that I don't have permission to access the folder or it doesn't exist. Think about that... I don't have permission to access my own Documents folder, or it doesn't exist. Yes, the Documents folder does exist, and I do have permission to access it. File Explorer and all of the other applications I've tried know this. SMSS, for some reason, does not. What is going on here, and how can I straighten it out?OrthoducksMar 21, 2025Copper Contributor28Views0likes2CommentsSQL Server Transactional Replication
Hi All - We are managing Large tables using Transactional Replication. I noticed the table is vertically published after adding a new column in the source. Performance perspective, will it make any different for large table being published vertically vs horizontally. Thanks, Kranp.kranpMar 19, 2025Copper Contributor7Views0likes0CommentsDB change of account
Our database jobs, such as full database backups, transaction log backups, system database backups, Reorg job , Update stats and few DB related jobs and task scheduler jobs for housekeeping the trace files , are owned by one of our SQL DB account, which also has DB and SQL server admin privileges . Without disrupting the above-mentioned job schedules, we would like to remove the DB admin & SQL admin privileges for that account . I.e, let this account be used only schedule SQL agents job purpose . Could you please help me obtain information from support how to convert that account without impact of DB operations ?janice88Mar 18, 2025Copper Contributor25Views0likes2Comments[SQL Server2019 Standard]Order history now shows, but missing download link and product key
Hello, I'm experiencing an issue with a recent purchase from the Microsoft Store. I haven't received a confirmation email, and the order doesn't appear in my order history on the Microsoft Store website. However, the item is showing in my Microsoft Store library. When I click the download button, nothing happens, and the download doesn't start. Could anyone please help me understand what might be happening or suggest any troubleshooting steps? Thank you for your assistance.kataokaMar 18, 2025Copper Contributor12Views0likes0CommentsTable
Hello everyone, I have a problem. Our products are dimension-dependent (width / height) and the prices are stored in a table [Costinformation]. The grid table is in the Pricetable column. Rows are separated with CRLF and the individual columns with Tab. Table [Costinformation] Mandant Code Text Category Pricetable 1 4711 Product Table 1 Window 1 4712 Product Table 2 Door Example Pricetable-Column W / H 100 200 300 400 100 10,00 20,00 30,00 40,00 200 20,00 30,00 40,00 50,00 300 30,00 40,00 50,00 60,00 400 40,00 50,00 60,00 70,00 Now it is so that there are always price adjustments. In 100% of cases, this is in percentage terms. This means that the prices in the grid table have to be extrapolated. I need a solution so that I no longer have to do this manually. Can it be done directly via an SQL statement? Example: Product Table 1 --> price + 5% Produkt Table 2 --> price + 3 % I hope it is understandable. My English is not so good. I can also provide an example table. Thank you very much for you help. Greetings from Germany CharlyStellaSolvedCharlyStellaMar 18, 2025Copper Contributor141Views0likes10CommentsSQL Server does not reduce the size of MDF and LDF
I am a SQL Server 2017 user, I deleted a table that had too many records, but after the delete process, the size of the MDF and LDF files did not decrease, but increased. I have shrunk the file but the file size remains the same. What should I do to change the file size?riyantoMar 17, 2025Copper Contributor74Views0likes1CommentParameterize OLE DB Destination?
I have a large number of tables to backfill and I was hoping to automate this process with a package parameter instead of creating a dataflow for each table (see screenshot below). Package parameter like 'tbl_a, tbl_b, tbl_c' Followed by an Execute Script that splits this string into an Array. The array would then be enumated in a Foreach loop container. 3a. SQL command from enumerated variable would be fed into an OLE DB Source SQL Command 'Select * from tbl_a', 'Select * from tbl_b', etc. 3b. Table name or view name variable in OLE Destination would be given different table names 'tbl_a' , 'tbl_bl', tbl_c' through the enumerated variables Where it fails is when tbl_a, tbl_b is enumarated into OLE DB Destination Error: 0xC020201B at Data Flow Task, OLE DB Destination [2]: The number of input columns for OLE DB Destination.Inputs[OLE DB Destination Input] cannot be zero. Error: 0xC004706B at Data Flow Task, SSIS.Pipeline: "OLE DB Destination" failed validation and returned validation status "VS_ISBROKEN". Is there a way to parameterize OLE DB Destination? Is there a way to avoid need column mappings or somehow query all the columns from the OLE DB Source to feed into OLE DB Destination? Each table is different and mapping the columns for one backfill would completely blow up the next backfill in the batch. Is there another Task I can use in place of OLE DB Destination that can be preceded by OLE DB Source? Thanks in advancejharper305Mar 12, 2025Copper Contributor28Views0likes0Comments
Resources
Tags
- Data Warehouse67 Topics
- Integration Services58 Topics
- sql server51 Topics
- SQL44 Topics
- Reporting Services43 Topics
- Business Intelligence35 Topics
- Analysis Services31 Topics
- Business Apps22 Topics
- analytics21 Topics
- ssms13 Topics