Forum Widgets
Latest Discussions
Using a connection string to connect to Blob storage
Hi This might be a simple one but I've searched around a lot for an answer on this and have come up with nothing so i'm hoping someone here can help. In Synapse I just want to create a linked service to Blob storage using it's connection string stored in Key Vault - it allows me to create it and 'test connection' succeeds but when I move to the Data tab in Synapse and try to view it there it show an error It seems to think the account name is blank, note there is nothing in the parenthesis: I also get this error when i hover over the red X: This does work when i enter the storage details manually and use key vault to get the access key, rather than the whole connection string so it's definitely not an issue with permissions on the key vault. I wondered the connection string needs to be in any particular format to work - I just copied it directly from the storage account in the portal and it looks correct. I've tried this with different storage accounts on different tenants and its the same. What am I missing? Thanks :-)ian2x4b523pSep 03, 2025Copper Contributor15Views0likes0Commentshow to connect to fabric Datawarehouse from synapse notebook using pyspark
Hi, I am trying to read and write the data in fabric datawarehouse from synapse Notebook using pyspark but I have not found any references Could you please help me on this?.venkateshgoudAug 06, 2025Microsoft25Views0likes0CommentsSynapse Webhook Action with Private Logic App
Hi all, I have a Synapse workspace with public access disabled and using all private endpoints, both for inbound and outbound access from the managed vnet. I also have a Logic App with private endpoints. Both Synapse and Logic App are in separate virtual networks but peered together at a central hub site. Each have access to private DNS zones with records to resolve to each resource. When I disabled public network access on the Logic App, I can no longer use a Webhook activity from a Synapse pipeline with callback URI. A Web action works just fine, but with the Webhook activity, I get a response from the Logic App of 403 Forbidden. Ordinarily this looks like a permission issue, but when public network is enabled, the Logic App workflow works fine. When the Webhook action fails to runs, there is no activity run logged on the Logic App. There's something that the Webhook action is not getting back from the Logic App when public network access is disabled. I've been trying to find a solution (including sending back a 202 response to Synapse from the Logic App), but it continues to baffle me. Has any one else successfully configured Synapse Webhook action to call a workflow in a Standard Logic App over private endpoints? Any ideas or suggestions to troubleshoot this?Jeff BrownJul 14, 2025Copper Contributor31Views0likes0CommentsPartitioning in Azure Synapse
Hello, Im currently working on an optimization project, which as led me down a rabbithole of technical differences between the regular MSSQL and the dedicated SQL pool that is Azure PDW. I noticed, that when checking the distributions of partitions, when creating a table, for lets say splitting data by YEAR([datefield]) with ranges for each year '20230101','20240101' etc, the sys partitions view claims that all partitions have equal amount of rows. Also from the query plans, i can not see any impact in the way the query is executed, even though partition elimination should be the first move, when querying with Where [datefield] = '20230505'. Any info and advice would be greatly appreciated.AbuasRinroeMay 21, 2025Copper Contributor49Views0likes0CommentsSynapse workspace cost reduction
I have a Cosmos DB where I have one container that contains different documents. One document is a main document that has another related document. Both documents are related to each other by using a partition key. There will be one main document and multiple event documents with the same partition key. The main document has fields like date, country, and categories, which the event document does not have, while the event document has fields like event type, event dateandtime etc. To filter how many events happened for a particular category on a particular day, we have to use the main document. The events can be repetitive on a single day. My requirement is to create a Power BI report to display how many events happened on a particular day and for which country in the last 2 months (each event should display only one time per category, per country in a day). I want to get this data from Synapse and load it into Power BI for the last 2 months. I used the Synapse view and implemented incremental dataset refresh in a power BI. In a Synapse view, I created a main view that loads data for a main document, and in another view, I get those partition keys from the main view and then load the data for an event document. There are 2 dates in a main document: created date and change date. I cannot use the change date in incremental dataset refresh as it creates duplicate records, so I used the created date and then used the data to detect changes for the last 30 days (this is the time period where the main document can change). It works well, but the problem here is that it takes a lot of time to execute the query, which is causing more cost for data processing in Synapse. Is there any suggestion to reduce the cost consumption of Synapse as well as query execution time/dataset refresh time in Power BI?SynLoverMar 11, 2025Copper Contributor62Views0likes0Commentshow do you have any questions or hell
How to use the latest version of the pr ye sab ki aap karte Microsoft word Excel PowerPoint template for your time to explore new the latest news on the latest news agency of my life I have a good the world on fire of my life I have..osude127wJan 12, 2025Copper Contributor23Views0likes0CommentsData Flow textbox size increase
Hi Community, Is there a way to increase the textbox size to Data Flows Select mapping fields (or request a change)? The Destination field (Name As) is way smaller than the source field, and therefore can't see the complete text - there is plenty of space to the right to increase this size...alexdarlingNov 04, 2024Copper Contributor61Views0likes0CommentsDisable mouse over text / hover tooltips
Hi Community, How can I disable the annoying tooltips / hover text when working in Azure Synpase Studio. When navigating around the page, the tooltip will appear multiple times, getting in the way of actually clicking where I want to go. Example belowalexdarlingNov 04, 2024Copper Contributor65Views0likes0CommentsQuery insights
Question: How can I identify unused data in a modern data platform built with Azure Synapse and the medallion architecture using Log Analytics? I’m working with a client who has built a modern data platform based on the medallion architecture, leveraging Azure Synapse and Azure Storage Accounts. Users access the data in various ways within Synapse workspaces: some through Python scripts, others through serverless SQL endpoints, and others via dedicated SQL pools (utilizing views and stored procedures). We log a significant amount of information via Log Analytics, which means that all select statements executed on the data are essentially logged. The client now wants to identify which data is not actively used, in order to reduce storage costs by removing unused datasets. In a traditional SQL data warehouse, the Query Store could be used for this, but in this platform, we only have access to the log data stored in Log Analytics. My question is: How can we, based on the logs in Log Analytics, determine which data (tables, views, etc.) is processed through the various layers of the medallion architecture but not actually used? The goal is to remove unused data to save costs. Some additional questions: Is there a way to analyze usage patterns of datasets based on raw logs in Log Analytics? Are there any existing tools or KQL queries that could help identify which datasets have been inactive over a certain period? Could a metastore tool, such as Azure Purview, play a role in identifying unused datasets? If so, how can this be integrated with our existing platform? Any suggestions or insights would be greatly appreciated!HansG1610Oct 15, 2024Copper Contributor140Views0likes0CommentsSQL DB(serverless sql pool) access grant to end user.
What is the best way to grant access to users for SQL DB(serverless sql pool) build on top of synapse azure lake db. How should SQL DB (views or external tables) access the lake database (storage container). Should we use RBAC, Managed Identity, ACLS, etc. Please tell me the best way/used in industry to handle it.AbbasFOct 08, 2024Copper Contributor143Views0likes0Comments