sql
11 TopicsCost-effective alternatives to control table for processed files in Azure Synapse
Hello, good morning.In Azure Synapse Analytics, I want to have a control table for the files that have already been processed by the bronze or silver layers. For this, I wanted to create a dedicated pool, but I see that at the minimum performance level it charges 1.51 USD per hour (as I show in the image), so I wanted to know what other more economical alternatives I have, since I will need to do inserts and updates to this control table and with a serverless option this is not possible.125Views1like2CommentsPrice reduction and upcoming features for Azure confidential ledger!
Effective March 1, 2025, you can keep your records in Azure confidential ledger (ACL) at the reduced price of ~$3/day per instance! The reduced price is for the computation and the ledger use. The price of any additional storage used will remain unchanged. To tamper protect your records: Automatically create hash (e.g. MD5 or SHA256) of your blob storage data and keep those in Azure confidential ledger. For forensics, you can verify the integrity of the data against the signature in ACL. Imagine doing this as you are migrating data from one system to another, or when you restore archived records from cold storage. It is also valuable when there is a need to protect from insider/administrator risks and confidently report to authorities. If you keep your data in Azure SQL database, you can use their security ledger feature to auto generate record digests and store them in confidential ledger for integrity protection and safeguarding. You can use the SQL stored procedure to verify that no tampering or administrator modifications occurred to your SQL data! In addition, we are announcing the preview of User Defined Functions for Azure confidential ledger. Imagine doing a schema validation before writing data to the Ledger or using pattern matching to identify sensitive information in log messages and perform data massaging to mask it. To increase your awareness, request access for this preview via the sign-up form. Get started by reading our documentation and trying out confidential ledger yourself! _____________________________________________________________________________________________________ What is Azure confidential ledger and what is the change? It is a tamper protected and auditable data store backed by a Merkle tree blockchain structure for sensitive records that require high levels of integrity protection and/or confidentiality. While customers from AI, financial services, healthcare, and supply chain continue to use the ledger for their business transaction’s archival needs and confidential data’s unique identifiers for audit purposes, we are acting on their feedback for scaling ledgers to more of their workloads with a more competitive price! How can I use Azure confidential ledger? - Azure SQL database ledger customers can enable confidential ledger as its trusted digest store to uplevel integrity and security protection posture - Azure customers who use blob storage have found value in migrating their workloads to Azure with a tamper protection check via the Azure confidential ledger Marketplace App. - Azure customers who use data stores and databases (e.g. Kusto, Cosmos, and Log Analytics) may benefit from auditability and traceability of logs being kept in the confidential ledger with new compliance certifications in SOC 2 Type 2 and ISO27001. How much does Azure confidential ledger cost? - Approximately $3/day/ledger _____________________________________________________________________________________________________ Resources Explore the Azure confidential ledger documentation Read the blog post on: Integrity protect blob storage Read the blog post on: How to choose between ledger in Azure SQL Database and Azure Confidential Ledger Read the blog post on: Verify integrity of data transactions in Azure confidential ledger View our recent webinar in the Security Community Recent case studies: HB Antwerp & BeekeeperAI2.2KViews0likes0CommentsHow to execute SQL script via Azure Pipeline, please help
How to execute SQL script via Azure Pipeline, please help, I don't want to do it via powershell. Please suggest. For example I will be creating an Azure pipeline.yml where under a stage/job will be having steps to execute an SQL script file(having multiple CRUD operation)10KViews0likes1CommentConfidential Data Clean Rooms – The evolution of sensitive data collaboration
Secure data collaboration between multiple parties has the potential to revolutionize societies, businesses and industries for the better. Collaborating on sensitive data assets facilitates innovation to unlock new value for organizations.Frictionless Collaborative Analytics and AI/ML on Confidential Data
Secure enclaves protect data from attack and unauthorized access, but confidential computing presents significant challenges and obstacles to performing analytics and machine learning at scale across teams and organizational boundaries. In this article, we'll explore the Opaque platform and describe how it can enable multiple parties to easily collaborate and analyze shared data while keeping it fully confidential.5.3KViews2likes0CommentsTrying to Set Up Advanced SQL Tracking
I work for a corporation, and we are trying to set up Application Insights for one of our products. We have the instrumentation key and everything integrated into the code where necessary, but we still can't see all the information we want... The goal is for our screen to look like this: https://docs.microsoft.com/en-us/azure/azure-monitor/app/asp-net-dependencies We've been using the above link to guide us. We have installed the Microsoft.Data.SQLClient NuGet Package, and the XML line, but the advanced SQL data still does not show. We are running things locally on IIS Express, and our project runs, and when we interact with the web app we see spikes in our performance graph, so we know it is collecting some metrics. We apparently need to be running SDK Version 'rddp'. We are using the version 'rddf' and 'rdddsd' which indicates dependencies are collected via DiagnosticSource or EventSource callbacks, and hence full SQL query aren't being captured. We got the information about the SDK Version by running this command in Logs in App Insights: dependencies | where timestamp > ago(1d) | summarize count() by cloud_RoleInstance, sdkVersion2.2KViews0likes2CommentsHosted SQL advice please
I’m looking to migrate away from our on prem DC and separate SQL server into Azure. My plan is to join our Windows 10 clients to our existing Azure domain (AAD), continue to use Exchange online for email and move files and folders to OneDrive. All of the above I’m happy with so far. The bit I’m unsure about is our SQL databases which our client software uses. I know I can create a SQL instance in Azure and potentially migrate our data across to it but am concerned about latency between the client app and the azure sql instance. We have a 100mb/100mb leased line as our internet link. any advice on this please1.6KViews0likes4CommentsSQL Migration to Azure Cloud
I have implemented a basic C# application connected with On premises SQL Server, I am going to migrate the same database and the data to Azure cloud using Microsoft Migration Tool, After the migration without touching the coding part I am debugging the same application (Changed the connections string only). On premises sql database connected with SQL Server, In the database I am running this query through management studio and I am getting the following results, Same query I am deploying through my developed C# application, I am testing the connection with sql authentication credentials String source = @”Data Source =” + textBox1.Text; Initial Catalog = CheckPostingDb; User Id =” + textBox2.Text;; Password=” + textBox3.Text; SqlConnection con = new SqlConnection(source); con.Open(); MessageBox.Show(“Db Connected”); Once it’s succeeded, I’m running the same query through sqlcommand function in C#, got the same results in the text box String sqlSelectQuery = “SELECT COUNT(*) AS MREQUESTS FROM MREQUESTS WHERE REQSTATE=1”; SqlCommand cmd = new SqlCommand(sqlSelectQuery, con); if (dr.Read()) { textBox4.Text = Convert.ToString(dr[“MREQUESTS”]) } con.Close(); Let’s migrate to Cloud I have deployed a sample database in the Azure Cloud with SQL authentication, It’s just a blank database and it doesn’t have any tables Tried the same query here and returning with failed errors, Start ab new project type as Migration In this step I am specifying the source and target server details In my scenario Source server is in localhost and target sql server is in Azure could Source Server — localhost , Target Server — gohulan.database.windows.net Select the Correct database from the source server to Migrate to cloud, In the target server select the correct database from Azure cloud, in my Azure cloud I have only one database named CheckPostingDb Once its’s connected I am going to select the objects or tables from the source database that I would like to migrate In my testing environment I am selecting only one table, my table is MREQUESTS since I am targeting the results only from this table through my C# application. Once the table is ticked, I have generated the SQL script Once the script is generated, I am deploying the schema, /******** DMA Schema Migration Deployment Script Script Date: 2/24/2020 12:50:55 PM ********/ /****** Object: Table [dbo].[MREQUESTS] Script Date: 2/24/2020 12:50:55 PM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N’[dbo].[MREQUESTS]’) AND type in (N’U’)) BEGIN CREATE TABLE [dbo].[MREQUESTS]( [ID] [bigint] IDENTITY(1,1) NOT NULL, [RID] [uniqueidentifier] NOT NULL, [ReqTime] [datetime] NOT NULL, [ReqState] [tinyint] NOT NULL, [RecordType] [int] NOT NULL, [Data1] [bigint] NULL, [ServiceID] [int] NULL, [FirstRequestTime] [datetime] NULL, [OfflinePosting] [bit] NULL, [ServiceHostInfo] [nvarchar](80) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, CONSTRAINT [PK_MREQUESTS] PRIMARY KEY CLUSTERED ( [ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ) END GO IF NOT EXISTS (SELECT * FROM sys.indexes WHERE object_id = OBJECT_ID(N’[dbo].[MREQUESTS]’) AND name = N’AK_MREQUESTS_RID’) CREATE UNIQUE NONCLUSTERED INDEX [AK_MREQUESTS_RID] ON [dbo].[MREQUESTS] ( [RID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) GO IF NOT EXISTS (SELECT * FROM sys.indexes WHERE object_id = OBJECT_ID(N’[dbo].[MREQUESTS]’) AND name = N’IX_MREQUESTS_2') CREATE NONCLUSTERED INDEX [IX_MREQUESTS_2] ON [dbo].[MREQUESTS] ( [ReqTime] ASC, [ReqState] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) GO After I deployed the generated script in the previous step, it doesn’t give any errors or warnings. Successfully Executed. Once the script is migrated, I can see my table was created in the Azure cloud. But my query was returning with 0 results, it means that MREQUESTS table has been deployed but not the data yet. I am migrating the data as well in my next step Once after I started the migration it will start the process to send the data to cloud, time depends on the data capacity and the network speed. Since my table doesn’t have huge data it finished with in short time without any warning or errors. Running the same query in the Query Editor in Azure to check my data, it’s succeeded and returned with same value as on-premise query returns earlier. Returning to my C# application and no changes made in the application but changing the connection string by changing the server name and sql authentication credentials, Debugging the application to confirm it’s functioning properly, tested my connection with correct sql credentials Wow the results as expected, means Migration is succeeded.1.5KViews1like0CommentsAzure SQL Managed Instance - Changing Geo-Replication settings after deployment
Hello Everyone, Do you know if it is possible to change settings of "DNZ Zone" and "DNS Zone Partner" after the deployment of Managed Instance has been completed? https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-failover-group-tutorial According to the documentation there is an option for Geo-Replication settings at the creation. Can I change the settings after creation of Managed Instance so I can create a Failover-Group? Unfortunatlly I did not create the second managed instance myself 😞 Wojtek1.7KViews0likes0Comments