sql db
6 TopicsStop Azure SQL Database
We have several cases where our customers are looking for ways to stop an Azure SQL Database. Some common scenarios where our customers request this option are when they want to decommission an Azure SQL Database and want to ensure that no one is using the database, or because the customer wants to save some money and stop the database during the hours they are not using it. Stopping an Azure SQL Database is not currently supported, but there are several options that could be helpful in these scenarios:24KViews5likes0CommentsExport Azure SQL Database | Advanced Scenarios
Introduction: Export Azure SQL Database is a common request for Azure SQL DB customers, in this article we are going to list down some advanced scenarios, on how this can be achievable through various tools not limited to Azure Portal, Azure CLI and PowerShell. In addition, this article will provide alternative methods when it comes to private endpoints and deny public access. Scenarios: In this section, we are going through the scenarios and provide a thoughtful insight on each one. Note: - Import Export using Private Link now in public review, more information at blog article: Import Export using Private Link now in Preview - Microsoft Tech Community Export via Azure Portal to Storage Account This can be a seamless solution to do the database export when the SQL server allows the public access, untoggled the Deny public access option on SQL DB Azure portal, otherwise you might get error like: An unexpected error was returned by the SQL engine while preparing the operation inputs. Error: 47073, State: 1. To overcome such error, you can TEMPORARY set deny public access to NO during the export operation. Note: - You don’t need to worry, if you set “Deny public access” to “No” it doesn’t mean that everyone will be able to connect from outside; you still can restrict the access using the database firewall. You can find more information at: Connectivity settings for Azure SQL Database and Azure Synapse Analytics - Azure SQL Database and Azure Synapse Analytics | Microsoft Docs Export via REST API You can use Export REST API to export the database, this can be done programmatically, or from tools like Postman, Also you can try this from Azure Documentation using the >try it button, More information can be found at: Databases - Export - REST API (Azure SQL Database) | Microsoft Docs Here is an example using postman: Request Body: { "storageKeyType": "StorageAccessKey", "storageKey": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx= =", "storageUri": https://xxxxxxxxxxxxxxxxx.blob.core.windows.net/testc, "administratorLogin": "xxxxxxxxxxxx", "administratorLoginPassword": "xxxxxxxxxxxxxx", "authenticationType": "Sql", "networkIsolation": { "sqlServerResourceId": "/subscriptions/xxxxxxxxxxxxx/resourceGroups/customer/providers/Microsoft.Sql/servers/xxxxxxxxx", "storageAccountResourceId": "/subscriptions/xxxxxxxxxxx/resourceGroups/customer/providers/Microsoft.Storage/storageAccounts/xxxxxxxxx" } } Below error may occur if the Deny public access is enabled, the solution is to enable the public access temporarily {"error":{"code":"ResourceNotFound","message":"The specified resource 'https://management.northeurope.control.database.windows.net/modules/AzureResourceManager.dsts/subscriptions/<yoursubscriptionid>/resourceGroups/customer/providers/Microsoft.Sql/servers/<servername>/databases/<dbname>/export?api-version=2021-02-01-preview ' was not found."}} Note:- networkisolation setting, this feature is currently under development and not ready for public consumption. More information can be found at: New-AzSqlDatabaseExport with network isolation · Discussion #13937 · Azure/azure-powershell · GitHub Error when calling New-AzSqlDatabaseExport with UseNetworkIsolation on $true · Issue #13964 · Azure/azure-powershell · GitHub Export via SQLPackage This can be a best bet solution for many scenarios to overcome limitations on the database size and also to export SQL DB via private endpoint through a VM running in the same VNET. Note:- you can export to local disk or Azure File Share, but you cannot use Azure Blob, for details can be found at Lesson Learned #25: Export/Import Azure SQL Database using Azure File Service? - Microsoft Tech Community Therefore, you can export the .bacpac locally/File share on the VM in the same VNET as the private endpoint of the SQL Server using SQLPackage.exe/SSMS then copy the bacpac to Azure blob (if required). For example: Using SQLPackage to import or export SQL Server and Azure SQL DB - Microsoft Tech Community Export via SQL server Management Studio : Export using SSMS from the VM running in the same VNET as a private endpoint from SQL to blob storage/ file share You can make use SQL Server Management Studio Export data-tier application wizard to export the Azure SQL database to a .bacpac file. The .bacpac can be stored into Azure blob storage or file share. Right click on the SQL Database on logical SQL Server from SSMS --> Tasks --> Select 'Export data-tier application' wizard. Select the location to store the BACPAC file You can select the subset of the tables from export setting in Advance tab --> Click Next to view the summary of export. One you click finish. And up on completion of the process you will be able to view the BACPAC file in the specified destination. More information at blog: Using data-tier applications (BACPAC) to migrate a database from Managed Instance to SQL Server - Microsoft Tech Community Export via Powershell/ CLI The New-AzSqlDatabaseExport cmdlet can be used to export database request to the Azure SQL Database service. Make a note that you have to enable public access to export the database via this method. With Deny public access set to YES, you might encounter below error. PowerShell command to export the database. Command to export the database via PS : New-AzSqlDatabaseExport -ResourceGroupName "customer" -ServerName "<your server name>" -DatabaseName "<your db name>" -StorageKeyType "StorageAccessKey" -StorageKey "<your storage access key>" -StorageUri "https://xxxxxxxxxxxxxxxxx.blob.core.windows.net/testc/database01.bacpac" -AdministratorLogin "<your login name>" To check the status of the export request, use the Get-AzSqlDatabaseImportExportStatus cmdlet. Get-AzSqlDatabaseImportExportStatus -OperationStatusLink https://management.azure.com/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxx/providers/Microsoft.Sql/locations/northeurope/importExportOperationResults/xxxxxxx-xxxxxxx?api-version=2021-02-01-preview Use the Database Operations - Cancel API or the PowerShell Stop-AzSqlDatabaseActivity command to cancel an export request. Stop-AzSqlDatabaseActivity -ResourceGroupName $ResourceGroupName -ServerName $ServerName -DatabaseName $DatabaseName -OperationId $Operation.OperationId - Please make a note of some of the considerations when using PowerShell method. Also check Azure SQL CLI at: az sql db | Microsoft Docs - Check out: How to cancel Azure SQL Database Import or Export operation - Microsoft Tech Community Database Copy You can use copy database from Azure portal to copy the database to the different server, then perform the export to Azure Blob, later on you can clean up the copied database The database export can also be done via automation, more information can be found at Blog: How to automate Export Azure SQL DB to blob storage use Automation account - Microsoft Tech Community Video: SQL Insider Series: Exporting Azure SQL DB BACPAC file to Azure with Azure Automation | Data Exposed - YouTube Additional References: Export a database to a BACPAC file - Azure SQL Database & Azure SQL Managed Instance | Microsoft Docs Using Azure Import/Export to transfer data to and from Azure Storage | Microsoft Docs Configure Azure Storage firewalls and virtual networks | Microsoft Docs Connectivity settings for Azure SQL Database and Azure Synapse Analytics - Azure SQL Database and Azure Synapse Analytics | Microsoft Docs Automate native database backup of Azure SQL Managed instance to Azure blob storage - Microsoft Tech Community Disclaimer Please note that products and options presented in this article are subject to change. This article reflects the database export options available for Azure SQL database in February, 2022. Closing remarks We hope you find this article helpful. If you have any feedback, please do not hesitate to provide it in the comment section below. Abhishek Shaha (Author) Ahmed Mahmoud (Co-Author)17KViews4likes2CommentsCustom RBAC to access QPI's query text with minimal permissions
Custom RBAC to access Query Performance Insight's query text with minimal permissions. The solution described in this article applies to Azure SQL Database, Azure Database for MySQL v.5.7, 8.0 and Azure Database for PostgreSQL v. 9.6,10,11.4.6KViews3likes1CommentTechnical Walkthrough: Deploying a SQL DB like it's Terraform
Introduction This post will be a union of multiple topics. It is part of the SQL CI/CD series and as such will build upon Deploying .dacpacs to Multiple Environments via ADO Pipelines | Microsoft Community Hub and Managed SQL Deployments Like Terraform | Microsoft Community Hub while also crossing over with the YAML Pipeline series This is an advanced topic in regard to both Azure DevOps YAML and SQL CI/CD. If both of these concepts are newer to you, please refer to the links above as this is not designed to be a beginner's approach in either one of these domains. Assumptions To get the most out of this and follow along we are going to assume that you are 1.) On board with templating your Azure DevOps YAML Pipelines. By doing this we will see the benefit of quickly onboarding new pipelines, standardizing our deployment steps, and increasing our security. We also are going to assume you are on board with Managed SQL Deployments Like Terraform | Microsoft Community Hub for deploying your SQL Projects. By adopting this we can increase our data security, confidence in source control, and speed our time to deployment. For this post we will continue to leverage the example cicd-adventureWorks repository for the source of our SQL Project and where the pipeline definition will live. Road mapping the Templates Just like my other YAML posts let's outline the pieces required in this stage and we will then break down each job Build Stage Build .dacpac job run `dotnet build` and pass in appropriate arguments execute a Deploy Report from the dacpac produced by the build and the target environment copy the Deploy Report to the build output directory publish the pipeline artifact Deploy Stage Deploy .dacpac job run Deploy Report from dacpac artifact (optional) deploy dacpac, including pre/postscripts Build Stage For the purposes of this stage, we should think of building our .dacpac similar to a terraform or single page application build. What I am referring to is we will produce an artifact per environment, and this will be generated from the same codebase. Additionally, we will run a 'plan' which will be the proposed result of deploying our dacpac file. Build Job We will have one instance of the build job for each environment. Each instance will produce a different artifact as they will be passing different build configurations which in turn will result in a different .dacpac per environment. If you are familiar with YAML templating, then feel free to jump to the finish job template. One of the key differences with this job structure, as opposed to the one outlined in Deploying .dacpacs to Multiple Environments via ADO Pipelines is the need for a Deploy Report. This is the key to unlocking the CI/CD approach which aligns with Terraform. This Deploy Report detects our changes on build, similar to running a terraform plan. Creating a Deploy Report is achieved by setting the DeployAction attribute in the SQLAzureDacpacDeployment@1 action to 'DeployReport' Now there is one minor "bug" in the Microsoft SQLAzureDacpacDeployment task, which I have raised with the ADO task. It appears the output path for the Deploy Report as well as the Drift Report are hardcoded to the same location. To get around this I had to find out where the Deploy Report was being published and, for our purposes, have a task to copy the Deploy Report to the same location as the .dacpac and then publish them both as a single folder. Here is the code for the for a single environment to build the associated .dacpac and produce the Deploy Repo stages: - stage: adventureworksentra_build variables: - name: solutionPath value: $(Build.SourcesDirectory)// jobs: - job: build_publish_sql_sqlmoveme_dev_dev steps: - task: UseDotNet@2 displayName: Use .NET SDK vlatest inputs: packageType: 'sdk' version: '' includePreviewVersions: true - task: NuGetAuthenticate@1 displayName: 'NuGet Authenticate' - task: DotNetCoreCLI@2 displayName: dotnet build inputs: command: build projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj arguments: --configuration dev /p:NetCoreBuild=true /p:DacVersion=1.0.1 - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\s/src/sqlmoveme/bin/dev/sqlmoveme.dacpac AdditionalArguments: '' DeleteFirewallRule: True - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: $(Build.SourcesDirectory)/src/sqlmoveme/bin/dev/cus - task: PublishPipelineArtifact@1 displayName: 'Publish Pipeline Artifact sqlmoveme_dev_dev ' inputs: targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/dev artifact: sqlmoveme_dev_dev properties: '' The end result will be similar to: (I have two environments in the screenshot below) One can see I have configured this to run a Deploy Report across each regional instance, thus the `cus` folder, of a SQL DB I do this is to identify and catch any potential schema and data issues. The Deploy Reports are the keys to tie this to the thought of deploying and managing SQL Databases like Terraform. These reports will execute when a pull request is created as part of the Build and again at Deployment to ensure changes from PR to deployment that may have occurred. For the purposes of this blog here is a deployment report indicating a schema change: This is an important artifact for organizations whose auditing policy requires documentation around deployments. This information is also available in the ADO job logs: This experience should feel similar to Terraform CI/CD...THAT'S A GOOD THING! It means we are working on developing and refining practices and principals across our tech stacks when it comes to SDLC. If this feels new to you then please read Terraform, CI/CD, Azure DevOps, and YAML Templates - John Folberth Deploy Stage We will have a deploy stage for each environment and within that stage will be a job for each region and/or database we are deploying our dacpac to. This job can be a template because, in theory, our deploying process across environments is identical. We will run a deployment report and deploy the .dacpac which was built for the specific environment and will include any and all associated pre/post scripts. Again this process has already been walked through in Deploying .dacpacs to Multiple Environments via ADO Pipelines | Microsoft Community Hub Deploy Job The deploy job will take what we built in the deployment process in Deploying .dacpacs to Multiple Environments via ADO Pipelines | Microsoft Community Hub and we will add a perquisite job to create a second Deployment Report. This process is to ensure we are aware of any changes in the deployed SQL Database that may have occurred after the original dacpac and Deployment Report were created at the time of the Pull Request. By doing this we now have a tight log identifying any changes that were being made right before we deployed the code. Next, we need to make a few changes to override the default arguments of the .dacpac publish command in order to automatically deploy changes that may result in data loss. Here is a complete list of all the available properties SqlPackage Publish - SQL Server | Microsoft Learn. The ones we are most interested in are DropObjectsNotInSource and BlockOnPossibleDataLoss. DropObjectsNotInSource is defined as: Specifies whether objects that do not exist in the database snapshot (.dacpac) file will be dropped from the target database when you publish to a database. This value takes precedence over DropExtendedProperties. This is important as it will drop and delete objects that are not defined in our source code. As I've written about previously this will drop all those instances of "Shadow Data" or copies of tables we were storing. This value, by default, is set to false as a safeguard from a destructive data action. Our intention though is to ensure our deployed database objects match our definitions in source control, as such we want to enable this. BlockOnPossibleDataLoss is defined as: Specifies that the operation will be terminated during the schema validation step if the resulting schema changes could incur a loss of data, including due to data precision reduction or a data type change that requires a cast operation. The default (True) value causes the operation to terminate regardless if the target database contains data. An execution with a False value for BlockOnPossibleDataLoss can still fail during deployment plan execution if data is present on the target that cannot be converted to the new column type. This is another safeguard that has been put in place to ensure data isn't lost in the situation of type conversion or schema changes such as dropping a column. We want this set to `true` so that our deployment will actually deploy in an automated fashion. If this is set to `false` and we are wanting to update schemas/columns then we would be creating an anti-pattern of a manual deployment to accommodate. When possible, we want to automate our deployments and in this specific case we have already taken the steps of mitigating unintentional data loss through our implementation of a Deploy Report. Again, we should have confidence in our deployment and if we have this then we should be able to automate it. Here is that same deployment process, including now the Deploy Report steps: - stage: adventureworksentra_dev_cus_dacpac_deploy jobs: - deployment: adventureworksentra_app_dev_cus environment: name: dev dependsOn: [] strategy: runOnce: deploy: steps: - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_dev_dev\**\*.dacpac AdditionalArguments: '' DeleteFirewallRule: False - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: postDeploy/sql-adventureworksentra-dev-cus.database.windows.net/sqlmoveme - task: SqlAzureDacpacDeployment@1 displayName: Publish sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: Publish azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_dev_dev\**\*.dacpac AdditionalArguments: /p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false DeleteFirewallRule: True Putting it Together Let's put together all these pieces. This example will show an expanded pipeline that has the following stages and jobs Build a stage Build Dev job Build Tst job Deploy Dev stage Deploy Dev Job Deploy tst stage Deploy tst Job And here is the code: resources: repositories: - repository: templates type: github name: JFolberth/TheYAMLPipelineOne endpoint: JFolberth trigger: branches: include: - none pool: vmImage: 'windows-latest' parameters: - name: projectNamesConfigurations type: object default: - projectName: 'sqlmoveme' environmentName: 'dev' regionAbrvs: - 'cus' projectExtension: '.sqlproj' buildArguments: '/p:NetCoreBuild=true /p:DacVersion=1.0.1' sqlServerName: 'adventureworksentra' sqlDatabaseName: 'moveme' resourceGroupName: adventureworksentra ipDetectionMethod: 'AutoDetect' deployType: 'DacpacTask' authenticationType: 'servicePrincipal' buildConfiguration: 'dev' dacpacAdditionalArguments: '/p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false' - projectName: 'sqlmoveme' environmentName: 'tst' regionAbrvs: - 'cus' projectExtension: '.sqlproj' buildArguments: '/p:NetCoreBuild=true /p:DacVersion=1.0' sqlServerName: 'adventureworksentra' sqlDatabaseName: 'moveme' resourceGroupName: adventureworksentra ipDetectionMethod: 'AutoDetect' deployType: 'DacpacTask' authenticationType: 'servicePrincipal' buildConfiguration: 'tst' dacpacAdditionalArguments: '/p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false' - name: serviceName type: string default: 'adventureworksentra' stages: - stage: adventureworksentra_build variables: - name: solutionPath value: $(Build.SourcesDirectory)// jobs: - job: build_publish_sql_sqlmoveme_dev_dev steps: - task: UseDotNet@2 displayName: Use .NET SDK vlatest inputs: packageType: 'sdk' version: '' includePreviewVersions: true - task: NuGetAuthenticate@1 displayName: 'NuGet Authenticate' - task: DotNetCoreCLI@2 displayName: dotnet build inputs: command: build projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj arguments: --configuration dev /p:NetCoreBuild=true /p:DacVersion=1.0.1 - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\s/src/sqlmoveme/bin/dev/sqlmoveme.dacpac AdditionalArguments: '' DeleteFirewallRule: True - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: $(Build.SourcesDirectory)/src/sqlmoveme/bin/dev/cus - task: PublishPipelineArtifact@1 displayName: 'Publish Pipeline Artifact sqlmoveme_dev_dev ' inputs: targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/dev artifact: sqlmoveme_dev_dev properties: '' - job: build_publish_sql_sqlmoveme_tst_tst steps: - task: UseDotNet@2 displayName: Use .NET SDK vlatest inputs: packageType: 'sdk' version: '' includePreviewVersions: true - task: NuGetAuthenticate@1 displayName: 'NuGet Authenticate' - task: DotNetCoreCLI@2 displayName: dotnet build inputs: command: build projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj arguments: --configuration tst /p:NetCoreBuild=true /p:DacVersion=1.0 - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-tst-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureTstServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-tst-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\s/src/sqlmoveme/bin/tst/sqlmoveme.dacpac AdditionalArguments: '' DeleteFirewallRule: True - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: $(Build.SourcesDirectory)/src/sqlmoveme/bin/tst/cus - task: PublishPipelineArtifact@1 displayName: 'Publish Pipeline Artifact sqlmoveme_tst_tst ' inputs: targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/tst artifact: sqlmoveme_tst_tst properties: '' - stage: adventureworksentra_dev_cus_dacpac_deploy jobs: - deployment: adventureworksentra_app_dev_cus environment: name: dev dependsOn: [] strategy: runOnce: deploy: steps: - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_dev_dev\**\*.dacpac AdditionalArguments: '' DeleteFirewallRule: False - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: postDeploy/sql-adventureworksentra-dev-cus.database.windows.net/sqlmoveme - task: SqlAzureDacpacDeployment@1 displayName: Publish sqlmoveme on sql-adventureworksentra-dev-cus.database.windows.net inputs: DeploymentAction: Publish azureSubscription: AzureDevServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-dev-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_dev_dev\**\*.dacpac AdditionalArguments: /p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false DeleteFirewallRule: True - stage: adventureworksentra_tst_cus_dacpac_deploy jobs: - deployment: adventureworksentra_app_tst_cus environment: name: tst dependsOn: [] strategy: runOnce: deploy: steps: - task: SqlAzureDacpacDeployment@1 displayName: DeployReport sqlmoveme on sql-adventureworksentra-tst-cus.database.windows.net inputs: DeploymentAction: DeployReport azureSubscription: AzureTstServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-tst-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_tst_tst\**\*.dacpac AdditionalArguments: '' DeleteFirewallRule: False - task: CopyFiles@2 inputs: SourceFolder: GeneratedOutputFiles Contents: '**' TargetFolder: postDeploy/sql-adventureworksentra-tst-cus.database.windows.net/sqlmoveme - task: SqlAzureDacpacDeployment@1 displayName: Publish sqlmoveme on sql-adventureworksentra-tst-cus.database.windows.net inputs: DeploymentAction: Publish azureSubscription: AzureTstServiceConnection AuthenticationType: servicePrincipal ServerName: sql-adventureworksentra-tst-cus.database.windows.net DatabaseName: sqlmoveme deployType: DacpacTask DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_tst_tst\**\*.dacpac AdditionalArguments: /p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false DeleteFirewallRule: True In ADO it will look like: We can see the important Deploy Report being created and can confirm that there are Deploy Reports for each environment/region combination: Conclusion With the inclusion of deploy reports we now have the ability to create Azure SQL Deployments that adhere to modern DevOps approaches. We can ensure our environments will be in sync with how we have defined them in source control. By doing this we achieve a higher level of security, confidence in our code, and reduction in shadow data. To learn more on these approaches with SQL Deployments be sure to check out my other blog articles on the topic "SQL Database Series" in "Healthcare and Life Sciences Blog" | Microsoft Community Hub and be sure to follow me on LinkedInForce world wide queries to single time zone
Hello, I hope somebody would be able to help me. We are using an Azure SQL database. When querying datetime fields, the result is adjusted for the time zone that you are in. As example, I have a column called CreationDate in table Bronze. A record is inserted in California with a system datetime of 1 July 08h00 (Pacific time). When somebody in Ireland queries the data, the value in CreationDate reflects 1 July 15h00 (Irish time). We know that we can force the time during query time (see query below), but that means that all reports need to include this logic. We would prefer if there was an Azure SQL configuration setting that will always return the same value, regardless of time zone. Could anybody tell me how we do this? Thank you - Jaunine SELECT [CreationDate] ,CONVERT(DATETIME2(0), [CreationDate], 126) AT TIME ZONE 'GMT Standard Time' AT TIME ZONE 'Pacific Standard Time' as pst_date FROM Bronze where trim([item]) = '1524200'Solved2.3KViews0likes5CommentsSQL Table refresh
Hello Community, need your help on this ( i am not able to attach files so pasting a sample of table) I have a table as per the below structure Run Date Region Manager supplier part spend 10-Mar-23 Region3 Manager3 Suppr2 Part3 18891 10-Mar-23 Region1 Manager1 Suppr2 Part3 10824 10-Mar-23 Region3 Manager2 Suppr3 Part2 14979 10-Mar-23 Region3 Manager2 Suppr1 Part3 15868 10-Mar-23 Region1 Manager2 Suppr3 Part3 15111 10-Mar-23 Region1 Manager2 Suppr1 Part2 19506 my organization does not grant create table access on the server so workaround is to dump the above table into Access DB which i will use to create reports in Power BI. The ask here is the owner of this table will refresh spend values for the existing dataset without altering the table structure and append data only in case a new region or a new manager or a new supplier has sent the data every before fiscal month run. the april run month table is now like this(Red ones are the new data received) Run Date Region Manager supplier part spend 10-Apr-23 Region3 Manager3 Suppr2 Part3 77036 10-Apr-23 Region1 Manager1 Suppr2 Part3 366771 10-Apr-23 Region3 Manager2 Suppr3 Part2 100775 10-Apr-23 Region3 Manager2 Suppr1 Part3 434291 10-Apr-23 Region1 Manager2 Suppr3 Part3 169688 10-Apr-23 Region1 Manager2 Suppr1 Part2 75593 10-Apr-23 Region 2 Manager4 Suppr1 Part2 340684 10-Apr-23 Region 2 Manager4 Suppr4 Part4 253959 10-Apr-23 Region 2 Manager1 Suppr4 part1 341101 10-Apr-23 Region 4 Manager2 Suppr2 Part5 28213 10-Apr-23 Region 4 Manager3 Suppr3 part1 42718 what i need is there should be some way to keep the old data table in access and a new column/table is added as per run date so that i can substract values from previous run to current run and check which region > manager> supplier> part has shown a high variance.789Views0likes0Comments