azure sql database
19 TopicsDeploying a Dacpac using SQLPackage
Hi, I'm trying to use SQLPackage to deploy a Dacpac using AccessToken authentication. I'm using the following command... .\sqlpackage.exe /Action:Publish /SourceFile:Database.dacpac /TargetServerName:xxxx.database.windows.net /TargetDatabaseName:DatabaseName /AccessToken:xxxx The error message I'm getting is: Unable to connect to target server 'xxxx.database.windows.net'. Please verify the connection information such as the server name, login credentials, and firewall rules for the target server. Login failed for user '<token-identified principal>'. Any ideas what's wrong here?7KViews0likes1CommentDATE_FORMAT is not supported for EXTERNAL FILE FORMAT.
I have been trying to create an external table from a CSV data set that does not use the default date format that synapse understands. So according to the documentation (https://docs.microsoft.com/en-us/sql/t-sql/statements/create-external-file-format-transact-sql?view=azure-sqldw-latest&tabs=delimited#examples) I should be able to specify a user defined format string as you can see in the picture below. But if I do so I get the following error "DATE_FORMAT is not supported for EXTERNAL FILE FORMAT." and I'm not sure what I'm missing in the documentation. I'm currently running in the Serverless SQL pool so could it be that this is just supported for dedicated pools? Thank you for your time and any help or insights would be greatly appreciated.Solved3.2KViews0likes1CommentIs general purpose serverless tier of Azure SQL DB good choice for data warehouse database ?
Hi, Is General purpose serverless tier of Azure SQL DB good choice for data warehouse database ? The ETL is impleted by data factory mapping dataflows. We are getting frequently and randomly connection timeout ( read failed) error. It seems IO properties are very less.3.1KViews0likes4CommentsCopy schema (with data) from one Azure SQL database to another Azure SQL database
Hallo, We need copy 3 schemas (incl tables, indexes, keys and data) to another Azure SQL database. We decided to switch from 2 database to one more powerful. There is a lot of data, smallest database is about 1TB data. I tried do it SSMS or BCP on client VM but it does not work due amount of data and transfer speed. I am looking something to do this in Azure portal. Both Azure databases are on one Azure host. Current situation: database source A_DB schemas c11, c12, c13; about 1TB data in 300 tables database target B_DB schemes c21, c22, c23, c24, c25; Hyperscale and 3TB data Desired result: target B_DB after copying the schema consists from schemas with the date c11, c12, c13, c21, c22, c23, c24, database A_DB can be deleted Please advise what azure service or tool can i use for it. It's a simple task in Oracle but it look more complicated in Azure SQL environment. So far I have found how to exp/imp or backup/restore an entire database, but nothing about how i can import schema with a lot of data in Azure environment. Thank you2.7KViews0likes5CommentsAzure SQL Hyperscale Best Practices
I am hoping you can help answer this question. Previously whenever we upgraded or downgraded a SQL database, we always followed best practices. This included do the following three things: Rebuild all indexes Update Statistics Clear Procedure Cache We have now moved to HyperScale and the databases will scale up and down as needed. Do we need to continue doing these best practices or is this taken care of in Azure automatically? Thank you.Solved2KViews0likes1CommentParallized Stored Procedure is executed synchroniously
Hi, we use the azure data factory with a foreach loop to do calculations with a stored procedure. This used to work very well but for no reason the running time went up from some minutes to some hours and sometimes we get error messages like this: Execution fail against sql server. Please contact SQL Server team if you need further support. Sql error number: 10928. Error Message: Resource ID : 1. The request limit for the database is 400 and has been reached. See 'https://docs.microsoft.com/azure/azure-sql/database/resource-limits-logical-server' for assistance. After some investigation in the sql server we recognized, that the 50 queries that should be run in parallel seem to ran synchoniously in azure SQL. There is always only one query running. All others are suspended. What can be a reason for this error and behavior? Thanks and regards Holger Regards Holger1.6KViews0likes0CommentsCreate login from Entra ID Security Group rather than individual
https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-azure-ad-logins-tutorial?view=azuresql says I can create a Login in Azure SQL Server from a Microsoft EntraID Security Group. I can, and it works, and it appears in sys.server_principals as type_desc 'EXTERNAL_GROUP' and type 'X'. (I note that non-group EntraID logins appear as type_desc 'EXTERNAL_LOGIN' and type 'E'.) But when I try the next step in the article, which is to create a User from the Login, I get the error '<EntraIDGroupName> is not a valid login or you do not have permission'. I have successfully created Users from non-group Logins, so I don't think it's a permission problem. Is it the case that, despite the article, you can't actually create a group user this way - I have to create individual logins and users for each potential EntraID user I want to have access the database? Or am I missing a trick somewhere?Solved1.6KViews2likes2CommentsAzure SQL Web Portal .dacpac Functionality
I've had so far success in creating SQL Server databases using .bacpac through the Azure Portal web interface. That said, what I need is a way to update said databases' structures using a .dacpac through through the Azure Portal web interface as well. I understand this might be possible using SSMS, but I find the over the network interaction response time to be less than optimal. Is there such functionality available using the Azure Portal web interface (not to create a new database using a .bacpac file but to update its structure using a .dacpac file.) If not, will there be such a feature implementation in the near future?1.1KViews0likes1CommentZone redundancy
Now generally available in select regions, your new and existing Azure SQL Databases and elastic pools that use the general purpose tier can enable the zone redundant configuration. This configuration is offered for both serverless and provisioned compute. https://techcommunity.microsoft.com/t5/azure-sql-blog/zone-redundancy-for-azure-sql-database-general-purpose-tier/ba-p/3280376 Are you planning to shift to this configuration in the future? What scenarios will this help you with?1.1KViews0likes0CommentsNew! Professional Azure SQL Database Administration Packt E-book
Hi folks! Check out our recently launched free e-book for Azure SQL Database from Packt! https://azure.microsoft.com/en-us/resources/professional-azure-sql-database-administration/ This comprehensive guide from Packt provides an in-depth look at Azure SQL Database and includes detailed guidance and activity plans on how to migrate to Azure SQL Database or provision a new SQL database, backup and restore Azure SQL Databases, implement high availability and disaster recovery, monitor and optimize the performance of your cloud database and automate common management tasks using PowerShell. For database administrators, architects, big data engineers, and anyone else looking to migrate or modernize their on-premises data estate and wants flexibility, efficiency, and high performance. Currently available in EN:US only. In this e-book, you’ll get detailed, easy-to-follow guidance and activity plans on how to: Migrate to Azure SQL Database or provision a new SQL database. Backup and restore Azure SQL Databases. Implement high availability and disaster recovery. Monitor and optimize the performance of your cloud database. Automate common management tasks using PowerShell. Here's the URL link to your free copy: https://azure.microsoft.com/en-us/resources/professional-azure-sql-database-administration/670Views0likes0Comments