Dipping My Toes Into SQL Azure
Published Jan 15 2019 01:58 PM 202 Views
Microsoft
First published on MSDN on May 27, 2011

My high school English teacher once told me that I should always have a spell checker on my computer.   This might help explain why I never make it past the first couple of chapters in any technical publication before I have to ' try-it-out .'

Over the last month I have spent several of my evenings learning what SQL Azure really is and does.   I have read quite a few articles and publications but as usual I have a need to ' try-it-out !'

Keith and I have worked on the RML Utility set for more than a decade.  As it is a very established codebase I decided to 'port' it to use the SQL Azure as the database storage for the ReadTrace analysis.   I now have the test suite running against SQL Azure so let me share some of my learning's with you.

Easy
I was expecting a bunch of hurdles from such an establish code base but overall the experience was REALLY easy.  There are some changes I had to make to the applications but they turned out to be minor.   With all the documentation I read I was envisioning a significant amount of work but the SQL Server development team has done a nice job at making the vast majority of capabilities just work.   The RML applications don't use full text and some of the other facilities that SQL Azure has not provided yet which helped my effort.

Firewall Client
The first thing I had to get used to was the use of a Firewall client.  Because the transitional, on premises application, does not cross security boundaries I generally don't have to pay much attention to the firewall client.   When using SQL Azure I had to exit the Microsoft CorpNet, and traverse across the internet to connect to my SQL Azure database(s).    The SQL Azure database(s) are contained in an entirely different domain and data centers from my normal corporate network.   It would take an act of congress, if you will, for me to get direct access to the systems.  These are highly secured environments, protecting the SQL Azure user's data.

I installed the Forefront TMG Client, enabled the automatic detection and I was able to connect to my SQL Azure database(s).

OStress
The first thing I did was fire up OStress to see if I could connect to SQL Azure and run commands.   WOW - Worked the first time!   OStress is an ODBC based application such as OSQL.exe and I didn't have to make any changes.  The parts of the RML test suite that use OStress worked as-is to create and drop the database.

ReadTrace
The next thing I did was try ReadTrace as-is and it indeed failed.   ReadTrace is an OLEDB based application, SQLOLEDB was not the problem it was some of the T-SQL that ReadTrace was using.

Create Database
One of the first things ReadTrace does is check to see if the analysis database exists.  If not it issues a CREATE DATABASE if it does it drops the objects and recreates them.  Create Database must be the first and only command in a batch for SQL Azure.   ReadTrace was using a series of commands in the batch.   All I had to do was break the Create Database into a separate batch.

Use Database
In order to insert data into the database or drop and create the objects ReadTrace would change database context using the Use Database statement. Use database is not supported in SQL Azure so I changed the connections to specify the default database instead.

Clustered Indexes
ReadTrace builds the objects and uses the SQLOLEDB Bulk Insert interfaces to load data into the tables.   The Bulk Insert is supported by SQL Azure but the table has to have a clustered index before any data can be inserted.  Each SQL Azure database is replicated to 2 additional locations so in the event of a failure a failover can occur.   The data replication component of SQL Azure uses the clustered index to assist in the replication activity so it must be present before an data can be inserted into the table.  ReadTrace did not add indexes to the database until after the data was inserted.    I moved the creation of indexes from after the load to before the load to resolve this issue.

Select Context_Info
When connecting to SQL Azure the Context Information for the session is automatically set to a GUID, tracking activity id, for the life of the connection.   This can be used by Microsoft support if you encounter problems.   I added logic when ReadTrace connected to a SQL Azure database, to log the activity id.  select CONVERT(NVARCHAR(36), CONTEXT_INFO())

Select Info
There were a couple of places in ReadTrace and Reporter that used select into #TMP that failed.  I didn't expect this on a temporary table.  I can understand if it was a select into a table in my database that this is a problem because the clustered index is not present and it would cause replication problems.   However, the information in the TEMPDB is not replicated among the SQL Azure replica's so I was not expecting this failure.    I altered the logic to create table #TMP and insert into instead of select into.   I also filed a work item with the development team to relax this restriction for temporary tables.

Idle Connections
ReadTrace establishes up to 4 utility connections.  These are used to create the objects in the database and do maintenance activities during the load.    I found that from time to time these connections were dropped when doing a large test run.   After some investigation I found that if the connection is idle for some period of time, various firewalls and gateway settings, from my ISV and others impact the connection lifetime.    I had to add logic to detect when the connection was 'dead' and re-connect because during the large test runs the Bulk Insert connections were active but my utility connections remained idle.

trace events
ReadTrace and Reporter use the trace event DVMs to provide event names and only store event ids.   Since tracing is not currently supported against SQL Azure the DMVs are not present.   I loaded a table with the event details so I had the same information in the analysis database.

Reporter
Reporter is a Managed SQL Client application and it ran against the SQL Azure databases without changes except for 2 locations using select into.

Connectivity
Because I am no longer using a local environment I am not going to hit the same router or two and connect to my SQL Server.  The data is still safe because of the encrypted communications it just might traverse to locations you might not expect.  The vast majority of the time I don't have any connection problems to my SQL Azure database(s).   However, once in a while I would look at my route and it would be going all over the internet before it got to the SQL Azure servers.    This lead to a few connection time-outs.    To correct this I updated the connection logic in the utilities to retry the connection, if not a security failure, and it resolved the intermittent connection problems.  I could have extended the connection timeout as well.

BCP
The test suite for RML uses BCP.exe to export the loaded data and do a diff comparison to the expected results.   The test suite was using the QUERYOUT option but remember SQL Azure does not allow a use database or cross database query.  I had to update the BCP.exe to SQL Server 2008 R2 based BCP.exe with the new -d<<databasename>> command line parameter.   This allows the database to be used as the default during connection establishment and lets BCP run correctly.    As I mentioned, every so often a connection attempt would fail.   BCP.exe does not support a connection timeout parameter so I added BCP.exe retries to the batch file that drives the copy out activity as well.

Recap
I have a private tree for RML that works against SQL Azure and it was pretty easy to get working.   Our utilities use ODBC, OLEDB and the Managed SQL Client API sets and they all worked seamlessly as well.  Out test suites already supported both Windows and SQL Authentication so I didn't have to make any change for SQL Azure, just run the suites with the SQL Authentication needed to connect to my SQL Azure databases.

Now that I have the test suite running against SQL Azure my next goal is to look at performance tuning.   The test suite performance is quite good right now but I would like to perform a more in-depth study.  Do I need or should I shard the database and if so what does that look like, etc…?

I hope your first experience with SQL Azure is as easy as mine!

Bob Dorr - Principal SQL Server Escalation Engineer

Version history
Last update:
‎Jan 15 2019 01:58 PM
Updated by: