First published on MSDN on Aug 09, 2016
Reviewed by Panagiotis Antonopoulos, Jakub Szymaszek, Raghav Kaushik
is one of the compelling features in SQL Server 2016 and in Azure SQL DB which provides a unique guarantee that data in the database cannot be viewed, accidentally or intentionally by users who do not have the ‘master key’ required to decrypt that data. If you want to know more about this feature, please review the product documentation at the previous link or watch the
Channel 9 video
on this topic.
In a recent case, we were working with a customer who was trying to use
Table Valued Parameters
(TVPs) to do a ‘batch import’ of data into the database. The TVP was a parameter into a stored procedure, and the stored procedure was in turn joining the values from the TVP ‘table’ with some other tables and then performing the final insert into a table which had some columns encrypted with Always Encrypted.
Now, most of the ‘magic’ behind Always Encrypted is actually embedded in the client library which is used. Unfortunately, none of the client libraries (.NET, JDBC or ODBC) support encrypted columns passed within TVPs. So, we needed a viable workaround in this case to unblock the customer. In this blog post, we explain this workaround by using a simple example.
Walkthrough: Working with Bulk data in Always Encrypted
We first proceed to create
Column Master Key
(CMK) and a
Column Encryption Key
(CEK). For the CMK, we used a certificate from the Current User store for simplicity. For more information on key management in Always Encrypted, please refer to
Create the CMK
Here’s how we created the CMK. You can either use the GUI:
Or you can use T-SQL syntax:
CREATE COLUMN MASTER KEY [TestCMK]
KEY_STORE_PROVIDER_NAME = N'MSSQL_CERTIFICATE_STORE',
KEY_PATH = N'CurrentUser/My/C17D4826FA1B6B68808951BF81734283388937EF'
Create the CEK
Here’s how we created the CEK using the GUI:
Alternatively you can use T-SQL to do this:
CREATE COLUMN ENCRYPTION KEY [TestCEK]
COLUMN_MASTER_KEY = [TestCMK],
ALGORITHM = 'RSA_OAEP',
ENCRYPTED_VALUE = 0x016E000001630075007200720065006E00740075007300650072002F006D0079002F0063003100370064003...5E48531480
Create the table
Then we create the final table, with the encrypted column defined. Note that in the real application, this table already exists, with data in it. We’ve obviously simplified the scenario here!
CREATE TABLE FinalTable
somePII VARCHAR(500) ENCRYPTED WITH (COLUMN_ENCRYPTION_KEY = TestCEK,
ENCRYPTION_TYPE = RANDOMIZED,
ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256')
Reworking the application to use SqlBulkCopy instead of TVPs
With this setup on the database side of things, we proceed to develop our client application to work around the TVP limitation. The key to doing this is to use the
in .NET Framework 4.6 or above. This class ‘understands’ Always Encrypted and should need minimal rework on the developer front. The reason for the minimal rework is that this class actually accepts a DataTable as parameter, which is previously what the
TVP was passed as
. This is an important point, because it will help minimize the changes to the application.
Let’s get this working! The high-level steps are outlined below; there is a full code listing at the end of this blog post as well.
Populate the DataTable as before with the bulk data
As mentioned before the creation and population of the DataTable does not change. In the sample below, this is done in the MakeTable() method.
Using client side ad-hoc SQL, create a staging table on the server side.
This could also be done using T-SQL inside a stored procedure, but we had to uniquely name the staging table per-session so we chose to create the table from ad-hoc T-SQL in the application. We did this using a SELECT … INTO with a dummy WHERE clause (in the code listing, please refer to the condition ‘1=2’ which allows us to efficiently clone the table definition without having to hard-code the same), so that the column encryption setting is retained on the staging table as well. In the sample below, this step is done in the first part of the DoBulkInsert method.
Use the SqlBulkCopy API to ‘bulk insert’ into staging table
This is the core of the process. The important things to note here are the connection string (in the top of the class in the code listing) has the Column Encryption Setting attribute set to Enabled. When this attribute is set to Enabled, the SqlBulkCopy class interrogates the destination table and determines that a set of columns (in our sample case, it is just one column) needs to be encrypted before passing to server. This step is in the second part of the DoBulkInsert method.
Move data from staging table into final table
In the sample application, this is done by using an ad-hoc T-SQL statement to simple append the new data from staging table into final table. In the real application, this would typically be done through some T-SQL logic within a stored procedure or such.
There is an
here: encrypted column data cannot be transformed on the server side. This means that no expressions (columns being concatenated, calculated or transformed in any other way) are permitted on the encrypted columns on server side. This limitation is true regardless of whether you use TVPs or not, but might become even more important in the case where TVPs are used.
In our sample application we just inserted the data from the staging table into the final table, and then drop the staging table. This code is in the InsertStagingDataIntoMainTable method in the listing below.
While Always Encrypted offers a compelling use case to protect sensitive data on the database side, there are some restrictions it poses to the application. In this blog post we show you how you can work around the restriction with TVPs and bulk data. We hope this helps you move forward with adopting Always Encrypted! Please leave your comments and questions below, we are eager to hear from you!
Appendix: Client Application Code
Here is the client application code used.
static private string TVPAEConnectionString = "Server=.;Initial Catalog=TVPAE;Integrated Security=true;Column Encryption Setting=enabled;";
static void Main(string args)
var stgTableName = DoBulkInsert(MakeTable());
private static DataTable MakeTable()
DataTable newData = new DataTable();
// create columns in the DataTable
var idCol = new DataColumn()
DataType = System.Type.GetType("System.Int32"),
ColumnName = "idCol",
AutoIncrement = true
var somePII = new DataColumn()
DataType = System.Type.GetType("System.String"),
ColumnName = "somePII"
// create and add some test data
var rand = new Random();
for (var loopCount = 0; loopCount < 10000; loopCount++)
var datarowSample = newData.NewRow();
datarowSample["somePII"] = DateTime.Now.ToLongDateString();
private static void InsertStagingDataIntoMainTable(string stgTableName)
using (var conn = new SqlConnection(TVPAEConnectionString))
using (var cmd = new SqlCommand("BEGIN TRAN; INSERT FinalTable SELECT * FROM [" + stgTableName + "]; DROP TABLE [" + stgTableName + "]; COMMIT", conn))
Console.WriteLine("Inserted rowcount: " + cmd.ExecuteNonQuery().ToString());
private static string DoBulkInsert(DataTable stagingData)
string stagingTableName = "StagingTable_" + Guid.NewGuid().ToString();
using (var conn = new SqlConnection(TVPAEConnectionString))
// create the staging table - note the use of the dummy WHERE 1 = 2 predicate
using (var cmd = new SqlCommand("SELECT * INTO [" + stagingTableName + "] FROM FinalTable WHERE 1 = 2;", conn))
using (var bulkCopy = new SqlBulkCopy(conn))
bulkCopy.DestinationTableName = "[" + stagingTableName + "]";