Azure Feature Pack 1.15.0 released for SQL Server 2019
Published Oct 23 2019 11:02 PM 9,608 Views
Microsoft

Dear SSIS Users,

 

From 1.15.0, Azure Feature Pack supports SQL Server 2019. Also included are the following new features.

  1. Delete operation in Flexible File Task.
    • Support deleting folder/file from local file system/Azure Data Lake Storage Gen2
    • Support deleting file from Azure Blob Storage
  2. External and Output data type conversion in Flexible File Source. You can change the DataType properties for Output Columns to convert data types in the Input and Output Properties tab of the Advanced Editor for Flexible File Source dialog as shown below.
    convert.png

You can download this new version of Azure Feature Pack from the following links:

7 Comments
Copper Contributor

Hi SSIS team

 

I found a little bug.

 

When you install the feature pack for SQL Server 2019 in VS 2019 with DT for 2019 the Azure components are greyed out. When you try to set it to SQL Server 2019 in the configuration, for which you downloaded the components, it says that the target version of the components must be 2017 and the only choice is OK so no Azure components.

 

If that is not a bug!

 

Cheers

Roger

 

Microsoft

@rvunen did you see this error message: The TargetServerVersion property of your project must be SqlServer2017 to switch Azure-Enabled features on.

 

If so , you need to set the AzureEnabled property value to False before you could switch the target server version to SQL2019.

Microsoft

I was using Azure Data Lake Store Destination with Azure Data Lake Analytics Connection Manager and was reading the connection properties like Application ID, Authentication Key from a previous step involving sql queries which were setting values into Variables. It didn't work out because the expression values used in ADLS connection manager were still using the default value of the variable instead of updated values from previous step. 

 

- Another issue I encountered is that I cannot create new folder using ADLS destination, the destination folder must exist on the data lake. My use case was to create daily stream with folder-wise separation on month-wise data.

 

Please let me know if there are any workarounds or any settings to enable the usage.

Microsoft

@mayank127 For 1), we need more info to troubleshoot. Could you describe your package in detail, so we can author a similar package and repro your issue? Or better yet, could you share your package with us for investigation? For 2), you can use Flexible File Destination. Any folders that do not exist on the full path will be created automatically.

Microsoft

Package can be assumed as two steps -

1. A sql query to read application id and application secret from a table

2. A Data flow task from sql to Azure data lake store destination

  • It uses ADLS destination connection manager with WebHdfsPassword, HdfsUserName being set from variable read in previous step
  • Default values for those variables is empty string and when task is run instead of taking the new values from previous step connection uses the default empty values and fails.

 

Copper Contributor

@Lingxi I'm experiencing a bug with the Flexible File Destination task. We're exporting from a SQL Server source to Parquet (I've also tested with CSV) and it appears to truncate large decimal values (38,19) and writes incorrect values to the files. I have compared side by side the same pipeline writing directly to a SQL server table with the same data and the issue isn't there. This is causing erroneous results in our Financial reports. I've also used a "normal" flat file destination task and didn't have any issues. I installed the most recent Azure Feature Pack for SQL Server 2019 in hope to fix the issue but that didn't change anything.

 

 

Microsoft

@j500sut This is a limitation of the component where decimal is always written as (38,18).

Version history
Last update:
‎Oct 23 2019 11:02 PM
Updated by: