Forum Widgets
Latest Discussions
Garbled Characters after scanning an database for Japanese Customer using DMA
I have faced an issue where I have scanned multiple databases for an Japanese customer where certain characters are garbled ( maybe written in Japanese ) , has anyone faced similar situation or any way to handle it ?himanshus2401Nov 21, 2023Copper Contributor236Views1like0Commentsvalidating record count at SQL server database tables with migrated azure data lake gen2
we are migrating out project from onpermise to azure , so on-premise database is the SQL server that we are using and azure datalake gen2 is the storage location where store data currently and so far we are currently validating record count of each table manually from sql server database tables and similarly we write pyspark code in databricks to write those data as parquet file and we validate record count from pyspark manually every time which is time consuming is that possible execute this process to make it automated in order to save time ? can this be done by using pyspark code ?sai_sathyaSep 29, 2023Copper Contributor420Views0likes0CommentsAPS to Synapse Migration (Datetime datatype issue)
Hi Everyone, I am seeking help to fix the issue below. I followed the steps from github for migrating APS to Synapse. Link: https://github.com/microsoft/AzureSynapseScriptsAndAccelerators Source Date values: Sample External table Script: CREATE EXTERNAL TABLE [DIM].[account] ( [ID] [bigint] NOT NULL, [EffectiveFrom] [datetime] NOT NULL, [EffectiveTo] [datetime2] NOT NULL, [ValidFrom] [datetime] NOT NULL, [ValidTo] [datetime2] NOT NULL ) WITH (DATA_SOURCE = [datasource_abfss], LOCATION = N'/locxxxxxxx', FILE_FORMAT = [parquet_file_format], REJECT_TYPE = VALUE, REJECT_VALUE = 0 ) GO SELECT TOP (100) [ID] ,[EffectiveFrom] ,[EffectiveTo] ,[ValidFrom] ,[ValidTo] FROM [DIM].[account] GO Facing challenges for reading datetime columns A Parquet file was unable to be exported from an ADLS2 blob into an External table. Here are the Errors: HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Arithmetic overflow error converting timestamp to data type DATETIME. ******************************************************************************************************************* 2nd try to read from parquet file using Auto generated scripts: SELECT TOP 100 * FROM OPENROWSET( BULK 'https://xxxxx.dfs.core.windows.net/srcblob/DIM_account/**', FORMAT = 'PARQUET' ) AS [result] Error: Started executing query at Line 1 Error handling external file: 'Inserting value to batch for column type DATETIME2 failed. Invalid argument provided.'. File/External table name: ''https://xxxxx.dfs.core.windows.net/srcblob/DIM_account/account.snappy.parquet'. Your assistance in advance is greatly appreciated !Santhosh0579Jun 07, 2023Copper Contributor634Views0likes0CommentsMicrosoft Virtual Summit April 30: Speakers on Cosmos DB, Data Analytics, Modern Data Warehouse
Eight Microsoft experts discuss the latest developments around Cloud, Security, Teams/Collaboration, Apps/DevOps, GitHub/Open Source, AI/ML, Containers/Kubernetes, Data Analytics, DB Architecture, Modern Data Warehouse/Data Lake and more. This free, virtual and interactive summit, organized by Angelbeat on behalf of Microsoft, has highly relevant and technical content on all these top issues, plus offers CPE/CEU/CISSP credits. Here is the registration link, where you can signup and see the full schedule. https://www.angelbeat.com/microsoft-april-30-virtual-summit/ Thursday, April 30, 9:30 AM - 5:30 PM, US Eastern Standard TimeAngelbeat_Ron_GerberApr 08, 2020Copper Contributor1.1KViews0likes0CommentsMigrate SqlServer on premise to azure SQL Server
https://datamigration.microsoft.com/scenario/sql-to-azuresqldb?step=1haitsongMar 04, 2020Former Employee813Views0likes0Comments