data lake
18 TopicsGateway Timout on Azure Data Factory Copy Task
I'm trying to set up a copy job that connects to a text file in Data Lake Storage (v1) and copies the data to somewhere... I've set up the Active Directory application I've created a Data Factory (tried v1 and v2) I've created the copy task and connected to the Data Lake. I've successfully picked a file on the lake. The fie is a CSV file. On the file format settings screen I get a Gateway Timeout. Activity ID:2f860074-7a71-470d-87b9-b5523a13d8a6 when setting up the file. I've tried a simple file with 2 lines and 3 columns all the way to a zipped file with lots of columns I get a similar error on the v1 factory. Any ideas on what I've done wrong?901Views0likes0CommentsAzure Data Lake Tools for VSCode supports Azure blob storage integration
We are pleased to announce the integration of VSCode explorer with Azure blob storage. If you are a data scientist and want to explore the data in your Azure blob storage, please try the Data Lake Explorer blob storage integration. If you are a developer and want to access and manage your Azure blob storage files, please try the Data Lake Explorer blob storage integration. The Data Lake Explorer allows you easily navigate to your blob storage, access and manage your blob container, folder and files. Read about it in the Azure blog.1.1KViews0likes0CommentsGet started with U-SQL: It’s easy!
Azure Data Lake Analytics combines declarative and imperative concepts in the form of a new language called U-SQL. The idea of learning a new language is daunting. Don’t worry! U-SQL is easy to learn. You can learn the vast majority of the language in a single day. If you are familiar with SQL or languages like C# or Java, you will find that learning U-SQL is natural and that you will be productive incredibly fast. A common question we get is “How can I get started with U-SQL?” This blog will show you all the core steps you need to get ramped up on U-SQL. Read about it in the Azure blog.927Views0likes0CommentsControl Azure Data Lake costs using Log Analytics to create service alerts
Azure Data Lake customers use the Data Lake Store and Data Lake Analytics to store and run complex analytics on massive amounts of data. However, it is challenging to manage costs, keep up-to-date with activity in the accounts, and proactively know when usage thresholds are nearing certain limits. Using Log Analytics and Azure Data Lake we can address these challenges and know when the costs are increasing or when certain activities take place. In this post, you will learn how to use Log Analytics with your Data Lake accounts to create alerts that can notify you of Data Lake activity events and when certain usage thresholds are reached. It is easy to get started! Read more about it in the Azure blog.1.1KViews0likes0CommentsAzure Data Lake launches in the West Europe region
Azure Data Lake Store and Azure Data Lake Analytics are now generally available in the West Europe region, in addition to the previously announced regions of East US 2, Central US, and North Europe. Azure Data Lake Store is a hyperscale enterprise data lake in the cloud that is secure, massively scalable, and built to the open HDFS standard. Read about it in the Azure blog.813Views0likes0CommentsAzure Data Lake launches in the West Europe region
Azure Data Lake Store and Azure Data Lake Analytics are now generally available in the West Europe region, in addition to the previously announced regions of East US 2, Central US, and North Europe. Azure Data Lake Store is a hyperscale enterprise data lake in the cloud that is secure, massively scalable, and built to the open HDFS standard. Data from disparate data sources can be brought together into a single data lake so that all your analytics can run in one place. From first class integration with AAD to fine grained access control, built-in enterprise grade security makes managing security easy for even the largest organizations. With no limits to the size of data and the ability to run massively parallel analytics, you can now unlock value from all your analytics data at ultra-fast speeds. Read about it in the Azure blog.989Views0likes0CommentsMicrosoft Azure Data Lake Storage in Storage Explorer – public preview
Providing a rich GUI for Azure Data Lake Storage resources management has been a top customer ask for a long time, we are thrilled to announce the public preview for supporting Azure Data Lake Storage (ADLS) in the Azure Storage Explorer (ASE). With the release of ADLS resources in ASE, you can freely navigate ADLS resources, you can upload and download folders and files, you can copy and paste files across folders or ADLS accounts and you can easily perform CRUD operations for your folders and files. Azure Storage Explorer not only offers a traditional desktop explorer GUI for dragging, uploading, downloading, copying and moving your ADLS folders and files, but also provides a unified developer experiences of displaying file properties, viewing folder statistics and adding quick access. With this extension you are now able to browse ADLS resources along-side existing experiences for Azure Blobs, tables, files, queues and Cosmos DB in ASE. Read about it in the Azure blog.1.4KViews0likes0CommentsAzure Data Lake tools integrates with VSCode Data Lake Explorer and Azure Account
If you are a data scientist and want to explore the data and understand what is being saved and what the hierarchy of the folder is, please try Data Lake Explorer in VSCode ADL Tools. If you are a developer and look for easier navigation inside the ADLS, please use Data Lake Explorer in VSCode ADL Tools. The VSCode Data Lake Explorer enhances your Azure login experiences, empowers you to manage your ADLA metadata in a tree like hierarchical way and enables easier file exploration for ADLS resources under your Azure subscriptions. You can also preview, delete, download, and upload files through contextual menu. With the integration of VSCode explorer, you can choose your preferred way to manage your U-SQL databases and your ADLS storage accounts in addition to the existing ADLA and ADLS commands. If you have difficulties to login to Azure and look for simpler sign-in processes, the Azure Data Lake Tools integration with VSCode Azure account enables auto sign in and greatly enhance the integration with Azure experiences. If you are an Azure multi-tenant user, the integration with Azure account unblocks you and empowers you to navigate your Azure subscription resources across tenants. Read about it in the Azure blog.1.2KViews0likes0CommentsNot able to execute "az dls fs upload xxx" through java code
I am trying to execute " az dls fs upload --account XXX--source-path "/local/xyz.txt" --destination-path "/temp/folder/" " through java code using Process p=Runtime.getRuntime().exec(command) ; But its not copy file to datalake. Please help me to figure out this. or is there any other way to do this.1.1KViews0likes0CommentsAzure Data Lake Tools for Visual Studio Code (VSCode) October Updates
If you are a data scientist looking for a lightweight code editor for U-SQL, try ADL Tools for VSCode for rapid development. If you are a developer looking for a modern, simple U-SQL development tool, try ADL Tools for VSCode. If you prefer to use Mac or Linux for your development, install ADL Tools for VSCode and get started with U-SQL development. U-SQL is a programming framework built for scaling out big data queries and running them in a serverless cloud environment. In addition to being able to run SQL-like queries directly, the U-SQL framework makes it easy to plug in your R, Python or .NET algorithms and scale them out in the same easy, declarative style as SQL. Read about it in the Azure blog.792Views0likes0Comments