azure ml studio
4 TopicsOperationalize your Prompt Engineering Skills with Azure Prompt Flow
In today’s AI-driven world, prompt engineering is a game-changing skill for developers and professionals alike. With Azure Prompt Flow, you can harness the power of open-source LLMs to solve real-world operational challenges! This article guides you through using Azure’s robust tools to build, deploy, and refine your own LLM apps—from chatbots to data extraction tools and beyond. Whether you're just starting or looking to sharpen your AI expertise, this guide has everything you need to unlock new possibilities with prompt engineering. Dive in and take your tech journey to the next level!1.4KViews5likes3CommentsDrug Details in Doctor’s Prescriptions: A Named Entity Recognition Approach using Prompt Flow
This article shows how to identify important information like drug details from doctor’s prescription, which is called Named Entity Recognition (NER). Entities are words that represent specific concepts or objects in a text. You can use any example to find the important information from the text. Below is the sample doctor’s prescription and we need to identify the drugs e.g. ibuprofen and pseudoephedrine.2.5KViews2likes2CommentsUnable to load large delta table in azure ml studio
I am writing to report an issue that I am currently experiencing while trying to read a delta table from Azure ML. I have already created data assets to register the delta table, which is located at an ADLS location. However, when attempting to load the data, I have noticed that for large data sizes it is taking an exceedingly long time to load. I have confirmed that for small data sizes, the data is returned within few seconds, which leads me to believe that there may be an issue with the scalability of the data loading process. I would greatly appreciate it if you could investigate this issue and provide me with any recommendations or solutions to resolve this issue. I can provide additional details such as the size of the data, the steps I am taking to load the data, and any error messages if required. I'm following this document: https://learn.microsoft.com/en-us/python/api/mltable/mltable.mltable?view=azure-ml-py#mltable-mltable-from-delta-lake Using this command to read delta table using data asset URI from mltable import from_delta_lake mltable_ts = from_delta_lake(delta_table_uri=<DATA ASSET URI>, timestamp_as_of="2999-08-26T00:00:00Z", include_path_column=True)586Views0likes0CommentsHow to get data directory path from class DatasetConsumptionConfig?
I am trying to read my data files from an Azure ML dataset. My code is as follows: from azureml.core import Dataset dataset = Dataset.get_by_name(aml_workspace, "mydatasetname") dataset_mount = dataset.as_named_input("mydatasetname").as_mount(path_on_compute="dataset") The type of dataset_mount is class DatasetConsumptionConfig. How do I get the actual directory path from that class? I can do it in a very complicated manner by passing the dataset_mount into a script as follows: PythonScriptStep(script_name="myscript.py", arguments=["--dataset_mount", dataset_mount], ...) Then, when that script step is run, "myscript.py" mysteriously gets the real directory path of the data in the argument "dataset_mount", instead of it being DatasetConsumptionConfig. However, that's an overcomplicated and strange approach. Is there any direct way to get the data path from DatasetConsumptionConfig? Or maybe I have misunderstood something here?1.2KViews0likes0Comments