machine learning
20 TopicsAdding multi-language support for Azure AI applications quickly
There is a growing demand for applications which support speech, language identification, translation or transliteration from one language to another. Common questions that customers encounter as they consider possible solutions are: How do we identify which language is being spoken/typed by the user? How do we translate one language to another for the user? Is there a way to transliterate text from one language to another? Can we determine the intent of a user utterance? How do we incorporate natural language understanding into our chatbot application for multiple languages? Complex problems such as these can now be solved using advanced APIs that are readily available without having to reinvent the wheel – no machine learning expertise required! This blog starts off with a brief introduction to machine translation and then explores various topics like identifying the language and how to perform translation/transliteration of spoken or typed text using Microsoft’s Translator Text API. In addition, we also discuss how translated or transliterated text can be integrated with LIUS.Pre-train and Fine-tune Language Model with Hugging Face and Gaudi HPU.
In this blog, we provide a general guideline of pre-training and fine-tuning language models using Hugging Face. For illustration, we use pre-training language models for question generation (answer-agnostic) in Korean as running example.13KViews0likes0CommentsReducing the distance to your Azure ML remote compute jobs
Under (hopefully) rare circumstances, after developing a training script and thorough local testing, it can still happen that the same script fails when executed on a remote AML compute target. Here, we are sharing some best practices around how to debug remote workloads on Azure ML.Accelerating Azure Databricks Runtime for Machine Learning
Data Scientists love working in the Azure Databricks environment when developing their machine learning and artificial intelligence models. By simply installing Intel-optimized versions of scikit-learn and TensorFlow as described in this blog post, they can see potentially large performance gains that will save them time and money.