microsoft ignite 2021
44 TopicsDatabase templates in Azure Synapse Analytics
Azure Synapse Analytics is a limitless analytics service that brings together data integration, enterprise data warehousing, and big data analytics. It gives you the freedom to query data on your terms, using either serverless or dedicated resources—at scale. Azure Synapse brings these worlds together with a unified experience to ingest, explore, prepare, manage, and serve data for immediate BI and machine learning needs. One of the challenges that users in key industry areas face is how to describe and shape the mass of data that they are gathering. Most of this data is currently stored in data lakes or in application-specific data silos. The challenge is to bring all this data together in a standardized format enabling it to be more easily analyzed and understood and for ML and AI to be applied to it. Azure Synapse solves this problem by introducing industry-specific templates for your data, providing a standardized way to store and shape data. These templates provide schemas for predefined business areas, enabling data to be loaded into a database in a structured way. Database templates in Azure Synapse are industry-specific schema definitions that provide a quick method of creating a database known as a lake database. As with any database, it contains a well-defined schema for a business solution. This schema includes tables, columns, and relationships that represent transactional, operational, and business semantic data. Azure Synapse is used to perform big data operations on a lake database via either SQL or Spark compute pools. By using database templates, we are leveraging decades of specific industry experience to form an extensible schema. This schema can be expanded or derived from your own business terminology and taxonomies. Data processing pipelines in Azure Synapse provide hundreds of connectors (e.g., SAP ECC, Dynamics CRM, Magento), making it easy to connect to different source systems. The data in the lake database is stored in , and creates the foundation of an enterprise data lake, where data from the different sources are combined for analytics and reporting. Finally, database templates have been built with an ecosystem in mind. Customers and partners can rapidly build analytics-infused industry use cases by customizing and extending the standard templates using the database editor in Azure Synapse. Here’s what some of the early adopters have noted: “Having an opportunity to test and use Azure Synapse database templates, our team at dunnhumby were impressed by the breadth of data domain coverage, along with Azure Synapse’s feature and tools for development, engineering, and delivery of outstanding data science. As a global leader in retail data science and knowing how hard navigating data fundamentals can be - we can see the benefits for Azure Synapse customers in helping them to rapidly unlock the value in their data assets, enabling them to evolve and scale their insight capabilities. Azure Synapse database templates can be a key enabler in breaking down data silos and unlocking potential value in enterprise data.” - David Jack | dunnhumby Chief Technology Officer “We have further deepened our Qlik Data Integration capabilities with Azure Synapse with the availability of database templates. Our retail customers now have increased ability to access and transform SAP’s complex, application-specific data structures from any SAP source, and into formats optimized for analytics within Azure Synapse” – Matt Hayes, VP SAP Business at Qlik Currently, Azure Synapse includes database templates for Retail, Consumer Goods, Banking, Fund Management, and Property and Casualty Insurance, with more industry-specific templates to be added in the near future: Getting started with database templates Here’s how the database templates can be used from Azure Synapse Studio (the browser-based editor for Azure Synapse). 1. Select a database template from the Azure Synapse gallery. Let’s use Retail as an example: 2. You’ll see a set of eight tables pre-populated in a lightweight data model editor to get you started. From here you can add, remove or edit, tables, columns and relationships. Click on “Create Database” when done: 3. In the database designer, you can further refine your data model by making edits to the tables specified previously. You can also add tables from an existing data lake or create brand new ones; the former being ideal if you already have a data lake in use. The Properties tab allows you to specify the name of your lake database, data format (delimited text or parquet) and storage account settings. The default storage account is the one that comes with your Synapse workspace. Click on “Publish All” to commit your changes: That’s it! In a few minutes you’re up and going with a lake database and ready to load it with data. The data can be loaded via Synapse Pipelines; an ELT engine with the lake database being the target, and your operational data being the source. With hundreds of data connectors (including 3 rd party, ODBC, REST, OData, HTTP), you can ensure that any source data can be loaded to your lake database. After the data is loaded, you can take advantage of pre-built AI and ML models that understand your data based on the lake database template schema. An example is the Retail-Product recommendation solution in the Gallery: Knowing the shape of the data allows us to provide pre-built industry AI solutions. The AI solution for Retail Product Recommendation provides a robust and scalable recommendation engine for out-of-the-box development in Synapse. No additional data shaping is needed, the solution can work on the data out of the box. This accelerates productivity of existing or emerging data scientists for solving a specific problem in the Retail domain. Considering database templates as a core component in your next big data project will help you with a better integrated and scalable architecture: ecture Benefits of database templates By leveraging Azure Synapse’s library of database templates derived from decades of industry implementations, you can Accelerate time to insights based on a standardized business area schemas for different industries Identify gaps and opportunities in your existing enterprise data model Consolidate data silos and query from a single pane of glass (Synapse Studio) Create a well-formed data lake ready for the analytics at scale Enrich your data with Azure Cognitive Services and Azure Machine Learning Develop reports easily using Power BI Conclusion Azure Synapse Analytics gives you the freedom to query data on your terms, using either serverless or dedicated resources—at scale. It solves many of the productivity and scalability challenges that prevent you from maximizing the value of your data assets with a service that is ready to meet your current and future business needs. To learn more, visit: Azure Synapse Analytics documentation Database templates in Azure SynapseIntroducing Azure Cognitive Service for Language
Azure Cognitive Services has historically had three distinct NLP services that solve different but related customer problems. These are Text Analytics, LUIS and QnA Maker. As these services have matured and customers now depend on them for business-critical workloads, we wanted to take a step back and evaluate the most effective path forward over the next several years for delivering our roadmap of a world-class, state-of-the-art NLP platform-as-a-service. Each service was initially focused on a set of capabilities supporting distinct customer scenarios. For example, LUIS for custom language models most often supporting bots, Text Analytics for general purpose pre-built language services, and QnA Maker for knowledge-based question / answering. As AI accuracy has improved, the cost of offering more sophisticated models has decreased, and customers have increased their adoption of NLP for business workloads, we are seeing more and more overlapping scenarios where the lines are blurred between the three distinct services. As such, the most effective path forward is a single unified NLP service in Azure Cognitive Services. Today we are pleased to announce the availability of Azure Cognitive Service for Language. It unifies the capabilities in Text Analytics, LUIS, and the legacy QnA Maker service into a single service. The key benefits include: Easier to discover and adopt features. Seamlessness between pre-built and custom-trained NLP. Easier to build NLP capabilities once and reuse them across application scenarios. Access to multilingual state-of-the-art NLP models. Simpler to get started through consistency in APIs, documentation, and samples across all features. More billing predictability. The unified Language Service will not affect any existing applications. All existing capabilities of the three services will continue to be supported until we announce a deprecation timeline of the existing services (which would be no less than one year). However, new features and innovation will start happening only on the unified Language service. For example, question answering and conversational language understanding (CLU) are only available in the unified service (more details on these features later). As such, customers are encouraged to start making plans to leverage the unified service. More details on migration including links to resources are provided below. Here is everything we are announcing today in the unified Language service: Text Analytics is now Language Service: All existing features of Text Analytics are included in the Language Service. Specifically, Sentiment Analysis and Opinion Mining, Named Entity Recognition (NER), Entity Linking, Key Phrase Extraction, Language Detection, Text Analytics for health, and Text Summarization are all part of the Language Service as they exist today. Text Analytics customers don’t need to do any migrations or updates to their in-production or in-development apps. The unified service is backward compatible with all existing Text Analytics features. The key difference is when creating a new resource in the Azure portal UI, you will now see the resource labeled as “Language” rather than “Text Analytics”. Introducing conversational language understanding (preview) - the next generation of LUIS: Language Understanding (LUIS) has been one of our fastest growing Cognitive Services with customers deploying custom language models to production for various scenarios from command-and-control IoT devices and chat bots, to contact center agent assist scenarios. The next phase in the evolution of LUIS is conversational language understanding (CLU) which we are announcing today as a preview feature of the new Language Service. CLU introduces multilingual transformer-based models as the underlying model architecture and results in significant accuracy improvements over LUIS. Also new as part of CLU is the ability to create orchestration projects, which allow you to configure a project to route to multiple customizable language services, like question answering knowledge bases, other CLU projects, and even classic LUIS applications. Visit here to learn more. If you are an existing LUIS customer, we are not requiring you to migrate your application to CLU today. However, as CLU represents the evolution of LUIS, we encourage you to start experimenting with CLU in preview and provide us feedback on your experience. You can import a LUIS JSON application directly into CLU to get started. GA of question answering: In May 2021, we launched the preview of custom question answering. Today we are announcing the General Availability (GA) of question answering as part of the new Language Service. If you are just getting started with building knowledge bases that are query-able with natural language, visit here to get started. If you want to know more about migrating legacy QnA Maker knowledge bases to the Language Service see here. Your existing QnA Maker knowledge bases will continue to work. We are not requiring customers to migrate from QnA Maker at this time. However, question answering represents the evolution of QnA Maker and new features will only be developed for the unified service. As such, we encourage you to plan for a migration from legacy QnA Maker if this applies to you. Introducing custom named entity recognition (preview): Documents include an abundant amount of valuable information. Enterprises rely on pulling out that information to easily filter and search through those documents. Using the standard Text Analytics NER, they could extract known types like person names, geographical locations, datetimes, and organizations. However, lots of information of interest is more specific than the standard types. To unlock these scenarios, we’re happy to announce custom NER as a preview capability of the new Language Service. The capability allows you to build your own custom entity extractors by providing labelled examples of text to train models. Securely upload your data in your own storage accounts and label your data in the language studio. Deploy and query the custom models to obtain entity predictions on new text. Visit here to learn more. Introducing custom text classification (preview): While many pieces of information can exist in any given document, the whole piece of text can belong to one or more categories. Organizing and categorizing documents is key to data reliant enterprises. We’re excited to announce custom text classification, a preview feature under the Language service, where you can create custom classification models with your defined classes. Securely upload your data in your own storage accounts and label your data in the language studio. Choose between single-label classification where you can label and predict one class for every document, or multi-label classification that allows you to assign or predict several classes per document. This service enables automation to incoming pieces of text such as support tickets, customer email complaints, or organizational reports. Visit here to learn more. Language studio: This is the single destination for experimentation, evaluation, and training of Language AI / NLP in Cognitive Services. With the Language studio you can now try any of our capabilities with a few buttons clicks. For example, you can upload medical documents and get back all the entities and relations extracted instantly, and you can easily integrate the API into your solution using the Language SDK. You can take it further by training your own custom NER model and deploy it through the easy-to-use interface. Try it out now yourself here. Several customers are already using Azure Cognitive Service for Language to transform their businesses. Here's what two of them had to say: “We used Azure Cognitive Services and Bot Service to deliver an instantly responsive, personal expert into our customers’ pockets. Providing this constant access to help is key to our customer care strategy.” -Paul Jacobs, Group Head of Operations Transformation, Vodafone “Sellers might have up to 100,000 documents associated with a deal, so the time savings can be absolutely massive. Now that we’ve added Azure Cognitive Service for Language to our tool, customers can potentially compress weeks of work into days.” -Thomas Fredell, Chief Product Officer, Datasite To learn more directly from customers, see the following customer stories: Vodafone transforms its customer care strategy with digital assistant built on Azure Cognitive Services Progressive Insurance levels up its chatbot journey and boosts customer experience with Azure AI Kepro improves healthcare outcomes with fast and accurate insights from Text Analytics for health On behalf of the entire Cognitive Services Language team at Microsoft, we can't wait to see how Azure Cognitive Service for Language benefits your business!27KViews5likes0CommentsManaged Instance link – connecting SQL Server to Azure reimagined
Link feature for Managed Instance is a new feature reimagining the connection between SQL Server hosted anywhere and the fully managed PaaS service Azure SQL Managed Instance, providing unprecedented hybrid flexibility and database mobility. With an approach that uses near real-time data replication to Azure, you can offload workloads to read-only secondaries on Azure to take advantage of a fully managed database platform, performance, and scale. The link can be operated for as long as you need it – months and years at a time, empowering you to get all the modern benefits of Azure today without migrating to the cloud. On your modernization journey, when and if you are ready to migrate to the cloud, the link de-risks your migration experience allowing you to validate your workloads in Azure prior to migrating with a seamless and instant experience, and at your own pace. This article provides deeper insights into this new feature.24KViews4likes9CommentsHybrid Cloud meets Hybrid Work with Azure Virtual Desktop for Azure Stack HCI
Following the launch of Azure Virtual Desktop in 2019, business needs – and the requirements for remote work – have continued to evolve. Fortunately, our flexible virtual desktop infrastructure (VDI) platform gives customers the benefits of cloud scale, security, agility, and resilience – along with less complexity and management overhead than traditional VDI solutions.22KViews4likes4CommentsAnnouncing General Availability for Virtual Machine Scale Sets – Flexible orchestration mode
Increase availability at Scale with Virtual Machine Scale Sets - Flexible orchestration mode, now available in GA. VM Scale Sets is a great platform for you to run your mission critical, dynamic and scalable applications.21KViews3likes1Comment