Blog Post

Azure AI Foundry Blog
4 MIN READ

Question answering feature is generally available

Disha_Agarwal's avatar
Disha_Agarwal
Icon for Microsoft rankMicrosoft
Nov 02, 2021

We are excited to announce that question answering is now generally available as a feature within Azure Cognitive Service for Language. As part of our AI at Scale initiative across Microsoft, we are making the latest breakthroughs in natural language processing available across our products and within our Azure Services portfolio. Powered by state-of-the-art transformer models and Turing natural language model, we are excited to announce general availability of the question answering feature. 

Question answering comprises of two capabilities:

  1. Custom question answering: Using this capability users can customize different aspects like edit question and answer pairs extracted from the content source, define synonyms and metadata, accept question suggestions etc.
  2.  Prebuilt question answering: This capability allows users to get a response by querying a text passage without having the need to manage knowledgebases.

With question answering becoming generally available, we are introducing:

  • Access to question answering through the ‘Answer questions’ section in the language studio, the recently announced customization portal for your Language Service.
  • Availability across 32 regions supported by the Language Service.  
  • Support to add and query unstructured content from public URLs with custom question answering for 53 languages .
  • Precise answering support for 53 languages .
  • Question answering REST APIs for Language Service in General Availability.
  • Question answering runtime SDK for Language Service in General Availability to be available shortly.

Custom question answering


Customize with language studio
: We are introducing a new guided experience to customize question answering with language studio. As shown below, users can customize domain specific knowledge base over their data by clicking on Open custom question answering.

                                                                       
Users can create a custom question answering project which is akin to a knowledge base. Users are guided to add data sources to the knowledge base and then test and deploy the same. We have a distinct Review suggestions section to manage and review Active Learning suggestions generated by the system based on real world user queries. These suggestions help users to optimize the knowledge base

based on real-world feedback.

                                                                

In the
Edit knowledge base section, we have added new editing and testing capabilities to improve the editing experience. We have advanced filters, support for drag-drop of questions and prompts, testing with metadata filters and score thresholds, etc.

                                                


Simplify service management:
Question answering requires management of only 2 Azure services – Language Service for authoring and computation and Azure Cognitive Search Service for storage and ranking. This simplifies service management compared to QnA Maker that requires users to manage 5 services.
As question answering is now a feature in the Language Service, users can access other NLP features with the same Language Service. This greatly reduces complexity for users who combine multiple NLP features for their use cases. For instances, custom question answering (GA) and custom language understanding (Preview) are now both features in the same Language Service.

Support for 32 regions: Question answering is available across 32 regions supported by the Language Service. Now, the users can control the entire stack – authoring and prediction within the same region.

Support for unstructured content from public URLs: We now support adding both files and URLs with unstructured content to your knowledge bases. We have also added the ability to auto-classify a document/URL as unstructured if applicable.

                                                


If the answer to a user query is not addressed by the question answer pairs in the knowledge base, the service searches across the unstructured content (powered by Turing models) and returns the relevant response. Take for instance, a request-response example on adding the following URL on Surface Pen (with unstructured content) : Best in class, Microsoft Classroom Pen 2 empowers every student to achieve more | Microsoft Devices Blog.

                 

State-of-the-art deep learnt Transformer models: Ranking question and answer pairs is powered by  Turing multilingual language model (T-ULRv2), a transformer-based model, which improves the precision of the service for all supported languages. For any user query, the new ranker model understands the semantics of the user query better and gives better aligned results.

Custom question answering supports all the other capabilities that we introduced with  QnA Maker managed in preview such as precise/short answer extraction from answer passages, and ability to add knowledge bases in multiple languages to the same service.

Steps to enable custom question answering

  • Users can navigate to the Azure Portal and create an Azure Language Service. They should enable custom question answering feature when creating the Language Service.

  • Users will be asked to update all the details related to the feature. 
  • A new Language Service with custom question answering feature is deployed. 
  • If users don’t enable custom question answering at service creation, they will have the option to enable/disable the feature later through the Features tab of the Language Service blade.

  • Users can also update Azure Cognitive Search linked to the custom question answering feature via the Features tab.

Migrate from QnA Maker to custom question answering

Prebuilt question answering

 

Prebuilt question answering provides users the capability to answer questions over a passage of text without having to create knowledge bases and manage additional storage. Given a user query and a block of text/passage the API will return an answer and precise answer (if available). Prebuilt question answering is available by default with a newly created language resource. Users can test the Prebuilt capability by trying out Answer questions from the Language studio interface. We currently support prebuilt question answering in 87 languages.


                                      
Pricing

 

Customers will be charged starting 3rd Dec’2021. The pricing details are available on the Language Service’s pricing page. Users will be charged as per usage for question answering APIs. With custom question answering, users will incur the additional cost of Azure Cognitive Search as per the selected tier.


References

Updated Jun 22, 2022
Version 9.0

17 Comments

  • PravinAmbekar's avatar
    PravinAmbekar
    Copper Contributor

    Hello SahithiKatakam,

     

    Thank you for clarifying , I was able to create Project in Language Resource and have tested related feature which are just amazing compare to old qnamaker (i can see much improvement in term of ranking, long and short options, suggested question)

     

    Just one thing bothering me is that the bot i created with custom question answering need to customize, hence i have downloaded source code from portal and tried running first on local. but unlike old (V4) qnamaker here also we have to configure knowledgebase id Endpoint key and host name. and here i am unable to find Knowledgebase ID  which is required to configure and run application.

     

    {
    "MicrosoftAppId": "xxxx",
    "MicrosoftAppPassword": "xxxx",
    "QnAKnowledgebaseId": "where_should_i_find_this_?",
    "QnAEndpointKey": "",
    "QnAEndpointHostName": "",
    "QnAServiceType": "",
    "EnablePreciseAnswer": "true",
    "DisplayPreciseAnswerOnly": "false"
    }

     

    this id is clearly not available in Language studio.

     

    Thank you anyway.

  • Hi PravinAmbekar 

     

    Please download the bot code (of he one you created via Language studio). It has sample usage of Question Answering features. I would suggest you to pick SDK code, bot code and configuration parameters from here and integrate it in your customized bot. 

     Please look for new configuration parameters

    "EnablePreciseAnswer": "true"

    "DisplayPreciseAnswerOnly": "false"

    "QnAServiceType"="language"

    which control the usage of new features in the Custom Question Answering service.

     

    By default answers from unstructured sources will be included in responses. However, QnAMakerOptions can be configured to set it to false if required. Look for parameter "includeUnstructuredSources".

    You can find examples of new representation of strictFilters as MetadataFilters in Filters parameter in the sample code.

     

    Note that you will have to copy the Microsoft.BotBuilder.AI.QnA SDK from this sample code and integrate in your existing bot. (if version 4.xx.x nuget was used, remove reference to it and add this library to your bot code).

     

    Official release of Botframework SDK with support for CQA is scheduled for release in April and the information above will also be available in corresponding git repos. 

     

    Let us know if you have any questions.

    Thanks.

  • PravinAmbekar's avatar
    PravinAmbekar
    Copper Contributor

    Hello Disha_Agarwal ,

    Excited to see all these new features, i am working on Bot and related development and finding it difficult to implement and customize bot source code for :

    • Using Adaptive cards
    • Hosting it on Linux environment
    • Enabling DL ASE for network isolation.

    I have customized existing bot for all above mentioned features but couldn't get much help in case of Custom question answering

    to start with, I have created Custom Project in Language studio, created related Language service in portal, also have created Web App bot

    but need help in how i can customize code to support multilingual communication, introducing introcard, enabling DL communication (C# SDK)

     

    It would be great if you could provide some references to Repos/sample or article which could help in anyway

    Thank you.

     

     

     

  • Peter_Guhr's avatar
    Peter_Guhr
    Copper Contributor

    Hi,

    I'm looking for details how to connect to the REST API for the Question Answering endpoint - including the Prebuilt Answer feature.
    For example to ask questions and get answer using Postman in the same why I can do it using the Test option for one project inside the Language Studio.

    The information I found around this topic are related to the "classic" QnA maker or the qnamaker v5.0-preview versions.
    But for the new Question Answering feature as part of the Language Service I didn't find information that worked for me.

     

    I tried to build my Postman query based in the details I extracted from using the mentioned Language Studio Test option.

    https://{Unique-to-your-endpoint}.api.cognitive.microsoft.com/language/:query-knowledgebases?projectName={projectname}&api-version=2021-10-01&deploymentName={deployment-environment}
    Plus the api-key as Header parameter (parameter name: Ocp-Apim-Subscription-Key)
    And the Request details as JSON document inside the request body

    But I get each time an 404 error message.
    On the other hand, using curl in a similar way works.

    Can you provide a Postman sample that works, too?


    In addition I want to ask, when will the Language Service (including Question Answering) API documentation be available as part of the REST API documentation (https://docs.microsoft.com/en-us/rest/api/) ?
    I would expect it under:
    https://docs.microsoft.com/en-us/rest/api/cognitiveservices/languageservice/...

     

    But this site contains just information for the separate qnamaker endpoint until now
    --> https://docs.microsoft.com/en-us/rest/api/cognitiveservices-qnamaker/

     

    Regards,
    Peter