Recent Discussions
Using Adaptive Cards to display carousels in Health Bot
Can we attach an array of Adaptive Cards using the Dynamic Card, to display as Carousel? I am trying to do something like below in a Dynamic Card, but it just displays empty cards in the carousel. (function(){ var data = ${dataList}; var cards = data.map(function(item) { return { type: "AdaptiveCard", $schema: "http://adaptivecards.io/schemas/adaptive-card.json", version: "1.3", body: [ { type: "Container", isVisible: true, items: [ { type: "Image", style: "person", url: item.imageUrl, altText: item.imageAltText, size: "medium" }, { type: "TextBlock", text: item.title, weight: "bolder", size: "medium", wrap: true } ] }, { type: "ActionSet", actions: [ { type: "Action.OpenUrl", title: "View Profile", url: item.buttonUrl, id: item.buttonId } ] } ] } }); return cards; })(); ${dataList} is an array of objects with properties imageUrl, imageAltText, title, buttonUrl and buttonId. I came across documentation on how to implement this in Copilot Studio - https://learn.microsoft.com/en-us/microsoft-copilot-studio/guidance/adaptive-card-display-carousels Do we have any documentation available for Health Bot as well? Appreciate any guidance !! Thanks, Abhilash8Views0likes0CommentsUnderstanding Health Bot's Logging Custom Dimensions
Microsoft Health Bot has the ability to emit custom logging into customer supplied Application Insightsresource using customer's instrumentation key. See here, for more details. There is also the ability to emit events programmatically inside the scenario's "action" step. For Example: When this code runs, it will emit a custom event that can be viewed using Application Insights viewer. It will contain the payload of this specific call inside the Custom Dimensions property of the event. Event name: As specified in the session.logCustomEvent call Custom Dimensions properties: callstack: The full stack of the scenarios in case there are scenarios that call other scenarios channelId: Chat channel in which this event happened. eg. Web chat, Teams, IVR etc... conv_id: Conversation Id of this conversation session. This is a hashed value, but it can allow the user to track the history of a specific session.Note however that the SALT value creating this hash is being replaced daily. correlation_id: It's an internal id that uniquely identifies this API call as it goes through our APIM (API Management) resource. The customer can give this id to the support engineer to assist debugging by the service team. dialogName: Scenario Id that emitted this event. eventId: Unique identifier of this logging event. locale: The client's locale while emitting this event. Client can switch locales mid conversation. offerid: azurehealthbot payload: The custom payload (JSON format) passed by this specific call. region:The deployment region of this Health Bot speaker: The speaker of this event. Can be either "bot" or "user" stepid: Internal unique id of this "action" step that emitted the event. If you have several such action step in one scenario, it can be a bit difficult to tell which made this call. To solve this, you can select this scenario and press the "export" button in the toolbar. This will download the scenario in JSON format where you can locate the "action" step and retrieve its id field. stepType: The step type that emitted this event. Any step with JavaScript editor can be used to emit the custom event. tenantId/tenantName: Unique name and id you the customer user_id: Hash value of the end user emitting the event. You can track the events of a specific user throughout the conversation using this id. Note however that the SALT value creating this hash is being replaced daily.181Views0likes0CommentsGetting Started with Healthbot using Customer Generated Sources
I am trying to get our customer generated data from AI Search integrated with the Healthbot. Here are the steps: 1. Using OpenAI Studio, file upload in new storage container and specified AI Search instance and created a new Index. Validated the index could be searched based on the context of the single file upload. 2. Using OpenAI Studio, tested the chat function and it returned the right results and referenced the file 3. When entering Azure AI Search parameters in the data connection, I entered in the index name and key then chose Vector as the search type. I also verified my index fields is using contentVector. The built scenario always falls back to medline and never returns the customer data, is there any recommendations on where to check? This use to work when using the import data wizard in AI Search but even that index fails to return results in healthbot but works just fine in AI Search.140Views0likes0CommentsConfiguring WhatsApp channel for Microsoft Health Bot
Microsoft Health Bot leverages Twilio’s WhatsApp API to enable chatting with WhatsApp users. To enable this feature, you first need to set up a Twilio account and a WhatsApp Business account (WABA). In this guide, we will use the WhatsApp Sandbox environment to configure the channel. 1. Under Integration/Channels, enable the WhatsApp channel 2. Paste in the Phone number you have obtained from Twilio. This phone number should be registered as WhatsApp Business number. 3. Paste your Twilio Account SID and you Auth Token taken from Twilio's account page. 4. Save these settings. 5. Re-Open the WhatsApp channel and notice that the "Service URL endpoint" now appears. Copy this URL. You will need it to paste into the WhatsApp configuration in Twilio. 6. Navigate to Messaging/Send WhatsApp Message page and paste the URL you copied from Health Bot into the "Sandbox Configuration when a Message comes in" URL. Method should ne "POST". 7. Test it by sending the specified message on this page to be added to the Sandbox participants list. From this point on, every message you send to this number will be routed to the Health Bot via the service URL. Any response from the Health Bot will be sent to your WhatsApp client.269Views0likes0CommentsUpcoming Backup Encryption Upgrade for Microsoft Healthcare Bot Users
To improve security, we will be upgrading our backup encryption system on August 1, 2024. With this upgrade any Microsoft Healthcare Botbackup made before Sep 28th2023 will not work. Important If you haven't made any backups after September 28 2023, please create a new backup before August 1, 2024. After this date, backups older than a year will no longer be usable for restoring. If you have made backups in the last year, no action is needed. Feel free to reach out if you have any questions.147Views0likes0CommentsAzure AI Health Bot – now supports Microsoft Entra Access Management
We are excited to announce the introduction of Microsoft Entra Access Management support in the Azure AI Health Bot. This enhancement increases security by leveraging the robust and proven capabilities of Microsoft Entra. Customers interested in this feature can opt-in by navigating to the User Management page and enabling the Microsoft Entra Access Management feature. This feature can only be enabled for users who have the Health Bot Admin role in the Azure access control identity-access-management (IAM) pane. When Microsoft Entra Access Management is enabled, all users and roles should be managed through Azure Access control identity-access-management (IAM) pane. The Access Control (IAM) now contains the same Azure AI Health Bot roles in Azure, such as Health Bot Admin, Health Bot Editor and Health Bot Reader. When the Microsoft Entra Access Management feature is enabled, the User Management page will be read-only. All users in the Management Portal page will need to be manually added with the right roles through the Azure Access Control (IAM) page in the Azure Portal. You can read more on the Microsoft Entra Access Management features on our public documentation page248Views1like0Commentsvalue changes sheet
Hi everyone, I'm not very expert in office script, I wrote this code but I would like the script to start when a value changes in another sheet. for now I can only activate it via button. Thank you function main(workbook: ExcelScript.Workbook) { // Refresh all data connections workbook.refreshAllDataConnections(); // Row height workbook.getWorksheet("Matrice").getRange("A4").getFormat().setRowHeight(200 * 3 / 4); workbook.getWorksheet("Matrice"). getRange("F5:F1000").getEntireColumn().getFormat().setHorizontalAlignment(ExcelScript.HorizontalAlignment.left); }274Views0likes0CommentsAzure Health Bot using Azure Container Instance (ACI) for enhanced security.
We are pleased to announce our latest security improvement we have introduced in our platform. Our customers who used the production ready (Standard plan) instances will now enjoy the use of their own dedicated Azure Container Instance (ACI) that will execute their JavaScript code in their own isolated container. All ACIs operate within a separate subnet in our AKS cluster's virtual network (VNet), and a unified network security group (NSG) is applied to all ACIs to prevent any outgoing communication. As part of this security enhancement JavaScript functions, you create will not be directly shared among steps in the scenario. To enable this, we have created a Global Action step you can utilize to have JavaScript code that is sharable with every other step in the scenario: For more information about the available objects and JS packages please visit our documentation. To learn more about ACI, visitAzure Container Instances | Microsoft Azure329Views0likes0CommentsSecurely passing data from the customer's backend to the Azure Health Bot server side.
When integrating with Azure Health Bot (AHB) in production-grade applications, you'll often need to write your backend and frontend components. These components will link your backend systems with AHB and transmit customer-specific data like Web Chat tokens, end-user details, and other sensitive information required during the conversation flow. This data is encrypted and signed so that only your legitimate bot instance can decode and use it. AHB provides exactly this kind of mechanism. You can check out this sample code on GitHub, which includes both the backend and frontend sample components. Once a conversation is initiated on the client side of AHB, it prompts your backend to prepare and sign the necessary data for the conversation. The server.js contains code that initiates a conversation session from your application's backend. For example, the snippet below from the server.js file passes an 'age' variable within an optionalAttributes object, but you can add any attributes you'd like: //Add any additional attributes response['optionalAttributes'] = {age: 33}; The backend then creates a payload that is signed into a JWT and sent back to the client side. The client forwards this JWT to the AHB session upon initiating the conversation. When AHB receives this token, it decodes, verifies, and populates a conversation scope variable called "initConversationEvent." To access this variable, refer to the example provided in a statement step. Please note that this data is passed at the beginning of the conversation, and it's your responsibility to handle token expiration if relevant.499Views0likes0CommentsSend metadata from Health Bot server to Web Chat client via backchannel.
You can send metadata attached to a message activity from the server side of Health Bot to the client side to be able to trigger various client events and set client properties. For example, we would like to turn the chat input prompt to be "password" type when we pass the {secret: true} object from the Bot. You can attach the metadata object that will be sent as part of the message activity in the Prompt/Statement steps. On the Web Chat client side, modify the code that handles the message activity to identify this metadata passed in the "entities" property and act accordingly. In this example, set the type of the input prompt to be "password" as shown below. else if (action.type === 'DIRECT_LINE/INCOMING_ACTIVITY') { const inputType = (action.payload && action.payload.activity && action.payload.activity.entities && action.payload.activity.entities.find(e => e.secret === true)) ? "password" : "text"; const input = document.getElementsByClassName("webchat__send-box-text-box__input")[0]; if (input && input.type !== inputType) { input.type = inputType; } } See this sample file325Views0likes0CommentsUsing Azure's Face API to detect faces from a locally uploaded image
Health bot supports uploading of files from local file systems. The files are temporarily stored on the Bot framework's infrastructure and access to them is only available to the holder of an exceedingly long URL. This can open various use cases, for example, having the patient upload some image of a medical condition via the bot, the bot can then use some cognitive service to detect the condition and answer the patient based on the uploaded image. In this example, we will use the Face API to detect faces in an image uploaded from the local file system. On the left you can see the scenario flow in which we first prompt for file upload, the canvas in response displays the prompt text and the "paper clip" icon is ready to be clicked and image can be selected from the file system (or Camera Roll on a mobile device). Prompt the user for an Image. Once the image is selected, we then proceed to call the Face API using the "Data Connection" step. We pass the URL of the uploaded image and calling the REST API for API Detection and use the subscription key I have copied from the Azure Cognitive Services resource I have created.675Views1like0CommentsAutomate Health Bot Deployment using Bicep
Health Bot Azure resource provider now supports retrieval of secrets using the ARM API. This will allow the automated deployment of Health Bot and any other resources that rely on the various secrets that the bot resource exposes. In this example, we show how to deploy Health Bot and a Web Application that uses the Container sample, wire up the secrets the app needs to communicate with the bot and even upload bot template that resides on a publicly accessible storage account via the Data plane API using a PowerShell deployment script. For the completeBicep code and the restoreBot.ps1 file, see the attached bicep.zip file. param serviceName string param linuxFxVersion string = 'node|14-lts' param farmSKU string = 'F1' param botSKU string = 'F0' param location string = 'eastus' param hbsRestoreFile string = 'https://xxxxxxxxx.blob.core.windows.net/templates/botTemplate.hbs' resource healthbot 'Microsoft.HealthBot/healthBots@2022-08-08' = { name: '${serviceName}-bot' location: location properties: { keyVaultProperties: { keyName: 'string' keyVaultUri: 'string' keyVersion: 'string' userIdentity: 'string' } } sku: { name: botSKU } } resource farm 'Microsoft.Web/serverfarms@2022-03-01' = { location: location name: '${serviceName}-farm' kind: 'linux' sku: { name: farmSKU } properties: { reserved: true } } resource webapp 'Microsoft.Web/sites@2022-03-01' = { name: '${serviceName}-webapp' location: location properties: { serverFarmId: farm.id siteConfig: { linuxFxVersion: linuxFxVersion appSettings: [ { name: 'PORT' value: '80' } { name: 'APP_SECRET' value: healthbot.listSecrets().secrets[0].value } { name: 'WEBCHAT_SECRET' value: healthbot.listSecrets().secrets[1].value } ] } } } resource srcControls 'Microsoft.Web/sites/sourcecontrols@2021-01-01' = { name: '${webapp.name}/web' properties: { repoUrl: 'https://github.com/microsoft/HealthBotContainerSample' branch: 'master' isManualIntegration: true } } resource importBotScript 'Microsoft.Resources/deploymentScripts@2020-10-01' = { name: '${serviceName}-RestoreBotScript' location: location kind: 'AzurePowerShell' properties: { azPowerShellVersion: '3.0' scriptContent: loadTextContent('restoreBot.ps1') retentionInterval: 'P1D' arguments: '-secret ${healthbot.listSecrets().secrets[2].value} -baseUrl ${healthbot.properties.botManagementPortalLink} -hbsRestoreFile ${hbsRestoreFile}' } }526Views0likes0CommentsCalling Power Automate flows from within Health Bot scenario
Health Bot can call any REST API endpoint from within the scenario. We can leverage this capability to access any Power Automateflow by using the Data Connection step. This opens an entire range of possibilities accessing Power Apps services, such as Dataverseand others. First define the flow that is triggered by "Request" as shown below. Once you save the flow, the "HTTP" post url is revelead. This the endpoint that also includes the access token to invoke this flow. We recommend to remove the access token from the URL and place in Health Bot's "Data Connections" configuration so it will be stored in a more secured manor. Author the flow according to your needs. For example, you can use the "Dataverse" connector to access dataverse data sources and tables. Return the data by using the "Response" step. In Health Bot scenario, use the "Data Connection" step and invoke the flow endpoint For more details seeCalling Microsoft Flow from your application | Blog Power Automate676Views1like0CommentsFinal call to migrate any Marketplace Health Bots to Azure Health Bot Offering!
As was communicated previously, on 1.1.2021, we've moved the Microsoft Healthcare Bot service from the Marketplace to Azure. All existing Health Bot instances still in the Marketplace have to be Migrate to Azure Health Bot Offering by 15 September 2022 to prevent them from being disabled. Migrating your Marketplace Health Bots, using the Health Bot migration wizard, takes a few minutes and requires no downtime. Learn more about the Health Bot migration wizard414Views0likes0CommentsData connection using Server-to-server authentication
When calling a Data Source from the Data Connection step, usually authentication of some sort is required. Only limited use cases allow you to get away with anonymous access. Health Bot supports OAUTH2 server-to-server authentication out-of-the-box. When calling the actual resource with RESTful API, Health Bot first checks if there is already existing valid token in the global bot storage, if not, it obtains the new token using the "Authentication Provider" configuration, places it in the "Authorization" header and proceeds with the call. Upon success, stores the new token for consecutive calls for improved performance. Navigate to "Integrations/Authentication Providers". Click on the "New" button and fill out the required fields. Give it a unique name optional description, authentication methods should be "server-to-server" client id, client secret and the URL for obtaining the token. Note: The URL should use v2.0 endpoint that uses "scope" (not "resource"like in v1.0). It's recommended that you verify the settings by clicking the "verify settings" link. This will create the token and show it as decrypted JWT. Verify that all the claims in the JWT are correct. When calling the resource, in the Data Connection step. Click on the "Authentication Providers" dropdown and select the provider that you like. When running the step, it will take care of the authentication flow for you. There is no need to define "Authorization" Header since it will be added implicitly just before the call.652Views0likes0CommentsNuance Mix NLU support added to Azure Health Bot
Now you can add Nuance Mix NLU servicelanguage model to Azure Health Bot just like you could add LUISlanguage model. Follow the steps to create your Mix application and language model, copy the client authentication details from Mix and paste it into the fields as shown below. You can test the setup by trying the utterance right inside the portal. Once it works, you can map each intent from Mix to an appropriate scenario. During runtime, utterances are evaluated against all other language models and the highest rated model is triggered with any associated entities that are passed to the scenario as "scenarioArgs" variable.959Views0likes0CommentsUnable to integrate the healthbot in the application
I am trying to integrate the health bot in the web application following the github repository provided as https://github.com/Microsoft/HealthBotContainerSample/ Unfortunately I am unable to do it , as when I make the changes in web config file for the rewrite rules and set <webSocket enabled="false" /> , my home page stops loading and I get the chromewebdata error. Please see the changes which I have made in webconfig <webSocket enabled="false" /> <validation validateIntegratedModeConfiguration="false" /> <modules> <remove name="ApplicationInsightsWebTracking" /> <add name="ApplicationInsightsWebTracking" type="Microsoft.ApplicationInsights.Web.ApplicationInsightsHttpModule, Microsoft.AI.Web" preCondition="managedHandler" /> <add name="ErrorLog" type="Elmah.ErrorLogModule, Elmah" preCondition="managedHandler" /> <add name="ErrorMail" type="Elmah.ErrorMailModule, Elmah" preCondition="managedHandler" /> <add name="ErrorFilter" type="Elmah.ErrorFilterModule, Elmah" preCondition="managedHandler" /> </modules> <handlers> <remove name="BotDetectCaptchaHandler" /> <add name="BotDetectCaptchaHandler" preCondition="integratedMode" verb="GET" path="BotDetectCaptcha.ashx" type="BotDetect.Web.CaptchaHandler, BotDetect" /> <remove name="ExtensionlessUrlHandler-Integrated-4.0" /> <remove name="OPTIONSVerbHandler" /> <remove name="TRACEVerbHandler" /> <add name="ExtensionlessUrlHandler-Integrated-4.0" path="*." verb="*" type="System.Web.Handlers.TransferRequestHandler" preCondition="integratedMode,runtimeVersionv4.0" /> <add name="iisnode" path="server.js" verb="*" modules="iisnode"/> </handlers> <rewrite> <rules> <!-- Do not interfere with requests for node-inspector debugging --> <rule name="NodeInspector" patternSyntax="ECMAScript" stopProcessing="true"> <match url="^server.js\/debug[\/]?" /> </rule> <!-- All other URLs are mapped to the node.js site entry point --> <rule name="DynamicContent"> <conditions> <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true"/> </conditions> <action type="Rewrite" url="/Scripts/server.js"/> </rule> </rules> </rewrite>401Views0likes0CommentsHealth bot Media Qurestion
Hello! I have two questions when using the health bot. 1. I try to use the “getUserMedia” method to access the user’s camera inside the health bot as an action, but that called up an error as “navigator is not defined”. Is this because the Azure Health Bot couldn’t access the getUserMedia API? If that’s the reason, are there any other methods I can use to achieve this requirement? Is there any previous chatbot that has successfully accessed the users’ camera and audio? The core code I try to use in Action: // require to access a video and audio input device from user let constrains = { video:true, audio: true }; let promise = navigator.mediaDevices.getUserMedia(constraints); promise.then(function(mediaStream){ // get video element and play the video var video = document.querySelector("video"); video.srcObject = mediaStream; video.onloadedmetadata = function(e){ video.play(); }; }); // if the require is failed promise.catch(function(err){ console.log(err) }); 2.I try to show back the video that the user uploaded to the bot as an attachment but failed. I try to use a Hero card, but I don't really understand how it works for the card action "playVideo". Can I show the video inside a Hero card or others? What should be the value?595Views0likes0Comments