Recent Discussions
Understanding Health Bot's Logging Custom Dimensions
Microsoft Health Bot has the ability to emit custom logging into customer supplied Application Insightsresource using customer's instrumentation key. See here, for more details. There is also the ability to emit events programmatically inside the scenario's "action" step. For Example: When this code runs, it will emit a custom event that can be viewed using Application Insights viewer. It will contain the payload of this specific call inside the Custom Dimensions property of the event. Event name: As specified in the session.logCustomEvent call Custom Dimensions properties: callstack: The full stack of the scenarios in case there are scenarios that call other scenarios channelId: Chat channel in which this event happened. eg. Web chat, Teams, IVR etc... conv_id: Conversation Id of this conversation session. This is a hashed value, but it can allow the user to track the history of a specific session.Note however that the SALT value creating this hash is being replaced daily. correlation_id: It's an internal id that uniquely identifies this API call as it goes through our APIM (API Management) resource. The customer can give this id to the support engineer to assist debugging by the service team. dialogName: Scenario Id that emitted this event. eventId: Unique identifier of this logging event. locale: The client's locale while emitting this event. Client can switch locales mid conversation. offerid: azurehealthbot payload: The custom payload (JSON format) passed by this specific call. region:The deployment region of this Health Bot speaker: The speaker of this event. Can be either "bot" or "user" stepid: Internal unique id of this "action" step that emitted the event. If you have several such action step in one scenario, it can be a bit difficult to tell which made this call. To solve this, you can select this scenario and press the "export" button in the toolbar. This will download the scenario in JSON format where you can locate the "action" step and retrieve its id field. stepType: The step type that emitted this event. Any step with JavaScript editor can be used to emit the custom event. tenantId/tenantName: Unique name and id you the customer user_id: Hash value of the end user emitting the event. You can track the events of a specific user throughout the conversation using this id. Note however that the SALT value creating this hash is being replaced daily.179Views0likes0CommentsGetting Started with Healthbot using Customer Generated Sources
I am trying to get our customer generated data from AI Search integrated with the Healthbot. Here are the steps: 1. Using OpenAI Studio, file upload in new storage container and specified AI Search instance and created a new Index. Validated the index could be searched based on the context of the single file upload. 2. Using OpenAI Studio, tested the chat function and it returned the right results and referenced the file 3. When entering Azure AI Search parameters in the data connection, I entered in the index name and key then chose Vector as the search type. I also verified my index fields is using contentVector. The built scenario always falls back to medline and never returns the customer data, is there any recommendations on where to check? This use to work when using the import data wizard in AI Search but even that index fails to return results in healthbot but works just fine in AI Search.136Views0likes0CommentsConfiguring WhatsApp channel for Microsoft Health Bot
Microsoft Health Bot leverages Twilio’s WhatsApp API to enable chatting with WhatsApp users. To enable this feature, you first need to set up a Twilio account and a WhatsApp Business account (WABA). In this guide, we will use the WhatsApp Sandbox environment to configure the channel. 1. Under Integration/Channels, enable the WhatsApp channel 2. Paste in the Phone number you have obtained from Twilio. This phone number should be registered as WhatsApp Business number. 3. Paste your Twilio Account SID and you Auth Token taken from Twilio's account page. 4. Save these settings. 5. Re-Open the WhatsApp channel and notice that the "Service URL endpoint" now appears. Copy this URL. You will need it to paste into the WhatsApp configuration in Twilio. 6. Navigate to Messaging/Send WhatsApp Message page and paste the URL you copied from Health Bot into the "Sandbox Configuration when a Message comes in" URL. Method should ne "POST". 7. Test it by sending the specified message on this page to be added to the Sandbox participants list. From this point on, every message you send to this number will be routed to the Health Bot via the service URL. Any response from the Health Bot will be sent to your WhatsApp client.262Views0likes0CommentsUpcoming Backup Encryption Upgrade for Microsoft Healthcare Bot Users
To improve security, we will be upgrading our backup encryption system on August 1, 2024. With this upgrade any Microsoft Healthcare Botbackup made before Sep 28th2023 will not work. Important If you haven't made any backups after September 28 2023, please create a new backup before August 1, 2024. After this date, backups older than a year will no longer be usable for restoring. If you have made backups in the last year, no action is needed. Feel free to reach out if you have any questions.146Views0likes0CommentsAzure AI Health Bot – now supports Microsoft Entra Access Management
We are excited to announce the introduction of Microsoft Entra Access Management support in the Azure AI Health Bot. This enhancement increases security by leveraging the robust and proven capabilities of Microsoft Entra. Customers interested in this feature can opt-in by navigating to the User Management page and enabling the Microsoft Entra Access Management feature. This feature can only be enabled for users who have the Health Bot Admin role in the Azure access control identity-access-management (IAM) pane. When Microsoft Entra Access Management is enabled, all users and roles should be managed through Azure Access control identity-access-management (IAM) pane. The Access Control (IAM) now contains the same Azure AI Health Bot roles in Azure, such as Health Bot Admin, Health Bot Editor and Health Bot Reader. When the Microsoft Entra Access Management feature is enabled, the User Management page will be read-only. All users in the Management Portal page will need to be manually added with the right roles through the Azure Access Control (IAM) page in the Azure Portal. You can read more on the Microsoft Entra Access Management features on our public documentation page244Views1like0CommentsUse Video Cards in Health Bot using Web Chat
There are several ways to display video content inside a Health Bot conversation using WebChat client. One way is to use Adaptive Cards. But if you don't need all the layout features of an adaptive card, you can also use a much simpler "VideoCard" attachment as part of a prompt or a statement in your Health Bot scenario flow. Based on the video source url, it will host YouTube or Vimeo video clips inside an iframe in the WebChat widget. Create a prompt or a statement step, click on adding a "Dynamic Card" and paste code snippet such as this: (function(){ return [ { contentType: "application/vnd.microsoft.card.video", content: { media: [{ url: "https://youtu.be/rwe2291YT8Rxw?si=t5m40pOGou5XdFqm" }], } } ] })() This will generate an attachment JSON that will be added to the prompt/statement activity. Obviously, you can use an expression instead of the hard codes URL link. When running the bot scenario, in WebChat widget, you will see the prompt hosting the native player with all the features.212Views2likes0Commentsvalue changes sheet
Hi everyone, I'm not very expert in office script, I wrote this code but I would like the script to start when a value changes in another sheet. for now I can only activate it via button. Thank you function main(workbook: ExcelScript.Workbook) { // Refresh all data connections workbook.refreshAllDataConnections(); // Row height workbook.getWorksheet("Matrice").getRange("A4").getFormat().setRowHeight(200 * 3 / 4); workbook.getWorksheet("Matrice"). getRange("F5:F1000").getEntireColumn().getFormat().setHorizontalAlignment(ExcelScript.HorizontalAlignment.left); }272Views0likes0CommentsFailed fetching CLU intents, please try again.
I have created a CLU language model with intents, variables etc. I have trained the language model and deployed it. I was able to successfully test the deployment aswell. I am trying to attach this CLU model in Healthbot. I have given all the details necessary to add the language to bot. The details included subscription key, deployment name, project name etc. When I click fetch CLU intents, I get 'failed fetching CLU intents, please try again'. I double checked on the configurations and they are looked good but I am not sure why I get this error. Could someone please help resolve this issue.309Views0likes1CommentAccessing "Conversation", "User" or "Scenario" entire objects is not allowed
Health bot uses three scopes of variables: Conversation - Variables that store data throughout the entire conversation. They are removed when the conversation ends. User - Variables that are stored in the context of the User Id that interacts with the bot. They are stored until the user asks to remove them. They are used so that we can retrieve user specific data, like birthdate, without asking it each time the same user interacts with the bot. Scenario - Variables that are stored in the context of the scenario and kept until the scenario ends. In analogy to normal programming language, they are like function variables that are allocated on the stack. If scenario A calls scenario B, each of the scenarios can have the same variable "foo" and each scenario will hold a different value. This allows us to write contained and reusable scenarios without the fear of "side effects". The syntax for accessing each variable scope is as follows: "This is a welcome message " + conversation.welcomeMessage "conversation" is not a real object, it's just a scope of the welcomeMessage variable. Therefore, we don't allow accessing the "conversation" as plain JavaScript object since it's really a reserved word. To access specific variables within the conversation scope, you can use the assign step and "pluck" the variables within the conversation scope into a new object. For example: We define a new object on the scenario scope called "dumpObject", we then "cherry pick", variables from scenario or conversation scopes. We later can use this object whenever we like, for example use it to dump onto custom telemetry.526Views2likes1CommentGetting Response Body
Hi, I am trying to use the contents of a http response body in my conversation flow. In the data connection element of the health bot, the response variable is named as 'res'. The process is like the data connection element connects to a logic app where user's credentials are verified and then a welcome message with user's name is displayed ("welcome {Firstname}"). I can see the message in logic apps's response body but how do i access this in my bot?446Views0likes2CommentsAzure Health Bot using Azure Container Instance (ACI) for enhanced security.
We are pleased to announce our latest security improvement we have introduced in our platform. Our customers who used the production ready (Standard plan) instances will now enjoy the use of their own dedicated Azure Container Instance (ACI) that will execute their JavaScript code in their own isolated container. All ACIs operate within a separate subnet in our AKS cluster's virtual network (VNet), and a unified network security group (NSG) is applied to all ACIs to prevent any outgoing communication. As part of this security enhancement JavaScript functions, you create will not be directly shared among steps in the scenario. To enable this, we have created a Global Action step you can utilize to have JavaScript code that is sharable with every other step in the scenario: For more information about the available objects and JS packages please visit our documentation. To learn more about ACI, visitAzure Container Instances | Microsoft Azure328Views0likes0CommentsSecurely passing data from the customer's backend to the Azure Health Bot server side.
When integrating with Azure Health Bot (AHB) in production-grade applications, you'll often need to write your backend and frontend components. These components will link your backend systems with AHB and transmit customer-specific data like Web Chat tokens, end-user details, and other sensitive information required during the conversation flow. This data is encrypted and signed so that only your legitimate bot instance can decode and use it. AHB provides exactly this kind of mechanism. You can check out this sample code on GitHub, which includes both the backend and frontend sample components. Once a conversation is initiated on the client side of AHB, it prompts your backend to prepare and sign the necessary data for the conversation. The server.js contains code that initiates a conversation session from your application's backend. For example, the snippet below from the server.js file passes an 'age' variable within an optionalAttributes object, but you can add any attributes you'd like: //Add any additional attributes response['optionalAttributes'] = {age: 33}; The backend then creates a payload that is signed into a JWT and sent back to the client side. The client forwards this JWT to the AHB session upon initiating the conversation. When AHB receives this token, it decodes, verifies, and populates a conversation scope variable called "initConversationEvent." To access this variable, refer to the example provided in a statement step. Please note that this data is passed at the beginning of the conversation, and it's your responsibility to handle token expiration if relevant.493Views0likes0CommentsSend metadata from Health Bot server to Web Chat client via backchannel.
You can send metadata attached to a message activity from the server side of Health Bot to the client side to be able to trigger various client events and set client properties. For example, we would like to turn the chat input prompt to be "password" type when we pass the {secret: true} object from the Bot. You can attach the metadata object that will be sent as part of the message activity in the Prompt/Statement steps. On the Web Chat client side, modify the code that handles the message activity to identify this metadata passed in the "entities" property and act accordingly. In this example, set the type of the input prompt to be "password" as shown below. else if (action.type === 'DIRECT_LINE/INCOMING_ACTIVITY') { const inputType = (action.payload && action.payload.activity && action.payload.activity.entities && action.payload.activity.entities.find(e => e.secret === true)) ? "password" : "text"; const input = document.getElementsByClassName("webchat__send-box-text-box__input")[0]; if (input && input.type !== inputType) { input.type = inputType; } } See this sample file324Views0likes0CommentsUsing Azure's Face API to detect faces from a locally uploaded image
Health bot supports uploading of files from local file systems. The files are temporarily stored on the Bot framework's infrastructure and access to them is only available to the holder of an exceedingly long URL. This can open various use cases, for example, having the patient upload some image of a medical condition via the bot, the bot can then use some cognitive service to detect the condition and answer the patient based on the uploaded image. In this example, we will use the Face API to detect faces in an image uploaded from the local file system. On the left you can see the scenario flow in which we first prompt for file upload, the canvas in response displays the prompt text and the "paper clip" icon is ready to be clicked and image can be selected from the file system (or Camera Roll on a mobile device). Prompt the user for an Image. Once the image is selected, we then proceed to call the Face API using the "Data Connection" step. We pass the URL of the uploaded image and calling the REST API for API Detection and use the subscription key I have copied from the Azure Cognitive Services resource I have created.670Views1like0CommentsAutomate Health Bot Deployment using Bicep
Health Bot Azure resource provider now supports retrieval of secrets using the ARM API. This will allow the automated deployment of Health Bot and any other resources that rely on the various secrets that the bot resource exposes. In this example, we show how to deploy Health Bot and a Web Application that uses the Container sample, wire up the secrets the app needs to communicate with the bot and even upload bot template that resides on a publicly accessible storage account via the Data plane API using a PowerShell deployment script. For the completeBicep code and the restoreBot.ps1 file, see the attached bicep.zip file. param serviceName string param linuxFxVersion string = 'node|14-lts' param farmSKU string = 'F1' param botSKU string = 'F0' param location string = 'eastus' param hbsRestoreFile string = 'https://xxxxxxxxx.blob.core.windows.net/templates/botTemplate.hbs' resource healthbot 'Microsoft.HealthBot/healthBots@2022-08-08' = { name: '${serviceName}-bot' location: location properties: { keyVaultProperties: { keyName: 'string' keyVaultUri: 'string' keyVersion: 'string' userIdentity: 'string' } } sku: { name: botSKU } } resource farm 'Microsoft.Web/serverfarms@2022-03-01' = { location: location name: '${serviceName}-farm' kind: 'linux' sku: { name: farmSKU } properties: { reserved: true } } resource webapp 'Microsoft.Web/sites@2022-03-01' = { name: '${serviceName}-webapp' location: location properties: { serverFarmId: farm.id siteConfig: { linuxFxVersion: linuxFxVersion appSettings: [ { name: 'PORT' value: '80' } { name: 'APP_SECRET' value: healthbot.listSecrets().secrets[0].value } { name: 'WEBCHAT_SECRET' value: healthbot.listSecrets().secrets[1].value } ] } } } resource srcControls 'Microsoft.Web/sites/sourcecontrols@2021-01-01' = { name: '${webapp.name}/web' properties: { repoUrl: 'https://github.com/microsoft/HealthBotContainerSample' branch: 'master' isManualIntegration: true } } resource importBotScript 'Microsoft.Resources/deploymentScripts@2020-10-01' = { name: '${serviceName}-RestoreBotScript' location: location kind: 'AzurePowerShell' properties: { azPowerShellVersion: '3.0' scriptContent: loadTextContent('restoreBot.ps1') retentionInterval: 'P1D' arguments: '-secret ${healthbot.listSecrets().secrets[2].value} -baseUrl ${healthbot.properties.botManagementPortalLink} -hbsRestoreFile ${hbsRestoreFile}' } }523Views0likes0CommentsCalling Power Automate flows from within Health Bot scenario
Health Bot can call any REST API endpoint from within the scenario. We can leverage this capability to access any Power Automateflow by using the Data Connection step. This opens an entire range of possibilities accessing Power Apps services, such as Dataverseand others. First define the flow that is triggered by "Request" as shown below. Once you save the flow, the "HTTP" post url is revelead. This the endpoint that also includes the access token to invoke this flow. We recommend to remove the access token from the URL and place in Health Bot's "Data Connections" configuration so it will be stored in a more secured manor. Author the flow according to your needs. For example, you can use the "Dataverse" connector to access dataverse data sources and tables. Return the data by using the "Response" step. In Health Bot scenario, use the "Data Connection" step and invoke the flow endpoint For more details seeCalling Microsoft Flow from your application | Blog Power Automate674Views1like0CommentsIntroducing the new look of the Health Bot management portal
In the next few weeks, a new version of the Health Bot management portal will be rolled out. The new version has an improved UI and enhanced accessibility. The transition will be seamless and does not require any action from your end. As always, we are happy to hear your feedback, please use the smiley icon at the top-right side of the frame to submit your feedback. What is new? New stylish look and feel: The scenario editor tool is now keyboard accessible: Users can smartly navigate the scenario flow, using the Arrows and Tab keys. All the actions applicable to the scenario steps are keyboard accessible as well. Adding and Deleting edges are also accessible using the Enter and Delete keys. Users can even move the steps around the canvas using the Shift+Arrows keys. Smooth navigation to and from the Scenario Editor tool: Open the Editor by clicking the “Edit steps” button: Navigate back to scenario management using the “Back arrow”: Using the navigation bar to the left, any page in the portal is one click away even from within the editor. Search & Filter: Every page containing a table or list includes also a Search & Filter component on the right-side of the page: And many more experience enhancements. Useful tips to boost your experience: To add a new element on the canvas to a specific location, use the mouse right-click and the Add menu will pop-up, each of the 3 categories will expand to the list of elements. To easily open a sub-scenario from within a parent scenario, click the “Open scenario” button from the actions menu, and the sub-scenario will be opened in a new tab. To select multiple elements and edges, press the Shift key and select the desired elements with your mouse. You can move them around and you can also delete them by using the Delete key. To restart the WebChat conversation, use the new Start-over button next to the language picker. Use your mouse roller to zoom the canvas in and out.5.6KViews4likes1CommentFinal call to migrate any Marketplace Health Bots to Azure Health Bot Offering!
As was communicated previously, on 1.1.2021, we've moved the Microsoft Healthcare Bot service from the Marketplace to Azure. All existing Health Bot instances still in the Marketplace have to be Migrate to Azure Health Bot Offering by 15 September 2022 to prevent them from being disabled. Migrating your Marketplace Health Bots, using the Health Bot migration wizard, takes a few minutes and requires no downtime. Learn more about the Health Bot migration wizard408Views0likes0Comments