Recent Discussions
Introducing the new look of the Health Bot management portal
In the next few weeks, a new version of the Health Bot management portal will be rolled out. The new version has an improved UI and enhanced accessibility. The transition will be seamless and does not require any action from your end. As always, we are happy to hear your feedback, please use the smiley icon at the top-right side of the frame to submit your feedback. What is new? New stylish look and feel: The scenario editor tool is now keyboard accessible: Users can smartly navigate the scenario flow, using the Arrows and Tab keys. All the actions applicable to the scenario steps are keyboard accessible as well. Adding and Deleting edges are also accessible using the Enter and Delete keys. Users can even move the steps around the canvas using the Shift+Arrows keys. Smooth navigation to and from the Scenario Editor tool: Open the Editor by clicking the “Edit steps” button: Navigate back to scenario management using the “Back arrow”: Using the navigation bar to the left, any page in the portal is one click away even from within the editor. Search & Filter: Every page containing a table or list includes also a Search & Filter component on the right-side of the page: And many more experience enhancements. Useful tips to boost your experience: To add a new element on the canvas to a specific location, use the mouse right-click and the Add menu will pop-up, each of the 3 categories will expand to the list of elements. To easily open a sub-scenario from within a parent scenario, click the “Open scenario” button from the actions menu, and the sub-scenario will be opened in a new tab. To select multiple elements and edges, press the Shift key and select the desired elements with your mouse. You can move them around and you can also delete them by using the Delete key. To restart the WebChat conversation, use the new Start-over button next to the language picker. Use your mouse roller to zoom the canvas in and out.5.6KViews4likes1CommentUse Video Cards in Health Bot using Web Chat
There are several ways to display video content inside a Health Bot conversation using WebChat client. One way is to use Adaptive Cards. But if you don't need all the layout features of an adaptive card, you can also use a much simpler "VideoCard" attachment as part of a prompt or a statement in your Health Bot scenario flow. Based on the video source url, it will host YouTube or Vimeo video clips inside an iframe in the WebChat widget. Create a prompt or a statement step, click on adding a "Dynamic Card" and paste code snippet such as this: (function(){ return [ { contentType: "application/vnd.microsoft.card.video", content: { media: [{ url: "https://youtu.be/rwe2291YT8Rxw?si=t5m40pOGou5XdFqm" }], } } ] })() This will generate an attachment JSON that will be added to the prompt/statement activity. Obviously, you can use an expression instead of the hard codes URL link. When running the bot scenario, in WebChat widget, you will see the prompt hosting the native player with all the features.231Views2likes1CommentAccessing "Conversation", "User" or "Scenario" entire objects is not allowed
Health bot uses three scopes of variables: Conversation - Variables that store data throughout the entire conversation. They are removed when the conversation ends. User - Variables that are stored in the context of the User Id that interacts with the bot. They are stored until the user asks to remove them. They are used so that we can retrieve user specific data, like birthdate, without asking it each time the same user interacts with the bot. Scenario - Variables that are stored in the context of the scenario and kept until the scenario ends. In analogy to normal programming language, they are like function variables that are allocated on the stack. If scenario A calls scenario B, each of the scenarios can have the same variable "foo" and each scenario will hold a different value. This allows us to write contained and reusable scenarios without the fear of "side effects". The syntax for accessing each variable scope is as follows: "This is a welcome message " + conversation.welcomeMessage "conversation" is not a real object, it's just a scope of the welcomeMessage variable. Therefore, we don't allow accessing the "conversation" as plain JavaScript object since it's really a reserved word. To access specific variables within the conversation scope, you can use the assign step and "pluck" the variables within the conversation scope into a new object. For example: We define a new object on the scenario scope called "dumpObject", we then "cherry pick", variables from scenario or conversation scopes. We later can use this object whenever we like, for example use it to dump onto custom telemetry.527Views2likes1CommentAzure AI Health Bot – now supports Microsoft Entra Access Management
We are excited to announce the introduction of Microsoft Entra Access Management support in the Azure AI Health Bot. This enhancement increases security by leveraging the robust and proven capabilities of Microsoft Entra. Customers interested in this feature can opt-in by navigating to the User Management page and enabling the Microsoft Entra Access Management feature. This feature can only be enabled for users who have the Health Bot Admin role in the Azure access control identity-access-management (IAM) pane. When Microsoft Entra Access Management is enabled, all users and roles should be managed through Azure Access control identity-access-management (IAM) pane. The Access Control (IAM) now contains the same Azure AI Health Bot roles in Azure, such as Health Bot Admin, Health Bot Editor and Health Bot Reader. When the Microsoft Entra Access Management feature is enabled, the User Management page will be read-only. All users in the Management Portal page will need to be manually added with the right roles through the Azure Access Control (IAM) page in the Azure Portal. You can read more on the Microsoft Entra Access Management features on our public documentation page248Views1like0CommentsUsing Azure's Face API to detect faces from a locally uploaded image
Health bot supports uploading of files from local file systems. The files are temporarily stored on the Bot framework's infrastructure and access to them is only available to the holder of an exceedingly long URL. This can open various use cases, for example, having the patient upload some image of a medical condition via the bot, the bot can then use some cognitive service to detect the condition and answer the patient based on the uploaded image. In this example, we will use the Face API to detect faces in an image uploaded from the local file system. On the left you can see the scenario flow in which we first prompt for file upload, the canvas in response displays the prompt text and the "paper clip" icon is ready to be clicked and image can be selected from the file system (or Camera Roll on a mobile device). Prompt the user for an Image. Once the image is selected, we then proceed to call the Face API using the "Data Connection" step. We pass the URL of the uploaded image and calling the REST API for API Detection and use the subscription key I have copied from the Azure Cognitive Services resource I have created.675Views1like0CommentsCalling Power Automate flows from within Health Bot scenario
Health Bot can call any REST API endpoint from within the scenario. We can leverage this capability to access any Power Automateflow by using the Data Connection step. This opens an entire range of possibilities accessing Power Apps services, such as Dataverseand others. First define the flow that is triggered by "Request" as shown below. Once you save the flow, the "HTTP" post url is revelead. This the endpoint that also includes the access token to invoke this flow. We recommend to remove the access token from the URL and place in Health Bot's "Data Connections" configuration so it will be stored in a more secured manor. Author the flow according to your needs. For example, you can use the "Dataverse" connector to access dataverse data sources and tables. Return the data by using the "Response" step. In Health Bot scenario, use the "Data Connection" step and invoke the flow endpoint For more details seeCalling Microsoft Flow from your application | Blog Power Automate676Views1like0CommentsMicrosoft Healthcare Bot is now Generally Available !
Today we announced the general availability of the Microsoft Healthcare Bot in the Azure Marketplace. Read all about it in the Blog post by Hadas Bitran! "The close collaboration with our preview partners, including Premera Blue Cross, Quest Diagnostics, and Advocate Aurora Heath, helped identify diverse use cases that address the needs and expectations of healthcare organizations. We now have a better understanding of what’s important to our partners, and how to evolve the product by focusing on key differentiating features. For example, we realized the importance of enabling a visual design environment that allows review of the flows by clinical personnel and domain experts who are non-developers. We also evolved our scenario templates catalog and provided a gallery of example use cases to start from, which allows our partners to develop their bots quickly and inexpensively."560Views1like0CommentsHealthcare Bot Maintenance Notification - Service Resumed
The Healthcare Bot service maintenance work has been completed and normal service is now resumed. This maintenance work was part of our commitment to improved availability and performance. Thank you for your patience, and we apologize again for any inconvenience.539Views1like0CommentsEntity recognition from LUIS.AI - Microsoft HealthBot preview.
Hola Amigos, Hope MS Suite is keep us really busy :) We need your help, we are exploring the new Microsoft HealthBot preview and trying to create a custom scenarios. We are able to create a scenario and attached with the NLU we designed on LUIS.AI . But while executing the scenario from the chatbot interface, we are only able to trigger the intents from NLU, but not able to identify the entities. Unless we find what entity is user input is triggering, we can't follow up with the user conversational flow. Could someone help to identity the entity names we well along with the Intent from LUIS. Great thanks in advance. Best wishes, Bhanu KandregulaSolved944Views1like2CommentsIntroducing the "Alexa" channel
Healthcare Bot now supports the "Alexa" channel. To configure it, navigate to the "Integration/Channels" tab, and turn on the Alexa channel You will be prompted to provide the "Alexa" skill id and when clicking on the "Save" button for the first time, you will be provided with the "Service Endpoint URL". You will use this URL in the Amazon Alexa Developer portal as described here. Now each time you use the Alexa skill invocation phrase, Healthcare bot will be answering.607Views1like0Comments"Welcome Scenario" to enable automatic "call to action" in Healthcare Bot
Previously, only "Welcome messages" were supported in which markdown could be used to convey a welcome message on initial interaction with the Healthcare Bot. We got numerous requests from our customers to be able to trigger more elaborate uses cases where some "call to action" could be expressed and conversation could be started without "free text" utterances typed by the user. So, "Automatic Welcome scenario" was introduced. However, there is one major limitation to the "Welcome scenario": It does not support a multi-turn conversation. This means that you can't use any prompts that will ask the user for a value and use this value in the same scenario. It will simply return to the root of the conversation after the welcome scenario had run. The workaround for this: If you need to prompt the user for an action, is to show a statement with a Hero Card or an Adaptive Card that will display an option to click. Clicking the action button will "postBack" whatever value you have set in the card that will trigger another scenario that can be a multi-turn one. In the sample statement below, we show a "Hero Card" with two options. Selecting one of the options, will trigger another scenario which can be a multi-turn one. To assign a "Welcome Scenario", navigate to the "Configuration/Conversation" tab and scroll down to the "Automatic Welcome Scenario". Select the welcome scenario you just created from the dropdown. Note: Welcome scenario will override the "Automatic welcome message" if exists Now each time the user opens the Bot client, he will be prompted with the welcome scenario. Selecting one of the options will begin another scenario as defined in the Hero Card as shown below2.1KViews1like0Comments