Analytics
772 TopicsClosing the loop: Interactive write-back from Power BI to Azure Databricks
This is a collaborative post from Microsoft and Databricks. We thank Toussaint Webb, Product Manager at Databricks, for his contributions. We're excited to announce that the Azure Databricks connector for Power Platform is now Generally Available. With this integration, organizations can seamlessly build Power Apps, Power Automate flows, and Copilot Studio agents with secure, governed data and no data duplication. A key functionality unlocked by this connector is the ability to write data back from Power BI to Azure Databricks. Many organizations want to not only analyze data but also act on insights quickly and efficiently. Power BI users, in particular, have been seeking a straightforward way to “close the loop” by writing data back from Power BI into Azure Databricks. This capability is now here - real-time updates and streamlined operational workflows with the new Azure Databricks connector for Power Platform. With this connector, users can now read from and write to Azure Databricks data warehouses in real time, all from within familiar interfaces — no custom connectors, no data duplication, and no loss of governance. How It Works: Write-backs from Power BI through Power Apps Enabling writebacks from Power BI to Azure Databricks is seamless. Follow these steps: Open Power Apps and create a connection to Azure Databricks (documentation). In Power BI (desktop or service), add a Power Apps visual to your report (purple Power Apps icon). Add data to connect to your Power App via the visualization pane. Create a new Power App directly from the Power BI interface, or choose an existing app to embed. Start writing records to Azure Databricks! With this integration, users can make real-time updates directly within Power BI using the embedded Power App, instantly writing changes back to Azure Databricks. Think of all the workflows that this can unlock, such as warehouse managers monitoring performance and flagging issues on the spot, or store owners reviewing and adjusting inventory levels as needed. The seamless connection between Azure Databricks, Power Apps, and Power BI lets you close the loop on critical processes by uniting reporting and action in one place. Try It Out: Get started with Azure Databricks Power Platform Connector The Power Platform Connector is now Generally Available for all Azure Databricks customers. Explore more in the deep dive blog here and to get started, check out our technical documentation. Coming soon we will add the ability to execute existing Azure Databricks Jobs via Power Automate. If your organization is looking for an even more customizable end-to-end solution, check out Databricks Apps in Azure Databricks! No extra services or licenses required.2.7KViews2likes2CommentsSupercharge Data Intelligence: Build Teams App with Azure Databricks Genie & Azure AI Agent Service
Introduction Are you looking to unlock the full potential of your data investments in Azure Databricks while seamlessly harnessing the capabilities of Azure AI? At Microsoft BUILD 2025, we made the announcement of the Azure Databricks connector in Azure AI Foundry. This blog post is a follow-up to take advantage of this feature within Microsoft Teams. We'll guide you through leveraging the integration between Azure Databricks and Azure AI Foundry to build a Python-based Teams app that consumes Genie APIs using the secure On-Behalf-Of (OBO) authentication flow. Whether you are a data engineer, AI developer, or a business user seeking actionable insights, this guide will help you accelerate your journey in data intelligence with modern, secure, and streamlined tools. You can find the code samples here: AI-Foundry-Connections - Teams chat with Azure Databricks Genie | GitHub (git clone https://github.com/Azure-Samples/AI-Foundry-Connections.git ) Setting the Stage: Key Components Before we dive into the sample app, let’s quickly establish what each major component brings to the table. What is AI/BI Genie? AI/BI Genie is an intelligent agent feature in Azure Databricks, exposing natural language APIs to interact with your data. It empowers users to query, analyze, and visualize data using conversational AI, making advanced analytics accessible to everyone. Azure AI Foundry Azure AI Foundry is your control center for building, managing, and deploying AI solutions. It offers a unified platform to orchestrate AI agents, connect to various data sources like Azure Databricks, and manage models, workflows, and governance. Azure AI Agent Service Azure AI Agents are modular, reusable, and secure components that interact with data and AI services. They enable you to build multi-agent workflows, integrate with enterprise systems, and deliver contextual, actionable insights to your end-users. The Sample Solution: Python Teams App for Genie APIs To help you get hands-on, this blog features a sample Teams app that demonstrates how to: Connect Teams to Azure Databricks Genie via Azure AI Foundry using OBO (On-Behalf-Of) authentication. Query and visualize data using Genie APIs and LLMs in a secure, governed manner. Build and run the app locally using DevTunnel and easily extend it for production deployments. Purpose: The sample app is designed as a learning tool, enabling developers to explore the integration and build their own enterprise-grade solutions. It shows you how to wire up authentication, connect to Genie, and ask data-driven questions—all from within Teams. Architecture Overview Below is a logical architecture that illustrates the flow: Key Steps from the Sample (posted on Github) Prerequisites: a. Familiarity with Azure Databricks, DevTunnel, Azure Bot Service, and Teams App setup. b. An AI Foundry project with Databricks and LLM connection. Configuration: a. Set up the DevTunnel for local development. b. Register and configure your bot and app in Azure. c. Create a connection in Azure AI Foundry to your Databricks Genie space. d. Provision storage with public blob access for dynamic chart/image rendering. e. Populate the `.env` file with all necessary IDs, secrets, and endpoints. Teams App Manifest: a. Create and configure the Teams app manifest, specifying endpoints, domains, permissions, and SSO details. Run and Test: a. Host the app locally, activate the virtual environment, and run the Python application. b. Upload your Teams app manifest and test conversational queries directly in the Teams client. Sample Queries (Try these in the app!): "Show the top sales reps by pipeline in a pie chart." "What is the average size of active opportunities?" "How many opportunities were won or lost in the current fiscal year?" Get Started: Next Steps To replicate this solution: Review the sample in AI-Foundry-Connections - Teams chat with Azure Databricks Genie | GitHub (git clone https://github.com/Azure-Samples/AI-Foundry-Connections.git) —you'll need to review the README for the step-by-step setup and configuration details! Explore homework opportunities such as deploying the app on Azure App Service or Kubernetes, and experimenting with additional languages or the M365 Agents Toolkit. Final Thoughts Thank you for reading! By integrating Azure Databricks, Azure AI Foundry, and Genie APIs, you can deliver powerful, secure, and collaborative data intelligence experiences right within Microsoft Teams. Ready to go further? Check out the M365 Agents SDKand M365 Agents Toolkit for building advanced AI applications. Experiment, extend, and share your feedback with the community! Happy building! 🚀462Views0likes0CommentsCascading of Results
Hello! I’m not sure if this is the right forum for my question—please let me know if I should post elsewhere. I’m interested in learning how other Viva Glint customers handle sharing survey results within their organizations. Specifically, does anyone provide access to results directly to broader groups, rather than using a cascading process (from C-suite to VPs and then to Managers)? Currently, our process takes 4-6 weeks, as results are cascaded down each management level. I’d love to hear about other approaches—how do you share results? What are the benefits and drawbacks of bypassing the traditional cascading method?How to exclude IPs & accounts from Analytic Rule, with Watchlist?
We are trying to filter out some false positives from a Analytic rule called "Service accounts performing RemotePS". Using automation rules still gives a lot of false mail notifications we don't want so we would like to try using a watchlist with the serviceaccounts and IP combination we want to exclude. Anyone knows where and what syntax we would need to exlude the items on the specific Watchlist? Query: let InteractiveTypes = pack_array( // Declare Interactive logon type names 'Interactive', 'CachedInteractive', 'Unlock', 'RemoteInteractive', 'CachedRemoteInteractive', 'CachedUnlock' ); let WhitelistedCmdlets = pack_array( // List of whitelisted commands that don't provide a lot of value 'prompt', 'Out-Default', 'out-lineoutput', 'format-default', 'Set-StrictMode', 'TabExpansion2' ); let WhitelistedAccounts = pack_array('FakeWhitelistedAccount'); // List of accounts that are known to perform this activity in the environment and can be ignored DeviceLogonEvents // Get all logon events... | where AccountName !in~ (WhitelistedAccounts) // ...where it is not a whitelisted account... | where ActionType == "LogonSuccess" // ...and the logon was successful... | where AccountName !contains "$" // ...and not a machine logon. | where AccountName !has "winrm va_" // WinRM will have pseudo account names that match this if there is an explicit permission for an admin to run the cmdlet, so assume it is good. | extend IsInteractive=(LogonType in (InteractiveTypes)) // Determine if the logon is interactive (True=1,False=0)... | summarize HasInteractiveLogon=max(IsInteractive) // ...then bucket and get the maximum interactive value (0 or 1)... by AccountName // ... by the AccountNames | where HasInteractiveLogon == 0 // ...and filter out all accounts that had an interactive logon. // At this point, we have a list of accounts that we believe to be service accounts // Now we need to find RemotePS sessions that were spawned by those accounts // Note that we look at all powershell cmdlets executed to form a 29-day baseline to evaluate the data on today | join kind=rightsemi ( // Start by dropping the account name and only tracking the... DeviceEvents // ... | where ActionType == 'PowerShellCommand' // ...PowerShell commands seen... | where InitiatingProcessFileName =~ 'wsmprovhost.exe' // ...whose parent was wsmprovhost.exe (RemotePS Server)... | extend AccountName = InitiatingProcessAccountName // ...and add an AccountName field so the join is easier ) on AccountName // At this point, we have all of the commands that were ran by service accounts | extend Command = tostring(extractjson('$.Command', tostring(AdditionalFields))) // Extract the actual PowerShell command that was executed | where Command !in (WhitelistedCmdlets) // Remove any values that match the whitelisted cmdlets | summarize (Timestamp, ReportId)=arg_max(TimeGenerated, ReportId), // Then group all of the cmdlets and calculate the min/max times of execution... make_set(Command, 100000), count(), min(TimeGenerated) by // ...as well as creating a list of cmdlets ran and the count.. AccountName, AccountDomain, DeviceName, DeviceId // ...and have the commonality be the account, DeviceName and DeviceId // At this point, we have machine-account pairs along with the list of commands run as well as the first/last time the commands were ran | order by AccountName asc // Order the final list by AccountName just to make it easier to go through | extend HostName = iff(DeviceName has '.', substring(DeviceName, 0, indexof(DeviceName, '.')), DeviceName) | extend DnsDomain = iff(DeviceName has '.', substring(DeviceName, indexof(DeviceName, '.') + 1), "")111Views0likes1CommentIssue while deploying Sentienl Rules
I know that when deleting a Sentinel rule, you need to wait a specific amount of time before it can be redeployed. However, in this tenant, we've been waiting for almost a month and are still getting the same deployment error ('was recently deleted. You need to allow some time before re-using the same ID. Please try again later. Click here for details'). I still want to use the same ID ect. Does anyone have any idea or similar issue why it's still not possible after waiting for about a month?457Views1like3CommentsMusic Status change based on Spotify/YouTube Music etc.
Is there anything about changing Microsoft Teams' status to "Listening to X" based on what we are currently listening to on Spotify or YouTube Music etc ? Or is there any development about this currently? It would be cool like back in old MSN days 😄36KViews3likes7CommentsSentinel Datalake - How to query outside of defender portal?
I've been doing some testing on Sentinel Datalake but I'm running into a major gap. How do we query the datalake outside Jupyter notebooks or the defender portal? Currently, this is done by connecting to the log analytics workspace. But I don't see any way to query the datalake from another system.78Views0likes2CommentsScaling Smart with Azure: Architecture That Works
Hi Tech Community! I’m Zainab, currently based in Abu Dhabi and serving as Vice President of Finance & HR at Hoddz Trends LLC a global tech solutions company headquartered in Arkansas, USA. While I lead on strategy, people, and financials, I also roll up my sleeves when it comes to tech innovation. In this discussion, I want to explore the real-world challenges of scaling systems with Microsoft Azure. From choosing the right architecture to optimizing performance and cost, I’ll be sharing insights drawn from experience and I’d love to hear yours too. Whether you're building from scratch, migrating legacy systems, or refining deployments, let’s talk about what actually works.48Views0likes1CommentFewer users in "Users" table than in "Messages" — how to ensure full user data?
Hi, I'm analyzing Viva Engage exported data and noticed that the Users table contains fewer users than the number of unique user IDs appearing in the Messages table. Some user IDs in the Messages table (as message authors) do not have corresponding entries in the Users table. Is there a known reason why some users might be excluded from the Users export? Is there a way to ensure all users who post or interact are included in the Users table? Any insight into how to resolve or explain this discrepancy would be greatly appreciated. Thanks in advance!