analytics
763 TopicsResoure Graph Explorer
I’m looking to retrieve a list of Azure resources that were created within the last 24 hours. However, it appears that Azure does not consistently expose the timeCreated property across all resource types, which makes direct filtering challenging. Request for Clarification/Support: Could you please confirm if there’s a reliable way to filter resources based on their creation time — for example, resources created in the last N days or within the last 6 hours? If timeCreated is not uniformly available, what’s the recommended approach (e.g., using Resource Graph, Activity Logs, or any other reliable method) to achieve this?41Views0likes2CommentsAnnouncing the Azure Databricks connector in Power Platform
We are ecstatic to announce the public preview of the Azure Databricks Connector for Power Platform. This native connector is specifically for Power Apps, Power Automation, and Copilot Studio within Power Platform and enables seamless, single click connection. With this connector, your organization can build data-driven, intelligent conversational experiences that leverage the full power of your data within Azure Databricks without any additional custom configuration or scripting – it's all fully built in! The Azure Databricks connector in power platform enables you to: Maintain governance: All access controls for data you set up in Azure Databricks are maintained in Power Platform Prevent data copy: Read and write to your data without data duplication Secure your connection: Connect Azure Databricks to Power Platform using Microsoft Entra user-based OAuth or service principals Have real time updates: Read and write data and see updates in Azure Databricks in near real time Build agents with context: Build agents with Azure Databricks as grounding knowledge with all the context of your data Instead of spending time copying or moving data and building custom connections which require additional manual maintenance, you can now seamlessly connect and focus on what matters – getting rich insights from your data – without worrying about security or governance. Let’s see how this connector can be beneficial across Power Apps, Power Automate, and Copilot Studio: Azure Databricks Connector for Power Apps – You can seamlessly connect to Azure Databricks from Power Apps to enable read/write access to your data directly within canvas apps enabling your organization to build data-driven experiences in real time. For example, our retail customers are using this connector to visualize different placements of items within the store and how they impact revenue. Azure Databricks Connector for Power Automate – You can execute SQL commands against your data within Azure Databricks with the rich context of your business use case. For example, one of our global retail customers is using automated workflows to track safety incidents, which plays a crucial role in keeping employees safe. Azure Databricks as a Knowledge Source in Copilot Studio – You can add Azure Databricks as a primary knowledge source for your agents, enabling them to understand, reason over, and respond to user prompts based on data from Azure Databricks. To get started, all you need to do in Power Apps or Power Automate is add a new connection – that's how simple it is! Check out our demo here and get started using our documentation today! This connector is available in all public cloud regions. You can also learn more about customer use cases in this blog. You can also review the connector reference here952Views2likes1CommentAzure Form Recognizer Redaction Issue with Scanned PDFs and Page Size Variations
Hi all, I’m working on a PDF redaction process using Azure Form Recognizer and Azure Functions. The flow works well in most cases — I extract the text and bounding box coordinates and apply redaction based on that. However, I’m facing an issue with scanned PDFs or PDFs with slightly different page sizes. In these cases, the redaction boxes don’t align properly — they either miss the text or appear slightly off (above or below the intended area). It seems like the coordinate mapping doesn't match accurately when the document isn't a standard A4 size or has DPI inconsistencies. Has anyone else encountered this? Any suggestions on: Adjusting for page size or DPI dynamically? Mapping normalized coordinates correctly for scanned PDFs? Appreciate any help or suggestions!18Views0likes1CommentInsecure Protocol Workbook
Greetings, maybe most orgs have already eliminated insecure protocols and this workbook is no longer functional? I have it added and it appears to be collecting but when I go to open the template it is completely empty. Is the Insecure Protocol aka IP still supported and if so is there any newer documentation than the blog from 2000 around it? I am hoping to identify ntlm by user and device as the domain controllers are all logging this and the MDI agents on them are forwarding this data to Defender for Identity and Sentinel.114Views1like4CommentsHow do you investigate network anomaly related alerts?
Hello everyone. Using some of the built-in analytical rules such as "Anomaly was observed with IPv6-ICMP Traffic", when you go into the incident event details, its just some numbers of the expected baseline vs actual value. What do you do with this? Similar case with following rules: Anomaly found in Network Session Traffic (ASIM Network Session schema) Anomaly was observed with ESP Traffic Anomaly was observed with Outbound Traffic Anomaly was observed with Unassigned Traffic1KViews1like2CommentsActivity log missing in new Microsft Teams on macos
Hi, recently I've switched from Microsoft Teams Classic to the new Teams version on my Mac. Before I was able to find my log in ~/Library/Application Support/Microsoft/Teams/logs.txt, but now I cannot find any logs written. According to https://learn.microsoft.com/en-us/microsoftteams/log-files#continuous-debug-logs my mac should continuously write debug logs (mac mini with intel i7-8700B). I was doing a grep in the log to find my current status to control a single lamp to indicate others whether I am in a meeting. How can I reenable the logging?559Views0likes2CommentsIdentityInfo with analytics KQL query
Hi, I'm currently trying to create a KQL query for an alert rule in Sentinel. The log source upon which the alert rule is based, only contains the SAMAccountName, which prevents me from mapping it to an Account entity in the alert. I'm therefore trying to use the IdentityInfo table to lookup the AadUserId of the user, using the SAMAccountName. The issue I'm running into is that I want my query to run every 10 minutes, and look up data from the past 10 minutes, as this is most suitable given the nature of the alert and the log source. This however causes the lookup in the IdentityInfo table to also only check data from the last 10 minutes, which doesn't work as the data in that table may be much older and therefor fail the lookup of the AadUserId of the user. According to the documentation, the IdentityInfo table is refreshed every 14 days, so for it to work I'd have to create a query that checks all logging, including that of the log source, from the past 14 days, which is not what I want. Hopefully some of you have suggestions or ideas on how to make this work. Thanks a lot! Marek171Views0likes9CommentsPet project on SQL Server 2022 platform
Hello, world! I would like to share my pet project on SQL Server 2022 platform. I have created a DWH solution that includes many MS's best practices and interesting features such us: ETL process with data cleansing and MDM that easy expand Documentation CI/CD Functional ETL test Ready analytical templates Time intelligence New & returning customers Cluster customers based on spending volume Product ABC classification Basket analysis Events in progress https://dev.azure.com/zinykov/NorthwindBI Unfortunately in SQL Server 2025 will be no DQS & MDS...12Views0likes0CommentsAnnouncing the availability of Azure Databricks connector in Azure AI Foundry
At Microsoft, Databricks Data Intelligence Platform is available as a fully managed, native, first party Data and AI solution called Azure Databricks. This makes Azure the optimal cloud for running Databricks workloads. Because of our unique partnership, we can bring you seamless integrations leveraging the power of the entire Microsoft ecosystem to do more with your data. Azure AI Foundry is an integrated platform for Developers and IT Administrators to design, customize, and manage AI applications and agents. Today we are excited to announce the public preview of the Azure Databricks connector in Azure AI Foundry. With this launch you can build enterprise-grade AI agents that reason over real-time Azure Databricks data while being governed by Unity Catalog. These agents will also be enriched by the responsible AI capabilities of Azure AI Foundry. Here are a few ways this can benefit you and your organization: Native Integration: Connect to Azure Databricks AI/BI Genie from Azure AI Foundry Contextual Answers: Genie agents provide answers grounded in your unique data Supports Various LLMs: Secure, authenticated data access Streamlined Process: Real-time data insights within GenAI apps Seamless Integration: Simplifies AI agent management with data governance Multi-Agent workflows: Leverages Azure AI agents and Genie Spaces for faster insights Enhanced Collaboration: Boosts productivity between business and technical users To further democratize the use of data to those in your organization who aren't directly interacting with Azure Databricks, you can also take it one step further with Microsoft Teams and AI/BI Genie. AI/BI Genie enables you to get deep insights from your data using your natural language without needing to access Azure Databricks. Here you see an example of what an agent built in AI Foundry using data from Azure Databricks available in Microsoft Teams looks like We'd love to hear your feedback as you use the Azure Databricks connector in AI Foundry. Try it out today – to help you get started, we’ve put together some samples here. Read more on the Databricks blog, too.5.2KViews5likes2Comments