tutorial
155 TopicsINDEX MATCH the easy way
I've created a short video on how to use a trick to create INDEX MATCH formula quicker. Also, my suggestion for a revised version of INDEX MATCH is now at number 6 (as at 22nd Sep 2016) in the most voted for improvements to Excel. https://excel.uservoice.com/forums/304921-excel-for-windows-desktop-application/suggestions/14049132-make-a-simple-safer-version-of-vlookup-and-index-m4.9KViews6likes3CommentsSmart Auditing: Leveraging Azure AI Agents to Transform Financial Oversight
In today's data-driven business environment, audit teams often spend weeks poring over logs and databases to verify spending and billing information. This time-consuming process is ripe for automation. But is there a way to implement AI solutions without getting lost in complex technical frameworks? While tools like LangChain, Semantic Kernel, and AutoGen offer powerful AI agent capabilities, sometimes you need a straightforward solution that just works. So, what's the answer for teams seeking simplicity without sacrificing effectiveness? This tutorial will show you how to use Azure AI Agent Service to build an AI agent that can directly access your Postgres database to streamline audit workflows. No complex chains or graphs required, just a practical solution to get your audit process automated quickly. The Auditing Challenge: It's the month end, and your audit team is drowning in spreadsheets. As auditors reviewing financial data across multiple SaaS tenants, you're tasked with verifying billing accuracy by tracking usage metrics like API calls, storage consumption, and user sessions in Postgres databases. Each tenant generates thousands of transactions daily, and traditionally, this verification process consumes weeks of your team's valuable time. Typically, teams spend weeks: Manually extracting data from multiple database tables. Cross-referencing usage with invoices. Investigating anomalies through tedious log analysis. Compiling findings into comprehensive reports. With an AI-powered audit agent, you can automate these tasks and transform the process. Your AI assistant can: Pull relevant usage data directly from your database Identify billing anomalies like unexpected usage spikes Generate natural language explanations of findings Create audit reports that highlight key concerns For example, when reviewing a tenant's invoice, your audit agent can query the database for relevant usage patterns, summarize anomalies, and offer explanations: "Tenant_456 experienced a 145% increase in API usage on April 30th, which explains the billing increase. This spike falls outside normal usage patterns and warrants further investigation." Let’s build an AI agent that connects to your Postgres database and transforms your audit process from manual effort to automated intelligence. Prerequisites: Before we start building our audit agent, you'll need: An Azure subscription (Create one for free). The Azure AI Developer RBAC role assigned to your account. Python 3.11.x installed on your development machine. OR You can also use GitHub Codespaces, which will automatically install all dependencies for you. You’ll need to create a GitHub account first if you don’t already have one. Setting Up Your Database: For this tutorial, we'll use Neon Serverless Postgres as our database. It's a fully managed, cloud-native Postgres solution that's free to start, scales automatically, and works excellently for AI agents that need to query data on demand. Creating a Neon Database on Azure: Open the Neon Resource page on the Azure portal Fill out the form with the required fields and deploy your database After creation, navigate to the Neon Serverless Postgres Organization service Click on the Portal URL to access the Neon Console Click "New Project" Choose an Azure region Name your project (e.g., "Audit Agent Database") Click "Create Project" Once your project is successfully created, copy the Neon connection string from the Connection Details widget on the Neon Dashboard. It will look like this: postgresql://[user]:[password]@[neon_hostname]/[dbname]?sslmode=require Note: Keep this connection string saved; we'll need it shortly. Creating an AI Foundry Project on Azure: Next, we'll set up the AI infrastructure to power our audit agent: Create a new hub and project in the Azure AI Foundry portal by following the guide. Deploy a model like GPT-4o to use with your agent. Make note of your Project connection string and Model Deployment name. You can find your connection string in the overview section of your project in the Azure AI Foundry portal, under Project details > Project connection string. Once you have all three values on hand: Neon connection string, Project connection string, and Model Deployment Name, you are ready to set up the Python project to create an Agent. All the code and sample data are available in this GitHub repository. You can clone or download the project. Project Environment Setup: Create a .env file with your credentials: PROJECT_CONNECTION_STRING="<Your AI Foundry connection string> "AZURE_OPENAI_DEPLOYMENT_NAME="gpt4o" NEON_DB_CONNECTION_STRING="<Your Neon connection string>" Create and activate a virtual environment: python -m venv .venv source .venv/bin/activate # on macOS/Linux .venv\Scripts\activate # on Windows Install required Python libraries: pip install -r requirements.txt Example requirements.txt: Pandas python-dotenv sqlalchemy psycopg2-binary azure-ai-projects ==1.0.0b7 azure-identity Load Sample Billing Usage Data: We will use a mock dataset for tenant usage, including computed percent change in API calls and storage usage in GB: tenant_id date api_calls storage_gb tenant_456 2025-04-01 1000 25.0 tenant_456 2025-03-31 950 24.8 tenant_456 2025-03-30 2200 26.0 Run python load_usage_data.py Python script to create and populate the usage_data table in your Neon Serverless Postgres instance: # load_usage_data.py file import os from dotenv import load_dotenv from sqlalchemy import ( create_engine, MetaData, Table, Column, String, Date, Integer, Numeric, ) # Load environment variables from .env load_dotenv() # Load connection string from environment variable NEON_DB_URL = os.getenv("NEON_DB_CONNECTION_STRING") engine = create_engine(NEON_DB_URL) # Define metadata and table schema metadata = MetaData() usage_data = Table( "usage_data", metadata, Column("tenant_id", String, primary_key=True), Column("date", Date, primary_key=True), Column("api_calls", Integer), Column("storage_gb", Numeric), ) # Create table with engine.begin() as conn: metadata.create_all(conn) # Insert mock data conn.execute( usage_data.insert(), [ { "tenant_id": "tenant_456", "date": "2025-03-27", "api_calls": 870, "storage_gb": 23.9, }, { "tenant_id": "tenant_456", "date": "2025-03-28", "api_calls": 880, "storage_gb": 24.0, }, { "tenant_id": "tenant_456", "date": "2025-03-29", "api_calls": 900, "storage_gb": 24.5, }, { "tenant_id": "tenant_456", "date": "2025-03-30", "api_calls": 2200, "storage_gb": 26.0, }, { "tenant_id": "tenant_456", "date": "2025-03-31", "api_calls": 950, "storage_gb": 24.8, }, { "tenant_id": "tenant_456", "date": "2025-04-01", "api_calls": 1000, "storage_gb": 25.0, }, ], ) print("✅ usage_data table created and mock data inserted.") Create a Postgres Tool for the Agent: Next, we configure an AI agent tool to retrieve data from Postgres. The Python script billing_agent_tools.py contains: The function billing_anomaly_summary() that: Pulls usage data from Neon. Computes % change in api_calls. Flags anomalies with a threshold of > 1.5x change. Exports user_functions list for the Azure AI Agent to use. You do not need to run it separately. # billing_agent_tools.py file import os import json import pandas as pd from sqlalchemy import create_engine from dotenv import load_dotenv # Load environment variables load_dotenv() # Set up the database engine NEON_DB_URL = os.getenv("NEON_DB_CONNECTION_STRING") db_engine = create_engine(NEON_DB_URL) # Define the billing anomaly detection function def billing_anomaly_summary( tenant_id: str, start_date: str = "2025-03-27", end_date: str = "2025-04-01", limit: int = 10, ) -> str: """ Fetches recent usage data for a SaaS tenant and detects potential billing anomalies. :param tenant_id: The tenant ID to analyze. :type tenant_id: str :param start_date: Start date for the usage window. :type start_date: str :param end_date: End date for the usage window. :type end_date: str :param limit: Maximum number of records to return. :type limit: int :return: A JSON string with usage records and anomaly flags. :rtype: str """ query = """ SELECT date, api_calls, storage_gb FROM usage_data WHERE tenant_id = %s AND date BETWEEN %s AND %s ORDER BY date DESC LIMIT %s; """ df = pd.read_sql(query, db_engine, params=(tenant_id, start_date, end_date, limit)) if df.empty: return json.dumps( {"message": "No usage data found for this tenant in the specified range."} ) df.sort_values("date", inplace=True) df["pct_change_api"] = df["api_calls"].pct_change() df["anomaly"] = df["pct_change_api"].abs() > 1.5 return df.to_json(orient="records") # Register this in a list to be used by FunctionTool user_functions = [billing_anomaly_summary] Create and Configure the AI Agent: Now we'll set up the AI agent and integrate it with our Neon Postgres tool using the Azure AI Agent Service SDK. The Python script does the following: Creates the agent Instantiates an AI agent using the selected model (gpt-4o, for example), adds tool access, and sets instructions that tell the agent how to behave (e.g., “You are a helpful SaaS assistant…”). Creates a conversation thread A thread is started to hold a conversation between the user and the agent. Posts a user message Sends a question like “Why did my billing spike for tenant_456 this week?” to the agent. Processes the request The agent reads the message, determines that it should use the custom tool to retrieve usage data, and processes the query. Displays the response Prints the response from the agent with a natural language explanation based on the tool’s output. # billing_anomaly_agent.py import os from datetime import datetime from azure.ai.projects import AIProjectClient from azure.identity import DefaultAzureCredential from azure.ai.projects.models import FunctionTool, ToolSet from dotenv import load_dotenv from pprint import pprint from billing_agent_tools import user_functions # Custom tool function module # Load environment variables from .env file load_dotenv() # Create an Azure AI Project Client project_client = AIProjectClient.from_connection_string( credential=DefaultAzureCredential(), conn_str=os.environ["PROJECT_CONNECTION_STRING"], ) # Initialize toolset with our user-defined functions functions = FunctionTool(user_functions) toolset = ToolSet() toolset.add(functions) # Create the agent agent = project_client.agents.create_agent( model=os.environ["AZURE_OPENAI_DEPLOYMENT_NAME"], name=f"billing-anomaly-agent-{datetime.now().strftime('%Y%m%d%H%M')}", description="Billing Anomaly Detection Agent", instructions=f""" You are a helpful SaaS financial assistant that retrieves and explains billing anomalies using usage data. The current date is {datetime.now().strftime("%Y-%m-%d")}. """, toolset=toolset, ) print(f"Created agent, ID: {agent.id}") # Create a communication thread thread = project_client.agents.create_thread() print(f"Created thread, ID: {thread.id}") # Post a message to the agent thread message = project_client.agents.create_message( thread_id=thread.id, role="user", content="Why did my billing spike for tenant_456 this week?", ) print(f"Created message, ID: {message.id}") # Run the agent and process the query run = project_client.agents.create_and_process_run( thread_id=thread.id, agent_id=agent.id ) print(f"Run finished with status: {run.status}") if run.status == "failed": print(f"Run failed: {run.last_error}") # Fetch and display the messages messages = project_client.agents.list_messages(thread_id=thread.id) print("Messages:") pprint(messages["data"][0]["content"][0]["text"]["value"]) # Optional cleanup: # project_client.agents.delete_agent(agent.id) # print("Deleted agent") Run the agent: To run the agent, run the following command python billing_anomaly_agent.py Snippet of output from agent: Using the Azure AI Foundry Agent Playground: After running your agent using the Azure AI Agent SDK, it is saved within your Azure AI Foundry project. You can now experiment with it using the Agent Playground. To try it out: Go to the Agents section in your Azure AI Foundry workspace. Find your billing anomaly agent in the list and click to open it. Use the playground interface to test different financial or billing-related questions, such as: “Did tenant_456 exceed their API usage quota this month?” “Explain recent storage usage changes for tenant_456.” This is a great way to validate your agent's behavior without writing more code. Summary: You’ve now created a working AI agent that talks to your Postgres database, all using: A simple Python function Azure AI Agent Service A Neon Serverless Postgres backend This approach is beginner-friendly, lightweight, and practical for real-world use. Want to go further? You can: Add more tools to the agent Integrate with vector search (e.g., detect anomaly reasons from logs using embeddings) Resources: Introduction to Azure AI Agent Service Develop an AI agent with Azure AI Agent Service Getting Started with Azure AI Agent Service Neon on Azure Build AI Agents with Azure AI Agent Service and Neon Multi-Agent AI Solution with Neon, Langchain, AutoGen and Azure OpenAI Azure AI Foundry GitHub Discussions That's it, folks! But the best part? You can become part of a thriving community of learners and builders by joining the Microsoft Learn Student Ambassadors Community. Connect with like-minded individuals, explore hands-on projects, and stay updated with the latest in cloud and AI. 💬 Join the community on Discord here and explore more benefits on the Microsoft Learn Student Hub.580Views5likes1CommentMicrosoft Fabric for those who know nothing about Fabric
This is not any regular blog, don't click on this blog if you don't want to get convinced, if you are curious, click and see. You will end up falling in love with Microsoft Fabric. Yes, that's because you will love it when you get to know what it is.18KViews5likes2CommentsTake Azure Defender for IoT for a Spin
Intended audience: Security and OT engineering enthusiasts, looking to secure unmanaged critical networks used by IoT/OT devices such as Building Management Systems, Manufacturing, Critical Infrastructure and more! Introduction You’ve read the product materials and would like to get started with securing your IoT/OT network – in this blog post, we will focus on setting up a sensor on your critical networks - without impacting IoT/OT stability or performance (If you missed it, you can read more about the capabilities of Azure Defender for IoT here). The goal of this article is to guide you through setting up a sensor to demonstrate the value of the system, as well as a quick start for securing unmanaged IoT/OT devices. Try it now at no charge Try Azure Defender for IoT - This version includes the agentless security provided via the integration of CyberX, a Microsoft company, plus the ability to connect to Azure Sentinel. Preparing your environment Azure Defender for IoT monitors unmanaged devices that are used in Operational Technology (OT) environments such as manufacturing, building management systems (BMS), life sciences, energy and water utilities, oil & gas, and logistics. In the most basic configuration, Setting up your environment can be taken in 4 easy steps: 1. Setup a sensor The software for the sensor may be installed on physical servers or as a virtual machine. The sensor installation files can be downloaded from the Azure Defender for IoT portal, on the “Getting Started” -> “Network Sensor” tab. Log into your Azure Account and download the ISO installer for the sensor. Install the ISO from USB on a VM or physical server (see Hardware Guide and Installation Guide) Make sure to make a note of the administrative login credentials presented during the installation process. If your setup includes multiple sensors, you can also download the optional “On-Premises Management Console” which allows you to manage and monitor large sensor deployments. More on this in the Installation Guide, Chapter 8 2. Monitor a SPAN port The sensor implements non-invasive passive monitoring with Network Traffic Analysis (NTA) and Layer 7 Deep Packet Inspection (DPI) to extract detailed IoT/OT information in real-time, even across diverse automation equipment from all major OT suppliers such as: Rockwell Automation, Schneider Electric, GE, Emerson, Siemens, Honeywell, ABB, Yokogawa, etc. Locate a managed LAN switch connected to IoT/OT devices. These switches can typically be set up with monitoring ports (also called SPAN or mirror ports). Utilizing this technique, the sensor will passively monitor the OT network, without creating any traffic which might impact or risk devices on the network. Connect the monitoring port to the sensor’s monitoring interface (typically the first available ethernet card) For more information and configuration examples, see the Network Deployment Guide, Chapter 5 - “Traffic Monitoring.” 3. Register and Activate the Sensor Once the sensor has been connected to the monitor port – it will immediately begin to analyze the network traffic. The next step is to login to the sensor and activate it with an activation file available for your account, in the Azure Defender for IoT portal. Log into your Azure Account select the “Onboard” sensor button (underlined below): Next, fill in the sensor name and subscription details. The button for "cloud-connected" will optionally send alert information into IoT Hub and Sentinel for further analysis. If you have an air-gapped or completely on-premises implementation with no connection to the cloud, disable the "cloud-connected" button below before you generate your license. Download the activation file. This will be used in the next step to activate the sensor. Login to the sensor’s IP address, with the administrative credentials shown during the installation process. On the next screen – upload the activation file from the previous step. For more information and detailed steps, see the Onboarding Guide. 4. Start Exploring Now you’ve successfully installed your first sensor and you can start using the system – view the asset inventory, zoom in on the network map or generate a risk report. Conclusion Thank you for reading this blog post. There will be more blog posts to follow, which will enable you to get the best of out your system, which will include: what to do when malware is detected, connecting to Azure Sentinel, or simulating attack vectors, so please check back with us soon. Learn more with these educational resources: Watch our Ignite session showing how Azure Defender for IoT and Azure Sentinel are combined to investigate multistage attacks that cross IT/OT boundaries, using the TRITON attack on a petrochemical facility as an example. Watch our Tech Community webinar describing MITRE ATT&CK for ICS, an OT-focused version of the well-known MITRE ATT&CK framework originally developed for IT networks. Watch our SANS webinar featuring the head of Microsoft’s datacenter security program, about securing building automation systems using continuous OT security monitoring. Stay tuned for an upcoming webinar during which we’ll do a technical walkthrough of how to deploy and use Azure Defender for IoT. Troubleshooting No traffic is monitored on the sensor. Check that the monitoring port is connected to the correct ethernet port. Make sure the port is indeed a SPAN port by monitoring bandwidth on the port. For more troubleshooting, see the Network Setup Guide, Appendix 1 I cannot find a device in the Asset Inventory Make sure the device is connected to the network. Search for its MAC address in the Asset Inventory – if it is active, it will appear on the list.2.4KViews4likes0CommentsDisplaying SharePoint 2010 multi-value choice columns in one query column
In my work environment SharePoint 2010 is still very much alive and I run several Power Query scenarios with SharePoint 2010 data. Yesterday, I worked on a query that includes a multi-value choice column. In the Query Editor, the column shows as "Table" and when that column is expanded, will duplicate each row for each value in the multi-value SharePoint column. That's not what I wanted. I'd like to see the values from the choices in one row, separated with a comma. Searching the web brought me to forum threads that discuss just that, but they were talking about "List" where I see "Table". I guess that is a difference between SharePoint 2010 and later versions, where apparently SharePoint choice columns are returned as "Lists" to Power Query. Anyway, the suggestion to use ... Text.Combine([Choices], ", ") ... did not work for the Table column. It threw an error instead. "Expression.Error: We cannot convert a value of type Table to type List." In Power Query, lists are one-column objects, whereas tables are multi-column objects, and even if a table contains only one column, it must be explicitly referenced. It took me a little more digging in the search engines, but then I found the solution that works with SharePoint 2010 multi-value choice columns, which show up as tables in Power Query. In the first screenshot we can see the selected table in the lower part of the Power Query window. The column header for the data I want to extract is "Value". So all I needed was to reference "Value" column in the formula. The correct syntax for that is Text.Combine([Choices][Value], ", ") And, hey, presto! It worked as I expected. Hope that helps someone.5.7KViews4likes2CommentsTutorial: How to run Legacy and Chromium-based Edge simultaneously side by side Without Group Policy
1. Uninstall Chromium Edge either from Control panel or from Windows 10 settings => Apps 2. Go to Registry Editor by typing "regedit" in Windows 10 search 3. Navigate to: HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft OR paste this in the address bar area of Registry editor Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft 4. Right-click on "Microsoft" and create a new key And name it "EdgeUpdate" (without quotes) 5. Then right-click on "EdgeUpdate", go to New and create a new DWord 32-bit and call it "Allowsxs" (without quotes) Allowsxs 6. Set its value to 1 7. Exit Registry Editor 8. now go to www.microsoft.com/Edge to download Edge stable again. install it and after that you will be able to access both Legacy and new Edge from start menu or Windows 10 search. Source: https://docs.microsoft.com/en-us/deployedge/microsoft-edge-update-policies#allowsxs This tutorial is for using only Registry editor, there is also another method to do this which is through Group Policy https://docs.microsoft.com/en-au/deployedge/microsoft-edge-sysupdate-access-old-edge#how-to-enable-a-side-by-side-experience-with-both-versions-of-microsoft-edge for that you also will need the policy files which you can get from here: https://www.microsoft.com/en-us/edge/business/download17KViews4likes9CommentsIntro to the Excel Data Model
Here is a great new Excel lab hosted on TechNet Labs. Learn about the data model, for free. This lab is intended to serve as an introduction to the understanding the Excel Data Model. The Excel data model was available in Excel 2010 and was introduced together with an add-in to Excel called Power. It has the data model in it and in the next version of Excel 2013, the model was actually absorbed into Excel and now it’s part of Excel itself. To really use the Excel Data model you need to understand the DAX language, which is the formula language for the Excel Data Model. This Lab will not walk through DAX. For further in-depth DAX learnings, please check out the following resources: The DAX Resource Center Wiki (http://social.technet.microsoft.com/wiki/contents/articles/dax-resource-center.aspx). Data Analysis Expressions (DAX) Reference (https://support.office.com/en-us/article/Data-Analysis-Expressions-DAX-Reference-411c6891-614d-438c-bf45-c7e061dd9e08). The DAX in the BI Tabular Model whitepaper (http://go.microsoft.com/fwlink/?LinkID=237472&clcid=0x409) The lab will walk through viewing the data model, setting up relationships, creating a date dimension and a set of calculations. Take tyhe lab now!2.6KViews3likes0CommentsGet & Transform - Introduction Video
Get & Transform (Power Query) is amazing at connecting to and transforming data. You just build the connections, do the transformations once, and then simply refresh anytime you want to update the data. Power Query is built into Excel 2016 and is a free download for Excel 2010 and 2013 Check out this 4-minute video which steps through the process of combining the data from every CSV file in a folder into one table, then impress your colleagues as you replicate this demo for them!2KViews3likes2CommentsIntro to Get & Transform
We just released 3 great free hosted labs for Excel. Take a look at the Intro to Get & Transform lab This lab is intended to serve as an introduction to the Get & Transform feature area in Excel 2016. The lab will walk through edit and transform the data you are importing, merge multiple queries together, and creating simple PivotTables to view the data in Excel. Hope you like it1.2KViews2likes0Comments