developer
39 TopicsBuilding a Restaurant Management System with Azure Database for MySQL
In this hands-on tutorial, we'll build a Restaurant Management System using Azure Database for MySQL. This project is perfect for beginners looking to understand cloud databases while creating something practical.1.3KViews5likes5CommentsTutorial: Building AI Agents That Talk to Your Azure Database for MySQL
What if you could ask your database a question in plain English and get the answer instantly, without writing a single line of SQL? In this tutorial, you'll build a Python-based AI agent that connects to Azure Database for MySQL server and uses OpenAI's function calling to translate natural language questions into SQL queries, execute them, and return human-readable answers. The agent can explore your schema, answer business questions, and even self-correct when it writes invalid SQL. What you'll build: An Azure Database for MySQL server with sample data A Python AI agent with three tools: list_tables, describe_table, and run_sql_query In the context of AI agents, tools are functions the agent can call to interact with external systems like querying a database, fetching a file, or calling an API. Here, our agent has three tools that let it explore and query your MySQL database. An interactive chat interface where you ask questions and the agent auto-generates and runs SQL Prerequisites Before you begin, make sure you have: An Azure account — Sign up for free (includes 12 months of free MySQL hosting) An OpenAI API key — Get one here (you'll need a few dollars of credit) Python 3.10+ — Download here (check "Add to PATH" during install) A code editor — VS Code recommended Optional: You can download the complete project from this GitHub repository, or follow the step‑by‑step instructions below to build it from scratch. Step 1 — Create the Azure Database for MySQL server Go to the Azure Portal Search for "Azure Database for MySQL server" and click + Create Configure the following settings: Setting Value Resource group rg-mysql-ai-agent (create new) Server name mysql-ai-agent (or any unique name) Region Your nearest region MySQL version 8.4 Workload type Dev/Test (Burstable B1ms — free for 12 months) Admin username mysqladmin Password A strong password — save it! 4. ✅ Check "Add firewall rule for current IP address" ⚠️ Important: If you skip the firewall settings, you won't be able to connect from Cloud Shell or your local machine. 5. Click Review + create → Create and wait 3–5 minutes Once deployment finishes, navigate to your server and note the hostname from the Connection details: mysql-ai-agent.mysql.database.azure.com Step 2 — Load Sample Data Open Azure Cloud Shell by clicking the >_ icon in the portal's top toolbar. Select Bash if prompted. Connect to your MySQL server. You can copy the exact connection command from the "Connect from browser or locally" section on your server's overview page in the Azure portal: mysql -h mysql-ai-agent.mysql.database.azure.com -u mysqladmin -p Enter your password when prompted (the cursor won't move — just type and press Enter). Now paste the following SQL to create a sample sales database: CREATE DATABASE demo_sales; USE demo_sales; CREATE TABLE customers ( id INT AUTO_INCREMENT PRIMARY KEY, name VARCHAR(100), email VARCHAR(100), city VARCHAR(50), signup_date DATE ); CREATE TABLE orders ( id INT AUTO_INCREMENT PRIMARY KEY, customer_id INT, product VARCHAR(100), amount DECIMAL(10,2), order_date DATE, FOREIGN KEY (customer_id) REFERENCES customers(id) ); INSERT INTO customers (name, email, city, signup_date) VALUES ('Sara Ahmed', 'sara@example.com', 'Cairo', '2024-06-15'), ('John Smith', 'john@example.com', 'London', '2024-08-22'), ('Priya Patel', 'priya@example.com', 'Mumbai', '2025-01-10'); INSERT INTO orders (customer_id, product, amount, order_date) VALUES (1, 'Azure Certification Voucher', 150.00, '2025-03-01'), (2, 'MySQL Workbench Pro License', 99.00, '2025-03-10'), (1, 'Power BI Dashboard Template', 45.00, '2025-04-05'), (3, 'Data Analysis Course', 200.00, '2025-05-20'); Verify the data: SELECT * FROM customers; SELECT * FROM orders; Type exit to leave MySQL. Step 3 — Set Up the Python Project Open a terminal on your local machine and create the project: mkdir mysql-ai-agent cd mysql-ai-agent python -m venv venv Activate the virtual environment: Windows (PowerShell): venv\Scripts\Activate.ps1 macOS/Linux: source venv/bin/activate Install the required packages: pip install openai mysql-connector-python python-dotenv Step 4 — Configure Environment Variables Create a file named .env in your project folder: OPENAI_API_KEY=sk-proj-xxxxxxxxxxxxxxxxxxxxxxxx MYSQL_HOST=mysql-ai-agent.mysql.database.azure.com MYSQL_USER=mysqladmin MYSQL_PASSWORD=YourPasswordHere MYSQL_DATABASE=demo_sales 🔒 Security: Never commit this file to Git. Add .env to your .gitignore Step 5 — Build the Agent Open VS Code, create a new file called mysql_agent.py in your mysql-ai-agent folder, and paste the following code. Let's walk through each section. 5.1 — Imports and Database Connection import os import json import mysql.connector from openai import OpenAI from dotenv import load_dotenv load_dotenv() def get_db_connection(): return mysql.connector.connect( host=os.getenv("MYSQL_HOST"), user=os.getenv("MYSQL_USER"), password=os.getenv("MYSQL_PASSWORD"), database=os.getenv("MYSQL_DATABASE"), ssl_disabled=False ) This loads your secrets from .env and creates a reusable MySQL connection function with SSL encryption. 5.2 — Define the Three Tools These are the functions the AI agent can call: def list_tables(): conn = get_db_connection() cursor = conn.cursor() cursor.execute("SHOW TABLES") tables = [row[0] for row in cursor.fetchall()] cursor.close() conn.close() return json.dumps({"tables": tables}) def describe_table(table_name): conn = get_db_connection() cursor = conn.cursor() cursor.execute(f"DESCRIBE `{table_name}`") columns = [] for row in cursor.fetchall(): columns.append({ "field": row[0], "type": row[1], "null": row[2], "key": row[3] }) cursor.close() conn.close() return json.dumps({"table": table_name, "columns": columns}) def run_sql_query(query): if not query.strip().upper().startswith("SELECT"): return json.dumps({"error": "Only SELECT queries are allowed."}) conn = get_db_connection() cursor = conn.cursor() try: cursor.execute(query) columns = [desc[0] for desc in cursor.description] rows = cursor.fetchall() results = [] for row in rows: results.append(dict(zip(columns, row))) return json.dumps({"columns": columns, "rows": results}, default=str) except mysql.connector.Error as e: return json.dumps({"error": str(e)}) finally: cursor.close() conn.close() A few things to note: run_sql_query only allows SELECT statements — this is a safety guardrail that prevents the AI from modifying data The try/except block is critical — if the AI generates invalid SQL (e.g., a bad GROUP BY), the error message is returned to OpenAI, and the model automatically corrects its query and retries. Without this, the script would crash. 5.3 — Register Tools with OpenAI This tells OpenAI what tools the agent has access to: tools = [ { "type": "function", "function": { "name": "list_tables", "description": "List all tables in the connected MySQL database.", "parameters": {"type": "object", "properties": {}, "required": []} } }, { "type": "function", "function": { "name": "describe_table", "description": "Get the schema (columns and types) of a specific table.", "parameters": { "type": "object", "properties": { "table_name": {"type": "string", "description": "Name of the table to describe"} }, "required": ["table_name"] } } }, { "type": "function", "function": { "name": "run_sql_query", "description": "Execute a read-only SQL SELECT query and return results.", "parameters": { "type": "object", "properties": { "query": {"type": "string", "description": "The SQL SELECT query to execute"} }, "required": ["query"] } } } ] def call_tool(name, args): if name == "list_tables": return list_tables() elif name == "describe_table": return describe_table(args["table_name"]) elif name == "run_sql_query": return run_sql_query(args["query"]) else: return json.dumps({"error": f"Unknown tool: {name}"}) 5.4 — The Agent Loop This is the core logic. It sends the user's message to OpenAI, processes any tool calls, and loops until the model produces a final text response: def chat(user_message, conversation_history): client = OpenAI() conversation_history.append({"role": "user", "content": user_message}) print(f"\n{'='*60}") print(f"🧑 You: {user_message}") print(f"{'='*60}") while True: response = client.chat.completions.create( model="gpt-4o-mini", messages=conversation_history, tools=tools, tool_choice="auto" ) assistant_message = response.choices[0].message if assistant_message.tool_calls: conversation_history.append(assistant_message) for tool_call in assistant_message.tool_calls: fn_name = tool_call.function.name fn_args = json.loads(tool_call.function.arguments) print(f" 🔧 Calling tool: {fn_name}({json.dumps(fn_args)})") result = call_tool(fn_name, fn_args) print(f" ✅ Tool returned: {result[:200]}...") conversation_history.append({ "role": "tool", "tool_call_id": tool_call.id, "content": result }) else: final_answer = assistant_message.content conversation_history.append({"role": "assistant", "content": final_answer}) print(f"\n🤖 Agent:\n{final_answer}") return conversation_history The while True loop is what makes self-correction possible. When a tool returns an error, the model sees it in the conversation and generates a corrected tool call in the next iteration. 5.5 — Main Entry Point if __name__ == "__main__": print("\n" + "=" * 60) print(" 🤖 MySQL AI Agent") print(" Powered by OpenAI + Azure Database for MySQL") print(" Type 'quit' to exit") print("=" * 60) system_message = { "role": "system", "content": ( "You are a helpful data analyst agent connected to an Azure Database for MySQL. " "You have 3 tools: list_tables, describe_table, and run_sql_query. " "ALWAYS start by listing tables and describing their schema before writing queries. " "Only generate SELECT statements. Never write INSERT, UPDATE, DELETE, or DROP. " "Present query results in clean, readable tables. " "If the user asks a question, figure out the right SQL to answer it." ) } conversation_history = [system_message] while True: user_input = input("\n🧑 You: ").strip() if user_input.lower() in ("quit", "exit", "q"): print("\n👋 Goodbye!") break if not user_input: continue conversation_history = chat(user_input, conversation_history) Your final project folder should look like this: Step 6 — Run and Test the Agent python mysql_agent.py Test: Prompt: Which product generated the most revenue and who bought it? How Self-Correction Works One of the most powerful aspects of this agent is its ability to recover from SQL errors automatically. Azure Database for MySQL has sql_mode=only_full_group_by enabled by default, which rejects queries where non-aggregated columns aren't in the GROUP BY clause. When the AI generates an invalid query, here's what happens: The run_sql_query function catches the MySQL error It returns the error message as the tool result OpenAI sees the error in the conversation context The model generates a corrected query automatically The agent retries — and succeeds Without the try/except error handling, the entire script would crash. This is a key design pattern for production AI agents. Security Best Practices When building AI agents that interact with databases, security is critical: Read-only enforcement — The run_sql_query function rejects anything that isn't a SELECT statement SSL encryption — All connections use ssl_disabled=False, ensuring data in transit is encrypted Environment variables — Credentials are stored in .env, never hardcoded Principle of least privilege — For production, create a dedicated MySQL user with SELECT-only permissions: CREATE USER 'ai_agent'@'%' IDENTIFIED BY 'AgentPass123!'; GRANT SELECT ON demo_sales.* TO 'ai_agent'@'%'; FLUSH PRIVILEGES; Network isolation — For production workloads, consider using Azure Private Link instead of public access. Conclusion In this tutorial, you built a Python AI agent that connects to Azure Database for MySQL and answers natural language questions by auto-generating SQL - complete with self-correction and security guardrails. Clone the GitHub repo, spin up your own server, and start experimenting! If you'd like to connect to Azure Database for MySQL using the Model Context Protocol (MCP), see Unlocking AI-Driven Data Access: Azure Database for MySQL Support via the Azure MCP Server. If you have any feedback or questions about the information provided above, please leave a comment below. Thank you!292Views0likes0CommentsNew series of monthly Live Webinars on Azure Database for MySQL!
Today we are announcing a new series of monthly Live Webinars about Azure Database for MySQL! These sessions will showcase newly released features and capabilities, technical deep-dives, and demos. The product group will also be addressing your questions about the service in real-time!4.5KViews2likes0CommentsMicrosoft Azure innovation powers leading price-performance for MySQL database in the cloud
As part of our commitment to ensuring that Microsoft Azure is the best place to run MySQL workloads, Microsoft is excited to announce that Azure Database for MySQL - Flexible Server just achieved a new, faster performance benchmark.7.3KViews5likes0CommentsDeploying Moodle on Azure – things you should know
Moodle is one of the most popular open-source learning management platform empowering educators and researchers across the world to disseminate their work efficiently. It is also one of the most mature and robust OSS applications that the community has developed and improvised over the years. We have seen customers from small, medium, and large enterprises to schools, public sector, and government organizations deploying Moodle in Azure. In this blog post, I’ll share some best practices and tips for deploying Moodle on Azure based on our experiences working with several of our customers.69KViews14likes25CommentsLeverage Flexible Server’s Business Critical service tier for mission critical applications
The Business Critical service tier is ideal for mission critical Tier 1 workloads such as ecommerce, financial, or internet-scale applications, that rely heavily on Azure Database for MySQL - Flexible Server to always be available, operational, and resilient to failure. Organizations with mission critical workloads that require low latency, high query per second (QPS), high concurrency, fast failover, and faster throughput should choose to run or build their applications using servers based on the Business Critical service tier.5.3KViews3likes0CommentsFlexible maintenance options for Azure Database for MySQL (Preview)
Flexible maintenance functionality is designed to provide you with unprecedented control over your maintenance operations. Today, we're pleased to announce preview support for new flexible maintenance options in Azure Database for MySQL!2.8KViews2likes0CommentsUnlocking AI-Driven Data Access: Azure Database for MySQL Support via the Azure MCP Server
Step into a new era of data-driven intelligence with the fusion of Azure MCP Server and Azure Database for MySQL, where your MySQL data is no longer just stored, but instantly conversational, intelligent and action-ready. By harnessing the open-standard Model Context Protocol (MCP), your AI agents can now query, analyze and automate in natural language, accessing tables, surfacing insights and acting on your MySQL-driven business logic as easily as chatting with a colleague. It’s like giving your data a voice and your applications a brain, all within Azure’s trusted cloud platform. We are excited to announce that we have added support for Azure Database for MySQL in Azure MCP Server. The Azure MCP Server leverages the Model Context Protocol (MCP) to allow AI agents to seamlessly interact with various Azure services to perform context-aware operations such as querying databases and managing cloud resources. Building on this foundation, the Azure MCP Server now offers a set of tools that AI agents and apps can invoke to interact with Azure Database for MySQL - enabling them to list and query databases, retrieve schema details of tables, and access server configurations and parameters. These capabilities are delivered through the same standardized interface used for other Azure services, making it easier to the adopt the MCP standard for leveraging AI to work with your business data and operations across the Azure ecosystem. Before we delve into these new tools and explore how to get started with them, let’s take a moment to refresh our understanding of MCP and the Azure MCP Server - what they are, how they work, and why they matter. MCP architecture and key components The Model Context Protocol (MCP) is an emerging open protocol designed to integrate AI models with external data sources and services in a scalable, standardized, and secure manner. MCP dictates a client-server architecture with four key components: MCP Host, MCP Client, MCP Server and external data sources, services and APIs that provide the data context required to enhance AI models. To explain briefly, an MCP Host (AI apps and agents) includes an MCP client component that connects to one or more MCP Servers. These servers are lightweight programs that securely interface with external data sources, services and APIs and exposes them to MCP clients in the form of standardized capabilities called tools, resources and prompts. Learn more: MCP Documentation What is Azure MCP Server? Azure offers a multitude of cloud services that help developers build robust applications and AI solutions to address business needs. The Azure MCP Server aims to expose these powerful services for agentic usage, allowing AI systems to perform operations that are context-aware of your Azure resources and your business data within them, while ensuring adherence to the Model Context Protocol. It supports a wide range of Azure services and tools including Azure AI Search, Azure Cosmos DB, Azure Storage, Azure Monitor, Azure CLI and Developer CLI extensions. This means that you can empower AI agents, apps and tools to: Explore your Azure resources, such as listing and retrieving details on your Azure subscriptions, resource groups, services, databases, and tables. Search, query and analyze your data and logs. Execute CLI and Azure Developer CLI commands directly, and more! Learn more: Azure MCP Server GitHub Repository Introducing new Azure MCP Server tools to interact with Azure Database for MySQL The Azure MCP Server now includes the following tools that allow AI agents to interact with Azure Database for MySQL and your valuable business data residing in these servers, in accordance with the MCP standard: Tool Description Example Prompts azmcp_mysql_server_list List all MySQL servers in a subscription & resource group "List MySQL servers in resource group 'prod-rg'." "Show MySQL servers in region 'eastus'." azmcp_mysql_server_config_get Retrieve the configuration of a MySQL server "What is the backup retention period for server 'my-mysql-server'?" "Show storage allocation for server 'my-mysql-server'." azmcp_mysql_server_param_get Retrieve a specific parameter of a MySQL server "Is slow_query_log enabled on server my-mysql-server?" "Get innodb_buffer_pool_size for server my-mysql-server." azmcp_mysql_server_param_set Set a specific parameter of a MySQL server to a specific value "Set max_connections to 500 on server my-mysql-server." "Set wait_timeout to 300 on server my-mysql-server." azmcp_mysql_table_list List all tables in a MySQL database "List tables starting with 'tmp_' in database 'appdb'." "How many tables are in database 'analytics'?" azmcp_mysql_table_schema_get Get the schema of a specific table in a MySQL database "Show indexes for table 'transactions' in database 'billing'." "What is the primary key for table 'users' in database 'auth'?" azmcp_mysql_database_query Executes a SELECT query on a MySQL Database. The query must start with SELECT and cannot contain any destructive SQL operations for security reasons. “How many orders were placed in the last 30 days in the salesdb.orders table?” “Show the number of new users signed up in the last week in appdb.users grouped by day.” These interactions are secured using Microsoft Entra authentication, which enables seamless, identity-based access to Azure Database for MySQL - eliminating the need for password storage and enhancing overall security. How are these new tools in the Azure MCP Server different from the standalone MCP Server for Azure Database for MySQL? We have integrated the key capabilities of the Azure Database for MySQL MCP server into the Azure MCP Server, making it easier to connect your agentic apps not only to Azure Database for MySQL but also to other Azure services through one unified and secure interface! How to get started Installing and running the Azure MCP Server is quick and easy! Use GitHub Copilot in Visual Studio Code to gain meaningful insights from your business data in Azure Database for MySQL. Pre-requisites Install Visual Studio Code. Install GitHub Copilot and GitHub Copilot Chat extensions. An Azure Database for MySQL with Microsoft Entra authentication enabled. Ensure that the MCP Server is installed on a system with network connectivity and credentials to connect to Azure Database for MySQL. Installation and Testing Please use this guide for installation: Azure MCP Server Installation Guide Try the following prompts with your Azure Database for MySQL: Azure Database for MySQL tools for Azure MCP Server Try it out and share your feedback! Start using Azure MCP Server with the MySQL tools today and let our cloud services become your AI agent’s most powerful ally. We’re counting on your feedback - every comment, suggestion, or bug-report helps us build better tools together. Stay tuned: more features and capabilities are on the horizon! Feel free to comment below or write to us with your feedback and queries at AskAzureDBforMySQL@service.microsoft.com.377Views2likes0CommentsAzure Database for MySQL triggers for Azure Functions (Public Preview)
Developers can now accelerate development time and focus only on the core business logic of their applications, for developing event-driven applications with Azure Database for MySQL as the backend data store. We are excited to announce that you can now invoke an Azure Function based on changes to an Azure Database for MySQL table. This new capability is made possible through the Azure Database for MySQL triggers for Azure Functions, now available in public preview. Azure Database for MySQL triggers The Azure Database for MySQL trigger uses change tracking functionality to monitor a MySQL table for changes and trigger a function when a row is created or updated enabling customers to build highly-scalable event-driven applications. Similar to the Azure Database for MySQL Input and Output bindings for Azure Functions, a connection string for the MySQL database is stored in the application settings of the Azure Function to trigger the function when a change is detected on the tables. Note: In public preview, Azure Database for MySQL triggers for Azure Functions are available only for dedicated and premium plan of Azure Functions To enable change tracking on an existing Azure Database for MySQL table to use trigger bindings for an Azure Function, it is necessary to alter the table structure, for example, enabling change tracking on an employees data table: ALTER TABLE employees ADD COLUMN az_func_updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP; Azure Database for MySQL trigger uses the 'az_func_updated_at' and column's data to monitor the table for any changes on which change tracking is enabled. Changes are then processed in the order that they were made, with the oldest changes being processed first. Important: If changes to multiple rows are made at once, then the exact order they're sent to the function is determined on the ascending order of the az_func_updated_at and the primary key columns. If multiple changes are made to a row in-between an iteration, then only the latest changes for that particular rows are considered. The following example demonstrates a C# function that is triggered when changes occur in the employees table. The MySQL trigger uses attributes for the table name and the connection string. using System.Collections.Generic; using Microsoft.Azure.WebJobs; using Microsoft.Azure.WebJobs.Extensions.MySql; using Microsoft.Extensions.Logging; namespace EmployeeSample.Function { public static class EmployeesTrigger { [FunctionName(nameof(EmployeesTrigger))] public static void Run( [MySqlTrigger("Employees", "MySqlConnectionString")] IReadOnlyList<MySqlChange<Employee>> changes, ILogger logger) { foreach (MySqlChange<Employee> change in changes) { Employee employee= change. Item; logger.LogInformation($"Change operation: {change.Operation}"); logger.LogInformation($"EmployeeId: {employee.employeeId}, FirstName: {employee.FirstName}, LastName: {employee.LastName}, Company: {employee. Company}, Department: {employee. Department}, Role: {employee. Role}"); } } } } Join the preview and share your feedback! We are eager for you to try out the new Azure Database for MySQL triggers for Azure Functions and build highly scalable event-driven and serverless applications. For more information refer https://aka.ms/mysqltriggers about using MySQL triggers for all the supported programming frameworks with detailed step-by-step instructions If you have any feedback or questions about the information provided above, please leave a comment below or email us at AskAzureDBforMySQL@service.microsoft.com. Thank you!