web development
28 TopicsTroubleshooting Performance Problems related to Application Domains reloading
Throughout my time working on the IIS team I've seen a lot of cases where frequent unloading of the Application Domains caused several performance issues. While there are a lot of blog post covering the mechanism ASP.NET uses to detect changes in the files, each of them only covers one aspect of the problem. Hopefully this article can provide a more unified perspective and help you understand how you can troubleshoot or prevent these issues.9KViews10likes0CommentsFueling the Agentic Web Revolution with NLWeb and PostgreSQL
We’re excited to announce that NLWeb (Natural Language Web), Microsoft’s open project for natural language interfaces on websites now supports PostgreSQL. With this enhancement, developers can leverage PostgreSQL and NLWeb to transform any website into an AI-powered application or Model Context Protocol (MCP) server. This integration allows organizations to utilize a familiar, robust database as the foundation for conversational AI experiences, streamlining deployment and maximizing data security and scalability. Soon, autonomous agents, not just human users, will consume and interpret website content, transforming how information is accessed and utilized online. During Microsoft //Build 2025, Microsoft introduced the era of the open agentic web, in which the internet is an open agentic web a new paradigm in which autonomous agents seamlessly interact across individual, organizational, team and end-to-end business contexts. To realize the future of an open agentic web, Microsoft announced the NLWeb project. NLWeb transforms any website to an AI-powered application with just a few lines of code and by connecting to an AI model and a knowledge base. In this post, we’ll cover: What NLWeb is and how it works with vector databases How pgvector enables vector similarity search in PostgreSQL for NLWeb Get started using NLWeb with Postgres Let’s dive in and see how Postgres + NLWeb can redefine conversational web interfaces while keeping your data in a familiar, powerful database. What is NLWeb? A Quick Overview of Conversational Web Interfaces NLWeb is an open project developed by Microsoft to simplify adding conversational AI interfaces to websites. How NLWeb works under the hood: Processes existing data/website content that exists in semi-structured formats like Schema.org, RSS, and other data that websites already publish Embeds and indexes all the content in a vector store (i.e PostgreSQL with pgvector) Routes user queries through several processes which handle natural langague understanding, reranking and retrieval. Answers queries with an LLM The result is a high-quality natural language interface on top of web data, giving developers the ability to let users “talk to” web data. By default, every NLWeb instance is also a Model Context Protocol (MCP) server, allowing websites to make their content discoverable and accessible to agents and other participants in the MCP ecosystem if they choose. Importantly, NLWeb is platform-agnostic and supports many major operating systems, AI models, and vector stores and the NLWeb project is modular by design, so developers can bring their own retrieval system, model APIs, and define their own extensions. NLWeb with PostgreSQL PostgreSQL is now embedded into the NLWeb reference stack as a native retriever, creating a scalable and flexible path for deploying NLWeb instances using open-source infrastructure. Retrieval Powered by pgvector NLWeb leverages pgvector, a PostgreSQL extension for efficient vector similarity search, to handle natural language retrieval at scale. By integrating pgvector into the NLWeb stack, teams can eliminate the need for external vector databases. Web data stored in PostgreSQL becomes immediately searchable and usable for NLWeb experiences, streamlining infrastructure and enhancing security. PostgreSQL's robust governance features and wide adoption align with NLWeb’s mission to enable conversational AI for any website or content platform. With pgvector retrieval built in, developers can confidently launch NLWeb instances on their own databases no additional infrastructure required. Implementation example We are going to use NLWeb and Postgres, to create a conversational AI app and MCP server that will let us chat with content from the Talking Postgres with Claire Giordano Podcast! Prerequisites An active Azure account. Enable and configure the pg_vector extensions. Create an Azure AI Foundry project. Deploy models gpt-4.1, gpt-4.1-mini and text-embedding-3-small. Install Visual Studio Code. Install the Python extension. Install Python 3.11.x. Install the Azure CLI (latest version). Getting started All the code and sample datasets are available in this GitHub repository. Step 1: Setup NLWeb Server 1. Clone or download the code from the repo. git clone https://github.com/microsoft/NLWeb cd NLWeb 2. Open a terminal to create a virtual python environment and activate it. python -m venv myenv source myenv/bin/activate # Or on Windows: myenv\Scripts\activate 3. Go to the 'code/python' folder in NLWeb to install the dependencies. cd code/python pip install -r requirements.txt 4. Go to the project root folder in NLWeb and copy the .env.template file to a new .env file cd ../../ cp .env.template .env 5. In the .env file, update the API key you will use for your LLM endpoint of choice and update the Postgres connection string. For example: AZURE_OPENAI_ENDPOINT="https://TODO.openai.azure.com/" AZURE_OPENAI_API_KEY="<TODO>" # If using Postgres connection string POSTGRES_CONNECTION_STRING="postgresql://<HOST>:<PORT>/<DATABASE>?user=<USERNAME>&sslmode=require" POSTGRES_PASSWORD="<PASSWORD>" 6. Update your config files (located in the config folder) to make sure your preferred providers match your .env file. There are three files that may need changes. config_llm.yaml: Update the first line to the LLM provider you set in the .env file. By default it is Azure OpenAI. You can also adjust the models you call here by updating the models noted. By default, we are assuming 4.1 and 4.1-mini. config_embedding.yaml: Update the first line to your preferred embedding provider. By default it is Azure OpenAI, using text-embedding-3-small. config_retrieval.yaml: Update the first line to postgres. You should update write_endpoint to postgres and You should update postgres retrieval endpoint is enabled to 'true' in the following list of possible endpoints. Step 2: Initialize Postgres Server Go to the 'code/python/misc folder in NLWeb to run Postgres initializer. NOTE: If you are using Azure Postgres Flexible server make sure you have `vector` extension allow-listed and make sure the database has the vector extension enabled, cd code/python/misc python postgres_load.py Step 3: Ingest Data from Talk Postgres Podcast Now we will load some data in our local vector database to test with. We've listed a few RSS feeds you can choose from below. Go to the 'code/python folder in NLWeb and run the command. The format of the command is as follows (make sure you are still in the 'python' folder when you run this): python -m data_loading.db_load <RSS URL> <site-name> Talking Postgres with Claire Giordano Podcast: python -m data_loading.db_load https://feeds.transistor.fm/talkingpostgres Talking-Postgres (Optional) You can check the documents table in your Postgres database and verify the table looks like the one below. To verify all the data from the website was uploaded. Test NLWeb Server Start your NLWeb server (again from the 'python' folder): python app-file.py Go to http://localhost:8000/ Start ask questions about the Talking Postgres with Claire Giordano Podcast, you may try different modes. Trying List Mode: Sample Prompt: “I want to listen to something that talks about the advances in vector search such as DiskANN” Trying Generate Mode Sample Prompt: “What did Shireesh Thota say about the future of Postgres?” Running NLWeb with MCP 1. If you do not already have it, install MCP in your venv: pip install mcp 2. Next, configure your Claude MCP server. If you don’t have the config file already, you can create the file at the following locations: macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json The default MCP JSON file needs to be modified as shown below: macOS Example Configuration { “mcpServers”: { “ask_nlw”: { “command”: “/Users/yourname/NLWeb/myenv/bin/python”, “args”: [ “/Users/yourname/NLWeb/code/chatbot_interface.py”, “—server”, “http://localhost:8000”, “—endpoint”, “/mcp” ], “cwd”: “/Users/yourname/NLWeb/code” } } } Windows Example Configuration { “mcpServers”: { “ask_nlw”: { “command”: “C:\\Users\\yourusername\\NLWeb\\myenv\\Scripts\\python”, “args”: [ “C:\\Users\\yourusername\\NLWeb\\code\\chatbot_interface.py”, “—server”, “http://localhost:8000”, “—endpoint”, “/mcp” ], “cwd”: “C:\\Users\\yourusername\\NLWeb\\code” } } } Note: For Windows paths, you need to use double backslashes (\\) to escape the backslash character in JSON. 3. Go to the 'code/python’ folder in NLWeb and run the command. Enter your virtual environment and start your NLWeb local server. Make sure it is configured to access the data you would like to ask about from Claude. # On macOS source ../myenv/bin/activate python app-file.py # On Windows ..\myenv\Scripts\activate python app-file.py 4. Open Claude Desktop. It should ask you to trust the 'ask_nlw' external connection if it is configured correctly. After clicking yes and the welcome page appears, you should see 'ask_nlw' in the bottom right '+' options. Select it to start a query. 5. To query NLWeb, just type 'ask_nlw' in your prompt to Claude. You'll notice that you also get the full JSON script for your results. Remember, you must have your local NLWeb server started to use this option. Learn More Vector Store in Azure Postgres Flexible Server Generative AI in Azure Postgres Flexible Server NLWeb GitHub repo includes: A reference server for handling natural language queries PGvector integrationHow to Create Your Own Portfolio Website in Minutes with GitHub Codespaces and Blazor
Creating a portfolio website is essential for showcasing your skills and accomplishments to potential employers or clients. However, setting up a website can be time-consuming and require technical expertise. Fortunately, with GitHub Codespaces and Blazor, you can create and customize your own portfolio website in just a few minutes, without installing any tools or worrying about lengthy environment setup. Our .NET Blazor Portfolio Site project template is perfect for beginners and experienced coders alike, and you can deploy your website with Azure Static Web Apps or GitHub Pages. This project is easily customizable and perfect for anyone looking to create a portfolio site, learn web development, or test out Codespaces. Follow our instructions to launch your Codespace, customize your website, and deploy it. No experience necessary – start today!11KViews2likes0Comments.NET Learning Resources
<a href="https://unsplash.com/@fakurian?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Milad Fakurian</a> on <a href="https://unsplash.com/?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Unsplash</a>Dedicated to our NTA non-profits with a focus on educating their community on how to develop new skills in tech by learning how to code.2.6KViews2likes0CommentsYour Guide to Azure Storage Options
Storage Made Simple Nonprofits often manage sensitive and crucial data, ranging from donor information to financial records and program details. Azure Storage offers a plethora of options that cater to the unique needs of nonprofits, ensuring that their data is stored securely and efficiently. By leveraging Azure's diverse storage solutions, nonprofits can benefit from scalable storage accounts, various tier options that optimize cost, and advanced lifecycle management to streamline data retention and deletion processes. Azure's robust encryption and security measures further protect data integrity, ensuring compliance with regulations. Nonprofits can ensure that their valuable data is not only safe but also accessible and manageable, helping them focus on their mission-driven work without worrying about data-related issues. Types of Storage Azure Storage offers various types of storage solutions to meet different requirements. These solutions are designed to support different use cases. Below is a brief description of the different types of storage: Blob Storage: Ideal for storing unstructured data such as documents, images, and media files. It supports massive amounts of data with easy access. File Storage: Provides fully managed file shares in the cloud that can be accessed via the SMB protocol. Perfect for lift-and-shift applications. Queue Storage: Enables message queuing for large workloads and is useful for decoupling application components, ensuring smooth communication between services. Table Storage: Stores large amounts of structured data with a NoSQL key-value store. Suitable for flexible datasets and rapid development scenarios. Disk Storage: Offers high-performance block storage for virtual machines. Ensures data durability and supports a wide range of I/O demands. Azure Container Storage: The perfect solution for Kubernetes orchestration and management for container applications. Utilizing rapid scale out of pods. Ideal for scalability and stateful applications. To maximize cost efficiency, nonprofits can leverage the following: Utilize tiered storage options to balance cost and performance based on data access patterns. Implement lifecycle management policies to automate data retention and deletion, reducing storage costs. Take advantage of scalable storage accounts to avoid over-provisioning and only pay for what they use. Encryption & Security Security and encryption are paramount for safeguarding sensitive information in Azure storage. Azure provides robust encryption mechanisms to ensure data protection across all storage types. File Storage, Queue Storage, Table Storage, Disk Storage, and Azure Container Storage all support encryption at rest, using Microsoft-managed keys or customer-managed keys for added control. Advanced features such as Azure Key Vault integration enhance security by allowing centralized management of encryption keys and secrets. Transport Layer Security (TLS) is also utilized to secure data in transit, ensuring that all communications are encrypted and secure. Organizations must ensure that their data is not only accessible but also protected from unauthorized access and tampering. Implementing encryption and security measures helps in maintaining compliance with regulatory standards and safeguarding the integrity of critical information. When planning for storage solutions, it's crucial to factor in these security aspects to develop a comprehensive strategy that includes encryption, secure access protocols, and regular audits to identify and mitigate potential vulnerabilities. Lifecycle Management Lifecycle Management is a crucial strategy for organizations seeking to optimize their data storage solutions. It involves the automatic transitioning of data through various storage tiers based on predefined policies, which are tailored to the access patterns and retention requirements of the data. By implementing lifecycle management policies, organizations can ensure that frequently accessed data remains on high-performance, costlier storage, while infrequently accessed data is moved to more economical storage options. This approach not only reduces storage costs but also enhances the efficiency of data retrieval processes. Furthermore, lifecycle management includes automated deletion policies for obsolete data, thereby freeing up storage space and ensuring compliance with data retention regulations. By adopting a comprehensive lifecycle management plan, organizations can effectively manage their storage resources, maintain data integrity, and ensure that their storage infrastructure scales with their evolving needs. Conclusion In conclusion, optimizing Azure storage solutions requires a balanced approach that integrates cost-effective strategies, robust security measures, and efficient lifecycle management. By leveraging tiered storage, automating data retention and deletion policies, and ensuring comprehensive encryption, organizations can achieve a secure, scalable, and economical storage environment. Through strategic planning and implementation, they can not only meet their current needs but also adapt to future demands, thereby ensuring that their data remains accessible, protected, and efficiently managed. Hyperlinks Azure Storage Documentation Hub | Microsoft Learn Introduction to Azure Storage - Cloud storage on Azure | Microsoft Learn Storage account overview - Azure Storage | Microsoft Learn Describe Azure storage services - Training | Microsoft Learn143Views1like0CommentsFestival Web
Microsoft anuncia una nueva iniciativa para ayudarte a impulsar tu carrera en desarrollo web llamada el Festival Web, es una serie de charlas en vivo que comienza desde el 31 de octubre y finaliza el 14 de noviembre. En estas charlas podrás aprender de la mano de expertos y conocer diferentes herramientas como VSCode y GitHub. ¡Regístrate en las charlas en vivo del Festival Web y comienza tu viaje en el mundo de la programación web con expertos de la industria! Además, hay una oportunidad gratuita para quienes desean iniciarse en este campo. ¡Descubre más información sobre esta iniciativa de Microsoft en este blog!4KViews1like0CommentsGetting Started with Azure App Service: A Beginner's Guide to Web App Deployment
Dive into the world of Azure App Service with our comprehensive beginner's guide! Learn the step-by-step process of deploying your first web app using Azure. From navigating the Azure portal to mastering deployment via Azure CLI, unlock the secrets to seamless web app deployment. Get ready to elevate your skills and bring your projects to life with Azure App Service!28KViews1like1Comment