FoundryLocalWebUI is a simple, self-contained web frontend for Foundry Local that runs on IIS, works on both Windows Server and Windows Client, and uses common Windows ecosystem components.
If you've been exploring local AI with Microsoft Foundry Local, you've learned that running a chatbot frontend on Windows Server or Windows Client that you can access over the network comes with a challenging set of dependencies.
FoundryLocalWebUI is a simple, self-contained web frontend for Foundry Local that runs on IIS, works on both Windows Server and Windows Client, and uses common Windows ecosystem components.
You can find the project on GitHub: https://github.com/itopstalk/FoundryWebUI
Here is an explanatory video here: https://youtu.be/IGWNhSQziZI
FoundryLocalWebUI is a lightweight web application designed to be hosted on IIS, which is already available to Windows Server and can be enabled on Windows Client with a few clicks. There is no need to install a separate web server, worry about a package manager, or spin up a Windows Subsystem for Linux environment.
FoundryLocalWebUI is an experimental proof of concept. It doesn't support multiple users and just provides basic chatbot functionality. It's suitable if:
- You're evaluating Foundry Local and want a quick, no-fuss frontend to test models through a browser rather than the command line.
- You want to keep your deployment footprint small and your dependencies minimal.
- You're running Windows Client and want a local chat interface without the overhead of heavier solutions.
The setup process is intentionally straightforward.
Make sure that Git is installed:
winget install --id Git.Git -e --accept-source-agreements --accept-package-agreements
Clone the repo and run the installer (you'll have to use set-executionpolicy to allow the PowerShell script to run)
cd C:\Projects
git clone https://github.com/itopstalk/FoundryWebUI.git FoundryLocalWebUI
cd FoundryLocalWebUI
# Windows Server 2025:
.\Install-FoundryWebUI.ps1
# Windows 10/11:
.\Install-FoundryWebUI-Desktop.ps1
Full setup details are in the GitHub repo, and the walkthrough video covers the process end to end if you'd rather follow along visually.
This is still early days for the project, and I'd love to hear from the community. Local AI is becoming a real option for organizations that need to keep data on-premises and maintain control over their infrastructure.
Spin up a WS 2025 eval edition VM and give it a go.