Blog Post

Apps on Azure Blog
2 MIN READ

Introducing AI Playground on Azure App Service for Linux

TulikaC's avatar
TulikaC
Icon for Microsoft rankMicrosoft
Nov 13, 2025

If you’re running a Small Language Model (SLM) as a sidecar with your web app, there’s now a faster way to try prompts, measure latency, and copy working code into your app—all without leaving your site. AI Playground is a lightweight, built-in experience available from the Kudu endpoint for every Linux App Service. 

What is AI Playground? 

AI Playground is a simple UI that talks to the SLM you’ve attached to your App Service app (for example, Phi or BitNet via the Sidecar extension). It lets you: 

  • Send system and user prompts and view responses in-line
  • See performance metrics like Time to First Token (TTFT), total time, and tokens/sec
  • Grab ready-to-use code snippets for popular languages from the right sidebar (when you’re ready to integrate) 
  • Confirm whether a sidecar SLM is configured—and get clear guidance if it isn’t 

Sidecar SLMs were introduced earlier this year; they let you run models like Phi and BitNet alongside your app. Learn more: https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-slm-dotnet 

 Where to find it the AI Playground 

  1. In the Azure portal, go to your App Service (Linux). 
  2. Open Advanced Tools (Kudu)Go. 
  3. In the Kudu left navigation, select AI Playground. 

Note: A pre-requisite for the playground is already having an SLM sidecar setup with your application. Here is a tutorial to set it up https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-slm-dotnet

A quick tour 

Prompts panel 

  • Set a System Prompt (e.g., “You speak like a pirate.”) to steer behavior. 
  • Enter a User Prompt, then click Send to SLM. 

Performance metrics displayed 

  • TTFT: how quickly the first token arrives—great for responsiveness checks. 
  • Total: overall response time. 
  • Tokens/sec: sustained throughput for the generation. 

Code integration examples 

  • On the right, you’ll find minimal snippets for C#, Python, and Node.js you can paste into your app later (no need to leave Kudu). 

Tip: Keep prompts compact for SLMs. If output slows, shorten the prompt or reduce requested length. 

Don’t have a sidecar yet? 

If AI Playground can’t find an SLM, you’ll see an inline notice with setup steps. 

Full tutorial: https://learn.microsoft.com/en-us/azure/app-service/tutorial-ai-slm-dotnet

Troubleshooting  

No responses / timeouts 

  • Confirm the sidecar is Running in Deployment Center → Containers. 
  • Check the sidecar’s port and endpoint 

Slow TTFT or Tokens/sec 

  • Warm up with a couple of short prompts. 
  • Consider scaling up to a Premium plan. 
  • Keep prompts and requested outputs short. 

Roadmap 

This is v1. We’re already working on: 

  • Bring-your-own LLMs (play with different models beyond SLMs) 
  • Richer evaluation (prompt presets, saved sessions, exportable traces) 
  • Better observability (per-call logs, quick links to Log Stream) 

Conclusion 

AI Playground makes building AI features on App Service feel immediate - type, run, measure, and ship. We’ll keep smoothing the experience and unlocking more model choices so you can go from idea to integrated AI faster than ever. 

 

Published Nov 13, 2025
Version 1.0
No CommentsBe the first to comment