Blog Post

Microsoft Developer Community Blog
4 MIN READ

Building Adaptive Multilingual Apps Using TypeScript and Azure AI Translator API

Julia_Muiruri's avatar
Julia_Muiruri
Icon for Microsoft rankMicrosoft
Nov 06, 2025

Azure AI Translator adds multilingual intelligence to applications via a REST API. The new 2025-05-01-preview version introduces Large Language Model (LLM) translation options alongside traditional Neural Machine Translation (NMT). For TypeScript developers, this unlocks richer style control (tone, gender), adaptive customization, and hybrid translation flows that balance speed, cost, and nuance.

A support agent (English) chats with a French-speaking customer at Zava, a fictitious retail company that specialized in home improvement and DIY supplies. The solution should: -

  • Translate the support agent's responses to French with appropriate tone (formal for official communication and informal/neutral for casual conversations on the company's community forum).
  • Consider gender specificity when referring to roles for clarity and cultural nuance.
  • Optionally generate both female and male variants for internal QA or regulatory phrasing checks.
  • Provide fallback to NMT when LLM deployment capacity is limited or language pairing is unsupported

What’s New in the 2025-05-01-preview Release

  1. LLM Choice: You can select GPT-4o or GPT-4o-mini deployments instead of default NMT. This requires an Azure AI Foundry resource.
  2. Adaptive Custom Translation: Supply up to five reference translation pairs or dataset for few-shot style adaptation, beneficial for businesses that require translations to adhere to specific brand phrasing, regulatory wording, or domain-specific jargon without fine-tuning.
  3. Tone & Gender Parameters: Specify tone (formal | informal | neutral) and gender (female | male | neutral) to tailor linguistic output in supported languages.
  4. Hybrid Translation: Combine NMT and LLM targets in a single request for differential quality/cost strategies.
Use CaseRecommended Mode
High-volume, latency-sensitive messaging and UI localizationNMT (character-based billing, mature, fast)
Nuanced support replies requiring tone/gender/politeness adaptationLLM (token-based, richer semantics)
Style matching to internal domain glossary or brand voiceLLM with adaptive dataset or reference pairs
For a mixed strategyHybrid: include both NMT and selected LLM targets
Exploration / prototyping of expressive outputsGPT-4o-mini (lower cost than full GPT-4o)

How to use it (TypeScript)

Pre-requisites

  • An Azure AI Foundry project (For LLM translation)
  • GPT-4o or GPT-4o-mini deployment
  • An Azure AI Translator resource (For NMT translation)

Set up your region, ai_translator_api_key and foundry_api_key in your environment variables. A link to the full sample repo with NMT, LLM + Tone & Gender and combined approach code will be provided at the end of this blog, but here’s a minimal explanation of the core logic.

Neural Machine Translation (NMT)

async function translateWithNMT(text: string, from: string, to: string) {
    const headers = {
        'Ocp-Apim-Subscription-Key': process.env.API_KEY!,
        'Ocp-Apim-Subscription-Region': process.env.REGION!,
        'Content-type': 'application/json'
    };

    const params = new URLSearchParams({ 'api-version': '2025-05-01-preview' });
    const body = [
        {
            "text": "J'ai un problème avec ma commande.",
            "language": "fr",
            "targets": [{ "language": "en" }]
        }
    ]
    
    const response = await axios.post(`${globalEndpoint}/translate`, body, { headers, params });
    return response.data;
}

Expected output:

English translation: I have a problem with my order.

LLM translation specifying tone and gender

async function translateWithLLM(text: string, from: string, to: string, model: string, tone: string, gender: string) {
    const headers = {
        'Ocp-Apim-Subscription-Key': process.env.FOUNDRY_API_KEY!,
        'Ocp-Apim-Subscription-Region': process.env.REGION!,
        'Content-type': 'application/json'
    };

    const params = new URLSearchParams({ 'api-version': '2025-05-01-preview' });
    const body = [
        {
            "text": "Your case has been forwarded to the support supervisor, April Gittens. She will contact you today to review the situation.",
            "language": "en",
            "targets": [
                { "language": "fr", "deploymentName": "gpt-4o-mini", "tone": "formal", "gender": "female" },
                { "language": "fr", "deploymentName": "gpt-4o-mini", "tone": "formal", "gender": "male" },
                { "language": "fr" }
            ]
        }
    ]   
    const response = await axios.post(`${globalEndpoint}/translate`, body, { headers, params });
    return response.data;
}

Expected output:

French translation with male and female variations

Migration from API version: v3 → 2025-05-01-preview

I've linked the migration guide in the resources section at the end of this blog, but here is a quick checklist of considerations when upgrading existing v3 implementations to the new preview API.

  • Refactor request body from single object with to parameter to array of objects with text, language, targets.
  • Remove reliance on deprecated features: BreakSentence, Detect, Dictionary operations.
  • Add environment variables for Foundry resource key if using LLM.
  • Validate supported language codes via the Language Support table.
  • Implement logging for token vs character usage metrics and re-test for response schema differences.
  • Update error handling.

Data Privacy, Regional Processing & Compliance

  • Regional & Geographical Endpoints. Choose regional endpoints (Americas, Europe, Asia Pacific, Switzerland) to confine processing geography as the global endpoint may route to the nearest data center with potential cross-geography fallback during outages.
  • LLM Processing Configuration. The deployment type (global, data zone, regional) selected at model deployment stage influences where data is processed, so select based on regulatory boundaries.
  • Virtual Network Support. For stricter network isolation, enable VNET + private endpoints and use custom domain endpoint (cannot use global endpoint or token auth).

Resources

Updated Oct 30, 2025
Version 1.0
No CommentsBe the first to comment