Announcing Azure OpenAI Global Batch Offering: Efficient processing at scale with 50% less cost
Published Aug 05 2024 05:39 PM 6,521 Views
Microsoft

We are excited to announce the public preview of Azure OpenAI Global Batch offering, designed to handle large-scale and high-volume processing tasks efficiently. Process asynchronous groups of requests with separate quota, with target 24-hour turnaround time, at 50% less cost than global standard.

 

This is a quote from our customer:

Sethu_Raman_0-1722899753397.jpeg

 

 "Ontada is at the unique position of serving providers, patients and life science partners with data-driven insights. We leverage the Azure OpenAI batch API to process tens of millions of unstructured documents efficiently, enhancing our ability to extract valuable clinical information. What would have taken months to process now takes just a week. This significantly improves evidence-based medicine practice and accelerates life science product R&D. Partnering with Microsoft, we are advancing AI-driven oncology research, aiming for breakthroughs in personalized cancer care and drug development."

 Sagran Moodley, Chief Innovation and Technology Officer, Ontada

 

Why Azure OpenAI Global Batch?

  • Cost Efficiency: 50% reduction in cost compared to global standard pricing.
  • Separate Quota: Manage batch requests with an independent enqueued token quota, ensuring that your online workloads are unaffected. Batch quota is substantially high. Learn more.
  • 24-Hour Turnaround: Our aim is to process batch requests within 24 hours, ensuring timely results for your usecases.

Supported Models

The following models currently support the global batch:

Model

Supported Versions

gpt4-o

2024-05-13

gpt-4o-mini

2024-07-18

gpt-4

turbo-2024-04-09

gpt-4

0613

gpt-35-turbo

0125

gpt-35-turbo

1106

gpt-35-turbo

0613

 

For the most up-to-date information on regions and models, please refer to our models page.

 

Key Use Cases

The Azure OpenAI Batch API opens up new possibilities across various industries and applications:

  1. Large-Scale Data Processing: Quickly analyze extensive datasets in parallel, enabling faster decision-making and insights.
  2. Content Generation: Automate the creation of large volumes of text, such as product descriptions, articles, and more.
  3. Document Review and Summarization: Streamline the review and summarization of lengthy documents, saving valuable time and resources.
  4. Customer Support Automation: Enhance customer support by handling numerous queries simultaneously, ensuring faster and more efficient responses.
  5. Data Extraction and Analysis: Extract and analyze information from vast amounts of unstructured data, unlocking valuable insights.
  6. Natural Language Processing (NLP) Tasks: Perform sentiment analysis, translation, and other NLP tasks on large datasets effortlessly.
  7. Marketing and Personalization: Generate personalized content and recommendations at-scale, improving engagement and customer satisfaction.

Getting Started

Ready to try Azure OpenAI Batch API? Take it for a spin here.

Learn more

Using images in your batch input

Default batch token quota allocation and requesting increase

Supported regions

 

Co-Authors
Version history
Last update:
‎Aug 12 2024 09:14 AM
Updated by: