Forum Discussion
Integrating Azure OpenAI Services
Hi ai89,
towards Model selection: I would recommend gpt-4o (or gpt-4o-mini for cost efficiency). You can deploy it as serverless/on-demand version in azure as most cost-efficient solution.
For integration: I assume that you have some kind of authentication for your web-app. If you use Entra-ID, you can use the token and make the request directly from your webapp.
If not, I often used an Azure Function (would be most cost efficient, alternatively e.g. App Service) as backend, which checks for correct permissions and calls the OpenAI deployment after that.
Just be careful that you don't expose your key in your web platform.
For customization: I would use a combination of context/system-messages and temperature/top_p settings. Lowering temperature (e.g., 0.5) can make outputs more consistent, while a slightly higher setting (e.g., 0.8) may produce more creative results. Here, keep in mind that context also counts as input tokens.
If that is not enough, there is also the option to train models serverless.
Hope that helps!
Best regards,
Moritz
Thanks Mortiz for your response. I really appreciate it.