Forum Discussion
Evaluation
Hi there,
I tried out the evaluation feature, and tested out groundedness, relevance as well as similarity.
My dataset has 94 questions and both relevance and similarity checked all 94 questions and its respective responses and gave me either a pass or a fail.
However, groundedness completed the run with errors, as almost 10 of the inputs came back as null. I tried going through the logs but I'm not sure where to check what went wrong for those questions.
Appreciate if someone could point me in the right direction.
1 Reply
- MisterEM
Microsoft
Check Evaluation results in the Portal
Use the Azure AI Foundry portal to inspect individual evaluation runs:
- Go to the Evaluation page.
- Locate your evaluation run in the run list.
- Click on the run to open the run detail page.
- Review each data sample and its associated metrics, including any null or failed entries
View Evaluation Results in the Azure AI Foundry Portal - Azure AI Foundry | Microsoft Learn
Check Evaluation Configuration and Data Mapping
Groundedness evaluation requires specific fields:
- Query: Required
- Response: Required
- Context: Required
If any of these are missing or malformed in your dataset, the evaluator may return null.
Evaluate Generative AI Models and Apps with Azure AI Foundry - Azure AI Foundry | Microsoft Learn
Inspect Dataset Format
Ensure your dataset:
- Is in JSONL or CSV format
- Contains valid entries for query, response, and context
- Avoids empty strings or missing fields
You can preview your dataset during evaluation setup in the portal to catch formatting issues early
Use the SDK for Local Degugging
If portal logs aren’t sufficient, use the Azure AI Evaluation SDK to run evaluations locally:
Python: pip install azure-ai-evaluation
This lets you isolate problematic entries and inspect detailed error messages.
Local Evaluation with the Azure AI Evaluation SDK - Azure AI Foundry | Microsoft Learn
Monitor Application Health
If the issue is related to runtime behavior or model deployment, use the Monitoring tab in Azure AI Foundry:
- Navigate to Monitoring > Application Analytics
- Use filters to inspect logs and metrics
- If needed, open the view in Azure Monitor Application Insights for deeper analysis
Monitor your Generative AI Applications - Azure AI Foundry | Microsoft Learn