Blog Post

TestingSpot Blog
2 MIN READ

AI‑Powered Performance Test Analysis using GHCP

AnilKumarGolla's avatar
Feb 02, 2026

Problem Statement
Performance testing teams often face significant challenges in comparing JMeter test results across environments or test runs.
Manual comparison and analysis of multiple result files is time-consuming, error-prone, and lacks actionable insights.

Solution
An AI-powered solution leveraging GHCP has been developed to address the identified challenge. This solution is designed to deliver

  • Seamlessly compare JMeter performance results across environments (e.g., On-Prem vs. Azure) or test runs.
  • Provide AI-driven insights to highlight endpoints with significant performance changes.
  • Deliver clear, prioritized recommendations for faster issue resolution.
  • Reduce analysis time by up to 80%, minimizing resource utilization and enabling cost savings.

Business Outcomes

  • Automated Performance Comparison: Seamlessly compare JMeter performance test results across two environments (e.g., On-Prem vs. Azure) or between different test runs, reducing manual effort and accelerating analysis.
  • AI-Driven Insights for Decision Making: Leverage AI to identify endpoints with the most significant performance improvements or degradations.
  • Actionable Observations and Recommendations: Generate clear, prioritized recommendations based on key performance trends, ensuring faster resolution of bottlenecks and improved application reliability.
  • Enhanced Efficiency and Cost Savings: Minimize analysis time and resource utilization through automation, contributing to measurable effort savings and improved operational efficiency. manual comparison of multiple JMeter result files is time-consuming and automation can reduce analysis time by up to 80%

Pre-Requisites

  • Visual Studio code with GHCP Enabled.

Usage Guidelines

  1. Start GitHub Copilot Chat from within Visual Studio Code.
  2. Attach follow files in GHCP Chat
  • Azure_PerfTestResults.json
  • OnPrem_PerfTestResults.json
  • PerfResultsAnalysis_Instructions.md

Note:  *.json files are statistics.json files generated as part of JMeter Html Reports.

  1. Execute below UserPrompt

                                       UserPrompt: “Follow the steps in #file:PerfResultsAnalysis_Instructions.md and compare the two JMeter result files uploaded.”

  1. File Structure Validation is performed to ensure both files conform to the expected test results format.
  2. Upon successful validation, select the performance metric for comparison. Ex: AverageResponseTime
  1. Test results are analyzed, and Response Time Comparison between Baseline (On-Prem) and Azure is presented, including deviation and performance status. Results are also exported to a CSV file for easy reference.
  2. AI-Driven Performance Insights are generated to provide actionable recommendations.
  1. Use the prompts below to perform a more detailed analysis of your test results.

                                    UserPrompt: “Get me the Average Response time of GetProducts API between Azure and On-Prem”

 

                                    User Prompt: “Expected SLA on Azure is 150 ms, Get me the APIs whos Average Response Time is > 150 ms on Azure”

GitHub Repository for project is available at https://github.com/AnilKumarGolla/PerfAnalysisUsingGHCP

 

Updated Jan 29, 2026
Version 1.0

1 Comment

  • Thanks AnilKumarGolla​ for publishing in the TestingSpotBlog! This is a great article, very detailed and very clear. It's nice to see AI making its way into performance testing. Thank you for your post again, please keep them coming!