[integ-tests-framework] Add launch time and performance report generation for integration tests#7052
Merged
hanwen-cluster merged 2 commits intoaws:developfrom Jan 13, 2026
Merged
Conversation
27a3bf2 to
ba39af8
Compare
himani2411
reviewed
Jan 8, 2026
| if last_evaluated_key: | ||
| scan_params["ExclusiveStartKey"] = last_evaluated_key | ||
|
|
||
| response = dynamodb_client.scan(**scan_params) |
Contributor
There was a problem hiding this comment.
scanning the dynamodb table to get a year's worth of data every day seems like an overkill, especially when we are not analyzing it every day.
I would suggest to run this section of getting the data in XSLX maybe once a week or month?
Contributor
Author
There was a problem hiding this comment.
Great suggestion! I amended the code to get launch time Excel every 5 days. I will still get the performance data Excel everyday. Because the source dynamoDB table for launch time is much larger than the dynamoDB table for performance. For performance data report, it only takes 30 seconds. We can keep it everyday
8a3c919 to
eea7e26
Compare
eea7e26 to
e6445ad
Compare
Add two new report generators to analyze integration test metrics from DynamoDB: - generate_launch_time_report: Queries ParallelCluster-IntegTest-Metadata table to analyze cluster creation time and compute node launch times. Generates statistics grouped by OS and test name with time-windowed aggregation to achieve consistent result with os rotation. This is done every 5 days to avoid querying the whole test database everyday. - generate_performance_report: Queries ParallelCluster-PerformanceTest-Metadata table to track performance data (OSU, StarCCM) by node count over time. This is done everyday because the database is separate and smaller. The execution time is around 30 seconds. Both generators output Excel reports using pandas/openpyxl for easy analysis and visualization. Reports are automatically generated when JSON reports are requested via test_runner.py.
e6445ad to
046b08b
Compare
himani2411
approved these changes
Jan 13, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description of changes
Add two new report generators to analyze integration test metrics from DynamoDB:
generate_launch_time_report: Queries ParallelCluster-IntegTest-Metadata table to analyze cluster creation time and compute node launch times. Generates statistics grouped by OS and test name with time-windowed aggregation to achieve consistent result with os rotation. This is done every 5 days to avoid querying the whole test database everyday.
generate_performance_report: Queries ParallelCluster-PerformanceTest-Metadata table to track performance data (OSU, StarCCM) by node count over time. This is done everyday because the database is separate and smaller. The execution time is around 30 seconds.
Both generators output Excel reports using pandas/openpyxl for easy analysis and visualization. Reports are automatically generated when JSON reports are requested via test_runner.py.
Tests
We are able to generate excel files at the end of integration tests
Checklist
developadd the branch name as prefix in the PR title (e.g.[release-3.6]).Please review the guidelines for contributing and Pull Request Instructions.
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.