-
Notifications
You must be signed in to change notification settings - Fork 124
Fix Simulated Hrvo test #3621
Copy link
Copy link
Labels
BugIssues that describe a bug and possibly a suggested solutionIssues that describe a bug and possibly a suggested solutionDifficulty - 13Requires a good understanding of relevant systems and toolsRequires a good understanding of relevant systems and toolsGameplayTestingField testing, test infrastructure, robot tuningField testing, test infrastructure, robot tuning
Metadata
Metadata
Assignees
Labels
BugIssues that describe a bug and possibly a suggested solutionIssues that describe a bug and possibly a suggested solutionDifficulty - 13Requires a good understanding of relevant systems and toolsRequires a good understanding of relevant systems and toolsGameplayTestingField testing, test infrastructure, robot tuningField testing, test infrastructure, robot tuning
Description of the task
Simulated hrvo test is super inconsistent across systems and across runs. Running with thunderscope breaks it. Running without thunderscope also breaks it.
One notable issue: Running it with thunderscope causes the AI to continue running after the validation is complete, failing the test. Find out why this happens (Can we disable the AI for this test? What is so different about this test from the other pytests that breaks our systems?)
May need to refactor into a similar form to a playtest.
Good luck
Acceptance criteria
Blocked By