-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow Metrics to be created outside of init context #3702
Comments
Hi @xresch, sorry for the slow reply 🙇 and thank you for brining this up. This has been the case from the very first version of k6, and we have transported it over. This is currently used to be able to do some threshold validation and other checking in some places. But is arguably not used through the codebase it seems. From code perspective though will require quite a lot of refactoring to make it work. cc @grafana/k6-core, @sniku and @dgzlopes on reasons why this is this way? Do we want to keep this definitely? The recommended practice on this is to use tags for the different parts instead of having totally different metrics. About the separate API for metric factory and co - arguably if you can only make metrics only in the init context, and you can only emit them outside, that seems less useful. But I am not against a separate issue proposing additional API for metric definition |
Hi @mstoykov, thanks a lot for having a look into this issue. |
Hi @xresch, Isn't group API an option for you? It uses the same tagging principle as suggested by @mstoykov. It generates metrics by adding a If not, can you report why, please? Do you expect to measure async code? |
Hi @codebien, the problem with groups is that if I have multiple requests in my group, then the response time for those is accumulated as one. I want to see the response time for different requests in order to identify what is the response time for what request. Hope you got that! |
Looking for this feature also. |
Feature Description
Situation:
I want to create trend metrics dynamically during script execution, but I receive the following error:
Data:
In load and performance testing and monitoring it is a common best practice to create custom duration metrics to measure specific parts of a script. What I do in most tools, would look in K6 like this(code does not work because of this issue):
The problem with doing it in the init context is that it makes the code harder to write and maintain.
If I want to create 15 custom duration metrics I have to initialize every single one of them, plus have to give every one of those constants to two methods(start/stop) in order to achieve proper duration metrics.
This will be prone to copy paste issues and unnecessarily blows up the code.
Here an example:
Suggested Solution (optional)
A solution would be to allow creating metrics anywhere and not only in the init context. This is supported by basically most of the tools I have used for testing so far (HP LoadRunner, Silk Performer, Playwright, Selenium etc...) and is considered often a best practice to create custom measure blocks.
Not sure how it works with K6, solutions I have seen in other tools:
a) Having a start(measureName)- and stop(measureName)-function that allows for easy measurement of time.
b) Having a Factory-method like "Metrics.getOrCreateMeasure(trendName)" that will return an instance to report metrics too.
c) Having a reporting mechanism, like "Measurements.reportData(measureName, value)" that allows you to add a value to a dataset that will later be added to the output.
I was actually quite surprised that I couldn't find anything like this in K6.
Maybe it is already there but I haven't looked good enough. 🤔
Already existing or connected issues / PRs (optional)
No response
The text was updated successfully, but these errors were encountered: