You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Currently, the JSON frontend requires that we specify the path of a CSV file containing the test data. It would be good if we could instead supply a data collection script which would interface directly with the system under test to collect data. A basic solution would be to simply pass in a dataframe object directly rather than the string filename. Then we could have a preprocessing script to execute before the tests to build the dataframe. If anyone has any better ideas, I'm open to suggestions.
The text was updated successfully, but these errors were encountered:
Makes sense, would be nice to be able to directly send it data objects rather than just paths.
I would love to be able to keep the ability of giving it a path to a directory of .csvs. As .csv is such a common file format and it may be the most user friendly way of interacting.
One potential way, which you could argue is 'pythonic', is to do some sort of EAFP (Easier to Ask for Forgiveness than Permission) style argument when giving data to the JSON class.
I.e. we attempt to handle the argument as a Dataframe, and if that throws a type error, handle it and try treat it as a path. If that then fails, throw an exception.
This would allow one argument to handle both Dataframes and paths? Not sure exactly where this sits in good practice though, despite sounding pythonic.
I found a temporary workaround for now which is to remove the requirement for a path to be specified. You can then just run your system and do json_class.data = my_data_frame. Not sure how pythonic or best practice this is, but it seems to work. We could easily stick in a validation to make sure the data has been set before any tests are executed.
Is your feature request related to a problem? Please describe.
Currently, the JSON frontend requires that we specify the path of a CSV file containing the test data. It would be good if we could instead supply a data collection script which would interface directly with the system under test to collect data. A basic solution would be to simply pass in a dataframe object directly rather than the string filename. Then we could have a preprocessing script to execute before the tests to build the dataframe. If anyone has any better ideas, I'm open to suggestions.
The text was updated successfully, but these errors were encountered: