Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement shared hyperameters #69

Open
csala opened this issue Oct 9, 2018 · 0 comments
Open

Implement shared hyperameters #69

csala opened this issue Oct 9, 2018 · 0 comments
Assignees

Comments

@csala
Copy link
Contributor

csala commented Oct 9, 2018

Currently, the hyperparameters of each primitive are specified individually.
However, there are some scenarios where hyperparameters from two different blocks should match each other.

An example of this are the keras.preprocessing.sequence.pad_sequences and keras.Sequential.LSTMTextClassifier primitives: The pad_sequences primitive has a maxlen hyperparameter that should be exactly the same value that is given to the LSTMTextClassifier as the input_length hyperparameter.

@csala csala self-assigned this Oct 9, 2018
gsheni pushed a commit that referenced this issue Aug 29, 2022
…mlprimitives_jsons

Issue 3 use entry points for mlprimitives jsons
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant