Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Advice/protection against oddities in training point sets #268

Open
odunbar opened this issue Dec 20, 2023 · 1 comment
Open

Advice/protection against oddities in training point sets #268

odunbar opened this issue Dec 20, 2023 · 1 comment

Comments

@odunbar
Copy link
Collaborator

odunbar commented Dec 20, 2023

Arising in PR #265 for example,

We find that sometimes emulator training is problematic for a fixed data set, and a small modification leads to massive improvements. More robust handling of the training dataest by e.g. providing more of a Cross validation procedure, or better construction of train/validation splits in the provided points may lead to more robust trainings.

@odunbar
Copy link
Collaborator Author

odunbar commented Feb 12, 2024

Adding this here,
Another nice thing would be to add correlation/covariance/transformations to the data processing https://stats.stackexchange.com/questions/53/pca-on-correlation-or-covariance

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant