You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This takes the configs that are a dict and replaces them by the first value.
This makes some sense in the case of dicts with one entry, like num_features_used, but for other dicts, like differentiable_hyperparameters I don't understand the intention.
This will pick whatever key was first inserted in the differentiable_hyperparamers and assign that as the value, so it'll be something like {'distribution': 'uniform', 'min': 2.0, 'max': 10.0} (this is from prior_bag_exp_weights_1, which should not be related to either the GP or the MLP prior, but the information what the key was is lost entirely).
Is that line meant to only get out num_features_used and we don't care what happens to any other dictionaries?
@noahho was so nice as to walk me through some of the prior code, but I have one more question.
In the model builder part of the prior config is parsed
here:
TabPFN/tabpfn/scripts/model_builder.py
Line 172 in d76f4ac
I'm trying to wrap my head around this line, which is used for both the MLP and the GP:
This takes the configs that are a dict and replaces them by the first value.
This makes some sense in the case of dicts with one entry, like
num_features_used
, but for other dicts, likedifferentiable_hyperparameters
I don't understand the intention.This will pick whatever key was first inserted in the
differentiable_hyperparamers
and assign that as the value, so it'll be something like{'distribution': 'uniform', 'min': 2.0, 'max': 10.0}
(this is fromprior_bag_exp_weights_1
, which should not be related to either the GP or the MLP prior, but the information what the key was is lost entirely).Is that line meant to only get out
num_features_used
and we don't care what happens to any other dictionaries?In that case I might use
instead of the dictionary comprehension above.
Thanks and merry christmas & happy new year!
The text was updated successfully, but these errors were encountered: