Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

log.to_pandas() gives only NaNs #80

Open
RiccPicc opened this issue Sep 28, 2022 · 0 comments
Open

log.to_pandas() gives only NaNs #80

RiccPicc opened this issue Sep 28, 2022 · 0 comments

Comments

@RiccPicc
Copy link

Hi everyone!
Can you help me? I am new in using DeepSurv. I am following the steps given in the examples, but I encounter this issue in the dataset on which I wish to use this package:
image

Before running the net, I ensured that all the variables I am using were float. I converted categories into numeric values using .cat.codes and then converted into float. I did so since otherwise model_.lr_finder(x_train, y_train, batch_size, tolerance=10) and model_.fit(x_train, y_train, batch_size, epochs, callbacks, verbose, val_data=val, val_batch_size=batch_size) wouldn't have worked due to different "data" type.

This are the setting I am using for the tt.practical.MLPVanilla(in_features, num_nodes, out_features, batch_norm, dropout, output_bias=output_bias):

n_nodes = 256
in_features = x_train.shape[1] # number of variables
num_nodes = [n_nodes, n_nodes, n_nodes, n_nodes]
out_features = 1
batch_norm = True 
dropout = 0.4 
output_bias = False

This is the model: model_ = CoxPH(net_ds, tt.optim.Adam)

This are fit settings for log = model_.fit(x_train, y_train, batch_size, epochs, callbacks, verbose,val_data=val, val_batch_size=batch_size):

batch_size = 128
best_lr = lrfinder.get_best_lr()
model_.optimizer.set_lr(best_lr)
epochs = 512
callbacks = [tt.callbacks.EarlyStopping()] # Stop training when a monitored metric has stopped improving.
verbose = True

Thank you in advance!
Cheers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant