You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is not an issue but I didn't see an option to start a discussion. Hence posting it here.
I have seen examples on DeepSurv and PyCox packages where the number of neurons in each hidden layers are the same. Is this how we need to define the network for DeepSurv? Or can we play around with the number of neurons in each layer?
Thanks!
The text was updated successfully, but these errors were encountered:
I think I found what I'm looking for: DeepSurv is a package that implements a DL generalization of the CPH model using the TensorFlow structure21. DeepSurv uses a multilayer perceptron to self-learn the effects of a covariate. Priori selection and interaction of the covariates should be considered in designing the CPH model, but DeepSurv has the advantage of not considering this. DeepSurv is composed of 1 input layer (12 nodes for independent variables), 3 hidden layers with 6, 3, and 1 nodes with tanh activation, and output. We used the Adam optimizer with a learning rate of 0.4 and a learning rate decay of 1.0. We used dropout, batch normalization, and L1 and L2 regularization during training. We additionally experimented whether the elimination of the covariate with the least important feature leads to an improvement in the C-index. All covariates were standardized when entered into the DeepSurv model, and grid search mechanisms were used for hyperparameter optimization.
Hello all,
This is not an issue but I didn't see an option to start a discussion. Hence posting it here.
I have seen examples on DeepSurv and PyCox packages where the number of neurons in each hidden layers are the same. Is this how we need to define the network for DeepSurv? Or can we play around with the number of neurons in each layer?
Thanks!
The text was updated successfully, but these errors were encountered: