-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement a loss function for classification models #117
Comments
Here is our implementation of the RMSE: SossMLJ.jl/src/loss-functions.jl Lines 1 to 19 in f0b38b0
|
@cscherrer Any thoughts on a good loss function for the multinomial classification problem? Some options include:
Any other options? |
Either of those would be good, or an asymmetric loss could be interesting. I'd think this must come up a lot in medical applications, right? |
Yeah in binary classification problems (e.g. mortality prediction), we often want to use a loss function that e.g. penalizes underprediction more than overprediction. I think for the multinomial example, we can just use something simple and symmetric. Then later we can add a binary class problem with class imbalance, and then we can think about some kind of asymmetric loss function for that problem. |
Let's go with the Brier score. For consistency with MLJ, we should implement it the same way they do (https://github.com/alan-turing-institute/MLJBase.jl/blob/5e5d1cda3b555510df1de4b125a5e320c11f6256/src/measures/finite.jl#L103-L131):
|
Sounds good |
We currently have an example of a loss function for regression models. Specifically, we implement the root mean squared error.
However, we don't currently have an example of a loss function for classification models.
We need to:
The text was updated successfully, but these errors were encountered: