Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Brier score for classification #41

Open
ja-thomas opened this issue Jan 25, 2018 · 4 comments
Open

Use Brier score for classification #41

ja-thomas opened this issue Jan 25, 2018 · 4 comments

Comments

@ja-thomas
Copy link
Owner

Might be better as a default compared to missclassification rate especially for tuning to make everything a bit smoother

@PhilippPro
Copy link

In my benchmark, it was not always better regarding the MMCE than for example tuning AUC. It had on average the lowest rank but at least for RF it sometimes provided worse results than tuning MMCE directly or MMCE. Important parameter in my opinion.

@ja-thomas
Copy link
Owner Author

Thanks for the info. Looks interesting.
So, in essence, it still makes sense for the user to choose the measure that should be optimized.
The question is still what's the best default measure :)
If I look over the average ranks over different measures, the brier score seems to be the best overall, correct?

@PhilippPro
Copy link

Yes, for the ranks this is true. But difference to for example logarithmic loss is not big. Probably it would be interesting to benchmark autoxgboost with different target measures. ;)

@ja-thomas
Copy link
Owner Author

sounds like a good idea.

@Coorsaa can you add that to the benchmark?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants