-
Notifications
You must be signed in to change notification settings - Fork 917
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding RMSE With Uncertainty loss to CatBoost #1044
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1044 +/- ##
==========================================
+ Coverage 91.35% 91.39% +0.04%
==========================================
Files 78 78
Lines 7931 7941 +10
==========================================
+ Hits 7245 7258 +13
+ Misses 686 683 -3
Continue to review full report at Codecov.
|
|
||
model_output = self.model.predict(x, **kwargs) | ||
output_dim = len(model_output.shape) | ||
chunk_len = self.output_chunk_length |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd recommend you get rid of this variable and simply use self.output_chunk_length
.
So in view of the tutorial you posted, what sort of "knowledge uncertainty" is this version capturing? It'd be nice to document this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, thanks!
In this PR, CatBoostModel and RegressionModel were modified to support CatBoost's RMSE With Uncertainty loss function.
With this loss, CatBoost output not a single value but a [mean, variance] list. Most of the work here was concerned with wrangling output shapes, especially in the multivariate case and with output chunks > 1.
This PR depends on the original CatBoost PR and should be merged after it.