-
Notifications
You must be signed in to change notification settings - Fork 917
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix/ensemble historical forecasts #1616
Fix/ensemble historical forecasts #1616
Conversation
…m/JanFidor/darts into fix/ensemble-historical-forecasts
Codecov ReportPatch coverage:
📣 This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more Additional details and impacted files@@ Coverage Diff @@
## master #1616 +/- ##
==========================================
- Coverage 94.11% 94.03% -0.08%
==========================================
Files 125 125
Lines 11447 11454 +7
==========================================
- Hits 10773 10771 -2
- Misses 674 683 +9
... and 9 files with indirect coverage changes Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @JanFidor and thanks for another PR 🚀 :)
Looks good! Only had a comment where I believe we're not getting the correct overall extreme lags.
max_lag = None | ||
for model in self.models: | ||
curr_lag = model.extreme_lags[lag_id] | ||
if max_lag is None or ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure if this respects the actual min/max of all extreme lags.
For example lags_future_covariates for one regression models could be [-3, -2], and for a second one [-1, 1].
The min lag for both models should be -3, and max lag +1.
However, the current implementation would give min/max lags of -3 and -2
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, I assumed that for a given lag all values must either be positive or negative, and abs() would be working then. I checked out the docs and I think that the new implementation fixes the problem
Hi @dennisbader ! As always, I really appreciate both that you took the time to review the PR and the chance to help out with darts, even if it's just a little for now 🚀 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @JanFidor and thanks for the updates 🚀
I just had a suggestion about adding a unit test that checks that we get correct extreme lags based on some example models.
After that, it's ready to be merge :)
self._find_max_lag_or_none(i, agg) for i, agg in enumerate(lag_aggregators) | ||
] | ||
|
||
def _find_max_lag_or_none(self, lag_id, aggregator): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you add a unit test for this? It would be great to check that ensemble models now get the correct extreme lags (for example for the regression model example I mentioned in a previous comment?) No need to call backtest/historical_forecasts, just a simple check that we get the correct lags.
…m/JanFidor/darts into fix/ensemble-historical-forecasts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, thanks a lot @JanFidor 🚀
I took the chance to update some parts so we can go ahead with the release soon.
Thanks for the review and help with the merge conflicts @dennisbader ! I think this PR might be my favorite one, so I'm very happy it will be released soon. A little unrelated, but Uni picked up significantly when it comes to the workload, so I'll unfortunately have to cut down on the new contributions for the time being, but I'll do my best to finish up an issue I picked up. I really enjoyed working with everyone here, had a blast writing the code and yeah, I just wanted to let you all know, so It doesn't seem like I disappeared out of the blue. I'll certainly be back with even more PRs, but it might be only after the term and exams end, so around the summer. Until then! cc @madtoinou |
* add correct extreme_lags override and test * add required extreme_lags override * delete logging print * change lag priorities * add a test + use switch to tuple * fix extreme lags from other PR * make RegressionEnsembleModel work * small unit test fix --------- Co-authored-by: madtoinou <[email protected]> Co-authored-by: Dennis Bader <[email protected]>
Fixes #1595
Summary
Calling
historical_forecasts
orbacktest
on an ensemble model containg at least one model havinglags
horizon > 1 orinput_chunk_length
> 1 throws error.Caused by using default ensemble's
extreme_lags
property, instead of calculating a new one using its models'extreme_lags
Other Information
Added
extreme_lags
property to theta models -> they were previously throwing errors.There might be a way of deleting
extreme_lags
altogether and only usingmin_train_series_length
, but I wanted to fix the bug ASAP and think about potential refactorization later.