-
Notifications
You must be signed in to change notification settings - Fork 918
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Odd Behaviour with new dictionary format to define lags #2062
Comments
Hi @ETTAN93, can you provide a minimal reproducible example that we can run to reproduce the bug (e.g. not a picture, and code to recreate the time series, you can use TimeSeries.from_times_and_values() to generate some example series)? Also, we might have fixed this bug in #2040 that resolved a similar issue. |
@dennisbader, from the example in #2040, it does seem like that is a similar error to what I was experiencing. Do you know when will the fix be available for use? |
Yes, we aim to release within the next 1-2 weeks. |
@dennisbader Please see below the code example of my reproduced error. Seems like it is not taking into account the future cov lags at 0 when it is calculating the number of features X is providing
|
I just checked on the master branch, #2040 indeed fixes this problem and will be part of the next release. Closing this issue, don't hesitate to reopen if it persists after updating darts to the latest version after the release. |
I am trying out the new dict format to define lags in the latest version of darts: 0.26.0
However, there seems to be some odd behaviour which I can't figure out why:
The way I define my covariates and the data:

The way the model and lags are initialized:

The way historical forecast is run:

This works completely fine when I define the lags in this way.
However, when I take out any features from the dataset as well the lags, e.g.


I get an error:

The text was updated successfully, but these errors were encountered: