-
Notifications
You must be signed in to change notification settings - Fork 917
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature/multivariate - step 4 #111
Conversation
…ade rnn and tcn model to inherit from it
…caused by merging
… frequency and inferred frequency don't match, but just a warning
@@ -28,10 +29,28 @@ | |||
|
|||
# TODO parameterize the moving window | |||
|
|||
def _create_parameter_dicts(model, target_indices, component_index, use_full_output_length): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe this is what @guillaumeraille was asking on the daily in the morning.
@pennfranc this looks good, think we can also somehow make it automatic?
def backtest_forecasting(series: TimeSeries, | ||
model: ForecastingModel, | ||
start: pd.Timestamp, | ||
fcast_horizon_n: int, | ||
target_indices: Optional[List[int]] = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quite a lot of args passed to this function. As an optional improvement can consider moving them to a class level and wrap the whole backtesting module in a class.
use_full_output_length | ||
In case `model` is a subclass of `TorchForecastingModel`, this argument will be passed along | ||
as argument to the predict method of `model`. | ||
stride |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
very nice. It would also be nice to have the option of configuring the training set length. When set, this would do moving window, and when not set it would do expending window (can wait a future PR though :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes for sure, @guillaumeraille also suggested this!
darts/backtesting/backtesting.py
Outdated
stride | ||
The number of time steps (the unit being the frequency of `series`) between two consecutive predictions. | ||
retrain | ||
Whether to retrain the model for every prediction or not. Currently only TorchForecastingModel |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can add backticks around TorchForecastingModel: `
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
darts/backtesting/backtesting.py
Outdated
last_pred_time = ( | ||
series.time_index()[-fcast_horizon_n - stride] if trim_to_series else series.time_index()[-stride - 1] | ||
) | ||
raise_if_not(retrain or isinstance(model, TorchForecastingModel), "Only 'TorchForecastingModel' instances" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can move this check up (before computing last_pred_time
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
Summary