Skip to content

Commit

Permalink
Fix mlflow.pyfunc.save_model docstring (mlflow#13446)
Browse files Browse the repository at this point in the history
Signed-off-by: harupy <[email protected]>
  • Loading branch information
harupy authored and serena-ruan committed Oct 25, 2024
1 parent 0048883 commit c987025
Showing 1 changed file with 1 addition and 6 deletions.
7 changes: 1 addition & 6 deletions mlflow/pyfunc/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2251,9 +2251,6 @@ def save_model(
**kwargs,
):
"""
save_model(path, loader_module=None, data_path=None, code_path=None, conda_env=None,\
mlflow_model=Model(), python_model=None, artifacts=None)
Save a Pyfunc model with custom inference logic and optional data dependencies to a path on the
local filesystem.
Expand Down Expand Up @@ -2386,9 +2383,7 @@ def predict(self, context, model_input: List[str], params=None) -> List[str]:
and :func:`PythonModel.predict() <mlflow.pyfunc.PythonModel.predict>`.
For example, consider the following ``artifacts`` dictionary::
{
"my_file": "s3://my-bucket/path/to/my/file"
}
{"my_file": "s3://my-bucket/path/to/my/file"}
In this case, the ``"my_file"`` artifact is downloaded from S3. The
``python_model`` can then refer to ``"my_file"`` as an absolute filesystem
Expand Down

0 comments on commit c987025

Please sign in to comment.