Skip to content
This repository has been archived by the owner on Feb 20, 2023. It is now read-only.

Export the trained model #10

Merged
merged 2 commits into from
Dec 15, 2021
Merged

Conversation

tzstoyanov
Copy link
Contributor

@tzstoyanov tzstoyanov commented Dec 8, 2021

When the model is trained, in order to run an inference service to serve
it, the model should be exported. Two optional parameters are
introduced:
-save NAME
-save_version VERSION
By default, the model is not exported. If "-save NAME" is specified, the
model is saved using given NAME. If "-save_version VERSION" is
specified, together with "-save NAME", the model is saved using given
NAME and VERSION. The "-save_version" is ignored, if "-save" is missing.
By default, version "001" is used. Models are exported in directory:
models/<NAME>-<outcome>/<VERSION>/
and are compressed in file:
models/<NAME>-<outcome>/<NAME>-<outcome>-<VERSION>.tar.gz
The exported models are tested with kserve, the layout of directories and
archive file is designed in a way kserve tensorflow predictor expects.

fixes #2

Signed-off-by: Tzvetomir Stoyanov (VMware) [email protected]

@annajung
Copy link
Contributor

annajung commented Dec 9, 2021

The model's version is hardcoded to 0001, managing different model
versions is TBD.

Can we create an issue to track this?

Also, I think with the new functions added, docs might have to be regenerated.

PYTHONPATH=./ml-conversational-analytic-tool pdoc --html --output-dir docs ml-conversational-analytic-tool --force

@tzstoyanov
Copy link
Contributor Author

tzstoyanov commented Dec 10, 2021

done, the documentation is updated with a second commit.
Thanks for pointing that - I had no idea about it.

@@ -46,6 +66,7 @@ def run(annotated_filename, dataset_filename, outcome, encoding_type, model_type
parser.add_argument('dataset_filename', help='File location of extracted dataset')
parser.add_argument('model', help='Model type to use for training, supported CNN and LSTM')
parser.add_argument('outcome', help='Inclusive, Constructive, or Both')
parser.add_argument('-save', help='Save the trained model')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you clarify in the help msg what is the type of the argument -save expects. For me, at first time it was unclear if -save is kind of a boolean and does not expect an additional argument if it is a path or a file name ...

tar = tarfile.open(name + "-" + model_version + ".tar.gz", "w:gz")
tar.add(model_version)
tar.close()
os.chdir("../../")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: what about printing out the directory the model and the tar file are saved?

When the model is trained, in order to run an inference service to serve
it, the model should be exported. Two optional parameters are
introduced:
  "-save NAME"
  "-save_version VERSION"
By default, the model is not exported. If "-save NAME" is specified, the
model is saved using given NAME. If "-save_version VERSION" is
specified, together with "-save NAME", the model is saved using given
NAME and VERSION. The "-save_version" is ignored, if "-save" is missing.
By default, version "001" is used. Models are exported in directory:
 models/<NAME>-<outcome>/<VERSION>/
and are compressed in file:
 models/<NAME>-<outcome>/<NAME>-<outcome>-<VERSION>.tar.gz
The exported models are tested with kserve, the layout of directories and
archive file is designed in a way kserve tensorflow predictor expects.

fixes vmware-archive#2

Signed-off-by: Tzvetomir Stoyanov (VMware) <[email protected]>
@annajung
Copy link
Contributor

I think you have to update the docs one more time, I don't think it reflects the latest changes.
Once that's resolved, lgtm.

As the implementation was extended with new functions, the documentation
should be regenerated.

Signed-off-by: Tzvetomir Stoyanov (VMware) <[email protected]>
@tzstoyanov
Copy link
Contributor Author

I think you have to update the docs one more time, I don't think it reflects the latest changes. Once that's resolved, lgtm.

done

@tzstoyanov tzstoyanov linked an issue Dec 14, 2021 that may be closed by this pull request
Copy link
Contributor

@annajung annajung left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you! lgtm

@tzstoyanov tzstoyanov merged commit 8d916e5 into vmware-archive:main Dec 15, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add functionality to export the model
4 participants