-
Notifications
You must be signed in to change notification settings - Fork 10
Conversation
Can we create an issue to track this? Also, I think with the new functions added, docs might have to be regenerated.
|
f074077
to
6207b40
Compare
done, the documentation is updated with a second commit. |
@@ -46,6 +66,7 @@ def run(annotated_filename, dataset_filename, outcome, encoding_type, model_type | |||
parser.add_argument('dataset_filename', help='File location of extracted dataset') | |||
parser.add_argument('model', help='Model type to use for training, supported CNN and LSTM') | |||
parser.add_argument('outcome', help='Inclusive, Constructive, or Both') | |||
parser.add_argument('-save', help='Save the trained model') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you clarify in the help msg what is the type of the argument -save
expects. For me, at first time it was unclear if -save
is kind of a boolean and does not expect an additional argument if it is a path or a file name ...
tar = tarfile.open(name + "-" + model_version + ".tar.gz", "w:gz") | ||
tar.add(model_version) | ||
tar.close() | ||
os.chdir("../../") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: what about printing out the directory the model and the tar file are saved?
When the model is trained, in order to run an inference service to serve it, the model should be exported. Two optional parameters are introduced: "-save NAME" "-save_version VERSION" By default, the model is not exported. If "-save NAME" is specified, the model is saved using given NAME. If "-save_version VERSION" is specified, together with "-save NAME", the model is saved using given NAME and VERSION. The "-save_version" is ignored, if "-save" is missing. By default, version "001" is used. Models are exported in directory: models/<NAME>-<outcome>/<VERSION>/ and are compressed in file: models/<NAME>-<outcome>/<NAME>-<outcome>-<VERSION>.tar.gz The exported models are tested with kserve, the layout of directories and archive file is designed in a way kserve tensorflow predictor expects. fixes vmware-archive#2 Signed-off-by: Tzvetomir Stoyanov (VMware) <[email protected]>
1ee9507
to
f661315
Compare
I think you have to update the docs one more time, I don't think it reflects the latest changes. |
As the implementation was extended with new functions, the documentation should be regenerated. Signed-off-by: Tzvetomir Stoyanov (VMware) <[email protected]>
f661315
to
3bce4db
Compare
done |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you! lgtm
When the model is trained, in order to run an inference service to serve
it, the model should be exported. Two optional parameters are
introduced:
-save NAME
-save_version VERSION
By default, the model is not exported. If "-save NAME" is specified, the
model is saved using given NAME. If "-save_version VERSION" is
specified, together with "-save NAME", the model is saved using given
NAME and VERSION. The "-save_version" is ignored, if "-save" is missing.
By default, version "001" is used. Models are exported in directory:
models/<NAME>-<outcome>/<VERSION>/
and are compressed in file:
models/<NAME>-<outcome>/<NAME>-<outcome>-<VERSION>.tar.gz
The exported models are tested with kserve, the layout of directories and
archive file is designed in a way kserve tensorflow predictor expects.
fixes #2
Signed-off-by: Tzvetomir Stoyanov (VMware) [email protected]