Skip to content

Commit

Permalink
Fix yolov8 pytorch notebook
Browse files Browse the repository at this point in the history
  • Loading branch information
lapid92 committed Apr 2, 2024
1 parent 8570ca7 commit c20bd42
Showing 1 changed file with 76 additions and 61 deletions.
137 changes: 76 additions & 61 deletions tutorials/notebooks/pytorch/ptq/pytorch_yolov8n_for_imx500.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@
"cells": [
{
"cell_type": "markdown",
"id": "fab9d9939dc74da4",
"metadata": {
"collapsed": false
},
"source": [
"# YOLOv8n Object Detection PyTorch Model - Quantization for IMX500\n",
"\n",
Expand All @@ -21,76 +25,78 @@
"1. Post-Training Quantization using MCT of PyTorch object detection model.\n",
"2. Data preparation - loading and preprocessing validation and representative datasets from COCO.\n",
"3. Accuracy evaluation of the floating-point and the quantized models."
],
"metadata": {
"collapsed": false
},
"id": "fab9d9939dc74da4"
]
},
{
"cell_type": "markdown",
"source": [
"## Setup\n",
"### Install the relevant packages"
],
"id": "d74f9c855ec54081",
"metadata": {
"collapsed": false
},
"id": "d74f9c855ec54081"
"source": [
"## Setup\n",
"### Install the relevant packages"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "7c7fa04c9903736f",
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"!pip install -q torch\n",
"!pip install onnx\n",
"!pip install -q pycocotools\n",
"!pip install huggingface-hub>=0.21\n",
"!pip install --pre sony-custom-layers-dev==0.2.0.dev1"
],
"metadata": {
"collapsed": false
},
"id": "7c7fa04c9903736f"
]
},
{
"cell_type": "markdown",
"source": [
" Clone a copy of the [MCT](https://github.com/sony/model_optimization) (Model Compression Toolkit) into your current directory. This step ensures that you have access to [MCT Models Library](https://github.com/sony/model_optimization/tree/main/tutorials/mct_model_garden) folder which contains all the necessary utility functions for this tutorial.\n",
" **It's important to note that we use the most up-to-date MCT code available.**"
],
"id": "57717bc8f59a0d85",
"metadata": {
"collapsed": false
},
"id": "57717bc8f59a0d85"
"source": [
" Clone a copy of the [MCT](https://github.com/sony/model_optimization) (Model Compression Toolkit) into your current directory. This step ensures that you have access to [MCT Models Library](https://github.com/sony/model_optimization/tree/main/tutorials/mct_model_garden) folder which contains all the necessary utility functions for this tutorial.\n",
" **It's important to note that we use the most up-to-date MCT code available.**"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "9728247bc20d0600",
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"!git clone https://github.com/sony/model_optimization.git local_mct\n",
"!pip install -r ./local_mct/requirements.txt\n",
"import sys\n",
"sys.path.insert(0,\"./local_mct\")"
],
"metadata": {
"collapsed": false
},
"id": "9728247bc20d0600"
]
},
{
"cell_type": "markdown",
"source": [
"### Download COCO evaluation set"
],
"id": "7a1038b9fd98bba2",
"metadata": {
"collapsed": false
},
"id": "7a1038b9fd98bba2"
"source": [
"### Download COCO evaluation set"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "8bea492d71b4060f",
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"!wget -nc http://images.cocodataset.org/annotations/annotations_trainval2017.zip\n",
Expand All @@ -99,11 +105,7 @@
"!wget -nc http://images.cocodataset.org/zips/val2017.zip\n",
"!unzip -q -o val2017.zip -d ./coco\n",
"!echo Done loading val2017 images"
],
"metadata": {
"collapsed": false
},
"id": "8bea492d71b4060f"
]
},
{
"cell_type": "markdown",
Expand All @@ -119,15 +121,28 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 2,
"id": "e8395b28-4732-4d18-b081-5d3bdf508691",
"metadata": {
"is_executing": true
},
"outputs": [],
"outputs": [
{
"ename": "FileNotFoundError",
"evalue": "[Errno 2] No such file or directory: 'tutorials/mct_model_garden/models_pytorch/yolov8/yolov8n.yaml'",
"output_type": "error",
"traceback": [
"\u001B[0;31m---------------------------------------------------------------------------\u001B[0m",
"\u001B[0;31mFileNotFoundError\u001B[0m Traceback (most recent call last)",
"Cell \u001B[0;32mIn[2], line 2\u001B[0m\n\u001B[1;32m 1\u001B[0m \u001B[38;5;28;01mfrom\u001B[39;00m \u001B[38;5;21;01mtutorials\u001B[39;00m\u001B[38;5;21;01m.\u001B[39;00m\u001B[38;5;21;01mmct_model_garden\u001B[39;00m\u001B[38;5;21;01m.\u001B[39;00m\u001B[38;5;21;01mmodels_pytorch\u001B[39;00m\u001B[38;5;21;01m.\u001B[39;00m\u001B[38;5;21;01myolov8\u001B[39;00m\u001B[38;5;21;01m.\u001B[39;00m\u001B[38;5;21;01myolov8\u001B[39;00m \u001B[38;5;28;01mimport\u001B[39;00m DetectionModelPyTorch, yaml_load\n\u001B[0;32m----> 2\u001B[0m cfg_dict \u001B[38;5;241m=\u001B[39m \u001B[43myaml_load\u001B[49m\u001B[43m(\u001B[49m\u001B[38;5;124;43m\"\u001B[39;49m\u001B[38;5;124;43mtutorials/mct_model_garden/models_pytorch/yolov8/yolov8n.yaml\u001B[39;49m\u001B[38;5;124;43m\"\u001B[39;49m\u001B[43m,\u001B[49m\u001B[43m \u001B[49m\u001B[43mappend_filename\u001B[49m\u001B[38;5;241;43m=\u001B[39;49m\u001B[38;5;28;43;01mTrue\u001B[39;49;00m\u001B[43m)\u001B[49m \u001B[38;5;66;03m# model dict\u001B[39;00m\n\u001B[1;32m 3\u001B[0m model \u001B[38;5;241m=\u001B[39m DetectionModelPyTorch\u001B[38;5;241m.\u001B[39mfrom_pretrained(\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mSSI-DNN/pytorch_yolov8n_640x640_bb_decoding\u001B[39m\u001B[38;5;124m\"\u001B[39m, cfg\u001B[38;5;241m=\u001B[39mcfg_dict)\n",
"File \u001B[0;32m/data/projects/swat/users/ariell/repos/my_fork/yolov8_pytorch_tut/model_optimization/tutorials/mct_model_garden/models_pytorch/yolov8/yolov8.py:51\u001B[0m, in \u001B[0;36myaml_load\u001B[0;34m(file, append_filename)\u001B[0m\n\u001B[1;32m 40\u001B[0m \u001B[38;5;28;01mdef\u001B[39;00m \u001B[38;5;21myaml_load\u001B[39m(file: \u001B[38;5;28mstr\u001B[39m \u001B[38;5;241m=\u001B[39m \u001B[38;5;124m'\u001B[39m\u001B[38;5;124mdata.yaml\u001B[39m\u001B[38;5;124m'\u001B[39m, append_filename: \u001B[38;5;28mbool\u001B[39m \u001B[38;5;241m=\u001B[39m \u001B[38;5;28;01mFalse\u001B[39;00m) \u001B[38;5;241m-\u001B[39m\u001B[38;5;241m>\u001B[39m Dict[\u001B[38;5;28mstr\u001B[39m, \u001B[38;5;28many\u001B[39m]:\n\u001B[1;32m 41\u001B[0m \u001B[38;5;250m \u001B[39m\u001B[38;5;124;03m\"\"\"\u001B[39;00m\n\u001B[1;32m 42\u001B[0m \u001B[38;5;124;03m Load YAML data from a file.\u001B[39;00m\n\u001B[1;32m 43\u001B[0m \n\u001B[0;32m (...)\u001B[0m\n\u001B[1;32m 49\u001B[0m \u001B[38;5;124;03m dict: YAML data and file name.\u001B[39;00m\n\u001B[1;32m 50\u001B[0m \u001B[38;5;124;03m \"\"\"\u001B[39;00m\n\u001B[0;32m---> 51\u001B[0m \u001B[38;5;28;01mwith\u001B[39;00m \u001B[38;5;28;43mopen\u001B[39;49m\u001B[43m(\u001B[49m\u001B[43mfile\u001B[49m\u001B[43m,\u001B[49m\u001B[43m \u001B[49m\u001B[43merrors\u001B[49m\u001B[38;5;241;43m=\u001B[39;49m\u001B[38;5;124;43m'\u001B[39;49m\u001B[38;5;124;43mignore\u001B[39;49m\u001B[38;5;124;43m'\u001B[39;49m\u001B[43m,\u001B[49m\u001B[43m \u001B[49m\u001B[43mencoding\u001B[49m\u001B[38;5;241;43m=\u001B[39;49m\u001B[38;5;124;43m'\u001B[39;49m\u001B[38;5;124;43mutf-8\u001B[39;49m\u001B[38;5;124;43m'\u001B[39;49m\u001B[43m)\u001B[49m \u001B[38;5;28;01mas\u001B[39;00m f:\n\u001B[1;32m 52\u001B[0m s \u001B[38;5;241m=\u001B[39m f\u001B[38;5;241m.\u001B[39mread() \u001B[38;5;66;03m# string\u001B[39;00m\n\u001B[1;32m 53\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;129;01mnot\u001B[39;00m s\u001B[38;5;241m.\u001B[39misprintable(): \u001B[38;5;66;03m# remove special characters\u001B[39;00m\n",
"\u001B[0;31mFileNotFoundError\u001B[0m: [Errno 2] No such file or directory: 'tutorials/mct_model_garden/models_pytorch/yolov8/yolov8n.yaml'"
]
}
],
"source": [
"from tutorials.mct_model_garden.models_pytorch.yolov8.yolov8 import DetectionModelPyTorch, yaml_load\n",
"cfg_dict = yaml_load(\"yolov8n.yaml\", append_filename=True) # model dict\n",
"cfg_dict = yaml_load(\"./local_mct/tutorials/mct_model_garden/models_pytorch/yolov8/yolov8n.yaml\", append_filename=True) # model dict\n",
"model = DetectionModelPyTorch.from_pretrained(\"SSI-DNN/pytorch_yolov8n_640x640_bb_decoding\", cfg=cfg_dict)"
]
},
Expand All @@ -146,6 +161,10 @@
{
"cell_type": "code",
"execution_count": null,
"id": "56393342-cecf-4f64-b9ca-2f515c765942",
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import model_compression_toolkit as mct\n",
Expand Down Expand Up @@ -229,37 +248,33 @@
" score_threshold=score_threshold,\n",
" iou_threshold=iou_threshold,\n",
" max_detections=max_detections).to(device=device)"
],
"metadata": {
"collapsed": false
},
"id": "56393342-cecf-4f64-b9ca-2f515c765942"
]
},
{
"cell_type": "markdown",
"id": "3be2016acdc9da60",
"metadata": {
"collapsed": false
},
"source": [
"### Model Export\n",
"\n",
"Now, we can export the quantized model, ready for deployment, into a `.onnx` format file. Please ensure that the `save_model_path` has been set correctly. "
],
"metadata": {
"collapsed": false
},
"id": "3be2016acdc9da60"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "72dd885c7b92fa93",
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"mct.exporter.pytorch_export_model(model=quant_model_pp,\n",
" save_model_path='./qmodel_pp.onnx',\n",
" repr_dataset=representative_dataset_gen)"
],
"metadata": {
"collapsed": false
},
"id": "72dd885c7b92fa93"
]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -347,6 +362,10 @@
},
{
"cell_type": "markdown",
"id": "6d93352843a27433",
"metadata": {
"collapsed": false
},
"source": [
"\\\n",
"Copyright 2024 Sony Semiconductor Israel, Inc. All rights reserved.\n",
Expand All @@ -362,14 +381,13 @@
"WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n",
"See the License for the specific language governing permissions and\n",
"limitations under the License."
],
"metadata": {
"collapsed": false
},
"id": "6d93352843a27433"
]
}
],
"metadata": {
"colab": {
"provenance": []
},
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
Expand All @@ -385,10 +403,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.7"
},
"colab": {
"provenance": []
"version": "3.11.4"
}
},
"nbformat": 4,
Expand Down

0 comments on commit c20bd42

Please sign in to comment.