Skip to content

Commit

Permalink
Updated documentation for compile_tool
Browse files Browse the repository at this point in the history
  • Loading branch information
ilya-lavrenov committed Mar 19, 2022
1 parent c3b0597 commit 5a2a49a
Show file tree
Hide file tree
Showing 2 changed files with 24 additions and 20 deletions.
24 changes: 12 additions & 12 deletions docs/OV_Runtime_UG/supported_plugins/Device_Plugins.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,17 +40,17 @@ Devices similar to the ones we have used for benchmarking can be accessed using
## Features support matrix
The table below demonstrates support of key features by OpenVINO device plugins.

| Capability | [CPU](CPU.md) | [GPU](GPU.md) | [GNA](GNA.md) | [VPU](VPU.md) | [Arm® CPU](ARM_CPU.md) |
| ---------- | --- | --- | --- | --- | --- |
| [Heterogeneous execution](../hetero_execution.md)| Yes | Yes | No | ? | Yes |
| [Multi-device execution](../multi_device.md) | Yes | Yes | Partial | ? | Yes |
| [Automatic batching](../automatic_batching.md) | No | Yes | No | ? | No |
| [Multi-stream execution](@ref openvino_docs_optimization_guide_dldt_optimization_guide) | Yes | Yes | No | ? | Yes |
| [Models caching](../Model_caching_overview.md) | Yes | Partial | Yes | ? | No |
| [Dynamic shapes](../ov_dynamic_shapes.md) | Yes | Partial | No | ? | No |
| Import/Export | Yes | No | Yes | ? | No |
| [Preprocessing acceleration](../preprocessing_overview.md) | Yes | Yes | No | ? | Partial |
| [Stateful models](../network_state_intro.md) | Yes | No | Yes | ? | No |
| [Extensibility](@ref openvino_docs_Extensibility_UG_Intro) | Yes | Yes | No | ? | No |
| Capability | [CPU](CPU.md) | [GPU](GPU.md) | [GNA](GNA.md) |[Arm® CPU](ARM_CPU.md) |
| ---------- | --- | --- | --- | --- |
| [Heterogeneous execution](../hetero_execution.md)| Yes | Yes | No | Yes |
| [Multi-device execution](../multi_device.md) | Yes | Yes | Partial | Yes |
| [Automatic batching](../automatic_batching.md) | No | Yes | No | No |
| [Multi-stream execution](../../optimization_guide/dldt_deployment_optimization_tput.md) | Yes | Yes | No | Yes |
| [Models caching](../Model_caching_overview.md) | Yes | Partial | Yes | No |
| [Dynamic shapes](../ov_dynamic_shapes.md) | Yes | Partial | No | No |
| [Import/Export](../../../tools/compile_tool/README.md) | Yes | No | Yes | No |
| [Preprocessing acceleration](../preprocessing_overview.md) | Yes | Yes | No | Partial |
| [Stateful models](../network_state_intro.md) | Yes | No | Yes | No |
| [Extensibility](@ref openvino_docs_Extensibility_UG_Intro) | Yes | Yes | No | No |

For more details on plugin specific feature limitation, see corresponding plugin pages.
20 changes: 12 additions & 8 deletions tools/compile_tool/README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,23 @@
# Compile Tool {#openvino_inference_engine_tools_compile_tool_README}

Compile tool is a C++ application that enables you to compile a network for inference on a specific device and export it to a binary file.
With the Compile Tool, you can compile a network using supported Inference Engine plugins on a machine that doesn't have the physical device connected and then transfer a generated file to any machine with the target inference device available.
Compile tool is a C++ application that enables you to compile a model for inference on a specific device and export the compiled representation to a binary file.
With the Compile Tool, you can compile a model using supported OpenVINO Runtime devices on a machine that doesn't have the physical device connected and then transfer a generated file to any machine with the target inference device available. See the [Features support matrix](../../docs/OV_Runtime_UG/supported_plugins/Device_Plugins.md) to understand which device support import / export functionality.

The tool compiles networks for the following target devices using corresponding Inference Engine plugins:
The tool compiles networks for the following target devices using corresponding OpenVINO Runtime plugins:
* Intel® Neural Compute Stick 2 (MYRIAD plugin)


The tool is delivered as an executable file that can be run on both Linux* and Windows*.
The tool is located in the `<INSTALLROOT>/tools/compile_tool` directory.

The workflow of the Compile tool is as follows:
## Workflow of the Compile tool

1. First, the application reads command-line parameters and loads a model to the OpenVINO Runtime device.
2. Then the application exports a blob with the compiled model and writes it to the output file.

1. First, the application reads command-line parameters and loads a network to the Inference Engine device.
2. The application exports a blob with the compiled network and writes it to the output file.
Also, the compile_tool supports the following capabilities:
- Embedding [layout](../../docs/OV_Runtime_UG/layout_overview.md) and precision conversions (see [Optimize Preprocessing](../../docs/OV_Runtime_UG/preprocessing_overview.md)). To compile the model with advanced preprocessing capabilities, refer to [Use Case - Integrate and Save Preprocessing Steps Into IR](../../docs/OV_Runtime_UG/preprocessing_usecase_save.md) which shows how to have all the preprocessing in the compiled blob.
- Compile blobs for OpenVINO Runtime API 2.0 by default or for Inference Engine API with explicit option `-ov_api_1_0`
- Accepts device specific options for customizing the compilation process

## Run the Compile Tool

Expand Down Expand Up @@ -85,5 +89,5 @@ To import a blob with the network from a generated file into your application, u
```cpp
ov::Core ie;
std::ifstream file{"model_name.blob"};
ov::CompiledModel compiled_model = ie.import_model(file, "MYRIAD", {});
ov::CompiledModel compiled_model = ie.import_model(file, "MYRIAD");
```

0 comments on commit 5a2a49a

Please sign in to comment.