Skip to content

Commit

Permalink
Cherry-pick #3295 to releases/2021/2 (#3353)
Browse files Browse the repository at this point in the history
* Feature/azaytsev/change layout (#3295)

* Updated openvino_docs.xml
  • Loading branch information
andrew-zaytsev authored Nov 25, 2020
1 parent fd532c2 commit bff2fe0
Show file tree
Hide file tree
Showing 6 changed files with 54 additions and 46 deletions.
2 changes: 1 addition & 1 deletion docs/IE_DG/inference_engine_intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ The open source version is available in the [OpenVINO™ toolkit GitHub reposito

To learn about how to use the Inference Engine API for your application, see the [Integrating Inference Engine in Your Application](Integrate_with_customer_application_new_API.md) documentation.

For complete API Reference, see the [API Reference](usergroup29.html) section.
For complete API Reference, see the [Inference Engine API References](./api_references.html) section.

Inference Engine uses a plugin architecture. Inference Engine plugin is a software component that contains complete implementation for inference on a certain Intel® hardware device: CPU, GPU, VPU, etc. Each plugin implements the unified API and provides additional hardware-specific APIs.

Expand Down
2 changes: 1 addition & 1 deletion docs/IE_DG/protecting_model_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ should be called with `weights` passed as an empty `Blob`.
- OpenVINO™ toolkit online documentation: [https://docs.openvinotoolkit.org](https://docs.openvinotoolkit.org)
- Model Optimizer Developer Guide: [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md)
- Inference Engine Developer Guide: [Inference Engine Developer Guide](Deep_Learning_Inference_Engine_DevGuide.md)
- For more information on Sample Applications, see the [Inference Engine Samples Overview](Samples_Overview.html)
- For more information on Sample Applications, see the [Inference Engine Samples Overview](Samples_Overview.md)
- For information on a set of pre-trained models, see the [Overview of OpenVINO™ Toolkit Pre-Trained Models](@ref omz_models_intel_index)
- For information on Inference Engine Tutorials, see the [Inference Tutorials](https://github.com/intel-iot-devkit/inference-tutorials-generic)
- For IoT Libraries and Code Samples see the [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit).
12 changes: 8 additions & 4 deletions docs/doxygen/openvino_docs.xml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
<!-- GET STARTED category -->
<tab type="usergroup" title="GET STARTED" url="index.html">
<!-- Install Directly -->
<tab type="usergroup" title="Install Directly" url=""><!--automatically generated-->
<tab type="usergroup" title="Installation Guides" url=""><!--automatically generated-->
<tab type="usergroup" title="Linux" url="@ref openvino_docs_install_guides_installing_openvino_linux">
<tab type="user" title="Install Intel® Distribution of OpenVINO™ toolkit for Linux* OS" url="@ref openvino_docs_install_guides_installing_openvino_linux"/>
<tab type="user" title="[DEPRECATED] Install Intel® Distribution of OpenVINO™ toolkit for Linux with FPGA Support" url="@ref openvino_docs_install_guides_installing_openvino_linux_fpga"/>
Expand All @@ -17,13 +17,15 @@
</tab>
<tab type="user" title="macOS" url="@ref openvino_docs_install_guides_installing_openvino_macos"/>
<tab type="user" title="Raspbian OS" url="@ref openvino_docs_install_guides_installing_openvino_raspbian"/>
<tab type="user" title="DL Workbench Installation Guide" url="./workbench_docs_Workbench_DG_Install_Workbench.html"/><!-- Link to the original Workbench topic -->
</tab>
<!-- Install From Images and Repositories -->
<tab type="usergroup" title="Install From Images and Repositories" url=""><!--automatically generated-->
<tab type="usergroup" title="Install From Images and Repositories" url="@ref openvino_docs_install_guides_installing_openvino_images">
<tab type="usergroup" title="Docker" url="@ref openvino_docs_install_guides_installing_openvino_docker_linux">
<tab type="user" title="Install Intel® Distribution of OpenVINO™ toolkit for Linux* from a Docker* Image" url="@ref openvino_docs_install_guides_installing_openvino_docker_linux"/>
<tab type="user" title="Install Intel® Distribution of OpenVINO™ toolkit for Windows* from a Docker* Image" url="@ref openvino_docs_install_guides_installing_openvino_docker_windows"/>
</tab>
<tab type="user" title="Docker with DL Workbench" url="./workbench_docs_Workbench_DG_Install_from_Docker_Hub.html"/><!-- Link to the original Workbench topic -->
<tab type="user" title="APT" url="@ref openvino_docs_install_guides_installing_openvino_apt"/>
<tab type="user" title="YUM" url="@ref openvino_docs_install_guides_installing_openvino_yum"/>
<tab type="user" title="Anaconda Cloud" url="@ref openvino_docs_install_guides_installing_openvino_conda"/>
Expand All @@ -37,7 +39,9 @@
<tab type="user" title="Linux" url="@ref openvino_docs_get_started_get_started_linux"/>
<tab type="user" title="Windows" url="@ref openvino_docs_get_started_get_started_windows"/>
<tab type="user" title="macOS" url="@ref openvino_docs_get_started_get_started_macos"/>
<tab type="user" title="Get Started with OpenVINO via DL Workbench" url="@ref openvino_docs_get_started_get_started_dl_workbench"/>
<tab type="user" title="Legal Information" url="@ref openvino_docs_Legal_Information"/>
<tab type="user" title="Introduction to DL Workbench" url="./openvino_docs_get_started_get_started_dl_workbench.html"/><!-- Link to the original Workbench topic -->
</tab>
<!-- Configuration for Hardware -->
<tab type="usergroup" title="Configuration for Hardware" url=""><!--automatically generated-->
Expand Down Expand Up @@ -126,8 +130,8 @@
<tab type="user" title="Automatic Speech Recognition C++ Sample" url="@ref openvino_inference_engine_samples_speech_sample_README"/>
<tab type="user" title="Neural Style Transfer C++ Sample" url="@ref openvino_inference_engine_samples_style_transfer_sample_README"/>
<tab type="user" title="Neural Style Transfer Python* Sample" url="@ref openvino_inference_engine_ie_bridges_python_sample_style_transfer_sample_README"/>
<tab type="user" title="Benchmark C++ App" url="@ref openvino_inference_engine_samples_benchmark_app_README"/>
<tab type="user" title="Benchmark Python* App" url="@ref openvino_inference_engine_tools_benchmark_tool_README"/>
<tab type="user" title="Benchmark C++ Tool" url="@ref openvino_inference_engine_samples_benchmark_app_README"/>
<tab type="user" title="Benchmark Python* Tool" url="@ref openvino_inference_engine_tools_benchmark_tool_README"/>
</tab>

<!-- DL Streamer Examples -->
Expand Down
4 changes: 2 additions & 2 deletions docs/get_started/get_started_linux.md
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ You will perform the following steps:

Each demo and code sample is a separate application, but they use the same behavior and components. The code samples and demo applications are:

* [Code Samples](../IE_DG/Samples_Overview.html) - Small console applications that show how to utilize specific OpenVINO capabilities within an application and execute specific tasks such as loading a model, running inference, querying specific device capabilities, and more.
* [Code Samples](../IE_DG/Samples_Overview.md) - Small console applications that show how to utilize specific OpenVINO capabilities within an application and execute specific tasks such as loading a model, running inference, querying specific device capabilities, and more.

* [Demo Applications](@ref omz_demos_README) - Console applications that provide robust application templates to support developers in implementing specific deep learning scenarios. They may also involve more complex processing pipelines that gather analysis from several models that run inference simultaneously. For example concurrently detecting a person in a video stream and detecting attributes such as age, gender and/or emotions.

Expand Down Expand Up @@ -370,7 +370,7 @@ As an alternative, the Intel® Distribution of OpenVINO™ toolkit includes two

### <a name="run-image-classification"></a>Step 4: Run the Image Classification Code Sample

> **NOTE**: The Image Classification code sample is automatically compiled when you ran the Image Classification demo script. If you want to compile it manually, see the [Inference Engine Code Samples Overview](../IE_DG/Samples_Overview.html#build_samples_linux) section.
> **NOTE**: The Image Classification code sample is automatically compiled when you ran the Image Classification demo script. If you want to compile it manually, see the *Build the Sample Applications on Linux* section in the [Inference Engine Code Samples Overview](../IE_DG/Samples_Overview.md).
To run the **Image Classification** code sample with an input image on the IR:

Expand Down
14 changes: 14 additions & 0 deletions docs/install_guides/installing-openvino-images.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Install From Images and Repositories {#openvino_docs_install_guides_installing_openvino_images}

You may install Intel® Distribution of OpenVINO™ toolkit from images and repositories using the **Install OpenVINO™** button above or directly from the [Get the Intel® Distribution of OpenVINO™ Toolkit](https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit/download.html) page. Use the documentation below if you need additional support:

* [Docker](installing-openvino-docker-linux.md)
* [Docker with DL Workbench](@ref workbench_docs_Workbench_DG_Install_from_Docker_Hub)
* [APT](installing-openvino-apt.md)
* [YUM](installing-openvino-yum.md)
* [Anaconda Cloud](installing-openvino-conda.md)
* [Yocto](installing-openvino-yocto.md)
* [PyPI](installing-openvino-pip.md)

The open source version is available in the [OpenVINO™ toolkit GitHub repository](https://github.com/openvinotoolkit/openvino) and you can build it for supported platforms using the <a href="https://github.com/openvinotoolkit/openvino/wiki/BuildingCode">Inference Engine Build Instructions</a>.

Loading

0 comments on commit bff2fe0

Please sign in to comment.