Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/azaytsev/change layout #3295

2 changes: 1 addition & 1 deletion docs/IE_DG/Introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ For Intel® Distribution of OpenVINO™ toolkit, the Inference Engine package co
[sample console applications](Samples_Overview.md) demonstrating how you can use
the Inference Engine in your applications.

The open source version is available in the [OpenVINO™ toolkit GitHub repository](https://github.com/openvinotoolkit/openvino) and can be built for supported platforms using the <a href="https://github.com/openvinotoolkit/openvino/blob/master/build-instruction.md">Inference Engine Build Instructions</a>.
The open source version is available in the [OpenVINO™ toolkit GitHub repository](https://github.com/openvinotoolkit/openvino) and can be built for supported platforms using the <a href="https://github.com/openvinotoolkit/openvino/wiki/BuildingCode">Inference Engine Build Instructions</a>.
## See Also
- [Inference Engine Samples](Samples_Overview.md)
- [Intel&reg; Deep Learning Deployment Toolkit Web Page](https://software.intel.com/en-us/computer-vision-sdk)
Expand Down
2 changes: 1 addition & 1 deletion docs/IE_DG/Samples_Overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ The officially supported Linux* build environment is the following:
* GCC* 7.5.0 (for Ubuntu* 18.04) or GCC* 4.8.5 (for CentOS* 7.6)
* CMake* version 3.10 or higher

> **NOTE**: For building samples from the open-source version of OpenVINO™ toolkit, see the [build instructions on GitHub](https://github.com/openvinotoolkit/openvino/blob/master/build-instruction.md).
> **NOTE**: For building samples from the open-source version of OpenVINO™ toolkit, see the [build instructions on GitHub](https://github.com/openvinotoolkit/openvino/wiki/BuildingCode).

To build the C or C++ sample applications for Linux, go to the `<INSTALL_DIR>/inference_engine/samples/c` or `<INSTALL_DIR>/inference_engine/samples/cpp` directory, respectively, and run the `build_samples.sh` script:
```sh
Expand Down
4 changes: 2 additions & 2 deletions docs/IE_DG/inference_engine_intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,11 @@ Inference Engine is a set of C++ libraries providing a common API to deliver inf

For Intel® Distribution of OpenVINO™ toolkit, Inference Engine binaries are delivered within release packages.

The open source version is available in the [OpenVINO™ toolkit GitHub repository](https://github.com/openvinotoolkit/openvino) and can be built for supported platforms using the <a href="https://github.com/openvinotoolkit/openvino/blob/master/build-instruction.md">Inference Engine Build Instructions</a>.
The open source version is available in the [OpenVINO™ toolkit GitHub repository](https://github.com/openvinotoolkit/openvino) and can be built for supported platforms using the <a href="https://github.com/openvinotoolkit/openvino/wiki/BuildingCode">Inference Engine Build Instructions</a>.

To learn about how to use the Inference Engine API for your application, see the [Integrating Inference Engine in Your Application](Integrate_with_customer_application_new_API.md) documentation.

For complete API Reference, see the [API Reference](usergroup29.html) section.
For complete API Reference, see the [Inference Engine API References](./api_references.html) section.

Inference Engine uses a plugin architecture. Inference Engine plugin is a software component that contains complete implementation for inference on a certain Intel&reg; hardware device: CPU, GPU, VPU, etc. Each plugin implements the unified API and provides additional hardware-specific APIs.

Expand Down
2 changes: 1 addition & 1 deletion docs/IE_DG/protecting_model_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ CNNNetwork network = core.ReadNetwork(strModel, make_shared_blob<uint8_t>({Preci
- OpenVINO™ toolkit online documentation: [https://docs.openvinotoolkit.org](https://docs.openvinotoolkit.org)
- Model Optimizer Developer Guide: [Model Optimizer Developer Guide](../MO_DG/Deep_Learning_Model_Optimizer_DevGuide.md)
- Inference Engine Developer Guide: [Inference Engine Developer Guide](Deep_Learning_Inference_Engine_DevGuide.md)
- For more information on Sample Applications, see the [Inference Engine Samples Overview](Samples_Overview.html)
- For more information on Sample Applications, see the [Inference Engine Samples Overview](Samples_Overview.md)
- For information on a set of pre-trained models, see the [Overview of OpenVINO™ Toolkit Pre-Trained Models](@ref omz_models_intel_index)
- For information on Inference Engine Tutorials, see the [Inference Tutorials](https://github.com/intel-iot-devkit/inference-tutorials-generic)
- For IoT Libraries and Code Samples see the [Intel® IoT Developer Kit](https://github.com/intel-iot-devkit).
9 changes: 6 additions & 3 deletions docs/doxygen/openvino_docs.xml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
<!-- GET STARTED category -->
<tab type="usergroup" title="GET STARTED" url="index.html">
<!-- Install Directly -->
<tab type="usergroup" title="Install Directly" url=""><!--automatically generated-->
<tab type="usergroup" title="Installation Guides" url=""><!--automatically generated-->
<tab type="usergroup" title="Linux" url="@ref openvino_docs_install_guides_installing_openvino_linux">
<tab type="user" title="Install Intel® Distribution of OpenVINO™ toolkit for Linux* OS" url="@ref openvino_docs_install_guides_installing_openvino_linux"/>
<tab type="user" title="[DEPRECATED] Install Intel® Distribution of OpenVINO™ toolkit for Linux with FPGA Support" url="@ref openvino_docs_install_guides_installing_openvino_linux_fpga"/>
Expand All @@ -17,19 +17,21 @@
</tab>
<tab type="user" title="macOS" url="@ref openvino_docs_install_guides_installing_openvino_macos"/>
<tab type="user" title="Raspbian OS" url="@ref openvino_docs_install_guides_installing_openvino_raspbian"/>
<tab type="user" title="DL Workbench Installation Guide" url="./workbench_docs_Workbench_DG_Install_Workbench.html"/><!-- Link to the original Workbench topic -->
</tab>
<!-- Install From Images and Repositories -->
<tab type="usergroup" title="Install From Images and Repositories" url=""><!--automatically generated-->
<tab type="usergroup" title="Install From Images and Repositories" url="@ref openvino_docs_install_guides_installing_openvino_images">
<tab type="usergroup" title="Docker" url="@ref openvino_docs_install_guides_installing_openvino_docker_linux">
<tab type="user" title="Install Intel® Distribution of OpenVINO™ toolkit for Linux* from a Docker* Image" url="@ref openvino_docs_install_guides_installing_openvino_docker_linux"/>
<tab type="user" title="Install Intel® Distribution of OpenVINO™ toolkit for Windows* from a Docker* Image" url="@ref openvino_docs_install_guides_installing_openvino_docker_windows"/>
</tab>
<tab type="user" title="Docker with DL Workbench" url="./workbench_docs_Workbench_DG_Install_from_Docker_Hub.html"/><!-- Link to the original Workbench topic -->
<tab type="user" title="APT" url="@ref openvino_docs_install_guides_installing_openvino_apt"/>
<tab type="user" title="YUM" url="@ref openvino_docs_install_guides_installing_openvino_yum"/>
<tab type="user" title="Anaconda Cloud" url="@ref openvino_docs_install_guides_installing_openvino_conda"/>
<tab type="user" title="Yocto" url="@ref openvino_docs_install_guides_installing_openvino_yocto"/>
<tab type="user" title="PyPI" url="@ref openvino_docs_install_guides_installing_openvino_pip"/>
<tab type="user" title="Build from Source" url="https://github.com/openvinotoolkit/openvino/blob/master/build-instruction.md"/>
<tab type="user" title="Build from Source" url="https://github.com/openvinotoolkit/openvino/wiki/BuildingCode"/>
</tab>
<!-- Get Started Guides-->
<tab type="usergroup" title="Get Started Guides" url=""><!--automatically generated-->
Expand All @@ -39,6 +41,7 @@
<tab type="user" title="macOS" url="@ref openvino_docs_get_started_get_started_macos"/>
<tab type="user" title="Get Started with OpenVINO via DL Workbench" url="@ref openvino_docs_get_started_get_started_dl_workbench"/>
<tab type="user" title="Legal Information" url="@ref openvino_docs_Legal_Information"/>
<tab type="user" title="Introduction to DL Workbench" url="./openvino_docs_get_started_get_started_dl_workbench.html"/><!-- Link to the original Workbench topic -->
</tab>
<!-- Configuration for Hardware -->
<tab type="usergroup" title="Configuration for Hardware" url=""><!--automatically generated-->
Expand Down
4 changes: 2 additions & 2 deletions docs/get_started/get_started_linux.md
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ You will perform the following steps:

Each demo and code sample is a separate application, but they use the same behavior and components. The code samples and demo applications are:

* [Code Samples](../IE_DG/Samples_Overview.html) - Small console applications that show how to utilize specific OpenVINO capabilities within an application and execute specific tasks such as loading a model, running inference, querying specific device capabilities, and more.
* [Code Samples](../IE_DG/Samples_Overview.md) - Small console applications that show how to utilize specific OpenVINO capabilities within an application and execute specific tasks such as loading a model, running inference, querying specific device capabilities, and more.

* [Demo Applications](@ref omz_demos_README) - Console applications that provide robust application templates to support developers in implementing specific deep learning scenarios. They may also involve more complex processing pipelines that gather analysis from several models that run inference simultaneously. For example concurrently detecting a person in a video stream and detecting attributes such as age, gender and/or emotions.

Expand Down Expand Up @@ -370,7 +370,7 @@ As an alternative, the Intel® Distribution of OpenVINO™ toolkit includes two

### <a name="run-image-classification"></a>Step 4: Run the Image Classification Code Sample

> **NOTE**: The Image Classification code sample is automatically compiled when you ran the Image Classification demo script. If you want to compile it manually, see the [Inference Engine Code Samples Overview](../IE_DG/Samples_Overview.html#build_samples_linux) section.
> **NOTE**: The Image Classification code sample is automatically compiled when you ran the Image Classification demo script. If you want to compile it manually, see the *Build the Sample Applications on Linux* section in the [Inference Engine Code Samples Overview](../IE_DG/Samples_Overview.md).

To run the **Image Classification** code sample with an input image on the IR:

Expand Down
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,4 +99,4 @@ Intel® Distribution of OpenVINO™ toolkit includes the following components:
- [OpenCV](https://docs.opencv.org/master/) - OpenCV* community version compiled for Intel® hardware
- [Intel® Media SDK](https://software.intel.com/en-us/media-sdk) (in Intel® Distribution of OpenVINO™ toolkit for Linux only)

OpenVINO™ Toolkit opensource version is available on [GitHub](https://github.com/openvinotoolkit/openvino). For building the Inference Engine from the source code, see the <a href="https://github.com/openvinotoolkit/openvino/blob/master/build-instruction.md">build instructions</a>.
OpenVINO™ Toolkit opensource version is available on [GitHub](https://github.com/openvinotoolkit/openvino). For building the Inference Engine from the source code, see the <a href="https://github.com/openvinotoolkit/openvino/wiki/BuildingCode">build instructions</a>.
14 changes: 14 additions & 0 deletions docs/install_guides/installing-openvino-images.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Install From Images and Repositories {#openvino_docs_install_guides_installing_openvino_images}

You may install Intel® Distribution of OpenVINO™ toolkit from images and repositories using the **Install OpenVINO™** button above or directly from the [Get the Intel® Distribution of OpenVINO™ Toolkit](https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit/download.html) page. Use the documentation below if you need additional support:

* [Docker](installing-openvino-docker-linux.md)
* [Docker with DL Workbench](@ref workbench_docs_Workbench_DG_Install_from_Docker_Hub)
* [APT](installing-openvino-apt.md)
* [YUM](installing-openvino-yum.md)
* [Anaconda Cloud](installing-openvino-conda.md)
* [Yocto](installing-openvino-yocto.md)
* [PyPI](installing-openvino-pip.md)

The open source version is available in the [OpenVINO™ toolkit GitHub repository](https://github.com/openvinotoolkit/openvino) and you can build it for supported platforms using the <a href="https://github.com/openvinotoolkit/openvino/wiki/BuildingCode">Inference Engine Build Instructions</a>.

Loading