Skip to content

Commit

Permalink
[DOCS] Update references to 2025 port (#28718)
Browse files Browse the repository at this point in the history
port: #28697

---------

Co-authored-by: sgolebiewski-intel <[email protected]>
Co-authored-by: Alina Kladieva <[email protected]>
  • Loading branch information
3 people authored Jan 29, 2025
1 parent 6bec542 commit 2eb92ad
Show file tree
Hide file tree
Showing 95 changed files with 281 additions and 281 deletions.
2 changes: 1 addition & 1 deletion CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ endif()

project(OpenVINO
DESCRIPTION "OpenVINO toolkit"
HOMEPAGE_URL "https://docs.openvino.ai/2024/home.html"
HOMEPAGE_URL "https://docs.openvino.ai/2025/index.html"
LANGUAGES C CXX)

find_package(OpenVINODeveloperScripts REQUIRED
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ product better.
Since the market of computing devices is constantly evolving, OpenVINO is always open to extending
its support for new hardware. If you want to run inference on a device that is currently not supported,
you can see how to develop a new plugin for it in the
[Plugin Developer Guide](https://docs.openvino.ai/2024/documentation/openvino-extensibility/openvino-plugin-library.html).
[Plugin Developer Guide](https://docs.openvino.ai/2025/documentation/openvino-extensibility/openvino-plugin-library.html).


### Improve documentation
Expand Down
24 changes: 12 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Open-source software toolkit for optimizing and deploying deep learning models.
</h3>

<p align="center">
<a href="https://docs.openvino.ai/2024/index.html"><b>Documentation</b></a> • <a href="https://blog.openvino.ai"><b>Blog</b></a> • <a href="https://docs.openvino.ai/2024/about-openvino/key-features.html"><b>Key Features</b></a> • <a href="https://docs.openvino.ai/2024/learn-openvino.html"><b>Tutorials</b></a> • <a href="https://docs.openvino.ai/2024/documentation/openvino-ecosystem.html"><b>Integrations</b></a> • <a href="https://docs.openvino.ai/2024/about-openvino/performance-benchmarks.html"><b>Benchmarks</b></a> • <a href="https://github.com/openvinotoolkit/openvino.genai"><b>Generative AI</b></a>
<a href="https://docs.openvino.ai/2025/index.html"><b>Documentation</b></a> • <a href="https://blog.openvino.ai"><b>Blog</b></a> • <a href="https://docs.openvino.ai/2025/about-openvino/key-features.html"><b>Key Features</b></a> • <a href="https://docs.openvino.ai/2025/learn-openvino.html"><b>Tutorials</b></a> • <a href="https://docs.openvino.ai/2025/documentation/openvino-ecosystem.html"><b>Integrations</b></a> • <a href="https://docs.openvino.ai/2025/about-openvino/performance-benchmarks.html"><b>Benchmarks</b></a> • <a href="https://github.com/openvinotoolkit/openvino.genai"><b>Generative AI</b></a>
</p>

[![PyPI Status](https://badge.fury.io/py/openvino.svg)](https://badge.fury.io/py/openvino)
Expand All @@ -24,22 +24,22 @@ Open-source software toolkit for optimizing and deploying deep learning models.
- **Broad Platform Compatibility**: Reduce resource demands and efficiently deploy on a range of platforms from edge to cloud. OpenVINO™ supports inference on CPU (x86, ARM), GPU (OpenCL capable, integrated and discrete) and AI accelerators (Intel NPU).
- **Community and Ecosystem**: Join an active community contributing to the enhancement of deep learning performance across various domains.

Check out the [OpenVINO Cheat Sheet](https://docs.openvino.ai/2024/_static/download/OpenVINO_Quick_Start_Guide.pdf) and [Key Features](https://docs.openvino.ai/2024/about-openvino/key-features.html) for a quick reference.
Check out the [OpenVINO Cheat Sheet](https://docs.openvino.ai/2025/_static/download/OpenVINO_Quick_Start_Guide.pdf) and [Key Features](https://docs.openvino.ai/2025/about-openvino/key-features.html) for a quick reference.


## Installation

[Get your preferred distribution of OpenVINO](https://docs.openvino.ai/2024/get-started/install-openvino.html) or use this command for quick installation:
[Get your preferred distribution of OpenVINO](https://docs.openvino.ai/2025/get-started/install-openvino.html) or use this command for quick installation:

```sh
pip install -U openvino
```

Check [system requirements](https://docs.openvino.ai/2024/about-openvino/system-requirements.html) and [supported devices](https://docs.openvino.ai/2024/about-openvino/compatibility-and-support/supported-devices.html) for detailed information.
Check [system requirements](https://docs.openvino.ai/2025/about-openvino/system-requirements.html) and [supported devices](https://docs.openvino.ai/2025/about-openvino/compatibility-and-support/supported-devices.html) for detailed information.

## Tutorials and Examples

[OpenVINO Quickstart example](https://docs.openvino.ai/2024/get-started.html) will walk you through the basics of deploying your first model.
[OpenVINO Quickstart example](https://docs.openvino.ai/2025/get-started.html) will walk you through the basics of deploying your first model.

Learn how to optimize and deploy popular models with the [OpenVINO Notebooks](https://github.com/openvinotoolkit/openvino_notebooks)📚:
- [Create an LLM-powered Chatbot using OpenVINO](https://github.com/openvinotoolkit/openvino_notebooks/blob/latest/notebooks/llm-chatbot/llm-chatbot-generate-api.ipynb)
Expand All @@ -48,7 +48,7 @@ Learn how to optimize and deploy popular models with the [OpenVINO Notebooks](ht
- [Multimodal assistant with LLaVa and OpenVINO](https://github.com/openvinotoolkit/openvino_notebooks/blob/latest/notebooks/llava-multimodal-chatbot/llava-multimodal-chatbot-genai.ipynb)
- [Automatic speech recognition using Whisper and OpenVINO](https://github.com/openvinotoolkit/openvino_notebooks/blob/latest/notebooks/whisper-asr-genai/whisper-asr-genai.ipynb)

Discover more examples in the [OpenVINO Samples (Python & C++)](https://docs.openvino.ai/2024/learn-openvino/openvino-samples.html) and [Notebooks (Python)](https://docs.openvino.ai/2024/learn-openvino/interactive-tutorials-python.html).
Discover more examples in the [OpenVINO Samples (Python & C++)](https://docs.openvino.ai/2025/learn-openvino/openvino-samples.html) and [Notebooks (Python)](https://docs.openvino.ai/2025/learn-openvino/interactive-tutorials-python.html).

Here are easy-to-follow code examples demonstrating how to run PyTorch and TensorFlow model inference using OpenVINO:

Expand Down Expand Up @@ -96,11 +96,11 @@ data = np.random.rand(1, 224, 224, 3)
output = compiled_model({0: data})
```

OpenVINO supports the CPU, GPU, and NPU [devices](https://docs.openvino.ai/2024/openvino-workflow/running-inference/inference-devices-and-modes.html) and works with models from PyTorch, TensorFlow, ONNX, TensorFlow Lite, PaddlePaddle, and JAX/Flax [frameworks](https://docs.openvino.ai/2024/openvino-workflow/model-preparation.html). It includes [APIs](https://docs.openvino.ai/2024/api/api_reference.html) in C++, Python, C, NodeJS, and offers the GenAI API for optimized model pipelines and performance.
OpenVINO supports the CPU, GPU, and NPU [devices](https://docs.openvino.ai/2025/openvino-workflow/running-inference/inference-devices-and-modes.html) and works with models from PyTorch, TensorFlow, ONNX, TensorFlow Lite, PaddlePaddle, and JAX/Flax [frameworks](https://docs.openvino.ai/2025/openvino-workflow/model-preparation.html). It includes [APIs](https://docs.openvino.ai/2025/api/api_reference.html) in C++, Python, C, NodeJS, and offers the GenAI API for optimized model pipelines and performance.

## Generative AI with OpenVINO

Get started with the OpenVINO GenAI [installation](https://docs.openvino.ai/2024/get-started/install-openvino/install-openvino-genai.html) and refer to the [detailed guide](https://docs.openvino.ai/2024/openvino-workflow-generative/generative-inference.html) to explore the capabilities of Generative AI using OpenVINO.
Get started with the OpenVINO GenAI [installation](https://docs.openvino.ai/2025/get-started/install-openvino/install-openvino-genai.html) and refer to the [detailed guide](https://docs.openvino.ai/2025/openvino-workflow-generative/generative-inference.html) to explore the capabilities of Generative AI using OpenVINO.

Learn how to run LLMs and GenAI with [Samples](https://github.com/openvinotoolkit/openvino.genai/tree/master/samples) in the [OpenVINO™ GenAI repo](https://github.com/openvinotoolkit/openvino.genai). See GenAI in action with Jupyter notebooks: [LLM-powered Chatbot](https://github.com/openvinotoolkit/openvino_notebooks/blob/latest/notebooks/llm-chatbot/README.md) and [LLM Instruction-following pipeline](https://github.com/openvinotoolkit/openvino_notebooks/blob/latest/notebooks/llm-question-answering/README.md).

Expand All @@ -122,7 +122,7 @@ Learn how to run LLMs and GenAI with [Samples](https://github.com/openvinotoolki
### Integrations

- [🤗Optimum Intel](https://github.com/huggingface/optimum-intel) - grab and use models leveraging OpenVINO within the Hugging Face API.
- [Torch.compile](https://docs.openvino.ai/2024/openvino-workflow/torch-compile.html) - use OpenVINO for Python-native applications by JIT-compiling code into optimized kernels.
- [Torch.compile](https://docs.openvino.ai/2025/openvino-workflow/torch-compile.html) - use OpenVINO for Python-native applications by JIT-compiling code into optimized kernels.
- [OpenVINO LLMs inference and serving with vLLM​](https://docs.vllm.ai/en/stable/getting_started/openvino-installation.html) - enhance vLLM's fast and easy model serving with the OpenVINO backend.
- [OpenVINO Execution Provider for ONNX Runtime](https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html) - use OpenVINO as a backend with your existing ONNX Runtime code.
- [LlamaIndex](https://docs.llamaindex.ai/en/stable/examples/llm/openvino/) - build context-augmented GenAI applications with the LlamaIndex framework and enhance runtime performance with OpenVINO.
Expand All @@ -133,7 +133,7 @@ Check out the [Awesome OpenVINO](https://github.com/openvinotoolkit/awesome-open

## Performance

Explore [OpenVINO Performance Benchmarks](https://docs.openvino.ai/2024/about-openvino/performance-benchmarks.html) to discover the optimal hardware configurations and plan your AI deployment based on verified data.
Explore [OpenVINO Performance Benchmarks](https://docs.openvino.ai/2025/about-openvino/performance-benchmarks.html) to discover the optimal hardware configurations and plan your AI deployment based on verified data.

## Contribution and Support

Expand All @@ -149,7 +149,7 @@ You can ask questions and get support on:

## Resources

* [Release Notes](https://docs.openvino.ai/2024/about-openvino/release-notes-openvino.html)
* [Release Notes](https://docs.openvino.ai/2025/about-openvino/release-notes-openvino.html)
* [OpenVINO Blog](https://blog.openvino.ai/)
* [OpenVINO™ toolkit on Medium](https://medium.com/@openvino)

Expand All @@ -164,7 +164,7 @@ You can opt-out at any time by running the command:
opt_in_out --opt_out
```

More Information is available at [OpenVINO™ Telemetry](https://docs.openvino.ai/2024/about-openvino/additional-resources/telemetry.html).
More Information is available at [OpenVINO™ Telemetry](https://docs.openvino.ai/2025/about-openvino/additional-resources/telemetry.html).

## License

Expand Down
4 changes: 2 additions & 2 deletions docs/RELEASE.MD
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ This phase takes 2-4 weeks and involves scoping the backlog, prioritizing it, an

### Execution (development of new features)
- [OpenVINO Contributing Guide](https://github.com/openvinotoolkit/openvino/blob/master/CONTRIBUTING.md)
- [Code Contribution Guide](https://docs.openvino.ai/2024/about-openvino/contributing/code-contribution-guide.html)
- [Code Contribution Guide](https://docs.openvino.ai/2025/about-openvino/contributing/code-contribution-guide.html)
- [OpenVINO First Good Issue](https://github.com/openvinotoolkit/openvino/issues/17502)

### Stabilization (Feature Freeze, Code Freeze milestones)
Expand All @@ -25,5 +25,5 @@ This phase takes 2-4 weeks and involves scoping the backlog, prioritizing it, an
- After Code Freeze, the testing team can perform final regression testing to ensure that recent changes have not introduced new bugs and that the software meets the required quality standards.

### Distribution
- OpenVINO has different types of build distribution: Regular releases, Long-Term Support, Pre-release releases, Nightly builds. Read more here: [OpenVINO Release Policy](https://docs.openvino.ai/2024/about-openvino/release-notes-openvino/release-policy.html)
- OpenVINO has different types of build distribution: Regular releases, Long-Term Support, Pre-release releases, Nightly builds. Read more here: [OpenVINO Release Policy](https://docs.openvino.ai/2025/about-openvino/release-notes-openvino/release-policy.html)
- Different distribution channels are supported. Explore different options here: [OpenVINO Download](https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html)
2 changes: 1 addition & 1 deletion docs/articles_en/about-openvino/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ PR. This way, it will be easier for other developers to track changes.

If you want to run inference on a device that is currently not supported, you
can see how to develop a new plugin for it in the
`Plugin Developer Guide <https://docs.openvino.ai/nightly/documentation/openvino-extensibility/openvino-plugin-library.html>`__.
`Plugin Developer Guide <https://docs.openvino.ai/2025/documentation/openvino-extensibility/openvino-plugin-library.html>`__.


:fas:`file-alt` Improve documentation
Expand Down
2 changes: 1 addition & 1 deletion docs/articles_en/about-openvino/key-features.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Easy Integration
| With the OpenVINO GenAI, you can run generative models with just a few lines of code.
Check out the GenAI guide for instructions on how to do it.
| `Python / C++ / C / NodeJS APIs <https://docs.openvino.ai/2024/api/api_reference.html>`__
| `Python / C++ / C / NodeJS APIs <https://docs.openvino.ai/2025/api/api_reference.html>`__
| OpenVINO offers the C++ API as a complete set of available methods. For less resource-critical
solutions, the Python API provides almost full coverage, while C and NodeJS ones are limited
to the methods most basic for their typical environments. The NodeJS API, is still in its
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ The tables below list the key performance indicators for inference on built-in G

.. grid-item::

.. button-link:: https://docs.openvino.ai/2024/_static/download/benchmarking_genai_platform_list.pdf
.. button-link:: https://docs.openvino.ai/2025/_static/download/benchmarking_genai_platform_list.pdf
:color: primary
:outline:
:expand:
Expand Down
12 changes: 6 additions & 6 deletions docs/articles_en/documentation/openvino-ecosystem.rst
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ models. Check the LLM-powered Chatbot Jupyter notebook to see how GenAI works.

| **Neural Network Compression Framework**
| :bdg-link-dark:`Github <https://github.com/openvinotoolkit/nncf>`
:bdg-link-success:`User Guide <https://docs.openvino.ai/2024/openvino-workflow/model-optimization.html>`
:bdg-link-success:`User Guide <https://docs.openvino.ai/2025/openvino-workflow/model-optimization.html>`
A suite of advanced algorithms for Neural Network inference optimization with minimal accuracy
drop. NNCF applies quantization, filter pruning, binarization, and sparsity algorithms to PyTorch
Expand All @@ -43,7 +43,7 @@ and TensorFlow models during training.

| **OpenVINO Model Server**
| :bdg-link-dark:`Github <https://github.com/openvinotoolkit/model_server>`
:bdg-link-success:`User Guide <https://docs.openvino.ai/2024/ovms_what_is_openvino_model_server.html>`
:bdg-link-success:`User Guide <https://docs.openvino.ai/2025/ovms_what_is_openvino_model_server.html>`
A high-performance system that can be used to access the host models via request to the model
server.
Expand All @@ -52,7 +52,7 @@ server.

| **OpenVINO Notebooks**
| :bdg-link-dark:`Github <https://github.com/openvinotoolkit/openvino_notebooks>`
:bdg-link-success:`Jupyter Notebook Collection <https://docs.openvino.ai/2024/learn-openvino/interactive-tutorials-python.html>`
:bdg-link-success:`Jupyter Notebook Collection <https://docs.openvino.ai/2025/learn-openvino/interactive-tutorials-python.html>`
A collection of Jupyter notebooks for learning and experimenting with the OpenVINO™ Toolkit.
|hr|
Expand All @@ -68,7 +68,7 @@ without the need to convert.

| **OpenVINO Training Extensions**
| :bdg-link-dark:`Github <https://github.com/openvinotoolkit/training_extensions>`
:bdg-link-success:`Overview Page <https://docs.openvino.ai/2024/documentation/openvino-ecosystem/openvino-training-extensions.html>`
:bdg-link-success:`Overview Page <https://docs.openvino.ai/2025/documentation/openvino-ecosystem/openvino-training-extensions.html>`
A convenient environment to train Deep Learning models and convert them using the OpenVINO™
toolkit for optimized inference.
Expand All @@ -77,7 +77,7 @@ toolkit for optimized inference.

| **OpenVINO Security Addon**
| :bdg-link-dark:`Github <https://github.com/openvinotoolkit/security_addon>`
:bdg-link-success:`User Guide <https://docs.openvino.ai/2024/documentation/openvino-ecosystem/openvino-security-add-on.html>`
:bdg-link-success:`User Guide <https://docs.openvino.ai/2025/documentation/openvino-ecosystem/openvino-security-add-on.html>`
A solution for Model Developers and Independent Software Vendors to use secure packaging and
secure model execution.
Expand All @@ -86,7 +86,7 @@ secure model execution.

| **Datumaro**
| :bdg-link-dark:`Github <https://github.com/openvinotoolkit/datumaro>`
:bdg-link-success:`Overview Page <https://docs.openvino.ai/2024/documentation/openvino-ecosystem/datumaro.html>`
:bdg-link-success:`Overview Page <https://docs.openvino.ai/2025/documentation/openvino-ecosystem/datumaro.html>`
A framework and a CLI tool for building, transforming, and analyzing datasets.
|hr|
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,6 @@ Detailed Guides
API References
##############

* `OpenVINO Plugin API <https://docs.openvino.ai/2024/api/c_cpp_api/group__ov__dev__api.html>`__
* `OpenVINO Transformation API <https://docs.openvino.ai/2024/api/c_cpp_api/group__ie__transformation__api.html>`__
* `OpenVINO Plugin API <https://docs.openvino.ai/2025/api/c_cpp_api/group__ov__dev__api.html>`__
* `OpenVINO Transformation API <https://docs.openvino.ai/2025/api/c_cpp_api/group__ie__transformation__api.html>`__

Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,6 @@ Plugin API Reference

The guides below provides extra API references needed for OpenVINO plugin development:

* `OpenVINO Plugin API <https://docs.openvino.ai/2024/api/c_cpp_api/group__ov__dev__api.html>`__
* `OpenVINO Transformation API <https://docs.openvino.ai/2024/api/c_cpp_api/group__ie__transformation__api.html>`__
* `OpenVINO Plugin API <https://docs.openvino.ai/2025/api/c_cpp_api/group__ov__dev__api.html>`__
* `OpenVINO Transformation API <https://docs.openvino.ai/2025/api/c_cpp_api/group__ie__transformation__api.html>`__

2 changes: 1 addition & 1 deletion docs/articles_en/get-started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ GET STARTED


For a quick reference, check out
`the Quick Start Guide [pdf] <https://docs.openvino.ai/2024/_static/download/OpenVINO_Quick_Start_Guide.pdf>`__
`the Quick Start Guide [pdf] <https://docs.openvino.ai/2025/_static/download/OpenVINO_Quick_Start_Guide.pdf>`__


.. _quick-start-example:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ is used instead. Mixing different ABIs is not possible as doing so will result i

To try OpenVINO GenAI with different dependencies versions (which are **not** prebuilt packages
as archives or python wheels), build OpenVINO GenAI library from
`Source <https://github.com/openvinotoolkit/openvino.genai/blob/releases/2024/3/src/docs/BUILD.md#build-openvino-openvino-tokenizers-and-openvino-genai-from-source>`__.
`Source <https://github.com/openvinotoolkit/openvino.genai/blob/releases/2025/0/src/docs/BUILD.md#build-openvino-openvino-tokenizers-and-openvino-genai-from-source>`__.

Additional Resources
#######################
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ and
`LLM Instruction-following pipeline <https://github.com/openvinotoolkit/openvino_notebooks/blob/latest/notebooks/llm-question-answering/README.md>`__.

OpenVINO GenAI is available for installation via PyPI and Archive distributions.
A `detailed guide <https://github.com/openvinotoolkit/openvino.genai/blob/releases/2024/3/src/docs/BUILD.md>`__
A `detailed guide <https://github.com/openvinotoolkit/openvino.genai/blob/releases/2025/0/src/docs/BUILD.md>`__
on how to build OpenVINO GenAI is available in the OpenVINO GenAI repository.

PyPI Installation
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Installing OpenVINO Node.js
.. note::

The *openvino-node* npm package runs in Node.js environment only and provides
a subset of `OpenVINO Runtime C++ API <https://docs.openvino.ai/2024/api/c_cpp_api/group__ov__cpp__api.html>`__.
a subset of `OpenVINO Runtime C++ API <https://docs.openvino.ai/2025/api/c_cpp_api/group__ov__cpp__api.html>`__.

What's Next?
####################
Expand Down
Loading

0 comments on commit 2eb92ad

Please sign in to comment.