Skip to content

Latest commit

 

History

History
 
 

benchmarks

Model Zoo Scripts

Training and inference scripts with TensorFlow optimizations that use the Intel® oneAPI Deep Neural Network Library (Intel® oneDNN) and Intel® Extension for PyTorch.

Prerequisites

The model documentation in the tables below have information on the prerequisites to run each model. The model scripts run on Linux. Select models are also able to run using bare metal on Windows. For more information and a list of models that are supported on Windows, see the documentation here.

The oneContainer Portal column has links for using workload containers and model packages for each model precision. These containers are built based on images with Intel optimizations for TensorFlow or PyTorch and contain all the dependencies, scripts, and pretrained models needed to run the workload. The model packages have scripts and pretrained model files used for running on bare metal.

For information on running more advanced use cases using the workload containers see the: advanced options documentation.

TensorFlow Use Cases

Use Case Model Mode oneContainer Portal Model Documentation
Image Recognition DenseNet169 Inference Model Containers: FP32
Model Packages: FP32
FP32
Image Recognition Inception V3 Inference Model Containers: Int8 FP32
Model Packages: Int8 FP32
Int8 FP32
Image Recognition Inception V4 Inference Model Containers: Int8 FP32
Model Packages: Int8 FP32
Int8 FP32
Image Recognition MobileNet V1* Inference Model Containers: Int8 FP32
Model Packages: Int8 FP32
Int8 FP32 BFloat16**
Image Recognition ResNet 101 Inference Model Containers: Int8 FP32
Model Packages: Int8 FP32
Int8 FP32
Image Recognition ResNet 50 Inference Model Containers: Int8 FP32
Model Packages: Int8 FP32
Int8 FP32
Image Recognition ResNet 50v1.5 Inference Model Containers: Int8 FP32 BFloat16**
Model Packages: Int8 FP32 BFloat16**
Int8 FP32 BFloat16**
Image Recognition ResNet 50v1.5 Training Model Containers: FP32 BFloat16**
Model Packages: FP32 BFloat16**
FP32 BFloat16**
Image Segmentation 3D U-Net Inference Model Containers: FP32
Model Packages: FP32
FP32
Image Segmentation 3D U-Net MLPerf Inference FP32 BFloat16** Int8
Image Segmentation MaskRCNN Inference Model Containers: FP32
Model Packages: FP32
FP32
Image Segmentation UNet Inference Model Containers: FP32
Model Packages: FP32
FP32
Language Modeling BERT Inference Model Containers: FP32 BFloat16**
Model Packages: FP32 BFloat16**
FP32 BFloat16**
Language Modeling BERT Training Model Containers: FP32 BFloat16**
Model Packages: FP32 BFloat16**
FP32 BFloat16**
Language Translation BERT Inference FP32
Language Translation GNMT* Inference Model Containers: FP32
Model Packages: FP32
FP32
Language Translation Transformer_LT_mlperf Training Model Containers: FP32 BFloat16**
Model Packages: FP32 BFloat16**
FP32 BFloat16**
Language Translation Transformer_LT_mlperf Inference FP32 BFloat16** Int8
Language Translation Transformer_LT_Official Inference Model Containers: FP32
Model Packages: FP32
FP32
Object Detection Faster R-CNN Inference Model Containers: Int8 FP32
Model Packages: Int8 FP32
Int8 FP32
Object Detection R-FCN Inference Model Containers: Int8 FP32
Model Packages: Int8 FP32
Int8 FP32
Object Detection SSD-MobileNet* Inference Model Containers: Int8 FP32
Model Packages: Int8 FP32
Int8 FP32 BFloat16**
Object Detection SSD-ResNet34* Inference Model Containers: Int8 FP32
Model Packages: Int8 FP32
Int8 FP32 BFloat16**
Object Detection SSD-ResNet34 Training Model Containers: FP32 BFloat16**
Model Packages: FP32 BFloat16**
FP32 BFloat16**
Recommendation DIEN Inference FP32 BFloat16**
Recommendation DIEN Training FP32
Recommendation NCF Inference Model Containers: FP32
Model Packages: FP32
FP32
Recommendation Wide & Deep Inference Model Containers: FP32
Model Packages: FP32
FP32
Recommendation Wide & Deep Large Dataset Inference Model Containers: Int8 FP32
Model Packages: Int8 FP32
Int8 FP32
Recommendation Wide & Deep Large Dataset Training Model Containers: FP32
Model Packages: FP32
FP32
Reinforcement MiniGo Training FP32
Text-to-Speech WaveNet Inference Model Containers: FP32
Model Packages: FP32
FP32

TensorFlow Serving Use Cases

Use Case Model Mode Model Documentation
Image Recognition Inception V3 Inference FP32
Image Recognition ResNet 50v1.5 Inference FP32
Language Translation Transformer_LT_Official Inference FP32
Object Detection SSD-MobileNet Inference FP32

PyTorch Use Cases

Use Case Model Mode oneContainer Portal Model Documentation
Image Recognition GoogLeNet Inference FP32 BFloat16**
Image Recognition Inception v3 Inference FP32 BFloat16**
Image Recognition MNASNet 0.5 Inference FP32 BFloat16**
Image Recognition MNASNet 1.0 Inference FP32 BFloat16**
Image Recognition ResNet 50 Inference FP32 Int8 BFloat16**
Image Recognition ResNet 50 Training FP32 BFloat16**
Image Recognition ResNet 101 Inference FP32 BFloat16**
Image Recognition ResNet 152 Inference FP32 BFloat16**
Image Recognition ResNext 32x4d Inference FP32 BFloat16**
Image Recognition ResNext 32x16d Inference FP32 Int8 BFloat16**
Image Recognition VGG-11 Inference FP32 BFloat16**
Image Recognition VGG-11 with batch normalization Inference FP32 BFloat16**
Image Recognition Wide ResNet-50-2 Inference FP32 BFloat16**
Image Recognition Wide ResNet-101-2 Inference FP32 BFloat16**
Language Modeling BERT base Inference FP32 BFloat16**
Language Modeling BERT large Inference FP32 Int8 BFloat16**
Language Modeling BERT large Training FP32 BFloat16**
Language Modeling DistilBERT base Inference FP32 BFloat16**
Language Modeling RNN-T Inference FP32 BFloat16**
Language Modeling RNN-T Training FP32 BFloat16**
Language Modeling RoBERTa base Inference FP32 BFloat16**
Object Detection Faster R-CNN ResNet50 FPN Inference FP32
Object Detection Mask R-CNN Inference FP32 BFloat16**
Object Detection Mask R-CNN Training FP32 BFloat16**
Object Detection Mask R-CNN ResNet50 FPN Inference FP32
Object Detection RetinaNet ResNet-50 FPN Inference FP32
Object Detection SSD-ResNet34 Inference FP32 Int8 BFloat16**
Object Detection SSD-ResNet34 Training FP32 BFloat16**
Recommendation DLRM Inference FP32 Int8 BFloat16**
Recommendation DLRM Training FP32 BFloat16**

*Means the model belongs to MLPerf models and will be supported long-term.

**Means the BFloat16 data type support is experimental.