-
Notifications
You must be signed in to change notification settings - Fork 1
ArmNN
A Bhat edited this page Aug 22, 2020
·
4 revisions
- An inference engine for Arm's CPUs, GPUs and NPUs
- Bridges the gap between existing NN frameworks and the underlying IP.
- Open source and free
- Runs on top of Arm Compute Library
System | Stream Latency (ms) | SoC | Cores | Software |
---|---|---|---|---|
Firefly-RK3399 (firefly) | 391.02 | Rockchip RK3399 | 2xA72+4xA53 | ArmNN v19.08 (Neon) |
Firefly-RK3399 (firefly) | 695.11 | Rockchip RK3399 | 2xA72+4xA53 | TFLite v1.15.0-rc2 |
Raspberry Pi 4 (rpi4) | 448.31 | Broadcom BCM2711B0 | 4xA72 @ 1.5 GHz | ArmNN v19.08 (Neon) |
Raspberry Pi 4 (rpi4) | 1,916.65 | Broadcom BCM2711B0 | 4xA72 @ 1.5 GHz | TFLite v1.15.0-rc2 |
Linaro HiKey960 (hikey960) | 494.9 | HiSilicon Kirin960 | 4xA73+4xA53 | ArmNN v19.08 (Neon) |
Linaro HiKey960 (hikey960) | 518.07 | HiSilicon Kirin960 | 4xA73+4xA53 | TFLite v1.15.0-rc2 |
Huawei Mate 10 Pro (mate10pro) | 494.92 | HiSilicon Kirin970 | 4xA73+4xA53 | ArmNN v19.08 (Neon) |
- Frameworks: TensorFlow | PyTorch | MXNET | Keras | ONNX
- Inference engines: ArmNN | OpenVINO | TensorRT | TensorFlow Lite | Core ML | Tengine