Skip to content

Instantly share code, notes, and snippets.

@csarron
Last active March 15, 2019 08:39
Show Gist options
  • Save csarron/9be8dee399cf6e311cbe920e092aa402 to your computer and use it in GitHub Desktop.
Save csarron/9be8dee399cf6e311cbe920e092aa402 to your computer and use it in GitHub Desktop.

MobileNet Inference Time

Movidius NCS Accelerator

Using NCSDK 1.12.00, lastest as of April 2018

Version Average Inference Time
mobilenet_v1_1.0_224 39.49 ms
mobilenet_v2_1.0_224 38.91 ms

OnePlus 3 (Tested on April 25, 2018)

Snapdragon 820 CPU, Adreno 530 GPU, Hexagon 680

  • mobilenet_v1_1.0_224:
Runtime Average Inference Time
Tencent ncnn 140.41 ms
SNPE CPU 419.15 ms
SNPE GPU 17.94 ms
SNPE DSP 10.58 ms
  • mobilenet_v2_1.0_224:
Runtime Average Inference Time
Tencent ncnn 94.96 ms
SNPE CPU 455.20 ms
SNPE GPU 12.68 ms
SNPE DSP 11.29 ms
  • Tencent ncnn is highly optimized for CNN infernce on CPU (tested for commit at 3a27cb7, April 2018)
  • SNPE-1.14.1, latest as of April 2018

Pixel 2 (Tested on Mar 21, 2018)

Snapdragon 835 CPU, Adreno 540 GPU, Hexagon 682

  • mobilenet_v1_1.0_224:
Runtime Average Inference Time
Tencent ncnn 66.06 ms
SNPE CPU 419.15 ms
SNPE GPU 14.60 ms
  • ncnn tested for commit at 6ea09eb
  • SNPE-1.13.0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment