20


0
shares

  • Author
  • Recent Posts
Javier Bonilla
Researcher at
CIEMAT – PSA PhD. in Computer Science / Solar Thermal Energy Researcher
Latest posts by Javier Bonilla
(see all)

  • Flutter on Raspberry Pi with flutter-pi – 26 November, 2019
  • ESP8266 NodeMCU pinout for Arduino IDE – 19 November, 2019
  • Cross-compile and deploy Qt 5.12 for Raspberry Pi – 17 November, 2019

The Google Coral USB Accelerator is a USB device that provides an edge Tensor Processing Unit (TPU) that highly accelerates machine learning model inference when attached to a Linux host computer (including Raspberry Pi).

In a previous tutorial, we already learnt how to integrate TensorFlow Lite with Qt/QML for the development of Raspberry Pi apps, together with an open-source example app for object detection: Raspberry Pi, TensorFlow Lite and Qt/QML: object detection example. Have a look at it to learn the basics.

This tutorial presents the same app, but in this case we will make use of the Coral USB Accelerator for inference. The results are impressive, since the inference time is reduced from 1 – 2 seconds on Raspberry Pi CPU to 55 – 80 milliseconds on Edge TPU.

Have a look at the following video and compare the inference speed with that from the app that performs inference on Raspberry Pi CPU: Raspberry Pi, TensorFlow Lite and Qt/QML: object detection example.

If you test this example in a computer equipped with a USB 3.0 port, you will have even better performance (7 – 9 milliseconds on my laptop). This is because, up to Raspberry Pi 3 Model B+ (the one considered in this tutorial), only USB 2.0 ports are available. This limits the data transmission bandwidth between the Edge TPU and the Raspberry Pi itself.

The good news is that Raspberry Pi 4 Model B is equipped with USB 3.0 ports.

Outline

  • Hardware used in this tutorial
  • Qt: download, cross-compile and install on Raspberry Pi
  • TensorFlow Lite: download and cross-compile for Raspberry Pi
  • Coral USB Accelerator: download and install the Edge TPU runtime
  • Edge TPU – Raspberry Pi object detection app
  • Summary

Hardware used in this tutorial

  • Coral USB Accelerator
  • Raspberry Pi 3 Model B+
  • Raspberry Pi Camera
  • Raspberry Pi 3 – 7″ Touchscreen Display
  • 2 x Flex Cable for Raspberry Pi (for camera and display)
  • 2 x Raspberry Pi 3 B+ Power Supply – 5V 2.5A (for Raspberry Pi and display)

These are affiliate links (except the Coral USB Accelerator link). This means if you click on the link and purchase the promoted item, we will receive a small affiliate commission at no extra cost to you, the price of the product is the same. We would really appreciate your support to our work and website if this is fine for you.

Qt: download, cross-compile and install on Raspberry Pi

Have a look at Cross-compile and deploy Qt 5.12 for Raspberry Pi. It provides all the details to do this step. There, you can also find how to set up Qt Creator to deploy Qt apps to Raspberry Pi.

TensorFlow Lite: download and cross-compile for Raspberry Pi

The compilation of TensorFlow Lite for Raspberry Pi, as well as for the host Linux operating system, is already covered in a previous tutorial: Raspberry Pi, TensorFlow Lite and Qt/QML: object detection example.

Coral USB Accelerator: download and install the Edge TPU runtime

We need to install the Coral edge TPU runtime to access the USB Accelerator. This runtime includes Python libraries, C++ API files and a shared library (libedgetpu.so) for our target platform. The required steps are shown below for Linux and Raspberry Pi in particular. They are also described at Coral website in the set up for Linux and Raspberry Pi section.

Linux host (Debian/Ubuntu)

If you want to test the Coral USB Accelerator in your Linux host, you have to install the edget TPU runtime for x86_64 platforms as follows.

wget https://dl.google.com/coral/edgetpu_api/edgetpu_api_latest.tar.gz   
   -O edgetpu_api.tar.gz --trust-server-names
tar xzf edgetpu_api.tar.gz
cd edgetpu_api
bash ./install.sh

Raspberry Pi

The same commands must be executed on the Raspberry Pi to install the edge TPU runtime for ARM platforms.

wget https://dl.google.com/coral/edgetpu_api/edgetpu_api_latest.tar.gz   
   -O edgetpu_api.tar.gz --trust-server-names
tar xzf edgetpu_api.tar.gz
cd edgetpu_api
bash ./install.sh

Note: Do not forget to synchronize your Raspberry Pi sysroot folder in your Linux host after downloading and installing the edge TPU runtime. For further information about what a sysroot is and why this is needed, check Cross-compile and deploy Qt 5.12 for Raspberry Pi, section 4 – Create and configure a sysroot.

We use rsync to synchronize our Linux host sysroot and the Raspberry Pi; raspberrypi_ip is the network interface name or IP address of our Raspberry Pi.

cd sysroot
rsync -avz pi@raspberrypi_ip:/lib sysroot
rsync -avz pi@raspberrypi_ip:/usr/include sysroot/usr
rsync -avz pi@raspberrypi_ip:/usr/lib sysroot/usr
rsync -avz pi@raspberrypi_ip:/opt/vc sysroot/opt

Next, we need to adjust our symbolic links in sysroot to be relative since this folder structure is in both our computer and Raspberry Pi.

wget https://raw.githubusercontent.com/riscv/riscv-poky/master/scripts/sysroot-relativelinks.py
chmod +x sysroot-relativelinks.py
./sysroot-relativelinks.py sysroot

Edge TPU – Raspberry Pi object detection app

This app is open source and it is hosted in a Git repository on GitHub. The app is mostly the same as the one developed in Raspberry Pi, TensorFlow Lite and Qt/QML: object detection example. The main differences are the following.

  • Use of TensorFlow Lite C++ API for Edge TPU.
  • Use of an artificial neural network model tailored for Edge TPU: MobileNet SSD v2 (COCO). However, this example works with any MobileNet SSD model.

Edge TPU – Raspberry Pi object detection app

Have a look at all the available edge TPU neural network models provided by Coral. The list includes models for image classification and object detection among others.

If you are interested in training and serving your own object detection model with edge TPU support, have a look at Coral documentation for retraining an object detection model.

Summary

This tutorial extended a previous tutorial, about how to use TensorFlow Lite C++ API on Raspberry Pi for object detection, with fast edge TPU inference thanks to a Coral USB Accelerator device. The app presented here works on desktop, as well as on Raspberry Pi, and it is compatible with any MobileNet SSD neural network model.

I hope you liked the tutorial, please consider to rate it with the starts you can find below, this gives us feedback about our work. If you have any doubt, proposal, comment or issue, just write below, we are here to help :-).

5
3
votes Article Rating

Related

相关文章
为您推荐
各种观点

报歉!评论已关闭.