R2Inference - Coral from Google

From RidgeRun Developer Connection
< R2Inference‎ | Supported backends
Revision as of 12:26, 23 February 2021 by Fsolano (talk | contribs) (USB Accelerator)
Jump to: navigation, search



Previous: Supported_backends/TensorRT Index Next: Supported_backends/ONNXRT




Introduction

The Edge TPU is an ASIC designed by Google to provide high-performance machine learning applications that are running on embedded devices. The Edge TPU library extends the TensorFlow Lite framework. For more information about the Edge TPU hardware go to the Coral Dev Board page.

Installation

R2Inference Edge TPU backend depends on the C/C++ TensorFlow API and the TensorFlow Lite backend. The installation process consists of downloading the source code, build and install it.

Coral Dev Board

Tensorflow Lite

1. Download Tensorflow source code and checkout the latest release tag (as per now 2.4.1).

git clone https://github.com/tensorflow/tensorflow
git checkout 85c8b2a817f95a3e979ecd1ed95bff1dc1335cff
cd tensorflow/tensorflow/lite/tools/make

2. Download dependencies

./download_dependencies.sh

3. Build the library.

Note: This building might take a while, and it consumes a lot of RAM memory so please do not do anything else while running the build.

./build_aarch64_lib.sh

Copy the static library to the libraries path:

sudo cp gen/linux_aarch64/lib/libtensorflow-lite.a /usr/local/lib/

4. Install abseil dependency.

cd downloads/absl/
mkdir build && cd build
cmake ..
make && sudo make install

USB Accelerator

The Coral USB Accelerator is an external device that can be connected to add the inference power to another compatible device. More information: https://coral.ai/docs/accelerator/get-started

While using the USB Accelerator Tensorflow Lite is also needed, so please follow instructions in previous section Tensorflow Lite.

Then continue with the following steps:

Install Edge TPU library

1. Add Debian package repository to your system:

echo "deb https://packages.cloud.google.com/apt coral-edgetpu-stable main" | sudo tee /etc/apt/sources.list.d/coral-edgetpu.list

curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add -

sudo apt-get update

2. Install the Edge TPU runtime:

sudo apt-get install libedgetpu1-std

3. Install dev headers:

sudo apt-get install libedgetpu-dev

Troubleshooting

With the Tensorflow commit d855adfc5a0195788bf5f92c3c7352e638aa1109, there is a bug with the Makefile that does not include the sparsity source files and can provoke linkage errors. To fix this issue, apply the following patch in your Tensorflow local repo: Fix a build error with Makefile.

API

As the Edge TPU backend is an extension of the TensorFlow Lite backend, hence, it keeps the same parameter options as presented R2Inference/Supported_backends/TensorFlow-Lite#API section.



Previous: Supported_backends/TensorRT Index Next: Supported_backends/ONNXRT