November updates

New PyCoral library and extended libcoral library

The Coral Team
November 5, 2020

We're excited to bring you the following software updates for our Python and C++ libraries.

New PyCoral library

When we first launched Coral, we offered the Edge TPU Python library for inferencing on the Edge TPU. But since we released the TensorFlow Lite delegate for the Edge TPU, we've been migrating our APIs to be fully-compatible with the TensorFlow Lite interpreter. That work is now done and provided in this new Python library.

Whether you use the PyCoral APIs or not, running an inference on the Edge TPU depends on the TensorFlow Lite interpreter API. What we offer in PyCoral are APIs that simplify your code when running an inference, such as when setting up the interpreter and processing your model's output tensors. So, compared to using the TensorFlow Lite API alone, using PyCoral greatly simplifies your code for common inferencing tasks..

This library also introduces model pipelining for Python (previously released as beta only for C++). The new PipelinedModelRunner API allows you to run a segmented model across multiple Edge TPUs, increasing the throughput for some models, especially large models. And this library is where you'll find our latest APIs for on-device transfer learning, such as with backpropagation and weight imprinting.

To get all these APIs, install the PyCoral Python wheel and import the "pycoral" module. On Debian systems, including the Coral Dev Board, you can install as follows:

sudo apt-get update

sudo apt-get install python3-pycoral

For more details, see our guide to run inferencing on the Edge TPU with Python and explore the PyCoral API reference.

Edge TPU Python library is deprecated

As described above, PyCoral is our new library for Python development with the Edge TPU, and we will not release any more updates for the Edge TPU Python library.

In case your project depends on it, the Edge TPU Python library is still available (as a Python wheel or a Debian package) but we will not release any more updates. If you continue using the Edge TPU Python library, you must also use the corresponding version of the Edge TPU Runtime (libedgetpu), which is now outdated. To enforce this compatibility on Debian systems, we've created a "legacy" version of the libedgetpu Debian package that's frozen at this older version. If you now install or upgrade python3-edgetpu, you'll see a message requiring that you also install that legacy package. If you install it, you will have an outdated version of libedgetpu that's compatible with the Edge TPU Python library but not compatible with our newer libraries, such as PyCoral.

To keep your code up to date with the latest Edge TPU Runtime, we recommend you migrate to the PyCoral library.

Expanded libcoral C++ library

Our Coral C++ library (libcoral) now offers essentially all the same features as our Python library, including the adapter APIs when using the TensorFlow Lite interpreter, on-device transfer learning APIs, and model pipelining.

For details, see our guide to run inferencing on the Edge TPU with C++ and explore the libcoral API reference.

New profiling-based partitioner for pipelining

To further optimize the performance when pipelining a model on multiple Edge TPUs, we've released a profiling-based partitioner. This is a tool that segments an Edge TPU model by measuring the actual run-time latency of each segment, and then re-segmenting the model to more evenly distribute the latency. (Whereas, the Edge TPU Compiler's --num_segments argument simply divides the model so each segment has a roughly the same amount of parameter data.)

For more information about how to build and use this tool, see the profiling-based partitioner README.

New pre-trained models

In this release, we've added a pre-trained version of MobileDet—a state-of-the-art object detection model for mobile systems—that's compatible with the Edge TPU.

We're also actively migrating our model-development workflow to TensorFlow 2, and in this release, we're sharing a handful of updated or new models based on the TF2 Keras framework: MobileNet v1 (ImageNet), MobileNet v2 (ImageNet), MobileNet v3 (ImageNet), ResNet50 v1 (ImageNet), and UNet MobileNet v2 (Oxford pets). You can find the all our trained and compiled models at coral.ai/models/.

That's all for now. As always, please send us feedback at coral-support@google.com.