Examples

Code examples and project tutorials to build intelligent devices with Coral

Coral examples link

Simple code examples showing how to run pre-trained models on your Coral device. More pre-trained models are on our Models page.

Note: These examples are not compatible with the Dev Board Micro—instead see the coralmicro examples.

MoveNet pose estimation

This example shows how to use the high-performance MoveNet model to detect human poses from images, and can be used with the high-speed "lighting" model or high-accuracy "thunder" model.

videocam

Image recognition with video

Multiple examples showing how to stream images from a camera and run classification or detection models with the TensorFlow Lite API. Each example uses a different camera library, such as GStreamer, OpenCV, PyGame, and PiCamera.

videocam

Object tracking with video

This example takes a camera feed and tracks each uniquely identified object, assigning each object with a persistent ID. The example detection script allows you to specify the tracker program you want to use (the Sort tracker is included).

videocam

Person segmentation with video

This example takes in a camera feed and performs body-part segmentation using the BodyPix model (with both MobileNet v1 and ResNet50 backbones). In addition to identifying different body parts, it can anonymize people from images.

videocam

PoseNet pose estimation with video

Multiple examples showing how to use the PoseNet model to detect human poses from images and video, such as locating the position of someone’s elbow, shoulder or foot.

Semantic segmentation

This example performs semantic segmentation on an image. It takes an image as input and creates a new version of that image showing which pixels correspond to each recognized object.

Keyphrase detector

A few examples using a keyphrase detection model that can detect over 140 short phrases such as "start game" and "next song." Includes a snake game and a YouTube player that respond to voice commands.

Pipelined image classification

An example showing how to pipeline a model across multiple Edge TPUs, allowing you to significantly increase throughput for large models such as Inception.

Basic object detection

An example that performs object detection with a photo and draws a square around each object. Also works with face detection models.

Partner examples link

More examples that use ML tools from our partners.

Sound categorization with balenaCloud

This tutorial teaches you how to deploy a Coral Dev Board with a pre-trained sound categorization model, and use a balenaCloud backend to manually review the classifications.

Fleet management with balenaCloud

This example uses balenaCloud to deploy an object detection model to a Dev Board and view live inferencing from a web page.

Project tutorials link

Instructions and source code to help you bring local AI into the real world.

Alto

An open source AI experiment that introduces the basics of machine learning by helping you build a teachable object using a Raspberry Pi Zero and the Coral USB Accelerator.

Teachable Sorter

A physical machine that you can teach to rapidly recognize and sort objects using your own custom machine learning models.

Smart Bird Feeder

A smart bird feeder that uses an image classification model to identify birds, record animal visits, and deter squirrels from stealing bird seed.

Banana Seeker Robot

A Raspberry Pi-based robot that chases bananas. This project uses some basic object tracking software and a pre-trained object detection model.

Embedded Teachable Machine

A machine that can quickly learn to recognize new objects by re-training a vision classification model directly on your device.

Minigo

An implementation of AlphaGo Zero called Minigo, which uses machine learning to play the strategy board game "go" at expert levels.