Models

Built for the Edge TPU

In the lists below, each "Edge TPU model" link provides a .tflite file that is pre-compiled to run on the Edge TPU. You can run these models on your Coral device using our example code. (Remember to download the model's corresponding labels file.)

For many of the models, we've also provided a link for "All model files," which is an archive file that includes the following:

  • Trained model checkpoints
  • Frozen graph for the trained model
  • Eval graph text protos (to be easily viewed)
  • Info file containing input and output information
  • Quantized TensorFlow Lite model that runs on CPU (included with classification models only)

Download this "All model files" archive to get the checkpoint file you'll need if you want to use the model as your basis for transfer-learning, as shown in the tutorials to retrain a classification model and retrain an object detection model.

If you'd like to download all models at once, you can clone our Git repo https://github.com/google-coral/edgetpu and then find the models in test_data/.

Notice: These are not production-quality models; they are for demonstration purposes only.

To build your own model for the Edge TPU, you must use the Edge TPU Compiler.

All models trained on ImageNet used the ILSVRC2012 dataset.

Image classification


Object detection


MobileNet SSD v2 (Faces)

Detects the location of human faces
Dataset: Open Images v4
Input size: 320x320
(Does not require a labels file)

Semantic segmentation


On-device retraining (classification)


MobileNet v1 embedding extractor

This model is compiled with the last fully-connected layer removed so that it can be used as an embedding extractor for on-device transfer-learning with the SoftmaxRegression API. This model does not perform classifications on its own, and must be paired with the SoftmaxRegression API.

For details, read Retrain a classification model on-device with backpropagation.