A USB accessory that brings machine learning inferencing to existing systems. Works with Raspberry Pi and other Linux systems.
This product has shipping restriction to certain countries. It can only be sent to countries and regions listed as below: Austria , Belgium , Bulgaria , Croatia , Cyprus , Denmark , Estonia , Finland , Germany , Greece , Hong Kong, Hungary , Iceland , Ireland , Italy , Japan, Korea, Latvia , Liechtenstein , Lithuania , Luxembourg , Malta , Netherlands , Norway , Poland , Portugal , Romania , Slovakia , Slovenia , Spain , Sweden , Switzerland , Turkey , United Kingdom , US
The Coral USB Accelerator brings powerful ML inferencing capabilities to existing Linux systems.
Featuring the Edge TPU — a small ASIC designed and built by Google— the USB Accelerator provides high performance ML inferencing with a low power cost over a USB 3.0 interface. For example, it can execute state-of-the-art mobile vision models such as MobileNet v2 at 100+ fps, in a power efficient manner. This allows you to add fast ML inferencing to your embedded AI devices in a power-efficient and privacy-preserving way.
Models are developed in TensorFlow Lite and then compiled to run on the USB Accelerator.
Edge TPU key benefits
Coral is a division of Google, that helps you build intelligent ideas with our platform for local AI.
Edge TPU ML accelerator
Arm 32-bit Cortex-M0+ Microprocessor (MCU)
Run on-device ML inferencing on the Edge TPU designed by Google.
Works with Debian Linux
Connect to any Linux-based system with an included USB Type-C cable.
Supports TensorFlow lite
No need to build models from the ground up. Tensorflow Lite models can be compiled to run on USB Accelerator.
An Open Software Ecosystem
We recognize that each project has different requirements and goals. As such we have worked to provide as much flexibility to developers as possible. Our quad-core solution allows us to run numerous software solutions. From the selection of algorithms, NLP services, smart assistants, operating systems, and more, we wish to support our users. If you have suggestions on what services you would like to see us officially support let us know on our forums, or if you enjoy the work yourself, submit a merge request to our github.
|Shipment Date||Jun 24, 2019|