![](https://www.aitechinsights.com/wp-content/uploads/2023/11/1699834582_Google-Coral-USB-Edge-TPU-ML-Accelerator-coprocessor-for-Raspberry.jpg)
![](https://www.aitechinsights.com/wp-content/uploads/2023/11/Google-Coral-USB-Edge-TPU-ML-Accelerator-coprocessor-for-Raspberry.jpg)
![](https://www.aitechinsights.com/wp-content/uploads/2023/11/1699834581_737_Google-Coral-USB-Edge-TPU-ML-Accelerator-coprocessor-for-Raspberry.jpg)
![](https://www.aitechinsights.com/wp-content/uploads/2023/11/1699834581_862_Google-Coral-USB-Edge-TPU-ML-Accelerator-coprocessor-for-Raspberry.jpg)
![](https://www.aitechinsights.com/wp-content/uploads/2023/11/Google-Coral-USB-Edge-TPU-ML-Accelerator-coprocessor-for-Raspberry.png)
![](https://www.aitechinsights.com/wp-content/uploads/2023/11/Google-Coral-USB-Edge-TPU-ML-Accelerator-coprocessor-for-Raspberry.jpg)
![](https://www.aitechinsights.com/wp-content/uploads/2023/11/1699834581_737_Google-Coral-USB-Edge-TPU-ML-Accelerator-coprocessor-for-Raspberry.jpg)
![](https://www.aitechinsights.com/wp-content/uploads/2023/11/1699834581_862_Google-Coral-USB-Edge-TPU-ML-Accelerator-coprocessor-for-Raspberry.jpg)
(as of Nov 13, 2023 00:16:22 UTC – Details)
![](https://www.aitechinsights.com/wp-content/uploads/2023/11/Google-Coral-USB-Edge-TPU-ML-Accelerator-coprocessor-for-Raspberry.png)
High speed TensorFlow Lite inferencing
Low power
Small footprint Features
Google Edge TPU ML accelerator coprocessor
USB 3.0 Type-C socket
Supports Debian Linux on host CPU
Models are built using TensorFlow. Fully supports MobileNet and Inception architectures though custom architectures are possible
Compatible with Google Cloud Specifications
Arm 32-bit Cortex-M0+ Microprocessor (MCU): Up to 32 MHz max 16 KB Flash memory with ECC 2 KB RAM
Connections: USB 3.1 (gen 1) port and cable (SuperSpeed, 5Gb/s transfer speed) Included cable is USB Type-C to Type-A Coral, a division of Google, helps build intelligent ideas with a platform for local AI.
Features: Google Edge TPU ML accelerator coprocessor, USB 3.0 Type-C socket, Supports Debian Linux on host CPU, Models are built using TensorFlow. Fully supports MobileNet and Inception architectures through custom architectures are possible. Compatible with Google Cloud.