Run TensorFlow model on reTerminal with TensorFlow Lite Runtime

TensorFlow Lite is a set of tools that enables on-device machine learning by helping developers run their models on mobile, embedded, and IoT devices. The key features of TensorFlow Lite are optimized for on-device machine learning, with a focus on latency, privacy, connectivity, size, and power consumption. The framework is built to provide support for multiple platforms, including Android and iOS devices, embedded Linux, and microcontrollers. It also has built-in support for a variety of languages, such as Java, Swift, Objective-C, C++, and Python, and it has high performance with hardware acceleration and model optimization. It provides end-to-end examples for common machine learning tasks, such as image classification, object detection, pose estimation, question answering, and text classification, on multiple platforms.

Meet reTerminal, The Next Generation of Human Machine Interface with you. This future-ready Human-Machine Interface (HMI) device can easily and efficiently work with IoT and cloud systems to unlock endless scenarios at the edge. reTerminal is powered by a Raspberry Pi Compute Module 4 (CM4) which is a Quad-Core Cortex-A72 CPU running at 1.5GHz and a 5-inch IPS capacitive multi-touch screen with a resolution of 720 x 1280. It has sufficient amount of RAM (4GB) to perform multitasking and also has sufficient amount of eMMC storage (32GB) to install an operating system, enabling fast boot-up times and a smooth overall experience. It has wireless connectivity with dual-band 2.4GHz/5GHz Wi-Fi and Bluetooth 5.0 BLE.

TensorFlow Lite Runtime Package Installation

The tflite_runtime package is a smaller, simplified Python package that includes the bare minimum code required to run an inference with TensorFlow Lite. This package is ideal when all you want to do is execute .tflite models and avoid wasting disk space with the large TensorFlow library.

It is possible to use TFLite Converter to convert any Tensorflow model into .tflite format, provided it only consists of operations supported by TFLite Runtime. The following is a list of demos currently tested on reTerminal, that will be expanded and completed in the future:

ModelResultComments
Object DetectionpirDemo: Vehicle Detection
Jupyter Notebook Example scripts
alpha 0.25 224×224 66.7 FPS (15 ms.)
alpha 0.5 224×224 40 FPS (25 ms.)
alpha 0.75 320×320 14.9 FPS (67 ms.)
alpha 1.0 320×320 10.4 FPS (96 ms.)
Image ClassificationpirDemo: Industrial Conveyor Rip Identification
Jupyter Notebook Example scripts
Semantic segmentationpirDemo: Lung segmentation
Jupyter Notebook Example scripts
Face age/gender recognitionpirDemo: Multi-stage inference: MobileNet YOLOv3 alpha 0.25 -> MobileFaceNet
Github repository Example scripts
~16-20 FPS (with ARM NN)
Face expression recognitionpirDemo: Multi-stage inference: MobileNet YOLOv3 alpha 0.25 -> MobileFaceNet
Github repository Example scripts
~11 FPS
Face anti-spoofingpirDemo: Multi-stage inference: MobileNet YOLOv3 alpha 0.25 -> MobileNet v1 alpha 0.25
Jupyter Notebook Example scripts ~23 FPS (ARM NN)

Please don’t forget to check reTerminal wiki page for Machine Learning applications to explore further and also run demos on your reTerminal in hands! We also included MediaPipe test. We will keep developing and extend more on-device machine learning possibilities. Let us know the application you are running/planning and we will add it to our development todo list!

About Author

Calendar

September 2021
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
27282930