[Video] Running TensorFlow Lite Models on Raspberry Pi

Many deep learning models created using TensorFlow require high processing capabilities to perform inference. Fortunately, there is a lite version of TensorFlow called TensorFlow Lite (TFLite for short) which allows these models to run on devices with limited capabilities. Inference is performed in less than a second.

This tutorial will go through how to prepare Raspberry Pi (RPi) to run a TFLite model for classifying images. After that, the TFLite version of the MobileNet model will be downloaded and used for making predictions on-device.

Tutorial video link:

Run the code on a free GPU:

submitted by /u/hellopaperspace
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *