Many deep learning models created using TensorFlow require high processing capabilities to perform inference. Fortunately, there is a lite version of TensorFlow called TensorFlow Lite (TFLite for short) which allows these models to run on devices with limited capabilities. Inference is performed in less than a second.
This tutorial will go through how to prepare Raspberry Pi (RPi) to run a TFLite model for classifying images. After that, the TFLite version of the MobileNet model will be downloaded and used for making predictions on-device.
Tutorial video link: https://youtu.be/FdfxizUUQJI
Run the code on a free GPU: https://console.paperspace.com/ml-showcase/notebook/rljtgo7aadmiq7q?file=Raspberry%20Pi%20TF%20Lite%20Models.ipynb
submitted by /u/hellopaperspace
[visit reddit] [comments]