TensorFlow’s lightweight solution for mobile and embedded devices. TensorFlow has always run on many platforms but as the adoption of ML models has grown exponentially over the last few years, so has the need to deploy them on mobile and embedded devices. TensorFlow Lite enables low-latency inference of on-device machine learning models.