2. Google MLKit
ML Kit beta brings Google's machine learning expertise to mobile developers in a
powerful and easy-to-use package
It is a mobile SDK that brings Google's machine learning expertise to Android and
iOS apps
Whether you're new or experienced in machine learning, you can easily implement
the functionality you need in just a few lines of code. There's no need to have deep
knowledge of neural networks or model optimization to get started.
It used Google's ML technologies, such as the Google Cloud Vision API, Mobile
Vision, and TensorFlow Lite, together in a single SDK
3. Use the ML Kit Text Recognition API to detect text in images
Use the ML Kit Face Contour API to identify facial features in images
(Optional) Use the ML Kit Cloud Text Recognition API to expand text recognition
capabilities (such as non-Latin alphabets) when the device has internet connectivity
Learn how to host a custom pre-trained Tensor Flow Lite model using Firebase
Use the ML Kit Custom Model API to download the pre-trained TensorFlow Lite
model to your app
Use the downloaded model to run inference and label images
What Google MLKit can build
5. What is Firebase?
Firebase is Google’s mobile application development platform that helps you build, improve, and
grow your app.
It gives you cover a large portion of the services that developers would normally have to build
themselves, but don’t really want to build, because they’d rather be focusing on the app experience
itself. This includes things like analytics, authentication, databases, configuration, file storage, push
messaging, and the list goes on. The services are hosted in the cloud, and scale with little to no
effort on the part of the developer.
When I say “hosted in the cloud”, I mean that the products have backend components that are fully
maintained and operated by Google. Client SDKs provided by Firebase interact with these backend
services directly, with no need to establish any middleware between your app and the service. So, if
you’re using one of the Firebase database options, you typically write code to query the database in
your client app.
This is different than traditional app development, which typically involves writing both frontend
and backend software. The frontend code just invokes API endpoints exposed by the backend, and
the backend code actually does the work. However, with Firebase products, the traditional backend
is bypassed, putting the work into the client. Administrative access to each of these products is
provided by the Firebase console.
7. Products in the “build” group are these:
Authentication — user login and identity
Realtime Database — realtime, cloud hosted, NoSQL database
Cloud Firestore — realtime, cloud hosted, NoSQL database
Cloud Storage — massively scalable file storage
Cloud Functions — “serverless”, event driven backend
Firebase Hosting — global web hosting
ML Kit —SDK for common ML tasks
8. ML Kit
It is a mobile SDK that enables Android and iOS app developers to have advanced machine learning capabilities into their apps
with ease.
9. Why ML KIT?
Machine learning has become an integral part of mobile development. Big
companies like Uber, Facebook, Microsoft etc. rely heavily on machine learning for
their businesses. It helps them to know their users better and provide them with a
better experience on their apps.
So, as a mobile developer, it is important for us to integrate some kind of
intelligence into our apps for a better user experience.
ML Kit comes with a set of ready to use APIs for common use cases and it just
takes a few lines of code to implement these APIs into your apps.
If the ML Kit doesn’t have an API that suits your use case then ML Kit also provides
convenient APIs that help you use your custom TensorFlow Lite models in your
mobile apps.
10. It works both on-device and cloud
ML Kit APIs works both on the device and on the cloud.
The on-device APIs are designed to work fast with no internet connection.
Cloud-based APIs uses Google Cloud Platform’s machine learning technology
which gives more accurate results but requires an internet connection.
12. ML Kit’s Text Recognition provides both on-device and cloud-based APIs.
You can choose which one to use depending on your use case.
13. Text Recognition
Text Recognition is the process of detecting and recognising of textual information
in images, videos, documents and other sources.
There are many apps like Google Translate, Google Keep, CamScanner etc. which
uses the power of text recognition to provide some awesome and useful features.
With ML Kit’s Text Recognition API, you can recognise text in any Latin based
language (and more, with cloud-based text recognition).
14. ML Kit’s Text Recognition
The ML Kit’s Text Recogniser segments text into blocks, lines, and elements.
Block is a contiguous set of text lines, such as a paragraph or column.
Line is a contiguous set of words on the same vertical axis.
Element is a contiguous set of alphanumeric characters on the same vertical axis.
17. Step 1
Add Firebase to your Andrioid Project.
You can connect your Android app to Firebase using one of the following options:
1. Option 1: (recommended) Use the Firebase console setup workflow.
2. Option 2: Use the Android Studio Firebase Assistant (requires additional
configuration).
18. Option 1: Add Firebase using the Firebase
console
Step 1: Create a Firebase project.
Step 2: Register your app with Firebase.
Step 3: Add a Firebase configuration file.
Step 4: Add Firebase SDKs to your app
19. Option 2: Add Firebase using the Firebase
Assistant
Open your Android project in Android Studio.
Select Tools > Firebase to open the Assistant window.
Expand one of the listed Firebase products (for example, Analytics), then click the provided tutorial link (for
example, Log an Analytics event).
Click Connect to Firebase to register your app with an existing or new Firebase project and to
automatically add the necessary files and code to your Android project.
Check that your plugin and library versions are up-to-date.
Sync your app to ensure that all dependencies have the necessary versions.
Configure your Analytics data sharing settings in the Firebase console Project settings.
Enabling the sharing of Analytics data with other Firebase products is required to use Firebase products
like Firebase Predictions or Firebase A/B Testing.
Run your app to send verification to Firebase that you've successfully integrated Firebase.
20. Step 2
You need to include the ML Kit dependency in your app-level build.gradle file.
21. Step 3:
Here we have to specify the ML models . For on-device APIs, you can
configure your app to automatically download the ML models after it is
installed from the Play Store. Otherwise, the model will be downloaded on
the first time you run the on-device detector.
To enable this feature you need to specify your models in your app’s
AndroidManifest.xml file.
22. Step 3 : Get! — the Image
ML Kit provides an easy way to recognise text from variety of image types like
Bitmap, media.Image, ByteBuffer, byte[], or a file on the device. You just need to
create a FirebaseVisionImage object from the above mentioned image types and
pass it to the model.
23. Step 4 : Set! — the Model
Now, It’s time to prepare our Text Recognition model.
ML Kit provides both on-device and cloud-based models for Text Recognition
On Device Model
Cloud Based Model
.
24. Step 5 : Gooo!
Finally, we can pass our image to the model for Text Recognition.
25. Step 6 : Extract the information
Voilà! That’s it!
If the text recognition was successful, you’ll get a FirebaseVisionText object in the
success listener. This FirebaseVisionText object contains all the textual information
present in the image.
26. Advance concepts
ML Kit is part of the Firebase ecosystem, and it contains a set of machine learning
model APIs that offer out-of-the-box models for face detection, barcode scanning, text
recognition, image labeling, smart reply, and language identification.
ML Kit also supports custom model integration (TensorFlow Lite models).
Even though ML Kit comes with these pre-trained models, there may be some special
use cases that you want to implement, or maybe you’ve already trained a model on
TensorFlow that you want to deploy to a mobile device.
For example, let’s say you have a health app which collects personal health and
diagnostics data, which is stored on the device (as opposed to in the cloud).
We could build a machine learning model that could offer health-related suggestions to
the user based on the user’s activity, provide diagnostic information to practitioners and
clients, predict better diagnoses, or even automatically recommend precautions.
28. What is Tensorflow Lite ?
TensorFlow Lite is an open source deep learning framework provided by
TensorFlow to build lightweight models for mobile devices. It allows you to run
trained models on both iOS and Android.
TensorFlow Lite consists of two main components:
The TensorFlow Lite interpreter, which runs specially optimized models on many
different hardware types, including mobile phones, embedded Linux devices, and
microcontrollers.
The TensorFlow Lite converter, which converts TensorFlow models into an
efficient form for use by the interpreter, and can introduce optimizations to
improve binary size and performance.
29. Converting our model to TensorFlow Lite with
tflite_convert
Starting with TensorFlow 1.9, model conversion works through the
TFLiteConverter. Before that, it was called TOCO, or “TensorFlow Lite Optimizing
Converter”. This tool is used to optimize TensorFlow graphs to run on mobile
devices.
TensorFlow models work on protobuff, whereas TensorFlow Lite models work on
FlatBuffers. This is why we need a conversion tool.
30. Embedding the .tflite/.lite model into our app
Now that our custom TensorFlow Lite model is ready, let's integrate it into an
Android app and use it with ML Kit. To use ML Kit, we’ll need to create a Firebase
project (Firebase console).
Add .tflite file and labels.txt into assets directory.