Skip to content

Using a TensorFlow Pretrained Model to Detect Everyday Objects

FTC Engineering edited this page Oct 29, 2020 · 21 revisions

Introduction

Teams have the option of using a custom TensorFlow object detection model to detect objects other than the current season's game elements. This tutorial demonstrates how use a pretrained model to look for and track everyday objects. This particular tutorial uses the OnBot Java programming tool, but teams can also use Android Studio or the Blocks programming tool to implement this example. This tutorial covers an advanced topic and assumes that the user has good programming knowledge and is familiar with Android devices and computer technology.

This tutorial also assumes that you have already completed the steps in a previous TensorFlow tutorial. If you have not yet completed the steps in the previous TensorFlow tutorial, then please do so before continuing with this tutorial.

Downloading the Pretrained Model and Label Map

The custom inference model must be in the form of a TensorFlow Lite (.tflite) file. For this example, we will use the same object detection model that is used in Google's example TensorFlow Object Detection Android app.

The model and its corresponding label map can be downloaded from this link.

The .zip archive contains a file called "detect.tflite". This TensorFlow Lite file is the inference model that TensorFlow will use to recognize the everyday objects. It is based on the mobilenet architecture, which was designed to provide low latency recognitions, while still maintaining reasonable recognition accuracy.

The .zip archive also contains a text file called "labelmap.txt". This text file contains a list of labels that correspond to the known objects in the "detect.tflite" model file.

Download the .zip archive to your laptop and uncompress its contents to a folder.

Transferring the files to the Robot Controller

For this example, we want to transfer the .tflite and labelmap files to a directory on your robot controller.

If you are an advanced user and are familiar with using adb, then you can use adb to push the files to the directory "/sdcard/FIRST/tflitemodels" on your robot controller.

Transferring Using the Windows Explorer

If you do not know how to use the Android Debug Bridge tool, then you can use the File Explorer on a Windows laptop to copy and paste the files to this directory.

If you are using an Android phone as your robot controller, then connect the phone to your laptop using a USB cable. Swipe down from the top of the phone's screen to display the menu. Look for an item that indicates the USB mode that the phone is currently in. By default, most phones will be in a charging mode.

Tap on the Android System item in the menu to get details on the current USB mode.

/images/Using-a-TensorFlow-Pretrained-Model-to-Detect-Everyday-Objects/selectAndroidSystemItem.png
Tap on the Android System item.

Tap where it says "Tap for more options" display an activity that will allow you to switch the phone into file transfer mode.

/images/Using-a-TensorFlow-Pretrained-Model-to-Detect-Everyday-Objects/tapForMoreOptions.png
Tap to display the USB mode options.

Select the "Transfer files" mode and the phone should now appear as a browsable storage device in your computer's Windows Explorer.

/images/Using-a-TensorFlow-Pretrained-Model-to-Detect-Everyday-Objects/selectTransferFiles.png
Select the "Transfer files" option.

If you are using a REV Robotics Control Hub, you can connect the Control Hub (that is powered on by a fully charged 12V battery) to your laptop using a USB Type C cable and the Control Hub will automatically appear as a browsable storage device in your computer's Windows Explorer. You do not need to switch it to "Transfer file" mode since it is automatically always in this mode.

Use your Windows Explorer to locate and copy the "detect.tflite" and "labelmap.txt" files.

/images/Using-a-TensorFlow-Pretrained-Model-to-Detect-Everyday-Objects/copyFiles.png
Copy the .tflite and .txt files.

Use Windows Explorer to browse the internal shared storage of your Android device and navigate to the FIRST->tflitemodels directory. Paste the "detect.tflite" and "labelmap.txt" files to this directory.

/images/Using-a-TensorFlow-Pretrained-Model-to-Detect-Everyday-Objects/tfliteModelsFolder.png
Navigate to FIRST->tflitemodels and paste the two files in this directory.

Now the files are where we want them to be for this example.

Clone this wiki locally