Hi everybody! New to ultralytics, but so far it’s been an amazing tool. Thank you to the team behind the YOLO models!
Some context: we are trying to improve the object detection in our react-native app, which is using react-native-fast-tflite to load and run our model. My question isn’t about react-native specifically, but using a tflite file is a requirement.
To convert a YOLO model to .tflite, I have a simple python script that downloads yolov8s and converts it to a .tflite format, which we then just drop into our react-native app:
from ultralytics import YOLO
model = YOLO("yolov8s")
model.export(format="tflite")
When running our model, we just get completely random detection boxes. I’ve reproduced what we’re seeing in a codepen here, using tfjs-tflite.
I believe we’re doing everything correctly… I’ve poured over every piece of documentation, relevant github issues, and discussion forums. Our process of interpreting the output seems correct, but I must be missing something. The codepen is full of comments explaining what we’re doing and why, but at a high level, this is what we do in the codepen (which mimics what we do in the react-native javascript logic):
- Load model using
tflite.loadTFLiteModel
- Prepare input tensor data using
tf.browser.fromPixels(imageElement)
- Normalize pixel values to [-1, 1] using
const input = tf.sub(tf.div(tf.expandDims(tensor), 127.5), 1)
(note that we have tried [0, 1] with similarly inaccurate results) - Run model prediction using
tfliteModel.predict(input)
, and grab result usingawait outputTensor.data()
- Parse the results and store in a “boxes” array (this might be where we messed things up…?)
- Run NMS on the “boxes” array, and draw the results to a canvas
Any help would be hugely appreciated. Our team is building a tool to help tradespeople identify tools and materials on a job site, and find the nearest distributor that supplies these tools. Thank you so much!
Drew