False positives after converting YOLOv8 to .tflite

Hi everybody! New to ultralytics, but so far it’s been an amazing tool. Thank you to the team behind the YOLO models!

Some context: we are trying to improve the object detection in our react-native app, which is using react-native-fast-tflite to load and run our model. My question isn’t about react-native specifically, but using a tflite file is a requirement.

To convert a YOLO model to .tflite, I have a simple python script that downloads yolov8s and converts it to a .tflite format, which we then just drop into our react-native app:

from ultralytics import YOLO
model = YOLO("yolov8s")
model.export(format="tflite")

When running our model, we just get completely random detection boxes. I’ve reproduced what we’re seeing in a codepen here, using tfjs-tflite.

I believe we’re doing everything correctly… I’ve poured over every piece of documentation, relevant github issues, and discussion forums. Our process of interpreting the output seems correct, but I must be missing something. The codepen is full of comments explaining what we’re doing and why, but at a high level, this is what we do in the codepen (which mimics what we do in the react-native javascript logic):

  1. Load model using tflite.loadTFLiteModel
  2. Prepare input tensor data using tf.browser.fromPixels(imageElement)
  3. Normalize pixel values to [-1, 1] using const input = tf.sub(tf.div(tf.expandDims(tensor), 127.5), 1) (note that we have tried [0, 1] with similarly inaccurate results)
  4. Run model prediction using tfliteModel.predict(input), and grab result using await outputTensor.data()
  5. Parse the results and store in a “boxes” array (this might be where we messed things up…?)
  6. Run NMS on the “boxes” array, and draw the results to a canvas

Any help would be hugely appreciated. Our team is building a tool to help tradespeople identify tools and materials on a job site, and find the nearest distributor that supplies these tools. Thank you so much!

Drew

Hi Drew! :blush:

It’s great to hear you’re finding Ultralytics helpful! Let’s see if we can resolve the issue with your .tflite model.

Firstly, ensure that the model conversion process is correctly configured. When exporting to .tflite, make sure you’re using the correct image size and input normalization that matches the training configuration. Here’s a quick checklist:

  1. Image Size: Ensure the input image size during inference matches the size used during training. You can specify this during export with imgsz.

  2. Normalization: YOLO models typically expect pixel values in the range [0, 1]. If you’re normalizing to [-1, 1], it might cause issues. Try sticking with [0, 1].

  3. Output Parsing: Double-check the parsing logic for the model’s output. Ensure the dimensions and indices align with the expected output format of YOLO models.

  4. Non-Maximum Suppression (NMS): Ensure your NMS implementation is correctly configured to filter out overlapping boxes.

Here’s a refined export example:

from ultralytics import YOLO

model = YOLO("yolov8s")
model.export(format="tflite", imgsz=640)  # Ensure this matches your training image size

For more detailed guidance, you might find the YOLOv8 Export Documentation helpful.

If the issue persists, consider testing the .tflite model in a simple Python environment using TensorFlow Lite to isolate whether the problem is with the model or the integration with React Native.

Feel free to reach out if you have more questions. Best of luck with your project—it’s a fantastic initiative! :rocket: