I am integrating a YOLOv11n object detection model (exported to TFLite float16) into my React Native app using react-native-fast-tflite. While the model loads successfully, the prediction results returned by runSync() show extremely low confidence scores (nearly zero).
Model Details:
Base Model: YOLOv11n
Format: TFLite (float16) (using int8 has same result)
Input shape: [1, 320, 320, 3]
Output shape: [1, 8, 2100]
const model = useTensorflowModel(
require('@/assets/best_float16.tflite'),
'android-gpu'
);
runAtTargetFps(8, () => {
'worklet'
const resized = resize(frame, {
dataType: 'float32',
pixelFormat: 'rgb',
scale: {width: 320, height: 320},
});
const outputs = actualModel.runSync([resized])
const detections = outputs[0] // length = 8 * 2100
for (let i = 0; i < 2100; i++) {
for (let c = 0; c < 4; c++) {
const rawScore = detections[(4 + c) * 2100 + i]
console.log("Raw score: ", rawScore);
}
console.log("End of anchor ", i)
}
The log is:
LOG Raw score: 0.000007644203833478969
LOG Raw score: 2.3472063048757263e-7
LOG Raw score: 3.0208294532485525e-8
LOG Raw score: 0.0000011134524129374768