YOLOV8 classification problem

I am a beginner, and I am currently using YOLOv8 for classification tasks. I hope that the results of each class in the model I trained do not affect each other during detection, meaning that the probabilities are calculated separately, and their sum cannot equal one. Through searching for information, I have learned that I need to change the activation function from Softmax to sigmoid. However, I am unsure where to start. I hope someone can help and guide me on how to achieve this task

You need to change this line to x.sigmoid()

Thank you very much for your response

Glad that helped!

One small clarification that may be useful as you keep experimenting: changing that line in the classification head from Softmax to Sigmoid only affects the inference output, not how the model is trained. During training, YOLO’s classification head returns raw logits and the loss is still standard cross‑entropy (single-label, classes compete). The Softmax or Sigmoid is only applied after training for easier probability interpretation.

If you truly need independent multi‑label probabilities (an image can belong to several classes at once), you would also need to:

  • change the loss to something like BCEWithLogitsLoss
  • prepare labels as multi‑hot vectors (multiple 1s per sample)

That is more involved and not fully plug‑and‑play in the current classification pipeline.

For a quick conceptual refresher on what you just changed, the articles on the Sigmoid function and the Softmax activation explain the difference in more detail.

Thank you very much. I have replaced the loss function.

1 Like