How to set up to display the training results of all classes (precision, recall, mAP, loss, etc.) every epoch?
No guarantees that this will work (I haven’t tried personally and if it does work not sure if t will work well) or look nice, but if you modify the file ultralytics/models/yolo/{TASK}
(for whichever task you’re training on), there’s a print_results()
method you can make changes to. For detect
you can change this line and remove and not self.training
which should mean it will report the per class metrics every epoch.
This is what the full line change would look like to accomplish what you’re after:
- if self.args.verbose and not self.training and self.nc > 1 and len(self.stats):
+ if self.args.verbose and self.nc > 1 and len(self.stats):
If I was looking to do this, I might instead try to write this information to a file, and probably only for one metric. The first thought that comes to mind is to generate a CSV with classes as columns and which number as rows (only for one metric) then you can easily view (or plot) the progress per class for each epoch.
n | Class-0 | Class-1 | Class-2 | … | Class-N |
---|---|---|---|---|---|
0 | |||||
1 | |||||
2 | |||||
… |
Additionally, you could create a function for the on_fit_epoch_end
callback of the BaseTrainer
class. This might be a cleaner way for you to inject a step to write a file with metrics for each class every epoch. The docs have a page about the callbacks
and includes an example of how to configure a custom callback (for prediction, but it would operate in a similar way for training). Alternatively, you could also use a custom callback for the on_val_end
callback of the BaseValidator
class which could be useful if you always want to save the per-class metrics for any time validation is called.