The example uses a JPG image, but how can I get the data feed from a camera that is already connected to my Raspberry Pi? I’ve read that you can’t do it directly and need to assign the camera an IP address, then have the model read the stream directly from that IP address.
But for some reason, i am not able to get my code working with the Camera without my Halo8L module, so I think I developed the Ultralytics model for the Hilo8 module instead of directly
Sounds like your model was compiled for the Hailo‑8L, so it won’t run with the standard Ultralytics runtime. To use the Pi camera directly (without the module), run a regular YOLO11 .pt or export to NCNN and feed frames from picamera2.
Quick check the camera first: rpicam-hello (5‑second preview). If that works, use picamera2 and pass frames to YOLO as in the Raspberry Pi camera guide:
import cv2
from picamera2 import Picamera2
from ultralytics import YOLO
picam2 = Picamera2()
picam2.preview_configuration.main.size = (1280, 720)
picam2.preview_configuration.main.format = "RGB888"
picam2.preview_configuration.align()
picam2.configure("preview")
picam2.start()
model = YOLO("yolo11n.pt") # use a standard .pt or your trained .pt
while True:
frame = picam2.capture_array()
res = model(frame)[0]
cv2.imshow("YOLO", res.plot())
if cv2.waitKey(1) == ord("q"):
break
cv2.destroyAllWindows()
Alternatively, stream the camera and use the TCP URL:
@pderrenger Sorry for the late reply
When I call rpicam-vid -n -t 0 --inline --listen -o tcp://127.0.0.1:8888 It seems its only one frame I am able to view. I am doing a little using VLC.