Real-time YOLOv3 object detection fused with PID-controlled flight and a robust state-machine — powering the intelligence behind every Crossm drone.
The drone operates as a finite state machine — click any state to explore it, or enable auto-cycle to watch a full mission unfold.
Four tightly-coupled subsystems working in concert to enable full autonomous flight.
YOLOv3-tiny runs real-time inference on every camera frame, outputting bounding boxes and confidence scores for 80 object classes.
A 3-axis PID controller converts pixel-space target error into NED-frame velocity commands with anti-windup protection.
State machine orchestrates mission phases. Transitions are triggered by sensor data — altitude check, detection presence, timers.
DroneKit wraps MAVLink to send velocity commands, yaw rotations, and arm/takeoff sequences to real or SITL-simulated vehicles.
YOLOv3-tiny processes each camera frame in milliseconds. Non-max suppression filters redundant detections, leaving clean, confident bounding boxes.
settings.yaml. PID controller centres this detection in the frame.cv2.dnn — no GPU required. Blob pre-processing at 416×416, forward pass through two YOLO output layers.Open-source, battle-tested tools forming the backbone of the autonomous pipeline.