VADR-TS-001 compliance, training strategy, submission pipeline • Grounded in the official spec • v1.0 — 2026-03-31
Every spec requirement mapped to our implementation. Fix all GAP items before submission.
| Spec Requirement | § | Status | Implementation / Fix |
|---|---|---|---|
| MAVLink v2 over UDP | 4.1-4.2 | DONE | mavsdk_bridge.py — SimBridge.connect() via udpin:// |
| HEARTBEAT reception | 4.3 | DONE | connection_state() async stream, waits for HEARTBEAT before race |
| ATTITUDE telemetry | 4.3, 4.5 | DONE | _subscribe_attitude() → Euler angles + angular rates |
| HIGHRES_IMU telemetry | 4.3, 4.5 | DONE | _subscribe_imu() → body-frame accel + gyro |
| ODOMETRY telemetry | 4.3 | DONE | _subscribe_odometry() → NED position + velocity |
| SET_POSITION_TARGET_LOCAL_NED | 4.3 | DONE | send_velocity_ned(), send_velocity_body(), send_position_ned() |
| SET_ATTITUDE_TARGET | 4.3 | DONE | send_attitude(), send_attitude_rate() |
| TIMESYNC handling | 4.3 | GAP | Mentioned in docstring but no subscription. Fix: add _subscribe_timesync() or verify MAVSDK handles transparently. |
| Python 3.14.2 runtime | 5.1 | PARTIAL | Code runs on 3.14 but uses asyncio.ensure_future() (deprecated). Fix: replace with asyncio.create_task(). |
| 120 Hz physics rate | 4.4 | DONE | race_config.py command_hz=120.0 |
| 50–120 Hz command rate | 4.4 | DONE | Rate-limited async loop in race_pipeline.py |
| 2 Hz minimum heartbeat | 4.4 | PARTIAL | MAVSDK auto-sends heartbeat. Verify: confirm with DCL sim that 2 Hz is maintained. |
| Local Cartesian only (no GPS) | 3.3 | DONE | NED frame throughout. No GPS references. |
| Forward-facing FPV camera only | 3.4 | DONE | camera_adapter.py single camera source |
| Vision → Perception → Planning → Control | 5.3 | GAP | Pipeline goes Perception → Control directly. Fix: integrate trajectory_optimizer.py as Planning layer, or document state machine as planning. |
| Max 8 minutes per run | 8.3 | DONE | max_time_s=480.0 enforced in race loop |
| No human interaction | 7 | DONE | race_main() is fully autonomous |
| Navigate start → intermediate → finish gates | 8.1-8.2 | DONE | State machine: SEEK → APPROACH → TRANSIT → FINISHED |
| Dependency manifest | 5.1 | GAP | No requirements.txt. Fix: generate with pinned versions. |
asyncio.ensure_future() deprecation. Fix before any submission.requirements.txt, heartbeat rate verification. Fix during Week 1.Train detection models offline using captured frames or synthetic data.
python gate_segmentation.py train --data dataset_gates_seg
| Parameter | Value |
|---|---|
| Architecture | GateSegNet (3→32→64→128→256 encoder) |
| Parameters | ~3.7M |
| Loss | Dice + BCE combined |
| Optimizer | Adam, lr=1e-3 |
| Epochs | 100 |
| Batch size | 8 |
| Input size | 480 × 640 |
| Export | gate_seg.onnx (opset 17) |
Dataset: dataset_gates_seg/{train,val}/{images,masks}/
Masks: binary (white = gate, black = background)
# Step 1: auto-label from VQ1 color detection
python yolo-auto-label.py
# Step 2: train YOLOv8
python yolo-train.py train
# Step 3: export TensorRT
python yolo-train.py export
| Parameter | Value |
|---|---|
| Base model | yolov8n (nano) |
| Epochs | 150 |
| Conf threshold | 0.5 |
| Export priority | .engine > .onnx > .pt |
Auto-labeler uses ColorGateDetector to generate ground-truth bounding boxes from VQ1 highlighted-gate footage.
The safe baseline for Round 1. Proportional pursuit controller with aggressive tuning.
| Gain | Value | What it does | Tune if... |
|---|---|---|---|
| kp_yaw | 50.0 | Yaw snap to gate heading | Oscillates: lower. Sluggish: raise. |
| kp_pitch | 30.0 | Forward pitch authority | Nosedive: lower. Won't approach: raise. |
| kp_roll | 25.0 | Banking into turns | Roll oscillation: lower. |
| cruise_pitch | -25.0° | Forward tilt when gate far | Too fast/crash: reduce. Too slow: increase. |
| approach_pitch | -15.0° | Forward tilt near gate | Overshoot: reduce. Stalls: increase. |
| hover_thrust | 0.50 | Thrust to hold altitude | CALIBRATE FIRST. Descends: raise. Climbs: lower. |
| seek_yaw_rate | 180°/s | Spin speed when searching | Blur kills detection: lower to 90-120. |
# Test classical controller
python test_race_standalone.py
# With YAML config override
python race_pipeline.py my_config.yaml attitude
# Phase A: privileged training
python rl_train.py train --steps 5000000
# Export to ONNX
python rl_train.py export
# Deploy in race pipeline
# Set controller to "rl" mode (requires code change in race_pipeline.py)
DroneRaceEnv currently uses privileged=True (ground-truth gate positions). Phase B requires modifying _get_obs() to feed real VisionPipeline detections. This is the hardest integration step.| Reward | Value | Purpose |
|---|---|---|
| Closer to gate | +1.0 per meter | Drive toward gate |
| Gate passage | +50.0 | Reward transit |
| Crash | -100.0 | Punish collision |
| Time penalty | -0.01 per step | Encourage speed |
| Stage | Target | Method |
|---|---|---|
| Camera capture | <1 ms | USB/CSI/sim pipe |
| U-Net inference | <5 ms | GPU (RTX/Jetson) |
| RANSAC corners + PnP | <2 ms | CPU (OpenCV) |
| Gate tracker | <0.1 ms | EMA update |
| State machine | <0.1 ms | Branch logic |
| Controller compute | <0.5 ms | Proportional math / ONNX inference |
| MAVLink send | <1 ms | Async UDP |
| Total | <8.7 ms | Fits 120 Hz with margin |
From submit_check.py — these are packaged into submission.zip:
| File | Purpose | Required |
|---|---|---|
race_pipeline.py | Main orchestrator | Yes |
vision_pipeline.py | Gate detection + PnP | Yes |
mavsdk_bridge.py | MAVLink communication | Yes |
race_config.py | Configuration | Yes |
gate_segmentation.py | U-Net model + RANSAC | Yes (if mode=unet) |
camera_adapter.py | Camera sources | Yes |
race_logger.py | JSONL logging | Yes |
drone_mpc_foundation.py | Schemas + MPC | Yes |
sim_drone.py | Physics (standalone mode) | Optional |
rl_controller.py | RL inference | Optional (if using RL) |
trajectory_optimizer.py | Path planning | Optional |
gate_seg_best.pt | U-Net weights | Yes (if mode=unet) |
policy.onnx | RL policy weights | Optional (if using RL) |
requirements.txt | Dependencies | Yes |
mavsdk>=2.0
opencv-python>=4.8
numpy>=1.26
torch>=2.2
onnxruntime>=1.17
ultralytics>=8.1
stable-baselines3>=2.2
gymnasium>=0.29
casadi>=3.6
scipy>=1.12
pyyaml>=6.0
# race_config.py ConnectionSettings
# Local simulation:
sim_url: "udpin://0.0.0.0:14540"
camera_source: "synthetic"
# DCL simulator:
sim_url: "udpin://0.0.0.0:14540" # verify with competition docs
camera_source: "gazebo_pipe" # or per DCL vision stream spec
python submit_check.py # run all checks
python submit_check.py package # create submission.zip
python test_race_standalone.py
Expected: Completes gates in <480s. Detection rate >50%.
python mavsdk_bridge.py
Expected: Connects, hovers at 5m for 30s, prints NED position and command rate.
In race loop output, confirm command_rate reads ≥50 Hz. Target: 120 Hz per race_config.py.
Confirm race_config.py max_time_s=480.0. Run a deliberate slow test — pipeline should terminate at 8:00.
python vision_pipeline.py
Expected: Color mode >200 FPS, U-Net >100 FPS (GPU), YOLO >60 FPS (GPU).
python submit_check.py package
Expected: 0 failures, creates submission.zip. Verify zip contents include all required files + weights.
| Metric | Target | Current | Tuning Knob |
|---|---|---|---|
| Vision latency (U-Net) | <5 ms | ~5ms GPU | Reduce base_ch or use TensorRT |
| Vision latency (YOLO) | <12 ms | ~12ms GPU | Use .engine TensorRT format |
| Control loop rate | 120 Hz | 120 Hz (config) | command_hz in race_config.py |
| Gate detection rate | >80% of frames | varies | HSV range, U-Net training data, confidence threshold |
| Transit detection | <1.5m + 3 closing frames | 1.5m default | transit_distance |
| Max speed | 30 m/s | 30 m/s (config) | safety.max_speed |
| Max tilt | 70° | 70° (config) | safety.max_tilt |
| PnP distance accuracy | <10% error at 15m | depends on corners | U-Net RANSAC >> bbox corners |
| EMA tracker alpha | 0.65 | 0.65 | Higher = faster, noisier |
| hover_thrust | calibrated | 0.50 (uncalibrated!) | MUST tune per sim |
| Tier | Lap Time | Avg Speed | Strategy |
|---|---|---|---|
| Conservative (Round 1 pass) | 60–90s | 2–3 m/s | Slow, safe, detect every gate |
| Competitive (top 50%) | 30–45s | 4–6 m/s | Aggressive pursuit, late braking |
| Top tier (podium) | <25s | 7–10 m/s | Optimal trajectory + RL policy + racing line |
| Risk | L | I | Mitigation |
|---|---|---|---|
| hover_thrust miscalibrated | H | H | Run hover test first. Tune in 0.02 increments. Add auto-calibration that measures altitude error. |
| Sim-to-real gap (RL) | H | H | RL Phase B (vision-in-loop) + Phase C (domain rand). Classical controller as fallback. |
asyncio.ensure_future() removed in 3.14 | H | H | Replace with asyncio.create_task() at 4 call sites in mavsdk_bridge.py. |
| Unknown gate color in VQ1 | M | H | Auto-detect dominant saturated color in first 10 frames. Multiple HSV presets available. |
| Timeout before finishing (8 min) | M | H | Monitor elapsed time. If >6 min and <50% gates, increase speed aggressiveness. |
| Camera stream not connected | M | H | preflight_check() verifies camera produces frames. Add retry logic. |
| TIMESYNC not handled | M | M | MAVSDK may handle transparently. If not, add _subscribe_timesync(). |
| Gate partially visible | M | M | U-Net handles partial gates. RANSAC works with incomplete contours. |
| False transit triggers | L | M | Predictive transit: distance_closing_count ≥ 3 + 0.3s cooldown. |
| No requirements.txt | H | M | Generate and include in submission. See Section 4. |
asyncio.ensure_future() → create_task()requirements.txthover_thrustsubmit_check.py all checksyolo-auto-label.pysubmit_check.py packagesubmission.zip