Submission · Updated 2026-04-19
Code submission &
model validation.
How to validate, benchmark, and package your AI for the Virtual Qualifier. Major 2026-04-19 update baked in: no GPS, no absolute positioning, no depth; inputs are FPV visual stream + telemetry; outputs are Throttle / Roll / Pitch / Yaw; Windows only; active internet during runs (anti-cheat); multiple parallel sim instances supported; unlimited attempts, 8-min max per run.
VQ1
Completion (pass all gates in order)
<10 gates · clear
VQ2
Fastest valid time
<20 gates · complex
OS
Windows only
Linux not working
Attempts
Unlimited · 8-min cap per run
no rush per run
§ 01How submission works
Confirmed 2026-04-19. You submit a Python AI that consumes the FPV stream + telemetry and emits standard drone controls. VQ1 scores on gate-clearing completion. VQ2 scores on fastest valid time. Same course per round for all teams; gates change between VQ1 and VQ2. No human interaction during runs.
What the simulator provides
| Input | What it carries | Confirmed? |
| FPV visual stream | Forward-facing camera frames | Yes |
| Telemetry | Attitude + body rates + accel (exact format TBD at credentials release) | Yes · payload TBD |
| GPS / absolute position | — | Not provided |
| Depth / LiDAR | — | Not provided |
| Sensor shortcuts | — | Not provided |
What you send back
| Output | What it controls |
| Throttle | Collective thrust |
| Roll | Body roll axis |
| Pitch | Body pitch axis |
| Yaw | Body yaw axis |
Exact transport (MAVLink command name, rate, units) ships with the sim package.
Scoring
| Round | Scoring | Gate count | Environment |
| VQ1 | Completion (pass all gates in order) | <10 | Short, simple, clear gates |
| VQ2 | Fastest valid time | <20 | Longer · lighting · 3D objects · obstacles |
| Physical | Real drones, controlled env | TBD | No audience · CA · September |
| Final | Real drones + audience | TBD | Ohio · November · environmental distractions |
Submission portal details (URL, upload format, deadline) will be provided by Anduril/DCL shortly before VQ1 launch. Watch
dcl-project.com. Until then: validate and benchmark locally.
§ 02Platform requirements
| Requirement | Notes |
| OS | Windows only. Linux is explicitly stated to not work. |
| Internet | Active connection required during runs (anti-cheat). |
| Parallel instances | Multiple sim instances can run on one box — good for RL env fan-out. |
| Max run time | 8 minutes per attempt |
| Attempts | Unlimited within the qualification window |
| Autonomy | No human interaction during runs. Any manipulation = DQ. |
| Code review | Code must be accessible for review if requested. Pin dep versions, include a one-command reproduction script. |
§ 03Step 1 — Validate your code
Run the submission validator
python submit_check.py
Checks module imports, vision smoke test (detect + PnP on synthetic frame), controller smoke test, and config validity.
Run the standalone race test against SimDrone
python test_race_standalone.py
Runs the full pipeline against sim_drone.py (our 6DOF proxy), not the real AIGP sim. Useful for end-to-end smoke testing before the real sim is available. Don't use SimDrone as a substitute for real-sim validation once the sim drops — SimDrone's physics diverge (hover = 2mg, max pitch ~15°).
§ 04Step 2 — Benchmark detector modes
python benchmark_models.py # full (2 laps, ~2 min)
python benchmark_models.py --quick # 1 lap, ~1 min
What it measures
| Metric | What it means | Good value |
| Gates passed | How many gates the drone flew through (proxy course) | All |
| Total time | Total lap time | <60s competitive |
| Detection rate | % of frames where a gate was found | >80% |
| Vision latency | Per-frame inference time | <10ms (VQ1), <5ms (VQ2) |
Detector comparison
| Detector | Best for | Corner accuracy | Speed | Training |
| YOLO11n + YOLO11n-pose | VQ1 + VQ2 (ships first) | Good (4 corners) | ~5ms GPU | Done (APEX) |
| RF-DETR-Nano | VQ2 upgrade if YOLO is bottleneck | Good | ~2.3ms TRT | P2 retrain |
| U-Net + RANSAC | Sub-pixel corners alternative | Excellent | ~5ms GPU | Done |
| Color (HSV) | Fallback if trained fails | Good if gates highlighted | <1ms CPU | None |
VQ1 recommendation: ship YOLO11n + YOLO11n-pose (already trained, APEX Phase 1+2) with the
VQ1 completion pilot. Lock in a passing submission. Benchmark sim-frame detection rate once credentials are released.
§ 05Step 3 — Package for submission
Run the packager
python submit_check.py package
Creates submission.zip. Only runs if validation passes (0 failures).
Typical package contents
| File | Purpose |
vq1_completion_pilot.py | Zero-learning VQ1 completion stack |
race_pipeline.py | Main orchestrator + entry point |
vision_pipeline.py | Gate detection + PnP |
rl_controller.py | Used by APEX PPO (VQ2) |
gate_segmentation.py | U-Net alternative detector |
race_logger.py | JSONL logging |
drone_mpc_foundation.py | Schemas (MPC itself retired) |
race_config.py | Configuration |
camera_adapter.py | Camera input adapter |
fpv_renderer.py | Dev-only renderer |
Add model weights (.onnx or .pt for the detector/keypoints, plus .onnx for the PPO policy if submitting for VQ2).
Final config check
| Setting | Value | Why |
vision.mode | Your best mode (benchmark result) | Best detector for this course |
connection.* | UDP MAVSDK + UDP:5600 JPEG stream | Per VADR-TS-002 §4.2 / §4.6 |
race.max_time_s | 480.0 | 8-minute limit per rules |
gate.width / height | 1.5 / 1.5 m (inner aperture) | VADR-TS-002 §3.7 · outer 2.7m |
§ 06Submission checklist
| # | Check | How | Expected |
| 1 | All modules import | python submit_check.py | 0 failures |
| 2 | Vision detects gates | python submit_check.py | Detection + PnP pass |
| 3 | Controller responds | python submit_check.py | Commands generated |
| 4 | Standalone race completes | python test_race_standalone.py | All gates cleared on proxy |
| 5 | Best detector identified | python benchmark_models.py | Recommendation printed |
| 6 | Package created | python submit_check.py package | submission.zip exists |
| 7 | No human interaction in code | Manual review | No input() / keyboard reads |
| 8 | Deps pinned + reproducible | Manual review | One-command reproduction script |
| 9 | Runs on Windows | Test on target box | Passes same tests as Linux |
When all 9 checks pass, your submission is ready. Upload via the portal once Anduril/DCL publishes it (watch
dcl-project.com).