Virtual qualifier → 5" racer build → Neros Archer deploy • AIGP v4.0 • 2026-03-31 • Ctrl+P = PDF
No hardware needed. You download a DCL-built simulator, write Python AI code, and submit timed runs remotely.
| Platform | DCL-built sim (download after registration) |
| Language | Python (primary), C/Cython extensions allowed |
| Interface | MAVLink v2 via MAVSDK-Python over UDP |
| Camera | Forward-facing FPV stream (~12MP equivalent) |
| Sensors | IMU (gyro + accel), motor RPM readouts |
| GPS/position | Limited start position only. No coordinates during flight. |
| Track knowledge | Drone does NOT know the track. Detect gates with vision. |
| Physics rate | 120 Hz. Command rate: 50-120 Hz recommended. |
| Scoring | Time-based. Must pass all gates. Speed vs reliability tradeoff. |
| Rounds | R1: simpler course. R2 (end July): more complex environment. |
| Flight logs | Available for analysis and iteration. |
# Install Python 3.12+ and create venv
python3 -m venv ~/aigp && source ~/aigp/bin/activate
# Install MAVSDK + dependencies
pip install mavsdk aioconsole numpy opencv-python torch torchvision
# Install PX4 SITL for local testing (before DCL sim releases)
git clone https://github.com/PX4/PX4-Autopilot.git --recursive
cd PX4-Autopilot && make px4_sitl jmavsim
# Sim opens. In another terminal:
python3 -c "
import asyncio
from mavsdk import System
async def t():
d=System(); await d.connect(system_address='udp://:14540')
async for s in d.core.connection_state():
if s.is_connected: print('Connected to SITL!'); break
asyncio.run(t())
"
aigp/
├── config.py # Platform switch: sim / diy / archer
├── main.py # Entry: connect → arm → offboard → loop
├── perception/
│ ├── camera.py # Camera abstraction (sim stream / GStreamer)
│ └── gate_detector.py # Your neural net (PyTorch → TensorRT)
├── planning/
│ ├── planner.py # Pure pursuit / MPC / RL policy
│ └── state_est.py # Fuse vision + IMU for gate-relative state
├── control/
│ └── commander.py # MAVSDK wrapper: velocity/attitude cmds
└── utils/
└── logger.py # Save frames + detections for debugging
import os
PLATFORM = os.environ.get("AIGP_PLATFORM", "sim")
CONFIGS = {
"sim": {"addr": "udp://:14540", "cam": "sim"},
"diy": {"addr": "serial:///dev/ttyUSB0:921600", "cam": "gst_csi"},
"archer": {"addr": "serial:///dev/ttyTHS1:921600", "cam": "gst_csi"},
}
CFG = CONFIGS[PLATFORM]
GST_CSI = ("nvarguscamerasrc sensor-id=0 ! "
"video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1 ! "
"nvvidconv ! video/x-raw,format=BGRx,width=640,height=480 ! "
"videoconvert ! video/x-raw,format=BGR ! appsink drop=1")
import asyncio, cv2, numpy as np
from mavsdk import System
from mavsdk.offboard import VelocityBodyYawspeed
from config import CFG, GST_CSI, PLATFORM
# Camera setup per platform
if CFG["cam"] == "sim":
cap = None # sim provides frames via its own API
else:
cap = cv2.VideoCapture(GST_CSI, cv2.CAP_GSTREAMER)
def detect_gate(frame):
"""Replace with your trained model. Returns (err_x, err_y, area) or None."""
hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
mask = cv2.inRange(hsv, (0,100,100), (15,255,255))
cnts,_ = cv2.findContours(mask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
if cnts:
c = max(cnts, key=cv2.contourArea)
if cv2.contourArea(c) > 500:
M = cv2.moments(c)
cx,cy = int(M["m10"]/M["m00"]), int(M["m01"]/M["m00"])
return (cx-320)/320, (cy-240)/240, cv2.contourArea(c)
return None
async def main():
drone = System()
await drone.connect(system_address=CFG["addr"])
async for s in drone.core.connection_state():
if s.is_connected: break
async for h in drone.telemetry.health():
if h.is_local_position_ok: break
await drone.action.arm()
await drone.offboard.set_velocity_body(VelocityBodyYawspeed(0,0,0,0))
await drone.offboard.start()
try:
while True:
ret, frame = cap.read() if cap else (True, get_sim_frame())
if not ret: continue
det = detect_gate(frame)
if det:
ex,ey,area = det
fwd=float(np.clip(3.0-area/10000,0.5,4.0))
yaw=float(-ex*40); vrt=float(ey*2.0)
else:
fwd,yaw,vrt = 0.0,15.0,0.0
await drone.offboard.set_velocity_body(
VelocityBodyYawspeed(fwd,0.0,vrt,yaw))
await asyncio.sleep(0.01)
except KeyboardInterrupt:
await drone.offboard.stop(); await drone.action.land()
asyncio.run(main())
Based on the AI Grand Prix FAQ and tech spec:
# 1. Download DCL sim package (released at qualification start)
# Includes course, drone model, and submission instructions
# 2. Run your code against the sim locally
export AIGP_PLATFORM=sim
python3 main.py
# Iterate: watch flight logs, tune gate detector, improve planner
# 3. When ready, submit via DCL platform
# Teams can attempt runs any time within qualification window
# Once all gates are passed, your run time is locked
# You can re-submit for a faster time
# 4. Round 1 cutoff: ~end of June
# Round 2 (harder course): ~end of July
# Top teams advance to physical qualifier
Only if you advance past the virtual qualifier. This drone matches the Neros Archer's flight dynamics for sim-to-real testing.
| # | Part | Spec | $ |
|---|---|---|---|
| Frame + propulsion | |||
| 1 | 5" CF frame | 220mm, TBS Source One V5 or iFlight Nazgul F5 | 30 |
| 2 | Motors x4 | 2207 1950KV (iFlight XING2) | 70 |
| 3 | 4-in-1 ESC | 35A BLHeli_S 30x30 (SpeedyBee) | 35 |
| 4 | Props x8 | 5x4.3x3 triblade (Gemfan 51433) | 8 |
| Compute | |||
| 5 | Pixhawk 6C Mini | PX4, 20x20mm (holybro.com) | 90 |
| 6 | Jetson Orin Nano Super | 67T, 8GB (Amazon) | 249 |
| 7 | NVMe SSD 128GB | M.2 2230 | 18 |
| Camera | |||
| 8 | RPi Camera Module 3 Wide | 12MP IMX708, 120° FOV — exact Archer match | 35 |
| 9 | CSI FPC cable 15cm | 22-pin | 3 |
| Power + RC + wiring | |||
| 10 | 4S 1550mAh LiPo | 95C XT60 (Tattu R-Line) | 22 |
| 11 | PM02 V3 + 5V UBEC | Pixhawk power + Jetson power (separate!) | 28 |
| 12 | RadioMaster Pocket ELRS | + HappyModel EP2 receiver | 75 |
| 13 | USB-UART + standoffs + misc | CP2102, nylon standoffs, zip ties, charger | 59 |
| TOTAL | $694 | ||
Assemble CF frame: bottom plate → 4-in-1 ESC (30x30 stack) → standoffs → top plate. No props.
Mount 4x 2207 motors to arm tips. Solder motor wires to ESC pads. Verify PX4 Quad-X spin:
Vibration pads → Pixhawk 6C Mini center stack. Arrow forward. PM02 → POWER1 port.
PM02 from battery → Pixhawk. UBEC from battery → Jetson barrel jack 5V. ESC power from XT60.
Install NVMe SSD → flash JetPack 6.x on host PC → mount on nylon standoffs above Pixhawk → UBEC barrel jack.
RPi Cam 3 Wide at front, 15° down-tilt. FPC → Jetson CSI-0 (lock tab!). USB-UART: Pixhawk TELEM2 TX→adapter RX, RX→TX (cross!). ELRS RX → Pixhawk RC IN.
See Phase 3 below. Props OFF until all tests pass.
Host Ubuntu PC → NVIDIA SDK Manager → JetPack 6.2+ to NVMe SSD via USB-C recovery.
sudo systemctl enable ssh && sudo systemctl start ssh
sudo apt update && sudo apt upgrade -y
sudo apt install -y python3-pip python3-venv python3-dev git cmake \
build-essential libopencv-dev python3-opencv
sudo apt install -y gstreamer1.0-tools gstreamer1.0-plugins-base \
gstreamer1.0-plugins-good gstreamer1.0-plugins-bad
sudo usermod -aG dialout $USER && sudo reboot
python3 -m venv ~/aigp && source ~/aigp/bin/activate
pip install mavsdk aioconsole numpy pymavlink mavproxy
pip install torch torchvision # verify: python3 -c "import torch; print(torch.cuda.is_available())"
# Live test
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! \
'video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1' ! \
nvvidconv ! 'video/x-raw,width=640,height=480' ! nvvidconv ! nveglglessink
# OpenCV in Python:
import cv2
GST = ("nvarguscamerasrc sensor-id=0 ! "
"video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1 ! "
"nvvidconv ! video/x-raw,format=BGRx,width=640,height=480 ! "
"videoconvert ! video/x-raw,format=BGR ! appsink drop=1")
cap = cv2.VideoCapture(GST, cv2.CAP_GSTREAMER)
ret, frame = cap.read(); print(frame.shape) # (480, 640, 3)
# QGroundControl (USB to Pixhawk):
# Flash: Firmware → PX4 Pro Stable
# Airframe: Generic Quadrotor X → Reboot
# Calibrate: Sensors → Compass, Gyro, Accel, Level
# Parameters:
# MAV_1_CONFIG=TELEM2 MAV_1_MODE=Onboard SER_TEL2_BAUD=921600
# Reboot. Radio → Calibrate. Flight Modes:
# SW A = Arm SW B = Stabilized/Position/Offboard SW C = Kill
# MAVProxy quick test
mavproxy.py --master=/dev/ttyUSB0 --baudrate=921600 # HEARTBEAT = success
# MAVSDK test
python3 -c "
import asyncio; from mavsdk import System
async def t():
d=System(); await d.connect(system_address='serial:///dev/ttyUSB0:921600')
async for s in d.core.connection_state():
if s.is_connected: print('Connected!'); break
async for a in d.telemetry.attitude_euler():
print(f'Roll:{a.roll_deg:.1f} Pitch:{a.pitch_deg:.1f}'); break
asyncio.run(t())"
| Flight | Mode | Goal |
|---|---|---|
| 1 | Stabilized (manual) | Hover at 2m for 30s. Repeat 5x. Tune PIDs if oscillating. |
| 2 | Offboard (hover) | Take off manual → switch to Offboard → zero velocity cmd → holds position? |
| 3 | Offboard (slow gate) | One gate, 1 m/s cap. Verify vision pipeline. Increase speed gradually. |
Only config.py changes: connection string and camera device path. Gate detector, planner, control logic — all identical.
# 1. Connect to Archer's onboard computer via WiFi
ssh team@archer-drone.local # or IP from organizers
# 2. Upload your code and model weights
scp -r aigp/ team@archer-drone.local:~/
# 3. Set platform
export AIGP_PLATFORM=archer
# 4. Verify camera
python3 -c "import cv2; cap=cv2.VideoCapture(GST_CSI, cv2.CAP_GSTREAMER); print(cap.read()[0])"
# 5. Verify MAVSDK connection
python3 -c "from mavsdk import System; ..."
# 6. FLY
python3 main.py
# Drone takes off, races gates, lands. Fastest clean lap = your score.
| AI Grand Prix | theaigrandprix.com |
| Official Rules | theaigrandprix.com/official-rules |
| PX4 User Guide | docs.px4.io/main/en/ |
| MAVSDK-Python | mavsdk-python.mavlink.io/main/ |
| MAVSDK Examples | github.com/mavlink/MAVSDK-Python/tree/main/examples |
| Offboard Mode | docs.px4.io/main/en/flight_modes/offboard.html |
| Jetson Setup | developer.nvidia.com/embedded/learn/get-started-jetson-orin-nano-devkit |
| MonoRace (A2RL Winner) | arxiv.org/abs/2601.15222 |
| QGroundControl | qgroundcontrol.com |
Before power-on:
Before takeoff:
AIGP_PLATFORM=sim|diy|archer is the only switch. Good luck in the AI Grand Prix.AI Grand Prix — Complete Build & Deploy Guide • v4.0 • 2026-03-31