AI Grand Prix — Complete Build & Deploy Guide

Virtual qualifier → 5" racer build → Neros Archer deploy • AIGP v4.0 • 2026-03-31 • Ctrl+P = PDF

Three phases, one codebase. This guide walks you through: (1) writing and submitting code for the virtual qualifier, (2) building a 5" practice drone matching the Neros Archer, and (3) deploying your code to the real Archer at competition. Your Python autonomy stack transfers across all three with a single config change.

Phase 1: Virtual Qualifier (May–July 2026)

No hardware needed. You download a DCL-built simulator, write Python AI code, and submit timed runs remotely.

1.1 How the sim works

PlatformDCL-built sim (download after registration)
LanguagePython (primary), C/Cython extensions allowed
InterfaceMAVLink v2 via MAVSDK-Python over UDP
CameraForward-facing FPV stream (~12MP equivalent)
SensorsIMU (gyro + accel), motor RPM readouts
GPS/positionLimited start position only. No coordinates during flight.
Track knowledgeDrone does NOT know the track. Detect gates with vision.
Physics rate120 Hz. Command rate: 50-120 Hz recommended.
ScoringTime-based. Must pass all gates. Speed vs reliability tradeoff.
RoundsR1: simpler course. R2 (end July): more complex environment.
Flight logsAvailable for analysis and iteration.
Sim connection
Your Python AI MAVSDK-Python UDP DCL Sim PX4 SITL Vision stream frames Gate detector + planner

1.2 Set up your dev environment (your laptop/PC)

# Install Python 3.12+ and create venv
python3 -m venv ~/aigp && source ~/aigp/bin/activate

# Install MAVSDK + dependencies
pip install mavsdk aioconsole numpy opencv-python torch torchvision

# Install PX4 SITL for local testing (before DCL sim releases)
git clone https://github.com/PX4/PX4-Autopilot.git --recursive
cd PX4-Autopilot && make px4_sitl jmavsim
# Sim opens. In another terminal:
python3 -c "
import asyncio
from mavsdk import System
async def t():
    d=System(); await d.connect(system_address='udp://:14540')
    async for s in d.core.connection_state():
        if s.is_connected: print('Connected to SITL!'); break
asyncio.run(t())
"

1.3 Virtual qualifier code structure

aigp/
├── config.py            # Platform switch: sim / diy / archer
├── main.py              # Entry: connect → arm → offboard → loop
├── perception/
│   ├── camera.py        # Camera abstraction (sim stream / GStreamer)
│   └── gate_detector.py # Your neural net (PyTorch → TensorRT)
├── planning/
│   ├── planner.py       # Pure pursuit / MPC / RL policy
│   └── state_est.py     # Fuse vision + IMU for gate-relative state
├── control/
│   └── commander.py     # MAVSDK wrapper: velocity/attitude cmds
└── utils/
    └── logger.py        # Save frames + detections for debugging

1.4 config.py — one file switches platforms

import os
PLATFORM = os.environ.get("AIGP_PLATFORM", "sim")

CONFIGS = {
    "sim":    {"addr": "udp://:14540",                    "cam": "sim"},
    "diy":    {"addr": "serial:///dev/ttyUSB0:921600",    "cam": "gst_csi"},
    "archer": {"addr": "serial:///dev/ttyTHS1:921600",    "cam": "gst_csi"},
}
CFG = CONFIGS[PLATFORM]

GST_CSI = ("nvarguscamerasrc sensor-id=0 ! "
    "video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1 ! "
    "nvvidconv ! video/x-raw,format=BGRx,width=640,height=480 ! "
    "videoconvert ! video/x-raw,format=BGR ! appsink drop=1")

1.5 main.py — the core loop (works in sim and real)

import asyncio, cv2, numpy as np
from mavsdk import System
from mavsdk.offboard import VelocityBodyYawspeed
from config import CFG, GST_CSI, PLATFORM

# Camera setup per platform
if CFG["cam"] == "sim":
    cap = None  # sim provides frames via its own API
else:
    cap = cv2.VideoCapture(GST_CSI, cv2.CAP_GSTREAMER)

def detect_gate(frame):
    """Replace with your trained model. Returns (err_x, err_y, area) or None."""
    hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
    mask = cv2.inRange(hsv, (0,100,100), (15,255,255))
    cnts,_ = cv2.findContours(mask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
    if cnts:
        c = max(cnts, key=cv2.contourArea)
        if cv2.contourArea(c) > 500:
            M = cv2.moments(c)
            cx,cy = int(M["m10"]/M["m00"]), int(M["m01"]/M["m00"])
            return (cx-320)/320, (cy-240)/240, cv2.contourArea(c)
    return None

async def main():
    drone = System()
    await drone.connect(system_address=CFG["addr"])
    async for s in drone.core.connection_state():
        if s.is_connected: break
    async for h in drone.telemetry.health():
        if h.is_local_position_ok: break
    await drone.action.arm()
    await drone.offboard.set_velocity_body(VelocityBodyYawspeed(0,0,0,0))
    await drone.offboard.start()
    try:
        while True:
            ret, frame = cap.read() if cap else (True, get_sim_frame())
            if not ret: continue
            det = detect_gate(frame)
            if det:
                ex,ey,area = det
                fwd=float(np.clip(3.0-area/10000,0.5,4.0))
                yaw=float(-ex*40); vrt=float(ey*2.0)
            else:
                fwd,yaw,vrt = 0.0,15.0,0.0
            await drone.offboard.set_velocity_body(
                VelocityBodyYawspeed(fwd,0.0,vrt,yaw))
            await asyncio.sleep(0.01)
    except KeyboardInterrupt:
        await drone.offboard.stop(); await drone.action.land()
asyncio.run(main())

1.6 Submit to virtual qualifier

SUBMIT

Based on the AI Grand Prix FAQ and tech spec:

# 1. Download DCL sim package (released at qualification start)
#    Includes course, drone model, and submission instructions

# 2. Run your code against the sim locally
export AIGP_PLATFORM=sim
python3 main.py
# Iterate: watch flight logs, tune gate detector, improve planner

# 3. When ready, submit via DCL platform
#    Teams can attempt runs any time within qualification window
#    Once all gates are passed, your run time is locked
#    You can re-submit for a faster time

# 4. Round 1 cutoff: ~end of June
#    Round 2 (harder course): ~end of July
#    Top teams advance to physical qualifier
Scoring: Time-based, but you must pass all gates. Faster control increases missed-gate risk. Balance speed and reliability. Flight data logs are available for analysis between attempts.

Phase 2: Build 5" Archer-Match Racer (August 2026)

Only if you advance past the virtual qualifier. This drone matches the Neros Archer's flight dynamics for sim-to-real testing.

2.1 What you're matching

Neros Archer (top view)
▼ FRONT 12MP CAM DCL AI VECTOR Orin NX class IMU ~200mm • 5.1" props • 160km/h • 960-1400g
DIY practice build (top view)
▼ FRONT RPi Cam 3 Wide PIX 6C Mini PX4 / MAVSDK JETSON ORIN NX or Nano 4in1 ESC 220mm • 5" props • 100-160km/h • 950-1100g • $694

2.2 Parts list — $694

#PartSpec$
Frame + propulsion
15" CF frame220mm, TBS Source One V5 or iFlight Nazgul F530
2Motors x42207 1950KV (iFlight XING2)70
34-in-1 ESC35A BLHeli_S 30x30 (SpeedyBee)35
4Props x85x4.3x3 triblade (Gemfan 51433)8
Compute
5Pixhawk 6C MiniPX4, 20x20mm (holybro.com)90
6Jetson Orin Nano Super67T, 8GB (Amazon)249
7NVMe SSD 128GBM.2 223018
Camera
8RPi Camera Module 3 Wide12MP IMX708, 120° FOV — exact Archer match35
9CSI FPC cable 15cm22-pin3
Power + RC + wiring
104S 1550mAh LiPo95C XT60 (Tattu R-Line)22
11PM02 V3 + 5V UBECPixhawk power + Jetson power (separate!)28
12RadioMaster Pocket ELRS+ HappyModel EP2 receiver75
13USB-UART + standoffs + miscCP2102, nylon standoffs, zip ties, charger59
TOTAL$694

2.3 Build — 7 steps, ~2 hours

1 — FRAME 15min

Assemble CF frame: bottom plate → 4-in-1 ESC (30x30 stack) → standoffs → top plate. No props.

2 — MOTORS 20min

Mount 4x 2207 motors to arm tips. Solder motor wires to ESC pads. Verify PX4 Quad-X spin:

FRONT M1 CCW M2 CW M4 CW M3 CCW REAR
Wrong spin = instant flip. Test with QGC motor test (no props!) first.
3 — PIXHAWK 10min

Vibration pads → Pixhawk 6C Mini center stack. Arrow forward. PM02 → POWER1 port.

4 — POWER 15min

PM02 from battery → Pixhawk. UBEC from battery → Jetson barrel jack 5V. ESC power from XT60.

Separate power domains. Pixhawk = PM02 only. Jetson = UBEC only. Never cross.
4S LiPo 14.8V PM02→Pixhawk ESC→Motors UBEC→Jetson
5 — JETSON 15min

Install NVMe SSD → flash JetPack 6.x on host PC → mount on nylon standoffs above Pixhawk → UBEC barrel jack.

6 — CAMERA + WIRING 10min

RPi Cam 3 Wide at front, 15° down-tilt. FPC → Jetson CSI-0 (lock tab!). USB-UART: Pixhawk TELEM2 TX→adapter RX, RX→TX (cross!). ELRS RX → Pixhawk RC IN.

Pixhawk 6C TELEM2 / RC IN Jetson Orin USB-UART + CSI-0 UART 921600 (cross TX↔RX) RPi Cam 3 Wide CSI-0 FPC ELRS RX CRSF → RC IN
7 — SOFTWARE + TEST 45min

See Phase 3 below. Props OFF until all tests pass.

Phase 3: Software Setup & First Flight

3.1 FLASH JETPACK

Host Ubuntu PC → NVIDIA SDK Manager → JetPack 6.2+ to NVMe SSD via USB-C recovery.

sudo systemctl enable ssh && sudo systemctl start ssh
3.2 INSTALL PACKAGES
sudo apt update && sudo apt upgrade -y
sudo apt install -y python3-pip python3-venv python3-dev git cmake \
    build-essential libopencv-dev python3-opencv
sudo apt install -y gstreamer1.0-tools gstreamer1.0-plugins-base \
    gstreamer1.0-plugins-good gstreamer1.0-plugins-bad
sudo usermod -aG dialout $USER && sudo reboot
3.3 PYTHON ENV
python3 -m venv ~/aigp && source ~/aigp/bin/activate
pip install mavsdk aioconsole numpy pymavlink mavproxy
pip install torch torchvision  # verify: python3 -c "import torch; print(torch.cuda.is_available())"
3.4 TEST CAMERA
# Live test
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! \
    'video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1' ! \
    nvvidconv ! 'video/x-raw,width=640,height=480' ! nvvidconv ! nveglglessink

# OpenCV in Python:
import cv2
GST = ("nvarguscamerasrc sensor-id=0 ! "
    "video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1 ! "
    "nvvidconv ! video/x-raw,format=BGRx,width=640,height=480 ! "
    "videoconvert ! video/x-raw,format=BGR ! appsink drop=1")
cap = cv2.VideoCapture(GST, cv2.CAP_GSTREAMER)
ret, frame = cap.read(); print(frame.shape)  # (480, 640, 3)
3.5 CONFIGURE PIXHAWK
# QGroundControl (USB to Pixhawk):
# Flash: Firmware → PX4 Pro Stable
# Airframe: Generic Quadrotor X → Reboot
# Calibrate: Sensors → Compass, Gyro, Accel, Level
# Parameters:
#   MAV_1_CONFIG=TELEM2  MAV_1_MODE=Onboard  SER_TEL2_BAUD=921600
# Reboot. Radio → Calibrate. Flight Modes:
#   SW A = Arm   SW B = Stabilized/Position/Offboard   SW C = Kill
3.6 TEST LINK
# MAVProxy quick test
mavproxy.py --master=/dev/ttyUSB0 --baudrate=921600  # HEARTBEAT = success

# MAVSDK test
python3 -c "
import asyncio; from mavsdk import System
async def t():
    d=System(); await d.connect(system_address='serial:///dev/ttyUSB0:921600')
    async for s in d.core.connection_state():
        if s.is_connected: print('Connected!'); break
    async for a in d.telemetry.attitude_euler():
        print(f'Roll:{a.roll_deg:.1f} Pitch:{a.pitch_deg:.1f}'); break
asyncio.run(t())"
If attitude prints, your full stack works. Same MAVSDK API as the sim — only the connection string changed.

3.7 First flight procedure

Safety is not optional. This drone hits 160 km/h. Fly outdoors, open area, spotter, kill switch tested.
FlightModeGoal
1Stabilized (manual)Hover at 2m for 30s. Repeat 5x. Tune PIDs if oscillating.
2Offboard (hover)Take off manual → switch to Offboard → zero velocity cmd → holds position?
3Offboard (slow gate)One gate, 1 m/s cap. Verify vision pipeline. Increase speed gradually.

Phase 4: Deploy to Neros Archer at Competition

At the physical qualifier (Sep, SoCal) and finals (Nov, Ohio), your code runs on the actual Neros Archer. Hardware is provided. No modifications. You upload software. It flies autonomously.

4.1 What changes

One codebase, three targets
YOUR CODE SIM udp://:14540 May–Jul DIY DRONE serial:///dev/ttyUSB0 Aug NEROS ARCHER TBD at event Sep–Nov

Only config.py changes: connection string and camera device path. Gate detector, planner, control logic — all identical.

4.2 Expected deploy workflow at the event

# 1. Connect to Archer's onboard computer via WiFi
ssh team@archer-drone.local    # or IP from organizers

# 2. Upload your code and model weights
scp -r aigp/ team@archer-drone.local:~/

# 3. Set platform
export AIGP_PLATFORM=archer

# 4. Verify camera
python3 -c "import cv2; cap=cv2.VideoCapture(GST_CSI, cv2.CAP_GSTREAMER); print(cap.read()[0])"

# 5. Verify MAVSDK connection
python3 -c "from mavsdk import System; ..."

# 6. FLY
python3 main.py
# Drone takes off, races gates, lands. Fastest clean lap = your score.

4.3 Pre-event checklist

The Archer's connection string and camera device WILL differ from your DIY build. Organizers provide specs at the event. With config.py, you update two lines. Everything else stays the same.

Reference

Stays the same (sim → real)

Changes (sim → real)

Key links

AI Grand Prixtheaigrandprix.com
Official Rulestheaigrandprix.com/official-rules
PX4 User Guidedocs.px4.io/main/en/
MAVSDK-Pythonmavsdk-python.mavlink.io/main/
MAVSDK Examplesgithub.com/mavlink/MAVSDK-Python/tree/main/examples
Offboard Modedocs.px4.io/main/en/flight_modes/offboard.html
Jetson Setupdeveloper.nvidia.com/embedded/learn/get-started-jetson-orin-nano-devkit
MonoRace (A2RL Winner)arxiv.org/abs/2601.15222
QGroundControlqgroundcontrol.com

Pre-flight checklist

Before power-on:

Before takeoff:

You're set. Virtual qualifier code → 5" practice drone → Neros Archer deploy, all from one codebase. AIGP_PLATFORM=sim|diy|archer is the only switch. Good luck in the AI Grand Prix.

AI Grand Prix — Complete Build & Deploy Guide • v4.0 • 2026-03-31