Agriculture Robot — RFGYC 2025-2026
Autonomous agricultural robot built for the Robotics for Good Youth Challenge 2025-2026. ROS2 Jazzy codebase with computer vision for crop detection, autonomous navigation, and precision manipulation. Designed for cultivation, irrigation, harvesting, and sorting missions on a competition field.
- Robotics
- Computer Vision
- AI
Tech Stack
Build Highlights
- ROS2 Jazzy Jalisco as the middleware layer for node communication, message passing, and lifecycle management
- Five ROS2 packages: sensing (computer vision), navigation (odometry and motion), manipulation (seed planting and picking), mission_control (state machine), simulation (Gazebo world)
- OpenCV and YOLO for real-time fruit detection and plot identification, leveraging the Quadro T2000 GPU for inference
Overview
Project overview
A ROS2 Jazzy Jalisco codebase implementing an autonomous agricultural robot for the RFGYC 2025-2026 Agriculture Edition. The robot is designed to perform four mission types on a 2362mm x 1143mm competition field: cultivation, irrigation, harvesting, and sorting. The system uses computer vision powered by OpenCV and YOLO for crop and plot detection, with a state machine for match-level mission control. The codebase runs on a HP ZBook 15 G6 (64GB RAM, Quadro T2000 GPU) and is architected for both Gazebo simulation and deployment on physical robotic hardware.
Problem
What it solves
Agricultural robotics at the competition level requires a system that can operate reliably under time pressure with no manual intervention. Each mission phase — detecting where to plant, irrigating specific plots, identifying ripe fruit, and sorting harvested items — requires its own sensing and actuation pipeline, all orchestrated by a single state machine that executes the correct sequence without ambiguity. The architecture had to be modular enough that individual subsystems could be debugged in simulation independently of the full match flow.
Build
Implementation details
What I worked on
- Lead Engineer and Architect
- Designed the ROS2 package structure with clear separation between sensing, navigation, manipulation, mission control, and simulation
- Implemented the computer vision pipeline using OpenCV for fruit and plot detection with YOLO for inference on the Quadro T2000 GPU
- Built the mission control state machine for match-level sequencing of all four mission types
- Developed Gazebo simulation world for testing autonomous behaviors without physical hardware
- Authored hardware deployment guides for cameras, ultrasonic sensors, encoders, and actuator systems
Technical implementation
- 01
ROS2 Jazzy Jalisco as the middleware layer for node communication, message passing, and lifecycle management
- 02
Five ROS2 packages: sensing (computer vision), navigation (odometry and motion), manipulation (seed planting and picking), mission_control (state machine), simulation (Gazebo world)
- 03
OpenCV and YOLO for real-time fruit detection and plot identification, leveraging the Quadro T2000 GPU for inference
- 04
State machine in mission_control coordinating the full match sequence across cultivation, irrigation, harvesting, and sorting phases
- 05
Gazebo simulation environment for development and testing without physical hardware dependency
- 06
Designed for adaptability: real hardware deployment via cameras, ultrasonic sensors, encoders, and mechanical actuators
Links
Project links
More Projects
Continue browsing
Previous Project
Scrapifie
Enterprise web scraping platform with TLS fingerprinting, CAPTCHA solving, residential proxy rotation, headless stealth browsers, honeypot avoidance, and bot bypass. API key access for scraping the web without getting blocked.
Next Project
PhotoMed
Patented AI-powered mobile application that identifies medicinal plants near your location and matches them to your symptoms. Geospatial plant mapping with PostGIS, AI diagnosis via Gemini and PlantNet, and preparation methods for natural remedies.