Kristian Hans Onjala Full-Stack Engineer / Cofounder / STEM Mentor
Menu
Agriculture Robot — RFGYC 2025-2026 logo

Agriculture Robot — RFGYC 2025-2026

Autonomous agricultural robot built for the Robotics for Good Youth Challenge 2025-2026. ROS2 Jazzy codebase with computer vision for crop detection, autonomous navigation, and precision manipulation. Designed for cultivation, irrigation, harvesting, and sorting missions on a competition field.

  • Robotics
  • Computer Vision
  • AI

Tech Stack

R ROS2 Jazzy P Python 3 O OpenCV Y YOLO G Gazebo U Ubuntu 24.04 C C++

Build Highlights

  • ROS2 Jazzy Jalisco as the middleware layer for node communication, message passing, and lifecycle management
  • Five ROS2 packages: sensing (computer vision), navigation (odometry and motion), manipulation (seed planting and picking), mission_control (state machine), simulation (Gazebo world)
  • OpenCV and YOLO for real-time fruit detection and plot identification, leveraging the Quadro T2000 GPU for inference

Overview

Project overview

A ROS2 Jazzy Jalisco codebase implementing an autonomous agricultural robot for the RFGYC 2025-2026 Agriculture Edition. The robot is designed to perform four mission types on a 2362mm x 1143mm competition field: cultivation, irrigation, harvesting, and sorting. The system uses computer vision powered by OpenCV and YOLO for crop and plot detection, with a state machine for match-level mission control. The codebase runs on a HP ZBook 15 G6 (64GB RAM, Quadro T2000 GPU) and is architected for both Gazebo simulation and deployment on physical robotic hardware.

Problem

What it solves

Agricultural robotics at the competition level requires a system that can operate reliably under time pressure with no manual intervention. Each mission phase — detecting where to plant, irrigating specific plots, identifying ripe fruit, and sorting harvested items — requires its own sensing and actuation pipeline, all orchestrated by a single state machine that executes the correct sequence without ambiguity. The architecture had to be modular enough that individual subsystems could be debugged in simulation independently of the full match flow.

Build

Implementation details

What I worked on

  • Lead Engineer and Architect
  • Designed the ROS2 package structure with clear separation between sensing, navigation, manipulation, mission control, and simulation
  • Implemented the computer vision pipeline using OpenCV for fruit and plot detection with YOLO for inference on the Quadro T2000 GPU
  • Built the mission control state machine for match-level sequencing of all four mission types
  • Developed Gazebo simulation world for testing autonomous behaviors without physical hardware
  • Authored hardware deployment guides for cameras, ultrasonic sensors, encoders, and actuator systems

Technical implementation

  1. 01

    ROS2 Jazzy Jalisco as the middleware layer for node communication, message passing, and lifecycle management

  2. 02

    Five ROS2 packages: sensing (computer vision), navigation (odometry and motion), manipulation (seed planting and picking), mission_control (state machine), simulation (Gazebo world)

  3. 03

    OpenCV and YOLO for real-time fruit detection and plot identification, leveraging the Quadro T2000 GPU for inference

  4. 04

    State machine in mission_control coordinating the full match sequence across cultivation, irrigation, harvesting, and sorting phases

  5. 05

    Gazebo simulation environment for development and testing without physical hardware dependency

  6. 06

    Designed for adaptability: real hardware deployment via cameras, ultrasonic sensors, encoders, and mechanical actuators

More Projects

Continue browsing

Back to all projects