A sophisticated autonomous cleaning robot built with cutting-edge robotics technologies, featuring advanced 3D perception, autonomous navigation, and intelligent coverage planning.
For detailed technical implementation, system architecture, and operational details, see our comprehensive Technical Report.
The Sucky Robot represents a comprehensive robotics engineering achievement, integrating state-of-the-art technologies to create an intelligent autonomous cleaning platform. This project demonstrates expertise in:
- Advanced 3D Perception & SLAM using NVIDIA Isaac ROS
- Real-time Navigation with Nav2 and NVBLOX 3D mapping
- Custom Hardware Integration with Arduino-based peripheral control
- Multi-Modal Sensor Fusion combining LiDAR and Visual SLAM
- Intelligent Coverage Planning for systematic area cleaning
- ROS 2 Humble - Primary robotics middleware
- NVIDIA Isaac ROS - GPU-accelerated perception pipeline
- NVBLOX - Real-time 3D reconstruction and mapping
- Nav2 - Advanced autonomous navigation stack
- RoboClaw Motor Controllers - Precision drive control
- Differential Drive Base with RoboClaw motor controllers
- SICK TiM LiDAR (135ยฐ field of view) for obstacle detection
- Intel RealSense Camera for Visual SLAM and 3D mapping
- Arduino-based Peripheral Controller for cleaning subsystems
- Custom Cyclone Vacuum System with automated debris handling
- Real-time Visual SLAM using Isaac ROS Visual SLAM
- 3D Voxel Mapping with NVBLOX for dense reconstruction
- LiDAR-based Obstacle Detection with 270ยฐ coverage
- Dynamic Object Detection for moving obstacle avoidance
- Autonomous Navigation with Nav2 stack integration
- Coverage Path Planning for systematic area cleaning
- Real-time Costmap Generation from 3D perception data
- Multi-mode Operation: Teleop, Mapping, and Autonomous Navigation
- ROS 2 Control Framework for motor management
- Arduino Integration for vacuum and cleaning peripherals
- Real-time Serial Communication with custom protocols
- Safety Systems with timeout protection and emergency stops
๐ฆ Modular Launch System
โโโ ๐ฎ Teleop Mode - Manual control with joystick
โโโ ๐บ๏ธ Mapping Mode - SLAM and environment mapping
โโโ ๐ค Navigation Mode - Full autonomous operation
| Component | Specification |
|---|---|
| Navigation | Nav2 stack with AMCL localization |
| Mapping | SLAM Toolbox + NVBLOX 3D reconstruction |
| Sensors | SICK TiM LiDAR + Intel RealSense Camera |
| Control | ROS 2 Control with differential drive |
| Visualization | RViz2 + Foxglove Bridge integration |
| Coverage | Custom waypoint-based coverage planner |
- Manual control via joystick
- Real-time sensor monitoring
- Emergency stop capabilities
- Diagnostic visualization
- Simultaneous Localization and Mapping (SLAM)
- Real-time 3D reconstruction with NVBLOX
- Environment exploration and map building
- Visual and LiDAR sensor fusion
- Full autonomous operation
- Coverage path planning and execution
- Dynamic obstacle avoidance
- Mission-based operation control
sucky- Main robot package with launch files and configurationssucky_arduino + arduino_controller- Firmware and ROS integration for cyclone, airlock, door, and shaker motor controlsucky_coverage- ROS node for rectangle to converage waypoint conversionsucky_mission- Mission commander for high-level control after deployment
- Real-time 3D Mapping with GPU acceleration
- Intelligent Coverage Planning for efficient cleaning patterns
- Multi-sensor Fusion combining visual and LiDAR data
- Modular Architecture supporting multiple operational modes
- Foxglove Studio for advanced visualization and debugging
- Custom Arduino Integration for peripheral device control
- Real-time Diagnostics and system health monitoring
- Configurable Parameters for different environments
- Real-time Operation at 30Hz perception processing
- Sub-meter Localization Accuracy with multi-sensor fusion
- Efficient Coverage Planning with optimized path generation
- Robust Obstacle Avoidance with 140ยฐ LiDAR coverage
#Launch Isaac ROS docker container
./issac_ros_common/scripts/run_dev.sh
#Install dependencies and build
source setup.sh
# Launch in autonomous navigation mode
ros2 launch sucky sucky_bringup.launch.py mode:=navigation
# Launch in mapping mode for environment exploration
ros2 launch sucky sucky_bringup.launch.py mode:=mapping
# Launch in teleoperation mode for manual control
ros2 launch sucky sucky_bringup.launch.py mode:=teleop| Feature | Description |
|---|---|
![]() |
Real-time SLAM mapping showing environment reconstruction |
![]() |
Transform tree demonstrating sensor frame relationships |
![]() |
Advanced visualization with Foxglove Studio integration |
- Advanced ROS 2 Development with custom packages and nodes
- GPU-Accelerated Perception using NVIDIA Isaac ROS
- Real-time Systems Programming with strict timing requirements
- Hardware-Software Integration across multiple platforms
- Computer Vision & SLAM implementation and optimization
- Autonomous Navigation with advanced path planning
- System Architecture Design for complex robotic systems
This project showcases comprehensive robotics engineering capabilities including:
- System Integration - Seamlessly combining diverse technologies
- Performance Optimization - Real-time operation with resource constraints
- Modular Design - Extensible architecture for future enhancements
- Safety Engineering - Robust fault handling and emergency protocols
- Documentation - Comprehensive system documentation and user guides
- Alexander Roller โ Robotics Engineer โ @AlexanderRoller
- Jason Koubi โ Software Developer โ @jkoubs
- Benjamin Cantero โ Mechanical Lead
- Hampton Lumber โ Project sponsor and host company (internship; primary stakeholder for whom the robot is built)
- George Fox University โ Chassis development and fabrication support
- Eric Cox โ RoboClaw hardware interface for ROS 2 Control (Apache 2.0)
- NVIDIA Isaac ROS Team โ GPU-accelerated perception and SLAM capabilities
- Open Source Robotics Foundation โ ROS 2, Nav2, and core robotics frameworks



