The Sucky robot is an advanced autonomous cleaning platform with three main operational capabilities: teleoperation, mapping, and autonomous navigation. This technical report details the system architecture, implementation details, and operational modes of the robot.
The teleoperation mode enables manual control of the robot through a PS4 controller interface.
- Primary Launch File:
teleop.launch.py
- Control Framework: ROS2 Control with differential drive configuration
- Hardware Interface: RoboClaw Hardware Interface (Apache 2.0 License)
- Motor Controllers: RoboClaw motor controller for precise differential drive control
The robot utilizes an Arduino microcontroller for additional functionality beyond basic locomotion:
- Vacuum System: Two independent cyclone units controlled by Electronic Speed Controllers (ESCs)
- Door Control: Two PWM servo motors for automated door operation
- Shaker Motor: Relay-controlled motor for debris agitation
- Airlock Feeder: Relay-controlled feeder mechanism
- Interface: USB serial communication between Arduino and Jetson Orin Nano
- Protocol: Custom bidirectional communication for state information and command execution
-
- Bridges Arduino hardware interfaces to ROS topics
- Provides high-level control abstraction for peripheral systems
- Manages state information and component interfacing
-
- Extension of the standard ROS joy node
- Interprets joystick button inputs for Arduino-controlled components
- Provides intuitive control mapping for complex operations
- Joystick Configuration:
joystick.yaml - ROS2 Control Configuration:
sucky_controllers.yaml
arduino_serial.py: Standalone serial terminal for Arduino functionality testing and hardware debugging
- Location:
sucky_arduino- PlatformIO project containing custom firmware for peripheral control
The mapping mode combines teleoperation capabilities with advanced SLAM (Simultaneous Localization and Mapping) functionality to create detailed environment maps.
- Primary Launch File:
mapping.launch.py
The robot employs NVIDIA Isaac ROS Visual SLAM for optimized odometry and SLAM processing:
- Hardware Acceleration: Leverages Jetson Orin Nano GPU for optimal performance
- Sensor Integration: Intel RealSense camera providing stereo IR image pairs
- Transform Publishing: Generates
base_footprint → odomtransform - Map Conflict Resolution: VSLAM
map → odomtransform disabled to prevent conflicts with SLAM Toolbox
- Current Limitation: Single RealSense camera configuration
- Recommended Upgrade: Dual StereoLabs Zed cameras for enhanced VSLAM robustness
- Hardware Bottleneck: Jetson Orin Nano performance constraints identified
Map Generation: SLAM Toolbox provides 2D occupancy grid mapping:
- Primary Sensor: SICK TiM 781 LiDAR
- Driver Package:
sick_scan_xd - Field of View:
- 140° with vacuum head installed
- 180° with vacuum head removed
- Current Status: Functional but with identified improvement areas
- Primary Bottleneck: Jetson Orin Nano computational limitations
- Recommended Improvements:
- 360-degree LiDAR for enhanced localization
- Improved odometry accuracy
- Highly accurate URDF model refinement
During development, several alternative mapping approaches were investigated:
- Visual mapping solution using Isaac ROS framework
- Development Status: Configuration challenges prevented successful integration with custom platform
The autonomous navigation mode represents the most advanced operational capability, combining multiple perception and planning systems for fully autonomous operation.
- Primary Launch File:
navigation.launch.py
Adaptive Monte Carlo Localization (AMCL) provides robot localization on pre-generated maps:
- Current Status: Functional implementation with identified optimization opportunities
- Computational Impact: High processing requirements
- Environmental Sensitivity:
- Susceptible to mapping errors
- Affected by dynamic environment changes
- Prone to dust obscuration of LiDAR sensors
-
- GPU-accelerated lidar localization solution
- Development Status: Testing revealed accuracy issues; requires further investigation
-
cuVGL (CUDA Visual-Geometric Localization):
- Hardware-accelerated visual localization solution
- Dependency: Requires maps generated using Isaac ROS Mapping visual mapping capability
ROS 2 Navigation Stack (Nav2) handles path planning and execution:
- Configuration:
sucky_nav2.yaml - Modification Level: Largely unmodified from standard Nav2 behavior
- Odometry Source: Isaac ROS Visual SLAM (primary and only current source)
Extended Kalman Filter (EKF) integration attempted:
- Components: VSLAM odometry + IMU data + ros2_control
- Outcome: Resulted in noisy transform; reverted to VSLAM-only configuration
NVIDIA Isaac ROS NVBLOX provides real-time 3D environment reconstruction:
- Input: RealSense stereo IR images
- Processing: GPU-accelerated 3D voxel mapping
- Output: Height-filtered 2D costmap projection for Nav2 integration
- Local Costmap: NVBLOX serves as primary obstacle detection layer
- Global Costmap: NVBLOX combined with:
- SLAM-generated static map
- Inflation layer
- Traditional obstacle layer
- Range: Local area mapping optimized for immediate navigation needs
- LiDAR Limitation: Excluded from obstacle avoidance due to dust sensitivity
- Filter Solution: Particle filter implementation to mitigate false positive detections
- Hardware: Enhanced camera configuration and increased computational power
- Software: Implementation of people segmentation and additional NVBLOX operational modes
The system enables dynamic waypoint navigation with comprehensive obstacle avoidance:
- Foxglove Studio: Visual waypoint selection and monitoring
- ROS Topic: Direct
/goal_posetopic for programmatic control
- 3D Obstacle Detection: Real-time static and dynamic obstacle avoidance
- Path Planning: Intelligent routing around detected obstacles
- Goal Management: Robust handling of unreachable waypoints
sucky_coverage.py - Intelligent coverage path planning system:
- Input: Four rectangular boundary coordinates
- Output: Array of waypoints implementing lawnmower coverage pattern
- Algorithm: Systematic area coverage with configurable parameters
- Robot Footprint: Accounts for physical robot dimensions
- Waypoint Density: Configurable spacing between waypoints
- Edge Margin: Safety buffer from boundaries
- Configuration File:
coverage_config.yaml
-
Occupancy Integration:
- Analyze global map occupancy values during waypoint generation
- Eliminate waypoints in inaccessible locations
- Reduce failed navigation attempts to unreachable goals
-
Nav2 Integration:
- Utilize Nav2-published robot footprint instead of independent configuration
- Ensure consistency between navigation and coverage planning systems
The autonomous operation requires sophisticated mission management beyond basic navigation capabilities.
sucky_missions.py - Mission configuration and execution system:
- Cleaning Tasks: Comprehensive area cleaning operations
- Navigate to Point: Targeted navigation missions
- Future Extensions (planned):
- Dust ejection operations
- Charging dock navigation
- Terminal Interface:
mission_terminal.py - ROS Topic Interface: Direct topic-based mission commands
- Future State Machine: Planned autonomous decision-making logic
- Mission Definitions:
sucky_missions.yaml - Coordinate Collection: Manual teleoperation with
/amcl_poserecording - Future Enhancement: GUI-based mission definition with interactive map interface
battery_monitor.py - Power system monitoring:
- Function: Battery voltage monitoring and reporting
- Development Status: Requires additional work to prevent serial conflicts with RoboClaw and ros2_control
- Integration: Battery state information for charging mission triggers
Hardware Integration: Break beam sensor via Arduino controller
- Function: Empty/full state detection
- Output: Published ROS topic for system state awareness
- Integration: Enables automated dust ejection missions
- Design Philosophy: Modular and customizable for deployment flexibility
- Configuration: Manual coordinate collection via teleoperation
- Coordinate Source: Recording
/amcl_poseduring manual exploration
Planned Enhancement: Interactive mission definition system
- Map Display: Visual representation of global map
- Point-and-Click Interface: Intuitive rectangular area selection
- Automatic Generation: Streamlined mission definition workflow
Future Development: Industrial automation integration
- Library: pylogix for PLC communication
- Applications:
- Secure area access control
- Integration with existing automation systems
- Coordinated industrial cleaning operations
The Sucky robot system is organized into several key ROS2 packages:
sucky- Main robot package containing launch files, configurations, and custom nodessucky_arduino- Arduino firmware for peripheral control systemsroboclaw_hardware_interface- Hardware interface for RoboClaw motor controllersroboclaw_serial- Serial communication package for RoboClaw integration
- Isaac ROS Common - NVIDIA Isaac ROS foundation packages
- Isaac ROS NVBLOX - 3D reconstruction and mapping
| File | Location | Purpose |
|---|---|---|
joystick.yaml |
Control Config | Joystick button mapping |
sucky_controllers.yaml |
Control Config | ROS2 Control configuration |
sucky_nav2.yaml |
Navigation Config | Nav2 stack parameters |
coverage_config.yaml |
Navigation Config | Coverage planning parameters |
sucky_missions.yaml |
Navigation Config | Mission definitions |
| Launch File | Purpose | Key Components |
|---|---|---|
teleop.launch.py |
Manual Control | Joy control, Arduino interface |
mapping.launch.py |
SLAM Mapping | VSLAM, SLAM Toolbox, sensors |
navigation.launch.py |
Autonomous Nav | AMCL, Nav2, NVBLOX |
sucky_bringup.launch.py |
Unified Launch | Mode selection interface |
- Enhanced Localization: Implement Isaac ROS Map Localization to replace AMCL
- Coverage Optimization: Integrate occupancy map data into waypoint generation
- Battery Integration: Complete battery monitoring system without serial conflicts
- GUI Development: Create interactive mission definition interface
-
Hardware Enhancements:
- Upgrade to dual StereoLabs Zed cameras
- Implement 360-degree LiDAR system
- Enhanced computational platform (higher-end Jetson)
-
Software Improvements:
- Advanced sensor fusion with EKF integration
- People segmentation in NVBLOX
- State machine for autonomous decision making
-
Industrial Integration:
- Complete PLC communication implementation
- Secure area access protocols
- Industrial automation coordination
-
Advanced Perception:
- Multiple NVBLOX operational modes
- Enhanced dust detection and avoidance
- Improved robustness in dynamic environments
- ROS 2 Humble - Primary robotics middleware
- NVIDIA Isaac ROS - GPU-accelerated perception pipeline
- Nav2 Navigation Stack - Autonomous navigation framework
- SLAM Toolbox - 2D SLAM implementation
- RoboClaw Hardware Interface - Motor controller integration
- SICK Scan XD - LiDAR driver package
- Intel RealSense SDK - Camera integration
- Foxglove Studio - Advanced robotics visualization
- PlatformIO - Arduino development environment
- pylogix - PLC communication library