SLAM and navigation implementation for kitbot
- Install necessary packages for SLAM with LIDAR:
- Use kitware to drive the robot to a pose set by
rviz2
- Clone into the
srcdirectory of your colcon workspace for ROS - Make sure you have cloned kitware and kitware_interface in
srcas well cdin to thekitware_slamand runsetup.sh- Run
colcon buildfrom the workspace folder
The following steps should already be done on the hardware we give students.
- Make sure you have TAMProxy-Firmware running on a teensy
- Install TAMProxy-pyHost
- Change the transformation for
base_link_to_base_laser_tf_nodeinlaunch/common.launch.pyto reflect the location of the LIDAR's base to the center of your robot. The argument format isor[ x, y, z, roll, pitch, yaw, parent_frame, child_frame]
[ x, y, z, qx, qy, qz, qw, parent_frame, child_frame]
- Tuning the PID and error tolerance constants in
kitware/differential_driver.pyfor desired performance/accuracy
Important
Run colcon build from the workspace folder to update the changes after modifying. Else the changes will not be applied
- Make sure the robot is running off the battery and not attached to anything
ssh -Xinto the robotcdinto your ROS workspace- Run
source install/setup.bashto setup your environment to use the built packages - Launch
- If using LDLIDAR: run
ros2 launch kitware_slam ldlidar_slam.launch.py - If using YDLIDAR: run
ros2 launch kitware_slam ldlidar_slam.launch.py
- Open another terminal and
ssh -Xinto the robot again - Run
rviz2 - Open
rvizconfiguration :TODO: - Set a goal pose
- Cross your finger that the robot will go
- Make sure the
kitbotworks with keyboard. If not then troubleshoot that first (pin settings inkitbot.pyis correct, motor wiring is correct)