- Introduction
- Project Goal
- Tech Stack
- Setup
- Key Challenges
- Hardware Development
- Control System Design
- ROS2-Teleoperation
- Localization and Odometry
- Motion Planning
- SLAM
- Future Developement
Field exploration robots have gained significant attention in recent years due to their potential applications in various domains such as agriculture, environmental monitoring, and autonomous driving research. These robots are designed to operate autonomously in diverse environments, collecting data and performing tasks that would otherwise be labor-intensive or hazardous for humans. This project focuses on developing a ROS-powered robotic vehicle designed for field exploration. The vehicle is capable of traversing varied ground surfaces autonomously and can be used for tasks such as autonomous field mapping, agricultural plant monitoring, and testing autonomous driving algorithms.
The primary goal of this project is to develop a ROS2-enabled four-wheel-drive vehicle with Ackermann steering capable of waypoint navigation and obstacle avoidance. This vehicle will have various sensors and an onboard computer to perform Simultaneous Localization and Mapping (SLAM). The project encompasses hardware development, sensor integration, firmware and algorithm development, and extensive testing and validation to ensure the vehicle’s performance and reliability in real-world scenarios.
- ROS2 jazzy: Middleware for robot software development.
- Raspberry Pi 5: Onboard computer for processing and communication.
- Esp32: Microcontroller for low-level motor and servo control interfaced using ArduinoIDE
- Nav2: Navigation stack for ROS 2.
- RViz2: Visualization tool for ROS 2.
- Operating System:
Ubuntu 24 LTSnoble or later - ROS 2 Distribution:
Jazzy Jelisco - Hardware:
Raspberry Pi5,BMX 160 IMU,Ydlidar Tmini-Pro,Esp32,7V Lipo Battery - Software:
Arduino IDE,Python 3.12
-
Install ROS2 Jazzy:
After installing run this command once
source /opt/ros/humble/setup.bash echo "source /opt/ros/humble/setup.bash" >> ~/.bashrc source ~/.bashrc
-
Create a workspace and clone the repository
mkdir -p ~/ros2_ws/src # Create a workspace , builld and source it cd ~/ros2_ws colcon build --symlink install source install/local_setup.bash cd /pi_ws/src # To clone the repo inside src folder git clone https://github.com/TahsinOP/eYSIP-24_Field_Exploration_ROS_Vehicles.git
To avoid sourcing everytime you open a new terminal instance run this once ,
echo "source ~/ros2_ws/install/setup.bash" >> ~/.bashrc source ~/.bashrc # source bashrc as we have made changes
- Flash SD Card
- Download Ubuntu 24:
Download the
Ubuntu 24image for Raspberry Pi from the official Ubuntu website. - Flash the SD Card:
Use tools like
Balena Etcher/Rpi Imagerto flash the Ubuntu image to a32GBor64GBSD card.
- Install ROS 2 Jazzy
- Follow the ROS 2 Installation Guide: Official ROS 2 Jazzy installation instructions can be found here.
- Workspace Setup
-
Create ROS 2 Workspace:
mkdir -p ~/pi_ws/src cd ~/pi_ws colcon build --symlink install source install/local_setup.bash echo "source ~/pi_ws/install/setup.bash" >> ~/.bashrc source ~/.bashrc # source bashrc as we have made changes
-
Clone the repository
cd /pi_ws/src git clone https://github.com/TahsinOP/eYSIP-24_Field_Exploration_ROS_Vehicles.git
-
USB serial Permissions Add
dialouttogroupssudo usermod -aG dialout $USER
- Install smbus :
sudo apt update sudo apt install python3-smbus
- Launch IMU node:
ros2 launch imu_pkg imu_rviz_robot.launch.py
- Check IMU data:
ros2 topic echo /imu_raw
-
Ensure the YDLidar is connected to a USB port and powered on
-
Launch the YDLidar node with the following command:
ros2 launch ydlidar_ros2_driver ydlidar_launch.py
-
Visualize Lidar scan data:
ros2 launch rcar_viz display.launch.py
This will launch
Rviz2with a robot model andfixed frameas"odom" -
Optional: Run IMU Pose Estimation: To see the Odom data, you can run the following node (optional):
ros2 run imu_pose_estimation estimator
Alternatively, set the fixed frame to
"base_link"in RViz to see laser data on the/scantopic.
- Low-level Control: The primary challenge we faced in low-level control was interfacing a custom servo with five pins for which no pin diagrams or references were available online. Achieving accurate control at this level required extensive experimentation and fine-tuning.
- Localization and Odometry: We relied solely on an IMU to estimate the robot’s pose for localization and odometry. The IMU’s orientation measurements exhibited significant drift over time. Additionally, the robot’s suspension system continuously affected the accelerometer's bias.
- Motion Planning: In motion planning, traditional control algorithms used for differential or omnidirectional drives were ineffective. This required the development of an approach tailored to the dynamics of an Ackermann steering system.
- Environment Mapping: For environment mapping, we had to adapt mapping techniques to work with Ackermann kinematics. This included ensuring accurate data collection from LIDAR and integrating these readings into a coherent map. Figure 2: Vicon Odometry vs IMU Odometry
- The motor actuation and steering system features a
five-pin servocontrol usingPIDfor precision and DC motor control for throttle. - A custom
PCBboard was developed to ensure optimum power supply to theESP32,motor drivers, andaspberry Pi. - The mechanical design involved creating laser-cut CAD models using acrylic for a
two-layered systemwith mounts for all hardware components including aLIDAR. BMX-160IMU was interfaced and calibrated to publish accurate orientation and acceleration data on a ROS-2 network.Ydlidar Tmini-Prointerfaced withRaspberry Pi5onROS2 Jazzyto publish laser data on/scantopic
Figure 1: Prototype Vehicle
Figure 2: Control System Design
-
ESP32:
- Role: Acts as the intermediary between the motor drivers and the Raspberry Pi 5.
- Communication: Receives
angleandPWMvalues viaUART Serialfrom the Raspberry Pi 5. - Outputs: Sends control signals to the motor drivers.
-
Motor Drivers:
- Role: Control the steering and throttle motors based on signals received from the ESP32.
-
Servo Motor (Steering):
- Role: Adjusts the
steering angleof the vehicle. - Control: Receives signals from the motor driver.
- Role: Adjusts the
-
DC Motor (Throttle):
- Role: Controls the
throttleof the vehicle. - Control: Receives signals from the motor driver.
- Role: Controls the
-
Operating System: Ubuntu 24LTS
-
Optimized Pure Pursuit Algorithm:
- Role: Calculates the desired
anglefor steering based on the currentpositionand thegoal. - Output: Angle and PWM values sent to the ESP32 for motor control.
- Role: Calculates the desired
-
BMX160 IMU:
- Role: Provides IMU data for odometry.
- Connection: Connected via
GPIOpins on Rasbperry Pi 5. - Data Published:
/imu_datatopic.
-
YDLidar Tmini Pro:
- Role: Provides
LiDARdata for SLAM. - Connection: Connected via
USB. - Data Published:
/scantopic.
- Role: Provides
-
Odometry (Modified Kalman Filter):
- Role: Processes
IMUand otherViconcar velocity (corresponding to PWM) data to provide an accurate estimate of the vehicle’spositionandvelocity. - Output: Publishes odometry data to the
/odomtopic.
- Role: Processes
-
Operating System: ROS2 Jazzy
-
SLAM (Simultaneous Localization and Mapping):
- Components:
EKF (Extended Kalman Filter)andAMCL (Adaptive Monte Carlo Localization). - Role: Utilizes LiDAR and odometry data to build and update a
mapof the environment. - Inputs: Receives
/scanand/odomtopics.
- Components:
-
Navigation Stack (Nav2):
- Role: Handles
path planningandnavigation. - Inputs: Receives odometry data and processed map information.
- Outputs: Publishes path data to the
/plantopic.
- Role: Handles
-
Goal Node:
- Role: Defines the navigation
goalsfor the vehicle. - Output: Publishes
waypointsto the/goal_topic.
- Role: Defines the navigation
- The Low-Level Control segment, managed by the ESP32, translates commands from the Raspberry Pi into physical movements of the vehicle.
- The Onboard Raspberry Pi 5 is the central processing unit for sensor data collection and preliminary processing (IMU, LiDAR), and it runs the Pure Pursuit algorithm for immediate steering adjustments.
- The Remote PC is responsible for advanced processing tasks, including SLAM for mapping and localization, and the Nav2 stack for path planning and navigation, ensuring the vehicle reaches its defined goals efficiently.
Achieving complete low-level control of the car through keyboard commands using ROS2 for communication over a shared network is a significant milestone. This setup allows for real-time teleoperation, enabling users to control the car remotely using simple keyboard inputs. By implementing teleoperation nodes in ROS 2, commands can be sent to the robot, providing a responsive and reliable control system.
Implementation Steps:
- Install necessary ROS 2 packages for teleoperation.
- Write a ROS 2 node to capture keyboard inputs and publish velocity commands.
- Configure the robot to subscribe to these velocity commands and actuate the motors accordingly.
- Test the teleoperation setup to ensure reliable and responsive control.
How to Run the Teleop Node:
- SSH into Raspberry Pi:
ssh arms@192.168.0.171 # Change the IP address and Pi name accordingly - Run Subscriber Node On Pi/Remote PC
ros2 run teleop_bot sub
- Run Publisher Node On Pi
If u have less computation on Pi run the publisher on the
ros2 run teleop_bot pub
remote PCusing the same command stated above
Now u can control the car using WASD keys for movement :
-
W- Move Forward -
A- Turn Left -
S- Move Backward -
D- Turn Right -
H- Halt -
Note: Thethrottlevalue has been kept constant for the time being, you can change the value in the subscriber script in theteleop_botpackage. Also, try running the car at25-50 PWMrange for safe teleoperation
- Modified Kalman Filter: Fused car velocity data (calculated from Vicon) with IMU accelerometer readings to get a more accurate state estimate (position and velocity) by reducing drift. The Kalman filter uses Vicon data as a reference and corrects the IMU readings accordingly giving accurate position data. For orientation, gyroscope data was used to calculate roll, pitch, and yaw after bias removal.
To run the odometry Node
- SSH into Raspberry Pi:
ssh arms@192.168.0.171 # Change the IP address and Pi name accordingly - Run Odom Node On remote PC
ros2 run localization imu_odometry
This will give you the (x,y,yaw) data of the vehicle , and also publish data on /odom topic
- Optimized Pure Pursuit Algorithm: Implemented an optimized version of this path-tracking algorithm to ensure appropriate steering angles using the lookahead distance concept. The waypoints are provided to the vehicle by the
Goal_Node, which is fed into the pure-pursuit algorithm. The algorithm used Odometry data from the/Odomtopic, calculates the linear and angular error and proceeds towards the waypoints effectively ( Note: The tuning parameter islookahead_distance) Waypoint navigation
- SSH into Raspberry Pi:
ssh arms@192.168.0.171 # Change the IP address and Pi name accordingly - Run Goal Node On Pi
ros2 run navigation GoalNode
- Run Pure Pursuit Node on Pi
Hit enter on the goal node terminal after the pure-pursuit controller is started
ros2 run navigation pure_pursuit
- SLAM: Mapped the environment using LiDAR scan data and fused it with odometry data. This, along with the Adaptive Monte Carlo Localization (AMCL) algorithm, provided an accurate pose (position and orientation) estimate of the car within the map. Slam_toolbox is used for mapping the environment with proper loop closure, and the Nav2 stack is used to generate local and global cost maps, generate planned paths which are finally fed in the form of waypoints to the motion planning algorithm.
- Map the environment using slam_toolbox ( Already done and saved in the maps folder)
- Launch the Odometery node using the steps given in the previous sections
- Launch the Navigation and Localization launch files
- Run the GoalNode and pure pursuit script
- Give a 2D goal pose in Nav2 Rviz
-
Upgrading Hardware for Outdoor Use: Enhance the hardware to function robustly in outdoor environments and incorporate GPS for improved outdoor localization.
-
Developing a Mission Planner Stack: Create a mission planner stack capable of generating and updating paths for the vehicle based on environmental variables, enabling more dynamic and adaptive route planning.
-
Implementing State-of-art Algorithms: Integrate deep learning and reinforcement learning-based algorithms to achieve more adaptive and robust navigation, enabling the vehicle to better handle diverse and unpredictable conditions. Upgrade the algorithms to better handle dynamic environments, both indoor and outdoor, as they are currently designed for static settings.
-
Enhanced Sensor Integration: Incorporate additional sensors such as RGB-depth cameras, and ultrasonic sensors to improve environmental perception, and implement advanced algorithms such as ORB-SLAM.



