Autonomous Navigation Mobile Robot using ROS | Jetson Nano | RPLidar | Differential Drive Kinematics

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
in my previous videos i have shown the working of manipulators using ros movait and opencv but this time i have a new chapter for you autonomous navigation mobile robot using ros navigation stack before getting into the content i would like to thank my friend riti for sending me rp lida for this project in this video we will cover the overview of ros navigation stack kinematics of differential drive robot and how to configure ros differential drive controller and finally mobile robot in action this is the architecture of ros navigation stack let's talk about each component for the robot to autonomously navigate the first thing required is the map so the map is provided by this map server node once we have the map the next thing is to know the location of the robot in the map this job of localization is done by amcl node amcl node uses sensor data and automatically information to localize robot in the map now coming to the heart of this stack move base is responsible for generating the path to the goal position avoiding the obstacles using global and local planners it uses data from all other parts to achieve this finally it computes the robot's linear and angular velocities required to follow the planned path and send it to the base controller the job of the base controller is to convert these robot velocities received from more base to individual wheel velocity commands the white and grey components in this picture which are map server amcl and move base are already implemented we just have to configure the parameters of these nodes according to our robot and rest is taken care the blue components which are base controller sensor tf automatically source and sensor source are robot platform dependent so these are the nodes that we have to write sensor tf can be given in the urdf of the robot for sensor source i am using rplida ros package which is readily available for my rp lidar sensor so the only parts to implement are base controller and automatic source as already said job of the base controller is to accept the command well topic from obase which gives the robots linear and angular velocities at that instance and convert them to individual wheel velocities the automatic source node must take feedback from the wheels motor encoders and provide robot's current position on autumn topic to move base so to write the base controller and automatically source for a differential drive report we have to understand the kinematics of differential drive a robot's motion can be defined with a linear velocity v and angular velocity omega but to work in robots joint space we have to convert these velocities to individual wheel velocities differential drive kinematics helps us to understand how to convert the velocity of the robot to individual wheel velocities and how each wheel's motion contributes to the position of the robot to compute the wheel velocities we need to know two physical parameters of the robot the wheel separation length l and the radius of the wheel are once we have v omega l and r we can compute the wheel velocities v l and v r using this equations so this is the job of a base controller now let's look at the equations for odometry calculation let's say the robot is moving in a curved trajectory then these are the distance covered by the left and right wheels the distance covered by the wheel is given by this equation where delta t is the difference in the wheel encoder pulses and n is the total number of pulses per rotation with this we can get the values for dr and dl now we have to take a point on the robot between the two wheels this is the distance covered by the robot equation for dc is this but odometer gives the position and orientation of the robot with respect to odom frame in the plane these are the equations for orientation and position of the robot using these equations we can write the base controller node and the odometry source node but do not worry about writing these nodes because ros has a solution for this as well thanks to differential drive controller differential drive controller converts the command weld topic to individual wheel velocities and publishes odome data so the only thing we have to do is to configure the parameters of the differential drive controller and write the hardware interface node with velocity joint interface for both the wheels and get the position feedback from the encoders here is the complete example of the parameters type of the controller left and right wheel names wheel separation length l wheel radius r and limits for linear and angular velocities of the robot okay we had enough theory let's get into some practical stuff let me show you the 3d model of my robot first this is the cases of my robot the top layer has slots to house the rp leader and space for jets and nano pillars for supporting top layer motor casing and slot for the caster wheel 3d printed and assembled it now time to check the robot's differential drive controller i have written this launch file to load hardware interface node and start differential drive controller we will use teleop twist keyboard package to publish command well topic from keyboard let's launch this file this is the robot model in rvs let's start tele up twist keyboard node and move the robot okay the differential drive controller and hardware interface node are working fine let's move to the next step mapping i have setup this layout to demonstrate the working of autonomous navigation robot so let's get the map of this layout first let's launch the mapping launch file this launch file also includes the teleop node so we will move the robot in this layout using keyboard to build the map we got the map let's save the map using map saver node of map server package these are the map related files now it's time for the fun part and the real test autonomous navigation let's launch the autonomous nav dot launch file this launch file includes all the components of ros navigation stack map server that loads the saved map amcl node move base node global and local planners and etc the code is available in my github you can have a look at it for better understanding okay everything is loaded let's do a pose estimate to help amcl node in localizing the robot now we are set to send the goal location let's first send the goal using 2d nav tools or firebase everything is good so far now let's put an obstacle in the path of the robot and see how it will avoid the obstacle and reaches the goal it took little long to find the new path but we can optimize this by tuning the local planner parameters lastly i will send a series of go locations using a python node this node will send three go locations to the move base here then here and back to home location kudos to my robot finally let me show you the hardware assembly of the robot this is jetsan nano the brain of this robot rp lidar the eyes of the robot i am using jitsu nano's i2c interface bus 0 and bus 1 to send the velocity commands to 2 arduino nanos one for each wheel now let's remove the top level and see the lower base i have to grab this extra wire of rp leader it looks a bit messy here and i feel proud of it so these are the two dc motors with encoders connected to one arduino nano each this is the h-bridge motor driver which takes pwm signal from nano and drives the motor and this is the power source of this robot 12 volt lithium ion battery so this completes the episode on autonomous navigation mobile robot hope you like this video then don't forget to hit the like button and subscribe to my channel for more videos on robotics and ross for more stuff on robotics and ross visit my website www.rosroboticslearning.com thanks for watching
Info
Channel: Robotics and ROS Learning
Views: 10,207
Rating: 5 out of 5
Keywords: ros, ros navigation, mobile robot, amcl, slam gmapping, obstacle avoidance, jetson nano, i2c, move_base, rplidar
Id: Uz_i_sjVhIM
Channel Id: undefined
Length: 13min 25sec (805 seconds)
Published: Fri Nov 27 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.