top of page

Kiwi Drive Robot

Writer's picture: Kevin LeeKevin Lee

Updated: Jan 15

This is part of a series of posts where I document my progress in building an autonomous mobile bot from scratch. The ultimate goal of this project is to build a robot that can accomplish the following tasks:

  • Autonomously map an indoor space using SLAM and save the generated map

  • Load the generated map file and localize the robot inside the map

  • Move the robot around by sending move commands from a GUI that shows the map


Background

About a year ago, I started working on an autonomous mobile robot that can accomplish the three tasks written above. I made the first iteration of the robot and documented the process in this post. However, the project was paused because the robot kept shutting down on its own. The caused was the power supply's (2 AAA batteries connected in parallel) inability to supply enough current to all the electrical components. Hence, I decided to completely rebuild the robot from scratch.


Implementation

Hardware Layout

Figure 1: Side view of my robot. It has three layers.

Figure 2: Bottom layer of the robot. It has three DC motors in kiwi drive configuration. The Teensy 4.0 and the motor drive board are used to control the motors.

Figure 3: Middle layer of the robot. It powers the electrical components on all layers.

Figure 4: Top layer of the robot. It has the main computer (Raspberry Pi 4) as well as a LiDAR and an IMU sensor.


The main difference between this iteration of the robot and the previous one is the drive configuration. This robot has a kiwi drive configuration, which is 3 motors separated by 120 degrees in angle and equipped with omni wheels. This configuration allows for moving in the x and y direction, and spinning in the yaw direction, which is the total degree of freedom in a 2D space. Hence, the kiwi drive is considered holonomic.


Motor Control

Figure 5: High-level diagram of how motor control and wheel odometry works. The Raspberry Pi sends velocity commands to the Teensy, which converts them into individual motor velocities. The angle measurement from the encoder is converted into robot odometry and sent to the Raspberry Pi.


Figure 5 shows the high-level diagram of how motor control is done. The Raspberry Pi sends velocity commands to the Teensy over serial communication. The Teensy then converts the commands into individual motor velocities. I referred to this video to find the relationship between robot velocity and wheel velocities.

Figure 6: Screenshot of the video describing the kinematics of kiwi drive (r = radius of the wheels, d = distance between the wheels and the center of the robot).


The following code was written based on the equations above.


To calculate the odometry from wheel encoder readings, I needed to apply the equations backwards, meaning I had to solve for x, y, and z given angle1, angle2, and angle3.


As for making the motors spin at a desired angular velocity, I implemented a simple PID controller.

Figure 7: Demonstration of my PID controller implemented on a Teensy 4.0.


With all these parts working together, I was able to make the robot move at a desired velocity while estimating its position and orientation.

Figure 8: Controlling a kiwi drive robot while estimating its position and orientation.


SLAM

With the motor control completed, I set up the LiDAR and ran SLAM from the Raspberry Pi. I used the slam-toolbox library.

Figure 9: Running SLAM on my robot.


There is a tremendous build up of error in the wheel odometry most likely due to the fact that my apartment floor is quite bumpy. I plan to alleviate this problem by fusing the IMU readings with the wheel odometry for a more robust odometry.

47 views0 comments

Recent Posts

See All

Comments


bottom of page