I have been working on this project since early 2011 during my spare time, partly as a project to learn about feedback control theory, digital signal processing and robotics, and partly inspired by some of the early videos of similar machines posted on youtube and the Web in general, for instance the nBot. At that time I have been reading up on standard college textbooks on classical (i.e. analog) and digital feedback control, and it is difficult to keep myself motivated, thus I figure by building a real system while reading a textbook is the best way to keep one continuously motivated. This is an interesting project to apply the principles of mechanics, mathematical modeling, feedback control, digital signal processing and some analog electronic circuit design. Besides, it also spurs me to learn about high-level programming of robotic systems. I must truly say over the years this project has taught me a lot in terms of real-world design, rapid prototyping, factoring design-for-test, reliability and maintenance of robotic systems.
--- VERSION 1 --- Electronics
The first version (Version 1.0) was completed in 2012. To reduce the cost, I decided to make the machine small. This allow the usage of 2-cells 7.4V LiPo rechargeable battery as the power source and small (cheaper) DC geared motors. Version 1.00 contains two main PCBs, a single-board computer (SBC) that houses the dsPIC33EFJ128GP804 16-bit micro-controller (MCU) with 40 MIPS processing power and an Altera MAX3000A CPLD chip (for custom glue logic), and a motor driver circuit board. The motor driver circuit board contains two digital-to-analog converter (DAC), and two servo amplifiers driving two discrete bipolar junction transistor (BJT) H-bridge networks, so that it can drive two DC motors. Basically the SBC controls the DC motor torque by adjusting the voltage to the motors via the DAC. A simpler approach is to use PWM (pulse-width modulation) pulses to drive the H-bridge, and adjusting the duty cycle of the PWM signals to produce an effective voltage across the DC motor. I prefer the former as (1) the DC motor torque may not be linearly related to the duty cycle of the PWM signals; (2) the output of the feedback control block is the control voltage to be applied to the DC motors, this can be send to the motor driver board without having to map it to a corresponding duty cycle value. This simplifies the mathematical model of the machine at the expense of more complex hardware; (3) The delay caused by the motor driver is smaller (in the case of PWM control signal, because of
the average effect, a few pulses are needed to change the motor torque
output, this may be fine when standing still but could be an issue if
the robot needs to change the motor torque rapidly like when it is
pushed).
Note [20 Feb 2020]:After scaling up my servo motor driver for larger motors, I realized that method (1) may not be practical for larger robots with high power motors as the transistor in the top part of the H-bridge gets heated up rapidly due to higher power dissipation. Thus I may have to get back to the PWM drive method. The older PWM based motor drivers in early 2000s may not work that well, but modern (e.g. 2019) PWM based motor driver can operate at more than 20 kHz pulse repetitive frequency (PRF), so this reduce the effect of non-linearity and delay somewhat.
Feedback Control and Modelling
I used a small digital scale to measure the weight of the components and bi-filar suspension method to estimate the center of mass of the body and the moment inertia of the wheel. From this a mathematical model of the robot can be obtained using Newton's laws and rotational dynamics principles (many articles describe the derivation of the equations). The mathematical equations can then be linearized at the position where the robot is to be balanced, and linear feedback control method can be applied to the model. I use the PID controller, with root-locus approach being used to come up with the preliminary coefficients for the PID controller. The free mathematical software Scilab is used to carry out the computation and visualize the poles-zeros of the system. Using the Scilab software and root-locus method, a suitable sets of coefficients for the PID controller can be obtained. This initial coefficient set is applied to the actual robot and by trial-and-error approach further optimized to get the best upright stability.
Note [April 2016]: I have migrated to state-space feedback control approach for better performance. Note [20 Feb 2020]: I have added another blog to explain the modeling in detail. See here.
Sensors
The robot also contains other sensor PCBs such as IMU (inertia measurement unit), infra-red proximity sensors and optical encoders for the wheel. Here analog IMU is used, e.g. the accelerometer and gyroscope outputs are analog voltage, and I use the internal ADC module in the dsPIC33 MCU to digitize the signal. The IMU and infra-red proximity sensors can be easily obtained from many sources such as ebay, Pololu Robotics, Solarbotics and many more, one can even buy the components from electronic components supplier and build custom board. For the wheel encoder I use an opto-reflective IR sensor and fashion a 'code-wheel' out of a piece of blank PCB with desktop CNC machine. The code-wheel has 20 teeth per rotation, so resolution is pretty bad as the code-wheel is mounted on the wheel (or gearbox output shaft), not on the motor shaft. Note that if one is using PID controller, the wheel encoder is not needed to keep the robot balance upright. The encoder is added when we also want to enable higher order movements like moving forward or reverse at constant velocity.
Note [April 2016]: Instead of using code wheel mounted on the wheel, I have been using rotary encoder mounted on the motor shaft. This is more compact mechanically, but has higher backlash if we are using a gearbox.
Here's a video of Version 1.0 (Actually Version 0.99), completed in July 2012.
--- Version 2 --- April 2016
An improved version, the Version 2.0 has been completed recently. Here all the various PCBs are combined into a single 4-layers PCB, resulting in a neater and more reliable hardware. I still use the dsPIC33XXX micro-controller (MCU) but for this version a more powerful version, the dsPIC33EP256MU806 with 70 MIPS is used instead. I have also written a simple scheduler and C routines to systematically use all the peripheral and features in the dsPIC33EP micro-controller. This makes the firmware more systematic and can be easily scaled up in future.
Also instead of PID, the feedback control approach now uses state-feedback approach, resulting in better stability when keeping the machine upright. I have also designed an on-board battery charger, so that the user the charge up the battery using a standard 9-12V/0.5-1.0A output AC adapter. Moreover most of the mechanical parts are now custom made with a low-cost 3D printer. The robot weights around 330 grams.
A final improvement in Version 2 is in the wheel encoder. Instead of using an infrared sensor with code wheel mounted on the output shaft of the geared DC motor, I now use a Hall effect based magnetic sensor with disc magnet mounted on the motor shaft. This is the preferred approach as it allows for a more compact wheel encoder in addition to better resolution. Of course one drawback of motor shaft mount encoder is the backlash of the gearbox is not taken into account by the encoder. To limit the impact of backlash we can use DC motor with higher current rating and smaller gear ratio.
Here's the video of Version 2.0 (Version 2 Test 1) as completed in April 2016.
Version 2 Test 1 (V2T1) still uses infrared sensors to avoid obstacles. In future I hope to incorporate a camera module on the head and uses machine vision to complement the infrared sensors.
July 2016 - V2T1P
After evaluating V2T1 for a month, I set off to improve the machine, in terms of mechanical reliability and also to improve the software. Here I use standard 42mm diameter rubber wheel which can be plugged directly into the geared DC motor output shaft. The result is Version V2T1P ('P' indicates production). The mechanical design is now sleeker and the overall machine weight is lighter at around 260 grams. There are two versions of the machine, the difference is in the construction of the head. Due to the smaller diameter of the wheel, light weight and dynamics, this version can be balanced on the palm.
Aug 2017 - Version 2 Test 2 (V2T2)
Throughout Dec 2016 to May 2017, I have been working on a bigger version of Version 2 and also a simple machine vision module (MVM). I will call this V2T2, since it still uses the same single-board computer module as V2T1. Only the mechanical aspects and motor driver board has been redesigned to accommodate larger DC motors (up to 15V 5A rating motors) and 11.1V LiPo battery. Small update is also done on the on-board computer circuit/PCB, it now has a switching DC-to-DC down-converter to regulate the battery voltage to 5V for powering the on-board electronics. This improves the power efficiency over linear DC-to-DC down-converter and it allows for up to 24V battery input. A comparison of the V2T2 and V2T1 is shown below. In this picture a vision module has been mounted on the 'head' of both robots. Details of the MVM can be found in an earlier post.
19 March 2018 - Version 2 Test 2 (V2T2) with Arms (Updated on 16 July 2019)
I have added a pair of arms to V2T2 robot. These are simple 3 degrees of freedom manipulator (without considering the motor in the gripper) with a claw-like gripper as the end effector. The front and rear view of the robot are shown below. Here I am using a 85mm diameter rubber wheels found in 8:1 scale remote control racing car and 25mm diameter geared DC motor (12V, 34:1, MP series) from Pololu Robotics. The Head Unit incorporates a motor to rotate the head up and down in the elevation axis.
One of the main considerations when adding arms to the two-wheels self-balancing platform is the effect of the arms to the robot state. Whenever we add extra motors and movable limbs to the two-wheels self
balancing platform we increase the degree of freedom of the system. The
pose of the manipulators affects the moment inertia and also the
location of the center of mass of the upper body. When the manipulator
moves, it is like having a disturbance to the robot state. Controlling
and anticipating the changes of such multi-degree of freedom system is
very difficult. So my strategy is to make the manipulator mass and
moment inertia small compare to the overall robot platform. This would
reduce the disturbance to the robot state and allow to robot to behave
is a reasonably predictable manner.
The arm is designed in such a way that it can be 'plug-and-play' on the robot body, i.e. we can switch to another arm by simply unplug and plug in a new design.
As seen in above photo, I am using Hitec's RC servo motors to drive each joint, and a
switch-mode buck dc-to-dc converter to down-convert an input supply voltage of 6-to-20V
to 5V for each servo motor. Switch-mode buck converter is more
power efficient, thus generate less heat and improves the running time of the robot battery. Generally I find Hitec servo motors (especially those with ball-bearing support at the output shaft) has better endurance than generic RC servo motors of similar size and output torque. This is important as the RC servo motors have to endure quite severe impact when the robot fell forward, pinning the arms on the ground. Initially I use generic nano/pico RC servos and I have damaged many in the process. At present the robot cannot push itself upright if it falls forward as the servo motors do not have sufficient torque.
A video of the gripper in action.
Finally short a demo of the finished robot with a pair of arms moving around as taken in March 2018.
STL Files for the Arm
If you are interested to duplicate the mechanical design I have shared the 3D design files (in SketchUp file format and STL format) for the arm and body parts here. The main body is actually just a piece of 3 mm thick acrylic plastic.
Robotic Controller
The custom micro-controller board and motor driver board are more challenging as these are actually designed for other more demanding applications and I just repurposed them to power my robots. In any case if you are interested to see the dsPIC33E single board computer schematic and the basic description of the codes using a scheduler, you can find it here. However for beginner I would suggest to replace the DC geared motor, the custom motor driver board and the dsPIC33E single board computer with stepper motors, the A4988 or DRV8825 stepper motor drivers and Arduino board. Controlling stepper motor is easier than controlling DC geared motor with encoder as you need to implement a hardware or software quadrature encoder routine in your micro-controller if DC geared motor is used. There are many small self-balancing robot projects out there using stepper motors and can be adapted to this design. You will probably need two Arduino boards, one to control the stepper motor (and read the IMU outputs) to balance the robot and to steer it, and another to drive the RC servo motors on the arms and to perform other high-level actions.
18 May 2018 - Version 2 Test 2 (V2T2) with Arms (More Videos)
In this interesting video I have added more capability (e.g. color object sensing) to the Machine Vision Module (MVM) attached to the robot's head. Also coordinate this capability with the control of the arms. Details of the construction of the MVM can be found in an earlier post in Jan 2016.
12 Aug 2019 - Version 2 Production 2 (V2P2) Two-Wheels Robotic Platform
This is a more refined version of the previous machine with a proper enclosure. Also the firmware is more stable. The robot is a platform, meaning it is "headless", and can be interfaced with custom head module by the user. The head unit communicate with the on-board robotic controller via a serial port.
Here is a demonstration of the capability for V2P2, with custom robotic head. The head contains a small camera with on-board processor for real-time color image processing at 20 FPS (frame-per-seconds). The description of this machine-vision module (MVM) is described in an earlier blog from 2016: https://fkeng.blogspot.com/2016/01/machine-vision-module.html
In this demonstration the robot uses the camera to detect obstacles by observing color contrast and also to find a yellow color tennis ball in the environment.
5 Feb 2020 - Version 2 Test 2 (V2T2) with Improved Capability (More Videos)
This is the same version I have built in March 2018, but over the past 1.5 years I have greatly improved the control software in the robot body and the machine vision module (MVM) on the head. The head is also rebuilt. As mentioned above, the mass and size of the arm is much smaller than the robot platform, this minimizes the effect of the manipulator pose on the robot state. The predictable behavior allows the
machine to perform some useful action like picking up a small load from
the floor and moving it to another location as shown in the demo video
taken in Feb 2020.The robot can now coordinate its manipulators and center-of-gravity to lift and carry and object up to 100 grams (In the video the load is 75 grams).
December 2021 - Version 2 Production 2 (V2P2) Update
Have put on hold development on version since 2020 while I focus my effort on building version 3 of the robot. Past few months I started to re-look into this version again. In particular efforts are being diverted to explore streaming of video image from the robot camera over WiFi, to a remote computer. I settle on using the popular ESP-01 module, which uses the ESP8266EX chip. Probably this can be done using ESP32 board too but for now ESP-01 is slightly cheaper. The approach I use is direct connection between the ESP-01 and a computer using TCP (transmission control protocol). The ESP-01 will be set as a TCP server, image data from the machine vision module on the robot will be stream to ESP-01, line-by-line, over serial communication, this in turn will be transmitted to a remote computer over TCP. A TCP client on the remote computer will then display the lines of image data on a app on the computer. At present I am using a baud rate of 345.6 kbps for the serial link between the machine vision module and ESP-01. The frame rate achieved is between 1-2 frames-per-second, and only in gray scale. It is not ideal, but at least is 3-4x faster than using Bluetooth. I also took the opportunity to improve on the robot monitoring software that is used to observe the real-time telemetry data from the robotic platform, such as the torque value setting to the motors, the wheel angular velocity, distance traveled, tilt angle, control board temperature etc. Below is the video showing the result of this efforts for the past few weeks.
July 2022 - Version 2 Production 2 (V2P2) Update
Some minor improvement in the firmware for V2P2. Incorporated routines for the robot to park itself on a wall or similar structures when it is idle. During this mode the robot will turn off the motors driving the wheels, thus conserving energy. Also incorporated the reverse routines, i.e. for the robot to get up on it's own. When the robot reverses until it is 1- 3 cm from a supporting structure,
it will turn off the balancing task, and lean back. To get up, first
the robot reverses both wheels, this way the reaction force on the
wheels will swing the robot to the front. Once the tilt angle is within a
certain threshold of the upright position, the balancing task is
activated. https://www.youtube.com/shorts/UId41GEJenM
--- Version 3 --- September 2020 - Version 3 Prototype 1 [Basic Platform]
I started planning, design and building Version 3 of the two-wheels mobile robot platform in Jan 2020. There are numerous delays and hiccups due to the global covid19 pandemic situation during this period. For instance delay in getting components and PCB fabricated, also in the initial phase of the pandemic there is a strict lock down in my locality, resulting in me not able to carry out extensive hardware work. Subsequently I adjusted and setup a basic lab at home. Only in August 2020 am I able to get a partially completed system to work reliably. I called this version 3 due to the new robot controller module being developed for this version. A slightly better micro-controller, dsPIC33CK256MP506 is used on the robot controller. This is still a 16-bits micro-controller with DSP capability as in Version 1 & 2, but runs at higher clock frequency of 200 MHz, and I suspect it is using a more advanced fabrication process, thus reducing the power consumption of the chip. A photo of the robotic platform is shown below.
As with version V2T2, this version has a pair of mechanical arms, although in this version the arms are more powerful, using a 15kg/cm torque smart servo as the actuator. I am still working on designing the gripper or end effector for the arms. Also in this version I am using a larger geared DC motor of 37 mm diameter which can produce larger torque (about 8kg/cm) coupled to a 120mm diameter wheel. The current robot platform is about 30 cm in height without the head unit and weight 2.5 kg. The video below explains the main feature of the platform in details.
March 2021 - Version 3 Prototype 1 Update [Wrist Joint, Head Bracket and Improved Firmware]
After several months, some improvement have been made on V3T1, notably I added a head structure, and wrists being added to both arms. For the wrist, a micro RC servo with bearing is used as the actuator for better mechanical reliability. Moreover, improvement in the robot controller firmware now allows the machine to pick up and place a load up to 400 g. The video below presents a short demonstration.
July 2021 - Version 3 Prototype 1 Update [Robot Head]
Past couple of months I have been experimenting with using Raspberry Pi (Rpi)single-board computer (SBC) as the 'head' of the robot. I intended to use Rpi with OpenCV libraries and CMOS camera to carry out basic image processing tasks. The Rpi will also be linked to the on-board Robot Controller, e.g. Rpi will carry out image processing tasks and high level algorithms, while the Robot Controller will execute low-level routines. Initially I tried with Raspberry Pi Zero W. However, the Rpi Zero proves too slow with it's single core processor. Subsequently I settled with Raspberry Pi 3A+ in the official enclosure. The four cores processor in Rpi 3A+ is sufficient for the basic image processing tasks, provided I limit the image resolution to 320x240 pixels. Rpi 3A+ is chosen for the size, weight and power consumption. The Rpi 3A+ is powered from 5V supply bus of the Robot Controller, which can provide up to 2.5A of current. I manage to obtain a frame rate of 7-15 frames per second depending on the complexity of the computation, this seems reasonable for the time being. The video below provides further explanation.
March 2023 - Version 3 Prototype 1 Update [Wrist, gripper, software and others]
Have been putting this project on hold most of last year due to work commitment, also spending time on learning classical robotic system and mechanical engineering topics such as kinematic and control of arm-type robots, machine and mechanism design etc. Anyway I manage to pick up the project again end of 2022, and managed to build up the wrist joint and gripper for the mechanical arms. The gripper is modified from a kit sold by Pololu. I also tidy up the source codes for the on-board robot controller, and added some interesting capability. So far the robotic control software seems to be more stable. Finally I also learnt how to make my own rechargeable battery pack. All these are summarized in the video below. Plan to share the design details in Github in near future.
--- PC Monitoring Software for Robot ---
April 2022
Though not explicitly mentioned, throughout the project I have developed a program running on PC which allow me to monitor the status of the robot sensors, and also to issue commands to the robot in real-time. Perhaps now is the right time to introduce the software as it is stable and has all the functionalities required. Below is a video that explains the feature of this software and the Github link.