In keeping with the original aim of replicating the Rug Warrior, described in the first part of this series, in this final part we’ll use the MD_SmartCar library functions to implement a simple random roving robot with similar functionality to the vintage Rug Warrior bot.
The objectives for this robot are therefore fairly modest – the robot should cruise on its own while avoiding obstacles and escaping from inadvertent collisions. Additionally it should be able to just move around, track a light source or follow a wall as its primary objective.
As the MD_SmartCar library takes care of the tedious details of managing the motors and vehicle motion, the application for this robot only needs to focus on the control of the robot’s behavior.
The control system for even a simple autonomous mobile bot must contend with information-processing tasks in real time. When using limited hardware, the traditional approach of separating the problem into a series of sequential functional components is too computationally intensive to be feasible.
An alternative approach is Behavioral Programming, a technique introduced by Rodney Brooks at MIT Robotics Lab. Behavioral Programming has been successfully implemented in many commercial robots (eg, the Roomba) and is also used in applications on NASA’s Mars Rover.
Behavioral Programming principles embody a control philosophy that makes the control program inherently parallel, allowing attention to hazards and opportunities as they arise.
Implementing Behavior Based Control
Behavior based robot programming is a software architecture that combines real-time control with sensor based triggered behaviors. Sensors are used to trigger behaviors rather than being used to make explicit judgements about the environment (eg, mapping or real-world modelling).
Conceptually, behaviors are parallel layers of control that are arbitrated by a prioritization scheme that decides which one will be dominant (behavior fusion). When higher priority behaviors are no longer triggered by a given sensor condition, lower priority behaviors cease to be suppressed and can resume control. A ‘default’ behavior is activated when there are no higher priorities.
The figure above shows a control hierarchy for a simple robot that moves about its environment. The yellow boxes are input sensors, the blue boxes behaviors, the red circles priority arbitration points and the green the output actuator.
The behaviors shown in the figure are the most basic needed for survival (Escape), avoiding obstacles (Avoid) and a default behavior (Cruise). Additional behaviors (such as light or dark seeking, wall following, etc) can be added as required between Avoid and Cruise.
Each behavior is relatively simple, but together they allow the roving bot to create a complex response to its environment whilst navigating around:
- Escape is triggered by either of two sensors. The front bumper is triggered (ie, hits something) and/or the sonar sensor detects that the robot is “too close” (ie, within a predefined close distance) to an obstacle. This behavior has the highest priority and will cause the robot to escape from the situation (see below).
- Avoid tries to avoid obstacles by using the sonar to detect them at a distance and moving away from the object. Any suitable behavior can be implemented here, but the most obvious thing to do is veer away from the object, with a higher turning rate the closer the robot is to the obstacle.
- Cruise is the default behavior if the robot does not have to Avoid or Escape. This behavior, for example, could just be to drive forwards in a straight line.
When a behavior is prioritized it becomes the ‘active’ behavior and will take control of the vehicle during its response to the situation.
In the absence of a scheduler, is helpful to think of each behavior as a Finite State Machine (FSM) triggered by some sensor condition. For example, an Escape behavior could be triggered by a front bumper switch to stop the robot, reverse, do an about-turn (or turn to the left or right side) and then give up its priority control.
Each behavior is implemented in 2 functions:
- A prioritization function that looks at the sensors and determines if the behavior should dominate. The order these are invoked provides the priority hierarchy – the first to determine that it has met its requirements gets control.
- An implementation function. Once the behavior has been prioritized, the function takes control. Behavior execution often maps into a FSM that can be defined in a sequence table for the library to manage in the background. The robot remains in the same behavior state until it is relinquished unless a higher priority behavior seizes control.
The dispatcher code for this looks something like:
// Arbitrate the behaviors in priority order if (activateEscape()) doEscape(); else if (activateAvoid()) doAvoid(); else if (activateSeek()) doSeek(); else if (activateWallFollow()) doWallFollow(); else doCruise(); // default
As behaviors are driven by sensors, it makes sense that managing the sensors is also an important task for the application.
This rover bot carries three main sensors types, whose layout is shown below:
- Three Sonar sensors to measure distance to obstacles, pointing straight ahead, left and right.
- Two whisker type bumper switches at the front of the rover.
- Two light sensors measuring the light levels either side of the vehicle.
Each type of sensor should be managed differently:
- Sonar sensors send a ‘ping’ and wait to detect the return echo. There then needs to be a 50ms pause to let all the echoes die down before the next sensor pings. So the three sensors must follow a sequential cycle.
- The bump switches are a simple digital input. They are a high priority because they signal a collision that must be dealt with, so they need to be read frequently!
- Light levels don’t generally change rapidly, so a less frequent polling regime is adequate for the light sensors.
To manage these disparate requirements, the sensors are packaged into a class to track the timers, current values, conversions, etc. Active sensor values are read from this class as needed. The implementation guarantees that they will remain unchanged during each cycle of the loop() function to allow a consistent basis for decision making during that iteration.
Control and Monitoring
I thought it would be interesting to control and monitor the rover using a wireless remote application.
Communications with the vehicle uses a Bluetooth (BT) interface implemented with a HC-05 module, as described in an earlier part of this series. The BT module is pre-initialized and paired to the Bluetooth master running the AI2 application.
The AI2 application is used to start and stop the rover, set the default behaviour mode, and present sensor telemetry data and messages from the vehicle in the GUI display.
Commands to the rover are simple ASCII commands received and processed using the MD_cmdProcessor library communicating through the Bluetooth serial interface.
The information from the rover to the AI2 application is a continuous stream consisting of two types of data messages:
- Telemetry data containing information about the status of the sensors and the behavior of the vehicle. A telemetry packet is ASCII data that starts with a special start character (‘$’) and ends with an end character (‘~’), at a set time interval. The AI2 application unpacks this data and updates the GUI display.
- Text Messages are anything outside of the telemetry packet., This data is simply displayed as text in the GUI. The application typically sends data about conditions triggering behaviors, but this is also a useful mechanism for a limited amount of real-time debugging.
It is quite interesting to watch the telemetry and follow the progress of the bot as it travels around the room and the appropriate behaviors take control!
The core library, configuration, setup and application sketches referenced in this series of articles are available in my code repository as the MD_SmartCar library. The code for the rover bot described here is the MD_SmartCar_Rover example and the App Inventor (AI2) application is found in the AI2 subfolder of the example.