This demonstration is a simple robot-contest-like simulation; the screen is a room in which a sweeper robot moves around and collects trashes. The mission is to make the robot clean the room with building the action logic of the robot with a controller software and invoking simple hardware APIs.
The sweeper robot has some sensors. The controller software handles the events from them, decides the robot's action and send command to its driving system.
In the demonstration, the sensors and the controller software are represented as the MixJuice Modules. You can understand that different module combinations lead to different action of the robot. Users may write the controller part as a module and link it at execution time to make robot act with their original action logic.
In the demonstration, the sweeper robot has the following virtual hardware specification:
From the viewpoint of controller software, 5 commands can be sent to the driving system of the sweeper robot:
Once a command is sent, the wheels keep their revolutions until the next command will be sent. In other words, `turn right' command keeps the robot turning right on until `stop' or other command will be sent.
For simple implementation, the movement unit of robot matches with a block of the screen, and the revolution unit is 90 degrees (i.e. the robot does not move like a bishop)
The vacuum hole of the sweeper robot is on its bottom, the vacuum motor keeps working. Thus, the robot can sweep all trashes below.
The sensors are virtual hardware mounted on the sweeper robot. They sense the environment around of the sweeper robot and send events to controller software. The objects to be sensed and the events to be sent are varied by the kind of sensors.
Since the sensors only send events corresponding their functions, it depends on the implementation of the controller software how the robot behaves to the events; for example, if a sensor tells an wall in front of the robot body, the robot may turn right, go backward, do nothing, etc..
The following 5 sensors are ready to use:
The sensor detects the direction of the robot. By specification, the robot can direct to 4 directions, and the sensor, thus, sends 4 kinds of events.
The sensor detects the state of movement, which is switched by commands to the driving system of the robot mentioned above. Therefore, there are 5 states same as driving system commands and there are corresponding events each. They are `going forward', `going backward', `turning right', `turning left' and `stopping'.
The sensor detects an wall. Walls in the left or right blocks of the robot or in the block in front of the robot can be checked. Each block has 2 states, namely `wall found' and `no wall', and there are corresponding events each, 6 kinds of event in total might be sent.
The sensor detects a trash. trashes in the left or right blocks of the robot or in the block in front of the robot can be checked. Each block has 2 states, namely `trash found' and `no trash', and there are corresponding events each, 6 kinds of event in total might be sent.
To enable to control the robot manually, it can have a remote controller sensor. The remote controller used in the demonstration is mapped to the keys on a keyboard. There are 5 kinds of key events to be handled: up, down, left and right of cursor keys and space key events.
The following image is a screen shot of the demonstration. Click a screen to start the demonstration.
This is the sweeper robot. The red part indicates the front of the robot.
This is the wall. The sweeper robot cannot go through the wall. Trying to go forward ignoring the wall in front or to go backward ignoring the wall on the back results the wheelspin and the robot cannot move.
This is the floor. It represents already swept or no trash block.
This is the trash. Once the sweeper robot go onto the trash, it is swept, and the block turns to the `floor'. The simulation ends when all trashes will be swept.
This is the time passed since the beginning of the simulation. In a unit time, the robot can make a single move; one of `go forward', `go backward', `turn left', `turn right' or `stop'.
The number of trash blocks is shown. It shows (number of trash block) / (number of initial trash blocks).
The sensors mounted on the sweeper robot are shown. They vary with the combination of sensor modules selected at the initiation time of the demonstration.
By the combination of sensor and controller software modules the behavior of the demonstration can be changed. Here are 9 applets of different combinations (By clicking link, an applet starts in other window). Details of each module and combinations of modules will be later stated.
To read the source code only, please download from the link below.
To get the archive of runnable source codes including image files, please download from the link below.
To compile and run the demonstration from source code, the following environments are required.
The followings are the steps to compile and execute the demonstration. Java and MixJuice execution environment is assumed to be installed.
% mjc AutoSweeperDemo.java
% mj -s sensor.key control.manual
Before detailed description of the program, here gives comments on the behavior of the demonstration with some examples.
The sensor modules are named as sensor.* and controller software modules are named as control.*. Please guess the combinations of sensors and/or controller softwares from the specified module of mj command in the following examples.
At first, we control the sweeper robot by combining remote controller sensor and manual controller software.
The controller software sends a command to the driving system directly triggered by the remote controller sensor event. That is, keyboard operation controls the sweeper robot as follows:
event sent from remote controller sensor | command sent to the driving system |
---|---|
up cursor key pushed | go forward |
down cursor key pushed | go backward |
right cursor key pushed | turn right |
left cursor key pushed | turn left |
space key pushed | stop |
To run this example, do as the following:
% mj -s sensor.key control.manual
One experiencing the `manual 1' may feel hard to operate. The hardness is from the direct mapping between the events from remote controller sensor (key input) and the command to the driving system of the sweeper robot in the controller software control.manual; pushing up cursor key does not make the robot move upward on the screen, and it is not intuitive. Then, the next controller software maps up, down, left and right cursor key input to up, down, left and right movement of the robot on the screen.
To achieve to implement the behavior with the driving system commands of the sweeper robot, movement should be changed according to the direction. For example, to make the robot move upward with up cursor key, the mapping between the directions and the commands should be combined as follows:
direction of the robot | commands needed for upward movement |
---|---|
upward | go forward |
rightward | turn left + go forward |
leftward | turn right + go forward |
downward | go backward |
The controller software control.manual2 introduced here implements to control the sweeper robot to move as stated above with the events from remote controller and direction sensors. To run the example, do as following:
% mj -s sensor.key -s sensor.direction control.manual2
Finishing the explanation of manual control with remote controller sensor leads to the automatic controller softwares.
Some controller softwares will be implemented later, but before starting, a prepared controller software to ease the implementations is explained. The driving system commands of the sweeper robot to change the direction are `turn right' or `turn left', the movements ``move to right'' or ``move to left'' are the result of combination with `go forward'. The prepared controller software implements that after the event `turning right' or `turning left' from a motor sensor was caught it sends `go forward' command. The controller software module is named control.turnAndForward and all the following controller software modules extends it. By the module the implementations become easier; ``move to right'' or ``move to left'' can be achieved simply by sending command `turn right' or `turn left'.
For the first example, we implement a controller software module control.random, which randomly commands `turn right' and `turn left' according to a certain probability. As stated above, since the module extends control.turnAndForward, to behave normally, it is necessary to combine it with motor sensor (sensor.motor).
It is obvious to one who executed the module, the efficiency is too low to clean up a whole room.
% mj -s sensor.motor control.random
The next controller software avoids walls. An wall sensor can detect walls in 3 blocks; front and each side of the robot. The controller software module control.avoidWall to be implemented takes simple wall detection action as follows, supposing the robot go forward by default:
detection of wall sensor (o:detected, x:not detected, -:irrelevant) | driving system command | ||
---|---|---|---|
front | right | left | |
x | - | - | (no command) |
o | x | - | turn right |
o | o | - | turn left |
To run the demonstration with the controller software, do as the following:
% mj -s sensor.wall -s sensor.motor control.avoidWall
It is obvious to one who executed the module, the sweeper robot comes to the dead end soon.
The next controller software seeks trashes with trash sensor. A trash sensor can detect the trashes in 3 blocks like a wall sensor; front and each side of the robot. The controller software module control.seekTrash to be implemented moves the robot simply by trash detection:
detection of trash sensor (o:detected, x:not detected, -:irrelevant) | driving system command | ||
---|---|---|---|
front | right | left | |
o | - | - | go forward |
x | o | - | turn right |
x | x | o | turn left |
x | x | x | (no command) |
To run the demonstration with the controller software, do as the following:
% mj -s sensor.trash -s sensor.motor control.seekTrash
It is obvious again for one who runs the demonstration, the sweeper robot is stuck with an wall.
Each of the automatic controllers above comes to inappropriate state soon so that the robot cannot continue to work. The next demonstration uses combination of 2 controller softwares; wall avoiding and trash seeking controller softwares. The important point is that there is no need to implement new modules but merely need to combine already implemented modules. The combination takes place on the time of execution:
mj -s sensor.wall -s sensor.trash -s sensor.motor \ -s control.avoidWall control.seekTrash
The combination seems much better to clean up the room. It is, however, difficult to finish cleaning up the room completely. The success depends on the arrangement of walls and trashes, the less the number of trash becomes, the higher the probability to come to the dead end like wall avoiding automatic controller gets.
The next example uses random driving controller software module besides the above combination.
mj -s sensor.wall -s sensor.trash -s sensor.motor \ -s control.avoidWall -s control.seekTrash control.random
It can clean up the room, though it takes a long time.
The examples above show the necessity to combine controller softwares with proper sensors, which send events that the controller software requires.
However, it is possible to run a demonstration with a mismatched combination between sensors and controller softwares, as the following examples show.
For example of the first case, we remove the direction sensor from manual 2:
% mj -s sensor.key control.manual2
As a result, the robot was very strangely responding to the input of cursor keys. Actually, the lack of the direction sensor causes the controller software to misunderstand that the robot is always directing to one direction (to the right, in this case), and the events from the remote controller sensor determines the robot's behavior with the wrong information.
The next example is of the second case. Now, we mount a needless wall sensor to manual 2:
% mj -s sensor.key -s sensor.direction -s sensor.wall control.manual2
Although the sensor icon shows the existence of a wall sensor, since the controller software has no implementation to handle the events from the sensor, the sweeper robot behaves just like before.
Note that there is a way to automatically link the sensors used by the controller softwares and avoid explicit specification of sensor module. In the demonstration, however, to show the mismatched examples as above we dare to avoid such an implementation.
From now on, we will explain the implementation of the demonstration programs.
At first, we show an image of the whole MixJuice module components of the demonstration program. Note that in the images below the modules are represented as the UML package for convenience. Moreover, the relations `extends' or `uses' among modules are represented as the UML dependency relation and their distinction is shown by stereotype notations (for simplicity, a part of stereotype notations is omitted).
The modules of the demonstration programs are classified into 3 major categories:
The demonstration programs assume that controller softwares are implemented by users. The controller softwares are, thus, needed to be disabled to touch directly the simulation rules and/or hardwares. In other words, the third group of modules and user implementing controller softwares are prohibited to add difference or refer to the classes in the first group of modules. The controller softwares must be implemented to add difference or refer only to the second group of modules.
There can be seen sometimes that relations of modules is not `extends' but `uses'. The relation `uses' is used, by nature, not to make a restriction for vertical relation at the time of module linearization in order to use the modules having cycle relation. The modules of the program, however, have no cycle relations. It is used only to indicate that ``the module has no differences of classes in the target module and is purely using the classes of the target module.''
This section denotes the details of each module implementation.
The base module is the module for foundation of the demonstration programs. The following is the base module class composition; the pale yellow classes in the class diagram represents the classes defined in the module:
BaseApplet is an applet class extends java.awt.applet.Applet.
DrawCanvas is a class to manage the drawing region of the simulation. It extends java.awt.Canvas and implements update() and paint() to draw the simulation screen. It is convenient for various classes to be able to refer the drawing region, it is implemented as the (pseudo) Singleton class.
Round is a class to control the simulation sequence. It implements java.lang.Runnable to be run as a thread. It manages start, end and elapsed time of the simulation. In every unit time of the simulation, it calls the doAction() method, which only draws elapsed time and the trash blocks in the implementation.
Room is a class to represent the `room' of the simulation. It manages the layout of the room with walls and trashes. In the implementation, the layout of the room is hard-coded in the class. It instantiates Floor, Trash, Wall when it draws the room (the method draw()).
Note that the method main() of the class SS is implemented in the base module. Thus, it can be run the base module only. Since there is no sweeper robot defined in the base module, the simulation only offers the `field' with time passing without any robots.
% mj base
The control module is the module which defines the group of modules of interfaces between hardwares and controller softwares.
Sweeper is the interface of the sweeper robot hardware for the controller softwares. The controller softwares command to the sweeper robot hardware by calling each method of Sweeper.
On the other hand, Controller is the interface of the controller softwares for simulation/hardwares. init() is called at the start time of simulation, and control() is called every unit time of the simulation from simulation. Both methods have empty implementation in the control module. Implementers of controller software manage the sweeper robot by implementing the class Controller and calling each method of Sweeper.
Note that in the class Controller there is a method getSweeper() to obtain a reference of Sweeper for controller software implementer to use. Moreover, the instantiation of the class Controller is done in the simulation internally, there is no need for the implementor to call the constructor.
The hardware module is the module to implement the hardware of the sweeper robot.
SweeperImpl is the implementation class of the sweeper robot. It implements the interface Sweeper of the control module. Each method of the interface Sweeper is implemented as to instantiate each concrete class of ActionCommand, will be stated later, and keep it.
Direction is the class representing the direction of the sweeper robot. The class is capable to represent the whole 360 degrees by 1 degree, but the demonstration uses only 4 instances by 90 degrees.
ActionCommand is the interface representing the actions of the sweeper robot. In case the robot do something in every unit time of the simulation, SweeperImpl call the exec() methods of each keeping ActionCommand. ForwardCommand, BackCommand, RightCommand, LeftCommand and StopCommand are the concrete classes implementing the interface; each represents the action, `go forward', `go backward', `turn right', `turn left' and `stop', of the sweeper robot and implements the method exec().
The following layered class diagram shows the relationship with the base module.
The method calling to move the sweeper robot is added to the method doAction() called in every unit time of simulation.
DrawCanvas manages the instance of SweeperImpl added in the module, too. The method draw() which draws the screen has a new routine to draw the sweeper robot.
The sensor module is the super-module to implement each sensor module.
Sensor is the super class of each sensor, which implements the abstract method sense() for sensor function. To fulfill a function of the sensor, it is necessary to obtain references of Room, SweeperImpl and Controller, there are methods getRoom(), getSweeper() and getController() provided to be used for implementation of sensors.
SeneorSet is a container class for sensors; calling sense() results sense() of each sensors to be called.
The class provides a template method includeSensor(). Since the method will be called in the constructor of SeneorSet, the registration process of sensors can be described as the difference of the method in each sensor module. By this means, the registration process depending on the sensors can be described in each sensor module and the code localization can be achieved. To register a sensor, the method add() is used.
The following layered class diagram shows the relationship with the base module.
The method calling for sensor detection is added to the method doAction() called in every unit time of simulation.
DrawCanvas additionally manages the instance of SensorSet added in the module. The method draw() has a new routine to draw the sensor icons.
The group of modules sensor.* are the implementation modules of each sensor. Each sensor class is defined/implemented by extending the class Sensor of the sensor module.
In addition, the group of modules event.* are the modules defining event handler methods of sensors. They extend the control module and add event handler methods corresponding events sent by each sensor to Controller class. Note that the implementation of these methods in event,* modules are all empty.
As stated in Modules overview, sensor.* modules and event.* modules are paired. In other words, in the implementation of each sensor class of sensor.* modules, by calling method of Controller class defined in paired event.* module, the events are transmitted to controller softwares.
The module group control.* are the reference implementations of controller softwares in the demonstration. These modules extend the events of event.* modules to be handled in controller software. The class of controller software is Controller, which describes the logic of controller software by implementing each event handler method, init() method and control() method.
See the explanation of Description of Behavior for the implementation of each module.
There are some merits of module mechanism for implementing the demonstration. This section provides the explanation of them, though a part of it is already mentioned in the sections above; how the module mechanism of MixJuice is used, or how it contributes to the simplification of implementation or aggregation of code.
As explained from the beginning, by employing module mechanism of MixJuice, the demonstration can choose combinations as follows at execution time.
It is ``combination of multiple controller software modules'' that should be considered here. At the example of combination of wall avoiding and trash seeking automatic controller, ``wall avoiding controller software'' module control.avoidWall and ``trash seeking controller software'' module control.seekTrash are combined to be executed. Actual execution seems that they control the sweeper robot cooperatively as we wished. However, strictly speaking, in some cases there are conflicts between controllers. The following is an example:
In case the sweeper robot comes forward to the block shown in the illustration above, control.avoidWall based on the event from the wall sensor and control.seekTrash based on the event from the trash sensor decide to command to turn opposite directions each. These commands are sent from each module's Controller#control().
module control.avoidWall extends event.wall, control.turnAndForward { class Controller { void control() { original(); ... getSweeper().turnRight(); // This is the command. ... } } } module control.seekTrash extends event.trash, control.turnAndForward { class Controller { void control() { original(); ... getSweeper().turnLeft(); // This is the command. ... } } }
Then, to which the sweeper robot actually moves? The answer is upward, i.e. the command from control.seekTrash is effective. The reason is as follows:
The first reason may be the specification of the sample, and is not problematic. On the other hand, the second reason can be problematic, since MixJuice prohibits to write a code expecting the behavior, and if a user who uses the combination of modules wishes ``in these situation, control.seekTrash should have priority''. In the first place, the behavior of the controller software sending different commands to Sweeper simultaneously is contradicting itself. To restrict the controller to send always a single command in such cases, it should be constructed from scratch as a single controller software to handle events from both wall sensor and trash sensor, and it should not be constructed as a combination of multiple controllers handling each event.
But if the ambiguity that ``if there is a conflict of commands, only one of the command should be effective'' is allowed by a user, it is admissible to combine multiple controller softwares as example above. Though there is a dependency on processing system, without new code, it is possible to combine method implementations to some degree. It is one of the merit of MixJuice.
The hardware module and the sensor module add differences to the implementation of Round and DrawCanvas defined in the base module. The difference of implementation added is about the class defined in the module, where the addition takes place; i.e. the localization of implementation is accomplished.
The registration process of sensors depending on the sensors can be localize to the module where the sensor is implemented. For a concrete example, here shows the remote controller sensor module sensor.key.
module sensor.key extends sensor uses event.key { define class KeySensor extends Sensor implements KeyListener { define KeySensor() { ... } void sense() {} void keyPressed(KeyEvent e) { ... } void keyReleased(KeyEvent e) {} void keyTyped(KeyEvent e) {} } class SensorSet { void includeSensors() { original(); KeySensor s = new KeySensor(); add(s); DrawCanvas.instance.addKeyListener(s); // This is needed only for remote controller sensor. } } }
Since the remote controller sensor class KeySensor need to catch key events, it implements java.awt.event.KeyListener. It, thus, need to be registered to AWT component as key listener, but the process is specific to the remote controller sensor and no other sensors need it. By using the module mechanism, special processes like this can be described in the necessary module itself, and the localization of code is achieved.
The use of module mechanism makes the implementation of the motor sensor very concise and localized.
The hardware module has definitions of the interface ActionCommand, which represents the sweeper robot's movement, and each concrete classes. Using the mechanism, in the sensor.motor module, the method notify() to notify these classes and the interface an event is added. The addition makes the implementation of the method MotorSensor#sense() completed only by calling the added notify() method via SweeperImpl; it is very concise. In addition, the all implementation related to the motor sensor is aggregated in the sensor.motor module.
The demonstration expects the user implements additional controller softwares. To clean up the room more certainly, new controller softwares can be implemented. For example:
This kind of controllers may be implemented with motor sensor, direction sensor and wall sensor. It is sure that the room will be cleaned up but every time the sweeper robot goes back to the stacking point and the robot may move around too redundantly.
The robot with this kind of controllers may clean up the room more optimally. The difficulty of implementation is rather high because the optimal path seeking algorithm, or an algorithm deciding blocks surrounded by walls are impossible to go, etc. are difficult to implement.
The provided implementation is limited to minimal functions, it does not seem a good game. If the followings, for example, are changed, it seems to be better game to play.
[MJ Top page]
[Site map]
|