RoboCupRescue - Robot League Team
CASualty (Australia)
Mohammed Waleed Kadous1, Sarath Kodagoda2, Jonathan Paxman3, Claude Sammut5, Raymond Sheh6, Jaime Valls Miro7, John Zaitseff8
2,3,7
ARC Centre of Excellence in Autonomous Systems
Faculty of Engineering, University of Technology, Sydney
Sydney NSW 2007
1s.kodagoda@cas.edu.au
3
jpaxman@cas.edu.au
7javalls@cas.edu.au
1,4,5,8ARC
Centre of Excellence in Autonomous Systems
School of Computer Science and Engineering
University of New South Wales
Sydney NSW 2052
1waleed@cse.unsw.edu.au
4malcolmr@cse.unsw.edu.au
5claude@cse.unsw.edu.au
8zaitseff@cse.unsw.edu.au
6School
of Computer Science and Engineering
and National ICT Australia
University of New South Wales
Sydney NSW 2052
6rsheh@cse.unsw.edu.au
Abstract. This document describes the “CASualty” entry into the 2006 RoboCup Rescue competition. The entry consists of two primary vehicles, the
tracked articulated rescue vehicle CASTER, and the wheeled differential drive
robot HOMER. The vehicles are equipped with a range of state of the art sensors for mapping, localization and victim identification, and are capable of
navigating and autonomously mapping unknown environments, detecting and
locating human victims, and identifying the victim states.
Introduction
CASualty is a team representing the ARC Centre of Excellence in Autonomous Systems (CAS), which is a collaboration between the Australian Centre for Field Robotics at the University of Sydney, the Artificial Intelligence Research Group in the
School of Computer Science and Engineering at the University of New South Wales
and the Mechatronics and Intelligent Systems Group at the University of Technology,
Sydney.
The RoboCup Rescue team aims to bring together several strands of Autonomous
Systems research from within CAS and NICTA in a single highly specialized application.
The robot team consists of two primary vehicles, CASTER: a Yujin RobHaz DT-3
articulated tracked vehicle (see figure 1), and the custom built HOMER (High-speed
Obstacle Mapping and Exploration Robot, see figure 2). CASTER will be teleoperated and is primarily intended for the more difficult Orange and Red arenas,
while HOMER will be run autonomously in the Yellow arena.
Although tele-operated, CASTER will use autonomous modules for localization and
mapping as well as victim searching, as an aid to the operator. It is fitted with a number of sensors for 2D and 3D localization and mapping, including Hokuyo URG laser
rangefinders, and a CSEM Swissranger 3D range sensor mounted on a pan-tilt unit.
Victim identification is further aided by the use of numerous cameras, a thermal camera, and a stereo microphone.
HOMER will operate primarily in autonomous mode. Localization and mapping will
be carried out via a Hokuyo URG laser range finder. Victim identification will be
carried out using vision-based techniques, aided by an IR camera, and an audio microphone.
Fig. 1. CASTER: a RobHaz DT-3 tracked vehicle, with remote control station.
Fig. 2. the autonomous High speed Obstacle Mapping and Exploration Robot (an earlier version using stereo vision instead of laser).
1. Team Members and Their Contributions
The following are the principal CASualty team members. Asterisks (*) denote team
members who will be part of the on-site team in Bremen.
Team Leaders:
Mohammed Waleed Kadous*, Jonathan Paxman*
Technical design and John Zaitseff, Weizhen Zhou
support:
Algorithms:
Sarath Kodagoda*, Raymond Sheh*, Jaime Valls Miro*, Tarek
Taha, Oliver Thane, Hue Tuan Thi
Team Mentors:
Prof Claude Sammut*, Prof Gamini Dissanayake
In addition, we would like to thank the following people for advice at various stages
of the project: Maurice Pagnucco, Charles Willock, Xianhang Zhang, Michael Trieu,
Mario Cheng, Dinesh Gurram, Matthew Rozyn.
2. Operator Station Set-up and Break-Down (10 minutes)
The team of robots will be centrally operated via a wireless enabled laptop. The operator station can be fully setup within ten minutes.
3. Communications
The robots are currently configured for use with 802.11a wireless networking
4. Control Method and Human-Robot Interface
The CASualty plan for the 2006 RoboCup Rescue competition involves a dual approach of tele-operation (with partially autonomous modules) and full autonomy. We
plan to use the tele-operated CASTER and REDBACK robots in the orange and red
elements, and the fully autonomous HOMER in the yellow elements and autonomous
area.
An integrated interface application has been developed, which allows a single operator to maintain control over a number of autonomous and tele-operated robots. The
application also integrates mapping information from the vehicles and generates a
single global map.
5. Map generation/printing
Representation of 2D maps:
It is proposed to represent the 2D map using occupancy grids (OG). Victims and
landmarks are located within the map by associating them with the range data. Inertial
measurement is used as an aid to mapping, by correcting for tilt (when 3D scans are
available), or by increasing the uncertainty of 2D range data produced from nonhorizontal poses.
Scan matching:
The Robocup rescue arena is a generally static environment. In this type of situation,
laser based scan matching [3] can provide good quality localization information,
which can be used to compensate for errors in the robot odometry. Matched scans can
provide the basis for an accurate occupancy grid map.
Integration:
The mapping component of the operator interface performs statistical optimisations in
order to align map elements from each robot.
6. Sensors for Navigation and Localization
CSEM Swissranger: a 2.5D/3D time-of-flight LADAR camera [3] mounted on a
pan-tilt unit onboard CASTER. This camera provides a 160x124 depth image at
30fps, to a distance of 7.5 metres and a resolution of 5mm. It also provides a near
infrared reflectance image from the same sensor. We intend to take advantage of the
high frame rate afforded by this camera in order to rapidly generate dense 3D maps
and to provide the primary means of robot localisation and 3D map building. We do
not plan to use odometry from the tracked robots except to determine if the robot is
likely to be stationary.
Hokuyo URG Laser Rangefinder: The Laser rangefinder returns accurate distance
measurements in the 240° horizontal field of view at a resolution of up to 0.35°. The
maximum scan rate is 10Hz. A rotated URG may also be used to generate 3D scan
data.
7. Sensors for Victim Identification
ThermoVision Micron IR thermal camera: This lightweight, compact 160x128
pixel thermal infrared (7.5 – 13.5 micron) camera is a key component of the victim
identification system. When calibrated with respect to the Videre stereo camera, it is
possible to localize heat sources very precisely within the Rescue arenas. Thermal
imaging is expected to be one of the primary methods of victim detection. In autono-
mous mode, identified heat sources are filtered for temperature and size before being
labelled as potential victims. In tele-operated mode, the thermal image is relayed to
the operator and may be viewed in conjunction with the optical image. Potential victims (as identified by the autonomous system) are flagged to the operator. Thermal
imaging has the advantage of enabling victim identification in dark areas of the arena
where the optical image is very poor, and of locating partially or completely occluded
victims.
Videre cameras: The optical cameras are an important component of the victim identification system. In autonomous mode, the optical image is processed by a skin detection algorithm (which can detect a wide range of skin types in a range of lighting
conditions), and a shape detection algorithm that can detect some instances of hands,
feet and heads within the optical image. In tele-operated mode, the optical image is
fed back to the operator (who may switch between the left and right camera view).
The operator may choose to run the autonomous victim identification algorithms as
an aid to victim identification.
Microphone: A stereo microphone mounted with the cameras will be used to assist in
the identification of the state of victims. In tele-operated mode, the sound from the
microphones will be fed back to stereo headphones worn by the operator. In autonomous mode, very simple processing (thresholding and filtering) will be used to estimate whether a sound of the appropriate frequency range is collocated with the victim.
8. Robot Locomotion
CASTER: uses a tank-like track arrangement that is steered differentially using
tracks. The rubber tracks are ribbed to improve traction over rough terrain. However,
unlike most tanks, the DT-3 consists of two articulated sections. The rear section is
similar to a conventional tank. The front section consists of a triangular track path.
This allows the DT-3 to traverse tall obstacles such as stairs. The locomotion of the
DT-3 will be controlled via direct manual control (tele-operation).
HOMER: is a differentially driven wheeled robot. The front wheels are driven by
Maxon RE-35 motors, through GP-32C gearboxes. The robot is capable of high
speeds up to 2m/s, but will be limited to very low speed operation for Robocup. The
rear omni-directional wheels result in relatively stable, predictable behaviour (compared to caster wheels and other options). HOMER is not equipped to handle rough
terrain, and will not be suitable for the orange and red arenas.
9. Other Mechanisms
Secondary robot:
The team is considering the possibility of deploying a secondary robot, dubbed
"RedBack", based on the MGA "Tarantula" RC vehicle. This vehicle would be used
in conjunction with CASTER in the orange and red arenas. As a small, lightweight,
low-cost, highly mobile robot, it may be used to explore unstructured areas inaccessible to CASTER due to dangerous drops or other physical constraints. Situational
awareness is also improved with the addition of a 3rd person view that a secondary
robot affords. Featuring two pairs of driven, tracked flippers (see figures), this fully
equipped robot can climb stairs with slopes in excess of 45º and obstacles up to 40cm
in height and 20cm in width, self-right, run inverted and rise to provide a ground
clearance of 20cm. It is also ruggedised to withstand rolls and falls and is extremely
lightweight at around 6kg. Equipped with a small PC, wireless LAN and a colour
camera, this robot may be tele-operated and has the potential for semi and fully
autonomous operation. Planned additions for situational awareness, mapping and
victim identification include additional cameras, rangefinders, microphones and accelerometers.
10. Team Training for Operation (Human Factors)
When operating in autonomous mode, HOMER does not require human intervention.
Nevertheless, the operator should be trained in the tele-operation of HOMER so that
intervention can take place if required. Operator training for tele-operation of
HOMER or CASTER requires roughly a day of familiarisation with the controls. In
addition, it is planned that the operator will be practising in a rescue arena with earlier
versions of the tele-operation software. Training includes instruction on the mechanisms of the various sensor and actuator modules, and the relative value of the data
provided by each module.
11. Possibility for Practical Application to Real Disaster Site
Two RobHaz DT-3 robots have been deployed as a military tool in Iraq. It is expected
that the DT-3 may soon see use in real disaster situations, as a tele-operated vehicle.
HOMER is not designed for a real disaster scenario, as mechanically it is only suitable for flat terrain. The sensing and intelligence technology on HOMER however is
expected to be fully transportable to more robust platforms capable of performing in
real disaster scenarios.
12. System Cost
All items listed in US dollars (conversion rate used 1AUD=0.77USD).
TOTAL SYSTEM COST:
USD$96,000
CASTER
KEY PART NAME:
MANUFACTURER:
COST:
WEBSITE:
DESCRIPTION/TIPS:
tion.
KEY PART NAME:
PART NUMBER:
MANUFACTURER:
COST:
RobHaz DT-3
Yujin Robotics
$60,000
http://www.tribotix.com/Products/Yujin/RobHaz/RobHaz.htm
DT-3 remote control robot, including remote control sta-
Hokuyo URG Laser
URG
Hokuyo
$1500
KEY PART NAME: Swissranger
MANUFACTURER: CSEM
COST:
$7,700
WEBSITE:
http://www.csem.ch/detailed/p_531_3d_cam.htm
DESCRIPTION/TIPS: Swissranger is a pulsed infrared time of flight sensor. It
provides a 160x124 depth image which may be used to map the 3D environment. As
an active sensor, it is not dependent on external lighting conditions.
KEY PART NAME: ThermoVision Micron IR Camera
MANUFACTURER: FLIR Systems
COST:
$13,000
WEBSITE:
http://www.indigosystems.com/product/micron.html
DESCRIPTION/TIPS: Excellent detection of heat sources.
HOMER:
HOMER is a custom built robot, constructed at the University of Technology,
Sydney. Prices for some components are listed below . The value of in-house fabrication and technical work is difficult to estimate.
KEY PART NAME:
PART NUMBER:
MANUFACTURER:
COST:
Motors and Gearboxes
RE-35, GP-32C
Maxon
$450x2
KEY PART NAME:
Libretto Laptop
PART NUMBER:
U100
MANUFACTURER: Toshiba
COST:
$3000
KEY PART NAME:
PART NUMBER:
MANUFACTURER:
COST:
Hokuyo URG Laser
URG
Hokuyo
$1500
Other items: 2x Hewlett Packard HEDS-5540 two channel quadrature encoder,
Magnevation H-bridge 3A DC servo motor (2x) board (R121-MTR-DRV-KIT from
Acroname) [$65], 2x SensComp 600 series SmartSensor sonar [2x $55], 6x Metal
Hydride (NiMH) 3000mAhr 7.2V 350g [6x $30].
References
[1]
D. G. Lowe. Distinctive image features from scale-invariant key points, International
Journal of Computer Vision, 2004.
[2] M. W. M. G. Dissanayake, Paul Newman, Steven Clark, Hugh F. Durrant-Whyte, and M.
Csorba, A Solution to the Simultaneous Localization and Map Building (SLAM) Problem,
IEEE Trans. on Robotics and Automation, Vol.17, No. 3, June 2001.
[3] F. Lu and E. Milios, Globally consistent range scan alignment for environment mapping,
Autonomous Robots, vol. 4, pp. 333--349, 1997
[4] Besl, P. J. and McKay, N. D. A Method for Registration of 3-D Shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 14, No. 2, Feb 1992.
[5] Surmann, H., Nüchter, A., Lingemann, K., Hertzberg, J. 6D SLAM - Preliminary Report on closing the loop in Six Dimensions. Proceedings of the 5th Symposium on Intelligent Autonomous Vehicles, Lisbon, Portugal, 2004.
[6] Russell, R.A. and Wijaya, J.A., 'Object location and recognition using whisker
sensors', Australian Conference on Robotics and Automation, CD-ROM Proceedings ISBN 0-9587583-5-2, 2003.