RoboCup 2009 - RoboCup Rescue Team
نویسندگان
چکیده
This paper describes the approach of the team resko@UniKoblenz for the RoboCup Rescue competition 2009. Our mobile system Robbie is based on a MobileRobots Pioneer 3 AT. It is equipped with a four-wheel drive and sonar sensors in the front and in the back. On this platform, an aluminum rack is installed where additional sensors are attached: three color cameras (including two high resolution Sony FireWire cameras), an actively contolled Hokuyo URG04LX laser range finder (LRF), a thermal camera and an inclination sensor. The robot can be operated in autonomous and in teleoperated mode. The map building is done by the computer automatically by merging the collected LRF data with the odometry information using a Hyper Particle Filter (a particle filter of particle filters). The automatic victim detection is based on the obtained thermal and color camera images. Robbie was developed and improved at the University of Koblenz-Landau (Germany) during the last four years as a PhD projekct and in six practical courses. The robot was used by the team resko@UniKoblenz at the RoboCup German Open 2007, 2008 and 2009 in Hannover and at the RoboCup World Championship 2007 in Atlanta (GA, USA) and 2008 in Suzhou (China). The team achieved the “Best in Class Autonomy Award” in all five competitions. Introduction The team resko@UniKoblenz is a group of researchers and students from the University of Koblenz-Landau, Germany. In 2009, the robot Robbie competes in two RoboCup leagues: In the RoboCup Rescue and in the RoboCup@Home league. 1 Team Members and their Contributions Our team consists of two groups: team resko@UniKoblenz and team homer@UniKoblenz : 2 Johannes Pellenz et al. – The team resko@UniKoblenz (with focus on the RoboCup Rescue league) • Johannes Pellenz: team leader, scientific advisor • Kevin Read: technical design, programming • Bernhard Reinert: head of project team rescue, programming • Christian Fuchs: quality assurance, programming • David Beckmann: infrastructure, programming • Susanne Maur: scan matching, SLAM • Denis Dillenberger: 3D terrain classification – The team homer@UniKoblenz (with focus on the RoboCup@Home league) supports the team resko@UniKoblenz : • David Gossow: team leader • Peter Decker: scientific advisor • Marc Ahrends: head of project team @home, programming • Susanne Thierfelder: public relations, programming • Sönke Greve: hardware assistant, programming • Christian Winkens: social events, programming • Viktor Seib: multi-robot coordination 2 Control method and Human-Robot Interface (HRI) Our robot navigates through the yellow part of the RoboCup Rescue Arena autonomously. The robot is started by the operator from the operator station; afterwards the operator can monitor the robot and the sensor data, which is sent to the operator station via WLAN. In case of an emergency, the operator can stop the autonomous mode and teleoperate the robot. If the connection to the operator station gets lost, the robot continues its mission and the victim search. As soon as the connection is re-established, the victim verification on the operator station is triggered. The graphical user interface (GUI) of the operator laptop is shown in Fig. 1(b). The human-robot interface is implemented in Qt4, with OpenGL widgets for visualizing easy-to-understand 3D views of the sensor data. The operator sees all sensor measurements, the color images and the last thermal image on a single screen. The current laser and sonar range readings are merged into the already learned map in the lower left part of the windows. In the upper part of the window, the images of the color cameras and the thermal image are shown. The operator can mark the position of victims in the map manually. In the autonomous mode, a dialog pops up when the robot has found a victim. This (potential) victim can then be verified or declined; the position is automatically marked in the map. Additionally, the robot has a blue flashing light that indicates the detection of a victim. This shows the audience that the victim search was successful. 3 Map generation/printing While exploring the environment, the robot automatically generates a map of the building. The map is based on the laser scans and the odometry information. Fig. 3 shows the result of the mapping process. RoboCup 2009 RoboCup Rescue Team resko@UniKoblenz (Germany) 3 (a) Our Pioneer 3 AT-based robot Robbie. (b) User interface of the operator station. Fig. 1. The mobile system Robbie and the user interface. (a) Thermal camera and mirror (b) ThermoVision Micron/A10 (c) Thermal image Fig. 2. Thermal camera mounting (a) and image of the thermal camera (b). The data structure to store the map is a single occupancy map. In the context of RoboCup Rescue, the grid usually has a size of 800 × 800 cells, which represents a map of an area of 40 × 40 meters with a grid cell size of 50 × 50 mm. The grid is stored in two planes: one plane counts how often a cell was “seen” by a laser beam. This value is increased either if the laser beam measured the cell as free or as occupied. A second plane stores the information how often a cell was seen as occupied. By dividing these two planes, the occupancy probability for a cell ci is calculated as the following ratio: pocc(ci) = countocc(ci) countseen(ci) (1) To solve the SLAM problem, we use a particle filter [IB98] with about 1,000 particles [Pel08]. Each particle represents a hypothesis for the pose of the robot in 2D-space: (x, y,Θ) . Fig. 4 illustrates the robot pose hypotheses, 4 Johannes Pellenz et al. Fig. 3. Map of the Yellow Arena. The blue line shows a planned path back to an unexplored room in the upper right corner (generated at the RoboCup German Open 2008 during the Mapping Mission). represented by particles. (a) Particles (in the center of the image). (b) Detailled view of the particles. (c) Weights of the particles. Fig. 4. Robot poses, represented by particles. The poses are visualized in real time in the user interface. The algorithm of the particle filter includes the following steps: resample, drift, measure, and normalize. The result is the most likely pose of the robot when the laser scan was taken. This pose is then used for the map update. NEW in 2009: An ICP (Iterative Closest Point) scan matcher with different metrics improves the pose estimation (which is the input for the particle filter) that is based on the odometry of the robot. Resample: Depending on the weight of the particles (generated by the last meaRoboCup 2009 RoboCup Rescue Team resko@UniKoblenz (Germany) 5 surement step), the probability distribution is adjusted. The resampling step is only done if the robot moved a certain distance (at least 20 mm) or turned (at least 5) since the last mapping step. Drift : During the drift step, the particles are moved depending on the odometry data and the result of the scan matching. It also models noise due to sensor inaccuracies and wheel slippage (see [HBFT03]). Measure: During the measurement step, new weights for each particle are calculated using the current laser scan. To weighten a particular particle, the endpoints of the laser beams are calculated, using the robot pose stored in this particle. NEW in 2009: We use a subsampling of the laser data: Only laser beam endpoints, that are at least 10 cm away from the last considered endpoint are used for the weighting function. This speeds up the measurement function significantly. Normalize: The assigned weight of a particle is a sum of occupancy probability values. During the normalization step, the weight of each particle is divided by the sum of all particle weights. Map update: The average pose of the top 5% particles with the highest weight is assumed to be the location of the robot. Using this pose, the current laser range scan is added to the global occupancy map by constructing a local map and “stamping” it into the global map, incrementing the counts for seen and occupied cells (cf. equation 1). NEW in 2009: We use a Hyper Particle Filter (HPF) [PP09] – a Particle Filter of Particle Filters – for solving the SLAM problem. Each particle of the HPF contains a standard localization Particle Filter (with a map and a set of particles, that model the belief of the robot pose in this particular map). To measure the weight of a particle in the HPF, we developed two map quality measures that can be calculated automatically and do not rely on a ground truth map: The first map quality measure determines the contrast of the occupancy map. If the map has a high contrast, it is likely that the pose of the robot was always determined correctly before the map was updated, which finally leads to an overall consistent map. The second map quality measure determines the distribution of the orientation of wall pixels calculated by the Sobel operator. Using the model of a rectangular overall structure, slight but systematic errors in the map can be detected. Using the two measures, broken maps can automatically be detected. The corresponding particle is then more likely to be replaced by a particle with a better map within the HPF. Using this technique, broken maps are very unlikely. Also, the problem of loop closing is solved (see Fig. 6). The resulting map is displayed in realtime on the operator station. After the mission, it can be printed out and handed to the responders. To make a step further to standardization, we can also export our map as a GeoTIFF file, which e. g. can be copied on a USB stick. 6 Johannes Pellenz et al.
منابع مشابه
UvA Rescue Team Description Paper Infrastructure competition Rescue Simulation League RoboCup 2014 - Jo~ao Pessoa - Brazil
The UvA Rescue Team has several innovative ideas to further improve the infrastructure of the RoboCup Rescue Simulation League. Those ideas range from providing USARSim an interface compatible with the RoboCup@Home Simulation, to provide the possibility to specify robots in the URDF format, to create a model for the Ricoh Theta 360o camera, to create a model of the omnidirectional robots like t...
متن کاملFc Portugal: Development and Evaluation of a New Robocup Rescue Team
FC Portugal Rescue team is the result of a cooperation project between the Universities of Aveiro and Porto in Portugal. Following previous collaborations of these two Portuguese Universities in RoboCup simulation league and associated competitions, this project intends to fully adapt the coordination methodologies developed by FC Portugal simulated soccer team to the search and rescue scenario...
متن کاملRoboCup is a Stage which Impulse the Research of Basic Technology in Robot
RoboCup is an international joint project to promote Artificial Intelligence (AI), robotics, and related field. It is an attempt to foster AI and intelligent robotics research by providing a standard problem where wide range of technologies can be integrated and examined. RoboCup chose to use soccer game as a central topic of research, aiming at innovations to be applied for socially significan...
متن کاملRoboCup 2010 - RoboCup Rescue Team
This paper describes the approach of the team resko@UniKoblenz for the RoboCup Rescue competition 2010. Our mobile system Robbie is based on a MobileRobots Pioneer 3 AT. It is equipped with a four-wheel drive and sonar sensors in the front and in the back. On this platform, an aluminum rack is installed where additional sensors are attached: three color cameras (Philips SPC1300NC), an actively ...
متن کاملRobocup Rescue A Proposal and Preliminary Experiences
RoboCup Rescue is an international project aimed at apply ing multi agent research to the domain of search and rescue in large scale disasters This paper reports our initial experiences with using the Robocup Rescue Simulator and building agents capable of making de cisions based on observation of other agent s behavior We also plan on analyzing team behavior to obtain rules that explain this b...
متن کاملRoboCup 2012 Rescue Simulation League Winners
Inside the RoboCup Rescue Simulation League, the mission is to use robots to rescue as many victims as possible after a disaster. The research challenge is to let the robots cooperate as a team. This year in total 15 teams from 8 different countries have been active in the competition. This paper highlights the approaches of the winners of the virtual robot competition, the infrastructure compe...
متن کامل