Diploma Thesis DIP-1700

BibliographyGraf, Birgit: Robot Soccer.
University of Stuttgart, Faculty of Computer Science, Diploma Thesis No. 1700 (1999).
104 pages, english.
CR-SchemaG.1.1 (Numerical Analysis Interpolation)
I.2.9 (Robotics)
I.2.10 (Vision and Scene Understanding)
I.2.11 (Distributed Artificial Intelligence)
I.4.8 (Image Processing and Computer Vision Scene Analysis)
I.5.4 (Pattern Recognition Applications)
KeywordsRobotics; Robotik; Soccer; Fußball; mobile robots; mobile Roboter; Bildverarbeitung; image processing; EyeBot
Abstract

This thesis describes the necessary hardware and software developments to prepare a group of mobile robots called EyeBot vehicles [Bräunl98/2] to participate at RoboCup - the robot soccer World Cup.

The vehicles used were designed around the EyeBot platform - a system developed especially to be used with small mobile robot systems. Thereby all processing is done on a Motorola M68332 microcontroller located directly on the platform. No processing is done off-board.

The CIIPS Glory robot soccer team consists of five players, one a goal keeper. The four field players have different roles linked to specific areas of competence on the soccer field.

Each robot is equipped with several infrared proximity sensors to measure distances to obstacles. Further a virtual bumper system is used to detect collisions. A 24 bit colour camera is directly attached to the EyeBot platform. It is the main sensor used to navigate the robot. As long as all the objects on the soccer field at RoboCup, e.g. like the ball, walls and goals have different colours, the overall approach in my program is to detect objects by an analysis of colours and react respectively.

For ball detection and path planning several algorithms have been developed and tested. Due to a relatively low processing speed, my approach was biased on developing simple but effective methods. Analysing its environment quickly and still getting precise results is the most important task for each player. The robots must always act fast and correctly considering continuous changes of their environment.

Specific colours for the ball (orange) and both goals (blue, yellow) are taught to the robot before a game is started. A specific starting position is further given for each of the players.

After a game is started images are constantly read from the colour camera and analysed. If the ball has been detected its global coordinates are calculated out of the position in the image. In additional threads, the updated position of the robot as well as infrared sensor readings are acquired.

A behavioural system has been provided to allow each robot to react to the sensor readings in a proper way. Depending on the positions of the ball and other robots on the field, a player selects a specific driving operation and executes it. If possible it drives directly to the ball and kicks it towards the opponents' half. Facing the wrong direction it must calculate a path how to get behind the ball in order not to shoot towards its own goal. All driving operations include obstacle avoidance - if an obstacle is detected in a robot's way it will not proceed but back up in order to get around the obstacle on a different path.

A first test in a real contest situation was given by the RoboCup competition in November '98 in Singapore. Comparing ours to other teams' performances as well as analysing other teams' algorithms gave me knowledge about different playing technics and team strategies. Some of them might be applied in order to improve my team's performance for future competitions.

Full text and
other links
PostScript (29978803 Bytes)
Access to students' publications restricted to the faculty due to current privacy regulations
Department(s)University of Stuttgart, Institute of Parallel and Distributed High-Performance Systems, Image Understanding
Entry dateMarch 31, 1999
   Publ. Computer Science