Year

2012

Degree Name

Doctor of Philosophy

Department

School of Computer Science and Software Engineering

Abstract

In this research, we proposed to tackle navigation in a realistic environment for a mobile robot by modelling it on the case of a tourist in an unfamiliar village. When lost, tourists use a variety of strategies to reacquire their path. Initially, we aimed to emulate these strategies to navigate a mobile robot. We sought to attempt to develop an intelligent controller to cope with imprecise inputs. This controller was to use fuzzy logic to make decisions based on data stored in a fuzzy map, represented as sets of rules that it could use to localise and navigate towards a target.

An intelligent agent using the proposed controller needs to perceive its environment with sufficient resolution to detect its features and, by referring to the fuzzy map, ascertain its location and plan its next move along the path to its destination. We published a theoretical approach (Antoun and McKerrow, 2006 and 2007). Core to our approach was sensing for navigation. We theorised that the key to successful autonomous navigation is good sensing which involves reliable feature extraction from sensor data.

We observed, that when deprived of sight, humans adapt and use other senses for navigation. Most rely on touch (long cane), but some use auditory perception. We learnt of a blind teenager echo-locating using clicks he makes with his mouth. Other vision-impaired humans use an ultrasonic sensor as a navigational aid to explore their path and environment. They interpret echoes that they perceive to form an auditory scene where clear paths and obstacles are identified. With this information, they explore the space ahead and thread their way safely through it.

To verify this we sought to mimic a blind person using a sonar navigational aid to traverse a long corridor. We used a commercially available ultrasonic mobility aid to ensonify and capture echoes from a dynamic uncontrolled workspace. We then attempted to correlate components of those echoes to geometric features in the workspace. Our aim was to develop a perception system, which was capable of interpreting the echoes in real time to discern these geometric features so that this data could then be used to navigate a robot along a corridor.

Our investigation of how blind people use ultrasonic aids revealed that they use directed sensing. This posed the problem of sensing in specific directions in synchronisation with robot motion while avoiding collisions with objects in other directions. We rebuilt a mobile robot, with the goal of mimicking a blind person navigating with echolocation and directed sensing. To achieve this synchronisation we experimented with a state machine based software.

Based on our understanding of ultrasonic echo characteristics, we set out to develop an advanced wall-following algorithm, where the robot follows a wall using a single, directed, continuous transmission frequency modulated (CTFM) ultrasonic sensor. The sensor is mechanically panned to track the wall and check for navigable space. We present results from our wall-following experiments. The experimental work described demonstrates the effectiveness of the algorithms we developed.

Another research question of interest is whether blind people track the wall of the corridor, the free space or a combination of the two. To study this question we set up a wall tracking experiment to collect echo data as a mono-aural sensor was moved parallel to a wall. This involved an investigation of the components of the echo and the geometry that produced them. We developed feature extractors to detect multiple objects from echoes. The results indicate that more useful information is contained in echo components than was previously thought. They also substantiate our hypothesis by demonstrating that accurate wall-following with a mono-aural ultrasonic sensor is possible. The goal of this thesis changed to: “how to extract and track multiple component from echoes from a continuous landmark as an ultrasonic sensor moves along that landmark”.

Our experimental work decoupled control from sensing by moving away from the autonomous robot to a precision positioner that tracked parallel to a wall. Then we analysed the backscattered echoes with the feature extractors we had developed. The data we collected and present in this thesis shows that it is possible to track a corridor wall as a continuous landmark by extracting multiple components from echoes backscattered from the landmark, where each component represents a geometric property of the landmark. The contribution this work offers to the field of navigation and localisation is in offering an alternative to the current pervasive sensing techniques that rely on rich information from high quality lasers and highresolution vision systems. The limitation of these is that they quickly become ineffective in dark or dusty environments. Ultrasonic sensing is robust in these harsh environments and renders data rich echo acoustic signals suitable for real time analysis. Our work in modelling the ultrasonic beam characteristics and backscattered signal can be adapted to myriad of application in the field of robotics autonomous navigation.

FoR codes (2008)

0801 ARTIFICIAL INTELLIGENCE AND IMAGE PROCESSING, 0803 COMPUTER SOFTWARE, 0899 OTHER INFORMATION AND COMPUTING SCIENCES

Share

COinS
 

Unless otherwise indicated, the views expressed in this thesis are those of the author and do not necessarily represent the views of the University of Wollongong.