When deprived of sight humans adapt and use other senses for navigation. Most rely on touch (long cane), but some use auditory perception. We have observed a blind teenager echolocating using sounds (clicks) he makes with his mouth. More commonly, an ultrasonic sensor is used as a navigational aid to scan the path and environment. The echoes blind people perceive are interpreted by each individual to form an auditory scene where clear paths and obstacles are identified. With this information, the blind user threads his/her way safely through the space scanned. The work we describe here seeks to mimic a blind person using a sonar navigational aid to traverse a path or corridor. We are using a commercially available ultrasonic mobility aid to isonify and capture echoes from a corridor, we then attempt to correlate these to the geometric features of the corridor, as we perceive them. Our aim is to develop a perception system, which is capable of interpreting, in real time the echoes to discern the geometric features of the environment, so that this data can be used to navigate a robot through it.