next up previous contents
Next: Modelling and Simulation Up: Mobile Robotics Previous: Task-Driven Behavioral Navigation

A Graphical User Interface for an Autonomous Robot Navigation System

This work involves the development of a graphical user interface (GUI) to encompass the entire control, sensing and input requirements for an autonomous mobile robot. The robot consists of a wide variety of hardware and software components. These include sensors and their respective control programs, navigational and control strategies, object recognition programs, and tools for inter-process communication. We wish to develop an intuitive, hierarchical interface which will allow non-technical users to control all aspects of the robotic system. As such, the details of the individual programs used to control robot operation are hidden from the user. In its place, the user specifies a desired task. It then becomes the job of the interface to coordinate the actions of the necessary programs in order to perform this task. A further objective of this work is the development of standards to allow future components to be seamlessly integrated within the existing system. This suggests interface standards to provide a consistent look and feel for the user interface. It also requires communication and device driver protocols to be established in order to allow new devices to be incorporated into existing control programs. The user interface is organised hierarchically. At the highest level the user describes the desired task using an intuitive, natural language interface. This task is accomplished through repeated application of three processes which constitute the three main subsystems of the interface hierarchy: perception, reasoning and action. The first subsystem, perception, allows the user to control the acquisition of sensor data and related hardware such as the pan/tilt units. The sensor data are input to the programs controlled by the second subsystem, reasoning. This subsystem gives the user control over all of the components related to the interpretation of the sensor data and the determination of appropriate actions based on this data. The final subsystem, action, provides the user with graphical feedback of the system's progress towards the accomplishment of the task. The underlying programs that actually perform the robot control do not fall neatly into this hierarchy. Therefore the user interface must organise the individual aspects of each program into these categories. Organising the interface in this way allows a user, at the highest levels of the hierarchy, to instruct the robot to perform complex operations, with no concern as to how these actions are accomplished. At the same time, more detailed control options are accessible, organised in a natural and intuitive way.
G. Sela, M.D. Levine



next up previous contents
Next: Modelling and Simulation Up: Mobile Robotics Previous: Task-Driven Behavioral Navigation



Thierry Baron
Mon Nov 13 10:43:02 EST 1995