Cooperative Localization and Multi-Robot Exploration Abstract This thesis has two main contributions. The first contribution is the use of cooperative localization for decoupling the positional error of a moving robot from its environment. The second contribution is the development of efficient multi-robot exploration strategies for an unknown environment. The proposed method is designed to be robust in the face of arbitrarily large odometry errors or objects with poor reflectance characteristics. Central to the exploration strategy is a sensor (robot tracker) mounted on a robot that could track a second mobile robot and accurately report its relative position. Our exploration strategies use the robot tracker sensor to sweep areas of free space between stationary and moving robots and to generate a graph-based description of the environment. This graph is used to guide the exploration process. Depending on the size of the environment relative to the range of the robot tracker, different spatial decompositions are used: a triangulation or a trapezoidal decomposition of the free space. Complete exploration without any overlaps is guaranteed as a result of the guidance provided by the dual graph of the spatial decomposition of the environment. The uncertainty in absolute robot positions and the resulting uncertainty in the map is reduced through the use of a probabilistic framework based on particle filtering (a Monte Carlo simulation technique). Particle filtering is a probabilistic sampling technique used to efficiently model complex probability distributions that cannot be effectively described using classical methods (such as Kalman filters). We present experimental results from two different implementations of the robot tracker sensor, in simulated and in real environments. The accuracy of the resulting map increases with the use of cooperative localization. Furthermore, the deterioration of the floor conditions did not affect the quality of the map verifying the decoupling of positioning error from the environment.