Introspective perception for mobile robots

Date

2023-02-28

Authors

Rabiee, Sadegh

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Perception algorithms that provide estimates of their uncertainty are crucial to the development of autonomous robots that can operate in challenging and uncontrolled environments. Such perception algorithms provide the means for having risk-aware robots that reason about the probability of successfully completing a task when planning. There exist perception algorithms that come with models of their uncertainty; however, these models are often developed with assumptions, such as perfect data associations, that do not hold in the real world. Hence the resultant estimated uncertainty is a weak lower bound.

To tackle this problem, we present introspective perception -- a novel approach for predicting accurate estimates of the uncertainty of perception algorithms deployed on mobile robots. By exploiting sensing redundancy and consistency constraints naturally present in the data collected by a mobile robot, introspective perception learns an empirical model of the error distribution of perception algorithms in the deployment environment and in an autonomously supervised manner.

In this thesis, we present the general theory of introspective perception and demonstrate successful implementations for two different perception tasks. We provide empirical results on challenging real-robot data for introspective stereo depth estimation and introspective visual simultaneous localization and mapping and show that they learn to predict their uncertainty with high accuracy. We also present a framework for integrating introspective perception with robot path planning algorithms. This framework enables the robot to leverage the accurate estimates of the perception uncertainty to reason about the probability of successfully completing a plan in novel deployment environments, hence reducing task execution failures.

Description

LCSH Subject Headings

Citation