On-demand coordination of multiple service robots

dc.contributor.advisorStone, Peter, 1971-
dc.contributor.committeeMemberGrauman, Kristen
dc.contributor.committeeMemberNiekum, Scott
dc.contributor.committeeMemberThomaz, Andrea
dc.contributor.committeeMemberVeloso, Manuela
dc.creatorKhandelwal, Piyush
dc.date.accessioned2017-07-31T19:05:10Z
dc.date.available2017-07-31T19:05:10Z
dc.date.issued2017-05
dc.date.submittedMay 2017
dc.date.updated2017-07-31T19:05:10Z
dc.description.abstractResearch in recent years has made it increasingly plausible to deploy a large number of service robots in home and office environments. Given that multiple mobile robots may be available in the environment performing routine duties such as cleaning, building maintenance, or patrolling, and that each robot may have a set of basic interfaces and manipulation tools to interact with one another as well as humans in the environment, is it possible to coordinate multiple robots for a previously unplanned on-demand task? The research presented in this dissertation aims to begin answering this question. This dissertation makes three main contributions. The first contribution of this work is a formal framework for coordinating multiple robots to perform an on-demand task while balancing two objectives: (i) complete this on-demand task as quickly as possible, and (ii) minimize the total amount of time each robot is diverted from its routine duties. We formalize this stochastic sequential decision making problem, termed on-demand multi-robot coordination, as a Markov decision Process (MDP). Furthermore, we study this problem in the context of a specific on-demand task called multi-robot human guidance, where multiple robots need to coordinate and efficiently guide a visitor to his destination. Second, we develop and analyze stochastic planning algorithms, in order to efficiently solve the on-demand multi-robot coordination problem in real-time. Monte Carlo Tree Search (MCTS) planning algorithms have demonstrated excellent results solving MDPs with large state-spaces and high action branching. We propose variants to the MCTS algorithm that use biased backpropagation techniques for value estimation, which can help MCTS converge to reasonable yet suboptimal policies quickly when compared to standard unbiased Monte Carlo backpropagation. In addition to using these planning algorithms for efficiently solving the on-demand multi-robot coordination problem in real-time, we also analyze their performance using benchmark domains from the International Planning Competition (IPC). The third and final contribution of this work is the development of a multi-robot system built on top of the Segway RMP platform at the Learning Agents Research Group, UT Austin, and the implementation and evaluation of the on-demand multi-robot coordination problem and two different planning algorithm on this platform. We also perform two studies using simulated environments, where real humans control a simulated avatar, to test the implementation of the MDP formalization and planning algorithms presented in this dissertation.
dc.description.departmentComputer Science
dc.format.mimetypeapplication/pdf
dc.identifierdoi:10.15781/T2V40KF21
dc.identifier.urihttp://hdl.handle.net/2152/61382
dc.language.isoen
dc.subjectMulti-robot coordination
dc.subjectMonte Carlo tree search
dc.subjectMarkov decision processes
dc.subjectProbabilistic planning
dc.subjectMulti-robot systems
dc.titleOn-demand coordination of multiple service robots
dc.typeThesis
dc.type.materialtext
thesis.degree.departmentComputer Sciences
thesis.degree.disciplineComputer Science
thesis.degree.grantorThe University of Texas at Austin
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy

Access full-text files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
KHANDELWAL-DISSERTATION-2017.pdf
Size:
7.74 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
4.46 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.85 KB
Format:
Plain Text
Description: