Modelling visually guided natural locomotion

dc.contributor.advisorHayhoe, Mary
dc.contributor.committeeMemberCormack, Lawrence K
dc.contributor.committeeMemberHuk, Alexander C
dc.contributor.committeeMemberHuth, Alexander
dc.contributor.committeeMemberGeisler, Wilson S
dc.creatorMuller, Karl Sungmin
dc.date.accessioned2022-08-02T23:28:59Z
dc.date.available2022-08-02T23:28:59Z
dc.date.created2021-12
dc.date.issued2021-11-29
dc.date.submittedDecember 2021
dc.date.updated2022-08-02T23:29:00Z
dc.description.abstractVision is an active process where an organism must seek out and acquire the information necessary to support different behavioral goals. This makes understanding these behaviors important for understanding how visual processes unfold in the brain, which is adapted to perform the necessary computations for these behaviors. Bipedal locomotion is one such behavior, which is of particular importance in evolutionary history. In this work I examine locomotion over complex terrain using a mobile eye tracker and motion capture system. This allows for an integrated record of eye and body movements, as well as approximation of the retinal input image. Computer vision methods were applied in order to extract visual motion and to reconstruct environment geometry. This allowed an unprecedented opportunity to examine the visuo-motor decision processes controlling locomotion in natural terrain. Our results reveal the statistical regularities in motion signals that depend on gaze angle and terrain, and have implications for how the visual system might process this information. Gaze angle shapes the spatial distribution of both speeds and directions of visual motion, which has implications for how the visual system might account for this relationship. Terrain differences also manifest in motion signals as deviations from flat ground motion, the magnitude of which is correlated with proximity of gaze allocation to the walker. We also find that gaze is partly predictable on the basis of body orientation and image features. Finally we find that foot placement reflects the avoidance of height changes, with the degree of this influence being modulated by subject leg length. Walkers appear to factor this information into their decision making across multiple spatial scales. Thus foot placement reflects a complex interplay between energetic costs and the need for stable footholds, all taking place as walkers maintain their forward momentum. The conclusions drawn from this new dataset, as well as the novelty of the dataset itself are important contributions towards a deeper understanding of how vision is used to guide locomotion in the natural world.
dc.description.departmentNeuroscience
dc.format.mimetypeapplication/pdf
dc.identifier.urihttps://hdl.handle.net/2152/115133
dc.identifier.urihttp://dx.doi.org/10.26153/tsw/42034
dc.language.isoen
dc.subjectVision
dc.subjectLocomotion
dc.subjectComputer vision
dc.subjectEye movements
dc.subjectMotion perception
dc.subjectMotion capture
dc.titleModelling visually guided natural locomotion
dc.typeThesis
dc.type.materialtext
thesis.degree.departmentNeuroscience
thesis.degree.disciplineNeuroscience
thesis.degree.grantorThe University of Texas at Austin
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy

Access full-text files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
MULLER-DISSERTATION-2021.pdf
Size:
10.07 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
4.45 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.84 KB
Format:
Plain Text
Description: