Towards accurate object tracking using acoustic signal
MetadataShow full item record
We are living in the era of mobile computing where people are surrounded by many smart devices such as smartphone, smart watch, smart TV, and Virtual Reality (VR) headsets. For them, providing intuitive user interface is crucial to satisfy the needs of the users, but their limited form-factor makes them difficult to support natural user interface. Object tracking can provide new opportunity to design intuitive gesture based user interface. First, it can provide more convenient User interface (UI) than traditional controllers and can be used to control a wide variety of Internet of Things devices. For example, it is difficult to use button based controller in VR headsets because users may not see the controller. By tracking the position of the controller or hand, it can control VR applications more intuitively. Second, it can be used to support motion based gaming, which is getting increasingly popular. In this dissertation, we provide accurate object tracking methods that are useful to design intuitive user interface for mobile systems. In particular, we focus on exploiting the acoustic signal to track the movement of the object. While the vision based and the RF signal based object tracking have been extensively investigated, acoustic signal based tracking has not been under explored. The advantage of the acoustic signal based tracking is that it can be enabled by widely available speakers and microphones and can be processed in software without any extra hardware. Using the acoustic signal, we provide two different ways of tracking: 1) tracking mobile devices such as smartphone and smart watch, and 2) device-free tracking that tracks a hand without wearing a device that exploits the reflected signal from the moving hand. In both scenarios, we provide sufficient tracking accuracy so that the mobile or the hand is used as a mouse in the air. First, we develop a system that can accurately track the movement of a mobile device using the acoustic signal. The device to be controlled (e.g., smart TVs) sends the acoustic signal using its speaker, and the mobile device tracks the movement. More specifically, the tracker sends inaudible sound pulses at a few selected frequencies, and uses the frequency shifts to estimate the speed and distance traveled. We then develop techniques to quickly calibrate the distance between speakers and narrow down the device’s initial position using its movement trajectory. Based on the information, we continuously track the device’s new position in real time. This is feasible because many devices, such as smart TVs, PCs, and laptops, already have multiple speakers. Our evaluation and user study demonstrate that our system achieves high tracking accuracy (e.g., median error of around 1.4 cm) and ease of use. Next, we provide a device-free motion tracking system. It tracks the movement of the hand relying the reflected acoustic signal, which is more challenging. To realize it, we propose a novel approach that can estimate the distance and velocity using a single chirp signal and combine both information to accurately locate the moving hand. Through micro-benchmarks and user studies, we show our system achieves the medium tracking error of 1.94 cm using 2 speakers and 2 microphones on the same computer. Finally, we improve the accuracy of the device-free tracking so that it is applicable for gesture based user interface for smart watches and VR headsets. We design single carrier based data communication system over acoustic channel and observe the channel impulse responses of the reflected signal. By observing the phase change, we can track the movement of the finger accurately. In our experiments using the speaker and the microphones of the commercial mobile device, we show that the finger movement can be tracked very accurately with the tracking error less than 1 centimeter.