# Browsing by Subject "Particle filter"

Now showing 1 - 9 of 9

- Results Per Page
1 5 10 20 40 60 80 100

- Sort Options
Ascending Descending

Item A real-time throughput model based particle filter program generator on GPU : a real-time analysis(2022-04-11) Zhang, Lixun, Ph. D.; Mok, Aloysius Ka-Lau; Beaman, Joseph; Novak, Gordon; Rossbach, ChristopherShow more State estimation plays an important role in cyber-physical systems. An accurate state of the physical plant is required by the controller to compute optimal control signals that are sent to the actuators to move the physical system towards the target state. However, in most cases, states cannot be obtained from sensors directly. And for complicated physical systems, whose dynamics are high-dimensional non-linear models, particle filters are required for state estimation due to their superior quality compared to linear estimators such as Kalman filters. A major drawback of particle filters is the computational cost they incur since a large number of particles is required to produce accurate estimation results. Fortunately, the computation of particle filters can be parallelized so that it can be accelerated by graphics processing units (GPUs). One of the hindrances of utilizing GPUs as the computing engine in cyber-physical systems is the lack of real-time performance information. Due to concurrency and synchronization between different processors, real-time performance analysis for parallel architectures is challenging. This dissertation focuses on the real-time analysis of state estimators using particle filters implemented on GPUs. The goal is to compute an accurate prediction of the execution time of the state estimator according to static information of the implementation, which includes both the source code of the state estimator and the hardware specifications. To achieve its goal, this dissertation presents an analytical performance model, which takes as input the source code of the state estimator, the number of particles, and the specifications of the hardware. The analytical performance model outputs a prediction of the execution time of the state estimator. The analytical performance model is tested by a synthetic benchmark and three real-world applications. The benchmark contains synthetic GPU programs with different arithmetic intensities and parallelism. The real-world applications, Vacuum Arc Remelting, Early Kick Detection, and Monte Carlo Localization, apply particle filters to perform state estimation. This dissertation demonstrates the application of the analytical performance model in a particle filter program generator system.Show more Item Analysis of circular data in the dynamic model and mixture of von Mises distributions(2013-05) Lan, Tian, active 2013; Carvalho, Carlos Marinho, 1978-Show more Analysis of circular data becomes more and more popular in many fields of studies. In this report, I present two statistical analysis of circular data using von Mises distributions. Firstly, the maximization-expectation algorithm is reviewed and used to classify and estimate circular data from the mixture of von Mises distributions. Secondly, Forward Filtering Backward Smoothing method via particle filtering is reviewed and implemented when circular data appears in the dynamic state-space models.Show more Item Autonomous search for gas source in an oil & gas facility(2021-08-12) Nambiar, Zahin; Pryor, Mitchell Wayne; Chen, Dongmei, Ph. D.Show more A common problem in oil and gas facilities is the leaking of fugitive emissions such as methane or other hazardous gasses. These hazardous, pollutant, and/or wasteful gases seep due to flaws in the seals of pressurized systems in refineries and other downstream sites. Operators expend great effort to detect, locate, and correct such leaks. Detection lends itself to automation via a mobile robot equipped with a gas and wind sensor. Prior work has developed such robotic methods, which we apply to demonstrations at UT Austin . We present an integrated system with autonomous navigation, remote vision, and gas sensing capabilities, along with task specification and data visualization methods. The system locates and characterizes a single gas source, and reports the location to an operator. Preliminary results based on the developed experimental road map validate the approach for single gas source localization, but additional data must be collected to fully understand the possible localization accuracy given particular environmental conditions.Show more Item Bayesian inference methods for next generation DNA sequencing(2014-08) Shen, Xiaohu, active 21st century; Vikalo, HarisShow more Recently developed next-generation sequencing systems are capable of rapid and cost-effective DNA sequencing, thus enabling routine sequencing tasks and taking us one step closer to personalized medicine. To provide a blueprint of a target genome, next-generation sequencing systems typically employ the so called shotgun sequencing strategy and oversample the genome with a library of relatively short overlapping reads. The order of nucleotides in the short reads is determined by processing acquired noisy signals generated by the sequencing platforms, and the overlaps between the reads are exploited to assemble the target long genome. Next-generation sequencing utilizes massively parallel array-based technology to speed up the sequencing and reduce the cost. However, accuracy and lengths of the short reads are yet to surpass those provided by the conventional slower and costlier Sanger sequencing method. In this thesis, we first focus on Illumina's sequencing-by-synthesis platform which relies on reversible terminator chemistry and describe the acquired signal by a Hidden Markov Model. Relying on this model and sequential Monte Carlo methods, we develop a parameter estimation and base calling scheme called ParticleCall. ParticleCall is tested on an experimental data set obtained by sequencing phiX174 bacteriophage using Illumina's Genome Analyzer II. The results show that ParticleCall scheme is significantly more computationally efficient than the best performing unsupervised base calling method currently available, while achieving the same accuracy. Having addressed the problem of base calling of short reads, we turn our attention to genome assembly. Assembly of a genome from acquired short reads is a computationally daunting task even in the scenario where a reference genome exists. Errors and gaps in the reference, and perfect repeat regions in the target, further render the assembly challenging and cause inaccuracies. We formulate reference-guided assembly as the inference problem on a bipartite graph and solve it using a message-passing algorithm. The proposed algorithm can be interpreted as the classical belief propagation scheme under a certain prior. Unlike existing state-of-the-art methods, the proposed algorithm combines the information provided by the reads without needing to know reliability of the short reads (so-called quality scores). Relation of the message-passing algorithm to a provably convergent power iteration scheme is discussed. Results on both simulated and experimental data demonstrate that the proposed message-passing algorithm outperforms commonly used state-of-the-art tools, and it nearly achieves the performance of a genie-aided maximum a posteriori (MAP) scheme. We then consider the reference-free genome assembly problem, i.e., the de novo assembly. Various methods for de novo assembly have been proposed in literature, all of whom are very sensitive to errors in short reads. We develop a novel error-correction method that enables performance improvements of de novo assembly. The new method relies on a suffix array structure built on the short reads data. It incorporates a hypothesis testing procedure utilizing the sum of quality information as the test statistic to improve the accuracy of overlap detection. Finally, we consider an inference problem in gene regulatory networks. Gene regulatory networks are highly complex dynamical systems comprising biomolecular components which interact with each other and through those interactions determine gene expression levels, i.e., determine the rate of gene transcription. In this thesis, a particle filter with Markov Chain Monte Carlo move step is employed for the estimation of reaction rate constants in gene regulatory networks modeled by chemical Langevin equations. Simulation studies demonstrate that the proposed technique outperforms previously considered methods while being computationally more efficient. Dynamic behavior of gene regulatory networks averaged over a large number of cells can be modeled by ordinary differential equations. For this scenario, we compute an approximation to the Cramer-Rao lower bound on the mean-square error of estimating reaction rates and demonstrate that, when the number of unknown parameters is small, the proposed particle filter can be nearly optimal. In summary, this thesis presents a set of Bayesian inference methods for base-calling and sequence assembly in next-generation DNA sequencing. Experimental studies shows the advantage of proposed algorithms over traditional methods.Show more Item Cyber-enabled manufacturing systems (CeMS) : model-based estimation and control of a solidification process(2014-12) Lopez, Luis Felipe, active 21st century; Beaman, Joseph J.; Williamson, Rodney L.Show more Vacuum arc remelting is a secondary melting process used to produce a variety of segregation sensitive and reactive metal alloys. The present day VAR practice for superalloys involves, typically, melting electrodes of 17'' into ingots of 20'' in diameter. Even larger diameter forging stock is desirable. However, beyond 20'' ingots of superalloys are increasingly prone to segregation defects if solidification is not adequately controlled. In the past years a new generation of model-based controllers was developed to prevent segregation in VAR by controlling melt rate, or the total amount of power flowing into the liquid pool. These controllers were seen as significant improvements in the industry of remelting processes, but these controllers were still focusing on the melting sub-process and ignoring ingot solidification. Accurate control of the liquid pool profile is expected to result in segregation-free ingots, but unfortunately a controller capable of stabilizing the solidification front in VAR is currently not available. The goal of the proposed research is to develop a cyber-enabled controller for VAR pool depth control that will enhance the capabilities of current technologies. More specifically, the objectives of this research are threefold. Firstly, a control-friendly model is proposed based on a high-fidelity ingot solidification model and is coupled to a thermal model of electrode melting. Secondly, sequential Monte Carlo estimators are proposed to replace the traditional Kalman filter, used in the previous VAR controllers. And finally, a model predictive controller (MPC) is designed based on the proposed reduced-order model. The time-critical characteristics of these methods are studied, and the feasibility of their real-time implementation is reported.Show more Item Energy storage-aware prediction/control for mobile systems with unstructured loads(2013-08) LeSage, Jonathan Robert, 1985-; Longoria, Raul G.Show more Mobile systems, such as ground robots and electric vehicles, inherently operate in stochastic environments where load demands are largely unknown. Onboard energy storage, most commonly an electrochemical battery system, can significantly constrain operation. As such, mission planning and control of mobile systems can benefit from a priori knowledge about battery dynamics and constraints, especially the rate-capacity and recovery effects. To help overcome overly conservative predictions common with most existing battery remaining run-time algorithms, a prediction scheme was proposed. For characterization of a priori unknown power loads, an unsupervised Gaussian mixture routine identifies/clusters the measured power loads, and a jump-Markov chain characterizes the load transients. With the jump-Markov load forecasts, a model-based particle filter scheme predicts battery remaining run-time. Monte Carlo simulation studies demonstrate the marked improvement of the proposed technique. It was found that the increase in computational complexity from using a particle filter was justified for power load transient jumps greater than 13.4% of total system power. A multivariable reliability method was developed to assess the feasibility of a planned mission. The probability of mission completion is computed as the reliability integral of mission time exceeding the battery run-time. Because these random variables are inherently dependent, a bivariate characterization was necessary and a method is presented for online estimation of the process correlation via Bayesian updating. Finally, to abate transient shutdown of mobile systems, a model predictive control scheme is proposed that enforces battery terminal voltage constraints under stochastic loading conditions. A Monte Carlo simulation study of a small ground vehicle indicated significant improvement in both time and distance traveled as a result. For evaluation of the proposed methodologies, a laboratory terrain environment was designed and constructed for repeated mobile system discharge studies. The test environment consists of three distinct terrains. For each discharge study, a small unmanned ground vehicle traversed the stochastic terrain environment until battery exhaustion. Results from field tests with a Packbot ground vehicle in generic desert terrain were also used. Evaluation of the proposed prediction algorithms using the experimental studies, via relative accuracy and [alpha]-[lambda] prognostic metrics, indicated significant gains over existing methods.Show more Item Mathematical modeling of epidemic surveillance(2019-09-16) Chen, Xi, Ph. D.; Meyers, Lauren Ancel; Hasenbein, John; Sarkar, Purnamrita; Mueller, PeterShow more My thesis focus on three aspects of epidemic surveillance: Estimation of the probability and corresponding uncertainty analysis for disease to be imported into multiple geographic regions (Chapter 1); Estimation of the transmission of disease after local transmission established (Chapter 2); Prevalence and corresponding confidence interval estimation incorporating individual level test sensitivity and specificity (Chapter 3). The maximum entropy model, a commonly used species distribution model (SDM) normally combines observations of the species occurrence with environmental information to predict the geographic distributions of animal or plant species. However, it only produces point estimates for the probability of species existence. To understand the uncertainty of the point estimates, we analytically derived the variance of the outputs of the maximum entropy model from the variance of the input in chapter 1. We applied the analytic method to obtain the standard deviation of dengue importation probability and Aedes aegypti suitability. Dengue occurrence data and Aedes aegypti mosquito abundance data, combined with demographic and environmental data, were applied to obtain point estimates and the corresponding variance. To address the issue of not having the true distributions for comparison, we compared and contrasted the performance of the analytical expression with the bootstrap method and Poisson point process model which proved of equivalence of maximum entropy model with the assumption of independent point locations. Both Dengue importation probability and Aedes aegypti mosquito suitability examples show that the methods generate comparatively the same results and the analytic method we introduced is dramatically faster than the bootstrap method and directly apply to maximum entropy model. Infectious diseases such as influenza progress quickly potentially reaching large parts of populations. Accurately estimating the parameters of the infectious disease progression model can efficiently help health organization determine the progression and severity of the disease and response properly and quickly. In chapter 2, we studied the application of 2 basic particle filter methods popularly used — Bootstrap Filter and Auxiliary Particle Filter — in estimating the parameters in infectious disease progression models which are non-linear in nature. We propose a posterior particle filter algorithm and two single statistic posterior particle filter algorithms to enhance handling outliers in data. The posterior particle filter algorithm and the two single statistic posterior particle filter algorithms are shown to out-perform the traditional bootstrap and auxiliary particle filters in terms of accurately and consistently estimating the parameters in compartmental SIR models. Besides, we proposed a re-sampling algorithm and compare it with the current popularly used re- sampling algorithm to show the importance of the re-sampling algorithm in helping improving the consistency of the particle filters. Dengue is currently diagnosed using test algorithm determined by number of days after illness onset which cause the challenge of prevalence estimation as the sensitivity and specificity level of patients varies with different RNA and antibody level. In Chapter 3, we tried to address the challenge of adjusting the estimated prevalence and propose the way of estimating corresponding confidence interval incorporating the individual level sensitivity and specificity. We compared sensitivity, specificity for individual level benefits and average estimation errors and precision for surveillance purpose of both using single test and possible combination of multiple tests. Prevalence estimation adjustment can correct all test combinations. Using immunoassays targeting DENV nonstructural protein (NS1), the combination the NS1 and and IgM-capture immunoassays (ELISA) and the combination of NS1 and real-time reverse transcription polymerase chain reaction (RT-PCR) can statistically significant improving sensitivity of the tests without sacrificing the specificity and narrowing the confidence interval of prevalence estimation.Show more Item Sequential Monte Carlo filtering with Gaussian mixture models for highly nonlinear systems(2021-05-05) Yun, Sehyun; Zanetti, Renato, 1978-; Akella, Maruthi R.; Jah, Moriba K.; Jones, Brandon A.; D’Souza, Christopher N.Show more This dissertation presents two different Bayesian approaches for highly nonlinear systems with a theoretical study on combining the benefits of the Gaussian sum filter and particle filter; the posterior particles of a particle filter are drawn from a Gaussian mixture model approximation of the posterior distribution. The first approach introduces the methods which change each and every particle of a particle filter into a Gaussian mixture component, either using the properties of Dirac delta function or using kernel density estimation; the former treats each particle of the prior distribution as a Gaussian component with a collapsed zero covariance matrix and the latter estimates the covariance matrix of a Gaussian component using the kernel density estimation algorithm. The Gaussian sum filter is then used to calculate the posterior distribution. The second approach uses clustering algorithms. These clustering algorithms are used to recover Gaussian mixture model representation of the prior probability density function from the propagated particles. The expectation-maximization clustering algorithm and modified fuzzy C-means clustering algorithms are applied to this approach. Under the scenarios considered in this study, it is shown through numerical simulations that the proposed algorithms lead to better performances than the existing algorithms such as Gaussian sum filters and particle filters.Show more Item Suitability of FPGA-based computing for cyber-physical systems(2009-12) Lauzon, Thomas Charles; Chiou, Derek; Mok, AloysiusShow more Cyber-Physical Systems theory is a new concept that is about to revolutionize the way computers interact with the physical world by integrating physical knowledge into the computing systems and tailoring such computing systems in a way that is more compatible with the way processes happen in the physical world. In this master’s thesis, Field Programmable Gate Arrays (FPGA) are studied as a potential technological asset that may contribute to the enablement of the Cyber-Physical paradigm. As an example application that may benefit from cyber-physical system support, the Electro-Slag Remelting process - a process for remelting metals into better alloys - has been chosen due to the maturity of its related physical models and controller designs. In particular, the Particle Filter that estimates the state of the process is studied as a candidate for FPGA-based computing enhancements. In comparison with CPUs, through the designs and experiments carried in relationship with this study, the FPGA reveals itself as a serious contender in the arsenal of v computing means for Cyber-Physical Systems, due to its capacity to mimic the ubiquitous parallelism of physical processes.Show more