TACCSTER 2020 Proceedings

Permanent URI for this collectionhttps://hdl.handle.net/2152/83912

Browse

Recent Submissions

Now showing 1 - 20 of 21
  • Item
    A High-Performance Inversion Framework for Brain Tumor Growth Models in Personalized Medicine
    (2020) Subramanian, Shashank; Scheufele, Klaudius; Himthani, Naveen; Biros, George
    The precise characterization of aggressive brain tumors remains a challenging problem due to their highly heterogeneous radiographic and molecular presentation. The integration of mathematical models with clini- cal imaging data holds an enormous promise of developing robust predictive and explainable models that quantify cancer growth with the potential to as- sist in diagnosis and treatment. In general, such models are parameterized by many unknown parameters and their estimation can be formally posed as an inverse problem. However, this calibration problem is a formidable task for aggressive brain tumors due to the absence of longitudinal data, resulting in a strongly ill-posed inverse problem. This is further exacerbated by the inherent non-linearity in tumor growth models. Overcoming these difficulties involves the introduction of sophisticated regularization strategies along with compu- tationally efficient algorithms and software. Towards this end, we introduce a fully-automatic inversion framework which provides an entirely new capa- bility to analyze complex brain tumors from a single pretreatment magnetic resonance imaging (MRI) scan. Our framework employs fast algorithms and optimized implementations which exploit distributed-memory parallelism and GPU acceleration to enable reasonable solution times – an important factor for clinical applications. We validate our solver on clinical data and demonstrate its utility in characterizing important biophysics of brain cancer along with its ability to complement other radiographic information in downstream machine learning tasks.
  • Item
    Negative Gruneisen Parameters at High Pressure in FeTi from Density Functional Theory
    (2020) Munoz, Jorge; Khamala, Bethuel
    Density functional theory (DFT) calculations are a pillar of modern materials physics and quantum chemistry research and the execution of DFT codes represent a significant fraction of the overall utilization of Texas Advanced Computing Center (TACC) resources. We present results of DFT and DFT-based calculations on FeTi, a brittle intermetallic material that crystallizes in the bcc-based CsCl structure and is stable until it melts at 1600 K. We investigated its electronic band structure and phonon dispersion relations using DFT at different specific volumes and uncovered a volume range in which the majority of the phonon modes decrease in energy or remain unchanged with decreasing volume. This behavior is usually observed in invar materials, but unlike them, FeTi is nonmagnetic and there is negligible change in the Fermi surface with pressure. The behavior occurs more generally in materials that show negative thermal expansion, but unlike most of those materials, the crystal structure of FeTi is not particularly open and it is stable at high pressure. In this talk we will show ancillary measurements of the phonon density-of-states curves performed via nuclear-resonant inelastic x-ray scattering in a diamond-anvil cell (DAC) at pressures up to 55 GPa and x-ray diffraction also in a DAC at pressures up to 25 GPa that confirm that the Gruneisen parameters are indeed negative in the predicted specific volumes. We also show an analysis of the calculated force constants, charge densities, and band structures that preliminarily point towards orbital hybridization as the origin of the observed negative Gruneisen parameters.
  • Item
    Stochastic SIR-based Examination of the Policy Effects on the COVID-19 Spread in the U.S. States
    (2020) Song, Mina; Belle, Macy; Mendlovitz, Aaron; Han, David
    Since the global outbreak of the novel COVID-19, many research groups have studied the epidemiology of the virus for short-term forecasts and to formulate the effective disease containment and mitigation strategies. The major challenge lies in the proper assessment of epidemiological parameters over time and of how they are modulated by the effect of any publicly announced interventions. Here we attempt to examine and quantify the effects of various (legal) policies/orders in place to mandate social distancing and to flatten the curve in each of the U.S. states. Through Bayesian inference on the stochastic SIR models of the virus spread, the effectiveness of each policy on reducing the magnitude of the growth rate of new infections is investigated statistically. This will inform the public and policymakers, and help them understand the most effective actions to fight against the current and future pandemics. It will aid the policy-makers to respond more rapidly (select, tighten, and/or loosen appropriate measures) to stop/mitigate the pandemic early on.
  • Item
    Optimal Dynamic Treatment Regime by Reinfocement Learning in Clinical Medicine
    (2020) Song, Mina; Han, David
    Precision medicine allows personalized treatment regime for patients with distinct clinical history and characteristics. Dynamic treatment regime implements a reinforcement learning algorithm to produce the optimal personalized treatment regime in clinical medicine. The reinforcement learning method is applicable when an agent takes action in response to the changing environment over time. Q-learning is one of the popular methods to develop the optimal dynamic treatment regime by fitting linear outcome models in a recursive fashion. Despite its ease of implementation and interpretation for domain experts, Q-learning has a certain limitation due to the risk of misspecification of the linear outcome model. Recently, more robust algorithms to the model misspecification have been developed. For example, the inverse probability weighted estimator overcomes the aforementioned problem by using a nonparametric model with different weights assigned to the observed outcomes for estimating the mean outcome. On the other hand, the augmented inverse probability weighted estimator combines information from both the propensity model and the mean outcome model. The current statistical methods for producing the optimal dynamic treatment regime however allow only a binary action space. In clinical practice, some combinations of treatment regime are required, giving rise to a multi- dimensional action space. This study develops and demonstrates a practical way to accommodate a multi-level action space, utilizing currently available computational methods for the practice of precision medicine.
  • Item
    Statistical Perspectives in Teaching Deep Learning from Fundamentals to Applications
    (2020) Kim, Nathan; Han, David
    The use of Artificial Intelligence, machine learning and deep learning have gained a lot of attention and become increasingly popular in many areas of application. Historically machine learning and theory had strong connections to statistics; however, the current deep learning context is mostly in computer science perspectives and lacks statistical perspectives. In this work, we address this research gap and discuss how to teach deep learning to the next generation of statisticians. We first describe some backgrounds and how to get motivated. We discuss different terminologies in computer science and statistics, and how deep learning procedures work without getting into mathematics. In response to a question regarding what to teach, we address organizing deep learning contents and focus on the statistician’s view; form basic statistical understandings of the neural networks to the latest hot topics on uncertainty quantifications for prediction of deep learning, which has been studied in the Bayesian frameworks. Further, we discuss how to choose computational environments and help develop programming skills for the students. We also discuss how to develop homework incorporating the idea of experimental design. Finally, we discuss how to expose students to the domain knowledge and help to build multi- discipline collaborations.
  • Item
    Using Ancestral Reconstruction of Chromosome Expression States (ARChES) to Understand the Evolution of Dosage Compensation
    (2020) Ramesh, Balan; Demuth, Jeff
    Ohno (1967) originally proposed that the sex difference in X-linked gene dose caused by the decay of Y-linked genes may impose a “peril of hemizygosity” and that regulatory mechanisms must compensate to make X=XX=AA at the level of expression. Recent evidence suggests that Ohno’s paradigm is not universal, but our understanding remains unclear because estimating the ancestral expression of X-linked genes is difficult or impossible in many systems. Many studies assess dosage compensation (DC) by comparing X: Autosome expression ratios, thereby implicitly assuming that current average autosomal gene expression (AA) is a good proxy for the average ancestral expression of X-linked genes. A more appropriate test would be whether X=XX=Ancestral expression, where “Ancestral” is the inferred expression level of each X- linked gene before becoming X-linked. The few studies that have attempted to compare X (or Z) linked gene expression to corresponding ancestral levels have relied on distantly related taxa that include changes in chromosome number and sex-determination system. Here, we study the evolution of dosage compensation by comparing expression of neo-X chromosome genes in Tribolium confusum to their inferred ancestral, autosomal expression state. The ancestral expression is estimated by analyzing RNA-Seq data across a time-calibrated phylogeny that includes four additional closely related species that all share an ancestral karyotype where the neo-X genes of T. confusum remain autosomal. We find that the neo-X in T. confusum is dosage balanced (X=XX) and dosage compensated (X=Ancestral), suggesting a chromosome-wide dosage compensation mechanism as envisioned by Ohno. Further, we observe that DC in T. castaneum, which was previously contentious, is fully balanced and compensated (X=XX=Ancestral). The computational approach to analyzing DC evolution via Ancestral Reconstruction of Chromosome Expression States (ARChES) was developed using TACC and is publicly available. ARChES workflow is computationally scalable and can be expanded to analyze DC in any species.
  • Item
    Quantum Computation, Quantum Algorithms & Implications on Data Science
    (2020) Kim, Nathan; Garcia, Jeremy; Han, David
    Quantum computing is a new revolutionary computing paradigm, first theorized in 1981. It is based on quantum physics and quantum mechanics, which are fundamentally stochastic in nature with inherent randomness and uncertainty. The power of quantum computing relies on three properties of a quantum bit: superposition, entanglement, and interference. Quantum algorithms are described by the quantum circuits, and they are expected to solve decision problems, functional problems, oracular problems, sampling tasks and optimization problems so much faster than the classical silicon-based computers. They are expected to have a tremendous impact on the current Big Data technology, machine learning and artificial intelligence. Despite the theoretical and physical advancements, there are still several technological barriers for successful applications of quantum computation. In this work, we review the current state of quantum computation and quantum algorithms, and discuss their implications on the practice of Data Science in the near future. There is no doubt that quantum computing will accelerate the process of scientific discoveries and industrial advancements, having a transformative impact on our society.
  • Item
    Andromeda: A Few-body Plane Wave Calculator
    (2020) Jerke, Jonathan; Wu, Jackson; Poirier, Bill; Karwowski, Jacek
    At TACCSTER last year, a novel method of ours to solve the 3-body lithium problem was presented. Without finishing, the computation plateaued at -7.3 (of -7.4) Hartree on an L = 67 ^ 9 grid running on a single TACC Lonestar5 node for three months. We have now released a new version of the Andromeda code capable of embarrassingly parallel operations. This improvement followed from a significant speedup of half the process, namely the free and exact creation of the Hamiltonian quantum operators and their operation in Sums of Products form. Even though this does not speed up the vector decomposition process, which is still the rate-limiting step, we can now distribute processing per term-state combination across numerous computational resources to overcome this problem. In particular, any 2-body interaction quantum operator is now a summation of processes defined by separate 1-body matrices for the 2-body diagonal, 1-body diagonal, and off-diagonal aspects of the quantum operation. Thus, every core in a parallel process can individually initialize the Coulombic quantum operator, which allows embarrassingly parallel operations across several state vectors. The current release has integrated the TACC/launcher as a vehicle to handle parallel operations. Digitize your wave function with the most local representation of the plane-wave basis. Tackle strongly correlated problems with a spatial component separated, but fully multi-body, Sums-of-Products representation. Compute 3-body quantum physics with a powerful scripting interface. Discover something.
  • Item
    Synthesizing Dense and Colored 3D Point Clouds for Training Deep Neural Networks
    (2020) Arshad, Mohammad Samiul; Beksi, William
    3D point clouds are a compact homogeneous representation that have the ability to cap- ture intricate details of the environment. They are useful for a wide variety of applications. For example, point clouds can be sampled from the mesh of manually designed objects to use as synthetic data for training deep learning networks. However, the geometry and tex- ture of these point clouds is bounded by the resolution of the modeled objects. To facilitate learning with synthetic 3D point clouds, we present a novel conditional generative adver- sarial network that creates dense point clouds, with color, in an unsupervised manner. The difficulty of capturing intricate details at high resolutions is handled by a point transformer that progressively grows the network through the use of graph convolutions. Every training iteration evolves a point vector into a point cloud. Experimental results show that our net- work is capable of learning a 3D data distribution and produces colored point clouds with fine details at multiple resolutions.
  • Item
    Quantitative Study of Unsaturated Transport of Glycerol through Aquaglyceroporin that has High Affinity for Glycerol
    (2020) Rodriguez, Roberto A; Chan, Ruth; Liang, Huiyun; Chen, Liao Y
    The structures of several aquaglyceroporins have been resolved to atomic resolution showing two or more glycerols bound inside a channel and confirming a glycerol-facilitator’s affinity for its substrate glycerol. However, the kinetics data of glycerol transport experiments all point to unsaturated transport that is characteristic of low substrate affinity in terms of the Michaelis- Menten kinetics. In this article, we present an in silico-in vitro research focused on AQP3, one of the human aquaglyceroporins that is natively expressed in the abundantly available erythrocytes. We conducted 2.1 μs in silico simulations of AQP3 embedded in a model erythrocyte membrane with intracellular-extracellular asymmetries in leaflet lipid compositions and compartment salt ions. From the equilibrium molecular dynamics (MD), we elucidated the mechanism of glycerol transport at high substrate concentrations. From the steered MD simulations, we computed the Gibbs free-energy profile throughout the AQP3 channel. From the free energy profile, we quantified the kinetics of glycerol transport that is unsaturated due to glycerol-glycerol interaction mediated by AQP3 resulting in the concerted movement of two glycerol molecules for the transport of one glycerol molecule across the cell membrane. We conducted in vitro experiments on glycerol uptake into human erythrocytes for a wide range of substrate concentrations and various temperatures. The experimental data quantitatively validated our theoretical- computational conclusions on the unsaturated glycerol transport through AQP3 that has high affinity for glycerol.
  • Item
    Loop Current Transport and Dispersal Dynamics in the Gulf of Mexico
    (2020) Stevens, Jessica; Harrison, Cheryl S; Rossi, Vincent; Ser-Giacomi, Enrico; Liu, Yonggang; Weisberg, Rogert; Garza, Adrian; Garza, Victoria
    Transport, connectivity, and dispersal both within and outside of the Gulf of Mexico impact important processes such as biological and pollutant dispersal. The Loop Current is a key flow feature within the Gulf of Mexico that affects transport. This research pairs Network Theory and Lagrangian oceanographic modeling (Lagrangian Flow Networks, LFN) to study the connectivity within the Gulf as a function of the Loop Current state. Surface-following particles are used to simulate Lagrangian transport over the observational record using a HYCOM ocean model reanalysis simulation coupled with the LFN particle tracking model computed on TACC supercomputers. The particle simulations are used to determine regions of connectivity, or hydrodynamic provinces, as a function of the Loop Current state, using machine learning. These provinces inform us about biological and pollutant transport variability, such as larval connectivity and harmful algal blooms.
  • Item
    uPredict: A User-Level Profiler-Based Predictive Framework in Multi-Tenant Clouds
    (2020) Moradi, Hamidreza; Wang, Wie; Fernandez, Amanda; Zhu, Dakai
    Clouds have been adopted widely by many organizations for their supports of flexible resource demands and low cost, which is normally achieved through sharing the underlying hardware among multiple cloud tenants. However, such sharing with the changes in resource contentions in virtual machines (VMs) can result in large variations for the performance of cloud applications, which makes it difficult for ordinary cloud users to estimate the run-time performance of their applications. We propose online learning methodologies for performance modeling and prediction of applications that run repetitively on multi-tenant clouds (such as on-line data analytic tasks). Here, a few micro-benchmarks are utilized to probe the in-situ perceivable performance of CPU, memory and I/O components of the target VM. Then, based on such profiling information and in-place measured application's performance, the predictive models can be derived with either Regression or Neural-Network techniques. In particular, to address the changes in the intensity of resource contentions of a VM over time and its effects on the target application, we proposed periodic model retraining where the sliding-window technique was exploited to control the frequency and historical data used for model retraining. Moreover, a progressive modeling approach has been devised where the Regression and Neural-Network models are gradually updated for better adaptation to recent changes in resource contention. With 17 representative applications from PARSEC, Nas Parallel and CloudSuite benchmarks being considered, we have extensively evaluated the proposed online schemes for the prediction accuracy of the resulting models and associated overheads on both a private and public clouds. The evaluation results show that, with the neural-network progressive models, the average prediction error for public clouds is less than 4% for the considered models with our proposed online schemes.
  • Item
    Climate Change Impacts on Hurricane Storm Surge Inundation in the Coastal United States
    (2020) Camelo, Jeane; Mayo, Talea L; Gutmann, Ethan D
    The properties of hurricanes directly influence storm surges; however, the implications of climate change are unclear. Here, we simulate the storm surges of historical storms under present day and end of century climate scenarios to assess the impact of climate change on storm surge inundation. We simulate 21 storms that impacted the Gulf of Mexico and Atlantic Coasts of the continental United States from 2000-2013. We find that the volume of inundation increases for storms and the average change for all storms is +36%. The extent of inundation increases for 13 storms, and the average change for all storms is +25%. Notable increases in inundation occur near Texas, Mississippi, the Gulf Coast of Florida, the Carolinas, Virginia, and New York. Our calculations of inundation volume and extent suggest that at the end of the century, we can expect hurricanes to produce larger storm surge magnitudes in concentrated areas, as opposed to surges with lower magnitudes that are widespread. We examine changes in maximum wind speed, minimum central pressure, translation speed, and radius of the 33 ms-1 wind to assess the impacts of individual storm characteristics on storm surge. We find that there is no single storm characteristic that directly relates to storm surge inundation or its climate induced changes. Even when all the parameters are considered together, the resulting influences are difficult to anticipate. This is likely due to the complexity of the hydrodynamics and interactions with local geography. This illustrates that even as climate change research advances and more is known about projected impacts to hurricanes, implications for storm surge will be difficult to predict without explicit numerical simulation.
  • Item
    Modeling Breast Cancer Patient, Communication and Treatment Factors in Discontinuation of Daily Adjuvant Treatment Over Time
    (2020) Lyman, C; Shinn, E; Busch B; Toole, T; Richman, S; Broderick, G
    Introduction. Despite the substantial survival benefit from daily oral endocrine therapy (ET) for estrogen receptor positive breast cancer, 50% of patients discontinue ET before completing 5 years. In order to identify patients at risk for early discontinuation, models are needed that account for the varying effects of factors over time. Methods. A decisional logic model representing possible casual interactions between 19 psychological, demographic, and treatment factors with adherence was assembled based on expert knowledge and tested against data from 83 patients assessed annually for over 3 years. Competing sets of logical rules supporting compliance with observed behaviors as well as regulatory parsimony were identified by solving a Constraint Satisfaction Problem (CSP). Repeated simulations were then conducted to identify risk factors and interventional strategies for early discontinuation of therapy in the presence of biological noise. Results. CSP identified 96 decisional models explaining up to 75% of discontinuation trajectories. These competing models collectively supported 15 stable attractor profiles which are characterized by low perceived risk of cancer recurrence, low level of cancer worry, high level of generalized anxiety, and poor quality of life (QoL) ratings. Answer Set Programming (ASP) applied to each of the 15 non- adherent profiles identified 6 interventional strategies restoring adherence in 13 of these profiles. These involved reducing general anxiety and reinforcing pill-taking strategies, trust in medical system. Neither education, age, nor household income consistently characterized these 13 steady states. Concurrently increasing health literacy in patients with lower education levels, was predicted to improve efficacy of the aforementioned strategies. Conclusions. These analyses suggest that discontinuation trajectories over time can be predicted from behavioral constructs. Improved understanding of mechanisms mediating perceived risk of recurrence and low ratings of QoL specifically could further improve model fidelity. Potential interventional targets include general anxiety, trust in the medical system and strengthened routines.
  • Item
    Finding Expressed Mutations in Multiple Myeloma Cell Lines
    (2020) Richardson, Jensen; Pritha, Jafrin; Jiang, Wenxuan; Prasad, Rohit; Arasappan, Dhivya; Kowalski-Muegge, Jeanne
    Neoantigens are newly formed peptides created from somatic mutations that are capable of inducing tumor-specific T-cell recognition. Prediction of these neoantigens can lead to personalized immunotherapies for the treatment of cancers. Identification of expressed somatic mutations using next generation sequencing data is a crucial first step in neoantigen prediction. Because of the expansion of next generation sequencing data, there exist a plethora of tools designed to sift through this data and return high quality Single Nucleotide Variants (SNVs) and small insertions and deletions (indels), however, it is essential to select tools that are flexible, efficient, and above all, accurate at detecting these mutations. Using RNA sequencing combined with whole exome sequencing data from 71 Human Multiple Myeloma cell lines (HMCLs), we compared different variant calling tools to develop a workflow for identifying expressed mutations. The use of well characterized HMCL’s with known SNVs and indels enables us to compare the accuracy of each variant calling tool. Thus far, we have compared the accuracy and efficiency of VarScan’s simple variant calling pipeline to GATK’s fully encompassing pipelines for exome and RNA-Seq data and have incorporated post-filtering, annotation and visualization of found variants to our workflow. Because of the large number of HMCLs and the several steps required and specific to each pipeline, we used Lonestar5 to parallelize our processing of the data. Our completed workflow will provide a standardized means for identifying expressed mutations in tumors.
  • Item
    Detecting Structural Variants in Multiple Myeloma Cell Lines using Whole Exome Sequencing
    (2020) Nanduri, Rahul; Pugalenthi, Lokesh; Hong, Raymond; Prasad, Rohit K; Arasappan, Dhivya; Kowalski-Muegge, Jeanne
    Whole exome sequencing (WES) is a targeted sequencing technique that sequences only the protein-coding regions of the genome. As WES has superior cost- effectiveness when compared to whole genome sequencing (WGS), WES has become a respected tool in identifying small genetic variants underlying diseases. However, it is less commonly used to identify large-scale structural variants (SVs) which because of their size and complexity, are more difficult to detect using short-read sequencing data. SVs are genome alterations spanning 50 or more base pairs and have been linked to the onset or progression of certain diseases, such as Multiple Myeloma (MM). Multiple bioinformatics tools are available for the identification of structural variants from genomic data; however, it is important to benchmark their accuracies and efficiencies, particularly in the context of WES data. Using WES data from 71 Human Multiple Myeloma Cell Lines (HMCLs), we benchmarked three established SV identification tools (Delly, Pindel, and Smoove) by comparing their results to the known structural variants in each cell line. We used an SV visualization tool, svviz and developed our own visualization scripts to examine output features, such as the distribution of base pair length, types of structural variants detected, and performance metrics, such as run-time. We utilized the Texas Advanced Computing Center (TACC) to run our workflow on all HMCLs in parallel. These SV identification tools each possess unique strengths and weaknesses, so they will be combined (along with filtering and visualization of SVs) to create a robust workflow that will be utilized to identify novel structural variants in HMCLs which can then be extended to patient tumors.
  • Item
    TIP3P and TIP4P in Molecular Dynamics Simulations of Erythrocyte Aquaporins
    (2020) Falato, Michael; Chen, Liao Y; Chan, Ruth; Liang, Huiyun
    Membrane protein simulation is vital for its vast implications in the study of biological processes. Because of this, curating the most accurate simulations that agree with in vitro experiments is of paramount importance. In membrane protein simulations concerning the permeability of a protein to water, implementing the most optimal water model and lipid bilayer constituents is a naturally arising concern. While water model TIP3P is conventionally used in molecular dynamics because of the optimization of its parameters to protein-protein interactions, many doubt the accuracy of simulations using TIP3P given that water model TIP4P outperforms TIP3P in certain regards. For instance, TIP4P yields considerably more accurate results in predicting bulk water diffusion constants. In addition to the choice of water model, membrane composition choice plays an important role. For instance, is it necessary to model the bilayer in accordance with experimentally validated compositions, and is it acceptable to use a bilayer consisting of one lipid type, as is done in the present literature? In our investigation, we sought to answer which of these two water models is better honed for permeability predictions of erythrocyte aquaporin, particularly AQP1 and AQP3. Concurrently, since TIP3P is refined for water-protein interactions and TIP4P is refined for water-water interactions, the model that provides the most accurate predictions could indicate the type of interactions that permeability depends on the most. We also sought to determine the necessity of implementing more accurate membrane models. We conducted molecular dynamics simulations of these variant water models and lipid compositions of AQP1 and AQP3 systems, using transition state theory to calculate the permeability, along with cell swelling assays to validate our results. Our results show that employing TIP3P and a more accurate lipid composition is preferable to using TIP4P or a single lipid type in the membrane.
  • Item
    Unconditionally Stable Space-Time Finite Element Method for the Shallow Water Equations
    (2020) Valseth, Eirik; Dawson, Clint
    We introduce the automatic variationally stable finite element (AVS-FE) method [1, 3] for the shallow water equations (SWE). The AVS-FE method uses a first order system integral formulation of the under- lying partial differential equations (PDEs) and, in the spirit of the discontinuous Petrov-Galerkin (DPG) method by Demkowicz and Gopalakrishnan [2], employs the concept of optimal test functions to ensure discrete stability. The AVS-FE method distinguishes itself by using globally conforming FE trial spaces, e.g., H1(Ω) and H(div,Ω) and their broken counterparts for the test spaces. The broken topology of the test spaces allows us to compute numerical approximations of the local restrictions of the optimal test functions in a completely decoupled fashion, i.e., element-by-element. The test functions can be com- puted with sufficient numerical accuracy by using the same local p-level as applied for the trial space. The unconditional discrete stability of the method allows for straightforward implementation of transient problems in existing FE solvers such as FEniCS. Furthermore, the AVS-FE method comes with a built-in a posteriori error estimate as well as element-wise error indicators allowing us to perform mesh adaptive refinements in both space and time. The application of this method to complex physical domains requires large FE meshes leading to signif- icant computational costs. However, since the computation of optimal test functions as well as element- wise error indicators are all local, the method is an excellent candidate for parallel processing. We show numerical verifications for the SWE utilizing the built-in error indicators to drive mesh adaptive refine- ments.