Chasing sparsity : from model, to algorithm, to science

Date

2023-03-02

Authors

Chen, Tianlong

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Sparsity is commonly produced from model compression (i.e., pruning), which eliminates unnecessary parameters. Beyond the improved resource efficiency, sparsity also serves as an important tool to model the underlying low dimensionality of neural networks, for understanding their generalization, optimization dynamics, implicit regularization, expressivity, and robustness. Meanwhile, appropriate sparsity-aware priors assist deep neural networks to achieve significantly enhanced performances on algorithms and systems. This dissertation studies it from two intertwined perspectives, (i) efficient and reliable sparsity and (ii) the sparsity for science. In the first part of this thesis (chapters 2 and 3), a few efforts that are devoted to improving resource efficiency and few-shot generalization of machine learning (ML) algorithms/systems, are presented. Particularly, we introduce a kind of high-quality sparsity capable of universally transferring across diverse downstream tasks, which amortizes the massive cost of sparsity finding and provides an efficient alternative to its dense counterpart without performance degradation. Moreover, we show implicit regularization effects of the structured prior encoded in sparse neural networks, as demonstrated by the substantial data-efficiency improvements in image generation. In the second part of this thesis (chapters 4 and 5), we exploit adequate forms of sparsity in two challenging interdisciplinary scientific problems: (1) Protein Engineering. We develop sparsity-regularized deep learning pipelines to tackle the issues of data scarcity and the distribution discrepancy among diverse proteins, to better model protein's thermostability and suggest positive mutations. (2) Quantum Computing. To reduce the negative effects of massive quantum noises from redundant quantum gates and poor circuit designs, we leverage dynamic sparse exploration to produce lightweight and noise-resistant quantum circuits, closing the performance gap between quantum simulation and realistic QC hardware.

Description

LCSH Subject Headings

Citation