A prototype-oriented framework for deep transfer learning applications




Tanwisuth, Korawat

Journal Title

Journal ISSN

Volume Title



Deep learning models achieve state-of-the-art performance in many applications but often require large-scale data. Deep transfer learning studies the ability of deep learning models to transfer knowledge from source tasks to related target tasks, enabling data-efficient learning. This dissertation develops novel methodologies that tackle three different transfer learning applications for deep learning models: unsupervised domain adaptation, unsupervised fine-tuning, and source-private clustering. The key idea behind the proposed methods relies on minimizing the distributional discrepancy between the prototypes and target data with the transport framework. For each scenario, we design our algorithms to suit different data and model requirements. In unsupervised domain adaptation, we leverage the source domain data to construct class prototypes and minimize the transport cost between the prototypes and target data. In unsupervised fine-tuning, we apply our framework to prompt-based zero-shot learning to adapt large pre-trained models directly on the target data, bypassing the source data requirement. In source-private clustering, we incorporate a knowledge distillation framework with our prototype-oriented clustering to address the problem of data and model privacy. All three approaches show consistent performance gains over the baselines.



LCSH Subject Headings