Full metadata
Title
Making the Best of What We Have: Novel Strategies for Training Neural Networks under Restricted Labeling Information
Description
Recent advancements in computer vision models have largely been driven by supervised training on labeled data. However, the process of labeling datasets remains both costly and time-intensive. This dissertation delves into enhancing the performance of deep neural networks when faced with limited or no labeling information. I address this challenge through four primary methodologies: domain adaptation, self-supervision, input regularization, and label regularization. In situations where labeled data is unavailable but a similar dataset exists, domain adaptation emerges as a valuable strategy for transferring knowledge from the labeled dataset to the target dataset. This dissertation introduces three innovative domain adaptation methods that operate at pixel, feature, and output levels.Another approach to tackle the absence of labels involves a novel self-supervision technique tailored to train Vision Transformers in extracting rich features.
The third and fourth approaches focus on scenarios where only a limited amount of labeled data is available. In such cases, I present novel regularization techniques designed to mitigate overfitting by modifying the input data and the target labels, respectively.
Date Created
2024
Contributors
- Chhabra, Sachin (Author)
- Li, Baoxin (Thesis advisor)
- Venkateswara, Hemanth (Committee member)
- Yang, Yezhou (Committee member)
- Wu, Teresa (Committee member)
- Yang, Yingzhen (Committee member)
- Arizona State University (Publisher)
Topical Subject
Resource Type
Extent
170 pages
Language
eng
Copyright Statement
In Copyright
Primary Member of
Peer-reviewed
No
Open Access
No
Handle
https://hdl.handle.net/2286/R.2.N.193841
Level of coding
minimal
Cataloging Standards
Note
Partial requirement for: Ph.D., Arizona State University, 2024
Field of study: Computer Science
System Created
- 2024-05-07 05:33:32
System Modified
- 2024-05-07 05:33:38
- 7 months 3 weeks ago
Additional Formats