Browsing Theses by Subject "model compression"
Now showing items 1-3 of 3
-
Compression and Analysis of Pre-trained Language Model using Neural Slimming
(University of Waterloo, 2022-08-19)Neural networks are powerful solutions to help with decision making and solve complex problems in recent years. In the domain of natural language processing, BERT and its variants significantly outperform other network ... -
Fair Compression of Machine Learning Vision Systems
(University of Waterloo, 2023-09-01)Model pruning is a simple and effective method for compressing neural networks. By identifying and removing the least influential parameters of a model, pruning is able to transform networks into smaller, faster networks ... -
Towards Effective Utilization of Pretrained Language Models — Knowledge Distillation from BERT
(University of Waterloo, 2020-09-02)In the natural language processing (NLP) literature, neural networks are becoming increasingly deeper and more complex. Recent advancements in neural NLP are large pretrained language models (e.g. BERT), which lead to ...