Browsing Theses by Subject "pre-trained language model"
Now showing items 1-2 of 2
-
AfriBERTa: Towards Viable Multilingual Language Models for Low-resource Languages
(University of Waterloo, 2022-08-29)There are over 7000 languages spoken on earth, but many of these languages suffer from a dearth of natural language processing (NLP) tools. Multilingual pretrained language models have been introduced to help alleviate ... -
Compression and Analysis of Pre-trained Language Model using Neural Slimming
(University of Waterloo, 2022-08-19)Neural networks are powerful solutions to help with decision making and solve complex problems in recent years. In the domain of natural language processing, BERT and its variants significantly outperform other network ...