MSc Thesis Proposal: Optimizing pretrained language models for sentiment analysis by Prithvi Rao Muthineni

Monday, December 11, 2023 - 13:00

The School of Computer Science is pleased to present…

Optimizing pretrained language models for sentiment analysis

MSc Thesis Proposal by:

Prithvi Rao Muthineni


Date: Monday December 11, 2023

Time:  1:00 pm – 2:00 pm

Location: Essex Hall, Room 122



The field of Natural Language Processing (NLP) has witnessed rapid advancements, resulting

in sophisticated models like BERT, GPT, and RoBERTa. Despite their exceptional language understanding capabilities, the computational demands for training and deploying these models pose challenges in GPU utilization and training time. RoBERTa, with its robust optimization strategies, stands out as an efficient and adaptable model for exploration in addressing these computational challenges. This research pioneers an innovative approach involving weight pruning and selective training to optimize RoBERTa. By freezing essential weights and selectively training redundant parameters, the study aims to enhance GPU efficiency and reduce training time, thereby contributing to the practicality of deploying RoBERTa models in real-world applications. The outcomes hold promise for advancing the field of natural language processing, although certain limitations related to dataset specificity and task variations are acknowledged.


Keywords:  Natural Language Processing(NLP); Sentiment Analysis; Model Optimization


Thesis Committee:

Internal Reader: Dr. Olena Syrotkina

External Reader: Dr. Mohammad Haasanzadeh

Advisor: Dr. Robin Gras

MSc Thesis Proposal Logo MSc AI Logo