THIS IS AN ONLINE TEAMS PRESENTATION
Abstract:
Dynamic networks arise in many real-world domains such as communication systems, citation networks, and social media platforms. Predicting how relationships evolve in these networks is known as Dynamic Link Prediction (DLP). While recent graph neural networks and transformer-based models capture structural and temporal patterns, most approaches primarily rely on network topology and overlook semantic information associated with nodes.
This research investigates methods for integrating semantic information from node-associated text with structural and temporal graph learning to improve dynamic link prediction. The work includes a survey of existing DLP architectures, a benchmarking study comparing neural models, and the proposed Text2Edge framework, a language-aware temporal graph transformer that combines semantic embeddings from pretrained language models with dynamic graph representations. The goal is to develop more robust and generalizable models for evolving networks.
Keywords:
Dynamic Link Prediction, Graph Transformers, Temporal Graph Learning, Large Language Models
Thesis Committee:
Internal Reader: Dr Dan Wu
Internal Reader: Dr Jianguo Lu
External Reader: Dr Narayan Kar
Advisor: Dr Ziad Kobti