Understanding long range dependency in temporal graphs - BC-889
Project type: ResearchDesired discipline(s): Engineering - computer / electrical, Engineering, Computer science, Mathematical Sciences, Mathematics
Company: Mastercard AI Garage
Project Length: 6 months to 1 year
Preferred start date: 05/01/2024
Language requirement: English
Location(s): BC, Canada
No. of positions: 1
Desired education level: CollegeUndergraduate/BachelorMaster'sPhDPostdoctoral fellowRecent graduate
Open to applicants registered at an institution outside of Canada: Yes
About the company:
We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. We cultivate a culture of inclusion for all employees that respects their individual strengths, views, and experiences. We believe that our differences enable us to be a better team – one that makes better decisions, drives innovation and delivers better business results. At AI Garage, we use state-of-the-art AI techniques to solve some of the most important problems in the financial world.
Describe the project.:
This project aims to address the challenges associated with long-range dependencies in temporal graphs, building upon the foundation of Graph Neural Networks (GNNs). GNNs have demonstrated significant potential in the realm of graph representation learning by employing a message-passing mechanism between neighboring nodes within a graph, allowing for the accumulation of richer graph representations through multiple layers. However, this conventional approach suffers from two primary limitations: excessive information compression and a lack of capability to capture long-distance connections or dependencies within the graph.
To overcome these limitations, some researchers have introduced the concept of "global attention," which enables models to consider information from distant parts of the graph. However, this comes at the cost of increased computational complexity, particularly as graph size grows. Furthermore, long-range dependencies in temporal graphs introduce an additional layer of complexity, as these dependencies can be both spatial, based on the number of hops, and temporal, reflecting changes over time.
This research project aims to fill this crucial gap by investigating methods to address long-range dependencies in temporal graphs. Such solutions could have far-reaching implications, particularly in applications such as recommendation systems, where the underlying graphs exhibit inherent temporal dynamics with long-range dependencies, and in financial fraud prediction, where fraud patterns often manifest long-term trends and seasonality. While several existing works[1,2,3,4] have attempted to tackle long-range dependency issues in static graphs, their applicability to temporal graphs remains unexplored. This project intends to bridge this gap by considering the temporal aspect of dependencies within the context of temporal GNNs, contributing to advancements in graph-based modeling for dynamic data.
- A Generalization of {V}i{T}/{MLP}-Mixer to Graphs
- {DR}ew: Dynamically Rewired Message Passing with Delay
- Implicit Graph Neural Networks: A Monotone Operator Viewpoint
- Ewald-based Long-Range Message Passing for Molecular Graphs
- Are More Layers Beneficial to Graph Transformers?
- Deep Ensembles for Graphs with Higher-order Dependencies
Required expertise/skills:
- Good theoretical and practical familiarity of Deep Learn Models
- Decent understanding of Graph Neural Networks formulations
- Good understanding of Machine Learning theory
- Decent understanding of probability theory and statistics
- Experience in Temporal Graphs is a plus
- Good experience with packages such as Pytorch and Tensorflow