Contextualized Messages Boost Graph Representations

Contextualized Messages Boost Graph Representations

Our paper Contextualized Messages Boost Graph Representations has recently been posted on arXiv (April 2025 update: published in TMLR). While the main theoretical results are presented in the paper, this post aims to provide an intuitive understanding of the motivation and findings. In summary, the paper theoretically justifies the need for anisotropic (i.e., a function of both the features of the center and neighboring nodes) and dynamic (i.e., a universal function approximator) message functions in graph neural networks (GNNs) and proposes a simple and computationally efficient model, the soft-isomorphic relational graph convolution network (SIR-GCN), satisfying this requirement that empirically outperforms comparable models.

How to Approach Math

How to Approach Math

On April 23, 2022, I had the honor of being invited by the Ateneo Mathematics Society (AMS) to give a talk on “How to Approach Math” for their annual project, Mathventure. This event was aimed at senior high school students who were interested in pursuing degrees in Science, Technology, Engineering, and Mathematics (STEM). Initially, I found the task daunting. I wasn’t sure if I could inspire the participants and make a meaningful impact. Thankfully, with the support and guidance of some friends, I managed to gather enough content and deliver the talk in an engaging manner, even incorporating memes to connect better with the younger audience.

Pagination