ESE Ph.D. Thesis Defense: ”Manifold Filters and Neural Networks: Geometric Graph Signal Processing in the Limit”
April 15, 2025 at 3:30 PM - 5:30 PM
Organizer
Graph Neural Networks (GNNs) are the tool of choice for scalable and stable learning in graph-structured data applications involving geometric information. My research addresses the fundamental questions of how GNNs can generalize across different graph scales and how they can remain stable on large-scale graphs. I do so by considering manifolds as graph limit models. In this talk, we will explain how to build manifold convolutional filters and manifold neural networks (MNNs) as the limit objects of graph convolutional filters and GNNs when the graphs are sampled from manifolds. Using the Laplace-Beltrami operator exponentials to define manifold convolutions, we demonstrate their algebraic equivalence to both graph convolutions and standard time convolutions in nodal and spectral domains. This equivalence provides a unifying framework to analyze key theoretical properties of GNNs: i) Convergence of GNNs to MNNs allows the scalability of GNNs on graphs across scales. ii) The stability of MNNs to deformations indicates the stability of large-scale GNNs. These findings offer practical guidelines for designing GNN architectures, particularly by imposing constraints on the spectral properties of filter functions. Theoretical results are verified in real-world scenarios, including point cloud analysis, wireless resource allocation, and wind field studies on vector fields.

