Events
Tutorials, workshops, and more.
NeurIPS 2025 NEGEL Workshop (2025.12)
The Non-Euclidean Foundation Models and Geometric Learning Workshop will take place at NeurIPS 2025 in San Diego, CA, USA, from December 2–7, 2025. We invite you to join discussions on non-Euclidean representation learning, geometric deep learning, and large foundation models!
KDD 2025 Hyperbolic FM Tutorial(2025.08)
This tutorial on Hyperbolic Deep Learning for Foundation Models will take place at KDD 2025. We invite you to explore the advancements in hyperbolic deep learning and its applications in foundation models.
This tutorial at KDD 2025 aims to provide a comprehensive understanding of hyperbolic deep learning methods and their application to foundation models. It will cover the theoretical foundations, practical implementations, and future research directions in this exciting field. The tutorial is designed for a broad audience, including both newcomers and experts in machine learning, and will feature interactive components to engage participants.
WWW 2025 NEGEL Workshop
The Workshop on Non-Euclidean Foundation Models and Geometric Learning (NEGEL) will take place at TheWebConf 2025 in Sydney, Australia, from April 28–May 2, 2025. We invite you to join discussions on Non-Euclidean representation learning and geometric deep learning, and large foundation models, alongside web-related applications!
KDD 2023 Tutorial – Hyperbolic GNN
This tutorial aims to give a systematical review of the methods, applications, and challenges in this fast-growing and vibrant area, with the express purpose of being accessible to all audiences. More specifically, we will first give a brief introduction to graph neural networks as well as some preliminary of Riemannian manifold and hyperbolic geometry. We then will comprehensively revisit the technical details of the developed HGNNs, by unifying them into a general framework and summarizing the variants of each component. Besides, we will introduce applications deployed in a variety of fields. Finally, we will discuss several challenges and present the potential solutions to address them, including some initial attempts of our own, which potentially paves the path for the further flourishing of the research community. A related GitHub repository of this tutorial can be found at Awesome-Hyperbolic-Graph-Representation-Learning
WWW 2024, TAG Tutorial
Text documents are usually connected in a graph structure, resulting in an important class of data named text-attributed graph, e.g., paper citation graph and Web page hyperlink graph. On the one hand, Graph Neural Networks (GNNs) consider text in each document as general vertex attribute and do not specifically deal with text data. On the other hand, Pre-trained Language Models (PLMs) and Topic Models (TMs) learn effective document embeddings. However, most models focus on text content in each single document only, ignoring link adjacency across documents. The above two challenges motivate the development of text-attributed graph representation learning, combining GNNs with PLMs and TMs into a unified model and learning document embeddings preserving both modalities, which fulfill applications, e.g., text classification, citation recommendation, question answering, etc.
ECML-PKDD 2022 Tutorial: Hyperbolic Graph Representation Learning: A Tutorial
This tutorial provides an accessible introduction to hyperbolic graph representation learning, covering graph representation learning fundamentals, Riemannian and hyperbolic geometry, hyperbolic embedding techniques, hyperbolic neural networks, and a variety of real-world applications. The tutorial also discusses advanced topics and future directions for non-Euclidean graph learning.