Deep Learning

Benchmarking Graph Neural Networks

This blog is based on the paper Benchmarking Graph Neural Networks which is a joint work with Chaitanya K. Joshi, Thomas Laurent, Yoshua Bengio and Xavier Bresson. Graph Neural Networks (GNNs) are widely used today in diverse applications of social sciences, knowledge graphs, chemistry, physics, neuroscience, etc., and accordingly there has been a great surge of interest and growth in the number of papers in the literature. However, it has been increasingly difficult to gauge the effectiveness of new models and validate new ideas that generalize universally to larger and complex datasets in the absence of a standard and widely-adopted benchmark.

Learning TSP Requires Rethinking Generalization

End-to-end training of neural network solvers for combinatorial problems such as the Travelling Salesman Problem is intractable and inefficient beyond a few hundreds of nodes. While state-of-the-art Machine Learning approaches perform closely to …

Benchmarking Graph Neural Networks

Identify universal building blocks for robust and scalable GNNs.

Benchmarking Graph Neural Networks

Graph neural networks (GNNs) have become the standard toolkit for analyzing and learning from data on graphs. As the field grows, it becomes critical to identify key architectures and validate new ideas that generalize to larger, more complex …

Transformers are Graph Neural Networks

Engineer friends often ask me: Graph Deep Learning sounds great, but are there any big commercial success stories? Is it being deployed in practical applications? Besides the obvious ones–recommendation systems at Pinterest, Alibaba and Twitter–a slightly nuanced success story is the Transformer architecture, which has taken the NLP industry by storm. Through this post, I want to establish links between Graph Neural Networks (GNNs) and Transformers. I’ll talk about the intuitions behind model architectures in the NLP and GNN communities, make connections using equations and figures, and discuss how we could work together to drive progress.

Free-hand Sketches

Representation learning for drawings via graphs with geometric and temporal information.

Multi-Graph Transformer for Free-Hand Sketch Recognition

Learning meaningful representations of free-hand sketches remains a challenging task given the signal sparsity and the high-level abstraction of sketches. Existing techniques have focused on exploiting either the static nature of sketches with …

A Two-Step Graph Convolutional Decoder for Molecule Generation

We propose a simple auto-encoder framework for molecule generation. The molecular graph is first encoded into a continuous latent representation , which is then decoded back to a molecule. The encoding process is easy, but the decoding process …

On Learning Paradigms for the Travelling Salesman Problem

We explore the impact of learning paradigms on training deep neural networks for the Travelling Salesman Problem. We design controlled experiments to train supervised learning (SL) and reinforcement learning (RL) models on fixed graph sizes up to 100 …

Graph Neural Networks for the Travelling Salesman Problem

The most famous NP-hard combinatorial problem today, the Travelling Salesman Problem, is intractable to solve optimally at large scale. In practice, existing techniques such as Concorde can efficiently solve TSP up to thousands of nodes. This talk …