Xavier Bresson | The Transformer Network for the Traveling Salesman Problem


Details
Virtual London Machine Learning Meetup - 14.07.2021 @ 18:30
We would like to invite you to our next Virtual Machine Learning Meetup.
Agenda:
- 18:25: Virtual doors open
- 18:30: Talk
- 19:10: Q&A session
- 19:30: Close
Sponsors
https://evolution.ai/ : Machines that Read - Intelligent data extraction from corporate and financial documents.
- Title: The Transformer Network for the Traveling Salesman Problem (Xavier Bresson is an Associate Professor in the Department of Computer Science at the National University of Singapore (NUS))
Abstract: The Traveling Salesman Problem (TSP) is the most popular and most studied combinatorial problem, starting with von Neumann in 1951. It has driven the discovery of several optimization techniques such as cutting planes, branch-and-bound, local search, Lagrangian relaxation, and simulated annealing. The last five years have seen the emergence of promising techniques where (graph) neural networks have been capable to learn new combinatorial algorithms. The main question is whether deep learning can learn better heuristics from data, i.e. replacing human-engineered heuristics? This is appealing because developing algorithms to tackle NP-hard problems may require years of research, and many industry problems are combinatorial by nature. In this project, we propose to adapt the recent successful Transformer architecture originally developed for natural language processing to the combinatorial TSP. Training is done by reinforcement learning, hence without TSP training solutions, and decoding uses beam search. We report improved performances over recent learned heuristics.
Bio: Xavier Bresson is an Associate Professor in the Department of Computer Science at the National University of Singapore (NUS). His research focuses on Graph Deep Learning, a new framework that combines graph theory and neural network techniques to tackle complex data domains. In 2016, he received the US$2.5M NRF Fellowship, the largest individual grant in Singapore, to develop this new framework. He was also awarded several research grants in the U.S. and Hong Kong. He co-authored one of the most cited works in this field, and he has recently introduced with Yoshua Bengio a benchmark that evaluates graph neural network architectures. He has organized several workshops and tutorials on graph deep learning such as the recent IPAM'21 workshop on "Deep Learning and Combinatorial Optimization", the MLSys'21 workshop on "Graph Neural Networks and Systems", the IPAM'19 and IPAM'18 workshops on "New Deep Learning Techniques", and the NeurIPS'17, CVPR'17 and SIAM'18 tutorials on "Geometric Deep Learning on Graphs and Manifolds". He has been a regular invited speaker at universities and companies to share his work. He has also been a speaker at the KDD'21, AAAI'21 and ICML'20 workshops on "Graph Representation Learning", and the ICLR'20 workshop on "Deep Neural Models and Differential Equations". He has been teaching graduate courses on Graph Neural Networks at NTU, and as a guest lecturer for Yann LeCun's course at NYU.

Sponsors
Xavier Bresson | The Transformer Network for the Traveling Salesman Problem