COVID-19 information for PI Residents and Visitors
The advent of modern machine learning has ushered in rapid advances in the classification and interpretation of large data sets, sparking a revolution in areas such as image and natural language processing. Much of our current understanding of the techniques that underlie this revolution owes a great debt to insights first gleaned from condensed matter and statistical physics. This raises the important question of what further insights remain to be found at the intersection of machine learning and fields such as statistical physics, condensed matter, and quantum information. In response to this question, this workshop aims to bring together experts from a variety of backgrounds who are interested in connections between many-body physics, quantum computing and machine learning. The scope of the conference will include:
- The use of techniques from machine learning, such as neural networks or statistical learning, to tackle quantum many-body problems, such as discriminating phases of matter, analyzing phase transitions, and addressing the inverse Hamiltonian problem.
- Physics-inspired algorithms for machine learning and neural networks, such as extensions of Boltzmann machines (classical statistical mechanical learning) and connections between deep learning, the renormalization group, and tensor networks/MERA.
- Opportunities for machine learning that quantum computing will enable. This includes algorithmic advances for fault tolerant computers, as well as currently-available hardware systems such as quantum annealers.
- Mohammad Amin, D Wave Systems
- Peter Broecker, University of Cologne
- Kieron Burke, University of California, Irvine
- Matthew Fisher, Kavli Institute for Theoretical Physics
- Christopher Granade, University of Sydney
- Sergei Isakov, Google
- Ashish Kapoor, Microsoft Research
- Rosemary Ke, University of Montreal
- Seth Lloyd, Massachusetts Institute of Technology
- Andrew Millis, Simons Foundation
- Alejandro Perdomo Oritz, NASA Ames Research Center
- Barry Sanders, University of Calgary
- Maria Schuld, University of KwaZulu-Natal
- David Schwab, Northwestern University
- Cyril Stark, Massachusetts Institute of Technology
- James Steck, Wichita State University
- Damian Steiger, ETH Zurich & Google
- Miles Stoudenmire, University of California, Irvine
- GiacomoTorlai, University of Waterloo
- Mohammad Amin, D Wave Systems
- Louis-Francois Arsenault, Columbia University
- Jacob Barnett, Perimeter Institute
- Matt Beach, University of British Columbia
- Stefanie Beale, Institute for Quantum Computing
- Oleg Boulanov, Université Laval
- Daniel Brod, Perimeter Institute
- Peter Broecker, University of Cologne
- Kieron Burke, University of California, Irvine
- Juan Carrasquilla, Perimeter Institute
- Chen-Fu Chiang, SUNY
- Joshua Combes, Perimeter Institute
- Alexandre Day, Boston University
- Matthew Fisher, Kavli Institute for Theoretical Physics
- Wenbo Fu, Harvard University
- Martin Ganahl, Perimeter Institute
- Sevag Gharibian, Virginia Commonwealth University
- Victor Godet, Google
- Christopher Granade, University of Sydney
- Zhengcheng Gu, Perimeter Institute
- Gian Giacomo Guerreschi, Intel
- Guiyang Han, University of Waterloo
- Lauren Hayward-Sierens, Perimeter Institute
- Yejin Huh, University of Toronto
- Sergei Isakov, Google
- Bryan Jacobs, IARPA
- Ying-Jer Kao, National Taiwan University
- Ashish Kapoor, Microsoft Research
- Hemant Katiyar, Institute for Quantum Computing
- Rosemary Ke, University of Montreal
- Adrian Kent, Cambridge University
- Ehsan Khatami, San Jose State University
- Aaram Kim, Goethe-Universität Frankfurt am Main
- Alexandre Krajenbrink, Cambridge Quantum Computing
- Bohdan Kulchytskyy, University of Waterloo
- Joel Lamy-Poirier, Perimeter Institute
- Jaehoon Lee, University of British Columbia
- Junhyun Lee, Harvard University
- Ipsita Mandal, Perimeter Institute
- Roger Melko, Perimeter Institute & University of Waterloo
- Andrew Millis, Simons Foundation
- Ryan Mishmash, California Institute of Technology
- Robert Myers, Perimeter Institute
- Apurva Narayan, University of Waterloo
- Nam Nguyen, Wichita State University
- Chan Y. Park, Rutgers University
- Alejandro Perdomo Oritz, NASA Ames Research Center
- Anthony Polloreno, Rigetti Computing
- Pedro Ponte, Perimeter Institute
- Andrew Reeves, Grand River Regional Cancer Center
- Trevor Rempel, Perimeter Institute
- Julian Rincon, Perimeter Institute
- Nicholas Rubin, Rigetti Computing
- Wojciech Rzadkowski, University of Warsaw
- Subir Sachdev, Harvard University
- Barry Sanders, University of Calgary
- Norbert Schuch, Max-Planck-Institute of Quantum Optics
- Maria Schuld, University of KwaZulu-Natal
- David Schwab, Northwestern University
- Ivan Sergienko, Scotiabank
- Todd Sierens, Perimeter Institute
- Rajiv Singh, University of California, Davis
- Cyril Stark, Massachusetts Institute of Technology
- James Steck, Wichita State University
- Damian Steiger, ETH Zurich & Google
- Miles Stoudenmire, University of California, Irvine
- Yongchao Tang, University of Waterloo
- GiacomoTorlai, University of Waterloo
- Jordan Venderley, Cornell University
- Guillaume Verdon-Akzam, University of Waterloo
- Guifre Vidal, Perimeter Institute
- Yuan Wan, Perimeter Institute
- Chenjie Wang, Perimeter Institute
- Ching-Hao Wang, Boston University
- Shuo Yang, Perimeter Institute
- Chuck-Hou Yee, Rutgers University
Monday, August 8, 2016
Time |
Event |
Location |
9:00 – 9:30am |
Registration |
Reception |
9:30 – 9:35am |
Welcome and Opening Remarks |
Theatre |
9:35 – 10:15am |
Ashish Kapoor, Microsoft Research |
Theatre |
10:15 – 11:00am |
Coffee Break |
Bistro – 1^{st} Floor |
11:00 – 11:45am |
Maria Schuld, University of KwaZulu-Natal |
Theatre |
11:45 – 12:30pm |
Christopher Granade, University of Sydney |
Theatre |
12:30 – 2:30pm |
Lunch |
Bistro – 2^{nd} Floor |
2:30 – 3:15pm |
Barry Sanders, University of Calgary |
Theatre |
Tuesday, August 9, 2016
Time |
Event |
Location |
9:30 – 10:15am |
Cyril Stark, Massachusetts Institute of Technology |
Theatre |
10:15 – 11:00am |
Coffee Break |
Bistro – 1^{st} Floor |
11:00 – 11:45am |
David Schwab, Northwestern University |
Theatre |
11:45 – 12:30pm |
Miles Stoudenmire, University of California, Irvine |
Theatre |
12:30 – 2:30pm |
Lunch |
Bistro – 2^{nd} Floor |
2:30 – 3:00pm |
James Steck, Wichita State University |
Theatre |
3:00pm – 3:30pm |
Rosemary Ke, MILA, University of Montreal |
Theatre |
Wednesday, August 10, 2016
Time |
Event |
Location |
9:30 – 10:15am |
Sergei Isakov, Google |
Theatre |
10:15 - 11:00am |
Coffee Break |
Bistro – 1^{st} Floor |
11:00 – 11:45am |
Mohammad Amin, D Wave Systems |
Theatre |
11:45 – 12:30pm |
Alejandro Perdomo-Ortiz, NASA Ames Research Center |
Theatre |
12:00 – 2:00pm |
Lunch |
Bistro – 2^{nd} Floor |
2:00-3:30pm |
Colloquium |
Theatre |
Thursday, August 11, 2016
Time |
Event |
Location |
9:30 – 10:15am |
Kieron Burke, University of California, Irvine |
Theatre |
10:15 – 11:00am |
Coffee Break |
Bistro – 1^{st} Floor |
11:00 – 11:45am |
Andrew Millis, Columbia University |
Theatre |
11:45 – 12:30pm |
Juan Carrasquilla, Perimeter Institute |
Theatre |
12:30 – 2:30pm |
Lunch |
Bistro – 2^{nd} Floor |
2:30 – 2:45pm |
Conference Photo |
TBA |
2:45 – 3:15pm |
Giacomo Torlai, University of Waterloo |
Theatre |
3:15 – 3:45pm |
Peter Broecker, University of Cologne |
Theatre |
5:30pm |
Pub Night |
Bistro – 2nd Floor |
Friday, August 12, 2016
Time |
Event |
Location |
9:30 – 10:15am |
Damian Steiger, ETH Zurich & Google |
Theatre |
10:15 – 11:00am |
Coffee Break |
Bistro – 1^{st} Floor |
11:00 – 11:45am |
Seth Lloyd, Massachusetts Institute of Technology |
Theatre |
12:00 – 2:30pm |
Lunch |
Bistro – 2^{nd} Floor |
2:30 – 5:00pm |
Collaboration |
Theatre |
Mohammad Amin, D Wave Systems
Quantum Boltzmann Machine using a Quantum Annealer
Machine learning is a rapidly growing field in computer science with applications in computer vision, voice recognition, medical diagnosis, spam filtering, search engines, etc. In this presentation, I will introduce a new machine learning approach based on quantum Boltzmann distribution of a transverse-field Ising Model. Due to the non-commutative nature of quantum mechanics, the training process of the Quantum Boltzmann Machine (QBM) can become nontrivial. I will show how to circumvent this problem by introducing bounds on the quantum probabilities. This allows training the QBM efficiently by sampling. I will then show examples of QBM training with and without the bound, using exact diagonalization, and compare the results with classical Boltzmann training. Finally, after a brief introduction to D-Wave quantum annealing processors, I will discuss the possibility of using such processors for QBM training and application.
Peter Broecker, University of Cologne
Machine learning quantum phases of matter beyond the fermion sign problem
Kieron Burke, University of California, Irvine
Finding density functionals with machine-learning
Density functional theory (DFT) is an extremely popular approach to electronic structure problems in both materials science and chemistry and many other fields. Over the past several years, often in collaboration with Klaus Mueller at TU Berlin, we have explored using machine-learning to find the density functionals that must be approximated in DFT calculations. I will summarize our results so far, and report on two new works.
Juan Carrasquilla, Perimeter Institute
Machine Learning Phases of Matter
Matthew Fisher, Kavli Institute for Theoretical Physics
Quantum Crystals, Quantum Computing and Quantum Cognition
Quantum mechanics is down to earth - quite literally - since the electrons within the tiny crystals found in a handful of dirt manifest a dizzying world of quantum motion. Each crystal has it’s own unique choreography, with the electrons entangled in a myriad of quantum dances. Quantum entanglement
Physical approaches to the extraction of relevant information
In the first part of this talk, I will focus on the physics of deep learning, a popular subfield of machine learning where recent performance on tasks such as visual object recognition rivals human performance. I present work relating greedy training of deep belief networks to a form of variational real-space renormalization. This connection may help explain how deep networks automatically learn relevant features from data and extract independent factors of variation. Next, I turn to the information bottleneck (IB), an information theoretic approach to clustering and compression of relevant information that has been suggested as a framework for deep learning. I present a new variant of IB called the Deterministic Information Bottleneck, arguing that it better captures the notion of compression while retaining relevant information.
Cyril Stark, Massachusetts Institute of Technology
Damian Steiger, ETH Zurich & Google
Racing in parallel: Quantum versus Classical
Quantum algorithm for topological analysis of data
This talk presents a quantum algorithm for performing persistent homology, the identification of topological features of data sets such as connected components, holes and voids. Finding the full persistent homology of a data set over n points using classical algorithms takes time O(2^{2n}), while the quantum algorithm takes time O(n^2), an exponential improvement. The quantum algorithm does not require a quantum random access memory and is suitable for implementation on small quantum computers with a few hundred qubits.
Racing in parallel: Quantum versus Classical
In a fair comparison of the performance of a quantum algorithm to a classical one it is important to treat them on equal footing, both regarding resource usage and parallelism. We show how one may otherwise mistakenly attribute speedup due to parallelism as quantum speedup. As an illustration we will go through a few quantum machine learning algorithms, e.g. Quantum Page Rank, and show how a classical parallel computer can solve these problems faster with the same amount of resources.
Machine learning quantum phases of matter beyond the fermion sign problem
Learning Thermodynamics with Boltzmann Machines
The introduction of neural networks with deep architecture has led to a revolution, giving rise to a new wave of technologies empowering our modern society. Although data science has been the main focus, the idea of generic algorithms which automatically extract features and representations from raw data is quite general and applicable in multiple scenarios. Motivated by the effectiveness of deep learning algorithms in revealing complex patterns and structures underlying data, we are interested in exploiting such tool in the context of many-body physics.
Machine Learning Phases of Matter
TBA
Finding density functionals with machine-learning
Density functional theory (DFT) is an extremely popular approach to electronic structure problems in both materials science and chemistry and many other fields. Over the past several years, often in collaboration with Klaus Mueller at TU Berlin, we have explored using machine-learning to find the density functionals that must be approximated in DFT calculations. I will summarize our results so far, and report on two new works.
Quantum Crystals, Quantum Computing and Quantum Cognition
Quantum mechanics is down to earth - quite literally - since the electrons within the tiny crystals found in a handful of dirt manifest a dizzying world of quantum motion. Each crystal has it’s own unique choreography, with the electrons entangled in a myriad of quantum dances. Quantum entanglement
A quantum-assisted algorithm for sampling applications in machine learning.
An increase in the efficiency of sampling from Boltzmann distributions would have a significant impact in deep learning and other machine learning applications. Recently, quantum annealers have been proposed as a potential candidate to speed up this task, but several limitations still bar these state-of-the-art technologies from being used effectively.
Quantum Boltzmann Machine using a Quantum Annealer
Machine learning is a rapidly growing field in computer science with applications in computer vision, voice recognition, medical diagnosis, spam filtering, search engines, etc. In this presentation, I will introduce a new machine learning approach based on quantum Boltzmann distribution of a transverse-field Ising Model. Due to the non-commutative nature of quantum mechanics, the training process of the Quantum Boltzmann Machine (QBM) can become nontrivial. I will show how to circumvent this problem by introducing bounds on the quantum probabilities.
Pages
Scientific Organziers:
- Roger Melko, Perimeter Institute & University of Waterloo
- Miles Stoudenmire, University of California, Irvine
- Guifre Vidal, Perimeter Institute
- Nathan Wiebe, Microsoft Research