**Stephen Bartlett**, University of Sydney

*Symmetry, topology, and thermal stability*

The interplay of symmetry and topology in quantum many-body systems can lead to novel phases of matter, with applications in quantum memories and resources for quantum computing. While we understand the range of phenomena quite well in 2-d systems, there are many open questions for the 3-d case, in particular what kind of symmetries and topology can allow for thermal stability in 3-d models. I’ll present some of the results and open questions in this direction, using the 3-d toric code and the RBH models as examples.

**Earl Campbell**, University of Sheffield

*Magic resource theories and classical simulation*

I will review the stabiliser rank and associated pure state magic monotone, the extent, [Bravyi et. al 2019]. Then I will discuss several new magic monotones that can be regarded as a generalisation of the extent monotone to mixed states [Campbell et. al., in preparation]. My talk will outline several nice theorems we can prove about these monotones relate to each other and how they are related to the runtime of new classical simulation algorithms.

**Bartek Czech**, Tsinghua University

*How Wilson lines in AdS redundantly compute CFT correlation functions*

In the AdS/CFT correspondence, global symmetries of the CFT are realized as local symmetries of AdS; this feature underlies the error-correcting property of AdS. I will explain how this allows AdS3 to realize multiple redundant computations of any CFT2 correlation function in the form of networks of Wilson lines. The main motivation is to rigorously define the CFT at a cutoff and study it as a model of computational complexity; in that regard we will find agreement with the holographic "Complexity = Volume" proposal. But the framework might be useful more generally.

**David Gosset**, University of Waterloo

*Classical algorithms for quantum mean values*

Consider the task of estimating the expectation value of an n-qubit tensor product observable in the output state of a shallow quantum circuit. This task is a cornerstone of variational quantum algorithms for optimization, machine learning, and the simulation of quantum many-body systems. In this talk I will describe three special cases of this problem which are "easy" for classical computers. This is joint work with Sergey Bravyi and Ramis Movassagh.

**Daniel Gottesma**n, Perimeter Institute

*Stabilizer codes for prime power qudits*

There is a standard generalization of stabilizer codes to work with qudits which have prime dimension, and a slightly less standard generalization for qudits whose dimension is a prime power. However, for prime power dimensions, the usual generalization effectively treats the qudit as multiple prime-dimensional qudits instead of one larger object. There is a finite field GF(q) with size equal to any prime power, and it makes sense to label the qudit basis states with elements of the finite field, but the usual stabilizer codes do not make use of the structure of the finite field. I introduce the true GF(q) stabilizer codes, a subset of the usual prime power stabilizer codes which do make full use of the finite field structure. The true GF(q) stabilizer codes have nicer properties than the usual stabilizer codes over prime power qudits and work with a lifted Pauli group, which has some interesting mathematical aspects to it.

**David Gross,** University of Cologne

*The representation theory of the Clifford group, with applications to resource theories*

I will report on an ongoing project to work out and exploit an analogue of Schur-Weyl duality for the Clifford group. Schur-Weyl establishes a one-one correspondence between irreps of the unitary group and those of the symmetric group. A similar program can be carried out for Cliffords.

The permutations are then replaced by certain discrete orthogonal maps.

As is the case for Schur-Weyl, this duality has many applications for problems in quantum information. It can be used, e.g., to derive quantum property tests for stabilizerness and Cliffordness, a new direct interpretation of the sum-negativity of Wigner functions, bounds on stabilizer rank, the construction of designs using few non-Clifford resources, etc.

[arXiv:1609.08172, arXiv:1712.08628, arXiv:1906.07230, arXiv:out.soon].

**Aleksander Kubica**, Perimeter Institute

*Error correction with the color code*

The color code is a topological quantum code with many valuable fault-tolerant logical gates. Its two-dimensional version may soon be realized with currently available superconducting hardware despite constrained qubit connectivity. In the talk, I will focus on how to perform error correction with the color code in d ≥ 2 dimensions. I will describe an efficient color code decoder, the Restriction Decoder, which uses as a subroutine any toric code decoder. I will also present numerical estimates of the storage threshold of the Restriction Decoder for the triangular color code against circuit-level depolarizing noise.

Based on arXiv:1905.07393 and arXiv:1911.00355.

**Anthony Leverrier**, Inria

*Towards local testability for quantum coding*

We introduce the hemicubic codes, a family of quantum codes obtained by associating qubits with the p-faces of the n-cube (for n>p) and stabilizer constraints with faces of dimension (p±1). The quantum code obtained by identifying antipodal faces of the resulting complex encodes one logical qubit into N=2n−p−1(np) physical qubits and displays local testability with a soundness of Ω(log−2(N)) beating the current state-of-the-art of log−3(N) due to Hastings. We exploit this local testability to devise an efficient decoding algorithm that corrects arbitrary errors of size less than the minimum distance, up to polylog factors.

We then extend this code family by considering the quotient of the n-cube by arbitrary linear classical codes of length n. We establish the parameters of these generalized hemicubic codes. Interestingly, if the soundness of the hemicubic code could be shown to be 1/log(N), similarly to the ordinary n-cube, then the generalized hemicubic codes could yield quantum locally testable codes of length not exceeding an exponential or even polynomial function of the code dimension.

(joint work with Vivien Londe and Gilles Zémor)

**Zi-Wen Liu**, Perimeter Institute

*No-go theorems for quantum resource purification*

The manipulation of quantum "resources" such as entanglement and coherence lies at the heart of quantum advantages and technologies. In practice, a particularly important kind of manipulation is to "purify" the quantum resources, since they are inevitably contaminated by noises and thus often lost their power or become unreliable for direct usage. Here we derive fundamental limitations on how effectively generic noisy resources can be purified enforced by the laws of quantum mechanics, which universally apply to any reasonable kind of quantum resource. Remarkably, it is impossible to achieve perfect resource purification, even probabilistically. Our theorems indicate strong limits on the efficiency of distillation, a widely-used type of resource purification routine that underpins many key applications of quantum information science. In particular, we present explicit lower bounds on the resource cost of magic state distillation, a leading scheme for realizing scalable fault-tolerant quantum computation

**Peter Love**, Tufts University

*Variational Quantum Eigensolvers and contextuality*

The variational quantum eigensolver (VQE) is the leading candidate for practical applications of Noisy Intermediate Scale Quantum (NISQ) devices. The method has been widely implemented on small NISQ machines in both superconducting and ion trap implementations. I will review progress to date and discuss two questions . Firstly, how quantum mechanical are small VQE demonstrations? We will analyze this question using strong measurement contextuality. Secondly, can VQE be implemented at the scale of devices capable of exhibiting quantum supremacy, around 50 qubits? I will discuss some recent techniques to reduce the number of measurements required, which again use the concept of contextuality.

**Akimasa Miyake**, University of New Mexico

*Symmetry-protected topologically ordered phases for measurement-based quantum computation*

Measurement-based quantum computation (MBQC) is a computational scheme to simulate spacetime dynamics on the network of entanglement using local measurements and classical communication. The pursuit of a broad class of useful entanglement encountered a concept of symmetry-protected topologically ordered (SPTO) phases in condensed matter physics. A natural question is "What kinds of SPTO ground states can be used for universal MBQC in a similar fashion to the 2D cluster state?" 2D SPTO states are classified not only by global on-site symmetries but also by subsystem symmetries, which are fine-grained symmetries dependent on the lattice geometry. Recently, all ground states within SPTO cluster phases on the square and hexagonal lattices have been shown to be universal, based on the presence of subsystem symmetries and associated structures of quantum cellular automata. Motivated by this observation, we analyze the computational capability of SPTO cluster phases on all vertex-translative 2D Archimedean lattices. We show that there are four different "fundamental" subsystem symmetries, called here ribbon, cone, fractal, and 1-form symmetries, for cluster phases, and the former three types one-to-one correspond to three classes of Clifford quantum cellular automata. We conclude that nine out of the eleven Archimedean lattices support universal cluster phases protected by one of the former three symmetries, while the remaining lattices with the 1-form symmetry have a different capability related to error correction.

**Tomoyuki Morimae**, Kyoto University

*Fine-grained quantum supremacy and stabilizer rank*

It is known that several sub-universal quantum computing models cannot be classically simulated unless the polynomial-time hierarchy collapses. However, these results exclude only polynomial-time classical simulations. In this talk, based on fine-grained complexity conjectures, I show more ``fine-grained" quantum supremacy results that prohibit certain exponential-time classical simulations. I also show the stabilizer rank conjecture under fine-grained complexity conjectures.

**Naomi Nickerson**, Psi Quantum

*Topological error correction in linear optical quantum computing*

In linear optical quantum computing the qubits do not fundamentally interact, and yet via measurement complex entanglement can be constructed to implement quantum error corrected computation via topological codes.

As a hardware platform for quantum computation linear optics offers unique flexibility in the options for building up topological error correcting schemes. Some interesting examples are the long range connectivity which is straightforward in a photonic architecture, and the ability to move qubits around in temporal as well as spatial dimensions. I will give an overview of quantum computing with silicon photonics and demonstrate how these physical features of the photonic approach can inspire novel schemes for fault tolerance.

**Robert Raussendorf,** University of British Columbia

*A computationally universal phase of quantum matter*

We provide the first example of a symmetry protected quantum phase that has universal computational power. Throughout this phase, which lives in spatial dimension two, the ground state is a universal resource for measurement based quantum computation.

Joint work with Cihan Okay, Dong-Sheng Wang, David T. Stephen, Hendrik Poulsen Nautrup; J-ref: Phys. Rev. Lett. 122, 090501

**Sam Roberts**, Psi Quantum

*Self-correction from symmetry*

A self-correcting quantum memory can store and protect quantum information for a time that increases without bound in the system size, without the need for active error correction. Unfortunately, the landscape of Hamiltonians based on stabilizer (subspace) codes is heavily constrained by numerous no-go results and it is not known if they can exist in three dimensions or less. In this talk, we will discuss the role of symmetry in self-correcting memories. Firstly, we will demonstrate that codes given by 2D symmetry-enriched topological (SET) phases that appear naturally on the boundary of 3D symmetry-protected topological (SPT) phases can be self-correcting -- provided that they are protected by an appropriate subsystem symmetry. Secondly, we discuss the feasibility of self-correction in Hamiltonians based on subsystem codes, guided by the concept of emergent symmetries. We present ongoing work on a new exactly solvable candidate model in this direction based on the 3D gauge color code. The model is a non-commuting, frustrated lattice model which we prove to have an energy barrier to all bulk errors. Finding boundary conditions that encode logical qubits and retain the bulk energy barrier remains an open question.

**Robert Spekkens**, Perimeter Institute

*A resource theory of nonclassicality in Bell scenarios*

We take a resource-theoretic approach to the problem of quantifying nonclassicality in Bell scenarios. The resources are conceptualized as probabilistic processes from the setting variables to the outcome variables which have a particular causal structure, namely, one wherein the wings are only connected by a common cause. The distinction between classical and nonclassical is then defined in terms of whether or not a classical causal model can explain the correlations. The relative nonclassicality of such resources is quantified by considering their interconvertibility relative to the set of operations that can be implemented using a classical common cause (which correspond to local operations and shared randomness). Among other results, we show that the information contained in the degrees of violation of facet-defining Bell inequalities is not sufficient for quantifying nonclassicality, even though it is sufficient for witnessing nonclassicality. In addition to providing new insights on Bell nonclassicality, our work sets the stage for quantifying nonclassicality in more general causal networks and thus also for a resource-theoretic account of nonclassicality in computational settings. (Joint work with Elie Wolfe, David Schmid, Ana Belen Sainz, and Ravi Kunjwal)

**Tzu-Chieh Wei,** State University of New York

*Two-dimensional AKLT states as ground states of gapped Hamiltonians and resource for universal quantum computation*

Affleck, Kennedy, Lieb, and Tasaki (AKLT) constructed one-dimensional and two-dimensional spin models invariant under spin rotation. These are recognized as paradigmatic examples of symmetry-protected topological phases, including the spin-1 AKLT chain with a provable nonzero spectral gap that strongly supports Haldane’s conjecture on the spectral gap of integer chains. These states were shown to provide universal resource for quantum computation, in the framework of the measurement-based approach, including the spin-3/2 AKLT state on the honeycomb lattice and the spin-2 one on the square lattice, both of which display exponential decay in the correlation functions. However, the nonzero spectral in these 2D models had not been proved analytically for over 30 years, until very recently. I will review briefly our understanding of the quantum computational universality in the AKLT family. Then I will focus on demonstrating the nonzero spectral gap for several 2D AKLT models, including decorated honeycomb and decorated square lattices, and the undecorated degree-3 Archimedean lattices. In brief, we now have universal resource states that are ground states of provable gapped local Hamiltonians. Such a feature may be useful in creating the resource states by cooling the system and might further help the exploration into the quantum computational phases in generalized AKLT-Haldane phases.

**Gabriel Wong**, Fudan University

*Entanglement and extended conformal field theory (or how to get a tensor network from a CFT path integral) *

In a continuum field theory the Hilbert space does not factorize into local tensor products. How then can we define entanglement and the basic protocols of quantum information theory? In this talk we will show how the factorization problem can be solved in a class of 2D conformal field theories by directly appealing to the fusion rules. The solution suggests a tensor network description of a CFT path integral using the OPE data.