Characterizing Quantum Devices Using the Principles of Quantum Information


The title of this thesis is “Characterizing Quantum Devices Using the Principles of Quantum Information”, named after the NSTGRO fellowship of the same name. As with any work, the focus has become more clear after the fact. The structure is roughly as follows: 

(1) Quantum computation.

(2) Quantum metrology.

(3) Quantum characterization.

This thesis outlines work pertaining to the development and characterization of quantum devices. The suggested methodologies borrow ideas and rely on techniques from quantum information theory. Importantly, this thesis is meant to be a thesis on theoretical physics, rather than mathematics or computer science, and so we hope that the reader is able to glean physical intuition from this work.

Arguably the most important question in recent years is “Is this a quantum computer?” This answer, in all of its various realizations, is the prerogative and impetus of the field of quantum characterization, verification and validation (QCVV). We will generalize this question to “Is this a quantum device,” provide theoretical motivation for a number of interesting experiments, and develop theories that we hope provide new insights into the field and have the potential to lead to further advancements in quantum technology.

Specifically, seven projects are discussed: developing single-site rotations in a Penning trap as a scalable mode of quantum computation, enabling large-scale quantum optimization in a neutral atom trap, benchmarking quantum computers with a quantum error detecting code, bounding the integrated quantum Fisher information for metrological protocols, developing the theory of reservoir computing with stochastic reservoirs, rigorously deriving a theory of direct randomized benchmarking and understanding the impact of Markovian errors in random circuits.

The first two projects are experimental collaborations working towards quantum computation with hundreds of qubits. In recent years quantum devices appear poised to break the hundred qubit mark, with varying levels of control and fidelity. Surely there is much work to be done in developing these technologies if we want to perform genuine quantum computation, but the demonstration of control over a large number of quantum degrees of freedom is, in itself, a marvel of engineering and demonstrates a growth of our mastery of physics. In these two projects, we propose a new method of single-qubit control in an ion trap which routinely has hundreds of qubits, and investigate the impact of measurement rate of neutral atom traps on our ability to execute quantum algorithms.

The next three projects work towards developing techniques to demonstrate genuine quantum information theoretic phenomena. The first work is simple, but was done at a time when many results were coming out suggesting the demonstration of fault tolerance, when they were instead implementing fault tolerant circuits, above threshold. We propose a simple Bacon-Shor code, over a family of circuits, that should be easy to implement on existing hardware, with current error rates, that should give a clear demonstration of actual fault-tolerance - a break-even of the logical error rate and physical error rate. While quantum computation has stolen the scene, quantum metrology has already been demonstrated to provide significant improvement over classical methods, if only in the context of scientific experiments. In particular, the probing of quantum gravitational effects and dark matter are only enabled by considering quantum devices. We first explore an application of sensing that relates to both dark matter detection and quantum computation, and then consider a theoretical investigation of a computational technique that may be able to leverage these metrological advantages. 

Finally, the last two chapters work towards the more general goal of characterizing quantum circuits. Despite significant research into the characterization of quantum states and gates, it is relatively poorly understood how errors impact the performance of structured circuits such as those used in error correction. By using techniques from the theory of randomized benchmarking we first establish an extremely general result about the decay rate of a large class of random circuits, and then use this result to understand how certain physical error rates impact the success probabilities of those circuits.

This work was done in collaboration with others. With the exception of the last chapter, all chapters have included with them a reference to the published article or preprint. The sections of these papers that were done primarily by another author have been elided from this document. The interested reader should refer to the referenced works to see the elided results.

Finally, this work was supported by a variety of funding agencies that I would like to acknowledge support from. This work was supported by a NASA Space Technology Graduate Research Opportunity award, the U.S. Department of Energy, Office of Science, National Quantum Information Science Research Centers, Quantum Systems Accelerator (QSA), the Defense Advanced Research Projects Agency (DARPA) under Contract No. HR001120C0068, the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Quantum Testbed Pathfinder, the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), the NSF JILA PFC grant 1734006 and NSF award DMR-1747426.

Year of Publication
Academic Department
Department of Physics
Number of Pages
Date Published
University of Colorado
JILA PI Advisors
Download citation
Publication Status