High-rate entanglement between a semiconductor spin and indistinguishable photons
In this publication, we present cluster states of entangled photons represent the main building block for developing error corrected and large scale photonic quantum computers. We demonstrate the generation of three-photon cluster states using a single quantum dot based device, achieving two orders of magnitude with respect to previous state of the art among different quantum technology systems.
Photon-number entanglement generated by sequential excitation of a two-level atom
Exploiting the atomic behaviour of a semiconductor quantum dot in a cavity, we develop a novel protocol for the generation of photon number entangled states in the time basis: from a photon number Bell state up to a series of multi-temporal mode entangled states.
The results demonstrate the possibility of using single-photon sources to encode quantum information in new ways.
Bright Polarized Single-Photon Source Based on a Linear Dipole
Bright emission of polarized single-photons from quantum dots is demonstrated by taking advantage of phonon-assisted relaxation and the intrinsic linear dipole structure of exciton states. By optical pumping of the exciton state along one of its dipoles we achieve high emission efficiency of indistinguishable photons with linear polarization degree up to 99%, without the need of complex cavity engineering.
Reproducibility of high-performance quantum dot single-photon sources
Solid-state quantum light emitters are ubiquitous quantum technology devices required for a large plethora of applications. Their integration in complex industrial systems is bounded by the improvement of their efficiency but also by reproducible and high-fidelity fabrication on a large scale. By leveraging the full potential of semiconductor processing and Quandela’s technology we demonstrate the scalable and reproducible fabrication of a large set of devices with top performances.
Generation of non-classical light in a photon-number superposition
Quantum information can be encoded in several degrees of freedom of single photons; we demonstrate the possibility to generate pulses of light containing a superposition of Fock states of different photon number with high quantum purity. By varying the excitation regime we present the deterministic variation of the proportions in between Fock states |0>, |1> and |2>.
The results show in this way a new degree of freedom for qubit encoding in quantum computing protocols.
Near-optimal single-photon sources in the solid state
The first demonstration of bright solid-state quantum light emitter devices, engineered in a deterministic microcavity system, with record photon indistinguishability and single-photon purity. The first demonstration of a game changing technology.
Energy-efficient quantum non-demolition measurement with a spin-photon interface
Spin-photon interfaces aim to coherently transfer quantum information between spin qubits and propagating pulses of light. In this collaborative work, we explore the potential for spin-photon interfaces to perform energy-efficient operations by exploiting quantum resources. We show that this advantage is robust against realistic imperfections in state-of-the-art implementations with quantum dots.
High-fidelity generation of four-photon GHZ states on-chip
High fidelity on-chip generation and characterization of a 4-photon Greenberger-Horne-Zeilinger (GHZ) state using a bright single photon source based on a single quantum dot in combination with a reconfigurable low-loss laser-written photonic chip.
Quantum Advantage in Information Retrieval
Random access codes have provided many examples of quantum advantage in communication, but concern only one kind of information retrieval task. We introduce a related task—the Torpedo Game—and show that it admits greater quantum advantage than the comparable random access code.
Perceval: A Software Platform for Discrete Variable Photonic Quantum Computing
We introduce Perceval, an evolutive open-source software platform for simulating and interfacing with discrete variable photonic quantum computers, and describe its main features and components.
Quantifying n-photon indistinguishability with a cyclic integrated interferometer
Indistinguishable single photons are key resources in photonic implementations of quantum information algorithms. However fully general techniques to quantify indistinguishability are missing. Here we report such a technique, with new theoretical developments leading to an experimental demonstration.
Interfacing scalable photonic platforms: solid-state based multi-photon interference in a reconfigurable glass chip
An efficient realization of modular quantum photonic platform interconnecting quantum dot single-photon emitters, active demultiplexing and integrated waveguides on Silica. Due to high brightness and device efficiency we demonstrate a speed up, compared to similar experiments performed with probabilistic SPDC and four-wave mixing sources.
One nine availability of a Photonic Quantum Computer on the Cloud toward HPC integration
In November 2022, we introduced the first cloud-accessible general-purpose quantum computer based on single photons. One of the key objectives was to maintain the platform’s availability as high as possible while anticipating seamless compatibility with HPC hosting environment. In this article, we describe the design and implementation of our cloud-accessible quantum computing platform, and demonstrate one nine availability (92 %) for external users during a six-month period, higher than most online services. This work lay the foundation for advancing quantum computing accessibility and usability in hybrid HPC-QC infrastructures.
Simulating time-integrated photon counting using a zero-photon generator
We introduce a novel method to numerically simulate measurements of light produced by quantum emitters. The method circumvents a longstanding computational scaling problem by exploiting information hidden in zero-photon measurements. We demonstrate the method by computing photon detection probabilities from physical source models exponentially faster than the current state of the art.
A general-purpose single-photon-based quantum computing platform
Read the paper on Quandela‘s 6-photon fully-programmable quantum processor, Ascella. This highly-versatile device is the first of its kind based on manipulating single photons. This modular platform (based on semiconductor quantum light emitters and opto-electronic modules) presents record 6-qubits efficiency; it includes a software stack enhanced by machine learning to controls the platform, corrects hardware imperfections and provides remote control for general quantum computing use. We program the platform both via gate-based and purely photonic encoding, looking into the close future with VQE and a supervised learning classification task but also towards error correction protocols by demonstrating the very first heralded generation of a 3-photon GHZ state. We perform benchmarking on 1-qubit, 2-qubit and 3-qubit gates as you run them on the platform, without tricks to enhance performance for the advertising leaflet.
Solving graph problems with single-photons and linear optics
An important challenge for current and near-term quantum devices is finding useful tasks that can be preformed on them. We first show how to efficiently encode a bounded n×n matrix A into a linear optical circuit with 2n modes. We then apply this encoding to the case where A is a matrix containing information about a graph G. We show that a photonic quantum processor consisting of single-photon sources, a linear optical circuit encoding A, and single-photon detectors can solve a range of graph problems including finding the number of perfect matchings of bipartite graphs, computing permanental polynomials, determining whether two graphs are isomorphic, and the k-densest subgraph problem. We also propose pre-processing methods to boost the probabilities of observing the relevant detection events and thus improve performance. Finally, we present various numerical simulations which validate our findings.
Certified randomness in tight space
Reliable randomness is a core ingredient in algorithms and applications ranging from numerical simulations to sampling and cryptography. The violation of a Bell inequality can certify that intrinsic randomness is being generated, but this certification typically requires spacelike separated devices. In this work, we provide new theoretical tools to certify randomness generation on a small-scale device and perform a first-of-its-kind integrated photonic demonstration combining a quantum dot based single-photon source and a reconfigurable glass chip.
Photonic Quantum Computing For Polymer Classification
We present a hybrid classical-quantum approach to the binary classification of polymer structures. Two polymer classes visual (VIS) and near-infrared (NIR) are defined based on the size of the polymer gaps. The hybrid approach combines one of the three methods, Gaussian Kernel Method, Quantum-Enhanced Random Kitchen Sinks or Variational Quantum Classifier, implemented by linear quantum photonic circuits (LQPCs), with a classical deep neural network (DNN) feature extractor.
Contextuality and Wigner Negativity Are Equivalent for Continuous-Variable Quantum Measurements
Quantum computers promise considerable speedups with respect to their classical counterparts. However, the identification of the innately quantum features that enable these speedups is challenging. In full generality, contextuality and Wigner negativity have been perceived as two such distinct resources. Here we show that they are in fact equivalent for the standard models of continuous-variable quantum computing. While our results provide a unifying picture of continuous-variable resources for quantum speedup, they also pave the way toward practical demonstrations of continuous-variable contextuality and shed light on the significance of negative probabilities in phase-space descriptions of quantum mechanics.
A Framework for Verifiable Blind Quantum Computation
While it is possible to benchmark devices or use certification techniques under various assumptions, the most stringent proof is given by verification protocols: they provide unconditional assurance that the client will either receive the correct outcome or abort the computation, even against a service provider which actively tries to corrupt the result. We provide the first framework for designing such protocols in a way that both encompasses most known protocols and allows to create new ones in a much simpler way. This streamlines the creation process and allows us to already improve on our previous state-of-the-art protocols for the verification of delegated quantum computation by introducing two new constructions.
Strong Simulation of Linear Optical Processes
In this paper, is provided an algorithm and general framework for the simulation of photons passing through linear optical interferometers.
A Complete Equational Theory for Quantum Circuits
In this note, is introduced the first complete equational theory for quantum circuits. More precisely, a set of circuit equations that are proved to be sound and complete: two circuits represent the same quantum evolution if and only if they can be transformed one into the other using the equations.
A Graphical Language for Linear Optical Quantum Circuits
The LOv-calculus, a graphical language for reasoning about linear optical quantum circuits with so-called vacuum state auxiliary inputs.
Mitigating errors by quantum verification and post-selection
This paper proposes a technique for mitigating time dependent errors in quantum circuits, and tests this technique on currently available quantum hardware.
Sequential generation of linear cluster states from a single photon emitter
Assessing the quality of near-term photonic quantum devices
The Photonic Quality Factor is a scalable single-number metric for assessing the performance of current and near-term photonic quantum computing devices. We propose a series of benchmark tests targetting two main sources of noise, namely photon loss and distinguishability. The PQF is the largest number of input photons for which the output statistics pass all tests. We provide strong guarantees that passing the tests precludes efficient classical simulability.