Quantum 2025 – scientific programme
Parts | Days | Selection | Search | Updates | Downloads | Help
THU: Thursday Contributed Sessions
THU 6: Quantum Computing and Communication: Contributed Session II (Concepts)
THU 6.5: Talk
Thursday, September 11, 2025, 15:15–15:30, ZHG007
Expressivity Limits of Quantum Reservoir Computing — •Nils-Erik Schütte1,2, Niclas Götting2, Hauke Müntinga1, Meike List1,3, Daniel Brunner4, and Christopher Gies2 — 1German Aerospace Center, Institute for Satellite Geodesy and Inertial Sensing, Bremen, Germany — 2Institut für Physik, Fakultät V, Carl von Ossietzky Universität Oldenburg — 3University of Bremen — 4Institut FEMTO-ST, Université Franche-Comté CNRS UMR, Besançon, France
Quantum machine learning (QML) merges quantum computing and artificial intelligence, two transformative technologies for data processing. While gate-based quantum computing employs precise unitary operations on qubits via parameterized quantum circuits (PQCs), quantum reservoir computing (QRC) leverages physical systems as quantum neural networks, relying on Hamiltonian dynamics rather than controlled gate operations, with learning performed at the output layer. Despite their differing foundations, these approaches share connections and can be formally mapped onto each other.
We formulate the QRC approach in the language of gate-based circuits and apply recently developed methods for PQCs to QRC. Contrary to expectations, we find that the effective computational dimensionality of quantum reservoirs does not scale with the reservoir dimension but is mainly determined by the input encoding [1]. For commonly used single-qubit rotations, we show that exponential scaling, one of the main promises of QRC over classical RC, cannot be reached.
[1] Schütte et al., arXiv: 2501.15528
Keywords: Quantum Reservoir Computing; Quantum Machine Learning