Hi all,
Tomorrow we will have two talks:
Eliott Mamon will tell us about 'How to model quantum bomb-testing? A framework for “interaction-free” experiments'.
Alexander Schmidhuber will talk about 'Complexity-Theoretic Limitations on Quantum Algorithms for Topological Data Analysis'.
See below for the abstracts. The talks will take place at 2pm in HIT E 41.1 or on Zoom: https://ethz.zoom.us/j/362994444.
Best, Ladina
%%%%%%
Title: How to model quantum bomb-testing? A framework for “interaction-free” experiments
Abstract: Taking inspiration from the "counterfactual computation" framework of [1], an axiomatic framework for studying quantum experiments that can lead to counterfactual outcomes, and more generally, interaction-free outcomes, is proposed. It is based on a separation of the Hilbert space into "off" and "on" subspaces. To both motivate the framework itself and better understand what devices one should actually choose in it to model a given experimental setup, we start by introducing a purely classical "interaction-free" framework (in which counterfactual outcomes are impossible by construction). There, one makes a conceptually simpler ansatz of such a classical framework instance, and then, applies a quantum extension scheme, wherein several "choices of coherence" are left to be made --- first within off-states and on-states, and then between the two. We highlight a resulting feature of the framework: if a lack of coherence between off-states and on-states is chosen in the extension scheme for a particular device, then the resulting quantum setup cannot yield counterfactual outcomes for the other devices. We apply our framework to approach modeling quantum "bomb-testing" experiments, illustrating how the framework structures the different candidate models together. We also discuss how the framework deals with modeling "quantum Zeno effect" type bomb-testing protocols like the Kwiat et al. protocol of [2]. Notably, we underline how the usual notion of "interaction-free" on them is recovered within our framework. Lastly, we consider formalizing an "existence of noncontextual model" question for a given protocol of our framework. We then numerically, using the unit separability criterion of [3], assess this question in the negative for the Kwiat et al. N insertion protocol, for N = 4,...,8.
[1] Graeme Mitchison and Richard Jozsa. "Counterfactual computation". Proceedings of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences 457.2009 (2001), pp. 1175–1193.
[2] Paul Kwiat et al. “Interaction-free measurement”. Physical Review Letters 74.24(1995), pp. 4763–4766.
[3] Victor Gitton and Mischa P. Woods. "Solvable criterion for the contextuality of any prepare-and-measure scenario". Quantum 6 (2022), p. 732.
Title: Complexity-Theoretic Limitations on Quantum Algorithms for Topological Data Analysis (https://arxiv.org/abs/2209.14286)
Abstract: Quantum algorithms for topological data analysis (TDA) have received a surge of attention recently. They seem to provide an exponential advantage over the best classical approach while remaining immune to dequantization procedures and the data-loading problem. In this talk, I will argue that quantum algorithms for TDA run in exponential time for almost all inputs by showing that the central problem of TDA - estimating Betti numbers - is intractable even for quantum computers. Specifically, we prove that the problem of computing Betti numbers exactly is #P-hard, while the problem of approximating Betti numbers up to multiplicative error is NP-hard. Because quantum computers are not expected to solve #P-hard or NP-hard problems in sub-exponential time, our results imply that quantum algorithms for TDA only provide a polynomial advantage. We verify our claim by showing that the seminal quantum algorithm for TDA developed by Lloyd et al. achieves a quadratic speedup over the best classical approach in almost all cases. Finally, we argue that an exponential quantum advantage can be recovered if problems beyond TDA are considered.