Optimal Success Bounds for Single Query Quantum Algorithms Computing the General SUM Problem

View/ Open

Date

Author

Metadata

Abstract

In this thesis the problem of computing the sum of a string of arbitrary finite length drawn from an arbitrary finite alphabet is treated. The resource considered is the number of queries to some oracle which hides the string and gives access to each of its digits. The sum of a string is defined as adding all of the string's digits together modulo the alphabet size.
Classically, this problem is straightforward as any less than a number of queries equal to the string length reveals no useful information about the sum of the string. In the quantum information setting however, things are not so clear. When the alphabet size is equal to two, the problem becomes finding the parity of a bit string. This is a seminal result in quantum computation that allows a correct answer with certainty by making only half the queries that are classically needed. As the alphabet size increases beyond two however, less is known. There is an algorithm by Meyer and Pommersheim which computes the sum in this general setting with probability of success: $\min\left\lbrace\frac{\floor{\frac{n}{n-q}}}{k}, 1\right\rbrace$ where $n$ is the string length, $k$ is the alphabet size, and $q$ is the number of queries made. This algorithm has probability of success slightly above guessing when the number of queries are half the string length, and perfect probability of success when $n-1$ queries are made. The question dealt with in this thesis is whether this algorithm is optimal for the general sum case.
The problem is expressed as a semidefinite program, given for all instances. The instance for strings of length two and algorithms making a single query is solved and a proof is given. Significant insight into the multi-query case is also provided.