We consider the problem of approximately integrating a Lipschitz function
f (with a known Lipschitz constant) over an interval. The goal is to
achieve an error of at most ε using as few samples of f as
possible. We use the adaptive framework: on all problem instances an adaptive
algorithm should perform almost as well as the best possible algorithm tuned
for the particular problem instance. We distinguish between DOPT and ROPT,
the performances of the best possible deterministic and randomized algorithms,
respectively. We give a deterministic algorithm that uses
O(DOPT(f, ε) ⋅ log (ε−1 / DOPT(f, ε)))
samples and show that an asymptotically better algorithm is impossible.
However, any deterministic algorithm requires
Ω(ROPT(f, ε)2) samples on some problem
instance. By combining a deterministic adaptive algorithm and Monte Carlo
sampling with variance reduction, we give an algorithm that uses at most
O(ROPT(f, ε)4/3 +
ROPT(f, ε) ⋅ log (1/ε))
samples. We also show that any algorithm requires
Ω(ROPT(f, ε)4/3 +
ROPT(f, ε) ⋅ log (1/ε))
samples in expectation on some problem instance (f, ε),
which proves that our algorithm is optimal.