Abstract

Reproducing kernel Hilbert spaces are elucidated without assuming prior familiarity with Hilbert spaces. Compared with extant pedagogic material,
greater care is placed on motivating the definition of reproducing kernel Hilbert spaces and explaining when and why these spaces are efficacious.
The novel viewpoint is that reproducing kernel Hilbert space theory studies extrinsic geometry, associating with each geometric configuration a canonical
overdetermined coordinate system. This coordinate system varies continuously with changing geometric configurations, making it well-suited for studying
problems whose solutions also vary continuously with changing geometry. This primer can also serve as an introduction to infinite-dimensional linear algebra
because reproducing kernel Hilbert spaces have more properties in common with Euclidean spaces than do more general Hilbert spaces.

This primer gives a gentle and novel introduction to RKHS theory. It also presents several classical applications. It concludes by focusing on recent developments
in the machine learning literature concerning embeddings of random variables. Parenthetical remarks are used to provide greater technical detail, which some readers
may welcome, but they may be ignored without compromising the cohesion of the primer. Proofs are there for those wishing to gain experience at working with RKHSs;
simple proofs are preferred to short, clever, but otherwise uninformative proofs. Italicised comments appearing in proofs provide intuition or orientation or both.

A Primer on Reproducing Kernel Hilbert Spaces empowers readers to recognize when and how RKHS theory can profit them in their own work.