This paper describes a new framework for processing images by example, called
"image analogies." The framework involves two stages: a design phase, in which a
pair of images, with one image purported to be a "filtered" version of the other,
is presented as "training data"; and an application phase, in which the learned
filter is applied to some new target image in order to create an "analogous"
filtered result. Image analogies are based on a simple multiscale autoregression,
inspired primarily by recent results in texture synthesis. By choosing different
types of source image pairs as input, the framework supports a wide variety of
"image filter" effects, including traditional image filters, such as blurring or
embossing; super-resolution, in which a higher-resolution image is inferred from
a low-resolution source; improved texture synthesis, in which some textures are
synthesized with better coherence than previous approaches; texture transfer, in
which images are "texturized" with some arbitrary source texture; artistic
filters, in which various drawing and painting styles are synthesized based on
scanned real-world examples; and texture-by-numbers, in which realistic scenes,
composed of a variety of textures, are created using a simple painting
interface.