Multiple Representations in KeLP

KeLP supports natively a multiple representation formalism. It is useful, for example, when the same data can be represented by different observable properties. For example, in NLP one can decide to derive features of a sentence for different syntactic levels (e.g., part-of-speech, chunk, dependency) and treat them in a learning algorithms with different kernel functions.

The kernel function is the only that has knowledge about the representation on which it will operate. To use multiple representations, each with a specific kernel function, we must specify for each kernel what representation to use. Note that to have comparable scores with different kernels, we normalize each kernel, by applying a NormalizationKernel.

A weighted linear combination of kernel contribution is simply obtained by instantiating a LinearKernelCombination, and by using the add method on it. Finally we set the kernel on the passive aggressive algorithm.