File
code/H_I_D_A_C_K/meta_estimators/KJS_DJS_estimation.m

+function [K] = KJS_DJS_estimation(Y1,Y2,co)

+%Estimates the Jensen-Shannon kernel of two distributions from which we have samples (Y1 and Y2) using the relation: K_JS(f_1,f_2) = log(2) - D_JS(f_1,f_2), where D_JS is the Jensen-Shannon divergence.

+%

+%%Note:

+% 1)We use the naming convention 'K<name>_estimation' to ease embedding new kernels on distributions.

+% 2)This is a meta method: the Jensen-Shannon divergence estimator can be arbitrary.

+%

+%INPUT:

+% Y1: Y1(:,t) is the t^th sample from the first distribution.

+% Y2: Y2(:,t) is the t^th sample from the second distribution. Note: the number of samples in Y1 [=size(Y1,2)] and Y2 [=size(Y2,2)] can be different.