eps

p

probConds

protected double[][] probConds

Conditional probabilities.

zlambda

protected double[] zlambda

Normalization factors, one for each x. (CDM questions 2008: Are these
only at training time? Can we not allocate it at test time (unlike
what LambdaSolveTagger now does)? Is the place where it is set from
ySize wrong?

LambdaSolve

LambdaSolve

LambdaSolve

public LambdaSolve()

Method Detail

setNonBinary

public void setNonBinary()

setBinary

public void setBinary()

transformValues

public void transformValues()

This is a specialized procedure to change the values
of parses for semantic ranking.
The highest value is changed to 2/3
and values of 1 are changed to 1/(3*numones). 0 is unchanged
this is used to rank higher the ordering for the best parse
values are in p.data.values

improvedIterative

public void improvedIterative()

Iterate until convergence. I usually use the other method that
does a fixed number of iterations.

improvedIterative

public void improvedIterative(int iters)

Does a fixed number of IIS iterations.

Parameters:

iters - Number of iterations to run

pcond

public double pcond(int y,
int x)

fnum

protected double fnum(int x,
int y)

checkCorrectness

public boolean checkCorrectness()

Check whether the constraints are satisfied, the probabilities sum to one, etc. Prints out a message
if there is something wrong.

save_lambdas

readL

Read the lambdas from the file.
The file contains the number of lambda weights (int) followed by
the weights.
Historical note: The file does not contain
xSize and ySize as for the method read(String).

Parameters:

filename - The file to read from

read_lambdas

public static double[] read_lambdas(java.io.DataInputStream rf)

Read the lambdas from the stream.

Parameters:

rf - Stream to read from.

Returns:

An array of lambda values read from the stream.

logLikelihood

public double logLikelihood()

Returns:

The loglikelihood of the empirical distribution as predicted by the model p.

divide

public static double divide(double first,
double second)

Given a numerator and denominator in log form, this calculates
the conditional model probabilities.

Returns:

Math.exp(first)/Math.exp(second);

main

public static void main(java.lang.String[] args)

With arguments, this will print out the lambda parameters of a
bunch of .lam files (which are assumed to all be the same size).
(Without arguments, it does some creaky old self-test.)

Parameters:

args - command line arguments

logLikelihoodNeg

public double logLikelihoodNeg()

Calculate the log-likelihood from scratch, hashing the conditional
probabilities in pcond, which we will use later. This is for
a different model, in which all features effectively get negative weights
this model is easier to use for heauristic search
p(ti|s)=exp(sum_j{-(e^lambda_j)*f_j(ti)})

Returns:

The negative log likelihood of the data

logLikelihoodScratch

public double logLikelihoodScratch()

calculate the log likelihood from scratch, hashing the conditional
probabilities in pcond which we will use for the derivative later.

Returns:

The log likelihood of the data

getDerivatives

public double[] getDerivatives()

assuming we have the lambdas in the array and we need only the
derivatives now.

getDerivativesNeg

public double[] getDerivativesNeg()

assuming we have the lambdas in the array and we need only the
derivatives now.
this is for the case where the model is parameterezied such that all weights are negative
see also logLikelihoodNeg