John Reid a écrit :
> I have a problem for which I do not know any algorithm to solve it. I'm
> wondering if there is something in scipy that will help me. Apologies if
> there isn't and this is off-topic.
>> I have a set of N data points, which I want to place on a k x m grid. I
> have a cost function D(x1,x2) that measures how different 2 data points
> are. I want to minimise the sum of D over all neighbouring pairs of
> points on the grid (i.e. I want similar points to be neighbours on the
> grid).
>> A) Is there some established technique to do this?
yes. It is called (un)constraint optimization, or optimal control, or in
physics, inverse problems. You can see Lions and al (he received a
fields medal for that) for some biblio.
To do an optimal choice you will have to minimize or maximize the cost
function with(out) constraint. you'll have to zeros the cost-function
derivative respect to each parameters.
basically, you have 2 choices:
the sensibility method, where you manually (good!) or numerically (bouuu
very bad!) calculate the frechet-derivative of the cost function, ie the
derivative for each parameters respect to every else parameters for
every single evaluation of your cost function. it means for k*m grid,
you will need a (k*n)**2 matrix which represent the derivative of the
cost function evaluated in one point.
the adjoint methods (very good, very modern, less calculus, stable
numerically), where you evaluate the cost function derivative with the
adjoint operator (mathematically calculated (better) or numerically
calculated (not so acurate)) solved. The best method and the most
stable, because it reimplace an ill posed problem by a suite of well
posed problem, in the Hadamard sense.
> B) Is it in scipy?
this one, I don't know :)
>> BTW, N = k.m
>> Thanks,
> John.
>
cheers
Xavier