Hi, I have a problem. Below is a link to a plot describing eye movements. It's been taken using small cameras that record the horizontal and vertical movements of the pupils. However due to something called "cross talk" and "deviant linearity" the coordinate system gets skewed. It should be in the form of a '+'-pattern with nine focal points (the person has been looking at nine points in a '+'-pattern with the center in origo). I'd like to know how to figure out exacly (or approximately) how much each sampel has been skewed such that I can write a program to adjust the data (rotate each sampel into its' correct place). I've tried using k-means clustering to find the nine focal points and then get the angles for each pair of clusters (ex. pair of upper clusters) to its' respective axis. But I want to adjust each individual sampel not a collective of them. Basically what I'm doing now is looking for a way to describe the angle to rotate each sample as a function of its' position in the plane. v = f(x, y)

Basically cross-talk is causing the skewness and translation and deviant linearity is causing the scaling. A modern system corrects these issues before pumping out data, but they cannot afford those and has to rely on their older systems. To get my degree my last task is to try and help them get by anyway.
Any ideas as to how to solve this one?