Checking if a point is on an arc, defined by two vectors

So, i have a point in the coordinate system, and an arc defined by two vectors of equal length starting in the same point. The arc is always smaller than 180. How would i check if the point is on the arc? (the point is always on the circle, just need to check if it is in between these two vectors)

http://img835.imageshack.us/img835/9043/59064651.png [Broken]
By deluksic at 2012-05-29

Using trig, try and figure out the angle it makes with the x axis (usually where 0 degrees is located), then show that that angle is between the two angles that each vector makes with the x axis.

Ok, trig would do, but can't you test x and y values of vectors and the point? I tried now drawing lines horizontally and vertically in the tips of both vectors. That would give me something to start with, now i need to test which lines outline the surface i need...

Oh, and i need this for a computer program so i need it to be fast :)

EDIT: now i see i could just draw line trough tips of these vectors and test whether the point is above or below the line

You could avoid any trigonometry by finding the equation of the line that goes through the tips of v1 and v2. Then check to see whether the origin is on the same side of the line as the point of interest. If the origin lies on the opposite side of the line from the point, then the point lies in the arc between v1 and v2.

Edit: I missed your edit above which is basically this, but note that you can't just check whether the point lies above or below the line. For a different configuration of v1 and v2, points in the arc will lie above the line, not below it. You should check whether the point lies on the same side of the line as the origin.

You could avoid any trigonometry by finding the equation of the line that goes through the tips of v1 and v2. Then check to see whether the origin is on the same side of the line as the point of interest. If the origin lies on the opposite side of the line from the point, then the point lies in the arc between v1 and v2.

Edit: I missed your edit above which is basically this, but note that you can't just check whether the point lies above or below the line. For a different configuration of v1 and v2, points in the arc will lie above the line, not below it. You should check whether the point lies on the same side of the line as the origin.

Thank you very much, yes, this is what i was looking for :D (never would've tought of the origin.. )

Then the angle between v_! and v_2 is 180 deg and to know which way you have to go (which arc is the "good" one) you need to know which vector is the beginning and which is the end - instead of just taking the sum (same result if you switch v_1 and v_2!!), you need to build a vector that bisects the angle. If you do that, then the method will work for any angle, as the angle with the bisecting vector will always be less than 180 deg. (if it is 180 deg, then you are accepting the whole circle).

Nice and elegant, but it has the same problem:If A and B are exactly opposite each other the angle AXB is always 90 deg, i.e. the dot product is zero. At least you don't get a division by zero error as you don't have to normalize vectors.