Movement vector for a subsection of the surface of a sphere - spherical polar coords

What I am trying to do seemed so simple at first - assign a movement vector to a large subsection of the surface area of a sphere.

The project I'm working on is a (very) simple simulation of tectonic plates.

I have a sphere defined in spherical polar coordinates (R, theta (longtitude - 0 - 360), phi (latitude 180 - 0)).
I also have large portions of the surface of the sphere randomly broken up into several 'plates'.

Initially, my thought was to simply assign a movement vector in terms of theta and phi to each plate.
However, it quickly becomes obvious that will not work. e.g. consider a plate that covers the 'north pole' area. If it is assigned a positive phi value for movement then the whole plate will spread outward from the pole. Clearly not what I want.

So - any ideas how I can do it?
Note that the solution needs to be fairly simply to transform into phi/theta values for a given point in order to apply the motion.

Re: Movement vector for a subsection of the surface of a sphere - spherical polar coo

I think I have an answer:

The movement needs to be expressed in terms of a 3D matrix of rotation about the center point of the sphere.
This actually makes sense as I was hoping to use the GPU for simulation of the plates anyway and things like applying rotation matricies are it's bread and butter so I'll just do that in the vertex shader before moving on to the rest of the simulation.

The only potential problem I can see is that after the simulation is complete, I won't have the points in spherical coordinates. However, this would always be the case if I ran the simulation on the GPU and I think the speedup over running it in regular code will more than make up for it.