I have two shafts rotating from about ~200RPM to ~1500RPM. They normally rotate at exactly the same speed. I need to detect when one shaft rotates at a different rate than the other shaft. I don't need to know the actual speed of the shafts, just the difference.

Waaaaay back, this was done with Selsyns attached to the shafts which worked great. I don't have the luxury of attaching anything to the shafts except maybe some striped tape for an opto sensor to see, or maybe some magnets for a reluctor to sense.

I could build two F/V converters and compare their outputs, but the slightest drift from either one will throw my readings out of the window.

I think a digital approach is the best bet but I can't wrap my head around comparing two inputs at such low frequencies (~4 Hz).

Multiple pieces of reflective tape spaced evenly
about the periphery or as radial lines on the end
would increase the frequency of the input pulses.
This would effectively allow sampling of the shaft
speeds at a faster rate. The input pulses could be
fed to microcontroller into two "timers" (example: Timer 0
and Timer 1 on a Microchip brand microcontroller), compared,
then output made appropriately.

If you could attach a small, thin ring one effective solution
would be the use of a "gear tooth sensor", "hall effect sensor",
"gear tooth encoder" to detect a very finely pitched (small teeth)
ring, this ring might have 10's or 100's of teeth accurately spaced
around it's periphery, then your frequency would increase 10 or 100 fold.
Once again thinking of a microcontroller solution with the input pulses.

Yeah, a gear split in half that clamps on the shafts, then 2 sensors, like the ABS sensors by an auto's wheel.
If the shafts are out of whack at low speed, they'll be even worse at high speed, kinda like driving with one tire low on air, it's not running at the same speed as the other three and it never fixes itself irregardless of speed.
Somebody makes clamp-on gears, maybe Airpax, if they're still around.