Need help with diffraction problem by tonight

Here's the problem:
Two light sources can be adjusted to emit monochromatic light of any visible wavelength. The two wources are coherent, 1.72e-6 m apart, and in line with an observer, so that one source is 1.72e-6m farther from the observer than the other. What is the longest of the visible wavelengths (400 to 700 nm) at which the observer will see the brightest light, owing to constructive interference?

I read the section in the book that problem came from, but the only equation I can find is r2-r1=m*lambda (constructive interference, sources in phase). How do I apply this equation to this problem?

This is probably my last physics question, since our final is Monday. I just want to thank everyone on this board. You have been so helpful to me...I don't know how I would have gotten my homework done without your help!!

Constructive interference occurs where the difference in path length (the difference between the distances that light from the two sources travels) is an integer multiple of the wavelength of the light: 1 x the wavelength, or 2 x the wavelength, or 3 x ... etc.

In your example, that difference is given as 1.72 x 10^(-6) m which is equal to 1720 nm. Looking at your choices of wavelengths, you can see that 1720 isn't 1 times any of them, or 2 times any of them. But it is 3 times one of them: 1720/3≅573, so at that distance there will be constructive interference for light of 573nm wavelength.

If you insist on using the equation: r2 - r1 is the path-length difference: 1720 nm.
m is 3. &lambda; is the (unknown) wavelength. You start plugging values into m, starting with m=1, then try m=2 etc, until you find an "m" that works. Note that m=4 works too, for light of wavelength 430 nm. But it won't be as bright as the constructive interference at m=3.