When I try an draw a line (glBegin(GL_LINES))that is approx. 75% of the screen away from the origin it is draw at 1 pixel less than the specified coordinate. What makes it really strange is that calls to glBegin(GL_QUADS), for filling the windows, and glDrawPixels get the correct coordinate.

I’m also using OpenGl as a 2D rasterizer and I also had the same problem. I ended up just using GL_QUADS to draw my lines because GL_LINES seemed inexact. It wasn’t always one pixel off otherwise I would have just added a constant. Instead it was off by one pixel depending on where in the window it was drawing.

Have a look at the “diamond exit rule” used in OpenGL to decide which pixels are touched while rendering primitives. Pictures are in the OpenGL specs.
Also remember that pixels are not infinitely small points but have an area. If you want to hit a pixel exactly in the center you have to shift all 2D screen coordinates by 0.5.
That is, don’t draw from (0, 0) to (x, y) but from (0.0 + 0.5, 0.0 + 0.5) to (x + 0.5, y + 0.5). This should fulfill the diamond exit rule conditions for both start and end points.