If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

TEARING - dreaming of tearfree on lvds with Intel gma950

For those who've not seen tearing / don't know what it is, perhaps you best stop reading, lest you find out your system suffers from it as well.

As with some other problems, once you notice this one, you always will and it will really annoy you where perhaps it didn't before.

First question: if you have a laptop with Intel GMA950 graphics and you know tearing when you see it, do you see it on your laptop also?

In short, I've had this laptop for over 2 years, and have messed with linux since 10, and so I know more about modelines than all youngsters who shout "you should switch to Ubuntu" (I'm using Mandriva, thanks for asking) do, but apparently not enough...

The problem:
video playback is showing tearing, which is only clearly visible as usual during sideways panning camera shots or large objects moving sideways.

This happens with mplayer and xine, with opengl and xv, and naturally whenever I go to slow motion playback, the problem is alleviated, at pause it is naturally absent.

The things I've tried:
- changing display refresh rates
- played with the relatively new LVDSFixedMode option - see below for the reasoning and manpage excerpt
- tried to select only the external vga connection (didn't manage)

First indication of LVDSFixedMode in the man page (man intel):

Code:

Option "LVDSFixedMode" "boolean"
Use a fixed set of timings for the LVDS output, independent of
normal xorg specified timings. The default value if left
unspecified is true, which is what you want for a normal LVDS-
connected LCD type of panel. If you are not sure about this,
leave it at its default, which allows the driver to automati‐
cally figure out the correct fixed panel timings. See further
in the section about LVDS fixed timing for more information.

Seems like that's what I may need; later there is more info on this:

Code:

HARDWARE LVDS FIXED TIMINGS AND SCALING
Following here is a discussion that should shed some light on the
nature and reasoning behind the LVDSFixedMode option.
Unlike a CRT display, an LCD has a "native" resolution corresponding to
the actual pixel geometry. A graphics controller under all normal cir‐
cumstances should always output that resolution (and timings) to the
display. Anything else and the image might not fill the display, it
might not be centered, or it might have information missing - any man‐
ner of strange effects can happen if an LCD panel is not fed with the
expected resolution and timings.
However there are cases where one might want to run an LCD panel at an
effective resolution other than the native one. And for this reason,
GPUs which drive LCD panels typically include a hardware scaler to
match the user-configured frame buffer size to the actual size of the
panel. Thus when one "sets" his/her 1280x1024 panel to only 1024x768,
the GPU happily configures a 1024x768 frame buffer, but it scans the
buffer out in such a way that the image is scaled to 1280x1024 and in
fact sends 1280x1024 to the panel. This is normally invisible to the
user; when a "fuzzy" LCD image is seen, scaling like this is why this
happens.
In order to make this magic work, this driver logically has to be con‐
figured with two sets of monitor timings - the set specified (or other‐
wise determined) as the normal xorg "mode", and the "fixed" timings
that are actually sent to the monitor. But with xorg, it's only possi‐
ble to specify the first user-driven set, and not the second fixed set.
So how does the driver figure out the correct fixed panel timings?
Normally it will attempt to detect the fixed timings, and it uses a
number of strategies to figure this out. First it attempts to read
EDID data from whatever is connected to the LVDS port. Failing that,
it will check if the LVDS output is already configured (perhaps previ‐
ously by the video BIOS) and will adopt those settings if found. Fail‐
ing that, it will scan the video BIOS ROM, looking for an embedded mode
table from which it can infer the proper timings. If even that fails,
then the driver gives up, prints the message "Couldn't detect panel
mode. Disabling panel" to the X server log, and shuts down the LVDS
output.
Under most circumstances, the detection scheme works. However there
are cases when it can go awry. For example, if you have a panel with‐
out EDID support and it isn't integral to the motherboard (i.e. not a
laptop), then odds are the driver is either not going to find something
suitable to use or it is going to find something flat-out wrong, leav‐
ing a messed up display. Remember that this is about the fixed timings
being discussed here and not the user-specified timings which can
always be set in xorg.conf in the worst case. So when this process
goes awry there seems to be little recourse. This sort of scenario can
happen in some embedded applications.
The LVDSFixedMode option is present to deal with this. This option
normally enables the above-described detection strategy. And since it
defaults to true, this is in fact what normally happens. However if
the detection fails to do the right thing, the LVDSFixedMode option can
instead be set to false, which disables all the magic. With LVDSFixed‐
Mode set to false, the detection steps are skipped and the driver pro‐
ceeds without a specified fixed mode timing. This then causes the
hardware scaler to be disabled, and the actual timings then used fall
back to those normally configured via the usual xorg mechanisms.
Having LVDSFixedMode set to false means that whatever is used for the
monitor's mode (e.g. a modeline setting) is precisely what is sent to
the device connected to the LVDS port. This also means that the user
now has to determine the correct mode to use - but it's really no dif‐
ferent than the work for correctly configuring an old-school CRT any‐
way, and the alternative if detection fails will be a useless display.
In short, leave LVDSFixedMode alone (thus set to true) and normal fixed
mode detection will take place, which in most cases is exactly what is
needed. Set LVDSFixedMode to false and then the user has full control
over the resolution and timings sent to the LVDS-connected device,
through the usual means in xorg.

Well, seems like that could explain what goes wrong for me.

Now the problem is: I can't seem to get any modeline accepted by the driver if I turn off LVDSFixedMode, the xorg log file just reports any ddc found modelines and uses that, but the resulting X still shows tearing, or without ddc xorg runs into the 'no screens found' situation...

Comment

One of the goals for version 2.5 of the driver was to resolve the tearing problems, but it was unfortunately not the case,

Jesse Barnes had this to say about it:

Item (4) was the main thing we missed this time (Nanhai has been working
flat out to get MPEG2 offload working so we didn't get a chance to try
different approaches for vblank sync'ing). I think the real fix here will
be DRI2, which should allow us to run compositing managers in DRI mode,
with real vblank sync'ing for all their drawing.

Comment

I would like to add that it seems (from the information in the beginning of this thread) that LVDSFixedMode could be used to alleviate video stuttering (not tearing) by setting the output frequency to the display to exactly a multiple of 23.976 when watching movies and such. I've previously used an nvidia card to do this because it will accept modelines in xorg.conf as overrides directly.

Comment

1 )
as usual after posting about my issues, in this case to feed modelines to the system and have those accepted, I managed to fix that part myself - I simply had to give a wide range for the vertical refresh, instead of just putting/keeping 60.
As long as DDC is used, Xorg will not care too much, but as soon as DDC was off, Xorg realised that the modelines were not really exactly 60 Hz, so it found no modes...

2 )
I tried the direct method of playing to the port (in my case 97), it played in the same tearing way as usual... :-(

3 )
I took a multiple of 3 of the 23.976 Hz, and also 50 Hz (to see if it looks different), nothing worked differently from before...

BTW I also tried XShm and such, not just xv.

So does this really means Linux + Intel + xorg sucks that much that even proper video playback is out of reach?
BTW on my gigabyte 780G I had big problems fighting tearing as well, finally I only got it working ok with gl, not xv.

After years of Nvidia, I just didn't realise others were so much behind - and I so really want a fully Free system.

Comment

> Are there any options that I can enable in my xorg.conf to help
> reduce/eliminate this tearing? Or is this simply a hardware limitation?
> Can XvMC somehow help me here?
There aren't any options at this point, but I'm wondering -- is this
full-screen? If we made full-screen Xv operations block until vblank
(which would lock up the X server), would that be an acceptable option?
It's actually very easy to do, just stick a 'wait for vblank' command
into the ring right before the 'copy the new picture' command in the Xv
extension code. It's just annoying when you're watching a tiny movie and
your whole session stops responding.

I also noticed something else; there was actually a wait for vblank patch committed to xf86-video-intel a while ago. But it was promptly revereted as it didn't support multiple clients. If that doesn't bother you, you might want to try the patch and see if it works.