Laaaaaag!

I've just moved from using my PS3 on an Asus MW221U 22" monitor (HDMI-DVI Adapter, PS3@720P) to a Sony KDL40HX800
40-inch LCD (HDMI, PS3@1080P). On connecting up the shiny new panel something is just not the same when it comes
to game play, mostly in MW2. I think
what I am trying to overcome is input lag on the Sony.

Following a few of the standard things like enabling the "game mode", which disabled a selection of the unnecessary
processing options, and directly connecting the PS3 to the panel rather than through the AV receiver, there was
a noticeable difference. Playing between both panels you still get a feeling you are behind the events on screen,
though, and the results at the end of the match speak for themselves.

I have done an input lag test by running the 22-inch and 40-inch as clones from my PC from each output of a
Radeon 4870, DVI direct to the 22" and with a DVI to HDMI adapter to the 40". Based on the timers, there is 40-60ms
delay to the Sony when running in its default game mode and up to 100ms on other settings. I know this is a small
difference but I guess being accustomed to the smaller screen is a factor when you are used to how things react.
As every display will have some sort of input latency to process the picture and chit chat over content protection
I now wonder what the latency is on the Asus, which seemed to be great in hindsight, although I'm over the small
screen and it was 16:10.

With such a range of TVs out there and a lot of people using the big screens for HD gaming, is big-screen choice
contributing to the potential of some gamers and the frustration of others? Trying to find information on the
panels prior to purchase is a nightmare, with most people confusing it with pixel
response time. Apart from trying the VGA input on the TV from
the PS3 (not sure how that'd work out), and maybe exploring the service menu, is there something else I can do
to reduce the time it takes to get the picture on the screen?

Andrew

I'm sure we've all wondered how far your monitor would have to be from your couch
for light-speed to create perceptible lag. Click for legible version!
(200,000-kilometre cables taking the signal to the monitor in the first
place would more than double these numbers.)

Answer:
First: Congratulations on actually doing a controlled test! It's easy to "feel" lag that isn't really there, just
as it's easy to decide sugar pills are a medical breakthrough or that
magic pebbles make
loudspeakerssoundbetter, if you've an inclination
to believe those things. (Note that people have a tendency to "see" not only things that they want to be true, but
also things that they fear are true.)

My quick answer to your actual question, though: No, there's nothing else you can do. There are a few things I
could suggest, but you've done them all already!

The actual input lag of a modern HDTV with "game mode" turned on and every other image-enhancement feature you
can find turned off is, usually, too small to notice.

The problem is that there's also an inescapable baseline lag for any 3D software - on a console, or a PC.

The graphics card can't "rasterise" an image for display
until it's rendered it, and it can't render it until the CPU has figured out what needs rendering, and there's a little
more delay on top of that because the CPU can't deal with new input if it's still busy with previous input. If a given
hardware-and-game combination quantises logic, rendering and rasterising all at the same 50Hz speed starting at the
vertical-blank moment, then it's impossible
for any input to produce output less than three frames later. That gives a baseline of 3/50ths of a second of lag,
or 60 milliseconds.

(Things get more complicated when simulation, rendering and output threads are running at different speeds. Many
modern games poll input very fast, and have fast simulation threads, but never output more than 30 distinct frames
per second. It's also possible to hand data to the
GPU before a whole frame worth of it has been calculated, and monkey with vblank timing.

And then there are games that look laggy, because the rendering thread lags behind the simulation one. So
you see your gun fire six frames late, but the bullet actually killed the target only two frames after you
pressed the button. (This is the opposite of unhelpful client-side prediction in networked games, where you see yourself
shoot someone but a moment later the server informs your game client that
actually, he shot you first.)

For almost all 3D games, you can bet on at least three frames of genuine lag. Some games are much worse.

The user generally can't do much about this baseline lag, but it makes display-device lag more important. 60ms
of lag is only noticeable if you're a pretty twitchy gamer playing a pretty twitchy game, but 60ms plus another
60ms of output-device lag will be noticeable to even quite dozy Guitar Hero players. (Multiplayer network lag adds
its own special flavour to the stew.)

It is indeed difficult, if not impossible, to get HDTV manufacturers to cough up input-lag numbers. This is partly
because the numbers vary depending on what the HDTV's displaying, but mainly because lag numbers aren't getting any
better, as the manufacturers add fancier and fancier image-enhancement features.

Most HDTV buyers care about a punchy, colourful picture, but seem to be unable to detect
sound that's a quarter of a freakin' second out of sync with the video. So HDTV makers have incentive to fiddle more
and more with the image, and no incentive to reduce the lag that this fiddling creates.

Once you've got the HDTV in your house you can measure lag by, as you did, setting up side-by-side screens (an
old CRT TV or monitor has zero inherent lag), or by pointing a video camera at the HDTV and your controller and counting
frames between button-press and on-screen action. (Infinity Ward got Ben Heckendorn his own bad self to make them
an LED-equipped button-press
indicator!)

The video-camera method lets you figure out whether a given setting change really made a difference, but if it
tells you your HDTV is lagtacular no matter what you do, your only options are to put up with the lag, or make friends
at the electronics shop by returning the TV.

If you haven't bought your HDTV yet, you can calculate lag by taking your PC or console to the shop, which
will conveniently probably also have a video camera you can use. The chance of this being doable in any randomly-chosen
shop is low, but a bit of phoning around will probably find you somewhere with agreeable gamers on staff.

It's also possible to short-cut the whole palaver if someone's already tested a TV you're considering, and posted
the results online. A few years ago the Web-forum pickings in this department were thin, but the situation's getting
better as HDTVs get cheaper and cheaper, and more and more people learn about the problem. (If only by wondering why
Guitar Hero and Rock Band have those calibration options now.)

Measure with micrometer, mark with chalk, cut with axe

How is it possible for several system monitoring programs to report different sets of CPU clocks and temperatures
(pic attached)?

I've disabled SpeedStep in the BIOS but Core Temp still says that the BIOS is underclocking my CPU - why is
this?

Answer:
A temperature sensor - actually, the hardware-monitoring chip to which the analogue sensors are connected - does not
actually report a temperature. It reports a number, which different programs (including the BIOS setup program)
interpret in different ways.

So even if a given program reckons it recognises whatever hardware-monitoring chip your motherboard (and/or graphics
card) uses, it may give wildly wrong numbers if your mobo/CPU/probably-BIOS-version-too combination has changed the
relationship between the reported number and the actual temperature.

(You can also get subtly wrong numbers if your temperature sensors read a bit high or low. This latter problem
- more common at lower temperatures than higher, I think - is seldom worth worrying about. Just remember that one
computer that seems to consistently run three degrees warmer than another may actually not be different at all. An
idle-temperature difference may be entirely illusory, even if it's quite large.)

It's sort of like car speedometers, which don't actually directly monitor speed - they monitor how rapidly the
wheels are rotating. Put smaller wheels on your car and your
speedo will read high, put bigger wheels on and it'll
read low. (And your rollin' on 20s will probably soon be interrupted by red and blue lights. For this reason, by the
way, most speedos are deliberately set up to read a little high with the car's standard wheels, so just putting
higher-profile tyres on won't make the speedo read hazardously low.)

The multiplier works the same way. There's no single standard way for a CPU and motherboard to tell a monitoring
program the CPU bus speed, multiplier and/or core speed, so once again the monitoring software has to explicitly recognise
the hardware it's running on.

I think Real Temp gets the numbers right for most current systems,
including Core i7s. The docs page talks about calibrating
the software to get really accurate numbers.

Plugging electron-leaks

Normally I'd research this myself, but I'm rather busy atm and thought I'd hopefully do both of us a favor
(I get the answer with little effort and you get something to write about).

I remember how happy I was with the first set of NiMH AA batteries that I bought. My digital camera worked
for more then a few lousy shots. I soon discovered their limitation, in the form of
self-discharge, so I couldn't
have them charged and sitting on my shelf waiting for use in a few months when I needed them. Around 2008 I came
across Duracell "Active-Charge"
low-self-discharge
NiMH and bought a set, I thought I'd give them a try and buy another set if they went well after 6-12 months.
Surprise surprise, they worked very well, but when I went back to buy some, none of the batteries carried the
"Active-Charge" branding.

Do these type of batteries still exist (from Duracell or
other common battery manufactures available
in retail outlets)? Are they just the standard NiMH that are available everywhere today and not carrying the extra
branding?

A.

Answer:
If the package doesn't specifically say "pre-charged" or "ready-to-use" or something of that sort (you probably won't
see "low self-discharge" actually written on the pack), the batteries will be the standard high-self-discharge type.
I can identify no clear relationship between battery brand and discharge rate; as a general rule, NiMH cells with
higher capacity will have higher self-discharge, but that's not a guaranteed relationship, and most, if not all, battery
brands inflate the capacity numbers anyway.

Standard NiMH cells are fine for high-discharge, short-standby applications like radio-controlled toys and powering
frequently-used photo flashes. They're acceptable for cordless phones and even cordless PC input devices, too, if
you're able to put the device on charge when you're not using it.

For applications with significant standby time, though, more expensive "LSD" cells with lower nominal capacity
will work much better.

There are rather more brands of LSD NiMH cell than there are actual different kinds of cell under the shrinkwrap.
The current "Duracell Pre-Charged" LSDs, for instance, are
actually Sanyo cells
- just different packaging for the same cells sold under Sanyo's "Eneloop" brand.

I think they still are Sanyos under the skin.

You can save a few bucks if you buy LSD cells
from eBay dealers. Lesser-known brands like
Uniross's "Hybrio" or Gold Peak's
"ReCyko" cost a bit less, but work the same. You'll probably not get
scammed!

The price difference isn't large any more, though. So unless you're really pinching pennies or buying a lot of
batteries, you might as well buy LSD cells locally.