Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "Intel's Open-Source Technology Center was given source-code access to Valve's Left 4 Dead 2 game in order to help them fix Linux bugs and to better optimize their graphics driver to this forthcoming Linux native game on the Source Engine. Intel has talked about their Valve Linux development experiences and now they managed to get Left 4 Dead 2 running on their open-source graphics driver. Valve also has grown fond of open-source hardware drivers: 'Valve Linux developers have also been happy looking at an open-source graphics driver. Valve Linux developers found it equally thrilling that now when hitting a bottleneck in their game or looking for areas for performance optimizations, they are simply able to look into Intel's open-source Linux graphics driver to understand how an operation is handled by the hardware, tossing some extra debugging statements into the Intel driver to see what's happening, and making other driver tweaks.'"

Of the GPUs available, Intel has by far the best open source driver. They don't even bother supplying a proprietary one. However, intel GPUs suck, and gamers will have either a nVidia card or an AMD card. There are open source drivers for both of these, but they both suck far worse than the Intel driver.

I really hope Valve can talk either AMD or nVidia into doing something about the quality of their open source drivers. But I'm not holding my breath. Chances are they'll just release a Steam box with Intel hardware instaed.

If memory serves, they haven't been OSS friendly in a while and the cards for which unofficial drivers exist are mostly antiques at this point.

(Probably more fundamentally, they fell badly behind in the performance wars, and digital video interfaces made their reputation for quality high-resolution analog output less relevant, and retreated into specialist multiheaded/2d workstation/display wall/etc. gear. I don't know how well regarded they are in that market; but it just isn't a very big one compared to c

Their main business is not consumer graphics cards. I believe their focus is on building specialized imaging hardware for industrial systems and providing the associated image processing software (ML if I recall correctly). I imagine the margins are far greater than what they were getting building consumer GPUs.

There's still a bit of their stuff on server motherboards, plus external boxes designed to add a lot of screens to laptops.This year I tried running TurboVNC using 3D acceleration on a matrox chip in a new server and it performed far worse than sending it via X and openGL to a Geforce6* from 2006 or so I had in an old desktop machine. It's purely the fault of the chipset and not TurboVNC since I hooked a monitor up to it and it was just as slow.

I was a loyal Matrox customer back in the 90s, because they supported OS/2 fairly well with cards like the Millennium and Mystique (called the "Mistake" because it failed to live up to expectations). I don't believe they're much more than a niche provider in the market today.

Matrox also does a good amount of business providing barebones 2D chips to OEMs for servers, namely, IBM and some to SuperMicro. They stick around in that market because their ancient G200 drivers are still beyond reproach for providing.99999+ uptime.

Also, you can't really fault them for going purely into video editing and multi-headed systems as they couldn't keep up on 3D performance. nVidia barely figured out how to run more than 2 displays in the last year, and AMD can barely manage 3+. Matrox has b

I've still got one of their expensive 3 head cards in something. Now when I want something with more than two screens I just put another card in another slot instead of getting an expensive matrox card.

A gaming box with *current* Intel hardware would suck. But that's primarily because the current Intel "GPUs" are integrated onto the CPU die, and are only "good enough" .

I wonder how well Intel's performance would scale up. If they took their basic design, and used 600-1600 render cores instead of 6-16. I mean, a top-of-the-line card from nVidia or AMD has *thousands* of cores spread between two dies, while Intel is cramming a dozen cores into whatever space is left on the CPU die. Let them put out a full-size card, put a few gigs of dedicated memory and cache on it, and see what happens. We won't know for sure until it's tried, but rendering tends to be a pretty scalable problem.

If Intel *does* do that, they would be a likely candidate for the hypothesized SteamBox console, since they seem to be working *very* closely with Valve.

One minor dispute from me: If their main target is a SteamBox console, why make a full-size card? I'll take onboard graphics if the chip on the main board is as powerful as a contemporary daughterboard.

It'd have to be "discrete" anyways, even if it is integrated into the board. There isn't enough room or thermal overhead to put the necessary power on the same die as the CPU, which is what modern Intel graphics does.

By "full-size card" I meant "full-size die". I have a tendency to use "video card" for things that aren't actually cards - it's easier to say than "GPU", and makes it more clear that I'm referring to the actual processor plus any attached memory.

Logically, though, it would be either on the motherboard, or worst-case attached as an MXM card.

However, I would like to see Intel try to crack into the consumer graphics card market again. And once they have the chip die designed, it's not particularly difficult to

I think the main reason Intel would never release a discrete graphics card at this stage is because that'd make it obvious how silly it is to buy both an integrated and discrete GPU as everyone building an Intel gaming PC do today. To not look really stupid they'd have to release a GPU-less CPU to pair with their discrete GPU (apart from the overpriced LGA2011 CPUs to go with the overpriced X79 motherboards) and that'd let AMD and nVidia back into a market that Intel is making a killing off now - using thei

Except gamers are already used to having an integrated GPU that goes to waste.

Myself, I have an Intel HD in my CPU, which is currently never used because I've got a whopping GeForce 660M next to it*. Several of my other computers, even desktops, have integrated graphics that are completely wasted.

What would be useful is if you could SLI/CF (or whatever Intel wants to call it) the integrated GPU with the discrete. I've been told the AMD Fusion CPU/GPU chips can CrossFire with a discrete Radeon, although I've

In the consumer laptop market (which is larger than the consumer desktop market by a wide margin, and has been for years) having an integrated GPU in your laptop CPU makes Intel laptops an easy sell over AMD.

I don't know how good it would scale up. One thing is for sure though they'd need to scale other things than just the execution units (which is all they do for now).
Oh and your scaling numbers are a bit off. intel has only 6-16 EUs but these are 8-wide. So if they'd want a chip comparable to a high-end nvidia or amd card, they'd only need around below 200 EUs (they also run at somewhat higher frequency) not 1600 (which would be insane). Likewise for some good performance card ~100 EUs would be enough.

Chances are they'll just release a Steam box with Intel hardware instaed.

I don't see that happening. Instead, I see Valve partnering with one of the "real" GPU companies (AMD or NVidia) and co-operating with them in the same manner. In NVidia's case, I see them signing enough NDAs to get access to the closed-source driver code.

Unlike the speed of a graphics card, caring about whether a car can do 130mph on a public highway (what's that. about 210km/h?) is something only for arseholes who don't give a shit about the risk of killing other people with their selfish indulgences.

Practically, you're never going to drive on a highway at more than about 100-120 km/h or ~60-~67mph (typical legal maximums in many countries), or maybe 120-140 km/h (~73 - ~86mph) for short stretches of straight road where you guess there aren't any cops or s

Intel's HD4000 is an impressive piece of silicon. It runs BF3 (barely playable, but not competitive), Skyrim and a host of other popular, modern games. I'm excited to see where the HD4500 or HD5000 heads. The HD4000 proved that Intel has what it takes to compete on the low end with Nvidia and AMD/ATI. The groundwork Intel is laying now shows room for impressive improvements in the next 18-24 months. The laptop graphics market is going to be very interesting to watch in 2014.

I have an i5 Sandy Bridge on Xubuntu LTS with xorg-edgers (latest graphics drivers from git). After reading this and following articles pointing to tests using this GPU, I found http://www.xonotic.org/ [xonotic.org], which is quite an impressive OSS game. I played it at 1920x1080 with Normal effects and it looked stunning with no apparent stutter. Though I'm sure there are plenty of recent games that would bring this GPU to its knees, it seems to be up to the task for moderate gaming.

"Suck" is relative. I've got a pile of desktop machines at work set up with nvidia cards just to get an easy dual monitor setup and they don't run anything more demanding in 3D than google earth. Intel graphics is now capable of handling that and more. The new diablo looks like it doesn't need any more GPU grunt than Intel can provide and they are probably approaching the point where you could get Skyrim to run decently on Intel.Performance may suck in comparison to even the low range AMD and nvidia card

The thing is that even though for desktop compositing the FOSS drivers are absolutely brilliant they don't do well at all in anything 3D related. And when I say don't do well I mean it in a 95% decrease way (if the blob will do 100fps foss will do 5).

I have this feeling that Linux community (or the larger free software community - ESR fans may simply not care) ever since announcements of Steam and L4D ports got public, thinks of Valve a little too high than the company deserves. At the same time as they criticise Windows 8 walled garden, they are pushing new TOS to their Steam service users which, most importantly, dropped the notion of owning a digital "product" in favor of "subscribtion". This is yet another step on the path towards taking our legally purchased software away from us.

As Linux serves to give it's users total control over their computers, I think at least part of community should rethink their enthusiasm over Valve coming to Linux platform. In my opinion, some of practices it brings are totally at odds with free software values.

One of the core issues here though is that we have a company that is trying to cater to their customers both with great games, an easy and intuative way to install/manage them, easy ways to keep them up to date, solid support for mods and modders on practicaly all their own games, good prices, and DRM which doesn't get in the way of almost anyone.

I agree that what you say about steam's TOS is a step back (if it's really as you describe, i hadn't heard about it before, but i'll take your word for it). But in

On the other hand if you ask for less than $10,000 in arbitration they'll pay for your lawyer fees win or lose.

On the other hand if you ask for less than $10,000 in arbitration they'll pay for your lawyer fees win or lose.

If you are going to dispute for a small amount of money you are always better off using Small Claims Court. It is a real court and you can expect to get a real fair verdict. Most of the small claims courts even forbid lawyers.

On the other side arbitration in USA is known to be so biased that it is literally a farce (in 99.9% of the cases). The arbitration is done by private entities under little to no oversight, you are going to face corporate lawyers and the arbitration is binding, meaning you can't appe

If you are going to dispute for a small amount of money you are always better off using Small Claims Court. It is a real court and you can expect to get a real fair verdict. Most of the small claims courts even forbid lawyers.

In my state, either party to a small claims suit can request that the case be moved to a regular court. If requested, it shall be moved. So small claims is effectively neutered here. If a big company wants to bury someone in legal fees, all they have to do is ask the judge to let th

I greatly respect Valve's invovlment in the whole open source driver issue, but I still won't buy anything from them because their products are very DRM-infested. Respecting a single action from a company, and willingness to buy their products are very different things.

While what you say is true, that Steam comes to Linux is infinitely better than it staying on Windows, also for people's freedom.

15 years ago I heard similar arguments that ports of free software to DOS/Windows should be discouraged because of the intermixing. Guess what? That intermixing gave me and a whole bunch of other people a hint at what free software really means and eventually brought us over.

So get off your high horse, mr. Anonymous Coward, and let people cheer.

This is good news, because a company like Valve might actually have the clout to get AMD and/or nVidia to release good open-source drivers. After all, if it wasn't for the games released by companies like Valve, a heck of a lot fewer PC owners would need/want discrete video cards. And neither AMD nor nVidia wants a popular game to run worse on their card than on their competitors.

but if a steambox is coming, and they buy 10 billion graphics cards from AMD or nVidia, then there's no reason why they would bother with an OSS driver - the hardware will be fixed in stone, so a single binary custom built for the steambox will be all that's needed.

No, you'll never get these 2 to provide OSS drivers for their high-end products simply because this is part how they compete with each other. Until someone understands this, nothing is going to change.

usually consoles are un-upgradeable so developes have a common standard to develop against. When hardware gets better, they release v2 of the device.

you might get away with varying hardware abilities on a PC, but a console is a consumer device, it's supposed to be guaranteed that stuff you buy for it will run perfectly every time. You only get that if you disallow people from fiddling with its internals.

You'd think this would be obvious... but it's good to see someone stand up and take notice. Of course having the source is extremely beneficial, especially if you have the inclination and skills to interact with it (or can pay someone who does possess these qualities). I hope this gets lots of coverage. Maybe with more eyes and more review, people can spend more of their time creating and trying new things and less time recreating the wheel. Open source is an excellent way to help achieve that goal.

Valve is sitting in closed rooms patting itself and Intel on the back.Intel GPU performance and drivers have in every encounter I have suffered them - blown. Yes, they will do basic workload gfx wise. They will run office. They run basic apps. The times I take complex apps and have problems are legion. Its great that Intel and Valve are debugging the worst hardware in the PC gaming arena. Great. Even the current HD4000 leaves much to be desired.Might I suggest this is the last place Valve should be knobbing

Maybe developing with open source graphics drivers is great, but that's a different story than the state of graphics on open source.

Graphics on Ubuntu are terrible in my anecdotal experience. On my last laptop, installing Ubuntu 9.04 failed during install and dumped me at a command prompt because it didn't support the correct drivers to display the graphical install. That was the first and last time I attempted to run Ubuntu on that laptop. Or on my newer Envy 14 with dual ATi and Intel graphics. 10.10 i

Sorry, I won't be even considering running games on my Linux boxes/laptops. I'm running Windows 8 on my gaming laptop and it handles graphics, HDMI out, dual cards, dual monitors, Steam, all games (not just Source games) just fine. Why would I ever subject myself to the mess that is graphics on Linux?

I can't run the newest 12.xx releases with Unity, since it says I need graphics acceleration and my machine can't handle it; it's probably looking at my Intel card and concluding it's not good enough, while ignoring my ATi card.

More likely it's detecting the ATi card and using the free driver. Try installing fglrx or whatever the non-free AMD driver is these days and see what that does. Low performing they may be, but I can't say I've seen an Intel vid driver perform below expectations in a very long time.

Valve also thinks denying (coercing) clients their fundamental right to pursue cooperative collective class-action lawsuits against the company, even when such suits would be ethically warranted, is great. In that context, as a Valve client who wishes he could get his damned money back for the games he can now no longer access or play, even in single-player modes, for having resisted the aforementioned coercion, I couldn't care less what Valve or Gabe Newell thinks about open source drivers or anything els

So to be competitive, hardware manufacturers may have to provide their driver source? Perhaps at least to the developers. But that could be anyone really, and the next Minecraft may run better on Intel graphics hardware than any other because some amateur developer was able to wring performance out of it that much more easily.

But at the level that AMD/ATI and nVidia are competing with each other, perhaps the one to take the edge will be the one that provides open source drivers.

system ram is slower then video ram and with cards having 1-2GB of ram now days that is a BIG CHUCK on system ram to use and shearing it makes so you really can't say block off 1gb of ram just for video use.

Linux IS a good platform for games. As said, you can see what's happening at every level, which mean no need to workaround weird unexpected behaviors and stuff.
Linux isn't a good platform for some game developpers, because of the small user base. But for Valve, aside from the initial work of porting their Source engine, it only means more reach. Having the engine already work on macs probably helped a lot. And if great games start to be available on Linux (and I mean more than one AAA game per year, at most), it might also leverage the linux presence.
Giving the user the choice is the only sensible choice for people working with their brains, and Valve's pretty good at it.

You're misunderstanding Valve's position. They're not tweaking the drivers so much as using the source to understand which operations in THEIR software behave poorly. You're also ignorant to how much tweaking is already done in video games to make them work under Windows. Look at the furor Rage's release last year caused because AMD's drivers were broken and id Software didn't jump through hoops to make it work on that platform like so many other companies do.

> Carmack was talking about the financial viablity of targeting games to run on desktop Linux.
Remember, there is an enormous difference in targetting games to write on Linux *as well as other platforms*, compared to targetting games for Linux only (which would be a financial mistake). I'm currently using Java+JoGL(OpenGL+GLSL)+JOAL(OpenAL)+JInput to write a combat flight simulator. I develop on Mac and test on Linux and Windows. Because I have chosen these technologies the only cross-platform issues I

Ok, let me pose a simple question for you. Let's say you have some big objects in your program and you want to put them on the heap. Now, to get good performance out of your program you are going to make it thoroughly multi-threaded. Now, because the market is getting more and more diverse you want your program to run on Mac, Linux, Windows and Android. You don't have the resources of a global mega-corp to achieve this but still want to get the project done. Should I use C++ with its problems between compi

> You're right, I have never written a huge application in C++
Ok, when you scale an application you get different concerns. The complexity does not increase linearly with size and with C++ it simply gets very very hard to write a large, correct, and reliable program - especially when multi-threading heavily and going cross-platform.

> it uses more memory than C++ with its garbage collection paradigm. Java enthusiasts always excuse this saying that you should get more memory or that newer computers

> That's exactly the excuse I was talking about. You have no business using loads of RAM for your application. You should use as little as possible to run on as many computers as possible and allow as many other programs to run as possible (instead of just today's). You're not the only program on the computer, after all, and I'm not going to buy a new computer to run your program.

Wrong. Memory is there to be used. In the case of a game it would be negligent not to efficiently use all the resources tha

> I don't have to believe anything you say.
No you don't. But what you could do are check facts instead. Either build a modern program (eg. game) in Java yourself (doesn't have to be big, just a little one - and profile with JVisualVM) or look around on the web for *current* statements that prove or disprove my assertions. I don't expect you to believe my statements based soley on my assertions - I'm confident that if you actually took the time to check you would come to the same conclusions I have: the

>Funny how Valve just *loves* Linux now that Microsoft threatens their primary business model. Meanwhile, John Carmack, who supported Linux before it was trendy and cool and has no financial incentive to shit all over Microsoft claims that Linux is not a good platform for games. Gee, I wonder who I should believe?!?!

John Carmack did not say that linux is not a good platform for games. He said that the games that ID-Software ported on linux did not earn the cost for porting. This is a hard fact.But, no wonder that this is the case. Most gamers that use linux although have a windows partition for gaming. And when the windows version of a game comes month before the linux version, you already "lost" a big part of the potential linux market to the windows version.

Now, Valve shit their pants because of the windows market, and try to change it. And they have the power. Valve can solve all the distro and patch problems for the developers. If they deliver an easy way for game developers to reach the linux audience, linuxgaming will hopefully be a worthwhile market.

While we're at John Carmack, he also said that he found the Intel open source drivers really interesting to look at, and recommended it to anyone else writing a graphics engine. He then proceeded to say that if he could clone himself so he would have more time, he would love to work on optimizing them.

So it's not just the Valve developers who see the benefit of open source drivers.

No matter how you twist it, if Linux gets graphics drivers on par with Windows, it is much better for games since it wastes much less resources.

Case in point: My Linux installation at work, which is an 8 core, 16 GB RAM computational workstation, uses 231 MB of RAM after I've logged in. Two days after last reboot, with five terminal windows, Firefox with a dozen tabs, Citrix (to run Outlook, restrictive company Exchange policy...), Gimp, Blender, two additional CAD programs, and two instances of a PDF viewer, I'm still only using 1.7 GB RAM.

On the same system, Windows 7 uses 1.5 GB after I've logged in, no programs running. And yes, I'm using both preload and readahead on the Linux system, so don't give me the "Windows uses RAM to store things it will need in the future" because my Linux does as well.

>And yes, I'm using both preload and readahead on the Linux system, so don't give me the "Windows uses RAM to store things it will need in the future" because my Linux does as well.
If you're *only* using 231MB of 16 gigabytes, you're not caching nearly as many things as it could/should be. The only point you make is that Linux is terrible at putting your system's resources to good use.

On UNIX machines, some filesystems actually use the free memory as a cache, but leave it marked as free in case an application needs them.UFS for instance does this, while ZFS for instance does not (it uses an explicit cache).

You gotta be trolling. I'm running dual-boot too, and just about everything I do goes so much smoother in Linux than in Windows. From my usability point of view, it feels like windows is just squandering resources. GP's numbers do seem about right to me.

That said, I've always felt uneasy about "comparing the numbers" between Linux and Windows. The way windows' Task Manager reports memory usage is different form the default "top" view, and they're both somewhat nontransparent to the uninitiated because virtual memory management is complicated business. To make an apples-to-apples comparison, one has to precisely analyze how much memory is cached, buffered, swapped, committed and allocated. To make matters more difficult, Linux distros and users have a strong inclination to customize how the kernel manages memory and what software is being loaded, so there will be huge differences between different Linux measurements. And even windows can be leaned out or fattened up to a great extent by users and OEMs.

so don't give me the "Windows uses RAM to store things it will need in the future" because my Linux does as well.

Apples to oranges. The two systems use different memory management schemes. Case in point: 8 GB laptop, with some explorer addons and several background services (including Forefront antivirus), and I use ~900MB of RAM. I understand that under memory pressure, Win7 can work quite well with as little as 512MB, which is about the minimum I have seen your average modern Linux desktop work well with.

Of course, if you want to compare a stock Win7 desktop with something like XFCE with no addons or icons or an

The fact that the open source drivers are on Linux isn't really important to the story at all, asides from the background (i.e. that is the reason Valve are working with open source drivers to max out performance in the first place). The interesting thing is how the OSS allows Valve to tweak or examine the driver code on the fly to find out how to optimize performance.

Reading the summary is great, but understanding the point is even better.

The interesting thing is how the OSS allows Valve to tweak or examine the driver code on the fly to find out how to optimize performance.

Anyone who *actually* games wants to know who the fuck cares about underpowered Intel video card drivers. Oh, it will be able to play 5 year old Valve games? WHOOPTY-FUCKING-DOO.

Perhaps you forgot about the time, years ago, when the FOSS crowd courted ATI, saying "Release your specs! The FOSS community will do the rest!" What did ATI do? They released the specs. An opensource driver was born, and it's an unstable, slow piece of shit. When these FOSS folks realized they weren't technically competent

The interesting thing is how the OSS allows Valve to tweak or examine the driver code on the fly to find out how to optimize performance.

Anyone who *actually* games wants to know who the fuck cares about underpowered Intel video card drivers. Oh, it will be able to play 5 year old Valve games? WHOOPTY-FUCKING-DOO.

Perhaps you forgot about the time, years ago, when the FOSS crowd courted ATI, saying "Release your specs! The FOSS community will do the rest!" What did ATI do? They released the specs. An opensource driver was born, and it's an unstable, slow piece of shit. When these FOSS folks realized they weren't technically competent enough to actually create a driver for a modern GPU architecture, they went back to demonizing ATI for not releasing their proprietary driver under a free license.

What's the moral of the story here? Just because something is open source doesn't mean "the community" is going to be able to do shit about it. Intel wants to point and say, "Look! Intel GPU can play 5 year old valve games!" Valve wants to say, "Look, Linux is a viable gaming platform!" At the end of the day, it's totally irrelevant to people who want to play new games on modern GPU's.

You are clearly not a big picture person. What this means is that a multi-million dollar company is saving time by using open source. Time saved is money saved, and, using political algebra, every dollar saved is 30 jobs. What did Intel lose? Nothing. Meanwhile, the economy as a whole gains GDP and everyone wins.

But, absolutely, you're right, and the other guy is wrong: this is all useless because you don't like Valve's game line-up.

Anyone who *actually* games wants to know who the fuck cares about underpowered Intel video card drivers

Starcraft 2 is 5 years old? Torchlight 1 / 2 are 5 years old? League of Legends is 5 years old? I suppose you could label TF2 and WoW as 5 years old, but that kind of ignores the whole "still actively developed" thing.

All of those work just fine on an underpowered Core i3 2310m, using HD3000. Current gen Ivy Bridge processors are expected to deliver ~10-15% better performance, and IIRC the HD4000 line is ~50% more performance over the 3000.

Anyone who *actually* games wants to know who the fuck cares about underpowered Intel video card drivers. Oh, it will be able to play 5 year old Valve games? WHOOPTY-FUCKING-DOO.

Again, NOT THE POINT. The point is: open source drivers are easier to work with. Creating one for a graphics card yourself? Hard. Writing drivers is always a bitch, thats why they often don't work right (even the closed source ones creating by the people who made the hardware in the first place). Thats why the ATI open source driver kind of sucks. Graphics cards have a ton of out-of-spec tweaks and gimmicks to improve performance, and always have, sometimes even tweaks intended to make a single engine run well. That makes creating your own driver a monumental task, even if you ostensibly have the specs, because those specs are never quite valid. Hell, ATI/Nvidia can't even get their drivers to work right all the time, and they made the damned cards.

All of that is a reason why the ability to work with an existing driver (assuming it is well-made) a huge bonus. Because otherwise you are working with a black box that doesn't ever work exactly as advertised and as it properly should. If you can look at the source, you can try to figure out why. Ideally, the hardware would itself be open too so you could see how far it deviates from the specs (they all do), but we don't live in an ideal world. Thats why I use a close-source driver and probably always will. But it'd be cool if I didn't have to. And that's the point of the story.

An opensource driver was born, and it's an unstable, slow piece of shit. When these FOSS folks realized they weren't technically competent enough to actually create a driver for a modern GPU architecture, they went back to demonizing ATI for not releasing their proprietary driver under a free license.

Not quite what I've heard. AMD didn't just release the spec, they also released a driver. . ..and assigned several developers [wikipedia.org] to keep working on it. My own opinion is that the open driver keeps slowly improving, and has been useable for a bit. (With the caveat that I don't play games very often.)

Also, you don't have to provide patches. You can also provide good bug reports. These are talented experienced graphics programmers that like to push hardware to it's limits. They are likely to "break things" and kind of know what's going on. They can share this information with the people who need to know.

Back in the day (as it happened, in that day the soon-to-be founders of Matrox, which was mentioned in an earlier post, worked in the same building as I did, at Tektronix in Wilsonville), we used various early computer games as useful hardware test cases for Tek's graphics terminals and workstations. Just as now, games pushed the envelope of everything - software, hardware, and thermal. Running Asteroids was a good way to find out how to blow up the terminal's embedded OS, or smoke the power supply.

Valve as a company is built to experiment. They were experimenting before they had metric fucktons of money (a metric fuckton is 1.7 imperial fucktons). Turning TF2 into "My Pretty Mercenary" (accessorize! explodize!) was an experiment. Steam itself was an experiment. Their experiments have frequently paid off, and now they've got the ability to do even more radical experiments.

Sadly, it has nothing to do with Flash (though Flash may well tear once everything else is fixed, but I don't care about Flash). Vsync is broken in various ways on Sandy Bridge and Ivy Bridge, both using a driver compiled with SNA enabled and not. Basically, any non-fullscreen video (or GL) will likely tear - and on a multihead display, fullscreen never happens (since "fullscreen" on on head is just a portion of the combined framebuffer). There's some attempt at vsync, since the tearing isn't random like on