Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

RemyBR writes "Softpedia points to a Nvidia Developer Zone forum post revealing that the company has removed a specific Linux feature as of the v310 drivers due to the Windows platform. A BaseMosaic user on Ubuntu 12.04 noticed a change in the number of displays that can be used simultaneously after upgrading from the v295 drivers to v310. Another user, apparently working for Nvidia, gave a very troubling answer: 'For feature parity between Windows and Linux we set BaseMosaic to 3 screens.'"

No, it would be communism if the nouveau project decreed that you couldn't run competing software (like nvidia's binary) and had the authority to enforce it. After all, it was developed by 'The People', and you will use and like it. Thankfully this is not the case. It's just too bad nvidia is removing functionality for idiotic reasons.

What deal? The only deal would be between Nvidea and Microsoft, who I'm sure paid a princely sum to hide one of Windows' various deficiencies.

Why do that, when you control the Windows logo rules?

Windows must have a preferred status; features available to Windows users must surpass the list of features available to users of competing platforms; that is, as a condition of applying Windows logo certification to a qualifying hardware product, Hardware must have an experience or supported featureset on Windows that exceeds the user experience on any competing operating system.

This is not a quote. It is an implication; that behind closed doors, between vendors, there is an "arrangement" MS requires, and if they refuse to comply --- MS has the stick of refusing logo certification to their product ---- if nVidia doesn't get the Windows Logo; then neither do any of the hardware builders or OEMs using nVidia components; therefore, they are likely to ship someone else's hardware instead, so they can get the logo.

This is not a quote. It is an implication; that behind closed doors, between vendors, there is an "arrangement" MS requires, and if they refuse to comply --- MS has the stick of refusing logo certification to their product ---- if nVidia doesn't get the Windows Logo; then neither do any of the hardware builders or OEMs using nVidia components; therefore, they are likely to ship someone else's hardware instead, so they can get the logo.

In other words, you made up a phony quote that looks like it's copied straight from a real policy and got modded to +5 Informative, when instead it's just speculation on your part.

It's well known that for 18 years (at least since 1996), that Microsoft doesn't want OpenGL to offer features that DirectX doesn't have. Some companies would be required to put the "brightest graduates" on DirectX projects rather than OpenGL ones - that's from interviews and direct employment.

Well from experience Microsoft has secret contract terms like this that repeatedly come out after years of competitive interests being shitcanned on support from hardware vendors or OEMS. So while this exact term may be made up, it most certainly EXISTS in their contracts... Because that's Microsoft's MO.

I'm certainly no fan of Microsoft or Windows -- you can check my posting history to see that -- but it strikes me that the WHQL requirements could have a good reason and such an exception shouldn't be granted. In particular, it makes sense to me that the OS should be able to see multiple GPUs separately so that it can manage them itself.

I may not entirely know what I'm talking about, but I get the general impression that the real issue is that X delegates too much responsibility to the driver, and that the

But doesn't Wayland get rid of all the remote desktop capabilities and whatnot too? I suppose I understand the need to break compatibility (in order to move to a more flexible driver model) but I don't understand the need to break compatibility gratuitously (by changing the API more than it absolutely has to) or the need to remove features.

Seriously, Does MicroSoft need 3rd party vendors to make their "user experience" better than their own operating system? I'm sorry, let me rephrase that: Do hardware drivers make the user experience so much better? That must be one crappy operating system then....

I think WHQL certification is mostly MS' way of pressuring component vendors to implement new features they want.
For example; UEFI secure boot / OS signing, and preventing OSes such as Linux for booting, or providing TCPA / TCPM (Trusted Computing Platform Modules).

But that's not all. there are plenty of features. Whatever MS requires or forbids will be extremely influential, as computer manufacturers
Want to be able to advertise their product using the Windows Logo; and they are required to use only Windows logo certified components in their systems, so a hardware vendor not getting the stamp of approval on their product can be harmful to their business.

MS will use their leverage to do what they think will improve the number of people buying their product --- that includes improving their user experience, or diminishing the user experience of old operating systems, or competing vendors' OS.

For example: making new hardware no longer compatible with XP or Windows 7 would be a win for them, because it encourages more sales of Windows 8.

The WHQL [microsoft.com] requirements are full of lists of features
that must be supported and features that must not be supported by hardware.

In the former category; they list supposed business justifications, and it's all about user experience.

For the latter category; these are limitations of Windows, and the hardware is not allowed to have support for features outside of Windows' limitations.

In the middle category; there are features hardware vendors must ask for permission to implement;
that is probably the safest category for MS to use to pressure vendors --- just withhold permission, until they agree to 'off the record' conditions.

No, I know where the quote comes from, it just wasn't applicable. Vader made a deal, Nvidea made no deal with Linux. Their only deals are with Microsoft, who are probably the Vader here (Darth Ballmer?).

No, I know where the quote comes from, it just wasn't applicable. Vader made a deal, Nvidea made no deal with Linux. Their only deals are with Microsoft, who are probably the Vader here (Darth Ballmer?).

That's kind of the point, most of the time the opensource world does not get to make deals with corporations. You take what they dole out and then you thank them for it.

I don't like it, and I'm hoping it doesn't last, but it does seem to be the reality.

In the ideal world our esteemed colleagues at Redmond will continue to screw themselves over and the world will turn slowly to Linux and all the hardware vendors will start playing ball, and there is some indication that has been the gradual trend over the pa

I've looked, and found no reference for this. On their SteamOS [steampowered.com] page, they hint at it but it's nebulous:

"Cooperating system

Steam is not a one-way content broadcast channel, it's a collaborative many-to-many entertainment platform, in which each participant is a multiplier of the experience for everyone else. With SteamOS, "openness" means that the hardware industry can iterate in the living room at a much faster pace than they've been able to. Content creators can connect directly to their customers. Users c

Considering it's a console that plugs into a TV, I don't think this matters much at all. Valve's definitely been throwing their weight around demanding better performance and fewer bugs, though. nVidia Linux drivers have been performing better with every release.

Considering it's a console that plugs into a TV, I don't think this matters much at all. Valve's definitely been throwing their weight around demanding better performance and fewer bugs, though.

That or the fact that of the four eighth-generation consoles available now, all three run AMD graphics. PS4 and Xbox One have essentially the same AMD APU, and Wii U is reportedly built on a Radeon HD 5000 [wiiudaily.com]. Only the OUYA console has NVIDIA graphics, and that's the same Tegra 3 that's in the first-generation Nexus 7. Perhaps this is NV's attempt to redeem itself to gamers who say OUYA doesn't count.

Meanwhile as of 3.11 the kernel "radeon" driver is already fully functional, complete with power management and KMS support.

Frankly I'd wager nvidia has already lost on Linux, even though it may currently appear they are still the preferred platform with their higher quality binary driver. But binary drivers have a very limited future on Linux, especially such a critical one as the graphics driver. AMD may have a shitty binary driver, but the "radeon" driver is miles ahead of "nouveau", and once they start

once they start seeing the signs on the wall it will be a simple matter for them to put in a little effort and make "radeon" the best graphics driver for gaming on Linux

Or a lot more work, from what I gather one of the key differences between the radeon driver and the catalyst driver is that they've created a ton of behavior profiles to fit different workloads and they're continuously working to update them and providing even more specific ones tuned to the individual game. That takes a lot of manpower and a rather complicated driver infrastructure, while the open source driver has gone for a much simpler "jack of all trades" acceleration. Last I was really paying attentio

Anyway, if i'm right, optimus support under linux is not on par with windows.Are you nvidia going to fix optimus on linux, or "for feature parity" are you going to make the optimus support worse on windows too?

Directly quoting someone from that thread because this was exactly what I was thinking of.

All this time I've been pissed at the nouveau [freedesktop.org] drivers that came as default with my linux distribution. "NVIDIA's drivers are working perfectly" I thought. "Why the hell are you building something not as good, just to make it open source?"

RMS was always insightful. We were just trying to cut corners by being "pragmatists". We fully deserved what we got out of this consumerist, passive stance. Oh, not all of us (Theo, are you reading this?), but most of us deserve to be hit by a clue bat every now and then.

It still lacks critical features like proper power management, which means cards using Nouveau tend to have reduced lifespans compared to the binary drivers which actually control the fans and voltages properly.

Nouveau is a good idea and should be encouraged, but there's no fucking way I'll touch it even if NVIDIA treats Linux like a second-class citizen. And honestly, if you aren't used to being treated like a second (or after OS X, a third-class citizen) on the desktop computing world, you haven't used Li

Yes, it's not as good as it should be. Fortunately, NVidia has opened up a lot more specs so the last missing bits for may features and "irky bugs" in Nouveau can finally be dealt with. It's still not enough to build a fully featured just-as-fast driver, but in the last few months, significant changes were made in how NVidia treats the open source community and so far, Nouveau driver developers have been happy with what they got. It's not finished, but at least it's getting in the right direction.

It still lacks critical features like proper power management, which means cards using Nouveau tend to have reduced lifespans compared to the binary drivers which actually control the fans and voltages properly.

Can you back up the statement that it actually reduces lifespan or it's just a wild conjecture? Indeed nouveau lacks support for voltage and fan control by default, but it leads to it using lowest frequency for GPU which doesn't require much fan management. I wouldn't expect degradation of lifespan.

Check what I wrote. I didn't blame the people working on Nouveau at all, it's a tough job and I even said we should encourage their work. It's just that the current results are shit for those who have high performance and functionality standards. It's stupid to ignore the reality of the situation just because it's not ideal and I'm not going to sugar-coat the current situation.

As for NVIDIA hardware, it's still got the best support in Linux when you want power. At work we use CUDA because OpenCL still isn't

I would hardly say OSX is third-class... OSX is far more open than windows, key parts of the system are open source and most parts of the system do a good job of supporting open standards (eg the calendar app supports caldav, whereas outlook is tied to exchange via proprietary protocols)...

I used that feature on a Geforce 2MX to try it out, a good while ago. No idea what you mean by "hacking" 3D support, you only had to press a hotkey to enable Stereo 3D in any game or app (with or without great results, but at least it's working or trying to). Five year laters I tried shutter glasses on Geforce 6/7 (too bad FSAA wasn't working, as I had to run at 640x480 or 800x600 on the old CRT to play with stereo).

Anaglyph was really shit though, it fucks your color vision (after using it for a hour your eyes or brain compensate, if you look away from the screen and close one eye, one eye sees in red and the other in blue! to this day my right eye seems to see in a warm tint and the left eye in a cold one)

That's normal. I remember as a child (long before I ever encountered anaglyph glasses of any kind) amusing myself by switching from one eye closed to the other, which would slightly change the tint of what I saw.

Anaglyph was really shit though, it fucks your color vision (after using it for a hour your eyes or brain compensate, if you look away from the screen and close one eye, one eye sees in red and the other in blue! to this day my right eye seems to see in a warm tint and the left eye in a cold one)

That's not the Anaglyph driver, that's physics and biology and is an artifact of the red-blue method of "3d". You'll get the same effect watching an old 3d movie from the '50s with the red-blue (more like bluegreen

Indeed, the complaint was more about anaglyph than the driver per se. You don't even need a driver, you can look at stuff right from image search results. The comments about blood flow in the eyes from other posters are damn interesting too, this means there are probably no long term effects.

About shutter glasses, they don't deal with colors and theoretically would not affect them. They darken the picture and give an unwanted blue tint, but that's because the LCDs are far from perfect. With the old style PC

My eyes do that naturally and I've never used any of those red/blue glasses. It's normal for your eyes not to see the same colors.

My big beef with cheap, TN panel LCDs is that the viewing angle is so narrow, my left eye sees different colors than my right eye. I have to use a good IPS display just for everyday work or I get a stereographic-induced headache. Forget resolution -- I'm pissed it took so long for affordable IPS displays to hit the market.

In this case, I think that Nvidia is using driver capabilities to sell new chips. What happens is that the chips are designed with a certain feature set, but the driver does not enable all of them. Later, "new" chips are released, but the only real change is a change to the drivers, which now unlock features already in the prior chips.

Nvidia still dosent get it.. Reminds me of now famous Torvalds quote from video where he send hes regards to Nvidia..

What doesn't it get that the quote from Linus reminds you of? That Linux developers are unprofessional? I think it gets that very well.

Nvidia still doesn't get that removing a feature, from the linux driver, to level it with the one for an inferior product is a big "f_ck you" to their linux costumers.

If that is being professional I think that calling " Linux developers are unprofessional" is praising them.

Well, they're certainly being encouraged to work with Linux by being told "fuck you" by its most important developer. I can see why they didn't sweat it by reducing a feature in the Linux driver to spare Microsoft's blushes. I expect they've never been told to fuck off by Microsoft.

Well, they're certainly being encouraged to work with Linux by being told "fuck you" by its most important developer. I can see why they didn't sweat it by reducing a feature in the Linux driver to spare Microsoft's blushes. I expect they've never been told to fuck off by Microsoft.

I think you are being unfair to assume nvidia product managers are such petulant tantrum-throwing prima donnas that they'll retaliate against their users because a project's leader prefixed constructive criticism with a rant nvidia pm's might find tactless. I see no evidence of that.

If that kind of behaviour is the norm in your company, or considered "an understandable human reaction" or something, I suggest it might be practical to demand that your coworkers grow the fuck up if they expect to be taken seriously.

I'm sorry, I thought the sarcasm was plainly evident in my post. I guess not.

If being "professional" means presenting a false, colorless, expressionless version of yourself rather than behaving like you would with friends and family without the need to present an artificial image I think the Linux developers are better off.People really need to get over their silly aversion to so called "obscene" words and gestures. They are just words and gestures. They aren't going to hurt anyone. We would all be better off if the entire world were bombarded with and thereby desensitized to this s

If being "professional" means presenting a false, colorless, expressionless version of yourself rather than behaving like you would with friends and family without the need to present an artificial image I think the Linux developers are better off.

People really need to get over their silly aversion to so called "obscene" words and gestures. They are just words and gestures. They aren't going to hurt anyone. We would all be better off if the entire world were bombarded with and thereby desensitized to this stuff.

I see you missed the sarcasm, but such as it is.

There is middle ground between a false, colourless, expressionless version of yourself and telling someone to fuck off. That's the target ground you (and the business you represent) aim for. Your strawman argument suggests that the alternative to telling a company to fuck off is to be an expressionless drone.

For any people with free time, how about starting a PAC to get a new law passed that would require hardware manufacturers to provide full specifications of their products to consumers in a standardized format? It could be used not only for open source developers (rights of the consumer to use purchased gear as he or she sees fit) but also could be used to guarantee and verify all provided functions and that there aren't any additional spyware functions included. Conceivably it could be used in a software / firmware binary verification program too.

besides, you can't get the gov to sign any laws that tie their hands on spying. and spyware, whether corporate or governmental is not going to be outlawed. that would interfere with, well, business and government! at least the current bad behavior of those two.

the time when government stood up for consumers' rights was 20 or more years ago. the last few decades, well, n

get a new law passed that would require hardware manufacturers to provide full specifications

This is analogous to patching bad code with more bad code instead of fixing the bug. The reason there is so little competition among video card manufacturers can be found in the patent system and corporate liability law.

How about you start a Kickstarter project to create a 3D graphics card as good as Nvidia that will have it's hardware fully documented and driver source always available. If you build it, Steam will come.

Come people think! Why would they do that? I'll bet you anything that it makes development easier not do have a special feature just for the Linux market tested only on Linux. Companies do not spend any more than is necessary especially if the feature in question is not driving sales.

Let's see: you use an overblown proprietary binary blob that contains who-knows-what in times of overall NSA spying, and you dare complain that this binary blob has lost one tiny bit of functionality w.r.t. Windows' binary blob? Don't worry, the main functionality of this nVidia blob (NSA backdoor?) is still fully functional.

Let's see: you use an overblown proprietary binary blob that contains who-knows-what in times of overall NSA spying, and you dare complain that this binary blob has lost one tiny bit of functionality w.r.t. Windows' binary blob? Don't worry, the main functionality of this nVidia blob (NSA backdoor?) is still fully functional.

Nvidia has never been a perfect partner to the opensource world, however your tinfoil hat is too tight son, it's clearly cutting off the blood flow to your brain.

Well, I was (partly) joking, but what makes you think those binary blobs are backdoor-free? That's just a belief, isn't it? Point is: there's no way for nVidia to restore confidence other than to provide the full source. As a matter of fact, I do work in IT security and I'm seeing more and more companies here in Europe avoiding those binary blobs like the plague. Even more so since all this Snowden publicity. Now, does nVidia's driver contain a backdoor? If your corporate secrets are important to you, it is

I've been running NVIDIA hardware with 4 monitors for over a decade. So, maybe there is an issue with win7/8 and multiple GPUs? In the past I would even mix/match the GPU's because the windows multimonitor support is (was?) part of the OS. I remember packing multiple PCI (not e) boards into the same machine. Lots of combinations worked but not all of them.

So, as another user on the nvidia forums pointed out it sounds like BS.

That said, running single screen configurations with linux/Xinerama has been proble

Just thought I'd post that our kickstarter goes live on 10/9 for an LGPL graphics core. It is a complete 2D/3D Verilog implementation. The current version is PCI based and runs on Altera/ Xilinx or ASIC. 100% clean and synthesizable Verilog. We have a number of stretch goals that bring new features, generic interfaces so you could run on a PCIe FPGA board or an SOC part. The ultimate stretch goal would be a Unified Shader design.

We have pictures and will have video from the FPGA board on the kickstarter sit

Not really so. We ran a comparison before our last server purchase for a larger client and AMD won the performance per dollar ratio for virtualization with the dl 385 g7. I'm also about to make a large desktop refresh purchase for a cost conscious company and the amd offerings from various suppliers offer more bang for the buck. They are mostly using standard office applications, and in a couple cases light adobe work (photoshop, Indesign etc) and for the price even on the more heavily utilized computers we can add a dedicated graphics card and more ram for the same price or less than buying an intel based box. Given that the ram is more expandable on many of the amd chipsets and the raw cpu power just isn't that important any more for the 90% use case it makes sense to have a homogenous environment, so intel is likely out of the picture completely.

In a car anaology, if you are a racecar driver you need a racecar, but as a car manufacturer don't rest on your laurels and think you can charge more just because you have a really fast ferrari. Most people are happy with a slower but reliable toyota with the power window and cruise control at a fraction of the cost.

I've been running accelerated 3D graphics on Linux thanks to nVidia since 2000. And thanks to Linus' pragmatism

In an ideal world, nVidia would provide their drivers as Open Source for the FOSS crowd, and one day they might if they can get the IP issues sorted out.

I am a supporter of Linux and the FSF and I admire and support the efforts and ideals of both. However, these are ideals not physical reality. I choose FOSS wherever possible. I don't run Windows and abandoned MS when Win95 came out. I've done just fine without them (thanks for Slackware, Pat).

I dare say that there are millions of (not very clever) people in the world who would have dismissed Linux and Free Software in general as "rubbish" had they not been able to see it do fast, hardware-accelerated 3D graphics like the commercial OSs. You know what people are like...

Then we had the support for Linux from ATi, not to be left out, and later intel who have very generously provided much data and open source code.

Without nVidia's contribution and pioneering support of Linux, we'd be in a much darker place today and Linux would be not nearly as popular with the average user.

I've been using nVidia graphics cards on my own PCs (all Linux) since 1999 and I've never been disappointed. I'm on my 5th or 6th card now (lost count). And I've never had trouble integrating their driver with Slackware or anything else...

I was going to mod it up until I saw the Hot Grits/GNAA crap, which I guess will come back around to being funny again some day... Most likely after Slashdot implements their new design that will finally stop me from coming here.