Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

EconolineCrush writes "Sound card giant Creative caught plenty of flak for its recentdriverdebacle, and has long been criticized for bullying competitors and stifling innovation. But few have been willing to compete with Creative head-on, allowing the company to milk its X-Fi audio processor for more than two and a half years. Now the SoundBlaster has a new challenger in the form of Asus' $90 Xonar DX, which delivers much better sound quality than the X-Fi, PCI Express connectivity, and support for real-time Dolby Digital Live encoding. The Xonar can even emulate the latest EAX positional audio effects, providing the most complete competition to the X-Fi available on the market."

Well, onboard sound is getting better, for what that's worth. And surround can be physically a pain to setup, assuming it's supported in the games you want to play.

But I think the real problem here is that just about every sound you're going to be listening to is already compressed mp3, range-compressed to hell. It's kind of like suggesting upgrading your monitor or video card if you're only going to be watching YouTube. Hopefully at least a few developers are using high quality sounds in their games...

I completely agree. I don't understand why anyone would sped exorbitant amounts of money on a "gamer" sound card (that's what creative markets to pretty much exclusively) when you can buy a decent card for recording for the same or less.

I have an M-Audio delta 44 and I love it. Sound q is excellent and the 1/4" analogue ins and outs work great for me (I have a pro-audio amp for my computer speakers). If I wanted something more basic for another computer build I'd buy the revolution 5.1 card. It supports Sensaura, EAX, DirectSound and A3D and I'd bet if you did measurements was lower noise than a Creative card.

Creative is nothing more than a brand. They leverage their name to sell cheap crap to consumers at inflated prices. Any educated buyer would NOT buy a Creative product.

I'll go one further. It's not just that they don't provide value for money, Creative actually makes the worst soundcards I have ever, ever used. They aren't as good as the onboard RealTeks that come with your mobo, and of course can't hold a candle to a proper M-Audio (I used to use a Delta 1010). Both of these options sound better and install with less fuss and operate with less trouble.

There are better devices available for recording. They typically include a high quality preamp, which is not something you'll find on a sound card. I use Konnekt 8 from TC Electronic. It's less than 300 bucks, it provides multichannel recording, XLR inputs with phantom power and monitor out.

If you're really into recording useful things on a regular basis, you're probably using something like an external firewire (or USB, eek) audio interface... Because even with a good card, there's just too much electronic noise roaming around inside the average computer case, and most of it is caused by shitty power supplies--so the noise is conveniently often right in the audible range--and most internal sound cards are not very well insulated. It's not such a big deal for skype or voip or most anything else the average joe does with audio in, because those ranges often get compressed out, and due to the nature of the use, it's not a big deal in the first place. The external boxes also usually have the added bonus of microphone phantom power, amps, and make it pretty easy to use a quality mic or other pro-quality recording gear, at relatively little expense.

And surround can be physically a pain to setup, assuming it's supported in the games you want to play.

The cool thing about Dolby Digital Live encoding is the game doesn't have to support Dolby Digital. The sound card and drivers magically remix positional DirectSound events into a Dolby Digital bitstream.

In other words, I plug my computer into my AV receiver with 1 audio cable and surround sound Just Works in all my games.

But I think the real problem here is that just about every sound you're going to be listening to is already compressed mp3, range-compressed to hell.

Even if the sound quality was terrible I'd want to know if there was a level 3 sentry behind me. Surround sound makes games more enjoyable.

At University I had a friend who was an audiophile. One day he came in looking very pissed off. I asked him why and he ranted that one of his female flatmates female friends had told him he was "the sort of man who stops having sex so he could turn over the record".

All those complex arguments I tried to use to convince him he was wrong and that music really does sound better on a decent quality CD player than a vastly more expensive record player were so utterly outmatched by those few words.

Because for really good audio sound cards inside a computer case are not a good idea so might as well go with a crappy one for testing or just use the bundled one.
In general D/A conversion needs to be performed outside the computer case, in a specialized box. So that is why people spend tons of money on a computer then spend a lot more money on a USB / Ethernet digital audio platform.

I'd wager that most folks, these days, never do any serious recording of audio. It's just not something that there are very many practical applications for in a modern world. And even when they do want to bring analog audio into a computer, it's probably only as a part of a video capture or VOIP rig, and they're just not paying much attention to the fidelity. And even when they do have a need to do serious recording and are paying attention, only the most glaring amounts of audible noise and distortion are likely to be noticed. People are generally pretty tolerant of relatively bad-sounding audio.

If the need were more common or they were paying more attention, cheap sound cards would commonly have the same huge number of reasonably good inputs as they currently do outputs, because that's what the market would demand..

Myself, I've been looking for a decent, cheap 4-channel sound input into the PC for years -- I've got a few old quad recordings of various rock music on 1/4" reels which I really want to listen to, but I will only do so in the presence of something with which to archive it with (the tapes are so old that it's not unlikely that playing them even once will destroy them).

Lately, the additional need for 4 or 8 (though preferably 12 or 16) inputs has risen as I'd like to begin making some live recordings of a band that I've been working with.

It's not hard to find sound card or external Firewire/USB box which can do these things -- it's just hard to justify the expense.

But it's not the expense which is keeping people away from recording on a PC, but rather just the fact that these sorts of tasks are esoteric enough that most people will never do them. Therefore, the market is, and is likely to remain, very thin.

Like RAID storage, backup devices, SAS drives, DVI-connected LCD monitors configured with 1:1 pixel ratios (instead of BlurryVision and/or FatPersonVision), most folks just don't have any reason to care about this aspect of computing.

USB or ethernet? Yikes. USB is frequently very unreliable for audio. The only place where Ethernet audio makes sense is if you're wiring up an arena or something. When you have to run 16 channels of audio to dozens of amplifiers and speakers all across an area that's a quarter mile wide, Ethernet is the perfect solution. For most recording purposes, though, the much higher cost of Ethernet-based gear just doesn't make much sense if you only need to run signals to the next room over.

are the answer, and most motherboards have one or both of these built-in these days.

Never output an analogue signal from a PC, if you've got a choice. Internal D/A sucks, so do it externally. Either use decent powered speakers or an inexpensive integrated receiver, and the PC is removed from the sound quality equation completely.

re: "Because its primary functions are gaming and programming, and neither of those would be seriously enhanced with a better sound card."

Gaming is absolutely enhanced with a better (read: real) sound card. Onboard audio steals system RAM for its buffers rather than having its own memory, which can lead to sound dropouts with multiple simultaneous voices, and even cause stuttering and FPS loss. Not that these aren't effects I've also seen with Creative "real" soundcard products though especially from the Live family. Creative's quality seems to have taken a nosedive since the SB16 days.

Except that not all onboard audio steals RAM, not all onboard audio catches all the surrounding noise (you didn't say that, but everyone else does), and not all onboard audios cause stuttering. Most do come with a slight FPS loss (OH NOES! Crappy non-optimised games like Hellgates:London run at 97 fps on my machine, so they could do 100~! big freagin woohoo).

Seems like getting a decent motherboard may matter in this case. Investing in better speakers is probably more important than the sound card... unless you have a top notch 5/7.1 system, the soundcard will not be the bottleneck.

Gaming is absolutely enhanced with a better (read: real) sound card. Onboard audio steals system RAM for its buffers rather than having its own memory

You are correct most of the time; however, there are a few onboard sound chipsets that provide their own buffers and hardware and interface to the mainboard via a PCI interface just like a real sound card, because they are real sound cards.

The usual implementation of the AC97 specification would be an example of what you are talking about, where older onboard

I agree that a real sound card is needed for gaming. But most people are surprisingly deaf and can't hear the difference, and so most game companies don't want to spend the time and money on proper audio.One of the best games to demonstrate the difference between onboard and hardware-accelerated audio is Bioshock. Using my onboard Realtek HD with 5.1 speakers, I get a muddy mess of sounds. I couldn't stand it and decided to get an X-Fi and the difference is amazing. I can hear the difference between sounds

Many people have asked why I have a moderately expensive sound system, and yet my TV is old enough to drink (seriously, it's a 15" 1986 panasonic with fake wood sides, pre-coax antenna connectors, and a slot underneath to store the remote). The answer I usually give is that,"I wear glasses, not hearing aids. I'm going to favor the senses that still work properly."

So yeah, there are people like me out there that would upgrade the sound card before upgrading the CPU (or video card).

Ouch. I'm not an audiophile by any means, but that would drive me to drink. I have a nice set of Aura Aspect 20/40 speakers (with under-the-desk subwoofer). They sound better than most home theaters I've been around, but only cost about $100 or so when I bought them. I like to code to music - for some reason, The Crystal Method's "Vegas" just makes the LOC flow - and they're the difference between hearing a symphony live versus over the phone.

Why? The ONLY sound my computer is setup for is "Beep" - Don't want any more than that - just enough to get my attention. I don't game on it - or listen to music - I surf, and I write software - silence is golden

As a matter of fact I ordered one computer to be without any soundcard at all, and in the most games I play I turn the sound off. And I never use the computer to play music. So what's the point in spending money on a sound card?

Turtle beach = YES. I don't know why they bother, but this tiny company makes great little sound cards. Simple, but clean. Their sound quality puts many "pro" cards to shame.

Bose = GOD NO! I mean, if you like the Bose sound, that's your preference and that's fine, but the term "playback quality" refers to reproducing the original sound as accurately as possible, something Bose speakers don't even try to accomplish.

The thing with sound is there are two main schools of thought: those who seek accurate reproduction, and those who seek "pleasant" reproduction. Studio monitors, high-end headphones and some brands of tower speakers shoot for accurate sound, which many people find cold and dry. Bose speakers typically produce "happy" sound, by using a gazillion drivers and psychoacoustic sound processing (think SRS).

Creative's X-Fi also specializes in this "happy" sound through the use of the so-called Crystallizer. It takes normal, clean audio, and adds the sonic equivalent of glitter dust to appeal to the aural magpies of this world. A few people dislike it (like me), but many people enjoy the effect it has on popular recordings.

So then, what do non-Bose non-Creative users lack ? Happy sound. I personally don't miss any of that stuff, and I have zero issues with my featureless onboard 8-channel sound and my cold-sounding high-end speakers. Even the Asus sound card doesn't tempt me one bit, because the features it offers, I don't want. It would be nice if a sound card could be just that: a sound processing accelerator, but in 2008 the CPU is more than capable of handling the cheap bandpass filters and flanging effects Creative calls "environmental audio". The fact that even Creative uses software EAX emulation for its cheaper products is proof of this, and the only reason it doesn't work on other cards is because of licensing/IP issues.

I hear this from audiophiles a lot, but everything I have studied about psychoacoustics totally supports everything I hear from Bose. I think audiophiles get caught up by the fact that Bose doesn't really care about the sound waves coming from their gear, they care about the perception of the sound waves coming from their gear. (Examples of why this difference matters: the inability of humans to distinguish between a sine wave and a saw wave at almost all frequencies and the inability to correctly echoloc

Bose's genius lies in making their speakers sound spectacular and impressive to the untrained ear. Their 'indirect sound' trickery gives you "stereo" in the entire room, at the expense of a muddled sound. I haven't heard their surround systems, but the problem's bound to be even worse there.Similarly, the frequency response of their speakers makes them stand out when you compare speakers, but pay a bit more attention and you'll notice the frequency response is as flat as a mountain range.IOW, they don't care about what sounds good, they care about marketing to the unwashed masses.

Their 'indirect sound' trickery gives you "stereo" in the entire room, at the expense of a muddled sound.

I would never describe what I hear from a Bose sound as muddled. One thing I know they do though is put the same sound through every indirect speaker, but louder through the direct speaker as a cue for echolocation. If it is not set up properly, or if your perception of sound varies significantly from most the population than this could present a big problem. You shouldn't hear muddled sound though,

Does it work in Linux? X-Fi on Linux is terrible at best and doesn't exist at normal. Can someone some insight as to whether it works in Linux or not?

I was just checking it myself and seems like ALSA supports the card allright [alsa-project.org]. I've been interested on a high quality, cheap soundcard because of my main gripe with onboard audio: noise levels. I can hear hiss through my nVidia onboard audio adapter (which otherwise sounds damn fine), and even faint pop and crackles when the HDD is doing heavy work.

There is a beta driver [alsa-project.org] for the D2X. Since, according to TFA:

The DX employs what's marked as an Asus AV100 audio processor while the D2X uses an AV200. Don't pay too much attention to the names silk-screened onto the chips, though; they're the very same C-Media Oxygen HD audio processor under the hood. Asus says the chips go through a "quality sorting" process to separate the AV100s from the AV200s.

So, since the chipsets are the same, I would guess that the D2X driver might work for the DX, perhaps with little or no modifications.

There is a beta driver [alsa-project.org] for the D2X. Since, according to TFA:

The DX employs what's marked as an Asus AV100 audio processor while the D2X uses an AV200. Don't pay too much attention to the names silk-screened onto the chips, though; they're the very same C-Media Oxygen HD audio processor under the hood. Asus says the chips go through a "quality sorting" process to separate the AV100s from the AV200s.

So, since the chipsets are the same, I would guess that the D2X driver might work for the DX, perhaps with little or no modifications.

personally i think most of the audio improvments have been a load of wank.

i haven't been able to tell the difference between my old live and my brandnew supposed "HD" soundcard. maybe on some seriously expensive speakers and a full THX system i could, but who needs to spend $300 on one of these cards creative put out?

"who needs to spend $300 on one of these cards creative put out?"
Hopefully nobody. One may, however, need to spend money of a good sound card, in order to output to a decent audio system.
For me, the absolutely deal-breaker is Creative's insistence of resampling all 44.1kHz content to 48kHz. I don't rely on my sound card to do any of the work - I just want it to take the data, and faithfully stream it via SPDIF into my external DAC.
That's why for many years now, I've been enjoying the services of the M-Audio Revolution.

Not exactly. I have a Creative USB sound device hooked up to my Myth box I picked up several yeara ago. It has a switch on the side to disable the analog outputs. If the analog plugs are enabled everything gets the 48Hkz resample. Kill the analog outputs and it will send a proper optical output to my amp at either 44.1 or 48. Haven't tried 32k or 96k, the amp supports em but I didn't have anything handy to test with. Turned out not to really matter in my case since the PVR-350 only captures audio at 4

barring HRS-type features and EAX, soundcards are generally soundcards. most any discrete soundcard sounds better than an onboard sound simply by virtue of getting it away from the electrical cacophony on the motherboard surface.

At 120db signal-to-noise ratio, to hear the difference you need hi-fi components starting from $600, loudspeakers starting at $400 for piece and cables for $300. And even then you (as most others) probably wouldn't be able to tell difference.

But there are some people (especially musicians) who can tell the difference, appreciate the better quality and actually willing to pay for it. (And note that price is generally high not be

At 120db signal-to-noise ratio, to hear the difference you need hi-fi components starting from $600, loudspeakers starting at $400 for piece and cables for $300. And even then you (as most others) probably wouldn't be able to tell difference.

There is no reason you should ever spend this much on cables. Ever. In fact, go ahead and do a blind test between Monster Cable and a coat hanger, and I defy you to be able to tell which is which. It's even extra-funny when people spend these kind of prices for digital cabling.

Well, actually with digital cables you do need to get a proper one, if you are doing a sufficient length. Now I say proper, not good, because what matters is impedance. Digital audio is pretty high frequency (as much as 25MHz for 192kHz stuff) and as such the cable acts like a wave guide as it does for video. Well, like with video it is a 75-ohm coax cable that you need. So while you don't need anything pricey, you do need to make sure you don't just use any old cable for digital audio.

You're missing the main point, which was central to Shannon's theorem from way back in 1948: an optimal digital encoding process will achieve 100% quality up to, but not exceeding, channel capacity as dictated by the noise model.Corrupted bits are easy to detect on the receiving side of any digital channel with a relatively trivial modicum of error correction.

If the receiving end of the digital channel sucks so bad it doesn't have a way to report that bits are being dropped or corrupted due to a substandard

I once plugged my studio monitors (aka really nice headphones) into my computer while gaming. You can tell the difference between good and bad audio if you have good speakers or monitors. Unfortunately, the audio became so clear that it was obvious the sound was synthesized or using looped samples. It actually detracted from my enjoyment of the game, so I went back to the $20 no-name speakers.

It's kinda like how the switch from CRTs to LCDs made text razor-sharp, but it exaggerated the "jaggies" in gra

Same thing with a monitor. If you are using an old CRT, ok sure the Integrated video is probably fine. However if you have a new professional LCD, maybe it is worth the money to buy a graphics card that properly supports it (for example has enough RAM to run at native rez and has a DVI port).

I get your audio argument, but that doesn't really fly with graphics. Integrated graphics don't have any problems driving large LCDs, and some [amd.com] even have HDMI on top of DVI outputs. That particular chipset easily beats a

It's a PCI device that requires a bridge chip to work on PCI Express... but there's no PCI version to be found, unlike its more expensive D2X cousin (which lacks front-panel connectors). Bah. One of the reasons for me to buy a nicer soundcard is to take the load off my aging CPU.

The X-Fi cards are nice, but not worth the price. I'm looking for some Linux support. I'm looking to hook it up to a digital 5.1 set of speakers and a headset on the front-audio of my case. I play 1st-person shooter games, so good DirectX support is required.

The article's author has posted a short follow up piece [techreport.com] after someone pointed out that some of the RightMark Audio Analyzer results don't make any sense. The X-Fi's frequency response is all over the place in the loopback (and only the loopback) tests, which causes most of the RMAA results to come in far lower than they should, or indeed where they did score when the card was initially reviewed a couple of years ago. The Xonar still does well regardless, but the RMAA results are effectively useless right now. I suspect the issue is that they used Vista; RMAA is a very peculiar program and has not been certified for use on Vista in all cases because of the UAA screwing with things.

Also, for the sake of being pedantic, the X-Fi they used isn't Creative's best (hence the submission title is wrong); the Xtreme Music was the low-end model and was discontinued last year, to be replaced by the Xtreme Gamer. The Elite Pro is still Creative's highest-end X-Fi.

Works 99.999% awesome with ALSA (then again, I haven't experienced the rare problem that the cs46xx driver had in a very long time, so maybe I should say 100%), has hardware mixing (though I am using PulseAudio now; perhaps in the future soundcard manufacturers would be so nice as to have per-mixer-input volumes in hardware---not that it really matters), and generally Just Works.Of course, maybe if sound starts to recover from the crap Creative has done to it (maintaining OpenAL is the only halfway-decent t

I like good sound and I haven't bought a sound card in 6 years or so (Nforce came out with very good integrated sound). Since then I run a single optical cable from my motherboard to my AV receiver; PERFECT sound. Even the HP at work driving my headphones from analog sounds great.

If you have a good quality digital pre-amp that's all fine - but it's cheaper to buy a good analog sound card and quality powered speakers or an analog pre-amp.

I wanted good quality stereo sound so I bought the Behringer B2031A speakers for around Â£200 and the M-Audio Revolution 5.1 for about Â£40 which together is cheaper than just the digital preamp capable of this kind of quality.

I have a Denon AVR 1802 and Paradigm Monitor 3 speakers, nothing terrible expensive. It is not just quality but versatility that AV receiver gives you. Not only that, but I have guaranteed clean path to my reciever, the music stays digital over optical right to my receiver. I don't want to ever go back to analog sound coming out of a computer. Interference is a thing of the past.

As mac desktops don't come with PCI slots it is very difficult to find a better sound card than the onboard crap for Mac, only ones available are USB and Firewire and I haven't found an external one with sufficient sound quality.

Since this is PCI Express, does anyone know if Asus will be releasing Mac drivers?

I'm familiar with Cirrus and Burr-Brown (Texas Instruments) converter chips as being among the best in professional audio devices, in fact the best Protools interfaces (HD192) use Cirrus chips. But having an S/N ratio of 123dB is moot when the analog circuitry is unshielded and housed inside a computer, which is EMI and RFI hell.

The noise floor is going to be at least -66dB, so 57dB of dynamic range is lost to noise. That means the noise level is at least 724 times higher than the lowest discernable sound the card can process. If you're going to spend a penny to improve your computer's sound, it should go towards an external USB or Firewire device.

And don't get me started on "computer speakers". Try this: knock on the sides of your speakers. That resonance is added to every sound emitted from your speakers. Think a better sound card is gonna help?

How much CPU does it use up like on an old Athlon 64 X2 4600+ 939 system with Windows XP Pro. SP2 (IE6.0 SP2; all updates)? The reason I bought an Audigy 2 ZS card was because of games that use EAX, especially v4.0.

I've followed Creative Labs and the PC sound card evolution since the early 80's, before there was an ADLIB and I was trying to get my PC speaker to produce music. My first sound card was a Sound Blaster like a lot of people at the time. The card worked great, replaced ADLIB as the de facto standard, of which I never owned, and brought PC games into realistic sound reproduction.

Fast forward 5 years, creative still dominates the market with their sound blaster offering and now there are a few competitors that claim 'sound blaster compatible' to work with existing games, still DOS games mind you. Most of these cards were fine replacements for the creative offering at the time, an ISA slot Sound Blaster 16 (which was stereo!), some were garbage, but most worked just like the creative card.

Along comes windows95 and DirectX API to unify sound programming in games for windows! Yay, no more need for 'sound blaster compatible' any card with a functioning windows driver will work for any game. During over a decade of existence creative thus far has done nothing to make their sound card better than offer 'stereo' and a 16 bit ISA adapter to replace their original 8bit adapter. Now at this point the only 16bit card you've got in your system is the stupid creative SB LIVE!, or another competitor's card that might be PCI but otherwise the same.

Everything is about to change though, a new company enters the scenes, Aurel. Right off the bat the Aureal sound card is obviously superior to every sound card on the market. They only have PCI cards and they boast something that no other card has had thus far, real time effect processor! Now you can have reverb and parametric EQ's and time delays and any sort of crazy effect you can dream up! AND IT REAL TIME! All the processing is done on the card, so no extra CPU overhead, multichannel in/multichannel out, multichannel SPDIF out, the friggin works, and this is going up against the sound blaster live which boasts..... STEREO, minor multi out functionality and a 16 bit slot.

This is where the story gets juicy and I'm sure quite a few people recall it. Creative backwards engineered or maybe just ripped off the processor design of the Audigy card, got sued for doing so, bought Aureal, stuck the almost EXACT same chip in their emuX series (Audigy) cards and haven't done a god damn thing since then and that was almost 10 years ago! All they seem to be able to do is make continuous copies of the chip Audigy designed almost a decade ago and sit on their asses while another company surpasses them in whatever the next PC sound evolution will be, then I guess they will buy them out and stop the innovation!

Seriously. I'm tired of sound cards basically being an all Creative market. While this newspost is basically a slashadvertisement, I'll buy it as soon as I dig up another review or two that echo the results of this one.

I fought Linux sound problems with my integrated optical audio for the longest time. Nobody would help on the Ubuntu forums, but I eventually studied enough about.asoundrc to configure ALSA to work correctly. It was a pain, and in the end everything was working aside from the Rhapsody plug-in for Flash (flash uses an odd wrapper for audio in certain cases; YouTube worked, Rhapsody didn't).It'll take a little learning and trickery, but if you want to fix this, you should be able to by scanning a few example

Yeah, pulseaudio's working pretty well for me on Hardy. Flash even works well with pulseaudio with the libflashsupport package. There are some apps, such as Wine and some SDL apps that still have trouble, though.