Posted
by
timothy
on Thursday July 10, 2014 @05:17PM
from the won't-fit-in-my-phone dept.

MojoKid (1002251) writes "Back in the day (which is a scientific measurement for anyone who used to walk to school during snowstorms, uphill, both ways), integrated audio solutions had trouble earning respect. Many enthusiasts considered a sound card an essential piece to the PC building puzzle. It's been 25 years since the first Sound Blaster card was introduced, a pretty remarkable feat considering the diminished reliance on discrete audio in PCs, in general. These days, the Sound Blaster ZxR is Creative's flagship audio solution for PC power users. It boasts a signal-to-noise (SNR) of 124dB that Creative claims is 89.1 times better than your motherboard's integrated audio solution. It also features a built-in headphone amplifier, beamforming microphone, a multi-core Sound Core3D audio processor, and various proprietary audio technologies. While gaming there is no significant performance impact or benefit when going from onboard audio to the Sound Blaster ZxR. However, the Sound Blaster ZxR produced higher-quality in-game sound effects and it also produces noticeably superior audio in music and movies, provided your speakers can keep up."

The/. writeup sounds like audiophile wank to me. I would be surprised if this Soundblaster could justify its price in a proper double blind study on real world data (music, games, movies, etc...) vs. the built in audio on your mobo.

I stopped buying high end discrete sound cards a long time ago. I still buy and use discrete cards when I build a system and sometimes when trouble shooting them though.

It might be more out of habit since I started buying and building computers when sound was almost always an add on. On Board sound probably wasn't even invented then. One thing that always annoyed me was on board devices going south and not enough expansion slots to add a card in. This used to be common with on board sound and network devices. It's also so much easier pulling a card to trouble shoot hardware issues than turning one off in the bias and hoping it actually disabled the chip. I've seen some plug and play happenings turn the devices back on once the OS booted.

I cannot tell a big difference in sound quality or CPU overhead any more either. But I guess habits are hard to break.

The problem as it was last time I messed with it was that USB power would start causing issues when you chained more devices to it. It was important to make sure you used powered hubs if connecting something (several devices) with more power consumption than a mouse or keyboard.

A USB audio interface also lies outside the electrically noisy interior of a PC chassis.

Strong caution with USB audio. There is a metric buttload of cheap USB adapters, While they technically work, they typically lack analog filtering that gets rid of higher harmonics. If you look at the output on an oscilloscope, instead of a smooth wave, you see the actual steps. Better audio hardware should have filters to smooth this stuff out.

Another MAJOR thing is inducing noise into the output. This is not just for USB cards, but all audio solutions. You need some pretty good filtering between the digital and analog power domains -- yet another area where cheap sound can skimp. Hey, let's shave $0.05 off by dropping this capacitor and inductor!

SoundBlaster (and other gaming-oriented cards) typically do both. However, do you really NEED both? The audio processing stuff is supposed to provide an API that games can use to make thing sound more realistic, or offload audio processing from software to hardware, or both. It can typically decode various dolby flavors, and do some other fancy DSP-ish type stuff. Do you really NEED all of that? If so, then maybe a gaming card is for you.

However, what if you want the best sound possible, the lowest noise possible, and don't really game or use the various audio enhancements? You just want a plain-vanilla sound card, but with the highest quality audio. Where to do? Skip the computer store, but go to your local MUSIC store (not the ones that sell CD's, the ones that sell GUITARS). Those cards skip all of the DSP bells and whistles, but have the best-quality DACs and filtering that you can find. You can find some really good USB solutions that will blow on-board audio out of the water for about $100 or so. Of course, you can go crazy and spend $500 or more if you want. If it is good enough for a music producer to use in a studio (who makes his or her living off of the sound), it is probably good enough for YOUR music and movies.

There's no need to spend that much. A lot of motherboards have S/PDIF outputs, and with a good coax/TOSLINK DAC (like the ~$40 FiiO D3), pristine noise-free stereo sound is both easier and cheaper than buying an expensive sound card.

There's no need to spend that much. A lot of motherboards have S/PDIF outputs, and with a good coax/TOSLINK DAC (like the ~$40 FiiO D3), pristine noise-free stereo sound is both easier and cheaper than buying an expensive sound card.

Or even with a cheap shitty coax cable like the one that you got for free in your wheaties 20 years ago to connect your VCR to your TV. Its a digital signal after all - communication either works or it doesn't.

Many of the earlier SB cards were known for a fixed clock, regardless of what the software was set for. This limited clock rate was the issue of many complaints of those looking for full 20-20K without artifacts. Once this reputation was cast, the line was considered as consumer grade and not better. Same applied to bit depth. The driver would accept many settings beyond the 16 bit DAC. Other cards had higher clocks and bits, and testing for the card performance showed the true limits.

Link below shows some of the real testing on this card beyond just golden ears. Look at the frequency output of noise and note what is NOT reproduced. Then scroll down a look at the extended frequency response of the cards in the test. SB hit a wall way before the competition.

Creative Labs has that reputation, and they were dicks in general but funnily I had some real "audiophile" sound with Sound Blaster Live! and Audigy 1 cards.

Creative drivers were shit and I was even once stranded - I needed to download a CD image from unofficial source to get sound under Windows, whereas finding and using the DOS driver took me minutes (!). But a russian guy made a great driver that always worked and is perfect if you only care about getting an output (so no EAX gaming shit) and even the la

People who really care about audio quality don't buy Creative hardware anyway. That's for gamers. If you want sound quality there are many cards with cheap but excellent chipsets. Via Envy24 codes and Wolfson DACs are the preferred combination, and cards with them cost under a tenner.

Much better to spend the money on better speakers or a headphone amp. If you really want high end sound get an external DAC.

I am not sure even gamers need sound cards any more... at least not those who don't use headphones. I have a 7.1 movie surround system hooked to a PC, and the Windows itself magically mixes sound bits into the HDMI stream coming from my Nvidia GPU. In games, I get as many discrete sound channels as the game software supports, plus I can push most any kind of bitstream (including DTS-HD and Dolby TrueHD) from media files.

With a complete digital path, what does a sound card have to offer me? I guess AMD is ma

Why do you lump Klipsch in with Monster Cables? The founder of Klipsch is renown for debunking many crap claims made by many speaker makers similar to the nonsense claims that Monster makes. Perhaps you mean "No highs. No lows. It's Bose"? K-horns, for example, have always been solid speakers.

And those noise problems don't matter if you're using digital audio connections, say over HDMI or TOSLINK or S/PDIF. In fact, if you're doing digital audio over HDMI, you're not even using your onboard sound, you're using your videocard's sound output.

Even then, the signal-to-noise ratios of onboard has been good enough for years now. Sure, you might notice a slight difference with a good pair of headphones, but in practice, not so much.

No, onboard analog outputs suck. By using a digital connection such as USB (or S/PDIF, Firewire, Thunderbolt, HDMI, DisplayPort etc.), you're passing a digital bitstream and moving the digital to analog conversion to an external device that usually has a much better signal/noise ratio.

Even the cheapest onboard sound chipsets can pass a perfect digital bitstream along via S/PDIF, even if the analog components are shit.

Onboard sound is fine, but a lot of motherboard don't have support for creating dolby digital live output. In fact, I am currently in the market for a lowly priced card that would do just this. For once I could simply move my card to the next computer, no matter which motherboard it is.

Is there a correctly priced (30$ perhaps ?) sound card that only do optical and coaxial output, with dolby digital live support ? We have very good surround received, I see no reason not to use those DAC and power amplifier with our nice speakers to get the sound out.

No I don't want to use HDMI; the video feed cause problem, and my monitors are too high res for hdmi anyway (not 2.0, but they don't support it either).

For true studio work, I've not checked recently, but I think M Audio has a PCI interface card for a few C-Notes. I think things have shifted to AI (audio interface) cards anyway, as opposed to discrete sound cards like SoundBlaster successors.

However, I wouldn't say SBs are pointless... for retro gaming, some games have better sounding music coming from the "primitive" FM synthesis at that time.

I was once horrifically stung (what I realize was a very long time ago) with an Abit "audiomax" soundcard that came with my motherboard. Quite horrific interference amongst the many problems. In a fit of pique I bought an Asus Xonar that solved all my problems immediately.

Since then, I've been through a few motherboards, but plugged that Xonar in, and it's definitely 'better'

Now if I didn't have that Xonar, then I'd be as happy as the proverbial Larry with my on-board sound I can get today. On-board sound is quite definitely 'good enough' now, but seems a shame for people not to realize (if they care) they can make it a great deal better for a pretty low price.

And, I've carried this card with me for quite a while as my GPUs have come and gone. The price I've paid for my slightly better sound is now practically nothing per year.

I think people still care about sound, but it's just another check-box on your slightly more pimped mobo - in much the same way as a I got a deluxe board with an Intel network adaptor in addition to the Realtek.

It doesn't really matter that much, I don't expect most people to care, but to say that on-board is good enough for all simply isn't true.

My current on-board is wired to my desk speakers for the day to day stuff I want to listen to, and the Xonar is connected to my silly-number-of-speakers gaming headset.

Some onboard sound is noisy through the analog outputs, although I guess it's only really noticeable in headphones. For normal PC speaker e-mail notifications and whatnot, it doesn't matter.

But luckily, most motherboards have S/PDIF outputs via coax and/or TOSLINK that allow you to connect to an external DAC (like the ~$40 FiiO D3) or a reciever with digital inputs. Some all-in-one PCs and laptops (all Macbooks IIRC) have a combined headphone output and mini-TOSLINK jack, but even if they don't, you can do

My thoughts exactly. A discussion of the merits of add-on vs built-in sound hardware is worthwhile on its own terms; but basing the discussion on a specific add-on card, with the flimsy excuse of one company's 25th anniversary, strikes me as blatant shilling.

Yes, a discrete card might have *better* specs (especially analog components, which was a problem on older integrated soundcards), but I haven't felt the need to use a discrete card since my nForce 2 board (Soundstorm).

This is what comes to my mind whenever I hear of Creative. Nice enough hardware, but shockingly bad software, 80% of which no-one ever had any need for. And it would invariably all be set up to load at boot-time, sucking up resources and RAM.

I love the fact that discrete sound cards exist.
It makes it a lot easier to not order one, so that my PC doesn't assail my ears every time some obnoxious video starts auto-playing after I open up a window.

For most of us, no. Onboard sound is great and getting better all the time. If you're an audiophile or using your system to do professional mixing or music then it is worth it.

Even then, you're not going to be using a PCI Soundblaster card, but rather a purpose-built audio interface device. And you sure as hell won't be buying it from Creative. At least, not if you care about your sound.

If you're an audiophile, you're probably using USB audio or S/PDIF, which don't need a discrete sound card, paired with an external DAC worth many times the price of a Creative soundcard and without the extraneous bells and whistles. If you're a gamer, you're on a headset, often again USB. If you're an average user, your speakers are too crappy to notice the difference.

As far as I can tell, the only use case that truly benefits from a discrete card is 5.1+ surround systems which support the latest Dolby/D

People who know and value quality audio are willing to buy discrete audio cards even though it costs them more money.

However, they don't realize that the improvement they see is because they are also willing to pay more money for quality cables. It's the solid gold Monster Cables that they buy because the salesperson at Fry's recommends them that is really the source of the improved audio quality.

The cables do not make a difference. Considering the level of thermal noise and the difference between, say, 30 AWG wire and 16 AWG "monster cable" (we're talking about low-level shielded cable, right), the monster cable "difference" is below thermal noise.

If you are "hearing" the difference with better cables, you are most likely hearing the money and not the electrons. Not to say that there aren't such a thing as sub-par cables, but monster cable vs OEM pc cable, for consumer-line-level, please.

I know this is a humorous post, but for really high-bandwidth applications, cables do actually matter. For example, driving WUXGA (1920x1200) at 60Hz, 24-bit/px, this is roughly 3.3Gb/s, or a little north of a gigabit/s for each color (RGB). Since each color runs over a single wire (I think), this is comparable to the requirements of gigabit ethernet -- except (I think) gigabit ethernet over twisted pair uses all 4 pairs of wires, as opposed to just a single wire for VGA. And, given that VGA is analog, nois

Any money spent on a sound card is better off spent on speakers and a good DAC, which often come together.

High end sound systems and speaker systems these days have digital inputs, thus an onboard DAC. If you're using a digital output on your motherboard to connect to a digital input on the speaker, the onboard sound card has ZERO effect on the quality of the audio. The bits are traveling directly, unmolested from the application generating them to the amplifiers in the speakers.

Now, if you have audiophile-type equipment that uses analog inputs, then YES, the analog sound you feed into those inputs needs to come from a high quality DAC. High end sound cards tend to have good DACs, but you can get the same effect by using an outboard DAC, which has a digital input and analog outputs, and is also AWAY from your PC, so your analog audio is less likely to be affected by interference from the motherboard or power supply.

You can get DACs with USB inputs, but USB adds latency so is best avoided for gaming. For music, go to town with a USB DAC; it won't matter there.

The gist of it is, the most important component is the DAC. The DAC completely determines the quality. Everything else is just hype.:)

As somebody who has been using an external DAC since the late 1990s, I'm getting a kick out of this response.

I'm actually surprised that inexpensive modern motherboards still include a DAC. You'd think it would all be coaxial SPDIF and HDMI output at this point. The freebie headsets I get when enrolling in online classes are all USB these days. Less and less seems to rely on analog outputs.

After many problems with sound cards, sound cards drivers and video drivers, I removed sound hardware from my PC.I use the HDMI output of my video card, connected to an Audio Video amplifier and that's all. 5.1 when needed, in games or VLC.

Agreed. It's just too bad that, AFAIK, there isn't great CEC support on desktop/laptop computers -- though this could be an outdated observation. Of course, the $35 Raspberry Pi supports HDMI CEC very well.

...but discrete soundcards, especially external ones, are still alive and well if you record. The noise floor of internal sound cards hasn't gotten that much better (a PC is very noisy RF environment), and if you need mic preamps, quarter inch jacks, optical in, etc, they generally don't fit on a PCI card or laptop.

But for general gaming or home theater use? Nope. Send the audio out over the HDMI out, or SPDIF for DVI/VGA rigs, and let the amp sort it out.

But for general gaming or home theater use? Nope. Send the audio out over the HDMI out, or SPDIF for DVI/VGA rigs, and let the amp sort it out.

This right here is a key point. Many people now don't rely on their PC to actually do any audio, just send the data somewhere else. Many hifi rigs are hooked up into digital inputs, many TVs and computer displays will support HDMI audio and do the conversion in the device. In some cases like mine people even opt for external streaming devices like a Roku to get music though that doesn't work for generic sound.

It is easy to make good DACs these days. Basically any DAC, barring a messed up implementation, is likely to sound sonically transparent to any other in a normal system. When you look at the other limiting factors (amp, noise in the room, speaker response, room reflections, etc) you find that their noise and distortion are just way below audibility. Ya, maybe if you have a really nice setup with a quiet treated room, good amps, and have it set for reference (105dB peak) levels you start to need something be

I use the motherboard audio to plug my headphones into. However, the volume for headphones is never high enough even with the volume control maxed out in Windows. Would a separate audio card fix this problem?

I use the motherboard audio to plug my headphones into. However, the volume for headphones is never high enough even with the volume control maxed out in Windows. Would a separate audio card fix this problem?

Maybe.

Higher quality headphones, specifically ones that have their own amp, would probably work better, though.

Onboard D/A for WAV, MP3, Movies, etc are generally good enough if the noise level is low enough. The biggest difference is in the on board synth. Playing games uses MIDI and the sound card produces the sounds. There are 2 versions. Hardware and software.

Hardware had an on board synth. It can be as simple as an 8 bit video game or as complex as full wavetable sampled sounds. An onboard hardware synth will sound the same on Linux or Windows. If the wavetable synth is XG compatible or similar, the soun

"back in the day" the main selling point of a "good" soundcard, was compatibility. Under Dr, each and every game had to reinvent the wheel and communicate directly with the soundcard. Unless you had one of major 'good' cards (Soundblaster, Gravis ultrasound, and one or two others) old games wouldn't have sound at all.
When Windows became the norm, the hardware communication was abstracted hough the windows driver - as long as Windows support the card, a game could use it. Combined with dirt-cheap integrated cards in most motherboards, there's very little need for discrete audio for non-professional use anymore.
We've reached "good enough" 15+ years ago.

I had the optical output on my motherboard run into my home theater receiver in the living room (where the computer was too). After 3 years of the PC always being on and the optical LED being lit, the LED brightness had diminished (yes, this happens) to a point where it could not signal reliably over the cheap 30 foot optical cable I was using (I did a lot of troubleshooting). To remedy the problem I bought the cheapest sound card I could find with an optical output. That solved the problem.

My Sound Blaster Audigy 2 ZS Platinum Pro 7.1 surround 24-bit 192KHz with an external breakout box (1/4" MIC, optical, etc.) has now been in 3 systems and is still going strong. I'm running Windows 8.1 using the DanielK drivers. It's PCI, so as long as I can buy a modern motherboard with a single PCI slot, I'm golden. In my opinion, is is one of the last great Creative Labs discrete sound cards.

I tried switching to the on-board sound in my latest build but I prefer the sound from the Audigy. My current m

Let's use Realtek as an example, because they're a very common one. They have a variety of chips, ranging from the ALC231 to the ALC1150,

The ALC231 is rubbish. Four output channels (two stereo outputs), four input channels, and a 97dB SNR on output. But even that is probably enough for most users.

A good "middle-end" chip is the ALC861. That brings you up to 7.1 audio out, and a pile of sound-processing features (EAX, A3D, all that - including Creative's own standards). You still only have a 90dB SNR, but on a clean line that's tolerable. And it's cheap enough to be seen on sub-$150 motherboards.

Their top-end ALC1150 is basically the same, adding a few more output channels for some reason, a second ADC, and a 115dB SNR. That puts you above the low-end SoundBlasters, and within spitting distance of the high-end ones. On an integrated chipset. For anyone not doing professional audio work, that's probably enough. And you can find it on motherboards that cost less than this discrete card alone - sometimes even with advanced features like swappable op-amps.

It gets worse, because the main advantage of a discrete card is the SNR. Problem is, S/PDIF over TOSLINK is becoming a more common feature. And that means your computer's DAC doesn't matter - it's done on the sound system itself. Line noise isn't an issue, because it's fiber-optic. Every single Realtek chip I looked at supported this - probably not every implementation does, but it's something that doesn't cost the manufacturer any more than the cost of the connectors. That's another blow against them.

This isn't like video cards, where integrated can handle light users but any remotely intensive task requires at least a low-end discrete card. Probably not even one in a thousand users will need a discrete sound card - the ones who need more than the low-end integrated chips, like gamers, will be buying mobos that already have a higher-end audio chip.

a) most peoples' computers are making so much noise (fans, etc) that the only way you're going to have a chance to hear the difference will be with $1000 totally-closed cup headphones - do a lot of people have them on their computer?b) otherwise, even if their PC is silent, their speakers are usually craptastic 3" logitechs, *maybe* with a cheapo sub buried in the shag carpet (ie a somewhat sub-optimal listening environment)c) finally, last time I checked *most* people are listening to relatively crappy lossy mp3s ripped from youtube videos. It really, truly, doesn't matter how lovely a board you're sending crap sound data through: GIGO.

So I guess these boards are still relevant to the microniche of audiophile enthusiasts that have a nearly-silent PC and hardware, floor-scale speakers connected to their system (or 4-digit $ headphones), and who listen to audiophile-caliber audio....meaning nearly nobody.

As someone who's been on the audiophile ride from the early days of strange use of the PC speaker, and the first FM synthesis boards, I can say honestly say, a few things happened that made discrete audio hardware obsolete:

1. a basic DSP became widely available, to do audio processing2. storage became vast enough, combined with audio compression, it made more sense to just pre-record all your audio effects and music and play them back through a basic DSP. I seen this shift in games through the years, from old school methods of creating sound effects and music with code, to just playing audio files included with your game.3. the general purpose CPU became powerful enough to do any complex signal processing and simply use the basic DSP to output the results of the processing.

Basically, in my opinion, specialize hardware is useless in the face of vast storage and general purpose CPU processing. So a basic DSP is all ya gunna need and that's what basically every PC comes with, standard now.

Integrated audio isn't good enough, isn't great, and isn't for me. I have a pro-level sound studio, and there's no way your going to tell me that the noisy environment that is the system motherboard is going to give me results I can be proud of. Not even for gaming, thanks.

Discreet card? Ok, maybe, but generally you need to jump up to RME or some such before you can really call it good. I have a an RME RayDAT - This means that that all my AD and DA happens somewhere else, and not in the computer. It all goes digital over ADAT to my mixer (a Yamaha DM2000) where the conversion happens. Or it goes digital over ethernet (audinate Dante) to an X32, again where the conversion happens.

There are a ton of good external boxes to handle sound - some quite reasonable. Stay away from the onboard and cheap USB sound dongles. If you have the speakers to handle it, then why put up with bad sound?

Quality wise, I think there are minor gains. The biggest gains come from being able to drive nice/high quality headphones at the correct power levels so they sound as they should. Some motherboards can't supply enough power and the headphones sound... gross... because of it.

Also, you can gain a few FPS in some games by offloading the audio magic onto a card rather than do it on the CPU.

If you look down to the industrial espionage part, I heard the story a little different. Instead of a worker stealing the recipe and copying it wrong, I heard there was a hacking incident and sabotages files were purposely placed in the areas the hackers were looking at. The faulty electrolyte recipe was supposedly on of them and they used it to pinpoint which manufacturer was trying to steal information. But that could have just been rumor.

The simple answer is the electrolyte that failed was simply cheaper to produce. Most of the product failed out of warranty so it was never an issue for the capacitor producer. The good stuff, tantalum, is actually a conflict mineral (meaning the mine's production is used by non-state entities to fund nearly endless war often over control of the mine) and is super expensive in comparison to the dirt cheap (fails in 6months to 3 years) stuff they used. Don't attribute this

Meanwhile, other game developers have stated that discrete soundcards just don't matter in terms of performance. A lot of the game developers need to do special processing on the audio files in the CPU before handing them off to the sound system to be played. Because the Windows API doesn't allow them to do that special processing on the card (and nobody wants to go back to the days of supporting a dozen different cards).

The "advanced functionality" of the add-in cards is mostly mythical these days, har

I think "performance" might be referring to framerate (i.e., a measure of how CPU-intensive it is to drive the onboard vs. dedicated card), whereas audio quality is considered separately. Not the best writing, I'll agree...

A quick glance over at Newegg would throw into question your statement that "most intel/amd sound chips don't support high range, 5.1 or 7.1 surround". Supporting 5.1 & 7.1 surround are de rigeur on even the low end motherboards that are available. As far as "high range", if you don't define it, I guess we don't know if they support it.