This article makes ANANDTECH, look like a bunch of apple swilling morons.

Hello ANANDTECH, did you know that 95% of the worlds computers still use windows? Probably not, which is why you put an article like this using a MAC, on a website 90% dedicated to PC hardware. And what is there to know about mac anyway? RUMOR RUMORS RUMORS, being a mac"enthusiast" is all about how much shit you can talk, how many rumors you can start, and how hard you would suck steve jobs cock if he put it in your mouth.

NOW.

Flash, is a fabulous technology that makes 10 million different things possible on the web that would NOT HAPPEN without it. OK, and interestingly, this article advocates turning on an adblocker, for their own website!!! If I was a sponsor of this website I would demand to have my motherf***inh money back.

LOL, now you noobie losers want to complain because your computers are slow, go talk to your grandma about how long it takes her to have a bowel movement, THAT is slow. Reply

PS: Keep in mind you must use a supported video card to be accelerated, which means only GPUs invented these past two years approximately, virtually nothing from Intel is supported except the new Core i CPUs with the integrated GPU, if you have a Netbook you're screwed unless you have an Ion chipset.Reply

You should get some information before starting to write bull***t."Flash video playback pausing frequently isn't caused by the Flash player. It's cause by the server being overloaded with too many requests. "You think they are so stupid, that can't see the difference of buffer-loading and framedrops?!

"GPU acceleration is for playing back Flash _CONTENT_, not Flash video (FLV). "The GPU accelerated flash's main point is H.264 decoding, and all the HD flash videos are now H.264 encoded(on youtube non-HD too).

"The reason being some apps created using Flash place an unusual amount of load on the CPU."And what do you think, what amount of load an HD video places on the CPU? I'll help you a bit: a lot.

"If you think GPU accelerated Flash has anything to do with video playback, I think you might be seriously confused. "You are seriously confused, clear the lot irrelevant information from your head, and get some relevant. I could only advise this whenever you want to comment on anything.Reply

Exactly. You have a high end system that can do all the video decoding in software on the CPU without problems. Not everyone has that. (And OS X is a different beast, apparently, at least as far as Flash is concerned.) Reply

I see, my bad I guess I didn't really consider my CPU to be high end. The system he mentions at the beggining sounded pretty powerful though I have never run anything but windows so I may be wrong... Reply

I should clarify: it may not be high-end by today's standards (what with Core i7), but a moderate ~2.0GHz Core 2 Duo can handle 1080p video decoding in software (albeit at high CPU utilization). It's really more of a question of laptops and even then more netbooks and nettops. And Flash optimizations are of course also important - I've seen Flash choke other laptops with Intel IGPs on older Flash revisions, but 10.0 does much better. Reply

You guys misunderstand what I was saying. I can play 1080 video just fine, HULU, WMV, MP4, MOV, whatever. I am unable to play 1080 smoothly when I have the Folding@Home GPU client running, that's the only time I have trouble with it (and standard def video too). Though it was working fine on Vista before I installed Win7.

Most of the time when I am playing HD video files CPU usage is about 28-48% on the most loaded core.

BTW since I updated to Nvidia drivers to 195.55 now Firefox is no longer crashing on YouTube videos with Flash 10.1 installed. Seems to me that Nvidia was not ready for Win7 with their drivers, they got a lot going on right now. Reply

I stopped messing with Folding when I started doing the calculations for how much it was costing me in electricity (and a few pieces of failed hardware). Plus, the GPU client in particular always seemed to slow down system responsiveness. If you want to multitask GPU intensive applications, I think we're still deep in the driver update stages (whether ATI or NVIDIA). Give it another year... LOL. Reply

I have been using Flash 10.1 for the last few days and it seems to crash Firefox in an Nvidia dll. All while using YouTube. Downgraded to the stable release and all is well again. Using Win7 driver 190.38 because newer drivers cause Flash to freeze video up for a half second for every every 10 seconds of video.

My testing on the ION LE was with Win7 and I didn't have any problems. Can you list details of exactly what hardware you're running on? Also, I believe the 195.55 drivers from NVIDIA are part of the requirements for this to work optimally (though if it's just DXVA that shouldn't be true). Reply

I have not yet tried the 195.55 drivers, those are still beta but I will give it a try. I was also having problems with the new Nvidia drivers not load balancing gpu folding@home while playing videos. The drivers in Vista would allow me to run gpu folding@home and playback a 1080 video without any frames skipping. None of the Win7 drivers allow me to do this so far. Reply

I would assume you're probably overclocking as well? Most people with something like an E6300 do that. Anyway, you might need to try several combinations, and with this beta software (and beta drivers) I wouldn't count on load balancing of multiple GPU applications. Reply

The CPU from 1.86 to 2.8GHz yea, GPU is stock 650Mhz, all works ok in Vista though. I did just install the 195.55 drivers and it's not as severe as a problem with folding and 1080 video but it is still too much dropping to make it watchable. So far YouTube has not caused Firefox to crash yet, that usually takes some time though, it doesn't happen right away. These drivers need some more work and Flash needs to reach a final version so that Nvidia can fix Adobe's screwups lol Reply

Anand I've got to be honest, I'm not liking the new trend of reposting an old article with a small update. It is difficult to find since you have to go through the article to find the updated information, and the comments section becomes jumbled up with old posts and new posts.

Please go back to the old way of posting a small updated blog post with a link to the original article for those that didn't read it originally, or would like to read it again.

Actually, the update was by me. I also tried to make it very clear, seeing that the page is labeled with "AMD and Intel Update". I could have done it as a blog, true, and perhaps next time I will. Reply

At work, our systems with discrete graphics use Radeon 2400 class GPUs. A few workstations use NV40 or NV41-based Quadro cards.

Is there any chance Adobe will go a little further back in their support? We only update every 3-5 years, and it seems like there are a lot more GPUs that could be capable of this (e.g., Geforce 6/7 series, Radeon X16 and up, etc.). Reply

Just because us HD3870 owners only have UVD (1) doesn't mean we should be left out of the GPU acceleration. I hope Adobe adds support for for those cards, as they have h264 acceleration as well and should be more than powerful enough. Reply

The Linux version of the latest Flash player plugin (10.1) still does not use the GPU at all even though Adobe said it works. From reading their blogs about them implementing OpenGL for acceleration, it still seems that Adobe does not include extras correctly or they fail to understand how stuff should work. The following is what Adobe tried doing and failed to handle OpenGL correctly.

I even tried using other web browsers and it still does not do what you said it should. I also included the following lines in /etc/adobe/mms.cnf.

WindowlessDisable = 1
OverrideGPUValidation = 1

Adobe needs to get their act together and understand that they need to make a more efficient plug-in and does better checks. Using OpenGL and VA API does not help to provide efficiency. It just puts a band-aid. How come Xara can make (a lot) more efficient vector graphics program compared to Adobe.

Flash player plug-in version 9 was better while 10 has gotten horrible. If I watched any videos on Hulu with version 10.0, frames skips all the time and with out any programs or windows open. If I use 10.1 with Hulu, it is the same. I know it is a beta version, but it is still poor. Sure Linux is the only OS that contains a 64-bit version of the Adobe Flash player plug-in, but from I heard or read that it still is poor or has no noticable increase in performance.

My computer contains T7300 and a GeForce8 8400M GS. It decodes H.264 fine with the help with VDPAU (nVidia's way of VA API) and it helps by a huge CPU usage reduction that is around 10% instead of 100% by just the processor alone decode. Flash player plug-in 10.0 or 10.1 is around a crazy CPU usage around 120%. Does anybody think that Adobe is doing things right because I do not think so. Reply

Okay, I must admit that as a longtime Mac user, I was skeptical of the claimed performance improvements on the Mac; Macromedia/Adobe never optimized Flash on the Mac in a satisfying way, and if you combine that with the fact that, up until recently, I was using the original MacBook, so the fan would kick in at the first YouTube video… I was not going to suddenly start trusting Adobe. So I measured.

Contrary to Anand, I'm not going to concentrate on video, as to me Flash is a superfluous middleman in the web video equation, and I can't wait to get rid of it (there's also the fact I don't have access to Hulu). However, I have a healthy respect of Flash as a way to create and deliver (interactive) content. So here are my tests.

CPU is the processor usage percent of the WebKitPluginHost process for Flash of 64-bit Safari as reported by top (you'll remember that on Mac/Unix, 100% means one core fully used, my machine only maxes up at 400%, so it's normal there is stuff that's more than 100%), either interactively or (for fullscreen and Magenta Kong cases) logged to a file. Tests performed on a Mac pro early 2009 (aka Nehalem) single proc (4 core) with Mac OS X 10.6.2, Safari, and Shockwave Flash 10.0 r32 ("10.0"), Shockwave Flash 10.1 d51 ("10.1"). Scientific accuracy not guaranteed. Less CPU used is obviously better.

A: with 10.0, 200% CPU; with 10.1, 230-250% CPU (yep, a regression)
B: with 10.0, 15-30%; same with 10.1
C: with 10.0, 50-60% with one peak to 84%; with 10.1, 30-40% with one peak to 47%
D: with 10.0, 36% with one peak to 62%; with 10.1, 25-33% with one peak to 43%
E: with 10.0 max 60%; with 10.1 max 40%
F: with 10.0 mean 10%, max usually 15%, peaked to 17%, with 10.1 barely touched 10% at worst
G: with 10.0 around 30% but a busy time peaked 66%; with 10.1 20-25% but the busy time peaked 57%
H: with 10.0 max 100%; with 10.1 max 82%
I: with 10.0 max 50%; with 10.1 max 32%
J: with 10.0 max 163%: with 10.1 max 92%
K: Due to different playthroughs, could not compare CPU usage; however, the game felt more fluid with 10.1; I'd need to do a double blind to be sure.

So yeah. Overall, it's a definitive improvement. I think it could be made better still, but an effort has definitely been made. Not bad Adobe. Not bad.

Note: This "benchmark" needs more Flash games, but I'm no specialist. Suggestions? Reply

Hulu improved dramatically! Sadly, I do nto have an application monitoring FPS, but the same SD video clip that played with 90 to 95 percent CPU on my Dell GX280 with XP Media center edition and Flash 10.0, play with CPU use of between 35 and 40 percent with flash 10.1.

I was running this out to my Samsung LCD TV over HDMI. I also trie don a different 280 running a 1440x900 monitor on an OLD 1950 ATI card, it made no difference.

One odd thing I noted - video was much smoother with 10.1, but it was like someone turned up the contrast a bit too high and the SD video was very grainy. Also, the color washed out a bit on the particular clip i was watching. But it was now watchable. I will play with Catalyst driver settings tonight to see if that makes any difference, and also see if I can test HD. I may have missed it, but what software can I use to actually measure FPS in Hulu? Is there an option I can turn on in Hulu itself?

It's sad how poor flash perfroms. But if you look over the whole Adobe stuff, no wonder. Everythings seems blotted, especially Acrobat. Like a 1 Gb install for creating and playing around with PDF's...
(Was it already that bad when it was not Adobe?I have the feeling not but I'm probably mistaken)

I'm glad flash is now usable at least on some weaker hardware, ideally also normal netbooks should be able to run it...But here we habe the opppsite trend to the car industry. Instead of getting more efficient we just add a second engine to satisfy the customers.
I mean cool I could run hd flash on a ION netbook. But probably battery live will we soemthign like halfed...nothing is free. Reply

Anyone think the ATI Radeon HD 3xxx standalone cards will work with some kind of driver hack?

There is support for integrated Radeon HD 3xxx series however there is no mention of Radeon HD 3xxx standalone cards. I'm thinking this must be to encourage people to upgrade to the HD 4xxx series cards. ATI however supports their HD 3xxx series integrated video, so I suppose the capability must be there for a standalone HD 3xxx card. Reply

The 9.11 drivers do NOT list HD 3200/3300 anymore, oddly enough. Remember that the RS780 chipset did get a bit of the advanced video decoding features even though it didn't have the shader and gaming performance. I think IGP 3000 will get support, but I wouldn't expect the same level of acceleration on the normal discrete 3000 series.

Also interesting is how HD 3200 (RS780) became HD 4200 (RS785). They're practically the same, I think, but one got the 4000 series moniker just to make it sound newer. Just like NVIDIA with the GTS 250, etc. cards.

At present, the HD 3200 on a laptop definitely isn't accelerating Flash to any noticeable extent. With a dual-core 2.0GHz processor, it almost doesn't need help from the GPU, but it would be nice to go from 60% CPU and some occasional frame dropping to 20% CPU and no dropping. Atom went from playing only 10% (or less) of frames and 90%+ CPU to less than 70% CPU and playing 100% of frames, with the ION chipset. That shows what is possible, once all the various aspects are dealt with. Reply

interesting article although i think it is to soon with those beta drivers and versions.

Did you guys happen to test also what the influence was on total power consumption, I mean due to utilizing certain gpu more reducing cpu i wonder if power consumption actually went up more by reducing the load on the cpu, since it is known that gpu (well at least the mid-high end) can consume way more then just the cpu. Reply

Although offloading some video decoding to the GPU sounds nice I'm surprised Adobe would bother with it while there CPU decoder leaves so much room for improvement.

I just setup a test and encoded a test h264 video (1280x532) in mp4 format and created a test webpage with the video embeded using both windows media player (using core AVC codec) and flash video (JW Mediaplayer). I then played the video in IE 7 on my GF's laptop running a core2 duo underclocked at 1163MHz using each player. IE's CPU usage playing the video using WMP was less than 20%; Embed the same video with flash and CPU usage goes up to 49-50 percent... near max cpu usage as the player is not multithreaded.

Why doesn't adobe focus on improving their dismal software decoder? A decent CPU decoder would also prevent all the silly GPU and platform requirements.

Hardware scaling would certainly be nice to prevent performance drops when going full screen but wasting resources developing GPU video decoding while their CPU decoders are in such a sad state is a clear misappropriation of resources.
Reply

Playing video through DVXA, GPU-enabled, decoders I average about 2% CPU utilization for 720p and 5% for 1080p content, including other background tasks. This using a C2D E6600 overclocked to 3.0GHz and a Radeon 4870.

If you're looking at a CPU utilization of 20-50%, even for a CPU clocked just over a third of what mine is, for lower resolution content you're not getting any GPU offloading. Reply

I saw all this happening long ago, when adobe aquired flash to begin with.

Adobe used to just make Acrobat reader, it sucked then it sucks now, its just so embedded in any corperate high-wire act its stoopid. Not to mention all the memory space want on start up, leaves in memory ect sloppy from day one.

Macromedia was the company that created flash (at least to my memory). When macromedia owned it, it wasn't bloated crap ware. And then again we weren't streaming whole shows, and 720I 1080P were not the buzzwords of the day.

I realize homestarrunner and illwillpress are not fully transmitted/encoded video, they are created in flash for flash.
But I don't see how this is enough to require gpu acceleration, isn't there a way to streamline this? Why doesn't other video kill everything else with such efficency? Are we sure they're not just accelerating how fast my computer can be exploited, this is a net application.

I'm not a coder, or some software guru, just a dude that works on computers. Could some one explain, or link me to something, that explains how this isn't an incoding issue, and a NEEDZ M0r3 PoWA issue? Adobe on my GPU - Sounds like "Sure I need some nike xtrainers for my ears? Reply

Video decode is quite CPU intensive, but nowhere near as heavy as video encoding with decent quality settings. Also, all current HD video formats will be able to be handled by the CPU within a few years once sex and octal-core or higher CPUs are mainstream.

The situation we are in currently regarding HD video playback of MPEG4 AVC type video is rather like the mid-late 1990's with DVD MPEG2 video, where hardware assistance was required for the CPUs of the day (typically around 200-400MHz) and you could even buy dedicated MPEG2 decoder cards. Within a few years, the CPU was doing all of the important decoding work with the only assistance being from graphics-cards for some later steps (and even that was not necessary as the CPU could do it easily if required). The same will apply with HD video in due course, especially as the boundary between a CPU and GPU narrows. Reply

I'm still rocking my trusty 8800GTX card. My heart sunk a little bit when I read that G80 cards are not supported. This is the first time since I bought the ol' girl years ago that she has not been able to perform.

However, I also have an 8600GT that runs two extra monitors in my workstation, and I always do my Hulu watching on one of those monitors anyway, so things may still work out between us for a while longer. Reply

I not only don't see any difference, but I think something was wrong with your Mac Pro. Hulu 480P and YouTube 720P videos have been fully watchable on my system, in full screen on a 1080p monitor, all along.

When playing your same Hulu video (The Office - Murder, 480P, full screen) with both versions of Flash, I get a nice stable full frame rate (I don't know how to measure frame rate on OS X, but it looks the same as when I watch it on broadcast TV,) with 150% CPU usage. (Average; varies from 130% to 160%; but seems to hover in the 148-152 range the vast majority of the time.)

And Legend of the Seeker, episode 1 in HD skips a few frames, but is perfectly watchable. Reply

Under Snow Leopard, the video was obviously jittery and video info indicated many dropped frames. CPU usage: 127% AVERAGE. Even the buffer froze at one point, stopping the video - I get this often under MacOS for some reason. The fan started up in seconds.

Under Windows 7, I experienced a handful of dropped packets on starting the video, but never observed anything but pitch-perfect playback, and the buffer raced far ahead of the playback time with no slowdown. CPU usage: 55% AVERAGE

It is an issue specific to the Mac, however the ball is in Adobe's court to fix it. It's their code that sucks ass under OS X. Apple had HD video content playable just fine full screen prior to the switchover to Intel CPUs.

This is why Apple pushes for open standards and wants Flash to die. Apple can't improve the closed Flash platform on their own, but they can build their software to support standards well. It looks bad for the Mac when the platform has problems playing keyboard cat due to closed proprietary crap.

The fact that Adobe "magically" brought CPU usage down from 450% to 130% is clearly a sign they can improve it if they try. Now they just need to stop acting like children and use the OpenCL standard on OS X 10.6 to accelerate it via the GPU. Reply

Sure, Apple likes to control their platforms, but that doesn't mean the platform is built on closed technology.

WebKit (the foundation of Safari, tons of mobile browsers including the ones on Android and the Pre) started as KHTML. Apple helped extend it and turn it in the mobile browsing powerhouse. It's also one of the most HTML compliant browsing cores out there. If HTML5 ever sorts out this video codec mess, it is possible it can replace Flash, a technology only controlled by Adobe.

Quicktime is completely MPEG 4 compatible, due to the fact that most of MPEG 4 is based on Quicktime technology. H.264 is everywhere now, streaming into the crappy Flash players, being used to encode movies on BluRay, and so on. MPEG 4 audio is also widespread mostly due to the iTunes Store.

PDF is a core part of OS X.

Bonjour/DNS-SD is an open protocol widely adopted by many printers and other devices, and even Microsoft with Link-local Multicast Name Resolution borrowed heavily from it.

OpenCL is a unified GPU compute language that helps to get GPGPU acceleration out of the "Glide/3dfx" realm and on it's way to a wider adoption.

Grand Central Dispatch is a great technology for developing programs that run well on multicore CPUs, and is being adopted by FreeBSD as well.

Would you like to bring up any counterpoints, or just mindlessly try to bash comments without anything to back them up? Reply

I call BS on Adobe in particular because the ENTIRE Snow Leopard release was to provide access to hardware features through Xcode. Snow Leopard is an OS for developers!

Xcode provides access to Compute Power of the videocards, just instantiate the object; almost like COM but easier ;). Let me translate for Adobe.

"We are lazy, like really, really lazy and don't really care about platform support. We're more closed than any other company but we'll blame others for our fallacies! Let's sign an NDA-no wait, that would be proprietary and companies like Apple, Linux/BSD (company?? I need coffee) want Adobe to standardize an API."

If Adobe were to create an API, they would handle the back-end and could even "open up" the front-end to Adobe. I should go to their offices in Ottawa and slap them on their wrists now.
Reply

flash should just use the platform's libraries to decode video. any video format, not just limited to their crap. why the hell not?

there isn't directshow equivalent on linux, but ffmpeg is pretty standard there and can be used directly instead. ffmpeg has VDPAU support on linux since the start of this year, which is pretty much the equivalent to dxva. at least for nvidia cards. if it does not yet, eventually it will have support for the AMD alternative, and then all programs using it will automagically get that too.

i don't know why things on windows are such a mess that official ffmped wont support dxva there - but there are versions that do, and there are other directshow filters that do, so using directshow should be a fine solution there.
Reply

DirectShow and ffmpeg aren't the same thing and FFmpeg is illegal homebrew software anyway. You can compare DirectShow with gstreamer and DXVA with VDPAU. FFmpeg are just the codecs and container demuxer, not the multimedia framework that puts the image on the screen.

When these drivers become better I would love to see a battery life consumption between gpu accelerated flash and non gpu accelerated. I think it would be useful since long battery lifes for netbooks and Ultraportables are all the rage at the moment. Reply

Two bugs so far:
1.I noticed Billinear filtering is missing on youtube videos when you don't play them HD. This was exactly the same when you disabled Hardware Acceleration in Flash Options. But now with this 10.1 beta I clearly can see the blocky effect everywhere especially when I make a video play fullscreen.. Good Job Adobe!

My system is a Core i7 920, 6GB RAM, Radeon 4870 1GB video card....on a 6Mb DSL line and I'm at the end of the line---way out in the sticks.

Currently downloading a large file, have ESPN (Mike & mike) audio streaming in the background (muted right now) and playing "V" in HD setting and full screen.

I notice no blockiness, no artifacts, nothing but perfect visuals from Hulu. While it does stutter once every few minutes for a second...guess dropping a frame or whatever....I'm attributing that to my taking a lot of the bandwidth I have available being used by ESPN and the file I'm downloading while watching the video.

Otherwise, a simply smooth video, looking just as good as the OTA broadcast of the original.

Don't know what the issues are for you, but I've just never noticed any problems with Hulu's streaming, except in videos that weren't filmed in HD to begin with. Reply

Hi guys. nice news, but I would like to ask if it is possible to do a second test on the AMD systems, because the 9.10 vs 9.11 issue. Even a quick check would suffice to let you (and us) know if it ever changes anything. Thanks, Reply

Could one of you do a quick test with the 9.11 drivers and give at least a single data point to show how well this works on the ATI parts? I know you updated to say 9.11 was required and that could be a problem, but these tests appear pretty hands-off (FRAPS), so a single episode of the Office or the HD content on YouTube shouldn't take more than an hour to run?

I just don't want to see the comments section turning into another ATI/NVIDIA fanboy girl-fight, and claims of NVIDIA favoritism...

Very interested in the final version of this FLASH update. It has been WAYYYY too long in coming. Reply

I am wondering if piping the video through the DXVA hardware decode path does anything for image quality? Do the standard Purevideo/Avivo enhancements apply? At the very least, I imagine it might "soften" resized video a bit more than the standard pixelated crap from Flash. Reply

So I just checked it out and on my system the video quality for hulu 480p videos at fullscreen is horrible. There is noticeable blocks that is most evident in peoples faces. I downgraded back to 10.0. I don't know if the problem was my video card's rendering or flash but on an 8800 gts 512mb it was unacceptable. I didn't check performance closely because on an i7 flash doesn't stress the CPU enough for me to care. I just checked task manager to make sure it wasn't doing something weird and pushing the CPU hard. Reply

Just tested it on my Nvidia 9600GT,running a youtube HD content, CPU utilization dropped to 25-27% compared to > 55% before using the latest flash player. However the video quality dropped and the whole scene seems washed out. Some blocky effect as mentioned as well.. Reply

I tried 10.1 on my system which has a Pentium E6500 and 9500GT and the latest Nvidia drivers. The CPU utilisation went way down whilst watching a HD stream on the BBC iPlayer but the image quality had also dropped considerably. There were noticeable block artefacts - it looks like the AA which was previously applied was no longer happening. I had a quick play around with the PureVideo settings it the Nvidia control panel but nothing seemed to make a difference.
I've reverted back to Flash 10 now. Reply

AMD/ATI
Hardware video decoding of H.264 content in Flash Player 10.1 is supported on AMD/ATI products with
UVD2 with the ATI Catalyst? Software Suite, starting with version 9.11 for the ATI Radeon? family of
products, and driver release 8.68 for the ATI FirePro? family of products. Reply

Well, that would explain things, though I *swear* it said Catalyst 9.10 earlier today/tonight. I think Adobe fixed a typo, because I even followed a link at one point to download the Mobile 9.10 drivers. Reply

The 9.11 RC you mention through AMD's developer site does not support Flash 10.1 GPU acceleration, I just confirmed. Waiting for a driver that does from AMD, also trying to see when AMD will make it public.

Yes, it seems that AMD released the 9.11 drivers at about the same time as I made that comment.
The final 9.11 release should have the GPU acceleration for Flash... However, it didn't seem like they left the OpenCL support in the final release.
So the 9.11 RC drivers and the 9.11 final release seem to be very different :) Reply

1. So I'm assuming flash now takes advantage of DXVA2 EVR rendering, so the GPU is now responsible for decoding quality? I should now be able to adjust my AVIVO settings for flash? I'm not too sure how EVR/dxva and the video card is related.

2. Too bad linux isn't yet supported. Flash on linux is notoriously bad. Nvidia is pushing their accelerated VDPAU, and many software players now include support for it. ATI and intel though are doing something different, but it seems binding are available to translate. So hopefully in the near future linux gets a modern bitstream accelerated video acceleartion framework.

3. Does it work with H.264 only? Or does it also work with sorenson and vp6 codecs? So youtube HQ or better is mp4 always? Reply

I know exactly what you mean about preaching to the choir. I have a decent midrange system (E7600, 8 GB RAM, 4870 1 GB, W7 x64), and even having Flash ads in an open browser window will choke my framerate on Dragon Age: Origins.

So I did the sensible thing and installed Flashblock (previously I only used it on my laptop for battery life and performance).

Better than flashblock, just use noscript (assuming you use firefox/mozilla).

If I ran a website, I think I'd avoid all flash ads (or at least highly recommend my advertisers avoid it).
Although i know many block all ads, I have no problem with ads, so long as they don't talk and don't eat up CPU cycles....oh and I block the keyword ads, because I move my mouse while reading, and those inevitably block the text that I'm reading.

Someone said that the problem is poor coding and that may be true, but if you're on a message board and you open up a bunch of threads in different tabs, those flash ads will eventually kill your processor. On one board, I open up every single thread that I've participated in as soon as I get on (so that they don't get marked as read before I read them) and until I blocked flash, that killed my system.

2) Flash is great, it's the best thing out there for delivering so many things. It's also some of the most fun and creative software i use. The problem is how advertisers use Flash, and what stupid websmasters decide to do with it (dump flash ads all over the place. This is NOT the fault of Flash. It simply happens to be the best thing for these things. If there was anything that could compete, that would be used instead and then people would just call that annoying.

2) Nothing is wrong with Flash performance considering what it does. It uses Vector based graphics normally and this happens to be very demanding for CPU's, Adobe could not possible get vector graphics magically running as good as pixel based graphics no matter what they did. The advantage of vector based graphics though is things like infinite zoom with no pixelation, and adaptive resolution. It's nice to see GPU acceleration for video though, that was needed.

It's sad that even Anand does not seem to understand this stuff. Reply

I agree Flash is a good thing and used poorly often. My concern is that the vector benefits you mention simply become irrelevant with pixel based video being converted to Flash. It is a mammoth waste of electricity and cpu/gpu cycles. I hope they are able to come up with a better alternative for video as it seems to me the core of Flash based video (vector based video) will never change. Reply

Vector based graphics have very variable CPU requirements, where raster video has CPU requirements directly proportional to the compression and resolution which at this point is very high. The Flash player is extremely efficient. It has no problem reaching 60FPS on high resolution content. The problem comes when you overload the content with silly effects that Adobe made just a little too easy to use (eg: shadows). Your frame rate dispute likely stems from the default FPS of 24, which ironically is what film and video runs at, unless it's running at 29.97 or 30 FPS... either way much lower than 60.

HD video just cannot play back without dropping frames without the help of a GPU. Most codecs use the GPU at this point so you rarely see high CPU usage with video playback.

I agree with you that it would be sad if Anand did not understand this stuff, but I think he understands it more than you think he does, and more than you actually do yourself. What's even more sad is how many people at Microsoft, Apple, Adobe, nVidia, AMD, etc don't understand this stuff. It's a nightmare for us even competent users, let alone computer illiterate.

What I'm sad that Anand doesn't understand (or maybe ignores) is how bad the entire codec and GPU acceleration industry is. I see screenshots of Absolutely horrid control panels and video players without comments like "Look at this complete trash they shipped us". There is no reason to have 1) non-native looking anything and 2) a control panel for graphics or codecs. This kind of bleeds over into the sound card realm as well.

Anand: I have a fairly similar Mac and I fully identify with you. The situation is complete garbage. Reply

I disagree, there is a point for control panels for video/audio codecs. See FFdshow.

Control Panel for Gfx:
How else can we force things like AA, AA type, CF, etc on/off? Editing the registry?
Audio: Should we edit the registry to change the number of speakers, the subwoofer cutoff frequency (depends on size of mains vs. sub), etc?

"HD video just cannot play back without dropping frames without the help of a GPU. Most codecs use the GPU at this point so you rarely see high CPU usage with video playback."

Not really true, a good C2Q should do it just fine.

That said, I must say that flash wasn't really meant to be used the way it is used today. Reply

Most control panels are crutches for poor operating system and application developers. I have never touched ~95% of settings in such control panels, and AA settings belong in the application, if any. These sorts of things should be non-issues. Same with sub cutoffs. That belongs on the amp/receiver. Even if I'm completely off base, there's no reason the travesty that is any current control panel should look the way they do. Honestly, skins in Catalyst? What - the - fuck.

The ONLY three things I ever do in a graphics card control panel are 1) adjust flat panel scaling, which It should never be on anything but maintain aspect ratio, 2) adjust black levels, which I should never have to do with a properly calibrated monitor and 3) adjust multi monitor options because for whatever reason nVidia and ATI insist using their own ultra shitty implementation instead of the only somewhat shitty Windows implementation.

You got me on the C2Q playing back video, although if you start doing anything else CPU intensive, that wont last. Imagine playing back TWO videos?! Reply

Sorry, but no. Having settings in the application means that if the application was not designed to use those features, you can't turn them on. Any features implemented just by the driver and hardware should be controlled entirely by the driver and hardware. The user should then have a front-end to control those settings.

Games should have the minimal amount of settings for anything not programmed into the game. The driver should have everything else. AA is one that should especially not be controlled by the game itself. AA controlled by the game means the game can turn it off, which should never happen. I would return a game that did that in a heartbeat (I've been having AA always on since Quake 2 and Tribes on a GF2 GTS--which would not even be possible if it were a game setting--and am not turning it off until pixel size becomes at least 1/16 what it is now on LCDs).

The game should never get to decide what settings can or can not be used. Likewise, audio applications should not have to know about my speaker setup, just send data to the driver API. Video should be the same way: define the stream, send it to a demuxer, have that send the video portion to a black-box decoder API. it doesn't always end up working out perfectly, but that's why default settings tend to be the most widely compatible ones.

Not all such control panels are well thought out, but they have their place. In some cases, they are OS/app crutches (multimonior and color tuning, FI), but they are quite useful beyond that, and would not disappear, even if all the crutch-like settings could be done away with. Reply

The problem is that AA doesn't work properly using some methods unless it is built into the game engine. Never mind that with pixel and compute shaders, it is now possible to do AA within the game code and not have it impact performance as much (i.e. DX 10.1 enabled games). I don't think you can make a case for either direction: i.e. it shouldn't ALWAYS be controlled by the game, and likewise it shouldn't NEVER be controlled by the game. Similarly, trying to control that in the driver won't always work (but it sometimes will).

IMO, the only reason we have the setting in drivers is because games are often not forward thinking, limiting what setting they will or won't support. Assassin's Creed for example decided that any resolution above 1680x1050 shouldn't be allowed to run AA. Stupid. Older games were made before AA was even a consideration. All new titles should look at implementing AA internally, in an optimal manner, with the ability for the user to turn it on or off. Thankfully, most games are doing exactly that.

Another case for why AA should be in the game/application and not the driver: say game X runs perfectly well at 2560x1600 4xAA, but game Y can't do more than 1920x1200 0xAA, and another game can run 2560x1600 0xAA, and yet another 2560x1600 2xAA.... You get the point. If you control the setting within each game, you set it once and forget about it. If it's a driver only setting, every time you decide to play a different game, you have to enter the control panel for the drivers and change the setting.

Saying AA should be only in the control panel is just one step up from saying games shouldn't even be able to specify what resolution to run at. I think we can all see how ludicrous that would be, and by extension forcing the driver to tell games what AA (and HDR, etc.) to use is equally limiting. The application knows what to do best, and the drivers are just an interface that talks to the hardware and interprets common function calls. Reply

A reasonable Core 2 Duo with GMA 4500 or better graphics should be able to handle Flash HD at up to 1680x1050 without trouble, even with Flash 10.0. The problem is that a lot of netbooks, nettops, and entry-level laptops don't have even that. Reply

Flash sucks big time,and is largely unwanted on the internet!
Instead of displaying movies in Flash, websites would have done better with streaming DivX/XviD or OGG/Mp3/WMA or so.

Flash is as much an internet hog as REAL was in the real networks days.
They are low in quality, require high CPU, and display few FPS.

I prefer the internet to become like a mobile internet, without ads, and just the minimal info visible necessary to do the most basic internet tasks (like this article, a few .jpg's, and a non-java based forum or thread underneath where we can type.

Many websites won't exist if it wasn't for the ads you may say, but I'll reply to them:
"So where is all that money going to that they get or pay to keep their website online? Who's on the head of the chain? The government?"
The internet was supposed to be a free thing, the only ones who should charge are the companies who place cables which carry the signals, and the renewal of the servers.

We're living in 2009, back in the '80, it could cost you quite some bucks to have 20MB of online server space!
Nowadays, they charge quite some to, while giving you 50MB of web space! I mean, what is that?
It costs a company today $90 to get a 1TB HD!
And if they got rid of flash altogether, internet pages wouldn't take up more than a few hundred of kilobytes.
A 1 TB harddrive could be enough to give 10.000 customers a 100MB webspace for $10 per person, which is almost for free.

But no,if you have a big website, they charge you hundreds of dollars per year, sticking a big fat bonus in their paychecks, because a server does not cost a company $100.000 anymore. They nowadays can easily be made for 1/10th of that price, yet they still charge too much.
That's why the world must be terrorized by flash ads.
I'm glad someone got some sense to create an ad blocker for my browser; because that not only seems to ease my reading of the page, it also reduces my overall network traffic, lowers CPU usage, and therefor increases battery life on my notebook, and it keeps me safer from hackers trying to enter into people's computers through annoying popup ads, using the weaknesses of flash.

The world would be a better place if flash was never invented.
Even those little tasks you do in flash,like playing farmville in facebook, perform much better if it was only an executable file for download, instead of a flash game! Reply

People provide services. In exchange for those services, they get paid. Their pay allows them to buy your services. People who provide services people actually want get paid, while those services that people don't want fail. Government steals 50% of your pay to build roads, police crime, and perpetual failed social programs.

A 1T hard drive might cost YOU $90, but for a server company it cost quite a bit more. Or do you and the 10,000 other customers not care if the 1T is RAIDed and backed up? I am not sure what else goes into to it since I don't do that, but I would guess they also have to pay for space/electricity/AC/people/internet/etc.

Oh, I'll add that I hate flash, too. Especially the idiot websites that think their front page has to be flash based. Reply

I'd also like to add, that the bad Flash performance in many things like Flash based ads, is nearly always down to the web developer of the ad itself. SO many of them could be made to use less CPU power, or even get file size way down.
It's nearly always down to web design amatures who dont know the following:
What image files types are best suited for what they're doing,
When to use vector graphics instead of jpegs,
And what quality settings and Flash publishing settings to use.
Reply

Even with a 9500GT that plays back 1080p mkv beautifully flash player chugs. It also only uses one core! What the hell, atleast you'd think it would multithreaded. The Atom D510 may be pathetically weak, but it goes to show how far adobe's heads are up their asses when even with supported cpu/gpu on a supported OS (Win XP/Vista x64) flash is still such a piece of garbage it can only grab one thread of a cpu. Reply