Posted
by
CmdrTaco
on Wednesday April 28, 2010 @10:45AM
from the only-a-matter-of-time dept.

Stoobalou sounds another death knell for Flash video. He says "Another heavy user of Adobe's video streaming software Flash is now pandering to the all-powerful iPad. Everybody's favourite waste of time, social notworking monster Facebook, is now streaming user videos to Apple's second coming of the portable computer with no sign of Flash in sight."

The Flash video used on Facebook is already H.264 video and AAC audio, just in a FLV container. All they really need to do with these is remux everything. I'm assuming they'll just remux into an MP4 or MOV container.

You're correct. Too many people seem to think that Flash is a specific type of codec, when in fact it is no such thing. It's actually about time that this barbaric development platform sees it's timely end so we can move on to better platforms.

Facebook may very well already be encoding its videos in H.264 (which is supported by Flash). In this case, all they need to do is to wrap the files into an MP4 container, with no transcoding necessary.

YouTube already supports this, and I imagine, will begin to do it by default in the near future.

Um..... please explain how Apple is responsible for the progression from floppies to hard drives, or from parallel ports to USB ports. The former seems a natural event since programs/OSes could no longer fit on floppies. The second is a result of the USB Consortium. To give Apple credit for this seems disingenuous, (especially since Apple would have preferred to kill USB in favor of Firewire).

Simple. The iMac shipped with USB everything. No floppy disk. No legacy ports (ADB, RS232, etc). Hell, I don't think the original ones came with a CD burner!

Back then yes you had USB. But you had two measly ports that pretty much sat empty because all the peripherals you could get were cheaper and easier to get in other connection formats. A keyboard and mouse were PS/2 because you could get both cheaply (a cheapass USB one would run you $50, a PS2 version of same for $20 or less). Printers used the parallel port. Modems either plugged into a serial port or inside your PC. And hard drives you had to install 'em yourself. You could get external Zips and Jaz drives, but unless you used SCSI, you put up with parallel ports. You transferred data via sneakernet.

And hell, USB had been around for 3+ years and peripherals were hard to come by. They were expensive and no one wanted them. OS support was iffy, too. Windows 95 OSR2 had basic keyboard/mouse support. Windows 98 same, but you could get drivers for mass storage. Basically non-existent until Windows 2000.

The Apple releases the iMac and gets you USB only. All of a sudden, a flood of peripherals started coming out for USB, and prices plunged. USB floppies, USB printers, keyboards and mice under $10. USB didn't mean overpriced anymore. And I scoffed at USB devices because they were overpriced - the USB versions were always much more expensive.

And Apple did like Firewire, because well, you could stick a hard disk on it and not have ot wait all day to transfer files like USB. (Remember, the iPod used Firewire purely because USB 1.1 was pathetically slow, and USB2.0 was on the horizon but would take a few more years to become popular and standard on every PC)

My understanding is that some containers bring features such as multiple audio tracks, multiple sub titles. The sound and video are stored separately inside the container (this is why sound can get out of sync sometimes, they are 2 separate streams of data playing simultaneously). Some containers like mkv can provide different auto streams for things like different languages, as well as subtitles and many many other different kinds of metadata. The container is almost like a zip archive with all the different parts living inside it with additional data storage.

There was this computer you might have heard of, it was called the iMac.

When it came out, USB was around but there were very few peripherals and people were still using floppies rather than CDs for everything.

The use of floppies for software distribution was already on the decline (though in most cases you still needed a rescue floppy for a windows machine) the iMac certainly helped speed that up and showed that a computer could be successful without a floppy (many laptops still came with a drive at that time).

As for USB though, the iMac caused a huge increase in the number of USB peripherals and had a significant impact on the market. You may hate apple but thats no reason to ignore the impact they had on the industry.

What a bunch of laughable nerd rage. And thankfully, utterly inconsequential to the millions of people who enjoy modern technology without the help -- or interference -- of nerds (a term that never should have lost its pejorative definitions). *This* is the real accomplishment of moving out of the 80's and 90's and into a more modern relationship with our tools.

Basically, a container serves to package up multiple streams of data (H.264 video, AAC audio, etc.) into one file with an index (for jumping around and maybe indicating chapters), subtitles, etc. As for the “what's what” of containers, Wikipedia has a nice comparison table [wikipedia.org] available.

Yes. According to the article "All new videos are encoded in h264 format, so we're playing videos natively in the iPad since it supports h264-encoded videos"

Flash can already play h.264 files in the mp4 container. So all Facebook has to do is bypass the flash plugin and just link you directly to the very same mp4 file the flash player uses.

I notice quite often that flash based videos are already encoded in a format supported by the iPad (I say iPad because some are too high resolution for the iPhone to play, even though the format is correct). Simply using the Activity window in the desktop Safari most of the time I see an mp4 file being streamed into the flash player. Simply copy the url from the activity window and paste it directly into the browser and it will play natively. Take the url to the iPad and it would also play just fine. So in cases like this, all that needs to be done is for the site to link you directly to the media file and the browser will do the rest. Why do we need flash for video again? (I know, for those not encoded in h.264.... )

You certainly have a whitewashed view of history... Windows 98 had full USB support for any device built out of the box. First usb header (not even a port) I ever saw was on an ASUS motherboard in the mid 90's long before apple was using them.

Most PC companies are about gradual change - have both options on a board, then one option after the parts arrive - which is what Apple did until the iMac g3. One could easily argue what they did was a bit premature.

Interesting you mention floppies - I recall a lot of mac users being rather upset (this is long before CD-RW, or usb thumb drives were all that common). Many 3rd party companies made a lot of money selling after market USB floppy drives.

Apple did force the issue, but like I said - iMac came out in 1998 (there first all usb machine - no ADB) - by then Windows 98 had full USB support built into the OS. Microsoft's famous bluescreen error [youtube.com] while plugging in a USB scanner was demoing Windows 98, and yes that feature worked when it shipped. 95 OS-R2 had the same USB support via a patch, and no it wasn't just keyboard/mice.

In other words - by 1998 - USB was here probably because both Microsoft and Apple promoted it actively, but you have to remember Apple derided USB (even when 2.0 came out) as being too primative for anything hdd/camera/scanner related (yes there were firewire scanners made for the Mac).

You are correct - containers provide metadata and important information regarding the streams, provide the ability to multiplex multiple streams in a single file, and generally also provide guidance regarding playback criteria (preferred information about how it should be played back, though sometimes streams also encode this)

Speaking on audio/video desync, for those interested...

You end up with multiple streams, each presenting a DTS and PTS value - the PTS being the presentation time stamp, the one that really matters. So what happens is frames can sometimes be marginally out of order or otherwise screwed up, and streams can play at different rates - generally the audio is bound to the OS through a callback interface because its interrupt driven - so you end up playing it at the bitrate as fast the OS will take it (and the OS locks to the bitrate and sampling rate for the audio to make it playback correctly). Then, you end up syncing the video to the audio...so what you do is before you display a frame you check the PTS and make sure it is within a certain amount of the PTS of the currently playing audio. If it isn't, you speed up the video by a certain amount depending on how far behind you are, or you just jump ahead, depending on your goals. You can see this when as disk activity becomes weird, sometimes audio keeps playing, video freezes, then video jumps to catch up. This becomes more complex when you get into codecs that have delta and beta frames like H.264 because you have to account for the fact that certain frames only update the previous frame - that is they don't contain a complete pictures, so you end up seeking to the nearest key-frame (frame that is not beta or delta, but contains complete picture)

If a program isn't properly using PTS values the audio will sometimes appear slighly out of sync. From experience, this generally indicates the PTS sync threshold - the threshold where the program starts to modify the speed of the video display - is too large. If this value is too small, you get a stuttering effect.

So...there's more than you ever wanted to know about video and audio decoding.