Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "It looks like Microsoft might finally be realizing that Silverlight can't cover every platform, according to this conversation with Bob Muglia: '... when it comes to touting Silverlight as Microsoft’s vehicle for delivering a cross-platform runtime, "our strategy has shifted," Muglia told [ZDNet]. Silverlight will continue to be a cross-platform solution, working on a variety of operating system/browser platforms, going forward, he said. "But HTML is the only true cross platform solution for everything, including (Apple's) iOS platform," Muglia said.'"

The fact that most of them suck, and you would have to install multiple plugins just to be able to browse the internet regularly.. but then because of that you end up having lots of security holes all of the place due to hastily written plugins.
Back in the day you had Real Audio, and you hated it, because all it did was BUFFER all the time. A lot of sites had WMV, which couldn't be played on Macs. A lot of sites had MOV which required QuickTime, which behaves horrendously on Windows. Now, the majority of

Yes on part of your post, but nobody has yet explained to me why supporting HTML V5 with H.264 is BETTER than supporting flash. It seems nobody is willing to talk about the elephant in the room: H.264, which is the biggest patent minefield in the history of bad patents. If we were talking WebM then yes, 100% right there behind you. But FOSS browsers like FF can't support H.264, since MPEG-LA has made it clear you WILL be cutting them a check, whereas Adobe doesn't give a shit who or what packages flash. Start advertising native H.264 support in a distro and watch MPEG-LA drop the troll hammer upon thee, whereas Adobe don't care, package away. So far they haven't even said boo about alternative render projects like Gnash.

So unless we can get the two Steves (Ballmer and Jobs) to get on board WebM I think we have a serious problem here. H.264 simply trades one master for another, and while I personally don't mind proprietary software as long as there is competition switching over to HTML V5 would pretty much hand the keys over to MPEG-LA, which have proven they just aren't nice. I only hope the web developers of the world will unite and demand that HTML V5 have a FOSS codec for video, be it Theora or WebM, rather than simply trade one lock in for another.

FYI, most flash video is already streamed using h264. The options are plain old html + flash + h264 or html5 + h264. On Linux, there already are dozens of programs able to decode h264, none of which has gotten into any legal trouble. Adobe on the other hand, has actively been sending DMCA requests to any projects (such as rtmpdump) working on decoding the proprietary RTMP protocol which is integrated into flash streaming.

I think it'll be a game of who blinks first. If Google puts WebM as the primary codec on YouTube, many (most?) device manufacturers will feel compelled to support it.

I think it's also possible Google could get its Android partners posse (and maybe Nokia) to also use WebM. With both Nokia and and Samsung/Motorola/HTC/LG/Sony etc., that's the majority of phones out there.

Flash can be played on Macs. You just have to install the plugin if you want it now, rather than having a (potentially out of date, with security holes) one preinstalled. You know, just like the way it is done with Windows - if you want it, go and download it. Apple doesn't want to be responsible for shipping a plugin that you're going to have to update when you first use it anyway.

Not shipping it by default (to come into line with other OS vendors) is not the same as "can't be played [...] if Apple had the

I would think that HTML 5 being more cross platform is pretty obvious. Along the gradient of machine code -> interpreted/jit code -> scripting -> markup/declarative language, the more to the right you get, the more portable you inherently become.

Only in the parallel universe where web applications are an appreciable percentage of the total software in use.

Web apps are certainly more visible than other apps, but for much the same reason that TV shows are more visible than other forms of art: everyone (or nearly everyone) has a web browser and a TV. But just as all the television shows ever made represent fewer works than are in the average large chain bookstore this evening, web applications represent a negligible proportion of the software in use.

It all depends on who has access to reference environments, money, access to the internals, and motivation to make the changes to make it work.

Microsoft, being buried in cash and having access to just about anything it wants to play with, and the only access to Silverlight, could easily set a goal of making it better propagated than similar functions in HTML5.

I think what it really wanted was for, somehow, people to adopt Silverli

the more to the right you get, the more portable you inherently become.

No, you don't. That is only the case if the language(s) you're dealing with are transportable due to having a virtual machine/runtime compilation design - and those languages have a multitude of platform-specific interpreters.

Examples: perl, python, java, javascript,.NET.

Silverlight is a very 'high level' language - but it only has runtimes for Firefox and Safari on OSX, and (essentially) Windows. There are no mobile implementations (except for possibly Windows Mobile 6.x, couldn't find any info on it.) Flash is much more portable and cross-platform.

Even javascript isn't all that cross-platform/portable due to the use of different browsers/javascript implementations.

Well, mono has been cross-compiled to a number of platforms, and node.js (V8 engine) is available on many OSes as well... not to mention that the language implementation (aside from E4X support) is about the same everywhere (at the core ES3 level), it's the browser DOM that's the mot different. Since the majority of smart phone sales are heading into iPhone & Android territory (both using webkit), even that gets to be a pretty moot point.

Naww, I think the translation is "Sure, you can use HTML5 for videos of your cat and basic apps, but if you're a commercial video publisher you'll love our built-in DRM, robust playback controls, dynamic quality change based on bandwidth/congestion, etc and other features. HTML5 isnt a threat because we're more focused on Hulu and Netflix, not your grandma's blog about baking." Adobe has the same attitude along with "oh btw, here's a script that provides fallback to Flash if your browser doesnt support 5 o

Honestly, I happen to like a lot of the concepts behind the DLR, its really like a more polished JVM. The biggest resistance seems to stem from it being that MS came up with the thing, they came up with XHR, SMB/CIFS as well, which seem to be widely used. The other point is the patent issues, but MS released the Community Promise, which is more than Sun/Oracle have done. Beyond this, they've helped Mono/Moonlight as projects, and released th DLR and other code (ASP.Net MVC) as able to run under Mono.

Nobody -- not even MS -- thinks SMB was a good file-sharing protocol. It became popular only because it was the only option available for MS clients. And even MS ditched it more or less as soon as they figured out that network file systems would be the normal way of doing business rather than some transient storage used to replace sneakernet.

CIFS is a better choice, though still not ideal. It's acceptable for many user-oriented filesystem mounts, but it has several limitations (some of which can be avoided

Moonlight seems to be a solution in search of a problem. It works great with aspects of silverlight nobody uses. And the only thing lacking in it is the ability to play the drm video of the few siliverlight using sites anyone actually cares about.

The only thing I ever needed Silverlight for was to watch Netflix streaming, and Moonlight didn't help any there. It's like Mono to run.net, or Wine to run Win32; you'll get a little ways with it, just not enough to be very useful. Microsoft simply does not do cross-platform (not even to the point of releasing and then following their own standards so others can make compatible implementations). If they say they are going to, it's a ploy. Sorry to have to repeat slashdot dogma, but it happens to be true in this case.

Moonlight just can't do what the Windows version can unless there was some huge upgrade to it in the last 6 months that I missed. If I am incorrect, than I will be downloading it tonight, but last check, Moonlight is not very robust compared to the Windows version. I am talking SL 1.0 support only (maybe a version that is before 2.0, but I am unsure). SL 3 is awesome, and I have not touched SL 4 yet, but the leap from a media player to something where you can use it like a platform similar to flash.

Unless you let the gospel of RMS into your heart, you will burn in the fires of Hades!

He who hath heard the Good News and let it fill his soul will have taken their first steps to redemption. Every time you say "GNU/Linux", you take another step upon that path (but, watch out... if you say Linux, without the "GNU", you will fall off the path, into the waiting hands of the Ballmer Devil!)

If that Hell is a world where the middle-men have even greater control over distribution than they do now, where the first sale doctrine is an anachronism, where cultural history can be rewritten or censored as easily as deleting a file, then yeah, you are merrily skipping down that path.

Netlix never "banned" Linux. If you can get it to work with the site, great, they'd be happy for you. The problem comes in with the studios, who demand that Netflix use DRM when a user streams a video on their site. So they use Silverlight's built-in DRM API, which the studios are okay with. The only problem is that Moonlight does not implement Silverlight's DRM scheme. The details are proprietary, and although Novell has asked Microsoft for permission to use their DRM scheme in Moonlight, Microsoft has said "no." They don't want to share it, they definitely don't want it open-sourced (what's the point of an open-source DRM implementation?). This all makes sense from both parties' perspective; the only one really making a stupid mistake is Netflix, for using Silverlight in the first place. (Although I don't know whether their licensing terms played a part in that or not---in any case Flash nowadays has lots of DRM support, and would of course be a viable solution should Netflix decide to switch.)

HTML5--another in a long line of standards forcefully popularized by Apple that Apple won't get credit for when everyone takes it for granted. See also: 3.5-inch floppies, USB hardware, the "File Edit View Window Help" menu layout, and more...

I didn't say Apple invented USB. I'm saying that it wasn't until the original iMac that hardware manufacturers fully embraced the standard in order to support the new Mac, which used USB ports. At the time, the standard with PCs was still a PS/2 mouse and keyboard, a parallel port for printers, and so on, so the iMac's design was very forward-thinking.

Actually, USB is an Intel designed standard and came with the ATX board design and the BX430 chipset, also from Intel.

designing != popularizing

The iMac popularized USB because PCs at the time were still using a variety of connectors (PS/2, parallel, serial, etc.), and the situation was similar with previous Macs. Including USB as the only* external hardware connector on the first iMac is presumed to have spurred the industry to create appropriate peripherals faster. For that record, we can say the same thing about the floppy drive, which, as you may remember, the iMac also omitted.

Actually, Apple *invented* Firewire; Sony made it popular outside of the Mac world by taking the generic specification (IEEE1394) and using it on their DV cameras. But Apple did indeed popularize USB by making it the only peripheral port on the iMac, encouraging more peripheral manufacturers to support it (the iMac was pretty wildly successful when it first came out), and it was largely because of this that *every* PC manufacturer started making USB a priority over the old serial ports.

The first iMac was controversial at the time because it eschewed all previous peripheral connector types for USB ports. At the time, USB was a new standard that wasn't as widely adopted as it is today.

I'm talking about the 3.5-inch floppies that Apple was first to include in its Lisa and Macs. They were removed in the late 90s when nobody was using floppies anymore. If you're seriously arguing that 1.5MB floppies were still widely used by 2000, I don't know what to say.

Firewire was started in the mid-80s to replace parallel SCSI, nearly a decade before USB's existence. It is still the standard for data transfer between devices such as A/V equipment. Apple's been phasing it out over the years has always been a supporter of USB, adopting it in the original iMac to the exclusion of older keyboard and mouse connectors, forcing hardware manufacturers to support the new standard.

Firewire was started in the mid-80s to replace parallel SCSI, nearly a decade before USB's existence. It is still the standard for data transfer between devices such as A/V equipment.

Yep, most notably all DV cams and even HDV cams. But the modern cams based on AVCHD don't anymore, things get stored to a HDD/memory card so no need for firewire's perfect realtime capture. It's just transferred as any other file...

Firewire was started in the mid-80s to replace parallel SCSI, nearly a decade before USB's existence. It is still the standard for data transfer between devices such as A/V equipment.

firewire is great to quickly create a *very fast* network link between two computers side to side, if you have the cable of couse. In Linux just load firewire-net and you should see a firewire0 net device popping up.Gigabit ethernet is getting more common in laptops though. However I still find a lot of laptops with firewire.

Further, how is Firewire their preferred standard when every iPod and iPhone comes with a USB connector? Apple has always been the biggest supporter of USB. They even put extra USB ports on their keyboards and cinema displays, for crying out loud.

They only included the USB2 protocol and connector on the iPod when they made it Windows compatible. It was originally Firewire only, then had both for a while (it was always better with firewire on the Mac, and charged much faster too), then they dropped firewire and went USB only to make them cheaper to manufacture - only one controller to support, and you can cut out some hardware and make it smaller.

Every iPod came with a Firewire connector until 2005, and every Macintosh produced between 1995 and 2008 included a firewire port.

Which is because we're talking about USB 1.1 in those days, when the 400 Mbps that Firewire provided exceeded both USB and Ethernet. Today, in a world of USB 2.0 and gigabit ethernet, Firewire has mostly outlived its uniqueness.

WRT the floppies, you must either be joking or a kid. Long before Apple was the first to abandon 3.5" floppies, they were among the first mass market computer makers to adopt their use. When the original Mac came out, nearly every other system came by default with 5.25" floppy drives. 3.5" drives were available as options for those other systems, but the Mac was, if not the first, one of the first to have 3.5" as the built-in standard.

WRT FireWire vs. USB, I'm pretty certain (although I could stand corrected) that Apple's stance has always been that there are some things for which FW is better, and other things for which USB is better. I'm pretty sure that every Mac that has shipped with a FW port has also shipped with at least one USB port. Apple never, ever, ever tried to push anyone towards FW keyboards and mice, for example.

What's interesting is that with USB2.0--while it's still not as fast as FW400 due to its half-duplex connection--Apple has accepted that FW's benefits aren't really all that tangible outside of the professional realm. Running a music studio and need to do 32-track digital audio? Get a Mac Pro with FW800. Recording your neighborhood jam sessions with Garage Band? The USB interface on your MacBook is good enough.

I wouldn't be surprised if, once USB3.0 ships, Apple even moves away from FW800 on pro devices and just puts USB3 on everything. My understanding is that USB3 goes full duplex *and* increases to 800Mbps (though I could be wrong). If that is indeed the case, then unless there's something I'm not aware of, the benefits of FW400/800 are essentially nil.

Firewire had serious security implications which weren't particularly well advertised and I'm not sure that I'd really know how to handle them. Part of why Firewire was faster than USB was that it had direct access to the computers memory. I remember connecting two computers via Firewire and then connecting a peripheral to one only to have it appear on the other computer.

I assume that my memory is a bit glitchy, but that's pretty close. Firewire was great for debugging computers, but you had to be sure t

Firewire has DMA. So does eSATA. I don't see anyone whining about DMA there. In fact, they'd whine if eSATA didn't support DMA. And there are methods available on both busses to require devices to be authorized before DMA requests work.

Firewire is also a master-less system. USB can only connect one master to multiple slaves. This is why you can't connect your camera to your phone or visa versa -- both devices are setup as slaves and can only connect to a host. This also means you can't connect two computers together via USB, as they are both masters. Firewire works like SCSI or Ethernet, where all devices are peers -- any FW device can talk to any other FW device on the same bus. You can even interconnect electrical busses with relatively intelligent routing to give you multiple collision domains while maintaining connectivity among a large number of devices. This again is a feature -- if your computer, phone, and camera all had FW instead of USB you could connect them in arbitrary combinations and still have them work. You can also use FW for IP networking and other Ethernet-like functions (and in fact modern FW provides support for cat-5 connectors that automatically switch between FW and Ethernet).

Though 3.5" floppy drives had been around since 1982, they did not meet with success until the 3.5" floppy drive was chosen for the original 1984 Macintosh (quickly followed by the Atari ST and Amiga the following year). Apple was not too far ahead of their time when they killed the floppy in 1998, but they saw where things were going and made the right call-- Mac users who still really needed a floppy drive were able to buy an external one. Windows users questioned it because they weren't (really, still aren't) accustomed to being able to boot from any device with an OS on it that's connected to their computer, so floppies were their lifeline.

Though USB had been on PC motherboards beginning in 1996, nobody did anything with it until Apple put it in the iMac in 1998 and excluded all other port types. Lots of people will argue that Microsoft finally adding USB support to Windows (in Win95 OSR2) was the tipping point, but that's bull. Windows users had the option of clinging to their peripherals that used the ancient parallel and serial ports, and cling they did. iMac users had no such option, and the popularity of the iMac meant that if hardware makers wanted iMac owners' money, they had to start churning out USB-based peripherals for them.

As an aside, Firewire did not appear in a Mac until the Blue & White G3, in January of 1999. It did not appear in an iMac until the 6th revision, in October of 1999. Apple's view was that USB and Firewire were complementary... USB for low-bandwidth stuff like keyboards and mice, and Firewire for hard drives, video cameras, and other high-bandwidth devices. Intel was the one that had the apparently inferiority complex and started working on USB2, to compete. Based on my experience using both, Firewire 800 is superior to USB2, and if I have the choice between those two I'll always pick Firewire. (As for the future, Firewire 1600 and 3200 have been approved by the IEEE but aren't in any shipping product, I haven't seen a USB3 device in the wild yet, and Light Peak is a wildcard at this point.)

To sum up, Apple is the tech company that is not afraid to chop off legacy stuff at the knees, and by doing so indeed often drags the rest of the industry kicking and screaming with it.

I'm not a Mac zealot or anything (writing this from a self-assembled Linux box) but I think you're missing the point.

This is not about having USB, it's about having USB while not having serial and parallel. irDA is really small compared to the giants that are serial, parallel and USB - it matters about as much as PCMCIA.

I built my first "own" computer in 1999 and it had all the old ports. I used all kinds of parallel and serial devices and no USB at the time - had I had an iMac, I would have bought USB devices. I had a printer which ate parallel, and it's pretty obvious that I used the existing parallel port instead of buying a new one just because USB was there. Yet with an iMac I would have been forced to buy a brand new USB device.

See how this works? Hell, when I started out with that computer I used an ISA sound card I had left over from before which perceptibly slowed the entire system down with its ancient hardware communications. Good luck using such shit with the iMac even back then - it's not about having the new standard, it's about forcing it by not having what everyone used to have.

Yet again, we all benefit from the fact that Steve Jobs is an asshole. His refusal to adopt WMA or license FairPlay killed DRM in the music industry, and now his refusal to allow Flash/Silverlight is pushing Internet standards forward.

What's next? Video? Can we get a real TVoIP system to kill cable? DRM-free movie/TV purchases?

His refusal to adopt WMA or license FairPlay killed DRM in the music industry

I'm sure it had NOTHING to do with the fact that WMA and FairPlay sucked, nor a little out-of-bottle genie called Napster.

It definitely had nothing to do with FairPlay sucking. FairPlay does suck a little bit, but all other implementations of DRM suck a lot more. What Apple did was 1) create the #1 best selling portable digital music player of all time, and 2) refuse to allow music purchased from any online store but theirs to play on it. This had the effect of motivating everyone else who wanted to compete with the iTunes Store to convince the record labels to allow THEM (not Apple) to sell DRM-free music, since there was no other way for them to meet customers' demands of something that's compatible with an iPod. Once this happened, it wasn't too much of a stretch for the record labels to allow Apple to sell DRM-free music too (although Apple did have to compromise in the negotiations, and allow the record companies to set different prices for some songs).

Your out-of-bottle genie is part of the reason the record labels insisted on DRM in the first place.

Apple was a supporter of DRM-free music and music sharing/fair use long before the iTunes store or iPod revolution - remember "Rip, Mix, Burn"? From the very start they never wanted DRM, but if they wanted content to sell, they had to include it.

So, they made it as weak as possible - they included the ability *in iTunes itself* to strip off the DRM from your tracks, and encouraged you to do so every time you downloaded music. It wasn't ideal (since it required making an Audio CD, so had a transcoding loss i

I know QuickTime doesn't have the best history (though neither does any other commercial A/V system), but it's now *the* standard for MP4 A/V wrappers and supports the same sorts of features that the built-in video systems on all other platforms do including a wide variety of easy-to-install open-source codecs an an extensible codec/wrapper framework. While it would be nice if there was one universal A/V system such a thing does not exist, and on supported platforms QuickTime is not such a terrible choice.

It was only after Amazon took away about 10% of the MP3 market that Apple removed DRM and that was only done in order to remain competitive.

Your premise is somewhat based on the assumption that Apple has the only say in the matter. It assumes that the music companies would gleefully allow Apple to sell music with no DRM. It was always the music companies that insisted on DRM.

In fact almost a full year before Amazon offered DRM free music in January 2008, Steve Jobs publicly stated [apple.com] in February 2007 that Apple would sell DRM free music if allowed. And EMI allowed Apple to sell DRM-free tracks since May 29, 2007.

> In January 2008, Steve Jobs publicly stated in February 2007 that Apple would sell DRM free music if allowed.

You actually believed that?!?

Consider: If Apple wanted to sell DRM-free music, it would mean that that they wanted people to be able to play music they bought from the iTunes store on any MP3-capable device they owned. If that's the case, why has Apple spent years updating iTunes, sending cease and desist letters and filing lawsuits to prevent people from being able to do so?

Apple said in February 2007 that they would offer DRM free music if allowed. EMI allowed them in May 2007. Actions speak louder than words I guess. Amazon didn't offer it until January 2008. So technically Apple was the first to offer DRM-free music. That dispels your theory that Amazon was the leader.

Amazon was the music companies' attempt to wrest their dependence on Apple. The music companies created the dependence problem in the first place by insisting on DRM. They didn't think that Apple was a

If that's the case, why has Apple spent years updating iTunes, sending cease and desist letters and filing lawsuits to prevent people from being able to do so?

They were contractually obligated by record labels to make their best effort to maintain their DRM system. If they hadn't tried to keep it intact, record labels would pull their content from the store.

You're creating some revisionist history here. Jobs had been outspoken about the problems of DRM for years, and it's known that Apple created their DRM scheme, above Jobs's objections, because record labels insisted. Record labels also had Apple remove the ability to copy music off of your iPod, which was

Perhaps realizing that even longtime Windows user like
myself refuse to click the "must install Silverlight" link
on the few websites that have it.

The only place I have this problem is on a few streaming
radio sites. In almost all cases, they have another link
for the "basic player" which gives me what I wanted: audio
from their station without having to install more crap.

To the exception of Netflix, I can think of no site I've been to which uses silverlight.

I've got a friend who, despite his worthwhile attributes, really likes Microsoft software (always has). He's mentioned that "does not work with silverlight" is a big game killer for him: apparently there are a number of sites and appliances which require silverlight plugins to use, which are important to his clients and their management. IMO, that's a huge fail.

Nice. For those of you complaining about how HTML doesn't or can't do everything that Flash/Silverlight/Java can do, realize that most of that stuff is not really necessary for basic information display purposes.

Now I'm waiting to see how Silverlight+WP7 and AdobeAir+Playbook will pan out. If the responsiveness and capabilities can't parallel native, these interpreted OS layers will be at a significant disadvantage. However, Palm did deliver something quite great with WebOS which was based on HTML/CSS/JS, so maybe this is the next step and most natural fit for technologies like Silverlight and Air...

Silverlight does not go away - it will simply take the place of ActiveX as the platform of choice for "kinda Web but not really" apps in MS-centric shops. A few places like that I know are all either already using Silverlight in that role, or are seriously considering it. On the other hand, I know of few sites on the Net which serve Silverlight content to end users.

If you look at the feature set changes in recent versions (especially Silverlight 4), it seems that this is also the direction in which it is being pushed. It now has a fairly complete widget library, and not one but two (WCF Data Services client library, and WCF RIA Services) data manipulation frameworks which integrate seamlessly with ORM on the backend, support integrated Windows authentication, etc. Immensely useful for business apps, but not so much so for typical consumer stuff.

The 5 primary Desktop computers in my home run Linux. I purchase services (annual subscriptions in Microsoft speak) from the NFL/MLB/HBO and several others. They all work with Linux. They all work with my Windows Netbook, Wii, MacBook, and Linux Laptop. The producers know the product they produce is viewable with Linux and several other OS's. They get my subscription fees while Microsoft doesn't. Check it out, I'm not tied to any platform.

Thing is, I believe that some quarters (particularly those who deal with desktop software) within Microsoft honestly think that "cross platform" means "works with more than one version of Windows". Were you to walk into one of their meetings and suggest supporting a non-Windows based platform, you'd get everything from funny looks to comments along the lines of "But nobody's used DOS for years!". As far as they're concerned, you might just as well propose video streaming to a paper pad, it'd be equally ab

... of Microsoft's XML based / GUI / animation-friendly /.NET based vector interface technology. The beast underlying Silverlight will continue to find its widest audience in WPF on the desktop, and possibly a decent sized user base in Windows Phone 7 -- if MS can get traction on the latter. Displacing Flash on the web has always been a pipe dream, and based on the dictates of iOS not even a pipe dream worth so very much effort anymore.