Whew, what a week. In a week of threats, patents and doom and gloom, we are joined by Tess again to discuss desktop computing. Is it going the way of the dodo, or will it turn into something else? (what with all this mobile-computing hype) Also discussed is the ongoing H264 minefield that THREATENS TO DESTROY US ALL *ahem* before meandering off to something more positive: HP buys Palm; assuring us that there is good in this world.

The H.264 fearmongering in this episode was a bit too much. Neither Microsoft or Apple are forcing websites to use H.264; video sharing sites are choosing to do so. Streaming video (from servers you are paying for) is expensive. So unless the MPEG-LA goes after people who embed youtube videos, most users have nothing to fear.

Look at the public relations nightmare the RIAA is going through over illegal music downloads. I don't think the MPEG-LA wants that type of publicity. Companies/Users will have to decide if they want to use MPEG1 or Theora in order to avoid paying 2c/subscriber/month ... if they have more than 100,000 subscribers.

Streaming video on the internet is not free, no matter how you look at it.

It doesn't sound like you've been paying much attention. The problem is that the MPEG-LA says they can charge/sue/threaten ANYONE in the U.S. who CREATES or VIEWS any video ENCODED or PROCESSED using ANY CODEC because they claim to have patents that cover ALL modern CODECs.

The license states that if you are using the video for commercial purposes, they (MPEG-LA) will want a cut. Yes, it is draconian, but it is the (US) law. It will only stand if the majority consents to it.

On a side note: Now I understand why my digital camera uses MJPEG. A few months ago, I was complaining that it didn't use MPEG4, but now I know the reason. I believe that the same way some guys decided to make PNG when GIF's patent lawyers came knocking, the same will happen when the MPEG-LA's lawyers do the same.

The people who should really be concerned are those who are making money from video sharing, i.e. youtube, dailymotion, etc. And we aren't hearing them complain. When youtube made streaming video popular, did they complain that they had to pay licensing fees?

Video sharing from its inception is an expensive venture. That's why venture capitalist pump millions of dollars into such websites. Because you can easily upload a video to your favourite video sharing site for free doesn't mean that there isn't a cost.

Very few, if any, end users will pay to upload videos to the web. Video sharing sites would quickly find an alternative format or converter plug-in to install on the end-users' machine before asking their users to pay-to-share.

The H.264 fearmongering in this episode was a bit too much. Neither Microsoft or Apple are forcing websites to use H.264; video sharing sites are choosing to do so. Streaming video (from servers you are paying for) is expensive. So unless the MPEG-LA goes after people who embed youtube videos, most users have nothing to fear.

Microsoft has said only h.264 will be supported in IE9. Apple only supports h.264 in its mobile devices.

"Pragmatists" will go for this codec because it will provide a path of least resistance because most browsers will have it out of the box.

Microsoft did not put a gun on web developers forcing them to code for IE and we all know how it turned out.

h.264 may win because of the support it has from the "big boys" but that doesnt mean we should just keep quiet and not mention its shortcommings.

There is a reason why all these "big boys" keep mentioning "open" when talking about this codec. This codec has issues and people should know them and at least make an informed decision when choosing to use it.

Having "normal" friends and not getting along with "geeks" really reflects my experience.

The geeks tend to be extremely passionate about one thing or another, i.e. their platform of choice or how cool they are for knowing such and such programming language. I even sometimes hear them say incorrect things (whether they're mis-informed or simply lying) to make their points; and this bothers me a lot.

But it's their aura of eliteness that bothers me the most. Why should knowing a lot about computers make you any better than someone who knows a lot about sports or cars or literature? It's just one thing. You can be proud of it but it shouldn't make you better.

To avoid looking like these people, most people don't know just how tech-savvy I am, and I catch them giving me that elitist mis-information crap, and then I tear their argument to shreds.

I'm at a residential high school and on a student computing support team. We all have school-picked tablets loaded with problems (the tablet functionality is not worth the extra hundreds of dollars! It's not even good!); and worst of all; the school gives them the most ridiculously bloated Windows XP images. When a tablet is broken beyond repair, students usually get a netbook or a big-brand Dell or HP. Aside from uninstalling their trialware, these computers are SOOOO much more pleasant to service.

No one wants to learn how a filesystem is organized or why something doesn't work. They simply want to turn in their next paper and watch videos on YouTube. Yet they keep coming back with broken computers. I would have to watch them use a computer 24/7 to really know what they're doing wrong.

While you can teach someone how to operating a particular system, like the theater system you set up for your parents, it's extremely difficult to teach someone the sense on how to not break a computer if they don't understand it already.

I've only really met 3 or 4 other tech-savvy people who are socially acceptable human beings who understand they're not better than anyone. I'm going into college looking for a computer engineering degree, but I'm really worried I won't be able to get along with my peers.

Largely true, but I would say that being good with computers is more like being good with math (actually, if you get down to it, it is being good at math in a lot of cases) than it is like knowing a lot about sports. That is to say that if you lack computing skills, it will actually inhibit your intellectual progress in a lot of fields (though, of course, there are a huge number of exceptions to this). So I don't really buy the "just make it work" attitude. There are real technical reasons that the system may require well-thought-out user input in order to function but most people seem to ignore this because it is convenient. If someone treats their car poorly and it breaks, would you call an experienced mechanic elitist for chiding the customer (ignore the fact that mechanics make more money when cars break down more )?

That has been my experience, too; I have some real trouble getting along with engineers, coders/programmers, "people who work with things" vs. "people" persons, etc. even though very often I have a very good idea of what they are talking about. For similar reasons.

(Within the same issue, there's another article about Radio Shack moving away from the business of tinkering. Times have changed, a lot.)

I figure, though, that being somewhat in-between will give me an advantage, someday. I joked with a tech that maybe one day I'd be an I.T. Admin or some other tech manager. (And to my surprise, he actually agreed.) I seem to understand both sides, somehow, however disparate they may seem. And I'd say there's a spectrum between the extremes of hobbyist/hacker computing and "It Just Works". Sometimes it seems like a comparison of how the Norse and the Romans regarded their weapons, but y'know. Not a perfect analogy.

I've heard this point before. When you say Web-based computing can't work because the Internet isn't readily available everywhere, you forget that web apps can be cached and used offline. Chrome OS can work. I don't know about many other web apps, but most of Google's web apps cache themselves for offline use. Many of them require Google Gears right now, but Google is trying to move to HTML5. If the web moves in this direction, then web-based computing can work.

Just imagine if someone came out with a comprehensive web-based programming environment. Even programmers wouldn't need anything more than a netbook, especially if they're targeting web apps.

Web computing makes sense for in-house apps where security concerns and support costs are much higher. Most in-house software is DB related which lends itself much better to a web transition.

Google Docs may have caching but for something like Photoshop there is only so much you can do with caching if the software engine is on the server. Adobe could provide a limited functionality mini-Photoshop that resides in the browser for offline use or they could exercise another option which is to continue making millions from their existing build that offers 100% functionality when the internet is off.

An often overlooked benefit of local apps is that they make use of local hardware. I can build an application and leave it on a rinky dink server for 100k people to download. However with a web app I would have to provide servers to handle all the processing that those 100k computers would normally handle.

Google has billions so providing servers isn't an issue for them but ISVs with budgets are not going to be excited about porting their software to the web due to the increased costs, especially if they are like Adobe and already have a local codebase that cost them millions to develop.

In the business world, business people have laptops. Yes, that's right. Office and phone simply don't belong together in one sentence. Yes it can be done - but so what. If people were to do any even remotely substantial office work on a phone employers would get sued for damages for not supplying a proper work environment. gosh - at my last work place, the employer got his ass sued for not supplying an ergonomic chair!! Get a laptop if you need office, and get on with it...