Whenever I write about Web applications, I get a number of responses that reflect a variety of myths and misunderstandings about Web applications and the cloud. Given that it looks like we've reached the event horizon on Web applications, it's time to clear up some of the most persistent myths about Web applications.

1: "Web" and "cloud" have to mean "third-party host"

People see words like "Web application" and "cloud-based" and immediately assume the service must be located outside the firewall, with the usual issues like offline access, security, and latency. This is simply a matter of architecture. Yes, many Web applications are only available on the software-as-a-service (SaaS) model, requiring you to pay a third party to host the application. But many Web applications, especially enterprise class applications, can be deployed to your Web servers internally. More and more vendors are even providing pre-configured virtual machines that you can drop into your data center and run with. And for building your own application, you can deploy to your own server.

The same is true for cloud. When folks hear cloud, they assume third party. The power of cloud is not that it is third party; the power is the idea that you have abstracted and virtualized the resources to the point where no one knows or cares how many physical or virtual servers are involved, and that the resources can dynamically expand and contract as the load requires. There are many private cloud options (third party and in house) that allow you to have the benefits of the cloud without requiring that your data be anywhere near anyone else's or that IP traffic outside your organization can even touch it.

2: No Internet access = broken application

Another common argument I hear against Web applications is that when the network connection is down, so is the application. If you are talking about an application that requires constant calls to the server (AJAX callbacks or full postbacks), this is true; but there are ways to write applications that do not require constant Internet access. For example, you can write a Web-delivered application where the initial page load brings up all of the resources (HTML, CSS, JavaScript, images, etc.) for the application, and all of the processing is handled via client-side JavaScript. Furthermore, HTML5 steps up the game with the parts needed for offline caching so that a sync can be performed when the server is available again. For the applications that do require always-on access to a server (say, to pull data from a database), they'd be dead in the water anyway if they were locally installed desktop applications.

3: Web UIs are clunky

It is historically true that HTML was not so hot at UIs. HTML5 has helped, by introducing elements that have many more capabilities and are data-type aware. This allows developers to do things that were previously only possible with plugins (which may not be universally available) or with a ton of effort, and it also allows browsers to display elements with data type-appropriate input models. Add on top of this the components from companies like Telerik and frameworks such as jQuery, and the Web UI model does not look so shabby.

4: JavaScript

JavaScript myths could be its own article, but I'll just run down some of the more common ones I encounter:

JavaScript is based on Java. "JavaScript" is a marketing term; when Netscape put it into its browser, Java was a hot buzzword. JavaScript and Java both have a C-style syntax and that is about the only similarity.

JavaScript requires a Java Runtime Engine (JRE) on the client. JavaScript is a dynamic, interpreted language that does not require a Java bytecode interpreter; it requires a JavaScript interpreter that comes baked in to browsers.

JavaScript is slow. Yes, JavaScript interpreters are slower than native code, but they are not slow enough for you to notice significant slowdown in common use cases. It isn't like doing common JavaScript tasks crushes the system resources. Most benchmarks are incredibly unrealistic.

JavaScript is hard to manage/test/etc. This is as true for JavaScript as it is for Perl, PHP, Python, Ruby, and other similar languages. Much of the difficulties are caused by developers who are not rigorous with sound programming process and techniques, but it is true there are things that make JavaScript development more of a challenge.

5: "Standards" and browser differences

Without a doubt, the big eyesore in Web development is Internet Explorer, and there is a dark place in most developers' hearts for IE6. Fortunately, IE6 is ancient history except for a small number of big companies that are hung up on it, and folks who have such an old copy of Windows XP or Windows 2000 that Windows Update is turned off.

More recent versions of IE have made great strides towards playing nicely with standards, and the upcoming IE10 looks extremely promising on this front. When IE's compatibility mode was introduced, it gave Microsoft the much-needed freedom to stop maintaining IE's worst habits going forwards.

Most importantly, HTML5 will be the first version of HTML that will be truly testable, where the spec is specific enough that you can verify whether a browser is standards compliant. With these tests and renewed effort from vendors (particularly Microsoft) to get their browsers into shape, the days of the standards wars are drawing to an end.