What is a tier, anyway? What can we learn from games? In this article, software development expert Tyson Gill discusses various strategies for architecting applications.

The content in this article is a mix of original material and material adapted from Visual Basic 6: Error Coding and Layering (Prentice Hall, 2000, ISBN 0-13-017227-8) and the upcoming book Planning Smarter: Creating Blueprint-Quality Software Specifications (Prentice Hall, 2001).

From the author of

From the author of

Back in the "olden days," when relational databases first started
gaining widespread use, we wrote standalone front-end applications. Although
we didn't know it at the time, we were writing single-tier applications.
What this term means is that all the code is in a single application, including
the user interface logic, the business rules, and the database management.

As these applications gained in popularity, it became evident that they were
too difficult to maintain. Large companies might have dozens or even hundreds
of users running the applications. Every time a business rule changed, the program
had to be modified, recompiled, and redistributed to all those people. At that
time, that meant some pitiful slob walking around the building running installs,
reconfiguring, and troubleshooting, while all the time the users' work was interrupted.
It got to be intolerable.

To mitigate this problem, efforts shifted to two-tier applications. By expanding
the capabilities of stored procedures in the database, business logic could
be moved there. That meant that stored procedures could be stored and maintained
in one place. If they needed to be changed, they could be modified without having
to recompile and re-deploy new program versions.

This architecture helped, but soon proved too limited:

As systems started to support larger numbers of users and the complexity
of stored procedures grew, the load on the database became a limiting factor.

SQL is not a programming language. As stored procedures were pushed to
do procedural things normally handled by powerful programming languages,
the stored procedures became difficult to write, difficult to debug, and
difficult to maintain.

Portability became an issue. To move to a new database required that all
the stored SQL code—perhaps the majority of the complex code in the
application—had to be rewritten.

To deal with these issues, three-tier applications became the accepted standard.
In a three-tier application, you basically backed away from putting all the
business logic in stored procedures. Instead, you moved business logic in a
programming environment such as Visual Basic, and deployed that code as middle-tier
components, DLLs. These business objects could reside on the database server
or on entirely separate servers. This architecture allowed the developers to
distribute services and load on three separate machines. It provided the benefit
of a thin front-end program, a dedicated database, and business code, written
in a powerful language and maintainable in one place on the server side.

It would seem as though this strategy should have met our all our needs. However,
as Internet front-end programs became more common, the number of potential users
jumped once more. When that happened, even these three-tier architectures were
not enough. The middle tier couldn't handle the load.

The solution was to move to n-Tier architectures. An n-Tier architecture
is basically a three-tier architecture with n number of middle-tier servers.
Those servers can reside at different locations. Now the workload can be distributed
in various ways across those servers; for example, by business function. Also,
fault tolerance can be increased by having redundant servers in the middle tier.

Of course, there is a lot more to the story, but that's the bare bones. Now
we've extended it further by not only scaling out our middle tier, but also
our database tier, as well as providing intelligent load balancing and other
services.

So far, we've ignored the Internet, and the Internet cannot be ignored. It
has had and continues to have profound impact on application architectures.

So let's look back to the influence of the Internet. One of the many reasons
the Internet took off so rapidly was that browser-based applications further
reduced the maintenance requirements of front-end programs, and also solved
connectivity issues through the World Wide Web. By creating a front end that
was browser-based rather than VB-based, there was no need to do any installation
or configuration. Installation and configuration has always been a major headache,
even for a thin VB client. Therefore, when you have potentially tens of thousands
of users, the browser was the only solution.

For a while, it looked like VB applications would soon be extinct and every
application would be browser-based in a very short time. Although browsers didn't
offer the kind of power and flexibility that VB applications provide, the benefits
far outweighed the limitations for most customers. Browser developers scrambled
to add more capability, and the entire world basically accepted the limitations
of browser-based applications as a temporary condition.

Some developers and some marketers still think that all applications today
need to be browser-based. However, the technical landscape has continued to
shift. One of the main motivations for browser-based applications was ease of
deployment. However, in the last few years Windows has gotten better about installation
and configuration. It's going to get better yet. Terminal Services and Citrix
offer a very efficient alternative for some applications. Finally, Active Update
programs have dramatically lowered the maintenance barriers. Communication across
the Web is now becoming transparently simple for standalone applications.

One great example of using these alternate approaches to architect sophisticated
Internet-enabled applications is online gaming. The game guys are always way
out at the leading edge of the wave. Everquest is one example of such a game.
It's a heavy-duty client/server application running across the Internet. In
Everquest's 3D world, thousands of players interact in real time. This application
has an extremely heavy front-end application, consisting of over half a gigabyte
of data and code on the local machine alone. And all of the critical information
is stored and processed on the server.

This game simply could not have been developed within a browser. There are
more than 300,000 Everquest subscribers; every time they run the program, it's
updated automatically over the Web. Modifications are made almost daily, and
all 300,000 people routinely get updated. If the designers of Everquest can
accomplish this level of maintainability and performance, virtually any
intensive application can follow this same model.

This illustrates the reality that you needn't think only of browser-based applications
when you think of the Internet. There are other viable options, including Terminal
Services–based applications, running inside or outside of a browser; installed
applications running across the Web; even Internet-installed and Internet-updated
applications.

The reasons that got us thinking only about browser-based solutions have already
gone away. In the near future, the Microsoft.Net platform will make it even
easier to copy heavy applications to client machines over the Web—without
having to perform installations—as well as facilitate Web connectivity.

Which Architecture Should You Choose Today?

Evaluate your application needs prior to deciding on an architecture. The first
question to ask is whether the application needs to be Internet-enabled. If
the answer is yes, ask whether it's a casual or intensive application. If it's
casual, think about a browser-based solution. If it's intensive, consider a
shipped or Web-deployed Internet-aware application or a Terminal Services solution.

The distinction between casual and intensive is important. It's not meant to
suggest a value judgment, but rather usage patterns and expectations. Casual
applications are used only occasionally; for example, Web commerce applications
such as online booksellers or trading sites. Users don't use these applications
intensively or as part of the job. They don't want these sites to download and
install custom components on their systems, and likewise don't expect functionality
beyond what's offered by the browser.

An intensive application is a business application with heavy requirements
that is used by employees or clients regularly to perform more intensive operations.
These may include data entry, analysis, and reporting. Clients expect sophisticated
custom interfaces to help them perform tasks efficiently. Managers expect to
provide tools that optimize the productivity of their workers, not necessarily
to make the task feasible in a browser.

While Internet Explorer is powerful, for example, it's unlikely ever to offer
the power and flexibility of a standalone application. It's tremendous for casual
applications, but often requires developers and users to make significant compromises
in the design of more intensive business applications. Therefore, why spend
a lot of time and effort and realize marginal results trying to force intensive
functionality into a browser? There are alternatives.

On the other hand, why incur the overhead of supporting installed applications
when the user would rather not leave the convenience of the browser for casual
operations? The browser is clearly the best choice for casual applications.
Other techniques may be better for more intensive applications. Of course, there
is a wide gray area within which the choice is not clear. Time to market and
the likelihood of expanding requirements are two possible deciding factors.

When we think of hosted applications, we think of browser or terminal services–based
applications. However, hosted applications can be designed using any of these
architectures. Everquest is essentially a hosted application for subscribers,
but it uses an entirely different model based on an Internet-enabled standalone
application.

The bottom line is that there are lot of good choices for developing Internet-enabled
applications. The range of options keeps growing rather than becoming more monolithic.
Just when it looks like one approach will dominate, different technologies emerge
to compete. The trick is picking the right technology for the right application
at any given time.