This excerpt is from Chapter 1, The Security Frontier, of Defending the Digital Frontier, written by Ernst & Young LLP, Mark W. Doll, Sajay Rai and Jose Granado; and published by John Wiley & Sons, Inc.

By submitting your personal information, you agree to receive emails regarding relevant products and special offers from TechTarget and its partners. You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

In the introduction to Part One, the digital frontier was described as virtual, borderless and highly dynamic. By implication, its environment is fluid rather than concrete, and transitory rather than fixed. Although such terms lend understanding in the abstract, they are less helpful when trying to quantify the frontier. Therefore, we offer the following operational definition of the digital frontier: It is the forward edge of technological impact with respect to organizations' usage of technology and their reliance upon it for day-to-day operations to achieve marketable productivity improvements (see Figure 1.1).

It is important to understand the difference between the "bleeding edge" of technology and the digital frontier because, although they have similarities in terms of their positions at the forefront of innovations with respect to the majority of business organizations, there are several significant distinctions. Companies investing in so-called bleeding edge technologies have as one of their drivers the adoption of the latest technology for experimental purposes. Companies investing to the edge of the digital frontier are careful to adopt the latest and best technology available with regard to its utility and performance because usage and adoption are critical to productivity gain.

Just as settlers pushing the boundaries of the American West redrew the maps to show later explorers where the old frontiers had ended and which areas were still open for development, Figure 1.2 shows the four clearly distinguishable eras on the continuum of digital technology. These eras are defined by their architecture, and all were pushed forward by the companies whose executive management understood that there must be a direct correlation between digital investment and operational productivity. Those executive managers knew that a high degree of both usage and reliance is what puts an organization squarely in the digital frontier, and their companies are the ones that traditionally have and are still holding the competitive advantage in the marketplace. The eras shown in Figure 1.2 can be described as follows:

Mainframe: This era is characterized by highly centralized systems and closed architecture. This era was the advent of the digital age, beginning with the development and use of the Electronic Numerical Integrator and Computer (ENIAC) in 1947. Mainframe systems evolved, but continued to be the platforms of choice until the mid-1980s.

Client/server: This second major shift along the digital frontier ushered in the concept of distributed information, private users and decentralized systems. This has proved to be an enduring structure and is still in widespread use today, with adaptations for more advanced technology.

Internet: The concept of a highly decentralized, open-architecture system that connected widely distributed users had been in use for a decade or more in the form of the original Advanced Research Projects Agency Network (ARPANet) of the U.S. Department of Defense, which connected academic and military research institutions. However, in the 1990s, this network was opened for public access and its usage increased exponentially. By the end of that decade, reliance upon it had become ubiquitous for business and non-business-related usage. Companies that had previously lagged behind in terms of technological innovation went online in the 1990s.

Mobile: The fourth wave of innovation, the effects of which are beginning to be felt in the business world at the beginning of the twenty-first century, is the era of wireless communication via highly decentralized, open-architecture systems. This technology is reshaping the digital frontier as wireless technology segues from being at the bleeding edge of technological innovation to the leading edge of the digital frontier. Organizations have begun to study its utility and determine their potential to become reliant upon it. The key factor in how ubiquitous this technology becomes will be its ability to significantly increase productivity without increasing risks to the organization's security framework.

E-Handbook

0 comments

E-Mail

Username / Password

Password

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy