Network Tools Curtail Federal Data Chaos

Network management as a stand-alone discipline is starting to play a critical role in federal data processing initiatives.

"Systems integration -- and by extension network management -- is a big problem with all agencies with systems at more than one site," says Bob Deller, principal analyst with Vienna, Va.-based Global Systems and Strategies.

In the process, government network administrators are being asked to maintain security and eliminate redundancy, while simultaneously enhancing internetwork access to more power users.

For systems integrators, solving this problem represents a multibillion-dollar opportunity, especially for high-end projects, such as the Joint Computer-Aided Acquisition and Logistics Support program, which requires integration of multimedia applications (voice, data and graphics), as well as secure multivendor systems and interoperability between agencies and private sector contractors.

Service agencies with many offices throughout the country, such as the Internal Revenue Service and the Social Security Administration, are under excruciating pressure to do a better job of sharing data and maintaining up-to-date information on distributed databases on new client/server architectures. And in the Pentagon, efforts to harmonize disparate networks within the armed services and between purchasing units and trading partners are creating similar requirements. (See related article on page 24).

Consequently, Deller expects a larger percentage of the federal government's systems integration budget, which will grow from $3.6 billion in 1996 to $4.6 billion in 2001, to be allocated to network management.

"Every agency appears to have some emphasis toward modernization, [which] really means responding to the mandate from the National Performance Review to perform better at lower cost. That means developing common ways of processing. That means doing something once instead of 40 or 50 times. That means integrating all of your networks," he says.

"As government agencies move to client/server architectures, they [will] be exposed to distributed environments. When they start dealing with distributed data and replication, and being able to keep things in sync with accurate data, network management becomes key," he says.

Thus, larger professional services companies, including CSC, expect to get a significant amount of work over the next few years, building centralized network management systems that will allow government administrators to monitor highly heterogenous and geographically dispersed networks from centralized locations.

The absence of such a system creates a complicated and human resource-intensive dilemma. The average federal computing environment supports a minimum of six platforms, including Sun workstations, NT and HP UNIX servers, Novell LANs and legacy mainframes. Without a centralized management system, network administrators must allocate specialists for each platform. Further, each platform requires subspecialists in the various network management functions such as security, disaster recovery and data backup.

"If you don't centralize and facilitate network management, you [will go] from machine to machine checking the same things over and over again," says Paul Swanson, security architect for the Defense Information Services Agency, which operates a network that supports as many as 23 different e-mail systems on various hardware and software platforms.

But in these post-Cold War days, even the growing federal information technology budget has its limits, and government agencies are finding that customized engineering solutions -- commonly implemented 10 years ago -- are too expensive today. As a result, commercial network management solutions are finding a new, vibrant market in the federal government.

At DISA, for instance, Swanson is monitoring security using a commercial, off-the-shelf product called Unicenter from Islandia, N.Y.-based Computer Associates. The product bundles an integrated suite of network management functions onto one system. It provides network administrators with a common approach to managing security, access control, disaster recovery, storage management and other network management functions across different platforms.

"Before we installed Unicenter last year, we were using the standard utilities of stand-alone UNIX boxes to enforce our security policies. We were literally going from box to box to box," says Swanson.

"With Unicenter, I have one screen on one console where I have all the information that I deem important piped from all of the other machines. I don't have to watch every single machine physically. It makes it easy for me to have a policy-based security system instead of relying on the file-based security that native UNIX has," he says.

Besides DISA, CA has installed the Unicenter product line at several defense sites at home and abroad. The distributed network management product line is CA's No. 1 seller in its $200 million federal operation.

"We currently have dozens of Unicenter implementations in the DoD, including in Bosnia, the European Command, the Warriors Prep Center and the Space Command," says Mike Miller, senior vice president of CA's federal operations.

But despite the availability of tools and expertise to the federal IT community, a serious question remains: Is the leadership wherewithal available to adopt a comprehensive, governmentwide approach to network management that can be adopted to tame the chaos created by the proliferation of client/server environments and the need to interoperate?

"The main problem with the government [in terms of systems integration and network management] is that there is no single engine driving the government train. Each agency appears to be pursuing its own strategy," says GSS' Deller.

"You see this reflected with the IRS' tax modernization program today. If you pick up any recent [General Accounting Office] report on the tax system modernization, it will tell you that the IRS has still not corrected management and technical weaknesses that revolve around systems integration. The Veterans Administration's Veterans Benefits Program is suffering the same sorts of problems," he says. n

That Was Then...This Is Now

It used to be so simple, a systems administrator at one of the largest federal bureaucracies nostalgically explains. In the not-so-old days, before the client/server revolution changed everything, data-processing jockeys at the major government agencies gathered, stored and doled out mission-critical information in an orderly manner from huge, imposing mainframes.

In those days, users were mostly gray-collar technicians rather than white-collar knowledge workers. Their modest data acquisition and processing requests -- from "dumb terminals" no less -- were consistent with their clerical status. Managing the limited number of users was a straightforward process. Application requirements rarely changed. And securing these closed proprietary systems was a cinch.

But that was then and this is now.

Today, spurred by many modernization initiatives, the entire federal government is moving mission-critical applications away from legacy systems toward a client/server environment that has been embraced by the commercial marketplace for years. The orderly hierarchical management schemes that prevailed in the mainframe age are collapsing into chaos, as local area networks manufactured by a variety of vendors pop up like mushrooms in even the most sensitive government agencies.

What is your e-mail address?

Do you have a password?

Trending

In an exclusive for WT Insider members, we are collecting all of the contract awards we cover into a database that you can sort by contractor, agency, value and other parameters. You can also download it into a spreadsheet. Our databases track awards back to 2013.
Read More