NEWS FEEDS

The RCJ provides RSS
feeds from well-respected news organizations, giving
our readers a convenient
portal through which to stay abreast of world
events and issues. Use the links provided. The
following are on the RCJ Front Page Report homepage
(scroll both columns to the right).

CCJ Publisher Rick Alan Rice dissects
the building of America in a trilogy of novels
collectively calledATWOOD. Book One explores
the development of the American West through the
lens of public policy, land planning, municipal
development, and governance as it played out in one
of the new counties of Kansas in the latter half of
the 19th Century. The novel focuses on the religious
and cultural traditions that imbued the American
Midwest with a special character that continues to
have a profound effect on American politics to this
day. Book One creates an understanding about
America's cultural foundations that is further
explored in books two and three that further trace
the historical-cultural-spiritual development of one
isolated county on the Great Plains that stands as
an icon in the development of a certain brand of
American character. That's the serious stuff viewed
from high altitude. The story itself gets down and
dirty with the supernatural, which inATWOOD
- A Toiler's Weird Odyssey of Deliveranceis the
outfall of misfires in human interactions, from the
monumental to the sublime.The
book features the epic poem"The
Toiler"as
well as artwork by New Mexico artist Richard
Padilla.

Elmore Leonard Meets Larry McMurtry

Western Crime Novel

I am
offering another novel through Amazon's Kindle
Direct Publishing service. Cooksin is the story of a criminal
syndicate that sets its sights on a ranching/farming
community in Weld County, Colorado, 1950. The
perpetrators of the criminal enterprise steal farm
equipment, slaughter cattle, and rob the personal
property of individuals whose assets have been
inventoried in advance and distributed through a
vast system of illegal commerce.

It is a ripping good
yarn, filled with suspense and intrigue. This was
designed intentionally to pay homage to the type of
creative works being produced in 1950, when the
story is set. Richard
Padilla has done his usually brilliant
work in capturing the look and feel of a certain
type of crime fiction being produced in that era.
The whole thing has the feel of those black & white
films you see on Turner Movie Classics, and the
writing will remind you a little of Elmore Leonard,
whose earliest works were westerns.
Use this link.

EXPLORE THE KINDLE
BOOK LIBRARY

If you have not explored the books
available from Amazon.com's Kindle Publishing
division you would do yourself a favor to do so. You
will find classic literature there, as well as tons
of privately published books of every kind. A lot of
it is awful, like a lot of traditionally published
books are awful, but some are truly classics. You
can get the entire collection of Shakespeare's works
for two bucks.

Amazon is the largest,
but far from the only digital publisher. You can
find similar treasure troves atNOOK Press(the
Barnes & Noble site),Lulu,
and others.

TECHNOLOGY

Musical Flying Robots

Playing the James Bond theme... It is
sort of disorienting to think how clever we are becoming.

The Very Onion of
Internet Anonymity

Grows as a Tool
of Revolution

Tor,
the U.S. Government-funded software for concealing the point of
origin and destination of an Internet data packet, and thus the
identity of its sender and receiver, was downloaded 36 million
times last year. You can download it, too, for free, just by
going to the "Tor Project" website.

Tor routes encrypted
packets of data through a series of computers, each of which
peels off a layer of encryption as the data is received and
passed on to the next computer in the chain until finally it
reaches its intended destination fully unencrypted. This is
referred to as onion encryption for obvious reasons.

The Tor
Project Website describes the software this way: "Tor
is a network of virtual tunnels that allows people and groups to
improve their privacy and security on the Internet. It also
enables software developers to create new communication tools
with built-in privacy features. Tor provides the foundation for
a range of applications that allow organizations and individuals
to share information over public networks without compromising
their privacy... Using Tor protects you against a common form of
Internet surveillance known as 'traffic analysis.' Traffic
analysis can be used to infer who is talking to whom over a
public network. Knowing the source and destination of your
Internet traffic allows others to track your behavior and
interests."

Initially, in 1995,
it was the Navy and U.S. spy agencies who were interested in
developing the Tor software. While there is still U.S.
government funding for its ongoing development, Google and Human
Rights Watch are now its biggest backers.

Besides Department
of Defense agencies, journalism groups and freedom of speech
advocates, including WikiLeaks, are all users of this software.
The Tor Project suggests that there is a need to protect our
Internet identities and activities and they offer a range of
related projects:

Among the many
interesting aspects of the Tor Project is the extent to which a
DoD-developed product has become an indispensable tool in wars
against government intrusion and oppression. One suspects that
we are entering an age where perhaps we should all be a little
more careful with our information, not that anybody is
watching... - RAR

Quick Hits

Anonymous Shuts Down Sites
Over SOPA

The hacktivist group
Anonymous launched its "largest attack ever" Thursday,
claiming credit for a coordinated takedown of websites
managed by the Department of Justice and organizations
supporting controversial antipiracy legislation. The attack,
dubbed “Operation Payback,” came in response to Thursday's
news that the Justice Department had shut down massive
file-sharing site Megaupload. The attack also temporarily
brought down the websites of the Recording Industry of
America, the Motion Picture Association of America and
Universal Music, among others, in retaliation for their
support of antipiracy legislation in Congress, known as SOPA
and PIPA.

The takedown of Megaupload,
and the arrests of its CEO and several execs, sent
shockwaves through the online community Thursday. An
indictment accused the company, which is one of the world's
most popular file-sharing sites, of costing copyright
holders at least $500 million in lost revenue.

"The raid on Megaupload
Thursday proved that the feds don’t need SOPA or its sister
legislation, PIPA, in order to pose a blow to the Web,"
Anonymous said in a statement posted to its website.

"In a world where govts [sic]
just keep on pushing their malicious agendas, we're no
longer ready to play nice. We do not forgive!" said a post
from one of Anonymous' Twitter handles.

The statement also said that
Anonymous was planning another attack - this time on the
White House's website, whitehouse.gov. One Anonymous
operative, Barrett Brown, told the Russian news service RT
on Thursday that more attacks were coming and the group
plans to “damage campaign-raising abilities of remaining
Democrats who support SOPA.”

According to other reports,
Anonymous’ attack also included the websites of the US
Copyright Office and the site for BMI, or Broadcast Music,
Inc., which collects license fees from businesses that use
music and distributes them as royalties to songwriters.

Top 10 On
Line Backup Services
- An online backup service
allows you to easily and automatically back
up important documents and files in the
background of your computer without
monopolizing system resources.

"Cloud computing is a model for
enabling convenient, on-demand network access to a shared pool of
configurable computing resources (e.g., networks, servers, storage,
applications, and services) that can be rapidly provisioned and released
with minimal management effort or service provider interaction." - National Institute of Standards and
Technology (NIST)

By RAR

So now we have all watched the
Microsoft "To the Cloud" commercial, where a
couple stranded in an airport kills time by accessing recorded videos from their
home computer via Windows Live. For them
"the cloud" is a thing that transports them from their present reality to some
place conveniently distracting, and like most convenient distractions in modern
life this one is on line. The "cloud", for all practical purposes, is "the
Internet", but more specifically certain types of
application programs offered from Web sites on the Internet. Calling
using the Internet "going to the cloud" sounds a lot more romantic, but while
the branding is a marketing confection the reference is to a specific type of
Web-based software that resides on line and usually independent of the user's
computer. (Some cloud computing software does require some installation on the
user's C drive.) The graphic explanation above came from Wikipedia.

Microsoft has been at the
forefront of "cloud computing", which encompasses any
subscription-based or pay-per-use service that is delivered in real time over
the Internet. If you and your kids have been playing
Xbox LIVE to participate in multiplayer gaming, you have been
visiting "the cloud" for some time now thanks to Microsoft's innovations, though
they have tons of company on this frontier of Web-based enterprise.
Facebook comes to mind, for the social network is
all in "the cloud". Sony is there with their PlayStation. Apple
is building a $1 billion data center in Maiden, North Carolina for their iCloud
products, which gives one some idea of the anticipated server capacity
requirement for services as robust as those imagined for the Cloud, which comes
with huge retrieval and storage demands.
Google, Amazon, even stuffy old Hewlett
Packard, have committed to this much-hyped vision for how computing
power will be utilized in our daily lives: beyond the firewalls of our personal
computers and local area networks.

Software
as a Service:
At the heart of cloud computing is the "Software as a
Service (SaaS)" model, under which a single application, designed as
a multitenant architecture, provides some combination of packaged services. The
management is largely server side, meaning that IT groups and/or Managed Service
Providers (MSP) handle all of the maintenance and operations details while the
users, who pay fee for service or transaction charges, benefit from the
functions provided without having to own the program application.

SaaS has made early inroads
into the corporate world with products that handle basic non-critical human
resources functions, such as time sheet recording and management of employee
files. Customer Relationship Management (CRM), the concept in use by sales
forces to capture data about their clients and their purchase histories, has
been an early offering from the SaaS era. Backup and Storage services have also
been early providers of services.

The table below provides an
overview cloud computing services with examples of providers currently
available.

Cloud
Computing Services

Description

Examples

Software as a
Service (Saas)

A single application
delivered through the browser to many customers through the use of a
multitenant architecture

The list above was compiled
by the SaaS Chronicles Web site,
which is an excellent source for information on the SaaS industry.

Cloud Makers

The scientists who made the Cloud possible

A brief
time out to respect the history of our subject. The use of "(World Wide) Web
browsers" to access information through the Internet is still in its
infancy, as human endeavors go.

The
"Internet" - a system of interconnected computer networks - has existed
since the 1960s, when Local Area Networks (LAN)
were first developed. The first network in the United States was established
around 1962 at the Advanced Research Projects
Agency (ARPA), a research organization for the
Department of Defense under the direction of
J.R. Linklider (pictured right)..
Linklider was quite a visionary, imagining a "Galactic Network" linking
computer users around the globe.

Candidate
Al Gore
began his first term as a Congressman from Tennessee in 1976 and began his
political career in the House of Representatives by introducing a bill
calling for the construction of a "data highway." Ten years later, as Sen.
Al Gore (D-TN), he sponsored the
Supercomputer Network Study Act, a
project which called for a mapping of the information needs of the general
public onto the existing networks run by various universities, corporate
research facilities, and military technology centers.

It
was only in 1989 that British engineer and computer scientist Sir Tim Berners-Lee
(right) proposed a World Wide Web
system of interlinked hypertext documents
that would be available through the Internet. Only one year later, at CERN
in Geneva, Switzerland, Berners-Lee and
Belgian computer scientist Robert Cailliau
(below left) proposed the development of "nodes", i.e., Web sites, that
would be linked using "HyperText" so that an Internet user could "browse at
will".

In
1990, the National Science Foundation,
in collaboration with private commercial interests, to expanded the "U.S.
backbone" of Internet infrastructure through the development and
implementation of the standard"Internet
Protocol Suite (TCP/IP)". That allowed an array
of government and commercial sector computer networks to be connected in a
robust and fault-tolerant network for the efficient sharing of vital
information.

In 1991,
Gore wrote an article for Scientific American,
"An Infrastructure For The Global Village" that further promoted the vision
for the Internet as a mass media communication tool. Gore was either reading
the gestalt or showing scientific judgment and intuitive skills because
commercial operations quickly adapted their businesses to the new
communication model. The Internet, and particularly the World Wide Web, grew
exponentially until only 20 years later it feels like something that must
always have been a part of our lives.

It was not
that long ago that we "surfed" the Web, which implied a first time visit, a
brief skimming of what all is there. Now we "browse", which is a reflection
of what the Internet, and particularly the Web, has become: a resource, very
close to what its developers had envisioned.

"The World-Wide Web was
developed to be a pool of human knowledge, and human culture, which would
allow collaborators in remote sites to share their ideas and all aspects of
a common project." -
Wardrip-Fruin, Noah and Nick Montfort, ed (2003). The New Media Reader.
Section 54. The MIT Press.

Cloud computing is a "next wave" concept in which the vision is
a bit out ahead of its current capabilities.

Where Linklider, Berners-Lee
and Cailliau had envisioned a network infrastructure and a means for accessing
documents, the promise of cloud computing is tied to the promise of integrating
free standing cloud computing applications. This is the new frontier that has
not yet been civilized. At present, cloud computing is provided in a range of
stand alone service offerings, but aggregator and integrator software is coming
into the marketplace and it won't be long before going to the cloud may mean
functionalities heretofore unimagined.

Cloud computing, at its
ultimate, is the dream of many powerful applications working efficiently in
unison in a virtual environment to produce deliverables we may not have even
thought of yet. So here we are in the realm of the visionary entrepreneur, for
the combined power of Internet technology, human knowledge, and entrepreneurial
imagination is about to reveal a remarkable new level of information sharing and
management.

For energy
efficiency, the cloud means that
data center facility managers may be able to reduce the number of servers that
they must operate, along with the amount of energy used to cool their server
environments. While that may not sound like a big deal, it has been projected
that by 2020, given current practices, 15 percent of total global emission of
greenhouse gas (GHG) will come from Information and Communication Technology (ICT)
solutions. In 2011 alone, US data centers consume twice the amount of power that
they consumed the previous year.

Large companies adopting the cloud can reduce
energy consumption and carbon emissions by 30 percent, and small business up to
90 percent. For example, a shift in IT management that moves applications off
in-house servers and to 1,000 cloud servers would cut carbon emissions in that
sampling by 50 percent. That is equal to the carbon emissions produced by
261 homes or 444 cars traveling down the highway. It would represent a carbon
offset equivalent of planting 5,810 trees.

For
consumers, the cloud means that
you can use free Apps from Google to develop and share documents, rather than
buying the Microsoft Office Suite. That is only the tip of the iceberg in terms
of describing the alternative universe being opened to consumers, and the
options this provides through the delivery of Apps.

On the service side, cloud computing offers
customer relationship management capabilities to businesses that have the
potential to deliver more efficient and effective services.

Among the consumer savings and benefits are those
derived from the "Smart Meter" technology that allows utility customers to
monitor the energy usage in their homes to reduce overall consumption and to
help them schedule usage at the most affordable times of the day.

For
businesses, cloud computing means lower overheads and potentially
greater efficiencies. This is already happening: companies operate without
having to own servers; they use Google for their email, Salesforce.com for
sales, marketing, support and billing; hosted accounting software; Skype for
audio and visual communications; and Google spreadsheets and docs for resource
scheduling and business planning.

As important as any of that
is that is that cloud computing allows companies to have consultants
strategically placed throughout their service areas and to manage their
activities using the same means used to manage personnel and resources in the
home office. That is huge!

Cloud computing is redefining
the role of the Chief Information Officer (CIO),
the IT professional whose once managed only the IT infrastructure but who now
becomes a manager of multi-platform internal and external services. A recent
Society for Information Management (SIM) study revealed that the average
life of a CIO in a major corporation is only 4.4 years, and they offer this
amusingly pitiful explanation of the usual employment arc:

“The first year as CIO
is the honeymoon. The second year is about strategy and planning, and the
third year is about implementing. In the fourth year they (the higher-ups)
figure out that the execution isn’t going that well, and in the fifth year,
you start looking for your next job.”

Another study conducted by IDG Research Services
on behalf of CA Technologies quizzed 200 IT managers in the U.S. and Europe to
learn that most see their value to their organizations being derived in the
immediate future by their management of the IT supply chain. Services that
were once managed in house by other department heads will become the province of
the CIO as those services are moved to cloud-based delivery.

Security has been and will continue to be a big
problem for cloud-based services, which has prompted some businesses to try
in-house cloud environments.

For
software developers, the advent of
cloud computing offers an entirely new market that has found an immediate niche
for enterprise-class applications. NetSuite Inc., for instance, is providing a
SuiteCloud development platform, which has been embraced by many of the top
enterprise-class companies (e.g., Ariba, Citrix Systems, Concur, Yammer,
Callidus Software) and independent software developers (ISVs) (e.g., SuiteApps).

NetSuite's SuiteCloud is a comprehensive offering
of cloud-based products, development tools, and services designed to help
customers and commercial software developers take advantage of the significant
economic benefits of cloud computing. The complete SuiteCloud offering includes
NetSuite's multi-tenant, always-on SaaS infrastructure; the NetSuite Business
Suite of applications for Accounting/ERP, CRM and Ecommerce; and comprehensive
development tools to create cloud-based business applications on top of NetSuite.

For casual
users, the Apple iCloud, only
announced in the past week (June 6, 2011), allows users to store music, photos,
applications, documents, iBooks and contacts. It serves Apple's email servers
and calendars and provides each account holder with 5GB of free storage.
Purchased music, apps and books and the Photo Stream service will not reduce
this free space. Any music files purchased via iTunes are automatically
downloaded to any registered devices, e.g. iPods, iPhones and computers. When a
user registers a new device all iTunes content can be automatically downloaded.
Apple is also offering a subscription service called iTunes MatchFor , which
allows customers to scan and match tracks in their iTunes music library,
including tracks copied from CDs or other sources, with tracks in the iTunes
Store. Apple will let customers download up to 25,000 tracks in 256 kbps AAC
file format that match tracks in any audio file format in the customers'
collection. Any music that is not available in the iTunes Store must be uploaded
to iCloud, though which formats are allowed to be uploaded is unclear.

Apple's iCloud commitment has some kinks to work
through, particularly around the Apple Lossless (ALAC) audio format, which
current users of iTunes software may already have chosen to use for all or some
of their collections for quality reasons. The issue is the controversial
"Digital Rights Management" (DRM) coding put in place to protect copyright
material from file sharing. In announcing the iCloud, Apple said that 128kbps
AAC files from the iTunes music store with DRM will automatically be converted
to the 256 kbps non-DRM versions for free, but iCloud support for the ALAC files
is "up in the air" as of this date.

Music is an enormous driver of cloud computing
services, and while Apple iCloud is about to swamp the market there are
currently alternatives available that offer impressive benefits.

The hottest of these is the free music streaming
service Spotify. The service offers a large music library of mostly new
artists and some unique benefits, such as the ability to play all night - handy
for a party where requests for certain tracks might otherwise exceed the range
of your personal library - and for musicians to upload their tunes.

CONCLUSION: Depending upon how it is
managed, the cloud may be nothing more than an advanced business model for goods
and services promoted, sold and distributed on line where the operation can be
tightly controlled by the provider. Or perhaps it will be more: a spicket that
can be opened to receive a better quality of life through integrated control of
our calendars, resources, perhaps even our environment.(5-12-11)

______________________

Computing Basics

Under the Hood

In computing, data is defined in various ways
depending upon its use and purpose, but it is always the most primitive form of
some qualitative or quantitative variable or set of variables.

Data is brought
together, through processing, to be formed into useable
Information.

The table below discusses aspects of computing in
terms of Fine and Coarse grain characteristics. Fine grain stuff takes place
behind the scenes, within the operations of a computer system. Coarse grain
level stuff is that with which we interact directly: program applications,
hardware, infrastructure. If you are having a hard time understanding how cloud
computing works, this table is provides definitions and context for the
discussion above.

Level

Function

Description

Examples and Usage

Fine Grain

Storage

Electro-magnetic: Hard
disk and tape systems

Backup

Database

Information stored for access, management and update

Organizational approach:

§Relational database, tabular, can be reorganized and accessed in a
number of different ways.

§Distributed database can be dispersed or replicated among different
points in a network.

§Object-oriented programming
database is congruent with the data defined in object classes and
subclasses.

Information

Data

The term data refers to
qualitative or quantitative attributes of a variable or
set of variables.

Data is considered to
be “quantitative” if it is in numerical form and “qualitative”, meaning
that qualitative data includes text but also photographs, videos, sound
recordings, etc.

Levels of Abstraction:

§Level 1: Data

§Level 2: Information

§Level 3: Knowledge

In programming
language, data is classified in “types” – e.g., floating-point,
integer, or Boolean – to define possible values for each, the
operations that can be done on values of that type, and the way values
of that type can be stored.

Programmers categorize
Data Types as:

§Algebraic

§Function

§Machine

§Object

§Pointer and reference

§Primitive

Process

An
instance of a program running in a computer; associates sets of data

Example:
Printing is a Process that compiles and formats Data for output
that is then Information.

§Share special knowledge of code (such as a lower-level
program interface)

Security

Protection of
information assets through technology, processes, and training.

Anti-virus software,
procedure policy

Management/ governance

Management of the
availability, usability, integrity, and security of the data employed in
an enterprise

§Data governance (DG)
program includes a governing body or council, a defined set of
procedures, and a plan to execute those procedures.

§GRC (governance, risk management and compliance)
software allows publicly-held companies to integrate and manage IT
operations that are subject to regulation. Such software typically
combines applications that manage the core functions of GRC into a
single integrated package.

Testing

Verifying functionality in meeting objectives

Software development:

§Unit Level

§Module Level

§Component Level

§System Level

Coarse Grain

Application

Program designed to
perform a specific function directly for the user or another application
program

§Word

§Excel

§PowerPoint

Platform

Operating system on which program applications are run

On
personal computers, Windows 2000, Mac OS X

On
enterprise servers or mainframes, IBM's S/390.

Infrastructure

Physical hardware
connecting computers and users

Infrastructure includes

§Transmission media, including telephone lines,
cable television lines, and satellites and antennas, and also the

§Routers, aggregators, repeaters, and other
devices that control transmission paths.

§Software used to send, receive, and manage the
signals that are transmitted.

______________________

BACKGROUND:

Definition of:
enterprise networking from the Encyclopedia at
www.pcmag.com

The networking infrastructure in a large enterprise with multiple computer
systems and networks of different types is extraordinarily complex. Due to the
myriad of interfaces that are required, much of what goes on has little to do
with the real data processing of the payroll and orders. An enormous amount of
effort goes into planning the integration of disparate
networks and systems and managing them, and, planning again for yet more
interfaces as marketing pressures force vendors to develop new techniques that
routinely change the ground rules.

Application
Development/Configuration Management
There are a large number of programming languages and development tools for
writing today's applications. Each development system has its own visual
programming interface for building GUI front ends and its own fourth-generation
language (4GLs) for doing the business logic. Programmers are always learning
new languages to meet the next generation.

Traditional programming has given way to programming for graphical user
interfaces and object-oriented methods, two technologies with steep learning
curves for the traditional programmer.

Programming managers are responsible for maintaining legacy systems in
traditional languages while developing new systems in newer languages. They must
also find ways to keep track of all the program modules and ancillary files that
make up an application when several programmers work on a project. Stand-alone
version control and configuration management programs handle this, and parts of
these systems are increasingly being built into the development systems
themselves (see
configuration management).

Database Management
Like all software, a database management system
(DBMS) must support the hardware platform and operating system it runs in. In
order to move a DBMS to another platform, a version must be available for the
new hardware and operating system. The common database language between client
and server is SQL, but each DBMS vendor implements its own rendition of SQL,
requiring a special SQL interface to most every DBMS.

Database administrators must select the DBMS or DBMSs that efficiently process
the daily transactions and also provide sufficient horsepower for decision
support. They must decide when and how to split the operation into different
databases, one for daily work, the other for ad hoc queries. They must also
create the structure of the database by designing the record layouts and their
relationships to each other.

Operating Systems
Operating systems are the master control programs that run the computer system.
Single-user operating systems, such as Windows and Mac, are used in the clients,
and multiuser network operating systems, such as Windows NT/2000, Unix and
NetWare, are used in the servers. Windows is the clear winner on the desktop,
but Windows and Unix compete with each other for the server side.

The operating system sets the standard for the programs that run under it. The
choice of operating system combined with the hardware platform determines which
ready-made applications can be purchased to work on it.

Systems programmers and ITmanagers must determine when
newer versions of operating systems make sense and plan how to integrate them
into existing environments.

Communications
Protocols
Communications protocols determine the format and rules for how the transmitted
data are framed and managed from the sending station to the receiving station.
Exchanging data and messages between PCs, Macs, mainframes and Unix servers used
to mean designing networks for a multiprotocol environment. Today, most
enterprises have migrated their proprietary protocols (IBM's SNA, Apple's AppleTalk, Novell's IPX/SPX,
Microsoft's NetBEUI) to the Unix-based TCP/IP protocol, which is the transport
of the Internet.

LANs
Transmission from station to station within a LAN is performed by the LAN access
method, or data link protocol, which is typically Ethernet. As traffic expands
within an organization, higher bandwidth is required, causing organizations to
plan for faster Ethernet connections (from 100 Mbps to 1,000 Mbps to 10,000
Mbps).

Repeaters, bridges, routers, gateways, hubs and switches are the devices used to
extend, convert, route and manage traffic in an enterprise network.
Increasingly, one device takes on the job of another (a router does bridging, a
hub does routing). Over the years, vendor offerings have been dizzying.

Network traffic is becoming as jammed as the Los Angeles freeways. Network
administrators have to analyze current network traffic in light of future
business plans and increasing use of Web pages, images, sound and video files.
They have to determine when to increase network bandwidth while maintaining
existing networks, which today have become the technical lifeblood of an
enterprise.

WANs
Transmitting data to remote locations requires the use of private lines or
public switched services offered by local and long distance carriers and
Internet providers. Connections can be as simple as dialing up via modem or by
leasing private lines, such as T1 and T3. Switched 56, frame relay, ISDN, SMDS
and ATM offer a variety of switched services in which you pay for the digital
traffic you use. With Internet access, you typically pay a fixed amount per
month based on the total bandwidth of the connection.

Laptop use has created a tremendous need for remote access to LANs. Network
administrators have to design LANs with a combination of remote access and
remote control capability to allow mobile workers access to their databases and
processing functions.

Network Management
Network management is the monitoring and control of LANs and WANs from a central
management console. It requires network management software, such as IBM's
NetView and HP's OpenView. The Internet's SNMP has become the de facto standard
management protocol, but there are many network management programs and options.
For example, there are more than 30 third-party add-ons for HP's popular
OpenView software.

Systems and Storage
Management
Systems management includes a variety of functions for managing computers in a
networked environment, including software distribution, version control, backup
& recovery, printer spooling, job scheduling, virus protection and performance
and capacity planning. Network management may also fall under the systems
management umbrella.

Storage management has become critical for two reasons. First, there is an
ever-increasing demand for storage due to the Internet, document management and
data warehousing as well as increasing daily transaction volume in growing
companies. Secondly, finding the time window in a 7x24 operation to copy huge
databases for backup, archiving and disaster recovery has become more difficult.

Electronic Mail
Most earlier proprietary mail systems have given way to Internet protocol-based
e-mail; however, some still remain within the enterprise. No matter which mail
system is used, keeping the network safe from virus-laden attachments and
preventing it from overloading because of spam is an ongoing challenge.

The Internet and
Intranets
As if everything above is not enough to keep the technical staff busy, the World
Wide Web came along in the mid-1990s with the force of a tornado, and nothing in
the IT world would ever be the same. Today, the Internet sets many of the
standards, and the browser has become an interface for accessing just about
everything. Every component of system software from operating system to database
management system, as well as every application on the market, was revamped in
some manner to be Internet compliant. Today, almost every new application deals
with the Internet in some manner.