Main menu

In a conventional highway design model, the data consist of points and lines that define the outline of the planned works. Other information critical to successful completion such as design data and specifications is disassociated. In typical engineering application data is not only stored in different locations, it is often linked via a human or paper intermediary. This segregation of data occurs because, in the common file based data management environment, there is no direct link between a drawing and a document. Changes in one may not be reflected in the other. This state of affairs is accepted historically as the design process has evolved to produce a final paper output with documents and drawings as separate entities. This is the focus of established design processes that have evolved over the years.

In a typical large highways project under the Early Contractor Involvement model, systems such as Business Collaborator (BC) share information among the parties. However in a typical application the data set held on BC is incomplete. BC is often used to issue information to team members rather than as an information or knowledge repository. The effect of this approach is to create duplication between data stored in the collaboration system and data held internally on file servers. This duplication may further increase inside the design organisation as different teams or groups maintain their own file systems. Multiple asynchronous copies of information may be held (and relied upon) that are not subject to a consistent update policy.

The lack of compatibility means that it is not possible to go to a “single source of truth” for project information it is necessary in many cases to know first what is being looked for and where it may be stored. This increases the workload to locate information and introduces a risk that it may not be found or when found may be (dangerously) out of date. A user cannot be totally confident that he has discovered the true and complete information needed to complete his task. The secondary effect of this is that often it will be easier for a person to go direct to the source of the information and the result is uncontrolled information. Additional workload is created within teams from making and responding to these requests. Multiple requests may be made over time for essentially the same information.

Should information need updating it is not possible to be confident that all versions on the system have been updated, and team members, as a result, have low confidence that information on the system is up to date. The output may, therefore, be subject to additional and unnecessary validation and control stages. Authors may hold back information in which they have little confidence, but is still useful information. This creates additional inefficiencies.

Where out of date information is held on the system this may result in rework when new information comes to light or it is corrected. A culture of making do emerges where tasks are commenced even though faith in the information is reduced, or it is known that input data will be subject to change. To enable work to advance assumptions are made to avoid future correction that in themselves may require correction at a later date. Hidden contingency is built in to cover these assumptions increasing base costs and extending project time.

Within many organisations the culture of “design iteration” remains embedded as it is implicitly accepted that information will change through the design cycle and that the design will be changed several times in response to this changing information. Supported by better information and better management of that information it should be possible to deliver better designs, earlier and cheaper, by eliminating wasteful iteration, risk and rework.

To adopt new ways of working and managing information, we must first recognise that deliverable and document focused systems are based on a paper model. In such a model information is collated into paper based documents such as reports and drawings. Information based methodologies eclipse paper and focus on delivering the right information to the right person at the right time. How information is delivered, and consumed can be variable, depending on the receiver. It is not necessary, therefore, to maintain information in different forms just because the consumer of that information has different needs. Information kept in different forms may result in omissions when updated if one of the forms is overlooked.

Much current technology for managing and indexing information is based on the assumption that information arrives in paper form or a paper analogue (e.g. email). In fact, what is essential is that the information can be found by those needing to know it. Such information is better found, not from the paper mindset of looking through all likely documents, but through search.

Traditional filing systems are based on a paper methodology with information collated into folders (files) browsed as a file analogue. The division into files can be arbitrary and at odds with how information is sought by the user. Within a traditional filing system, vertical navigation is straightforward, but horizontal navigation between folders is more difficult. The analogy would be that to move from the east wing of a 40 floor building to the west wing it was necessary to take a lift first to the lobby. If the constraint of the paper is broken, and the appropriate technology adopted, it becomes possible to obtain information by whatever method (for example full text search) is best and most effective.

It is clear, therefore, that paper based approaches to information management in a world in which almost all project information is either electronic or convertible to electronic form creates an unnecessary overhead. Search based approaches will locate information more thoroughly and more efficiently. We are used now to the internet and Google, would we gladly give up Google to use a library instead? By continuing to use deliverable and document focused systems, as opposed to information focused, this is in effect what we force teams to do.

An effective information system must provide assurance as to the quality of the information being delivered. The current position of document controller on a project will change to information manager in order that the system is properly managed and controlled.

There are many great websites that provide generic best practice information security tips for the workplace. However, employers need to be aware of two major risks of asking employees to rely on them for their security awareness.

The first risk is making sure that your employees visit one of the good websites, rather than fall foul of one of the ‘lesser’ sources. Simple enough to solve – send your staff an email of the information security websites that you approve of. Job done!

The second risk isn’t so simple to address. Your organisation is unique, with its own specific processes, procedures and information types. It may even draw unique cyber threats that other industries and organisations don’t have to contend with. Unfortunately, any best practice that your employees draw from generic security websites is unlikely to be fully applicable to these unique aspects of your organisation.

For example, generic websites can talk about the dangers of phishing, but they can’t talk about the specific dangers of spear phishing attacks that are unique to your industry or organisation. Generic sites can talk about how ‘sensitive information’ should be encrypted when copied onto storage media or transported on laptops, but they can’t define what ‘sensitive information’ means in the context of your organisation.

Benefits of the specific source

Many organisations are addressing this second risk by bringing the source of security best practice in-house. This ensures that employees have fast access to a comprehensive portal that covers the breadth of required information security awareness. In most cases this is achieved by way of a distinct information security micro-site held within their existing intranet framework.

This delivers the immediate benefit of allowing you to tailor all information security best practice to your organisation, making it fit for purpose for the work your employees do and the way that they do it. The types of information can be discussed within the context of the organisation’s own information classification system. All handling procedures can refer specifically to organisation processes. The unique risks of the industry or organisation can also be addressed, with relevant real life case studies providing additional weight.

Compiling an in-house resource also provides many other advantages. The content can be re-tasked for your employee information security awareness training sessions. It can also become the central information hub from which organisation-wide information security communications campaigns are run. No matter how campaign messages are conveyed to employees – whether by posters, presentations, plasma screen animations or quick-guides – the information security micro-site is always cited as the first port of call for further information.

Building an information security portal

Naturally there are many factors that contribute to a successful information security portal. Two key priorities are to plan a clear information hierarchy and aim for maximum build flexibility.

Getting the information hierarchy right plays a huge role in dictating the success of the project. If users have trouble finding what they want to know, you run the risk that they’ll try and find it on a web search, which takes them outside your control. Information security is a complex topic, and a clear information hierarchy not only makes it easy to find topics, it can also help employees to see how all the various topics inter-relate. This can make the entire subject seem much more mentally accessible and therefore easier to employ.

Build flexibility gives your site the longest possible shelf-life and makes it a highly versatile communications tool. Like any website, users are encouraged to return if they feel it is a dynamic source of valuable information. For example, home page flexibility in particular can allow you to tailor it to specific information security awareness campaigns. You should also ensure that the clear information hierarchy takes into account that the site will grow over time. For example, as new threats emerge or as new processes are introduced to the organisation.

Before embarking on a portal project, it’s a good idea to ask a cross-section of your employees what they would like to see and what would help them most. Although many will almost certainly provide generic answers, look closely at the way they are responding. This is an excellent opportunity to test the temperature of your organisation’s attitude to information security. If a large proportion of your staff members have no opinion, it could indicate that they aren’t that interested in handling their work securely – something that certainly needs to be addressed.

Anyone who is regularly online will have seen it more than once, if they’re really interested in Social Media Networking, they’ll have seen it tens of times over the last few years: Big Brother. Stories, articles, essays and a whole mess of scaremongering about what happens each and every time a person logs onto the Internet. Someone, somewhere is watching over them, peeking over their shoulder and following each and every move whilst they are surfing. They know what has been purchased on Amazon, what is searched for on Google, each status update on Facebook and Twitter. The curtains may have been drawn and the door locked, but no one is ever alone on the Internet.

In Europe and the United States there is a great deal of legal pressure on politicians, not so much pressure from the public because they know better, but from civil rights organizations and the like, to limit the ability of some web sites to gather information. Much has been written about Facebook and Google gathering information, and there have been many diverging opinions: the information is entered voluntarily, so be it! It is, however, much more than that.

The Internet is the biggest potential marketplace ever. The discussions might be about markets such as China and the United States, about emerging markets and First and Third World markets but they have nothing compared to the potential of the Internet, because the Internet brings every single country together, almost into one melting pot, and has all the possibilities at anyone’s fingertips for exploitation. Not necessarily in a bad way, not all exploitation is bad, but in a way which could define how the market evolves, what offers are made and how web sites and online stores are designed and geared up for the customer of the future.

In short, someone out there is gathering information on you and your habits.

Most of the information being gathered is harmless. It is information individuals have entered themselves – such as by Facebook – and it is information on what is needed, desired or enjoyed – such as by Google, Yahoo, Bing and any other search engine one might care to mention. It is information about what has bought – where else can Amazon get its recommendations from other than from individual buying habits?

And the rest of the information?

The rest is a gathering of individual surfing habits. Which web sites have been visited and how long has the visitor stayed there? Where did they come from and where did they go? Which page did they land on and which search words did they use to get there?

What would happen if a single person or a company could use all this technology at their fingertips to see what each person does on other sites? What if they could set up a little bit of spying software on another site and see whether someone visits when that site has no other connection to them?

This has happened here from the moment a link was made to this site. Not in a bad way, but everyone visiting this page has been checked by others. They’ve been checked by Google (Google Analytics), by Alexa, by Facebook. Even if the visitor doesn’t have a Facebook account, they’ve been checked and the visit logged.

Why and how?

Why. Facebook is a site which gathers all manner of information to advance its own advertising strategy. A person doesn’t need to be registered for Facebook to want to know what interests them, to be able to build up a global picture of what is popular and what is on the way out. Each time there is a Facebook symbol on a web site, even if no one presses Like, they’ve been seen, their visit has been noted. The page has loaded in a browser and the Like button has been loaded direct from Facebook.

How do webmasters know when others are hot linking to their photographs and images? The visit, on another web site, has been logged and, eventually, evaluated.

And when a person thinks that they’ve only been surfing safe sites? Think again.

A few days ago I installed a new tracking checker on my personal system. It tells me how many other companies are watching my every move, how many spies there are out there. I went through my normal surfing routine, a little bit of Twitter, a touch of Facebook, some StumbleUpon, a hint of Google+ and a few sites with adult content. The result after only two days, that is perhaps seven or eight hours of actual surfing from one web site to another, was seven hundred and sixty-eight hits by Facebook alone.

Let’s get one thing right out of the way: in the majority of cases Facebook, and all the others tracking, do not know who an individual is. They can’t put a name to their activities, or a face. That is, unless they happen to be logged in to Facebook while surfing elsewhere. Unless they happen to still have the Facebook cookie saved in their computer cache. Facebook and others can see where a person is on the Internet, where they’ve been, which country they are in and, probably, also which area from the IP address, but they don’t know who an individual is.

Is this a bad thing, this gathering of information for marketing purposes?

Perhaps there will indeed come a time when Minority Report – the film with Tom Cruise – is not just a threat but a reality. A time when a person’s features can be recognized from afar and advertising is adapted to their needs, their interests. At the moment it is all limited to offers made when someone log into the web sites of their choice and based upon the information they’ve given up voluntarily. But some of that information is already being used to influence other people in their buying choices.

Who hasn’t seen the little addition on Amazon: people who bought this book also bought…

This is the thin edge of the wedge, this is just the beginning. This is the information other people have put in to a web site being used to influence you, the visitor. It’s one thing to say that an item might interest you based on what you’ve purchased before, but quite another to have information based on what other people have looked at or bought.

And it is also a simple fact of life which cannot be avoided. I may well have been able to block over three thousand tracking attempts during my few hours of surfing, but did they catch all of them? More to the point, aside from Facebook, who is tracking me? The Big Bad Wolf is not an advertising company checking on who has been looking at their banners or pop-ups. The Big Bad Wolf is those tracking companies who gather information, press it all together and then sell it to others. The anonymous, faceless people we have nothing to do with. Are they just marketing companies, or is the government, any government, hiding behind them? Has the CIA found me, or you and decided to track our movements because a web site visited published a photo of someone, or MI6 because there is a comment posted about Kate Middleton’s figure?

Enough of the scaremongering. To be honest and there is not a great deal about this gathering of information that’s all that bad. Information has always been gathered, evaluated, passed on and it always will be. Every single time someone goes shopping in the Real World their purchases are recorded: the credit or debit card company; the store; the wholesaler; the manufacturer. No names in most instances, but the information has been gathered. A tin of peas has been purchased, restock the shelves and order a new tin.

Are there any benefits to this mass gathering of information?

If a product isn’t popular it gets removed from sale. If a whole range of products suddenly go viral, more are produced. If a web site suddenly falls in the ratings, it gets improved or it vanishes. If an advert gets no clicks at all, it needs to be re-evaluated and a new marketing strategy pounded out.

The people who are surfing through the Internet are changing its features with each click of their mouse. Their surfing activity is the basis for what follows. A visitor to any web site doesn’t have to press Like to show appreciation, it is enough that the records show they stayed on a site for five minutes, read through an article, even if they didn’t comment or purchase. The visit alone is showing the manufacturers, the advertisers, the service industry where interest lie with the result that they are going to have to tailor what they have on offer to meet our (silent) demands. We, the Internet users, are shaping the future, just by being here. And that can only be a good thing.

Even so, nearly eight hundred blocks on Facebook alone in so few hours?

I have written so far about the marketing strategies of various Internet web sites, of advertising and the collection of data from individual visits to web sites while surfing through the Internet. Now I wish to take it one step further following an announcement by the German telecommunications company O2, a daughter firm of the Spanish telecommunications company Telefónica.

The collection of information through Internet sites, as illustrated above, is simple, cheap and effective. An Internet user surfs to a web site of interest and his or her movements through the web are logged, collected and evaluated by a whole range of different tracking devices, from spy software through cookies, links to social media networks and search engines or analytical tools. But what about the general movements of a person during their daily lives? Is it possible to follow a specific person, or a group of people, as they move through a city? Is it possible to collate the information gained from these movements and come up with an overall picture which might be useful to marketing companies, to advertisers, to the marketplace in general?

It is a well known and accepted fact that people who use modern smart phones, as well as older versions, can be tracked. The mobile telephone needs to be in constant contact with a transmission device, a node or similar, so that it is available should the user wish to telephone out or to receive calls from other people. As long as the mobile device is switched on it sends and receives a signal which places it within a certain area, within reach of a communications point to retain this high level of connectivity. A person moving through the streets of Berlin, New York, London, Paris or any other modern city as well as all minor cities, smaller towns, villages and the countryside with a mobile device is constantly followed by these connection signals as long as their device is switched on. Information on their position may, with the right technology, quickly be collected and, in the case of an emergency for example, directed to the appropriate authorities, even without the use of a Global Positioning System (GPS).

The German telecommunications company O2 is investigating the possibilities of using this information on the movement of individuals for marketing purposes. Being able to watch the movements of an individual or a group as they travel from one shop to another within a major city, or from one position to another on longer journeys, can give information about where the most interest in a town lies, where the shops and stores have the best pull and even, with finer tuning, how long a person remains in one position, in one shop or store.

Not, in and of itself, too much of a problem until you take it to the next step in the process.

Couple the information on a person’s movements with further information, such as age and gender, and it is possible to build up a very accurate picture of the movements and interests of a group of people within a certain age group – such as young women aged between 18 and 24. The necessary information is already there, voluntarily given by the customer during the process of buying or renting a mobile telephone. Date of birth, address, gender and, in some cases, income and educational levels are all included in the basic application process for a contract between telecommunications company and customer.

Here, because of the sudden lack of anonymity, we come into a gray area as far as data protection is concerned, and a potential earner for the telecommunications industry. Combine the information with actual sales, with positioning in an entertainment area of a city or the main shopping street, and it is possible to build up an individual picture of each and every person using a mobile device at any time of the day or night. Here we are verging on the private sphere, the gathering of information which can be narrowed down to a specific person.

What is the difference between an individual person using the Internet and being tracked and an individual using a mobile device?

With Internet tracking there may well be several hundred people using a connection point into the Internet, an IP address linked to an Internet Service Provider, at any one time. With mobile device tracking the link is direct to a specific mobile phone, to a specific person who has purchased or rented this device. It is possible to link directly to a name and an address without needing to go any further along the chain, without needing to find out who was using a specific IP at a certain time and then checking their communications protocol or whereabouts at the time of connection. It is possible to track movements without the person being tracked actually being active, without them having logged into the Internet or even making a telephone call.

With further innovations in the smart phone market, such as video devices, payment for services through a smart chip, it is possible to trace their every movement right down to the items they may purchase in any given store, even a parking ticket purchased through an appropriate application on their mobile phone. It is possible to see how long they remain in one area, where they move to and how much they have spent.

For the gathering of information with marketing potential, this is an absolute goldmine. For the individual, the mobile device owner, it is an incursion into their private sphere, into their daily lives.

This form of market information gathering is not music for the future; the first steps have already been taken by O2 in Germany. Information is already available and is constantly being added to each time a person switches their mobile device on. It is only a matter of time before the true potential of this information source is recognized and, data protection laws allowing, becomes common practice.

This form of gathering, of tracking is, according to many professional and civil rights organizations, one step too far. As long as the information gathered comes from a large group and cannot be traced back to an individual it is relatively harmless. With the mobile device potential, the move towards a Minority Report style society is far closer than anyone would wish to believe and, in all probability, far closer than anyone is prepared to accept.

The United States is facing the largest shortage of healthcare practitioners in our country’s history which is compounded by an ever increasing geriatric population. In 2005 there existed one geriatrician for every 5,000 US residents over 65 and only nine of the 145 medical schools trained geriatricians. By 2020 the industry is estimated to be short 200,000 physicians and over a million nurses. Never, in the history of US healthcare, has so much been demanded with so few personnel. Because of this shortage combined with the geriatric population increase, the medical community has to find a way to provide timely, accurate information to those who need it in a uniform fashion. Imagine if flight controllers spoke the native language of their country instead of the current international flight language, English. This example captures the urgency and critical nature of our need for standardized communication in healthcare. A healthy information exchange can help improve safety, reduce length of hospital stays, cut down on medication errors, reduce redundancies in lab testing or procedures and make the health system faster, leaner and more productive. The aging US population along with those impacted by chronic disease like diabetes, cardiovascular disease and asthma will need to see more specialists who will have to find a way to communicate with primary care providers effectively and efficiently.

This efficiency can only be attained by standardizing the manner in which the communication takes place. Healthbridge, a Cincinnati based HIE and one of the largest community based networks, was able to reduce their potential disease outbreaks from 5 to 8 days down to 48 hours with a regional health information exchange. Regarding standardization, one author noted, “Interoperability without standards is like language without grammar. In both cases communication can be achieved but the process is cumbersome and often ineffective.”

United States retailers transitioned over twenty years ago in order to automate inventory, sales, accounting controls which all improve efficiency and effectiveness. While uncomfortable to think of patients as inventory, perhaps this has been part of the reason for the lack of transition in the primary care setting to automation of patient records and data. Imagine a Mom & Pop hardware store on any square in mid America packed with inventory on shelves, ordering duplicate widgets based on lack of information regarding current inventory. Visualize any Home Depot or Lowes and you get a glimpse of how automation has changed the retail sector in terms of scalability and efficiency. Perhaps the “art of medicine” is a barrier to more productive, efficient and smarter medicine. Standards in information exchange have existed since 1989, but recent interfaces have evolved more rapidly thanks to increases in standardization of regional and state health information exchanges.

History of Health Information Exchanges

Major urban centers in Canada and Australia were the first to successfully implement HIE’s. The success of these early networks was linked to an integration with primary care EHR systems already in place. Health Level 7 (HL7) represents the first health language standardization system in the United States, beginning with a meeting at the University of Pennsylvania in 1987. HL7 has been successful in replacing antiquated interactions like faxing, mail and direct provider communication, which often represent duplication and inefficiency. Process interoperability increases human understanding across networks health systems to integrate and communicate. Standardization will ultimately impact how effective that communication functions in the same way that grammar standards foster better communication. The United States National Health Information Network (NHIN) sets the standards that foster this delivery of communication between health networks. HL7 is now on it’s third version which was published in 2004. The goals of HL7 are to increase interoperability, develop coherent standards, educate the industry on standardization and collaborate with other sanctioning bodies like ANSI and ISO who are also concerned with process improvement.

In the United States one of the earliest HIE’s started in Portland Maine. HealthInfoNet is a public-private partnership and is believed to be the largest statewide HIE. The goals of the network are to improve patient safety, enhance the quality of clinical care, increase efficiency, reduce service duplication, identify public threats more quickly and expand patient record access. The four founding groups the Maine Health Access Foundation, Maine CDC, The Maine Quality Forum and Maine Health Information Center (Onpoint Health Data) began their efforts in 2004.

In Tennessee Regional Health Information Organizations (RHIO’s) initiated in Memphis and the Tri Cities region. Carespark, a 501(3)c, in the Tri Cities region was considered a direct project where clinicians interact directly with each other using Carespark’s HL7 compliant system as an intermediary to translate the data bi-directionally. Veterans Affairs (VA) clinics also played a crucial role in the early stages of building this network. In the delta the midsouth eHealth Alliance is a RHIO connecting Memphis hospitals like Baptist Memorial (5 sites), Methodist Systems, Lebonheur Healthcare, Memphis Children’s Clinic, St. Francis Health System, St Jude, The Regional Medical Center and UT Medical. These regional networks allow practitioners to share medical records, lab values medicines and other reports in a more efficient manner.

Seventeen US communities have been designated as Beacon Communities across the United States based on their development of HIE’s. These communities’ health focus varies based on the patient population and prevalence of chronic disease states i.e. cvd, diabetes, asthma. The communities focus on specific and measurable improvements in quality, safety and efficiency due to health information exchange improvements. The closest geographical Beacon community to Tennessee, in Byhalia, Mississippi, just south of Memphis, was granted a $100,000 grant by the department of Health and Human Services in September 2011.

A healthcare model for Nashville to emulate is located in Indianapolis, IN based on geographic proximity, city size and population demographics. Four Beacon awards have been granted to communities in and around Indianapolis, Health and Hospital Corporation of Marion County, Indiana Health Centers Inc, Raphael Health Center and Shalom Health Care Center Inc. In addition, Indiana Health Information Technology Inc has received over 23 million dollars in grants through the State HIE Cooperative Agreement and 2011 HIE Challenge Grant Supplement programs through the federal government. These awards were based on the following criteria:1) Achieving health goals through health information exchange 2) Improving long term and post acute care transitions 3) Consumer mediated information exchange 4) Enabling enhanced query for patient care 5) Fostering distributed population-level analytics.

The department of Health and Human Services (HHS) is the regulatory agency that oversees health concerns for all Americans. The HHS is divided into ten regions and Tennessee is part of Region IV headquartered out of Atlanta. The Regional Director, Anton J. Gunn is the first African American elected to serve as regional director and brings a wealth of experience to his role based on his public service specifically regarding underserved healthcare patients and health information exchanges. This experience will serve him well as he encounters societal and demographic challenges for underserved and chronically ill patients throughout the southeast area.

The National Health Information Network (NHIN) is a division of HHS that guides the standards of exchange and governs regulatory aspects of health reform. The NHIN collaboration includes departments like the Center for Disease Control (CDC), social security administration, Beacon communities and state HIE’s (ONC).11 The Office of National Coordinator for Health Information Exchange (ONC) has awarded $16 million in additional grants to encourage innovation at the state level. Innovation at the state level will ultimately lead to better patient care through reductions in replicated tests, bridges to care programs for chronic patients leading to continuity and finally timely public health alerts through agencies like the CDC based on this information.12 The Health Information Technology for Economic and Clinical Health (HITECH) Act is funded by dollars from the American Reinvestment and Recovery Act of 2009. HITECH’s goals are to invest dollars in community, regional and state health information exchanges to build effective networks which are connected nationally. Beacon communities and the Statewide Health Information Exchange Cooperative Agreement were initiated through HITECH and ARRA. To date 56 states have received grant awards through these programs totaling 548 million dollars.

History of Health Information Partnership TN (HIPTN)

In Tennessee the Health Information Exchange has been slower to progress than places like Maine and Indiana based in part on the diversity of our state. The delta has a vastly different patient population and health network than that of middle Tennessee, which differs from eastern Tennessee’s Appalachian region. In August of 2009 the first steps were taken to build a statewide HIE consisting of a non-profit named HIP TN. A board was established at this time with an operations council formed in December. HIP TN’s first initiatives involved connecting the work through Carespark in northeast Tennessee’s s tri-cities region to the Midsouth ehealth Alliance in Memphis. State officials estimated a cost of over 200 million dollars from 2010-2015. The venture involves stakeholders from medical, technical, legal and business backgrounds. The governor in 2010, Phil Bredesen, provided 15 million to match federal funds in addition to issuing an Executive Order establishing the office of eHealth initiatives with oversight by the Office of Administration and Finance and sixteen board members. By March 2010 four workgroups were established to focus on areas like technology, clinical, privacy and security and sustainability.

By May of 2010 data sharing agreements were in place and a production pilot for the statewide HIE was initiated in June 2011 along with a Request for Proposal (RFP) which was sent out to over forty vendors. In July 2010 a fifth workgroup,the consumer advisory group, was added and in September 2010 Tennessee was notified that they were one of the first states to have their plans approved after a release of Program Information Notice (PIN). Over fifty stakeholders came together to evaluate the vendor demonstrations and a contract was signed with the chosen vendor Axolotl on September 30th, 2010. At that time a production goal of July 15th, 2011 was agreed upon and in January 2011 Keith Cox was hired as HIP TN’s CEO. Keith brings twenty six years of tenure in healthcare IT to the collaborative. His previous endeavors include Microsoft, Bellsouth and several entrepreneurial efforts. HIP TN’s mission is to improve access to health information through a statewide collaborative process and provide the infrastructure for security in that exchange. The vision for HIP TN is to be recognized as a state and national leader who support measurable improvements in clinical quality and efficiency to patients, providers and payors with secure HIE. Robert S. Gordon, the board chair for HIPTN states the vision well, “We share the view that while technology is a critical tool, the primary focus is not technology itself, but improving health”. HIP TN is a non profit, 501(c)3, that is solely reliant on state government funding. It is a combination of centralized and decentralized architecture. The key vendors are Axolotl, which acts as the umbrella network, ICA for Memphis and Nashville, with CGI as the vendor in northeast Tennessee.15 Future HIP TN goals include a gateway to the National Health Institute planned for late 2011 and a clinician index in early 2012. Carespark, one of the original regional health exchange networks voted to cease operations on July 11, 2011 based on lack of financial support for it’s new infrastructure. The data sharing agreements included 38 health organizations, nine communities and 250 volunteers.16 Carespark’s closure clarifies the need to build a network that is not solely reliant on public grants to fund it’s efforts, which we will discuss in the final section of this paper.

Current Status of Healthcare Information Exchange and HIPTN

Ten grants were awarded in 2011 by the HIE challenge grant supplement. These included initiatives in eight states and serve as communities we can look to for guidance as HIP TN evolves. As previously mentioned one of the most awarded communities lies less than five hours away in Indianapolis, IN. Based on the similarities in our health communities, patient populations and demographics, Indianapolis would provide an excellent mentor for Nashville and the hospital systems who serve patients in TN. The Indiana Health Information Exchange has been recognized nationally for it’s Docs for Docs program and the manner in which collaboration has taken place since it’s conception in 2004. Kathleen Sebelius, Secretary of HHS commented, “The Central Indiana Beacon Community has a level of collaboration and the ability to organize quality efforts in an effective manner from its history of building long standing relationships. We are thrilled to be working with a community that is far ahead in the use of health information to bring positive change to patient care.” Beacon communities that could act as guides for our community include the Health and Hospital Corporation of Marion County and the Indiana Health Centers based on their recent awards of $100,000 each by HHS.

A local model of excellence in practice EMR conversion is Old Harding Pediatric Associates (OHPA) which has two clinics and fourteen physicians who handle a patient population of 23,000 and over 72,000 patient encounters per year. OHPA’s conversion to electronic records in early 2000 occurred as a result of the pursuit of excellence in patient care and the desire to use technology in a way that benefitted their patient population. OHPA established a cross functional work team to improve their practices in the areas of facilities, personnel, communication, technology and external influences. Noteworthy was chosen as the EMR vendor based on user friendliness and the similarity to a standard patient chart with tabs for files. The software was customized to the pediatric environment complete with patient growth charts. Windows was used as the operating system based on provider familiarity. Within four days OHPA had 100% compliance and use of their EMR system.

The Future of HIP TN and HIE in Tennessee

Tennessee has received close to twelve million dollars in grant money from The State Health Information Exchange Cooperative Agreement Program.20 Regional Health Information Organizations (RHIO) need to be full scalable to allow hospitals to grow their systems without compromising integrity as they grow.21and the systems located in Nashville will play an integral role in this nationwide scaling with companies like HCA, CHS, Iasis, Lifepoint and Vanguard. The HIE will act as a data repository for all patients information that can be accessed from anywhere and contains a full history of the patients medical record, lab tests, physician network and medicine list. To entice providers to enroll in the statewide HIE tangible value to their practice has to be shown with better safer care. In a 2011 HIMSS editor’s report Richard Lang states that instead of a top down approach “A more practical idea may be for states to support local community HIE development first. Once established, these local networks can feed regional HIE’s and then connect to a central HIE/data repository backbone. States should use a portion of the stimulus funds to support local HIE development.”22 Mr. Lang also believes the primary care physician has to be the foundation for the entire system since they are the main point of contact for the patient.

In the past few decades there has been a revolution in computing and communications, and all indications are that technological progress and use of information technology will continue at a rapid pace. Accompanying and supporting the dramatic increases in the power and use of new information technologies has been the declining cost of communications as a result of both technological improvements and increased competition. According to Moore’s law the processing power of microchips is doubling every 18 months. These advances present many significant opportunities but also pose major challenges. Today, innovations in information technology are having wide-ranging effects across numerous domains of society, and policy makers are acting on issues involving economic productivity, intellectual property rights, privacy protection, and affordability of and access to information. Choices made now will have long lasting consequences, and attention must be paid to their social and economic impacts.

One of the most significant outcomes of the progress of information technology is probably electronic commerce over the Internet, a new way of conducting business. Though only a few years old, it may radically alter economic activities and the social environment. Already, it affects such large sectors as communications, finance and retail trade and might expand to areas such as education and health services. It implies the seamless application of information and communication technology along the entire value chain of a business that is conducted electronically.

The impacts of information technology and electronic commerce on business models, commerce, market structure, workplace, labour market, education, private life and society as a whole.

1. Business Models, Commerce and Market Structure

One important way in which information technology is affecting work is by reducing the importance of distance. In many industries, the geographic distribution of work is changing significantly. For instance, some software firms have found that they can overcome the tight local market for software engineers by sending projects to India or other nations where the wages are much lower. Furthermore, such arrangements can take advantage of the time differences so that critical projects can be worked on nearly around the clock. Firms can outsource their manufacturing to other nations and rely on telecommunications to keep marketing, R&D, and distribution teams in close contact with the manufacturing groups. Thus the technology can enable a finer division of labour among countries, which in turn affects the relative demand for various skills in each nation. The technology enables various types of work and employment to be decoupled from one another. Firms have greater freedom to locate their economic activities, creating greater competition among regions in infrastructure, labour, capital, and other resource markets. It also opens the door for regulatory arbitrage: firms can increasingly choose which tax authority and other regulations apply.

Computers and communication technologies also promote more market-like forms of production and distribution. An infrastructure of computing and communication technology, providing 24-hour access at low cost to almost any kind of price and product information desired by buyers, will reduce the informational barriers to efficient market operation. This infrastructure might also provide the means for effecting real-time transactions and make intermediaries such as sales clerks, stock brokers and travel agents, whose function is to provide an essential information link between buyers and sellers, redundant. Removal of intermediaries would reduce the costs in the production and distribution value chain. The information technologies have facilitated the evolution of enhanced mail order retailing, in which goods can be ordered quickly by using telephones or computer networks and then dispatched by suppliers through integrated transport companies that rely extensively on computers and communication technologies to control their operations. Nonphysical goods, such as software, can be shipped electronically, eliminating the entire transport channel. Payments can be done in new ways. The result is disintermediation throughout the distribution channel, with cost reduction, lower end-consumer prices, and higher profit margins.

The impact of information technology on the firms’ cost structure can be best illustrated on the electronic commerce example. The key areas of cost reduction when carrying out a sale via electronic commerce rather than in a traditional store involve physical establishment, order placement and execution, customer support, strong, inventory carrying, and distribution. Although setting up and maintaining an e-commerce web site might be expensive, it is certainly less expensive to maintain such a storefront than a physical one because it is always open, can be accessed by millions around the globe, and has few variable costs, so that it can scale up to meet the demand. By maintaining one ‘store’ instead of several, duplicate inventory costs are eliminated. In addition, e-commerce is very effective at reducing the costs of attracting new customers, because advertising is typically cheaper than for other media and more targeted. Moreover, the electronic interface allows e-commerce merchants to check that an order is internally consistent and that the order, receipt, and invoice match. Through e-commerce, firms are able to move much of their customer support on line so that customers can access databases or manuals directly. This significantly cuts costs while generally improving the quality of service. E-commerce shops require far fewer, but high-skilled, employees. E-commerce also permits savings in inventory carrying costs. The faster the input can be ordered and delivered, the less the need for a large inventory. The impact on costs associated with decreased inventories is most pronounced in industries where the product has a limited shelf life (e.g. bananas), is subject to fast technological obsolescence or price declines (e.g. computers), or where there is a rapid flow of new products (e.g. books, music). Although shipping costs can increase the cost of many products purchased via electronic commerce and add substantially to the final price, distribution costs are significantly reduced for digital products such as financial services, software, and travel, which are important e-commerce segments.

Although electronic commerce causes the disintermediation of some intermediaries, it creates greater dependency on others and also some entirely new intermediary functions. Among the intermediary services that could add costs to e-commerce transactions are advertising, secure online payment, and delivery. The relative ease of becoming an e-commerce merchant and setting up stores results in such a huge number of offerings that consumers can easily be overwhelmed. This increases the importance of using advertising to establish a brand name and thus generate consumer familiarity and trust. For new e-commerce start-ups, this process can be expensive and represents a significant transaction cost. The openness, global reach, and lack of physical clues that are inherent characteristics of e-commerce also make it vulnerable to fraud and thus increase certain costs for e-commerce merchants as compared to traditional stores. New techniques are being developed to protect the use of credit cards in e-commerce transactions, but the need for greater security and user verification leads to increased costs. A key feature of e-commerce is the convenience of having purchases delivered directly. In the case of tangibles, such as books, this incurs delivery costs, which cause prices to rise in most cases, thereby negating many of the savings associated with e-commerce and substantially adding to transaction costs.

With the Internet, e-commerce is rapidly expanding into a fast-moving, open global market with an ever-increasing number of participants. The open and global nature of e-commerce is likely to increase market size and change market structure, both in terms of the number and size of players and the way in which players compete on international markets. Digitized products can cross the border in real time, consumers can shop 24 hours a day, seven days a week, and firms are increasingly faced with international online competition. The Internet is helping to enlarge existing markets by cutting through many of the distribution and marketing barriers that can prevent firms from gaining access to foreign markets. E-commerce lowers information and transaction costs for operating on overseas markets and provides a cheap and efficient way to strengthen customer-supplier relations. It also encourages companies to develop innovative ways of advertising, delivering and supporting their product and services. While e-commerce on the Internet offers the potential for global markets, certain factors, such as language, transport costs, local reputation, as well as differences in the cost and ease of access to networks, attenuate this potential to a greater or lesser extent.

2. Workplace and Labour Market

Computers and communication technologies allow individuals to communicate with one another in ways complementary to traditional face-to-face, telephonic, and written modes. They enable collaborative work involving distributed communities of actors who seldom, if ever, meet physically. These technologies utilize communication infrastructures that are both global and always up, thus enabling 24-hour activity and asynchronous as well as synchronous interactions among individuals, groups, and organizations. Social interaction in organizations will be affected by use of computers and communication technologies. Peer-to-peer relations across department lines will be enhanced through sharing of information and coordination of activities. Interaction between superiors and subordinates will become more tense because of social control issues raised by the use of computerized monitoring systems, but on the other hand, the use of e-mail will lower the barriers to communications across different status levels, resulting in more uninhibited communications between supervisor and subordinates.

That the importance of distance will be reduced by computers and communication technology also favours telecommuting, and thus, has implications for the residence patterns of the citizens. As workers find that they can do most of their work at home rather than in a centralized workplace, the demand for homes in climatically and physically attractive regions would increase. The consequences of such a shift in employment from the suburbs to more remote areas would be profound. Property values would rise in the favoured destinations and fall in the suburbs. Rural, historical, or charming aspects of life and the environment in the newly attractive areas would be threatened. Since most telecommuters would be among the better educated and higher paid, the demand in these areas for high-income and high-status services like gourmet restaurants and clothing boutiques would increase. Also would there be an expansion of services of all types, creating and expanding job opportunities for the local population.

By reducing the fixed cost of employment, widespread telecommuting should make it easier for individuals to work on flexible schedules, to work part time, to share jobs, or to hold two or more jobs simultaneously. Since changing employers would not necessarily require changing one’s place of residence, telecommuting should increase job mobility and speed career advancement. This increased flexibility might also reduce job stress and increase job satisfaction. Since job stress is a major factor governing health there may be additional benefits in the form of reduced health costs and mortality rates. On the other hand one might also argue that technologies, by expanding the number of different tasks that are expected of workers and the array of skills needed to perform these tasks, might speed up work and increase the level of stress and time pressure on workers.

A question that is more difficult to be answered is about the impacts that computers and communications might have on employment. The ability of computers and communications to perform routine tasks such as bookkeeping more rapidly than humans leads to concern that people will be replaced by computers and communications. The response to this argument is that even if computers and communications lead to the elimination of some workers, other jobs will be created, particularly for computer professionals, and that growth in output will increase overall employment. It is more likely that computers and communications will lead to changes in the types of workers needed for different occupations rather than to changes in total employment.

A number of industries are affected by electronic commerce. The distribution sector is directly affected, as e-commerce is a way of supplying and delivering goods and services. Other industries, indirectly affected, are those related to information and communication technology (the infrastructure that enables e-commerce), content-related industries (entertainment, software), transactions-related industries (financial sector, advertising, travel, transport). eCommerce might also create new markets or extend market reach beyond traditional borders. Enlarging the market will have a positive effect on jobs. Another important issue relates to inter linkages among activities affected by e-commerce. Expenditure for e-commerce-related intermediate goods and services will create jobs indirectly, on the basis of the volume of electronic transactions and their effect on prices, costs and productivity. The convergence of media, telecommunication and computing technologies is creating a new integrated supply chain for the production and delivery of multimedia and information content. Most of the employment related to e-commerce around the content industries and communication infrastructure such as the Internet.

Jobs are both created and destroyed by technology, trade, and organizational change. These processes also underlie changes in the skill composition of employment. Beyond the net employment gains or losses brought about by these factors, it is apparent that workers with different skill levels will be affected differently. E-commerce is certainly driving the demand for IT professionals but it also requires IT expertise to be coupled with strong business application skills, thereby generating demand for a flexible, multi-skilled work force. There is a growing need for increased integration of Internet front-end applications with enterprise operations, applications and back-end databases. Many of the IT skill requirements needed for Internet support can be met by low-paid IT workers who can deal with the organizational services needed for basic web page programming. However, wide area networks, competitive web sites, and complex network applications require much more skill than a platform-specific IT job. Since the skills required for e-commerce are rare and in high demand, e-commerce might accelerate the up skilling trend in many countries by requiring high-skilled computer scientists to replace low-skilled information clerks, cashiers and market salespersons.

3. Education

Advances in information technology will affect the craft of teaching by complementing rather than eliminating traditional classroom instruction. Indeed the effective instructor acts in a mixture of roles. In one role the instructor is a supplier of services to the students, who might be regarded as its customers. But the effective instructor occupies another role as well, as a supervisor of students, and plays a role in motivating, encouraging, evaluating, and developing students. For any topic there will always be a small percentage of students with the necessary background, motivation, and self-discipline to learn from self-paced workbooks or computer assisted instruction. For the majority of students, however, the presence of a live instructor will continue to be far more effective than a computer assisted counterpart in facilitating positive educational outcomes. The greatest potential for new information technology lies in improving the productivity of time spent outside the classroom. Making solutions to problem sets and assigned reading materials available on the Internet offers a lot of convenience. E-mail vastly simplifies communication between students and faculty and among students who may be engaged in group projects. Advances in information technology will affect the craft of teaching by complementing rather than eliminating traditional classroom instruction. Indeed the effective instructor acts in a mixture of roles. In one role the instructor is a supplier of services to the students, who might be regarded as its customers. But the effective instructor occupies another role as well, as a supervisor of students, and plays a role in motivating, encouraging, evaluating, and developing students. For any topic there will always be a small percentage of students with the necessary background, motivation, and self-discipline to learn from self-paced workbooks or computer assisted instruction. For the majority of students, however, the presence of a live instructor will continue to be far more effective than a computer assisted counterpart in facilitating positive educational outcomes. The greatest potential for new information technology lies in improving the productivity of time spent outside the classroom. Making solutions to problem sets and assigned reading materials available on the Internet offers a lot of convenience. E-mail vastly simplifies communication between students and faculty and among students who may be engaged in group projects.

Although distance learning has existed for some time, the Internet makes possible a large expansion in coverage and better delivery of instruction. Text can be combined with audio/ video, and students can interact in real time via e-mail and discussion groups. Such technical improvements coincide with a general demand for retraining by those who, due to work and family demands, cannot attend traditional courses. Distance learning via the Internet is likely to complement existing schools for children and university students, but it could have more of a substitution effect for continuing education programmes. For some degree programmes, high-prestige institutions could use their reputation to attract students who would otherwise attend a local facility. Owing to the Internet’s ease of access and convenience for distance learning, overall demand for such programmes will probably expand, leading to growth in this segment of e-commerce.

As shown in the previous section, high level skills are vital in a technology-based and knowledge intensive economy. Changes associated with rapid technological advances in industry have made continual upgrading of professional skills an economic necessity. The goal of lifelong learning can only be accomplished by reinforcing and adapting existing systems of learning, both in public and private sectors. The demand for education and training concerns the full range of modern technology. Information technologies are uniquely capable of providing ways to meet this demand. Online training via the Internet ranges from accessing self-study courses to complete electronic classrooms. These computer-based training programmes provide flexibility in skills acquisition and are more affordable and relevant than more traditional seminars and courses.

4. Private Life and Society

Increasing representation of a wide variety of content in digital form results in easier and cheaper duplication and distribution of information. This has a mixed effect on the provision of content. On the one hand, content can be distributed at a lower unit cost. On the other hand, distribution of content outside of channels that respect intellectual property rights can reduce the incentives of creators and distributors to produce and make content available in the first place. Information technology raises a host of questions about intellectual property protection and new tools and regulations have to be developed in order to solve this problem.

Many issues also surround free speech and regulation of content on the Internet, and there continue to be calls for mechanisms to control objectionable content. However it is very difficult to find a sensible solution. Dealing with indecent material involves understanding not only the views on such topics but also their evolution over time. Furthermore, the same technology that allows for content altering with respect to decency can be used to filter political speech and to restrict access to political material. Thus, if censorship does not appear to be an option, a possible solution might be labelling. The idea is that consumers will be better informed in their decisions to avoid objectionable content.

The rapid increase in computing and communications power has raised considerable concern about privacy both in the public and private sector. Decreases in the cost of data storage and information processing make it likely that it will become practicable for both government and private data-mining enterprises to collect detailed dossiers on all citizens. Nobody knows who currently collects data about individuals, how this data is used and shared or how this data might be misused. These concerns lower the consumers’ trust in online institutions and communication and, thus, inhibit the development of electronic commerce. A technological approach to protecting privacy might by cryptography although it might be claimed that cryptography presents a serious barrier to criminal investigations.

It is popular wisdom that people today suffer information overload. A lot of the information available on the Internet is incomplete and even incorrect. People spend more and more of their time absorbing irrelevant information just because it is available and they think they should know about it. Therefore, it must be studied how people assign credibility to the information they collect in order to invent and develop new credibility systems to help consumers to manage the information overload.

Technological progress inevitably creates dependence on technology. Indeed the creation of vital infrastructure ensures dependence on that infrastructure. As surely as the world is now dependent on its transport, telephone, and other infrastructures, it will be dependent on the emerging information infrastructure. Dependence on technology can bring risks. Failures in the technological infrastructure can cause the collapse of economic and social functionality. Blackouts of long-distance telephone service, credit data systems, and electronic funds transfer systems, and other such vital communications and information processing services would undoubtedly cause widespread economic disruption. However, it is probably impossible to avoid technological dependence. Therefore, what must be considered is the exposure brought from dependence on technologies with a recognizable probability of failure, no workable substitute at hand, and high costs as a result of failure.

The ongoing computing and communications revolution has numerous economic and social impacts on modern society and requires serious social science investigation in order to manage its risks and dangers. Such work would be valuable for both social policy and technology design. Decisions have to be taken carefully. Many choices being made now will be costly or difficult to modify in the future.