Assignment 6 (Due: before August 19, 2009, 13:00hrs)

If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? (3000words)

If I were hired by the university president as an IT consultant, I would suggest innovations in order for the internet connectivity be improved because it is what the university needs before we consider technology, infrastructure, steps and process. There should be a "new way of doing something" in the university.

Why I choose Innovations?

According to my research, the term innovation refers to a new way of doing something. It may refer to incremental, radical, and revolutionary changes in thinking, products, processes, or organizations. A distinction is typically made between invention, an idea made manifest, and innovation, ideas applied successfully. (Mckeown 2008) In many fields, something new must be substantially different to be innovative, not an insignificant change, e.g., in the arts, economics, business and government policy. In economics the change must increase value, customer value, or producer value. The goal of innovation is positive change, to make someone or something better. Innovation leading to increased productivity is the fundamental source of increasing wealth in an economy.Innovation is an important topic in the study of economics, business, design, technology, sociology, and engineering. Colloquially, the word "innovation" is often synonymous with the output of the process. However, economists tend to focus on the process itself, from the origination of an idea to its transformation into something useful, to its implementation; and on the system within which the process of innovation unfolds. Since innovation is also considered a major driver of the economy, especially when it leads to increasing productivity, the factors that lead to innovation are also considered to be critical to policy makers. In particular, followers of innovation economics stress using public policy to spur innovation and growth.Those who are directly responsible for application of the innovation are often called pioneers in their field, whether they are individuals or organizations.

In the organizational context, innovation may be linked to performance and growth through improvements in efficiency, productivity, quality, competitive positioning, market share, etc. All organizations can innovate, including for example hospitals, universities, and local governments.While innovation typically adds value, innovation may also have a negative or destructive effect as new developments clear away or change old organizational forms and practices. Organizations that do not innovate effectively may be destroyed by those that do. Hence innovation typically involves risk. A key challenge in innovation is maintaining a balance between process and product innovations where process innovations tend to involve a business model which may develop shareholder satisfaction through improved efficiencies while product innovations develop customer support however at the risk of costly R&D that can erode shareholder return. In summary, innovation can be described as the result of some amount of time and effort into researching (R) an idea, plus some larger amount of time and effort into developing (D) this idea, plus some very large amount of time and effort into commercializing (C) this idea into a market place with customers.

Conceptualizing innovation

Innovation has been studied in a variety of contexts, including in relation to technology, commerce, social systems, economic development, and policy construction. There are, therefore, naturally a wide range of approaches to conceptualizing innovation in the scholarly literature. See, e.g., Fagerberg et al. (2004).Fortunately, however, a consistent theme may be identified: innovation is typically understood as the successful introduction of something new and useful, for example introducing new methods, techniques, or practices or new or altered products and services.

Distinguishing from Invention and other concepts

"An important distinction is normally made between invention and innovation. Invention is the first occurrence of an idea for a new product or process, while innovation is the first attempt to carry it out into practice" (Fagerberg, 2004: 4)It is useful, when conceptualizing innovation, to consider whether other words suffice. Invention – the creation of new forms, compositions of matter, or processes – is often confused with innovation. An improvement on an existing form, composition or processes might be an invention, an innovation, both or neither if it is not substantial enough. It can be difficult to differentiate change from innovation. According to business literature, an idea, a change or an improvement is only an innovation when it is put to use and effectively causes a social or commercial reorganization.Innovation occurs when someone uses an invention or an idea to change how the world works, how people organize themselves, or how they conduct their lives. In this view innovation occurs whether or not the act of innovating succeeds in generating value for its champions. Innovation is distinct from improvement in that it permeates society and can cause reorganization. It is distinct from problem solving and may cause problems. Thus, in this view, innovation occurs whether it has positive or negative results.So far there is no evidence where innovation has been measured scientifically. Scientists around the world are still working on methods to accurately measure innovation in terms of cost, effort or resource savings. Some of the innovations have become successful because of the way people look at things and need for change from the old ways of doing things.

Innovation in organizations

A convenient definition of innovation from an organizational perspective is given by Luecke and Katz (2003), who wrote:"Innovation . . . is generally understood as the successful introduction of a new thing or method . . . Innovation is the embodiment, combination, or synthesis of knowledge in original, relevant, valued new products, processes, or services.Innovation typically involves creativity, but is not identical to it: innovation involves acting on the creative ideas to make some specific and tangible difference in the domain in which the innovation occurs. For example, Amabile et al. (1996) propose:"All innovation begins with creative ideas . . . We define innovation as the successful implementation of creative ideas within an organization. In this view, creativity by individuals and teams is a starting point for innovation; the first is necessary but not sufficient condition for the second".For innovation to occur, something more than the generation of a creative idea or insight is required: the insight must be put into action to make a genuine difference, resulting for example in new or altered business processes within the organization, or changes in the products and services provided.A further characterization of innovation is as an organizational or management process. For example, Davila et al. (2006), write:"Innovation, like many business functions, is a management process that requires specific tools, rules, and discipline."From this point of view the emphasis is moved from the introduction of specific novel and useful ideas to the general organizational processes and procedures for generating, considering, and acting on such insights leading to significant organizational improvements in terms of improved or new business products, services, or internal processes.Through these varieties of viewpoints, creativity is typically seen as the basis for innovation, and innovation as the successful implementation of creative ideas within an organization (c.f. Amabile et al. 1996 p.1155). From this point of view, creativity may be displayed by individuals, but innovation occurs in the organizational context only.It should be noted, however, that the term 'innovation' is used by many authors rather interchangeably with the term 'creativity' when discussing individual and organizational creative activity. As Davila et al. (2006) comment,"Often, in common parlance, the words creativity and innovation are used interchangeably. They shouldn't be, because while creativity implies coming up with ideas, it's the "bringing ideas to life" . . . that makes innovation the distinct undertaking it is."The distinctions between creativity and innovation discussed above are by no means fixed or universal in the innovation literature. They are however observed by a considerable number of scholars in innovation studies.

Innovation as a behavior

Some in depth work on innovation in organizations, teams and individuals has been carried out by J. L. Byrd, PhD who is co-author of "The Innovation Equation." Dr Jacqueline Byrd is the brain behind the Creatrix Inventory which can be used to look at innovation, and what is behind it. The Innovation Equation she developed is:

Innovation = (Creativity * Risk Taking)

Economic conceptions of innovation

Joseph Schumpeter defined economic innovation in The Theory of Economic Development, 1934, Harvard University Press, Boston.[2]1. The introduction of a new good — that is one with which consumers are not yet familiar — or of a new quality of a good.2. The introduction of a new method of production, which need by no means be founded upon a discovery scientifically new, and can also exist in a new way of handling a commodity commercially.3. The opening of a new market, that is a market into which the particular branch of manufacture of the country in question has not previously entered, whether or not this market has existed before.4. The conquest of a new source of supply of raw materials or half-manufactured goods, again irrespective of whether this source already exists or whether it has first to be created.5. The carrying out of the new organization of any industry, like the creation of a monopoly position (for example through trustification) or the breaking up of a monopoly positionSchumpeter's focus on innovation is reflected in Neo-Schumpeterian economics, developed by such scholars as Christopher Freeman and Giovanni Dosi.Innovation is also studied by economists in a variety of other contexts, for example in theories of entrepreneurship or in Paul Romer's New Growth Theory.

Transaction cost and network theory perspectives

Main articles: Transaction cost and network theoryAccording to Regis Cabral (1998, 2003):"Innovation is a new element introduced in the network which changes, even if momentarily, the costs of transactions between at least two actors, elements or nodes, in the network."

Innovation and market outcome

Market outcome from innovation can be studied from different lenses. The industrial organizational approach of market characterization according to the degree of competitive pressure and the consequent modeling of firm behavior often using sophisticated game theoretic tools, while permitting mathematical modeling, has shifted the ground away from an intuitive understanding of markets. The earlier visual framework in economics, of market demand and supply along price and quantity dimensions, has given way to powerful mathematical models which though intellectually satisfying has led policy makers and managers groping for more intuitive and less theoretical analyses to which they can relate to at a practical level. Non quantifiable variables find little place in these models, and when they do, mathematical gymnastics (such as the use of different demand elasticities for differentiated products) embrace many of these qualitative variables, but in an intuitively unsatisfactory way.In the management (strategy) literature on the other hand, there is a vast array of relatively simple and intuitive models for both managers and consultants to choose from. Most of these models provide insights to the manager which help in crafting a strategic plan consistent with the desired aims. Indeed most strategy models are generally simple, wherein lie their virtue. In the process however, these models often fail to offer insights into situations beyond that for which they are designed, often due to the adoption of frameworks seldom analytical, seldom rigorous. The situational analyses of these models often tend to be descriptive and seldom robust and rarely present behavioral relationship between variables under study.From an academic point of view, there is often a divorce between industrial organization theory and strategic management models. While many economists view management models as being too simplistic, strategic management consultants perceive academic economists as being too theoretical, and the analytical tools that they devise as too complex for managers to understand.Innovation literature while rich in typologies and descriptions of innovation dynamics is mostly technology focused. Most research on innovation has been devoted to the process (technological) of innovation, or has otherwise taken a how to (innovate) approach. For example the integrated innovation model of Soumodip Sarkar (Sarkar 2007). These 'integrated' approaches, draw on industrial organization, management and innovation literature.

There are several sources of innovation. In the linear model of innovation the traditionally recognized source is manufacturer innovation. This is where an agent (person or business) innovates in order to sell the innovation. Another source of innovation, only now becoming widely recognized, is end-user innovation. This is where an agent (person or company) develops an innovation for their own (personal or in-house) use because existing products do not meet their needs. Eric von Hippel has identified end-user innovation as, by far, the most important and critical in his classic book on the subject, Sources of Innovation.[5]Innovation by businesses is achieved in many ways, with much attention now given to formal research and development for "breakthrough innovations." But innovations may be developed by less formal on-the-job modifications of practice, through exchange and combination of professional experience and by many other routes. The more radical and revolutionary innovations tend to emerge from R&D, while more incremental innovations may emerge from practice – but there are many exceptions to each of these trends.Regarding user innovation, a great deal of innovation is done by those actually implementing and using technologies and products as part of their normal activities. Sometimes user-innovators may become entrepreneurs, selling their product, they may choose to trade their innovation in exchange for other innovations, or they may be adopted by their suppliers. Nowadays, they may also choose to freely reveal their innovations, using methods like open source. In such networks of innovation the users or communities of users can further develop technologies and reinvent their social meaning. Whether innovation is mainly supply-pushed (based on new technological possibilities) or demand-led (based on social needs and market requirements) has been a hotly debated topic. Similarly, what exactly drives innovation in organizations and economies remains an open question.More recent theoretical work moves beyond this simple dualistic problem, and through empirical work shows that innovation does not just happen within the industrial supply-side, or as a result of the articulation of user demand, but through a complex set of processes that links many different players together – not only developers and users, but a wide variety of intermediary organizations such as consultancies, standards bodies etc. Work on social networks suggests that much of the most successful innovation occurs at the boundaries of organizations and industries where the problems and needs of users and the potential of technologies can be linked together in a creative process that challenges both.

Value of experimentation in innovation

When an innovative idea requires a new business model, or radically redesigns the delivery of value to focus on the customer, a real world experimentation approach increases the chances of market success. New business models and customer experiences can't be tested through traditional market research methods. Pilot programs for new innovations set the path in stone too early thus increasing the costs of failure. On the other hand, the good news is that recent years have seen considerable progress in identifying important key factors/principles or variables that affect the probability of success in innovation. Of course, building successful businesses is such a complicated process, involving subtle interdependencies among so many variables in dynamic systems, that it is unlikely to ever be made perfectly predictable. But the more business can master the variables and experiment, the more they will be able to create new companies, products, processes and services that achieve what they hope to achieve. Stefan Thomke of Harvard Business School has written a definitive book on the importance of experimentation. Experimentation Matters argues that every company's ability to innovate depends on a series of experiments [successful or not], that help create new products and services or improve old ones. That period between the earliest point in the design cycle and the final release should be filled with experimentation, failure, analysis, and yet another round of experimentation. "Lather, rinse, repeat," Thomke says. Unfortunately, uncertainty often causes the most able innovators to bypass the experimental stage.In his book, Thomke outlines six principles companies can follow to unlock their innovative potential.1. Anticipate and exploit early information through 'front-loaded' innovation processes2. Experiment frequently but do not overload your organization3. Integrate new and traditional technologies to unlock performance4. Organize for rapid experimentation5. Fail early and often but avoid 'mistakes'6. Manage projects as experiments.[8]Thomke further explores what would happen if the principles outlined above were used beyond the confines of the individual organization. For instance, in the state of Rhode Island, innovators are collaboratively leveraging the state's compact geography, economic and demographic diversity and close-knit networks to quickly and cost-effectively test new business models through a real-world experimentation lab.

Diffusion of innovations

Once innovation occurs, innovations may be spread from the innovator to other individuals and groups. This process has been proposed that the life cycle of innovations can be described using the 's-curve' or diffusion curve. The s-curve maps growth of revenue or productivity against time. In the early stage of a particular innovation, growth is relatively slow as the new product establishes itself. At some point customers begin to demand and the product growth increases more rapidly. New incremental innovations or changes to the product allow growth to continue. Towards the end of its life cycle growth slows and may even begin to decline. In the later stages, no amount of new investment in that product will yield a normal rate of return.The s-curve derives from an assumption that new products are likely to have "product Life". i.e. a start-up phase, a rapid increase in revenue and eventual decline. In fact the great majority of innovations never gets off the bottom of the curve, and never produces normal returns.Innovative companies will typically be working on new innovations that will eventually replace older ones. Successive s-curves will come along to replace older ones and continue to drive growth upwards. In the figure above the first curve shows a current technology. The second shows an emerging technology that current yields lower growth but will eventually overtake current technology and lead to even greater levels of growth. The length of life will depend on many factors.

Goals of innovation

Programs of organizational innovation are typically tightly linked to organizational goals and objectives, to the business plan, and to market competitive positioning.For example, one driver for innovation programs in corporations is to achieve growth objectives. As Davila et al. (2006) note,"Companies cannot grow through cost reduction and reengineering alone . . . Innovation is the key element in providing aggressive top-line growth, and for increasing bottom-line results" (p.6)In general, business organizations spend a significant amount of their turnover on innovation i.e. making changes to their established products, processes and services. The amount of investment can vary from as low as a half a percent of turnover for organizations with a low rate of change to anything over twenty percent of turnover for organizations with a high rate of change.The average investment across all types of organizations is four percent. For an organization with a turnover of say one billion currency units, this represents an investment of forty million units. This budget will typically be spread across various functions including marketing, product design, information systems, manufacturing systems and quality assurance.The investment may vary by industry and by market positioning.One survey across a large number of manufacturing and services organizations found, ranked in decreasing order of popularity, that systematic programs of organizational innovation are most frequently driven by:1. Improved quality2. Creation of new markets3. Extension of the product range4. Reduced labour costs5. Improved production processes6. Reduced materials7. Reduced environmental damage8. Replacement of products/services9. Reduced energy consumption10. Conformance to regulationsThese goals vary between improvements to products, processes and services and dispel a popular myth that innovation deals mainly with new product development. Most of the goals could apply to any organization be it a manufacturing facility, marketing firm, hospital or local government.

Failure of innovation

Research findings vary, ranging from fifty to ninety percent of innovation projects judged to have made little or no contribution to organizational goals. One survey regarding product innovation quotes that out of three thousand ideas for new products, only one becomes a success in the marketplace. Failure is an inevitable part of the innovation process, and most successful organizations factor in an appropriate level of risk. Perhaps it is because all organizations experience failure that many choose not to monitor the level of failure very closely. The impact of failure goes beyond the simple loss of investment. Failure can also lead to loss of morale among employees, an increase in cynicism and even higher resistance to change in the future.Innovations that fail are often potentially good ideas but have been rejected or postponed due to budgetary constraints, lack of skills or poor fit with current goals. Failures should be identified and screened out as early in the process as possible. Early screening avoids unsuitable ideas devouring scarce resources that are needed to progress more beneficial ones. Organizations can learn how to avoid failure when it is openly discussed and debated. The lessons learned from failure often reside longer in the organizational consciousness than lessons learned from success. While learning is important, high failure rates throughout the innovation process are wasteful and a threat to the organization’s future.The causes of failure have been widely researched and can vary considerably. Some causes will be external to the organization and outside its influence of control. Others will be internal and ultimately within the control of the organization. Internal causes of failure can be divided into causes associated with the cultural infrastructure and causes associated with the innovation process itself. Failure in the cultural infrastructure varies between organizations but the following are common across all organizations at some stage in their life cycle (O'Sullivan, 2002):1. Poor Leadership2. Poor Organization3. Poor Communication4. Poor Empowerment5. Poor Knowledge ManagementCommon causes of failure within the innovation process in most organizations can be distilled into five types:1. Poor goal definition2. Poor alignment of actions to goals3. Poor participation in teams4. Poor monitoring of results5. Poor communication and access to informationEffective goal definition requires that organizations state explicitly what their goals are in terms understandable to everyone involved in the innovation process. This often involves stating goals in a number of ways. Effective alignment of actions to goals should link explicit actions such as ideas and projects to specific goals. It also implies effective management of action portfolios. Participation in teams refers to the behavior of individuals in and of teams, and each individual should have an explicitly allocated responsibility regarding their role in goals and actions and the payment and rewards systems that link them to goal attainment. Finally, effective monitoring of results requires the monitoring of all goals, actions and teams involved in the innovation process.Innovation can fail if seen as an organizational process whose success stems from a mechanistic approach i.e. 'pull lever obtain result'. While 'driving' change has an emphasis on control, enforcement and structures it is only a partial truth in achieving innovation. Organizational gatekeepers frame the organizational environment that "Enables" innovation; however innovation is "Enacted" – recognized, developed, applied and adopted – through individuals.Individuals are the 'atom' of the organization close to the minutiae of daily activities. Within individuals gritty appreciation of the small detail combines with a sense of desired organizational objectives to deliver (and innovate for) a product/service offer.From this perspective innovation succeeds from strategic structures that engage the individual to the organization’s benefit. Innovation pivots on intrinsically motivated individuals, within a supportive culture, informed by a broad sense of the future.Innovation, implies change, and can be counter to an organization’s orthodoxy. Space for fair hearing of innovative ideas is required to balance the potential autoimmune exclusion that quells an infant innovative culture.

Measures of innovationThere are two fundamentally different types of measures for innovation: the organizational level and the political level. The measure of innovation at the organizational level relates to individuals, team-level assessments, private companies from the smallest to the largest. Measure of innovation for organizations can be conducted by surveys, workshops, consultants or internal benchmarking. There is today no established general way to measure organizational innovation. Corporate measurements are generally structured around balanced scorecards which cover several aspects of innovation such as business measures related to finances, innovation process efficiency, employees' contribution and motivation, as well benefits for customers. Measured values will vary widely between businesses, covering for example new product revenue, spending in R&D, time to market, customer and employee perception & satisfaction, number of patents, additional sales resulting from past innovations. For the political level, measures of innovation are more focusing on a country or region competitive advantage through innovation. In this context, organizational capabilities can be evaluated through various evaluation frameworks, such as those of the European Foundation for Quality Management. The OECD Oslo Manual (1995) suggests standard guidelines on measuring technological product and process innovation. Some people consider the Oslo Manual complementary to the Frascati Manual from 1963. The new Oslo manual from 2005 takes a wider perspective to innovation, and includes marketing and organizational innovation. These standards are used for example in the European Community Innovation Surveys.Other ways of measuring innovation have traditionally been expenditure, for example, investment in R&D (Research and Development) as percentage of GNP (Gross National Product). Whether this is a good measurement of Innovation has been widely discussed and the Oslo Manual has incorporated some of the critique against earlier methods of measuring. This being said, the traditional methods of measuring still inform many policy decisions. The EU Lisbon Strategy has set as a goal that their average expenditure on R&D should be 3 % of GNP.The Oslo Manual is focused on North America, Europe, and other rich economies. In 2001 for Latin America and the Caribbean countries it was created the Bogota ManualMany scholars claim that there is a great bias towards the "science and technology mode" (S&T-mode or STI-mode), while the "learning by doing, using and interacting mode" (DUI-mode) is widely ignored. For an example, that means you can have the better high tech or software, but there are also crucial learning tasks important for innovation. But these measurements and research are rarely done.A common industry view (unsupported by empirical evidence) is that comparative cost-effectiveness research (CER) is a form of price control which, by reducing returns to industry, limits R&D expenditure, stifles future innovation and compromises new products access to markets. Some academics claim the CER is a valuable value-based measure of innovation which accords truly significant advances in therapy (those that provide 'health gain') higher prices than free market mechanisms. Such value-based pricing has been viewed as a means of indicating to industry the type of innovation that should be rewarded from the public purse. The Australian academic Thomas Alured Faunce has developed the case that national comparative cost-effectiveness assessment systems should be viewed as measuring 'health innovation' as an evidence-based concept distinct from valuing innovation through the operation of competitive markets (a method which requires strong anti-trust laws to be effective) on the basis that both methods of assessing innovation in pharmaceuticals are mentioned in annex 2C.1 of the AUSFTA.

If I am the IT Consultant, I better suggest to the president to improve the IT infrastructure of the university as well as the technology so that internet connectivity will be enhanced not only the internet connection but also the different tasks of the different offices of the University of Southeastern Philippines.

As I surf over the internet, I found out this so-called Advanced Console Server. This ACS secures the remote IT infrastructures management. What is this IT infrastructures management service? IT Infrastructures Management Service is a set of concepts and policies for managing information technology (IT) infrastructure, development and operations and also known as Information Technology Infrastructure Library (ITIL).

Over the past decade, corporate information technology (IT) departments have replaced large mainframes and minicomputers with smaller, less costly, and more scalable servers. This transition offers substantial benefi ts. In the past, increasing an organization’s computing power meant replacing a large computer with an even larger one – a process that was both expensive and time-consuming. The switch to clustered computing (also called server farms) reduced both the expense and the disruption of adding more computing resources. Mainframes offered high availability and reliability, but at a premium price. Servers offer equally reliable and available computing resources using less expensive hardware – and because servers are incrementally scalable, adding more computing power leverages previous IT investments. As IT departments adopted cluster computing practices, managing the widely dispersed servers became a signifi cant issue. Monitoring and managing remotely located servers usually relies on an on-site IT staff member, a third-party service contract, or a willing, but untrained, employee. The fi rst two are costly, and the third compromises IT access and security policies. This white paper explores the issues facing the IT staff as it attempts to manage the dispersed and growing IT infrastructure. As more servers and more support equipment connecting these servers enter the corporate computing environment, the demand for high-quality, platform-independent infrastructure management tools also increases. Effectively managing centralized or remote servers, networking equipment, and other IT assets will remain a critical aspect of IT infrastructure management.

Identifying and Meeting Infrastructure Management ChallengesManaging today’s IT infrastructure requires an approach that maintains virtually continuous business operation, provides high levels of security, and reduces operating cost and complexity while increasing IT staff productivity.

Each of these challenges presents IT administrators with unique issues, many of which involve ensuring secure and immediate access to the IT infrastructure.

An effective way to achieve this access is through a console server, which connects the serial console ports of many managed devices to a single appliance. An IT administrator can access any managed device’s console from any location at any time, even when the production network is unavailable. State-of-the-art console servers offer the following features:

Scalability — The ability to manage several servers in high-density racks is beneficial. Port density — A console server should use a minimum amount of space in a rack to manage all the equipment in that rack. Reliability — All connectors are located on the same side of the console server; the unit must be rack-mountable; and the unit should require minimum cabling and offer a high level of integration.Power supply — The console server’s power supply needs to be integrated into the device. Compatibility — The console server needs to be compatible with all of the IT organization’s servers and network equipment. Security — Comprehensive support for IT security policies, including multilevel user access control and logging capability, is critical. Audit capability — The console server needs to log all its activities in order to maintain security and regulatory auditing compliance. Hardware flexibility — Support for out-of-band management, the ability to connect to more than one LAN, and integration with service processors and intelligent power distribution units (IPDUs) is useful.Software flexibility — Upgradability helps to take advantage of emerging technologies. Cost and service — The console server vendor should be committed to a product roadmap in IT infrastructure services.

Maintaining High AvailabilityWidely dispersed computing resources create an environment that relies on component peak performance for a maximum amount of time. Infrastructure problems, including environmental factors, hardware and operating system errors, power failures, and natural disasters, comprise 20 percent of all unplanned data center downtime.When the network is operating properly, local or remote access is available through the network (in-band) and standard programs such as SSH and encrypted Web browser sessions. However, if a server or a network router has failed, IT administrators need access to the failed device through an out-of-band mechanism that connects to the device’s serial port and provides low-level control such as hardware self-tests or power cycling.

Maintaining Network SecurityWidely dispersed computing resources often create a serious challenge to maintaining network security. Established access policies become more troublesome to enforce (e.g., when a non-IT employee reboots a server). Likewise, talking an employee through the steps to change BIOS settings involves employee access to administrative passwords, which violates established authentication, authorization, and auditing policies. In the event of an IT audit, these practices increase company vulnerability to charges of security policy violation and non-compliance with regulatory requirements. Out-of-band access strengthens IT security policies by supporting features such as encryption of console traffic, authentication protocols including token-based authentication, and IP packet fi ltering, among others. Role-based access limits access to only those administrators with responsibility for maintaining specifi c servers. A console server also needs to support session management and maintain local and remote event logs, access logs, and data logs. Effective physical security (e.g., keeping servers in a locked room) is rarely possible at remote locations that do not employ full-time IT staff. Access to a server’s serial console from anywhere strengthens server security and enhances IT policies governing the confi dentiality of corporate data.

Centralizing Data Center ManagementToday’s data center environment includes both a heterogeneous mix of servers and geographically dispersed servers and other devices such as uninterruptible power supplies and PBX phone systems. The IT staff needs to be able to manage this diverse environment from any location at any time of day or night, without regard to hardware type, operating system, or network status. A console server needs to support all popular server operating systems and hardware features, such as a service processor and its Intelligent Platform Management Interface (IPMI). In addition, because many network-connected devices include only a serial console interface, a console server needs to be able to aggregate operating information from these devices. Uninterruptible power supplies, network routers and switches, telephony systems, and environmental control systems are examples of non-computing devices that can be controlled through console server.

Controlling CostsWithout remote access to a device’s serial port, IT administrators are unable to securely communicate with an inoperative server unless they travel to the site. IT administrator travel incurs costs. A far larger cost is incurred if server availability is critical to the company’s business. Productivity losses throughout the company further increase the negative effects of unplanned downtime. Space requirements also increase without remote access to the device’s serial port. When servers and other network equipment are functioning properly, in-band access to the devices and systems management applications normally suffi ce to monitor and manage the IT infrastructure. A serial console server directly addresses infrastructure hardware failures using secure out-of-band access to the failed device, even when the network is not functioning. The out-of-band capability enables IT administrators to communicate with a failed device without having to be physically present at the site where the device is located. Quicker access to failed devices reduces unplanned downtime on the production network and enhances a company’s ability to maintain or even improve its revenue stream.

Reducing ComplexityThe variety of servers and other devices in a typical corporate environment complicate detection of hardware failures and initiation of correct recovery features. Each device may support a different serial port connector, and there is no standard for the pin assignments on the commonly used RJ-45 serial connector. Add to this the different types and lengths of serial cables, and the complexity of merely gaining access to serial ports often requires signifi cant investments of IT staff time and budget. Simpler cabling and connectors increase the infrastructure management value of a console server. Standard CAT5 cables and RJ-45 connectors, coupled with confi gurable cabling pin-outs, add fl exibility and eliminate the need for specialized adapters to connect to the console server.

Increasing Staff ProductivityAutomating as many routine, repetitive administrative tasks as possible contributes signifi cantly to a more productive IT staff. Locating and creating an inventory of all IT assets, particularly at remote locations, consumes many hours of staff time that could be more productively used on other, more strategic tasks. Staff hiring and training also becomes more diffi cult, time-consuming, and expensive in a heterogeneous and dispersed environment in which each server and device relies on a different user interface. A serial console that automates discovery of any serially connected device saves confi guration and installation time, and reduces the chance of human error. In the same vein, a consistent user interface simplifi es confi guration of a large number of servers and other devices that could be dispersed among many locations. And a consistent, simple, Web-based interface reduces hiring and training costs.The Advanced Console Server SolutionACS advanced console servers provide IT and network operations center staff with the ability to perform secure, remote and out-of-band data center management of IT infrastructure from anywhere in the world. It also offers an Enhanced Security Framework that provides current security profi les and enough fl exibility for IT administrators to create custom security profi les that comply with existing network security policies.Maximizing Network AvailabilityPerhaps the single most important objective of today’s IT staff is to ensure that data is available to suppliers and company employees without interruption. Unplanned server or network downtime undermines that objective and causes productivity losses and reduced revenue to every one of the company’s partners. To ensure that an organization’s data and its network are always available, the console server provides both in-band and out-of-band remote access to servers and other serially connected networked devices. IT access to the console server is available from any location at any time, providing the IT administrator with low-level control of network attached hardware. This control includes hardware self-test, BIOS access, power cycling, and remote rebooting.

Protecting Network SecurityThe console server integrates with a company’s existing security structure and supports enterprise security policies. It supports strong user authentication using two-factor authentication with RSA SecurID and device authentication using certifi cates and a host key. The console server is compatible with virtually all authentication servers, including RADIUS, LDAP, Active Directory, TACACS+, Kerberos, and NIS protocols. Supported authorization methods include local access control lists or server-based group authorization through Active Directory, LDAP, TACACS+, or RADIUS. The console server also supports role-based authorization, and maintains both remote and local data and event logs and audit trials. It supports data encryption and secure out-of-band dial-up access through ISDN modem.

Centralizing Data Center ManagementThe console server provides both in-band and out-of-band remote access to connected serial devices. In-band access is available through single or dual Ethernet ports, which support up to 1 Gbit/second transmission speeds and secure Telnet and SSH access to serial devices. A PC card slot (16- or 32-bit) supplies wireless remote access. Out-of-band access is available through either a built-in modem or a customer-supplied V.92 or ISDN modem. The ACS console server easily confi gures and manages large data centers using a browser-based interface. Tight integration with Avocent DSView 3 software provides an effective method to configure and manage servers using a consistent, simple interface. The ACS console server also integrates power management from a single interface for any third-party power supply vendor.

Controlling Operational CostsRemote access to all devices connected to the advance console server virtually eliminates the need for IT staff to travel to remote sites in order to manage and maintain servers and other network equipment. Not only does this save travel costs, but remote access also reduces recovery time for unplanned downtime, which helps a company meet its revenue goals.

Providing Easier IT ManagementBy using a simple, secure Web-based interface, the advance console server enables an IT administrator to configure and manage any networked device with a serial port. This includes servers, routers, switches, and some non-computing devices such as power supplies, HVAC controls, and building alarms.The advance console server also simplifies cabling requirements. Standard CAT5 cabling terminated with inexpensive RJ-45 connectors supply the needed connections to the console server. Because RJ-45 pin-outs differ from one manufacturer to another, the advance console server provides a software-configurable pin-out feature to simplify serial connections between a device and the console server.

Enhancing IT Staff ProductivityThe auto-discovery mechanism of the advance console server saves significant amounts of IT time at the time of initial installation and configuration. Auto-discovery detects the names of connected devices and updates the network confi guration automatically, reducing the possibility of data entry errors and further helping to maximize uptime. The auto-discovery feature also detects servers that have been re-located, which allows the IT staff to avoid time-consuming and error-prone re-confi guration. The advance console server’s consistent Web-based interface also simplifies hiring and training requirements, and enables the IT staff to configure and manage large numbers of servers and other devices.

Therefore, advance console server provides secure remote access to serial consoles for servers and other devices, including power supplies, telephony equipment, and network routers and switches. Out-of-band capability enables secure console access from anywhere at any time regardless of network availability, reducing downtime and virtually eliminating travel to remote sites. The advance console server solution includes integrated power management and centralized management to support network security, administration, maintenance, and upgrades. The console server reduces operational costs, automates device discovery, and simplifies cabling and pin-out requirements not only that but also it enhances internet connectivity from network traffic.

TechnologyI would suggest the the 2.5 STABLE version of Lamit 2Pro Advanced and Lamit 2Pro Mini Power servers to University President. Despite the very small dimensions of the Mini Power server, it maintains all the features and characteristics of the Lamit 2Pro server’s Platform. The new generation of servers includes Wireless connectivity (N type, offering an extended coverage area and maximum transfer rates) providing easier users’ connection. The new servers’ version also offers new encryption facilities, a better control and professional management of the users’ LAN.The Lamit 2Pro servers were tested in difficult conditions, beside high debit satellite bidirectional connections offered by Lamit, in different regions of the world, at different levels of users, starting with the services for small or middle users and continuing with the broadband communications via satellite offered to romanian and foreign military bases from Iraq, Afghanistan and Kuwait. The servers/routers have advanced functions of acceleration and prioritization of the data transfer between the user’s network and internet, helping the VoIP and VPN communications.The Lamit Company also offers broadcast services via satellite, occasional or permanent ones, as well as special encrypted governmental connections, dedicated networks, point to point and point to multipoint connections, SCPC/SCPC or SCPC/DVB S2 types.

Military bases, governmental agencies, universities, internet café’s, drilling and oil, gas and petroleum companies, as well as various corporations and physical persons from all over the world (USA, Asia, Middle East, Europe and Africa) are using daily the high speed satellite connections offered by Lamit Company for worldwide communication and safe data transfer.The Lamit 2Pro server Platform has been adapted to improve the communication in each of the previously mentioned environments.

Last but not least, it is important to mention that Lamit Company was elected the winner, in 2009, of two international awards: “International Trophy for Technology and Quality” and “Golden Award for Quality and Business Prestige”, the annalists’ conclusions being:

The Selection Committee of Lamit Corporation said that after a deep analysis of the company’s activity they concluded that due to their innovative special capacities, Lamit Co. succeeded to maintain itself in the top of the companies that activates in the transmissions via satellite field.Lamit Company made itself known due to the innovative solutions and the improvement they made to the quality of transmission services via satellite, fixed or mobile and through its value added networking solutions, reason why it has received multiple appreciations and international awards in the past years.

Steps and Processes:I would suggest the following tips and improvements to university network administrator, network technicians, university computer technicians, university employees as well as to students:

#1 Ditch the modem. The first tip is to get rid of the modem and move to ADSL. Yes, broadband is available at low cost in most areas. University network technician would visit keyword broadband to see if it's supported by his exchange. If he is on broadband he can probably skip the rest of the top tips because he'll be enjoying life rather than fretting about his connection speed.

#2 Update drivers. University network administrator must love the modem! Perhaps university’s fund is limited or he doesn't have ADSL in his area - read on. Driver files are updated regularly by most modem manufacturers. For some modems, he can also "flash upgrade" the software in the modem to provide the latest (and fastest) communications software. Even so, he should be sure the driver is right for his operating system. To find the latest drivers, he just enter the modem details into a search engine such as www.yahoo.com or www.google.com.ph with the word "driver". So, to find drivers for a USR Sportster modem, enter "USR Sportster driver" and follow any instructions on how to install it. He can check his current modem drivers from Control Panel. With Windows XP, Select Start | Control Panel | Phone and Modem Options | Modems | Properties | Drivers.

#3 Tweak settings. Host computers have some settings that may improve modem throughput. All data sent over the internet goes in data packets. The size of these packets is the Maximum Transmission Unit or MTU. The aim is to send packets that are as large as possible without them needing to be broken down into smaller packets which would slow down their connection speed. A modem user, should set the (MTU) to 1500, the RWIN multiplier to 10 and Time to Live to 35. Download Tweak-Me or Tweak-XP to get this done automatically.

#4 Use FTP download wherever possible. If students want to download files, they can often choose between FTP or HTML download. FTP, (File Transfer Protocol,) is much faster for file transfers so they should choose that when they can.

#5 Use a high speed port. This'll only apply to readers with really old computers. The serial port may use an old, slow chip called a UART. The answer is to fit a high speed serial port with a 16550 UART chip or to fit an internal modem which includes one of these beasties.

#6 Use a download tool. There's nothing more frustrating than losing a connection near the end of a one hour download. The good news is that most downloads are resumable which means they can be restarted from where the connection failed. Universisty computer technician needs the right tool to manage the reconnection - one of the best is shareware Getright. Getright also searches for the fastest download sites and splits the download between several sites with the downloads running in parallel for the maximum possible download speed.

#7 Use a faster browser. Once computer users have connected to Yahoo, they can start any browser and run it in a second window. Opera is one of the fastest so why not download a free copy and give it a test run?

#8 Manage the computer cache. Every time university offices use the internet, images and other files are downloaded onto machine’s hard disk. If a particular image or other file is needed in a subsequent session, it can be pulled from the disk faster than it could be downloaded again. They are kept in a "Temporary Internet Files" folder, often called Cache. When the folder is full, Windows deletes the oldest files. University employees can vary the size of this folder by visiting Control Panel and selecting Internet Options. If they increase it, then more files can be stored on hard disk but if they go too far, then a slow PC may spend too long searching cache! They'll need to experiment to find the right level for their computers and internet connection speed.

#9 Define a blank homepage. Each time students start a browser outside their Google window, the browser will go to the defined homepage. If this is slow, they should change the home page to a fast web site. If they are a real speed nut, set it to blank. To do this, go to Internet Options as above, and select Use Blank. Now their external browser is up and running in record time.

#10 Don't display images. Text only windows are much faster to download. Computer technicians can easily restore images when required. Here's how to set whether to display images: From Internet Options (see above) select the Advanced tab. Scroll down until the multimedia section is seen, then select or deselect "Show pictures." Select Apply then OK to save the change.If they make these changes and they'll be cruising in the fast lane!

Infrastructure:I would suggest the following computer hardware and sofware infrastructure: Computers should be running at least Pentium-2.0Ghz w/48.8K modem. Computer technician would use different "internet software" ( pick and choose what works better ). Network technician would also use different operating systems ( WinXP or WinVista or Linux ). Host computers should upgrade to faster hardware ( Pentium 4, Dual Core, Core 2 Duo, AMD Sempron, AMD Athlon ) and upgrade to faster connection ( DSL or Full T1,T2 ).

University internet server needs to have a 24 hour dedicated connection to the internet and it should be internet server == a "unix machine" + firewall + router + phone lines to ISP. Another suggestion is to switch to a "better/faster/larger" ISP ( not the slow commercial online services ).

Innovations:I would suggest the innovations in cable modem/router tweaking to improve university internet connection speed. All modems make it possible for established communications channels to support a wide variety of data communication. Similar to other modems, a cable modem receives and sends data by modulating and demodulating signals. However, cable modems differ from other modems because they also function like routers. Broadband Internet data is delivered into the home or office over a coaxial cable line that also carries television signals. The information travels like a TV channel through the coaxial cable line. The cable modem separates the data from the television signals and directs the data to the PC and video to the television.

Local traffic is the biggest speed cap that plagues cable modems or a source of a slow cable internet connection. Cable modems work on a network/grid that connects to a T3 router running at 45 Megabits per second. Depending on where computers placed, they could have a busy grid, or a not-so-busy one. Then there is Internet traffic. Network administrator’s best bet if he really needs the speed is to pick a time where not as many people are on. There is a noticeable difference in traffic between 2 AM and 6 PM.

There are a number of ways to improve the performance of university cable modem/router. Unlike old fashioned dial-up modems, there isn't a whole lot he can do to increase the speed of computer cable connection. He can tweak the way the broadband Internet cable connection sends the data/packets back and forth. If he is new to this whole thing, there are a number of programs that will automatically set the best values for the connection. Some of the most popular programs are EasyMTU, I-Speed, Intelli Dial-Up, Smartalec, Smartplay.

There’s also Web browser called Voyager 5000 made by Smartalec that’s much faster than regular Internet Explorer. Updating the drivers on computer’s (Network Interface Card) can give the most noticeable speed boost above everything else. Some good places to check for drivers would be: www.drivershq.com, www.download.com, and even www.altavista.com.

Improving university cable internet speeds with connection teaming is also an option. Midpoint has a feature called connection teaming. Connection teaming combines multiple connections to the Internet for increased bandwidth. Along with connection teaming, the software splits large files being downloaded into multiple smaller parts and downloads each part at the same time along each connection. Cable modem companies sometimes allow network technician to purchase additional IP address for a monthly fee. @Home does this for $5 per IP address. @Home caps the bandwidth per account, not IP address - but at the very least, it might increase the efficiency of internet connection and allow the throughput to closer reach the capped maximum speed.On a very basic level, host computer’s performance also affects the Internet performance as well. If university computer isn't running at its best, neither will the broadband Internet connection.

MaintenanceI would recommend the following tips on University Wifi Network and Internet Connection Maintenance:

1. Upgrade and Add EquipmentMany have heard of the basic Wi-Fi equipment like network routers (or access points) and wireless adapter cards. While good routers and adapters can last for many years, in general university network technician should periodically consider replacing the old equipment. Newer network gear can be faster, more reliable or offer better compatibility with university offices electronic gadgets.

Some network technicians fail to consider the benefits that more advanced gear like wireless print servers, game adapters and video cameras can bring to the offices and computer laboratories. Before settling for a bare-bones network setup with only a router and a few PCs, research these other types of products also (that can be acquired for very reasonable prices).

2. Move the Router to a Better LocationSome computer technicians quickly assemble their wireless network only to find that it won't function in certain areas of the residence. Others enjoy a working setup at first but find later that their network crashes when a microwave oven or cordless phone is turned on. Perhaps even more common, PCs in a basement, attic or corner room may suffer from chronically poor network performance but the office employees fear attempting to fix it. One easy way to address these common Wi-Fi networking issues is to move the wireless router to a better location.

3. Change the Wi-Fi Channel NumberIn most countries, Wi-Fi equipment can transmit signals on any of several different channels, similar to televisions. Most wireless routers ship with the same default channel numbers, and most technicians never think about changing this. However, if they experience radio interference from a nearby offices’ router or some other piece of electronic equipment, changing the Wi-Fi channel is often the best way to avoid it.

4. Upgrade Router FirmwareWireless routers contain built-in programmable logic called firmware. A version of firmware is installed on the router by the manufacturer, and this logic is essential to the operation of the device. However, many routers also offer a firmware upgrade capability that allows employees to install newer versions. Updating university firmware can provide performance improvements, security enhancements or better reliability. University network personnel should watch for firmware updates from the router manufacturer and upgrade as needed.

5. Increase Signal Strength and Range of the RouterNo matter where in a residence a wireless router is installed, sometimes the Wi-Fi signal will simply not be strong enough to maintain a good connection. The likelihood of this problem increases with longer distances and with severe obstructions such as brick walls between the router and the wireless client. One way to solve this problem is to upgrade the Wi-Fi antenna installed on the router. Some routers do not support antenna upgrades, but many do. The alternative involves installing an additional device called a repeater.

6. Increase Signal Strength and Range of the ClientsAs with wireless routers, Technicians can also improve the signal strength of wireless clients. They must consider this when dealing with a single Wi-Fi device that suffers from a very short signal range compared to the rest of these devices. This technique can improve the ability of laptop computers to connect to Wi-Fi hotspots, for example.

7. Increase Wireless Network SecurityMany network authorized personnel consider their wireless network setup a success when basic file and Internet connection sharing are functional. However, if proper security features are not in place, the job remains unfinished. They should follow this checklist of essential steps for establishing and maintaining good Wi-Fi security on a office and laboratory network.

----------------------------------------------------------------------------------------------------------------------------------------------Question: If you were hired by the university president asan IT consultant, what would you suggest (technology, infrastructure,innovations, steps, processes, etc) in order for the internetconnectivity be improved? (3000words)

Retort:The Internet has made possible entirely new forms of socialinteraction, activities and organizing, thanks to its basic featuressuch as widespread usability and access.

Doubtlessly, it is my great honor and privilege to be hired by the University President as an IT Consultant. BUt before my suggestions and recommendations i prefer to define some termenologies.

What is an Internet?-

TheInternet is a global system of interconnected computernetworks that use the standardized Internet Protocol Suite (TCP/IP). It is a networkof networks that consists of millions of private and public, academic,business, and government networks of local to global scope that are linked by copper wires, fiber-opticcables, wirelessconnections, and other technologies. The Internet carries a vast array of informationhypertextdocuments of the Wolrd Wide Web (WWW) and the infrastructure tosupport Electronic mail, in addition to popular services such as Online chat ,file transfer file sharing, online gaming and Voice over Internet Protocol (VoIP)person-to-person communication via voice and video.resources and services, most notably the inter-linked and

he complex communications infrastructure of the Internet consists ofits hardware components and a system of software layers that controlvarious aspects of the architecture. While the hardware can often beused to support other software systems, it is the design and therigorous standardization process of the software architecture thatcharacterizes the Internet and provides the foundation for itsscalability and success. The responsibility for the architecturaldesign of the Internet software systems has been delegated to the Internet Engineering Task force(IETF).The IETF conducts standard-setting work groups, open to any individual,about the various aspects of Internet architecture. Resultingdiscussions and final standards are published in a series ofpublications each of which is called a Requets for Comment (RFC), freely available on the IETF web site. The principal methods ofnetworking that enable the Internet are contained in speciallydesignated RFCs that constitute the Internet standards.

*Workplace

The Internet is allowing greater flexibility in working hours andlocation, especially with the spread of metered high-speedconnections and web application.

*Remote access

The Internet allows computer users to connect to other computers andinformation stores easily, wherever they may be across the world. Theymay do this with or without the use of security authentication and encryption technologies, depending on therequirements. This is encouraging new ways of working from home,collaboration and information sharing in many industries.

a.The Domain Name System (DNS) is a hierarchical naming system for computers, services, or any resource connected to the internet or a private network. It associates various information with domain names assigned to each of the participants. Most importantly, it translatesdomain names meaningful to humans into the numerical (binary)identifiers associated with networking equipment for the purpose oflocating and addressing these devices worldwide. An often used analogyto explain the Domain Name System is that it serves as the "phone book" for the Internet by translating human-friendly computer host names into IP address For example, www.example.com translates to 208.77.188.166.In general, the Domain Name System also stores other types of information, such as the list of mail severs that accept email for a given Internet domain. By providing a worldwide, distributed keyword -based redirection service, the Domain Name System is an essential component of the functionality of the internet.

b.An Internet exchange point (IX or IXP) is a physical infrastructure that allows different Internet service providers (ISPs) to exchange internet traffic between their networks (autonomous systems) by means of mutual peeringagreements, which allow traffic to be exchanged without cost. IXPsreduce the portion of an ISP's traffic which must be delivered viatheir upstream transit providers, thereby reducing the Average per-bit delivery cost of their service. Furthermore, the increased number of paths learned through the IXP improves routing efficiency and fault-toleranceThe primary purpose of an IXP is to allow networks to interconnectdirectly, via the exchange, rather than through one or more 3rd partynetworks. The advantages of the direct interconnection are numerous,but the primary reasons are cost, latency, and bandwidth. Trafficpassing through an exchange is typically not billed by any party,whereas traffic to an ISP's upstream provider is.

c.Internet Protocol version 6 (IPv6) is the next-generation internet protocol version designated as the successor to version 4,IPv4, the first implementation used in the Internet and still in dominant use currently[update]. It is an Internet layer protocol for packet-switched internetworks. The main driving force for the redesign of Internet Protocol was the foreseeable IPv4 Address Exhaustion IPv6 was defined in December 1998 by the Internet Engineering Task Force (IETF) with the publication of an Internet Standard specification, RFC 2460.

-In my personal views, having a fast and high tech internet connection really matters in this Institution, for This University necessitate Excellence and expect its products to be superb and outstanding not just here in our own country but looking forward to standout in all over the World.And that is definitely the Pride and reputation of the University.Of course having an improved,developed and enhance internet Connection, it requires 'Money'This is a public School, I guess we will have a difficulty in altering the old Computer units intoa modernize and high tech one. But let's just set aside that. The more important matter is thewire's and cabling[LAN,WAN, etch] because according to my reliable source Cabling, wires and medias in Interconnection and networking really affects the the speed and the connectivity in the internet. I would suggest that the University fund must priorities to invest in this matter. If we want a technically advanced internet reputation, we must spent our money wisely before spending it on anything else.then after putting first things first, The university can now replace all the old-school monitors, CPU's and, keyboards (perhaps,we can have monitors flat-screened and have it touch screen) etch. If and ONly IF we have enough Funds for that..

Last edited by Marlie E. Sisneros on Tue Aug 18, 2009 4:57 pm; edited 5 times in total

All of us know how useful is the internet. By using internet, whatever you want to searched it will give you the informations you need. As from the meaning that give by the wiki, the internet is a global system of interconnected computer networks that use the standardized internet protocol suite (TCP/IP). It is a network of networks that consists of millions of private and public, academic, business and government networks of local to global scope that are linked by copper wires, fiber-optic cables wireless connections and other technologies. The Internet carries a vast array of information resources and services, most notably the inter-linked hypertext documents of world wide web(WWW) and the infrastructure to support electronic mail, in addition to popular services such as online chat, file transfer and file sharing, online gaming, and voice over internet protocol(VoIP) person-to-person communication via voice and video. As you can see, in all the activities in everyday life, internet is part of us all the time even in the business world. Not only in business world but in all the aspects that comprises in this whole universe.

Ref: http://en.wikipedia.org/wiki/Internet

Students nowadays are very lucky to have this kind of source of information for it helps them (we) in all situations. It is really a big help in researching all the necessary and needed information by all of us. It is also one way in which you can communicate to other places by just chatting to your loved ones. All of the people really influenced now by this kind of medium.

Now, all universities have computers and have internet connections for it is also introduced to all the students in order to know how to use it. Even kids now know how to use some products of technology.

In case to our university, the USEP, one of the problems is the internet connection. That problem occurred years ago but until now, didn't get a solution.

If I was hired as an IT consultant, that could be amazing...hehe! Well, in that situation, to help my university, after analyzing the problem I will think about the possible solutions to the addressed problem. And then I will tell them what is in my mind (he he...char). If you observed, every time we step on the line, there is what we called “orientation”. Orientation is the act of orienting a person. At Orientation you will have the opportunity to receive assistance in selecting your fall classes, tour the campus, and get to know other new students. Orientation is also a time to address any questions you have for University staff, faculty, and administration. And the universities always do that every year. Not only universities but also in the business world. It has the purpose of introducing a certain thing in order to be learned and understand. Back to the problem of the internet connection, I would initiate to do an orientation. In that orientation, I will be able to express my feelings, opinions and suggestions to all the students regarding the topic which is the low connection. I will be able also to introduce to them the new policies of using computers. I would suggest that it will be strictly given. And because it is of tight rule, there should be a person/or persons to look over the computers. And I hope he/she will do his/her part in order to have an organize way of using computers. Some policies that I hope to be followed are: a.) When entering the computer lab, virtual lib, students should log-in their names and leave their ID's either school id or library id. b.) One student per computer (the person assigned should always check that). c.) When inserting flash drive, they should first scan it.(students should cooperate...hope so!haizt)

Boost optimizationOne way to improve internet speed is click boost where it is the one software for all your PC and Internet connectivity speed boosting needs. It is indeed the solution for great power and web speed as it is capable of installing tweaks that lead to improved connection speed regardless of its nature - Dialup, LAN or DSL. It also monitors your computer silently in the background for processes that have been ended improperly, offering methods to free up unused virtual memory.

Boosting can be very helpful in speeding up internet connection. Like active speed, which dramatically improves any home or office Internet connection, including dial-up modems of any speed, and high-speed connections such as cable modems, DSL, ISDN, T-1, LAN, etc. It works with all Internet services, including AOL and local ISPs. Active Speed’s patent-pending Intelligent Optimization Engine will continually monitor and optimize you connection so the more you use Active Speed the faster it gets. Another one is webRocket which is an Internet Optimizer that is powerful, easy-to-use program for Windows 95, 98, NT, Me 2000, and XP which boosts your Internet connection speed by up to 200%. Dr Speed will boost your Internet connection up to 250%! This will 1.) Optimize your Internet connection 2.) Increase surfing speeds by up to 250%! 3.)Optimize your computer to reclaim lost memory 4.) Shorten download times for all files 5.) Speed up file sharing 6.) Increase computer performance 7.) Decrease wait times while applications work 8.) Run programs simultaneously, without slowing your PC 9.) Display complete information about your computer and much more.

Maybe by picking by just one of the mentioned above, or to prove it helps, the university should test them first in order to determine either which of them is very effective.

Another way to speed up internet connection is to update antivirus everyday. I think this is also one cause of having a low internet connection. It’s because it prone to viruses. Malfunction will possibly occur and we all know and experienced it every time we use the computer. It always hanged. Updating antivirus is not that difficult to do because it’s just need to find the update and then click. There should be one person assigned to update software everyday. It maybe the person whom assigned to look over the computers. By that, we have an updated antivirus.

Increased the bandwidth. A bandwidth is a measure of available or consumed data communication resources expressed in bit/s or multiples of it (kbit/s, Mbit/s etc). I think increasing in bandwidth and proper allocation can help improve internet connection.

Ref: http://en.wikipedia.org/wiki/Bandwidth_(computing)

Maintenance of computer. In all situations, in order to have a well manage and organize environment, the word “maintenance” should not taken to be forgotten. It is always part of daily activities. Maintenance is the act of maintaining or the state of being maintained. Also the work of keeping something in proper condition. From the meaning itself, you can identify and understand that this word is very important in all activities in which it has something to do with keeping all the things that is meant to be kept or to last forever. That can be applied to computers. And like other things, computers also have some basic maintenance. Like Run Disk Defrag, a scan disk, a virus scan, a malware scan, and clear recycle bin. An unusually slow Internet connection experience is often the only sign that computer is infected with viruses or other malware. Delete old files and temporary files. Never allow the free space on C: drive to be less than 10% of the total size or twice the installed RAM (which ever is larger). A well maintained PC will operate much better than a PC that has never had any maintenance. You can google or your local computer repair store should be able to help you with this if you don't know how.

Ref: http://www.answers.com/topic/maintenance

Reset Network. Sometimes restarting home network if you have one will drastically increase the speed of the connection.

Optimize cache or temporary Internet files. These files improve Internet connection performance by not downloading the same file over and over. When a web site puts their logo graphic on every page, computer only downloads it when it changes. If you delete the temporary files it must be downloaded again. If you disable the cache, it must be downloaded every time you view a page that uses it. This can be done by opening Internet Explorer, clicking on "Tools" at the top and choosing "Internet Options". On the General tab, click the "Settings" button next to Temporary Internet Files. Set Check for newer versions to "Automatically". Set amount of disk space to use to 2% of your total disk size or 512 MB, which ever is smaller. On Fire fox, click "Tools" then "Options," and go to the privacy tab. Then click on the Cache tab within this.

Never bypass router. A router is a networking device whose software and hardware are usually tailored to the tasks of routing and forwarding information. Most routers include a firewall that is very difficult for hackers to defeat. Routers connect two or more logical subnets, which do not necessarily map one-to-one to the physical interfaces of the router. If you don't need to use Wireless then hook your computer directly to your router. Routers will only slow down your connection by a few Milli-seconds. You won't notice the difference but the hackers will.

Ref: http://en.wikipedia.org/wiki/Router

Call Internet service provider (ISP). Sometimes you just have bad service. They can usually tell if your connection is substandard without having a technician come to your home. Just be nice and ask. By that, it can lessen the expensive.

Upgrade computers. The term upgrade refers to the replacement of a product with a newer version of that same product. Generally meaning a replacement of hardware, software or firmware with a newer or better version, in order to bring the system up to date or to improve its characteristics. Common hardware upgrades are installing additional memory (RAM), adding larger hard disks, replacing microprocessor cards or graphics cards, and installing new versions of software, although many other upgrades are often possible as well. Common software upgrades include changing the version of an operating system, office suite, anti-virus program, or various other tools. Although upgrades are for the purpose of improving a product, there are risks involved—including the possibility that the upgrade will worsen the product. When hardware is upgraded, there is a risk that it will not be compatible with other pieces of hardware in the system. For example, an upgrade of RAM may not be compatible with existing RAM in the computer. Other hardware components may not be compatible after either an upgrade or downgrade, due to the non-availability of compatible drivers for the hardware with a specific operating system. Conversely, there is the same risk of non-compatibility when software is upgraded or downgraded for previously functioning hardware to no longer function. When software is upgraded, there is a chance that the new version (or patch) will contain a bug, causing the program to malfunction in some way or not function at all. Upgrades can also worsen a product subjectively. A user may prefer an older version even if a newer version is perfectly functioning.If computer is slow, it doesn't matter how fast your Internet connection is, the whole thing will just seem slow. You can only access the Internet as fast as your PC will allow you to.

Ref: http://en.wikipedia.org/wiki/Upgrade

Replace old cable modem. Any solid-state electronics will degrade over time due to accumulated heat damage. Modem will have a harder and harder time 'concentrating' on maintaining a good connection as it gets older (signal to noise ratios will go down, and the number of resend requests for the same packet will go up). An after-market cable modem as opposed to a cable-company modem will frequently offer a better connection.

Another is blocking some sites that is not supposed to be open and browse. Just open internet connection to the sites that is useful to the students. It’s because downloading media, videos and other entertainment files can also lower the internet connection. Much better to have limited sites and just for school purposes. I know that in this suggestion, lots of students will be protesting, murmuring that against to this kind of policy but knowing that it will be helpful to speed up internet connection, this can be in the positive side.

Buying new computers is one of the options that need to consider in the situation of USEP. Although this university is a government school, if have enough budget then suggest to purchase new computers with high specifications. That will guarantee us a fast internet connection.

An IT consultant works in partnership with clients, advising them how to use information technology in order to meet their business objectives or overcome problems. Consultants work to improve the structure and efficiency and of an organization’s IT systems. IT consultants may be involved in a variety of activities, including marketing, project management, client relationship management and systems development.

They may also be responsible for user training and feedback. In many companies, these tasks will be carried out by an IT project team. IT consultants are increasingly involved in sales and business development, as well as technical duties.

Career development

As an IT consultant, your immediate prospects depend on the size and type of the organization you work for. Movement between employers is common. Larger consultancies have an established career structure for their staff, with frequent appraisals and an emphasis on individuals managing their own career. A typical consultant moves from the daily responsibility of a project to a more strategic role with team leadership and responsibility.

The IT industry is so diverse and IT consultants perform such a variety of tasks that your career may develop into a number of different industries and sectors. Once you gain generalist experience you may want to specialize in a sector or a program, e.g. SAP, Oracle, or work as a senior consultant. IT consultants may take on greater responsibilities in another part of an organization (e.g. training and recruitment, project management, sales and account management roles). Other possible progression includes the development of specific technical expertise, possibly leading to contributing at national and international technical conferences. Some consultants go on to become IT specialists at partner level or IT architects.

If I will hired by the university president as an IT consultant what would I suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved is technology.

There’s no doubt that the hard market has brought a scarcity of coverage, an expansion of exclusions and a significant increase in premiums for public entities. As a result, these organizations are struggling to find new ways to protect their assets.

Today’s managers are being asked to do more with less. They not only handle risk financing, but also must focus on overall risk management strategies, such as safety programs, educating staff members on new policy and procedures, developing proper loss control strategies in key areas and determining the effectiveness of each initiative.

To improve performance, Manager need a more rapid and effective means to identify risks before they result in significant losses, to monitor key performance indicators, improve communications and share information among all stakeholders involved in the risk management process. To meet these needs, risk managers would significantly benefit from the connectivity, transparency and real-time benefits that Internet technology delivers. Thus, IT Consultant performs a variety of tasks that can help the improvement in this university.

Overview

An internet service is expanding rapidly. The demands it has place upon the public network, especially the access network are great. However technological advance promise big increase in access speed, enabling public networks to play a major role in delivering new and Improve telecommunication services and application to consumers.Technology as a Tool; the Internet as a Vehicle

Organizations are now looking to the Internet as a way to more directly and immediately facilitate the accessing and sharing of information with key decision-makers. After all, one bad claim can result in thousands, even millions of dollars in losses. To identify and control these losses, public entities must utilize Internet systems to achieve connectivity, transparency and real-time notification:

Connectivity

Connectivity, in the context of computer science, refers to the use of computer networks to link computers to one another, and provide information resources between computer systems and their final users. On a collective scale, connectivity may refer to the Internet bandwidth coming into and going out of a country, and the quality of the infrastructure within the country for linking to the Internet.

Public entities have long been in search for convenient, cost-effective connectivity with individuals that need to communicate, including claims adjusters, risk managers, nurse case managers, supervisors and excess carriers.

In this way, they can share information as it develops. Traditionally, decisions to connect these individuals were based on the costs of building a connectivity infrastructure, typically utilizing dial-up or dedicated lines.

Today, the cost savings achieved by true Internet connectivity vs. building a “private” network can be measured in the thousands of dollars per location saved each year. Even for entities with a private network already in place, utilizing Internet-based solution for risk management and claims administration still pays for itself in a few months. These systems are also flexible, and easily adapt to new offices or remote users that may be connecting from a PC, Mac, PDA or even a mobile phone.

Transparency

In our consumer-driven society, convenience is of utmost importance. Everyone wants easy access to the information they need, without being bothered with where the information came from or how it was compiled.

Internet systems are vital to delivering critical information where and when it’s needed most. At the same time, the transfer and management of this information remain transparent. Instead of having to access multiple applications with different system requirements, user IDs and passwords, users simply retrieve what they need from one site. This saves an incredible amount of time, money and hassle.

For instance by using an Internet system, public entities have lowered both their training and installation costs for software, hardware and networking needs and achieved increased information transfer as users are more likely to use the application when using only one site.

A good example of transparency is a claims administration system that automatically accesses the latest labor codes, compensation rates, medical bill reprising schedules and PPO contracts. The user doesn’t have to “perform” an information update from their workstation, nor does the IT department have to manage multiple batch data interfaces.

One public insurance pool now accesses information for several data sources from a single site, streamlining communication between the pool, various members and their third-party administrators (TPA). At one time, the pool had utilized administrative staff members to fax information back and forth, create spreadsheets to track and create reports, and to obtain information from a variety of Web sites. After deploying the Internet system, the pool was able to re-deploy one FTE per each member to more productive roles, and significantly improved their claims outcomes. The savings in improved outcomes is more difficult to measure, but far exceeds the administrative savings. Real-time notification

Keeping up with the latest, most important losses and developments within an organization is a critical issue. By using Internet-based business rules, a public entity can instantly inform its risk manager of an urgent claim or loss through native Internet tools like email or via wireless notification to a pager or cell phone. For instance, if a significant loss occurred, such as a fire at a school, the system would instantaneously notify key decision makers and managers about the incident, so they could initiate appropriate response measures.

Real-time notification allows risk managers and executives to be better informed, prepared for meetings, and compliant with statutory timelines. For instance, many tasks and functions in workers’ compensation run on mandated timelines. If these deadlines are not met, a public entity can be liable for penalties. With real-time notification, public entities can improve communication, and thereby reduce penalties on late regulatory reporting, late payments or late reporting on excess claims. As a result, a lot of savings have been achieved and penalties averted through real-time notification.

The Lay of the Land: Knowing What Internet Systems Are Available

Many of today’s people, feeling comfortable with the Internet in other areas of their lives, are more inclined to use the latest Internet tools for risk management. However, it’s important to understand the various types of systems that are currently available and the benefits and disadvantages of each:

Internet-enabled

Many public entities have Windows-based or mainframe-based applications that utilize the Internet for remote access and to drive one or two functions. For the most part, however, public entities cannot fully leverage the Internet’s real-time capabilities for their risk management functions using this model.Application service providers (ASPs)

An ASP allows companies to access their system for a “subscription” cost. The ASP handles the installation, housing, maintenance and upgrades to the system. Many of these applications are client-server or mainframe applications that have been modified to run over the Internet, utilizing middleware technology, which not only creates additional technology expenses, but also creates vulnerabilities and/or problems in an organization’s firewall security.

Truly Internet-based/Browser-based

The latest applications have been specifically designed to run over the Internet. A browser-based system as its name suggests only requires a browser—which is a standard option on most PCs. Since browser-based applications do not require middleware, they are more cost-effective, secure and offer an improved Internet model. It allows for immediate access to information at anytime, from anywhere. More companies are looking to browser-based technology to solve their risk management needs.

Owning & Accessing Claims Data

IT Consultant not only needs access to claims information in order to analyze losses and key performance indicators, but they also need to monitor the claims process as a risk factor in and of itself. Claims process inefficiencies add significantly to claims costs. Many public entities are now purchasing their own browser-based claims administration system to ensure that the claims process is managed efficiently and to also have ultimate control over their claims data whether they are self-administered or TPA-administered.

Later, if a public entity decides to use another or an additional TPA, no data conversion is required; the new TPA simply logs on over the Internet to the entity’s browser-based system. In addition, public risk managers would not have to rely on the TPA to provide risk management reports, instead risk managers would have direct access to the information they need, when they need it. Internet-based business rules can also be set up to immediately notify individuals when certain types of claims or activities have occurred, so a proper response can be initiated.

Tracking Key Performance Measures

Until recently, few public entities had systems in place to capture the full range of data to analyze losses and to understand the effectiveness of their risk management initiatives. Today’s integrated claims administration and risk management systems available via browser-based technology now allow public entities to have a centralized data repository from which they can monitor key performance measures.

By leveraging the Internet’s connectivity, these measures are more readily accessible and can be utilized to identify potential opportunities for improvement and to formulate new initiatives that further reduce losses. If a key performance indicator is at a critical risk level, real-time notification can be leverage to automatically inform the appropriate decision maker or manager.

The Future of Internet Technology

Over the next quarter century, though, the Internet will help to transform companies, although the transformation may be too subtle for people to notice much while it happens. It will be most striking, at least in the medium term, in companies providing services: financial, travel, medical, educational, consultancy. Many manufacturing businesses will grow more like service industries: They will cater to the individual customer's tastes, for instance, and create a continuing relationship to ensure they get repeat purchases.

The connectivity, transparency and real-time nature of browser-based systems can help publican entities institute effective claims administration and risk management strategies. The quality of any initiative depends almost entirely on individuals receiving the “right” information to do their jobs effectively. The Internet has become a powerful tool that enables the access and sharing of information among all stakeholders. Public risk managers who use browser based technology can drive their risk management initiatives with greater business intelligence, simplicity, and control.

Subject: If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? Tue Aug 18, 2009 11:42 am

If I was hired by the university president as an IT consultant, I would suggest technology, & innovations, in order for the internet connectivity be improved.

TECHNOLOGYA computer network is a group of interconnected computers. Networks may be classified according to a wide variety of characteristics. This article provides a general overview of some types and categories and also presents the basic components of a network.InternetworkAn Internetwork is the connection of two or more distinct computer networks or network segments via a common routing technology. The result is called an internetwork (often shortened to internet). Two or more networks or network segments connected using devices that operate at layer 3 (the 'network' layer) of the OSI Basic Reference Model, such as a router. Any interconnection among or between public, private, commercial, industrial, or governmental networks may also be defined as an internetwork.In modern practice, interconnected networks use the Internet Protocol. There are at least three variants of internetworks, depending on who administers and who participates in them:ท Intranetท Extranetท InternetIntranets and extranets may or may not have connections to the Internet. If connected to the Internet, the intranet or extranet is normally protected from being accessed from the Internet without proper authorization. The Internet is not considered to be a part of the intranet or extranet, although it may serve as a portal for access to portions of an extranet.

Connection methodComputer networks can also be classified according to the hardware and software technology that is used to interconnect the individual devices in the network, such as Optical fiber, Ethernet, Wireless LAN, Home PNA, Power line communication or G.hn. Ethernet uses physical wiring to connect devices. Frequently deployed devices include hubs, switches, bridges and/or routers.Wireless LAN technology is designed to connect devices without wiring. These devices use radio waves or infrared signals as a transmission medium.ITU-T G.hn technology uses existing home wiring (coaxial cable, phone lines and power lines) to create a high-speed (up to 1 Gigabit/s) local area network.

-IN WIRED TECHNOLOGIES I WOULD SUGGEST TO USE AND WIRELESS LAN TO MAKE THE INTERNET CONNECTION FASTER.-

Wired TechnologiesTwisted-Pair Wire - This is the most widely used medium for telecommunication. Twisted-pair wires are ordinary telephone wires which consist of two insulated copper wires twisted into pairs and are used for both voice and data transmission. The use of two wires twisted together helps to reduce cross talk and electromagnetic induction. The transmission speed range from 2 million bits per second to 100 million bits per second.Coaxial Cable – These cables are widely used for cable television systems, office buildings, and other worksites for local area networks. The cables consist of copper or aluminum wire wrapped with insulating layer typically of a flexible material with a high dielectric constant, all of which are surrounded by a conductive layer. The layers of insulation help minimize interference and distortion. Transmission speed range from 200 million to more than 500 million bits per second.Fiber Optics – These cables consist of one or more thin filaments of glass fiber wrapped in a protective layer. It transmits light that can travel over long distance and higher bandwidths. Fiber-optic cables are not affected by electromagnetic radiation. Transmission speed could go up to as high as trillions of bits per second. The speed of fiber optics is hundreds of times faster than coaxial cables and thousands of times faster than twisted-pair wire.-IN WIRED TECHNOLOGIES I WOULD SUGGEST TO USE OPTICAL FIBER TO MAKE THE INTERNET CONNECTION FASTER.-

Wireless TechnologiesTerrestrial Microwave – Terrestrial microwaves use Earth-based transmitter and receiver. The equipment looks similar to satellite dishes. Terrestrial microwaves use low-gigahertz range, which limits all communications to line-of-sight. Path between relay stations spaced approx. 30 miles apart. Microwave antennas are usually placed on top of buildings, towers, hills, and mountain peaks.Communications Satellites – The satellites use microwave radio as their telecommunications medium, which are not deflected by the Earth's atmosphere. The satellites are stationed in space, typically 22,000 miles above the equator. These Earth-orbiting systems are capable of receiving and relaying voice, data, and TV signals.Cellular and PCS Systems – Use several radio communications technologies. The systems are divided to different geographic area. Each area has low-power transmitter or radio relay antenna device to relay calls from one area to the next area.Wireless LANs – Wireless local area network use a high-frequency radio technology similar to digital cellular and a low-frequency radio technology. Wireless LANS use spread spectrum technology to enable communication between multiple devices in a limited area. Example of open-standard wireless radio-wave technology is IEEE 802.11b.Bluetooth – A short-range wireless technology. Operate at approx. 1Mbps with range from 10 to 100 meters. Bluetooth is an open wireless protocol for data exchange over short distances.The Wireless Web – The wireless web refers to the use of the World Wide Web through equipments like cellular phones, pagers, PDAs, and other portable communications devices. The wireless web service offers anytime/anywhere connection.

-IN WIRELESS TECHNOLOGIES I WOULD SUGGEST TO USE WIRELESS WEB TO MAKE THE INTERNET CONNECTION FASTER.-ScaleNetworks are often classified as Local Area Network (LAN), Wide Area Network (WAN), Metropolitan Area Network (MAN), Personal Area Network (PAN), Virtual Private Network (VPN), Campus Area Network (CAN), Storage Area Network (SAN), etc. depending on their scale, scope and purpose. Usage, trust levels and access rights often differ between these types of network - for example, LANs tend to be designed for internal use by an organization's internal systems and employees in individual physical locations (such as a building), while WANs may connect physically separate parts of an organization to each other and may include connections to third parties.Local area networkA local Area Network (LAN) is a computer network covering a small physical area, like a home, office, or small group of buildings, such as a school, or an airport. Current wired LANs are most likely to be based on Ethernet technology, although new standards like ITU-T G.hn also provide a way to create a wired LAN using existing home wires (coaxial cables, phone lines and power lines).For example, a library may have a wired or wireless LAN for users to interconnect local devices (e.g., printers and servers) and to connect to the internet. On a wired LAN, PCs in the library are typically connected by category 5 (Cat5) cable, running the IEEE 802.3 protocol through a system of interconnected devices and eventually connect to the Internet. The cables to the servers are typically on Cat 5e enhanced cable, which will support IEEE 802.3 at 1 Gbit/s. A wireless LAN may exist using a different IEEE protocol, 802.11b, 802.11g or possibly 802.11n. The staff computers (bright green in the figure) can get to the color printer, checkout records, and the academic network and the Internet. All user computers can get to the Internet and the card catalog. Each workgroup can get to its local printer. Note that the printers are not accessible from outside their workgroup.

-THE UNIVERSITY USES LAN TO HAVE A CONNECTION-Basic hardware componentsAll networks are made up of basic hardware building blocks to interconnect network nodes, such as Network Interface Cards (NICs), Bridges, Hubs, Switches, and Routers. In addition, some method of connecting these building blocks is required, usually in the form of galvanic cable (most commonly Category 5 cable). Less common are microwave links (as in IEEE 802.12) or optical cable ("optical fiber"). An ethernet card may also be required.Network interface cardsA network card, network adapter, or NIC (network interface card) is a piece of computer hardware designed to allow computers to communicate over a computer network. It provides physical access to a networking medium and often provides a low-level addressing system through the use of MAC addresses.RepeatersA repeater is an electronic device that receives a signal and retransmits it at a higher power level, or to the other side of an obstruction, so that the signal can cover longer distances without degradation. In most twisted pair Ethernet configurations, repeaters are required for cable which runs longer than 100 meters.HubsA network hub contains multiple ports. When a packet arrives at one port, it is copied unmodified to all ports of the hub for transmission. The destination address in the frame is not changed to a broadcast address. BridgesA network bridge connects multiple network segments at the data link layer (layer 2) of the OSI model. Bridges do not promiscuously copy traffic to all ports, as hubs do, but learn which MAC addresses are reachable through specific ports. Once the bridge associates a port and an address, it will send traffic for that address only to that port. Bridges do send broadcasts to all ports except the one on which the broadcast was received.Bridges learn the association of ports and addresses by examining the source address of frames that it sees on various ports. Once a frame arrives through a port, its source address is stored and the bridge assumes that MAC address is associated with that port. The first time that a previously unknown destination address is seen, the bridge will forward the frame to all ports other than the one on which the frame arrived.Bridges come in three basic types:1. Local bridges: Directly connect local area networks (LANs)2. Remote bridges: Can be used to create a wide area network (WAN) link between LANs. Remote bridges, where the connecting link is slower than the end networks, largely have been replaced with routers.3. Wireless bridges: Can be used to join LANs or connect remote stations to LANsSwitchesA network switch is a device that forwards and filters OSI layer 2 datagrams (chunk of data communication) between ports (connected cables) based on the MAC addresses in the packets. This is distinct from a hub in that it only forwards the packets to the ports involved in the communications rather than all ports connected. Strictly speaking, a switch is not capable of routing traffic based on IP address (OSI Layer 3) which is necessary for communicating between network segments or within a large or complex LAN. Some switches are capable of routing based on IP addresses but are still called switches as a marketing term. A switch normally has numerous ports, with the intention being that most or all of the network is connected directly to the switch, or another switch that is in turn connected to a switch. Switch is a marketing term that encompasses routers and bridges, as well as devices that may distribute traffic on load or by application content (e.g., a Web URL identifier). Switches may operate at one or more OSI model layers, including physical, data link, network, or transport (i.e., end-to-end). A device that operates simultaneously at more than one of these layers is called a multilayer switch.Overemphasizing the ill-defined term "switch" often leads to confusion when first trying to understand networking. Many experienced network designers and operators recommend starting with the logic of devices dealing with only one protocol level, not all of which are covered by OSI. Multilayer device selection is an advanced topic that may lead to selecting particular implementations, but multilayer switching is simply not a real-world design concept.RoutersA router is a networking device that forwards packets between networks using information in protocol headers and forwarding tables to determine the best next router for each packet. Routers work at the Network Layer of the OSI model and the Internet Layer of TCP/IP.

Network topologyComputer networks may be classified according to the network topology upon which the network is based, such as bus network, star network, ring network, mesh network, star-bus network, tree or hierarchical topology network. Network topology signifies the way in which devices in the network see their logical relations to one another. The use of the term "logical" here is significant. That is, network topology is independent of the "physical" layout of the network. Even if networked computers are physically placed in a linear arrangement, if they are connected via a hub, the network has a Star topology, rather than a bus topology. In this regard the visual and operational characteristics of a network are distinct; the logical network topology is not necessarily the same as the physical layout. Networks may be classified based on the method of data used to convey the data, these include digital and analog networks.

-IN NETWORK TOPOLOGY I RECOMMEND STAR-

INNOVATIONThe concept of the innovation system stresses that the flow of technology and information among people, enterprises and institutions is key to an innovative process. It contains the interaction between the actors who are needed in order to turn an idea into a process, product or service on the market.

The Technological Innovation System is a concept developed within the scientific field of innovation studies, which serves to explain the nature and rate of technological change. A Technological Innovation System can be defined as ‘a dynamic network of agents interacting in a specific economic/industrial area under a particular institutional infrastructure and involved in the generation, diffusion, and utilization of technology’. The approach may be applied to at least three levels of analysis: to a technology in the sense of a knowledge field, to a product or an artifact, or to a set of related products and artifacts aimed at satisfying a particular [societal] function’. With respect to the latter, the approach has especially proven itself in explaining why and how sustainable (energy) technologies have developed and diffused into a society, or have failed to do so.StructuresThe system components of a Technological Innovation System are called structures. These represent the static aspect of the system, as they are relatively stable over time. Three basic categories are distinguished:ท Actors: Actors involve organizations contributing to a technology, as a developer or adopter, or indirectly as a regulator, financer, etc. It is the actors of a Technological Innovation System that, through choices and actions, actually generate, diffuse and utilize technologies. The potential variety of relevant actors is enormous, ranging from private actors to public actors, and from technology developers to technology adopters. The development of a Technological Innovation System will depend on the interrelations between all these actors. For example, entrepreneurs are unlikely to start investing in their businesses if governments are unwilling to support them financially. Vice versa, governments have no clue where financial support is necessary if entrepreneurs do not provide them with the information and the arguments they need to legitimate policy support.- Actors are the students and the members in the university who are using the Internet connectionท Institutions: Institutional structures are at the core of the innovation system concept. It is common to consider institutions as ‘the rules of the game in a society, the humanly devised constraints that shape human interaction’. A distinction can be made between formal institutions and informal institutions, with formal institutions being the rules that are codified and enforced by some authority, and informal institutions being more tacit and organically shaped by the collective interaction of actors. Informal institutions can be normative or cognitive. The normative rules are social norms and values with moral significance, whereas cognitive rules can be regarded as collective mind frames, or social paradigms. Examples of formal institutions are government laws and policy decisions; firm directives or contracts also belong to this category. An example of a normative rule is the responsibility felt by a company to prevent or clean up waste. Examples of cognitive rules are search heuristics or problem-solving routines. They also involve dominant visions and expectations held by the actors. -Institution is the university; the rules, structure and managing implementation are within the university. ท Technological factors: Technological structures consist of artifacts and the technological infrastructures in which they are integrated. They also involve the techno-economic workings of such artifacts, including costs, safety, and reliability. These features are crucial for understanding the feedback mechanisms between technological change and institutional change. For example, if R&D subsidy schemes supporting technology development should result in improvements with regard to the safety and reliability of applications, this would pave the way for more elaborate support schemes, including practical demonstrations. These may, in turn, benefit technological improvements even more. It should, however, be noted here that the importance of technological features has often been neglected by scholars. -Factors include Internet connectivity and its accessibility.The structural factors are merely the elements that make up the system. In an actual system, these factors are all linked to each other. If they form dense configurations they are called networks. An example would be a coalition of firms jointly working on the application of a fuel cell, guided by a set of problem-solving routines and supported by a subsidy programmed. Likewise, industry associations, research communities, policy networks, user-supplier relations etc. are all examples of networks.An analysis of structures typically yields insight into systemic features - complementarities and conflicts - that constitute drivers and barriers for technology diffusion at a certain moment or within a given period in time.

Dynamics of Technological Innovation SystemsStructures involve elements that are relatively stable over time. Nevertheless, for many technologies, especially newly emerging ones, these structures are not yet (fully) in place. For this reason, mostly, the scholars have recently enriched the literature on Technological Innovation Systems with studies that focus on the build-up of structures over time. The central idea of this approach is to consider all activities that contribute to the development, diffusion, and use of innovations as system functions. These system functions are to be understood as types of activities that influence the build-up of a Technological Innovation System. Each system function may be ‘fulfilled’ in a variety of ways. The premise is that, in order to properly develop, the system should positively fulfill all system functions. Various ‘lists’ of system functions have been constructed. These lists show much overlap and differences reside mostly in the particular way of clustering activities. An example of such a list is provided below.Note that it is also possible that activities negatively contribute to a system function. These negative contributions imply a (partial) breakdown of the system.

If I was hired by the university president as an IT consultant, I would suggest to measure every equipment/ technology that the university has and the university afford to buy but must check first these 9 things below:ุ Scalability ุ Reliability ุ Power supplyุ Compatibility ุ Security ุ Audit capability ุ Hardware flexibility ุ Software flexibility ุ Cost and service In order for the internet connectivity be improved.REFERENCES:http://en.wikipedia.org/wiki/Technological_innovation_systemhttp://en.wikipedia.org/wiki/Innovation_systemhttp://en.wikipedia.org/w/index.php?title=Special%3ASearch&search=IT+technology&go=Go

Last edited by rosemie nunez on Thu Oct 01, 2009 5:11 am; edited 1 time in total

If ever I will be hired by the university president as an IT consultant, I would suggest to renew all the technology, infrastructure, innovations, as well as its processes. I know it's really difficult to do it but I am not saying that it will happen for just one blink of an eye, it will undergo many processes and steps. What I'm trying to say is, "Dahan-dahanin lang".

Base on the news I heard, we are now in the new age of technology so in order to be competent, the institution must improve the said technology. In such way of, making a system for the students as well as the faculty and staffs can benefit it. My point is that, it should have a kind of system that will make all processes to become one or much easy and not time-consuming. But the question is HOW? Well, I can't decide it right away maybe I would just think of it. Also I would tell him/her to reconstruct its network because the loading is very slow.

My suggestion in terms of infrastructure, is that, I will give choices for them and this are the following: this institution will provide a well equipments for their students need or they should have funds to renovate such material when it is being malfunction or else they must allow students in using their laptops and the electricity. Perhaps, it's hard for the institute to support for it but as a head of such institution he/she must find ways to solve that particular problem. As an IT consultant, i would also suggest to have a password in the logger because as I observed the account of one student can be easily used by another student just by typing the i.d. number without any permission and I know it is not good for the student. All of us paid it so we must have a equal and fair in using accounts.

I will first tell that it really implies, the term innovation refers to a new way of doing something. It may refer to incremental, radical, and revolutionary changes in thinking, products, processes, or organizations. A distinction is typically made between invention, an idea made manifest, and innovation, ideas applied successfully. For its advancement in terms of innovation, well, I will just concentrate with using a particular device, a high tech one to compete with the other. I know they will say, aiming a goal with a high tech devices is no used and it is more expensive but what I'm to clarify is that having a good equipment is one of the aspects that can affect to aim that goal, to be one of the competent university not only here in the country but also in the whole world. Honestly, its hard to achieve it but if the institution can provide this well I am much assure that it will change or have a big impact. Next, the institution should also have a skilled-teacher in order that a student will learn much from them.

To process all transaction here for instance are the assignments, projects, and presentations or reporting, it should a better rules for the students to allow them using the electricity. Then I suggest that the university will have a particular website that all announcement will be posted and it is always updated.

If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? (3000words)

I am not sure if I get the question right. haha. What i understand about the question is select areas in which i would suggest developments particularly for the internet connectivity of the university not considering the budget of the university. Its not easy for me to suggest ways and opinions about the internet connectivity(like anybody does haha).

I am new in Obrero campus (so with my classmates from Tagum Campus) , I have not seen all the malfunctions and disadvantages in the internet connection of Obrero campus. It's also my first time to use the internet services inside the university(just this day August 18, 2009, 3:45pm ), and i noticed some factors which could be considered "not yet developed"(this word may be exaggerated but i cant find any words to describe, anyways that was just my opinion).

TECHNOLOGY

Technological development is the process of research and development of technology. Many emerging technologies are expected to become generally applied in the near future.

The new technology development process leans on the New product development process. It starts with a new technological idea, via research and development through to the use of a technology (e.g. by introducing products that are based on the new technology in market).

In general, products are not equal to technologies. A product is based on several technologies and each technology is basis for several products. Life cycles of products and technologies also are different. However mostly, it is difficult to see the difference between products and technologies and people often confound these two terms [1] . Then, the New product development process equals the new technology development process.

In terms of Technological development here are my suggestions:

*New Internet-based technologies*Wireless Internet Access Service Provider WISP is an acronym which stands for Wireless Internet Access Service Provider. These can be Wi-Fi hotspots or an operator with a Wi-Fi based network infrastructure. Often they offer additional services, like location based content, Virtual Private Networking and Voice over IP.

WISP's are predominantly in rural environments where cable and digital subscriber lines are not available. WiMax was expected to become mainstream in 2006 and was anticipated that it would dramatically change the marketplace by increasing the number of interoperable equipment on the market and making mobile data transmission feasible. Unfortunately this has not happened and has hampered the expected increase in adoption rates of WISP services.

Typically, the way that a WISP operates is to pull a large and usually expensive point to point connection to the center of the area they wish to service. From here, they will need to find some sort of elevated point in the region, such as a radio or water tower, on which to mount their equipment. On the consumers side, they will mount a small dish to the roof of their home and point it back to the WISP's dish.

Since it is difficult for a single service provider to build an infrastructure that offers global access to its subscribers, roaming between service providers is encouraged by the Wi-Fi Alliance with the WISPr protocol. WISPr is a set of recommendations approved by the alliance which facilitate inter-network and inter-operator roaming of Wi-Fi users.

Many wireless broadband services provide average download speeds of over 100 Mbit/s, and is estimated to have a range of 50 km (30 miles)[citation needed]. Technologies used include LMDS and MMDS, and one particular access technology is being standardized by IEEE 802.16, also known as WiMAX.

At first, Wireless Internet Service Providers (WISPs) were only found in rural areas not covered by cable or DSL. These early WISPs would receive a large connection, such as a T1 or DS3 connection, and then prodcast signal from a high elevation, such as at the top of a water tower. To receive this type of internet, consumers would mount a small dish to the roof of their home or office and point it to the transmitter. Line of sight was usually necessary for this type of technology, but technologies by Motorola have not adhered to this general rule.

Benefits of Using a Wireless Internet Access Service Provider

* Rapid Deployment - Allows networks to be deployed without cabling for client devices, typically reducing the costs of network deployment and expansion. Spaces where cables cannot be run, such as outdoor areas and historical buildings, can host wireless Internet access networks. * Fastest Internet Access in the World - Wi-Fi is the fastest Internet access network in the world. Wi-Fi is currently at 54Mbps now, and soon 802.11n will offer 248Mbps. The access speeds blow away 3G, cable and DSL providers. Only fiber to the curb can really compete with Wi-Fi. * Massive Availability - Wireless Internet Access (Wi-Fi) antennas are built into 99% of all modern laptops, getting a laptop without a built in Wi-Fi antenna would be extremely rare. * Affordable Pricing for the Masses - Wi-Fi chipset pricing continues to come down, making Wi-Fi a very economical networking option and driving inclusion of Wi-Fi in an ever-widening array of devices. * Interoperability Amoung All Vendors - Wi-Fi products are widely available in the market. Different competitive brands of access points and client network interfaces are interoperable at a basic level of service. Products designated as Wi-Fi Certified by the Wi-Fi Alliance are backwards compatible. * Worldwide Wi-Fi Standard - Wi-Fi is a global set of standards. Unlike 3G cellular carriers, which are slow and fragmented, the same Wi-Fi Internet access device works in different countries around the world. * Ubiquitous Coverage - Widely available in more than 250,000 public hot spots and tens of millions of homes and corporate and university campuses worldwide. * Secure Network Connections - As of 2007, Wi-Fi Protected Access (WPA) is not easily cracked if strong passwords are used and WPA2 encryption has no known weaknesses. * Quality of Service - New protocols for Quality of Service (WMM) and power saving mechanisms (WMM Power Save) make Wi-Fi even more suitable for latency-sensitive applications (such as voice over IP and video on demand) and offered in a very small form-factor.

*TableTop Computers -It's quite ambitious to suggest for this hardware but it would be a better technological awareness of the students to engage on this kind of advancement.Hand-held devices and laptops have been able to allow customers to use their fingers to control what is on their screen, now Microsoft has taken touch screen devices to a new level and they have created a table top touch screen.

Microsoft has created "Surface", a 30-inch table top touch screen device that allows users to control what is on the screen with their hands. Surface is a shiny black coffee table with 5 cameras mounted beneath the screen. The cameras can sense nearby objects and allow you to control the entire screen with your fingers. You can touch or drag your finger across the screen to interact with objects like paintbrushes or photographs.

What makes Surface so unique is its ability to respond to more than one touch at a time. Because Surface is the only device of its kind on the market, the price tag is still pretty high, between $5,000 and $10,000. If you wait about 3-5 years, you can probably get Surface for half the cost. Microsoft has stated they expect the price to drop in order to be feasible for consumers to purchase.

The current version of Surface is being used by hotels and other companies. Some hotel lobbies allow guests to use Surface to play music and buy songs with their credit card. You can also use it to order food and drinks and simply scan your door key barcode over the screen to pay the bill. Other companies are using Touch for their clients to learn about nearby attractions or book show tickets to movies or plays.

Digital photography is one of the biggest revenue-producing ventures right now. Microsoft is working on digital photography sharing software to communicate with Surface. Right now, Microsoft's goal is to place a card onto the screen of Surface and allow all your digital photographs to appear on the screen. From here, you are able to bend, flip, stretch, or do literally anything to your photographs.

At the current time, only 6 software development firms have been given permission by Microsoft to work on Surface. Most everyone has been impressed by Surface and it has opened a new door for technology savvy companies to expand upon.

So who is in competition with Microsoft and their new product? Researchers from the National Taiwan University have created a dual-resolution table top touch screen device called the "i-m-Top". Basically, it can take a desk and turn it into a 120x80cm touch screen device.

The i-m-Top has a number of infrared sensors that trace the motions of the user's hands to control the screen. There are 2 rear projectors installed in the device. One projector covers the entire table top and the other provides high-resolution detail. The i-m Top provides a cheaper touch screen alternative to make use of optical imaging technology.

The current cost of the i-m-Top is $3, 085.00. Volume production of the i-m-Top is still in the works and it is currently only available to professional workshops and high-level executives. In a few years, the i-m-Top price is expected to drop to the price of a regular personal computer.

Table top touch screen devices are pretty unique and exciting to play around with. If you haven't had a chance to view a table top touch screen device, you should connect to Microsoft's web site as soon as you have a free minute. Table top touch screen devices are fun to watch in action and if you ever get the chance to play around with one in person, you won't regret it! The onward progression of technology can only make one wonder what other touch-screen devices will be created next!

INFRASTRUCTURE

For an organization's information technology, infrastructure management (IM) is the management of essential operation components, such as policies, processes, equipment, data, human resources, and external contacts, for overall effectiveness. Infrastructure management is sometimes divided into categories of systems management, network management, and storage management. Infrastructure management products are available from a number of vendors including Hewlett-Packard, IBM, and Microsoft.

Although all business activities depend upon the infrastructure, planning and projects to ensure its effective management are typically undervalued to the detriment of the organization. According to IDC, a prominent research firm (cited in an article in DMReview), investments in infrastructure management have the largest single impact on an organization's revenue.

Internet telecommunications technology has been hot and will continue to dominate the interest of both consumers and business professionals alike. Bandwidth is still the major bottleneck for many of us accessing the Internet, so new transmission methods must be devised. These transmission methods must provide us with inexpensive bandwidth that is easy to use and easy to set up.

Internet Technology Development

Several problems are associated with today's Internet technology. One, there are too many bottlenecks in the current system. Many users have slow dial-up connections and cannot use streaming multimedia applications. We are also running out of Internet addresses (TCP/IP) so protocol enhancements are needed.

Internet2 capabilities, introduced by then Vice President Albert Gore in 1998, was a first step in experimenting with the next generation Internet. It is a cooperative effort between the federal government, colleges, and universities. Private companies such as Lucent, Cisco, and Nortel are contributors to the Internet2 project.

Today's Internet uses IPV4, which is more than twenty years old. IPV4 has supported us for a long time, but it is starting to show signs of its age. A major problem is the limited number of IPV4 addresses to support the Internet for now and for the future.

TO BE CONTINUED...

Last edited by Jevelyn Labor on Wed Aug 19, 2009 6:37 am; edited 1 time in total

Since 1996, Internet2 has provided extremely fast connections for colleges and universities only. The purpose of Internet2 is to deploy advanced network applications and technologies, accelerating the creation of tomorrow's Internet. Internet2 is looking to recreate the partnership among educational institutions, industry, and government that fostered today's Internet back in the late 1960s.

The Internet2 network is not available to the general public. The primary goals of Internet2 are to do the following:

*Create a leading-edge network capability for research and development. *Enable testing of new fiber products and routers. *Create new network services and applications for the standard Internet.

The network uses high bandwidth backbone fiber similar to the standard Internet. The basic difference between the standard Internet and Internet2 is the number of users. Today's Internet2 network supports 3 million users. In contrast, the standard Internet supports hundreds of millions of users.

The minimum connection speed for Internet2 is 155 Mbps. The network is designed with a minimum number of "hops" between routers, creating even faster connections.

Some experimental testing with Internet2 technology is as follows:

* Medical—3D Brain Mapping: Allows real-time visualization of brain activity during visual and memory tasks. Internet2 allows computers to link together at a high rate of speed to exchange information. *Medical—Remote Medical Specialist Evaluation: Most medical evaluations can be completed after review of a patient's image by a non-physician at a reading center. Some evaluations, however, require even further review by a supervising specialist. Internet transmission with a guaranteed quality of service and high bandwidth supports real-time interaction among non-physician experts and supervising physicians. Both groups share large image data sets and finalize the evaluation by using Internet2. *Entertainment—Advanced Digital Video: Internet2 allows for high-quality presentation of video that is not currently available on the standard Internet. Internet2 also provides interactive 2-way digital video, which is not possible with the traditional first-generation Internet services. *Educational—Remote Instruction and Teaching: Internet2 can link multiple multimedia classroom sessions, providing development of educational materials between two geographically separate institutions. Real-time interaction between students and teachers can be formulated with Internet2. *Astronomy—Remote Telescope Manipulation: With Internet2, an astronomer in Amsterdam can remotely control a telescope in Hawaii and then participate in an international, full-speed, full-motion videoconference to discuss his or her findings.

Remote interactivity is one of the key features of Internet2. As previously mentioned, doctors can remotely supervise a medical procedure, astronomers can remotely view the contents of a telescope, teachers can remotely instruct a classroom. It is a very powerful scientific and educational medium (Figure 10.1).

Internet2 video conference.

Internet2 may one day become the Internet of the future. In fact, there is already some talk about Internet3 development, with even higher processing speeds than presently available with Internet2.

Internet Protocol Version 6

Internet Protocol Version 6 (IPV6) was developed to fix problems that are associated with today's Internet or IPV4. IPV6 also adds enhancements such as automatic routing and network reconfigurations. This new protocol will eventually replace IPV4, and the two will coexist for several years until a complete transition is made. In fact, IPV6 includes a transition mechanism that provides direct interoperability between IPV4 and IPV6 hosts. It is expected that a full transition to IPV6 will take at least a decade.

The projected use of the Internet in the next few years will be for a much larger group of people. This doesn't include the projected convergence of the telecommunications, entertainment, home appliance, and computer industries, to name a few! (See the section "Convergence" in this chapter for more on convergence).

There are several major differences between IPV4 and IPV6. The first noticeable difference is the address size. IPV6 is 16 bytes long, providing us with a virtually unlimited supply of addresses. A second improvement with IPV6 is the size of the header. The header now contains 7 fields (13 in IPV4). The smaller header size allows for faster packet processing than ever before.

Another improvement with IPV6 is in the area of security. Privacy and security features are always of utmost concern.

IPV6 solves network addressing limitations by replacing IPV4's 32-bit address with a 128-bit address. Despite major advances, IPV6 has been slow to catch on and few commercial products supporting the new protocol are available.

As small pockets of IPV6 networks grow, they will merge into larger and larger pockets. Eventually, all of the IPV6 pockets will become one, and a new Internet will be fully deployed.

Having an Internet connection in a school is advantageous to the students that study in it. The students could use it very well in their research, because with the Internet, you can gain access to lots and lots of information, and use it for your own. And if you’re a student with a computer-related course like Information Technology or Computer Science, then having Internet at school would be very useful. When you have Internet at school, may it be in your library or in your computer lab, you also save money because you have free hours of surfing that you have in your account. Or if you have a laptop, then better for you, if the school provides Wi-Fi access.

But here in the school (I don’t think that you’ll argue with me), we have a slow Internet connection. No offense to those in-charge of the matter, I know they are doing what they can, but we really do have a slow Internet connection. Since my 1st year in the University of Southeastern Philippines, this is what I have observed. I always hear people complain, “Hinaya sa net ui!”. That’s what I always hear from people who have surfed the net from inside the school. There has to be something wrong, because if there was not, then I would be proud to say that “Ok man ang net sa skul”. And I would not always go outside to Internet cafes to do my research and to download files and other stuff (We just recently had a desktop computer in the house given by my uncle, but we still have no Internet connection).

To the question “If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved?”, these will be my answers:

But first off, let’s take a closer look at some technical stuff, like the definition of the Internet, just to clarify a few things.

“The Internet, sometimes called simply "the Net," is a worldwide system of computer networks - a network of networks in which users at any one computer can, if they have permission, get information from any other computer (and sometimes talk directly to users at other computers). It was conceived by the Advanced Research Projects Agency (ARPA) of the U.S. government in 1969 and was first known as the ARPANET. The original aim was to create a network that would allow users of a research computer at one university to be able to "talk to" research computers at other universities. A side benefit of ARPANet's design was that, because messages could be routed or rerouted in more than one direction, the network could continue to function even if parts of it were destroyed in the event of a military attack or other disaster.

Today, the Internet is a public, cooperative, and self-sustaining facility accessible to hundreds of millions of people worldwide. Physically, the Internet uses a portion of the total resources of the currently existing public telecommunication networks. Technically, what distinguishes the Internet is its use of a set of protocols called TCP/IP (for Transmission Control Protocol/Internet Protocol). Two recent adaptations of Internet technology, the intranet and the extranet, also make use of the TCP/IP protocol.”

Now, let’s proceed to my ideas and solutions as the new IT Consultant of the University of Southeastern Philippines. Hwow!

Planning and OrientationFirst is proper planning and orientation. There must be a proper meeting about what will be the best way on how to improve the internet connectivity in the school, and also because I think that this plan will really involve money (of course). An agreement must be made between the management and the ones who will be implementing the changes, and the ones handling the implementation must explain the whole plan to the management, so there will be no issues after. With a good relationship, there will surely be an improvement with whatever plan you have in mind, as long as it is for the better and not for the worse.

Money/BudgetProper budget has to be allocated to fund all the changes, the materials needed, the supplies and the upgrades that the University has to do. I think that with the tuition we pay, (even though our school is a public one), bit by bit we can buy all we need to increase the internet connectivity of the school, may it be new cables, routers, switch/hubs, computers, upgrading to a better deal with the ISP(Internet Service Provider)etc. The computer laboratory fees of the school should just be enough for what I’m talking about here. But if some rich organization, politician or foundation would be willing to donate computers and other computer related materials, supplies, etc., then good for us. We just have to handle their gifts with care. Let’s not take things for granted.

Increase BandwidthWe all know that bandwidth really plays a role in improving the speed of the internet. Bandwidth is the amount of data that is downloaded by you using the connection. If the bandwidth allocation were allocated properly, then the internet connectivity would improve, even if it’s just a little. Blocking sites that allow downloads and file-sharing can help to not let students download again and again. But from what I’ve heard, I think the University has increased the bandwidth, and still the internet connectivity is slow. I don’t really know why, but there has to be something wrong with it.

HardwareI think that we don’t have to forget our PCs if we talk about improving the internet connectivity of the school. It’s true that if we subscribe to a higher internet speed from an Internet Service Provider (ISP), we could get the satisfaction of surfing the Net faster, but that’s only if your computer also has the capacity to do so. Like for example, upgrading the computers’ RAM. By upgrading the RAM, it will not only improve your regular computer use, but it will also affect the speed of your Internet connection because your computer works faster. Recently, the Nodal Center just got a donation (I think) of computer units with Dual Core processors, although I don’t know how big the RAM is (After I post this, I’m going to find out). All I know is that when I used these computers, I noticed that the Internet connection is doing better than before. I think now the problem is with the Internet connection, and not with the units. If your computer is slow, it doesn't matter how fast your Internet connection is, the whole thing will just seem slow. You can only access the Internet as fast as your PC will allow you to. And talking about cables and other networking matters, I think we use fiber-optic cables now in the University, although I’m not really sure as to whether we’re using repeaters or not. And I don’t think that we have problems with UTP cables, because I think we have enough of them.

Viruses and MalwareViruses and malware can often use up your bandwidth and slow down your Internet connection. So if I were the IT consultant, I would make sure that all computers in the school have up-to-date virus scanners and malware scanners. I would let those in-charge of maintenance monitor the computers if their anti-virus and anti-malware software are up to date. Just like in the Nodal center, I think some units have viruses or malware in them, although it has been lessened now than before. I think that it would be good if we buy and invest in a reliable anti-virus and use it on our computers here in the school, and not just rely to free anti-virus software. The question now will be which one of them will be the best and the most reliable. And also if it will match with the University’s needs. But now there are a lot of studies and reviews about anti-viruses and anti-malware software, so choosing shouldn’t be a problem. Experience in using these software also count.

SoftwareNot only should we just consider hardware, but we should also consider the software that we use. Right now, there are several free downloadable programs that boast that they can improve your Internet connection to an extent. It’s good to use these different kinds of software, but we have to be careful, some other downloadable programs have hidden programs attached to it and we will never know what might happen with these so-called “hidden programs”. Here is some information on the programs that I got from the Internet that say they make browsing faster.

* Loband.org is a browser inside of a browser that loads web pages without the images.

* Firefox and Opera both have options to disable images. * In Firefox, you can also use extensions such as NoScript that let you block scripts and plug-ins that would otherwise slow things down a lot. * If you are using Internet Explorer or Firefox, try downloading Google Web Accelerator. It is meant to speed up broadband connections, but it can also slow your Internet connection. Try enabling it and disabling it and see when your Internet connection runs faster. * If you are using Firefox, download the Fasterfox extension and Firetune. * Reduce the amount of programs running that use your Internet connection (Instant Messengers, RSS Feeders, and MS Applications set to send Internet data) * Google Accessible Is designed to search pages in order of how clean they are of junk. This will bring up pages that are usually not only easy to read, but are quick to load.

There are still loads of other free software out there to download and some are even from Microsoft and other reliable sources. Let’s just be careful downloading these software. Other software also clean and monitor your temporary files, cache or you could just do it yourself. These files improve your Internet connection performance by not downloading the same file over and over. When a web site puts their logo graphic on every page your computer only downloads it when it changes. If you delete the temporary files it must be downloaded again. if you disable the cache, it must be downloaded every time you view a page that uses it. So if you optimize this, then it could greatly help your surfing pleasure and the Internet connection speed would improve. This just has to be done with each computer, so proper maintenance must be observed.

Proper MaintenanceAll of these ideas, conceptualizations (and other words that mean the same) will be useless if there will be no proper maintenance, may it be the software, the hardware or even the people using it themselves. Students have to take care of the units that they use, and they must be responsible in using the Internet in the school. If they are prohibited to go to sites that have audio streaming and video streaming, then they have to follow these rules and restrictions. The technicians also should be responsible in the maintenance and care of the hardware. Faulty hardware should be looked upon properly and if need be, replaced, if there would be a proper budget for it. If there are some problems with the fiber optic cables, then a solution has to be made immediately. Just recently, there was an Internet connection problem because of a cable from PLDT. The effect was: very slow Internet connection speed. We should avoid problems like that, even though here in the school it’s in a smaller scale. With proper maintenance, we should be able to do that. The maintenance of the programs and regular updates should also be taken care of by the right people. With all of these in hand, then there would be surely some improvements and enhancements to our Internet usage.

Network TopologyDuring our Data Communication and Computer Networks I class, we discussed about different network topologies. These topologies or frameworks are ways to properly distribute the data and information in a network. With these many topologies, we can choose which one will be the best for our school to implement to improve our internet connection between networks, although I am not sure what topology to use. There also has to be a simulation first before letting this take effect, because we can’t afford to carelessly do something before not trying it out.

Consultation and StudyIt’s not harmful for the University to ask for advice or consult from the ones that excel in this kind of field. If I were the IT Consultant, I would also ask for tips or advice from those who have more experience than me. Then I would apply all the information that I have for the school. And we should also take account the universities in other countries. We should study what they also did to the infrastructure in their schools that led their internet connectivity to be improved.

Careful and Proper SelectionIf we were to buy hardware for our computers, then we have to pick reputable and reliable brands. Tried and tested brands are better than experimenting with new but not so sure ones. Remember that the money used is from the students, so we have to choose carefully for their sake. And for the software, we have to use programs and applications that are sure to work properly with the hardware, and are also sure to work what they are designed to work for.

All of these can be implemented through proper planning and management. The school should cooperate with the IT Consultant (that’s me) and offer the best they can to help out with the Internet connection, because it’s all for the sake of the students. And the students should also do their part and cooperate with the school.

With fast Internet connection speeds, the students’ problems with schoolwork will be lessened, and that’s about it with improving the Internet connection of the school.

If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? (3000words)

Assuming that I am hired by the university president as an IT consultant, and the task given to me is to improve the internet connection. In an organization we are in, knowing that we have limited resources, it is advisable and recommended to maximize our resources where we have to limit expenditures, and it also means that we have limited ways on how to improve the connection of the internet in our beloved organization. Let’s just assume that we have all we need to implement this project. In connection to the topic, the following are my suggestions on how we can improve internet connectivity in our university; this is based on some articles I have read from the internet and my own experience and understanding.

To start, let us examine first the internet connection we have right now, our facilities or conditions of our computers as well as the network. As a member of this university, I have had use the computer laboratory or the so called “nodal” for only a minimal number of times for the following reasons:

1) Slow or should I say very slow internet connection. 2) The number of students falling in line to use the laboratory.3) The number of computers available.4) Restricted access of some sites.5) Vast number of viruses residing in our computer laboratory.6) Old and faulty hardware

What I have mentioned above are based on what I have observed and those are the problems I found, that must be given consideration in developing, recommendations that leads to improvement of the university connectivity.

Discussing the problems I have observed, we currently have a very slow internet connection this is may due to the that we have a not maximized the bandwidth usage, and we are aware that the allocated bandwidth we currently have, is not enough for the number of computers in the university, to be more specific I will state the users of the said topic, we have the students who uses the laboratory and the university virtual library and to mention, those students bringing their laptops with them, the faculties and school staffs. For we have a limited number of computers, it is expected that not all students can avail the said service where many of us choose to go to the nearest internet café with faster internet speed and no site restrictions. I also observed that our computer laboratory is like a lair of computer viruses which also contributes to slowing down of connections. And also, we have faulty and old hardware presents in our computer networks. These also a great factor that must be given attention, for its one of the main reasons why we have slow pace connectivity, to elaborate further internet connection depends on the capability of computers, we cant have a fast connection when we have an old and outdated hardware, and also UTP cables must be check and replace if necessary ‘coz this also slows down the connection for some interference maybe present and if needed use repeaters.

I suggest the following to speed up university’s internet connection:

1. Perform some computer maintenance. Run Disk Defrag, a scan disk, a virus scan, a malware scan, and clear your recycle bin. An unusually slow Internet connection experience is often the only sign that your computer is infected with viruses or other malware. Delete old files and temporary files. Never allow the free space on your C: drive to be less than 10% of the total size or twice the installed RAM (which ever is larger). A well maintained PC will operate much better than a PC that has never had any maintenance. Google or your local computer repair store should be able to help you with this if you don't know how.

2. Reset Your Home Network. Sometimes restarting your home network if you have one will drastically increase the speed of your connection.

3. Optimize your cache or temporary Internet files. These files improve your Internet connection performance by not downloading the same file over and over. When a web site puts their logo graphic on every page your computer only downloads it when it changes. If you delete the temporary files it must be downloaded again. if you disable the cache, it must be downloaded every time you view a page that uses it. This can be done by opening Internet Explorer, clicking on "Tools" at the top and choosing "Internet Options". On the General tab, click the "Settings" button next to Temporary Internet Files. Set Check for newer versions to "Automatically". Set amount of disk space to use to 2% of your total disk size or 512 MB, which ever is smaller. On Firefox, click "Tools" then "Options," and go to the privacy tab. Then click on the Cache tab within this.

4. Never bypass your router. Most routers include a firewall that is very difficult for hackers to defeat. If you don't need to use Wireless then hook your computer directly to your router. Routers will only slow down your connection by a few Milli-seconds. You won't notice the difference but the hackers will.

5. If you are using a Wireless router, make sure it doesn't conflict with a cordless phone or wireless camera. Wireless routers come in two varieties; 802.11bg (2.4Ghz) or 802.11a (5.8Ghz) If you are using a 2.4Ghz Cordless phone and 2.4Ghz Wireless router then your Internet connection speed will slow while you use the cordless phone. The same is true of wireless security cameras. Check on your phone and camera, if it's 900Mhz then it's fine. If it says 2.4Ghz or 5.8Ghz then it could be the cause of your slow connection speed while they're in use.

6. Call your Internet service provider (ISP). Sometimes you just have bad service. They can usually tell if your connection is substandard without having a technician come to your home. Just be nice and ask.

7. Upgrade your computer. If your computer is slow, it doesn't matter how fast your Internet connection is, the whole thing will just seem slow. You can only access the Internet as fast as your PC will allow you to.

8. Replace your old cable modem. Any solid-state electronics will degrade over time due to accumulated heat damage. Your broadband modem will have a harder and harder time 'concentrating' on maintaining a good connection as it gets older (signal to noise ratios will go down, and the number of resend requests for the same packet will go up). An after-market cable modem as opposed to a cable-company modem will frequently offer a better connection.

9. Often your connection speed is slow because other programs are using it. To test if other programs are accessing the Internet without your knowing, Click Start, Click Run. Type "cmd" (without quotes). Type "netstat -b 5 > activity.txt". After a minute or so, hold down Ctrl and press C. This has created a file with a list of all programs using your Internet connection. Type activity.txt to open the file and view the program list. Ctrl Alt Delete and open up the Task Manager. Go to the process menu and delete those processes that are stealing your valuable bandwidth. (NOTE: Deleting processes may cause certain programs to not function properly)

10. After you have tried all this try your connection again and see if it's running any faster.

11. Caching proxy server. A proxy server can reply to service requests without contacting the specified server, by retrieving content saved from a previous request, made by the same client or even other clients. This is called caching. Caching proxies keep local copies of frequently requested resources, allowing large organizations to significantly reduce their upstream bandwidth usage and cost, while significantly increasing performance

12. Proxy Servers. A proxy server is a kind of buffer between your computer and the Internet resources you are accessing. They accumulate and save files that are most often requested by thousands of Internet users in a special database, called “cache”. Therefore, proxy servers are able to increase the speed of your connection to the Internet. The cache of a proxy server may already contain information you need by the time of your request, making it possible for the proxy to deliver it immediately. The overall increase in performance may be very high. Also, proxy servers can help in cases when some owners of the Internet resources impose some restrictions on users from certain countries or geographical regions. In addition to that,among proxy servers there are so called anonymous proxy servers that hide your IP address thereby saving you from vulnerabilities concerned with it.

Tips• Call your ISP and have them verify all of your TCP/IP settings if you are concerned. Ask them to verify that your Proxy settings are correct.• Don't expect dial up or high speed lite service to be fast. The Internet is primarily geared towards Broadband Connections. Sometimes, you have to wait a little.• Download programs that make browsing faster:

o Loband.org is a browser inside of a browser that loads web pages without the images.o Firefox and Opera both have options to disable images.o In Firefox, you can also use extensions such as NoScript that let you block scripts and plug-ins that would otherwise slow things down a lot.o If you are using Internet Explorer or Firefox, try downloading Google Web Accelerator. It is meant to speed up broadband connections, but it can also slow your Internet connection. Try enabling it and disabling it and see when your Internet connection runs faster.o If you are using Firefox, download the Fasterfox extension and Firetune.o Reduce the amount of programs running that use your Internet connection (Instant Messengers, RSS Feeders, and MS Applications set to send Internet data)o Google Accessible Is designed to search pages in order of how clean they are of junk. This will bring up pages that are usually not only easy to read, but are quick to load.

• Upgrade your RAM. This will not only improve your regular computer use, but it will affect the speed of your Internet connection because your computer works faster.• Use the Stop button to stop loading pages once you've gotten what you want.• Some times malware on your computer can eat up your bandwidth. Make sure you have an up-to-date malware protection program.• Most Internet Providers have flaky DNS servers (no citation necessary, it's a given) - so, instead of using those provided by your ISP, switch your DNS servers to use those of OpenDNS. OpenDNS is far faster, and more reliable, simply using 208.67.222.222 and 208.67.220.220 as your domain name servers will speed up most flaky DNS problems (may even speed up your networking since OpenDNS has large caches).• Look into running your own local DNS server on your network. Some newer routers may include their own nameserver, otherwise, check into AnalogX.com's DNSCache program, it works great to hold commonly accessed domain names in the "cache" so that the IP addresses do not have to be looked up everytime you navigate to a new page.

Warnings• Viruses and malware can often use up your bandwidth and slow down your Internet connection. Make sure you have protection against this. Many ISP's will provide software for this. Make sure your anti-virus and malware scanners are up-to-date.• Bypassing the router will leave you more vulnerable to attacks because you no longer have the built-in firewall from your router protecting you.• Watch out for scams that claim to make your Internet go a lot faster for free. They may tell you to download their program, which usually has a lot of other hidden programs attached that might steal your identity.

Those are my cost effective, brilliant and I guess efficient ideas on how to improve the internet connection of the university. However it does not end there, we must orient the users of this project. Proper usage of computers and connection must be reminded. Everything will be useless and all the efforts will be futile if the users is not responsible enough on how to fully utilized, maintained and managed this internet connection.

Internet has been perhaps the most outstanding innovation in the field of communication in the history of mankind. As with every single innovation, internet has its own advantages and disadvantages. But usually, greater magnitude of advantages outweighs its disadvantages. Internet Today internet has brought a globe in a single room. Right from news across the corner of the world, wealth of knowledge to shopping, purchasing the tickets of your favorite movie-everything is at your finger tips. Internet has great potential and lot to offer. With that, itsreally part of our daily lives. The foremost target of internet has always been the communication. And internet has excelled beyond the expectations .Still, innovations are going on to make it faster, more reliable. By the advent of computer’s Internet, our earth has reduced and has attained the form of a global village. By just clicking the mouse, we can now communicate in afraction of second with a person who is sitting in the other part of the world. With the so-called emails and chat, we can get in touch with our loved ones. Another great advantage of internet is that it provides us the information we need in our daily lives. The Internet is a virtual treasure trove of information. Any kind of information on any topic under the sun is available on the Internet. The search engines like Google, yahoo is at your service on the Internet. You can almost find any type of data on almost any kind of subject that you are looking for. There is a huge amount of information available on the internet for just about every subject need to be searched. Students and children are among the top users who surf the Internet for research. Today, it is almost required that students should use the Internet for research for the purpose of gathering resources. Teachers have started giving assignments that require research on the Internet. It’s really an advantage to us students because it’s easy for us to find the information we need in our studies. On other way round, entertainment is another popular reason why people love to surf the internet. Games, chat, some networking sites are those popular demands of the users. The industry of online gaming has tasted dramatic and phenomenal attention by game lovers. Chat rooms are popular because users can meet new and interesting people. In fact, the Internet hasbeen successfully used by people to find life long partners. When people surf the Web, there are numerous things that can be found. Music, hobbies, news and more can be found and shared on the Internet.

Slowinternet connection problems are becoming more common. The speed of internet connection is based on several factors. If those factors are hampered, the internet connection will slows down.One of the major causes of slowdown on the net is spyware. So before you begin tweaking your setting. Make sure that your computer is free of spyware and other nasty.

If I’m hired as an IT consultant of the university, I would suggest to have a good infrastructure in order to improve the internet connection. We are all aware that the slow internet connection in our university is one of the problem our university has. Students keep on complaining on the slow internet connection. As an IT consultant, I do have lots of suggestions to improve the internetconnectivity.These are the following:

1. Broadband Router Settings

As the centerpiece of a network, a broadband router can be responsible for slow Internet connections if configured improperly. For example, the MTU setting of your router will lead toperformance issues if set too high or too low. Ensure your router's settings are all consistent with the manufacturer's and your Internet Service Provider (ISP) recommendations. Carefully record any changes you make to your router's configuration so that you can undo them later if necessary.

An Internet worm is a malicious software program that spreads through computer networks. If any of your computers are infected by an Internet worm, they may begin spontaneously generating network traffic without your knowledge, causing your Internet connection to appear slow. Run antivirus software regularly to diagnose and remove these worms from your computers.

4. Running Background Applications

Some software applications you install on a computer run in the background, quietly consuming network resources. Unlike worms, these are programs designed to do useful work. Peer topeer (P2P) programs in particular can heavily utilize your network and cause connections to appear slow. It's easy to forget these applications are running. Always check computers for any programs running in the background when troubleshooting a slow network.

5. Faulty Network Equipment

When routers, modems or cables fail, they typically won't support connections. Certain technical glitches in network equipment, however, adversely affect performance even though connections are maintained. To troubleshoot potentially faulty equipment, temporarily re-arrange and re-configure your gear while experimenting with different configurations. Try bypassing the router, swapping cables and changing network adapters to isolate the slow performance to a specificcomponent of the system.

6. Service Provider IssuesInternet speed ultimately depends on the service provider. Your ISP may change their network's configuration, or suffer technical difficulties, that inadvertently cause your Internet connection to run slow. ISPs may also intentionally install filters or controls on the network that can lower your performance. Don't hesitate to contact your service provider if you suspect they are responsible for a slow Internet connection.

7. Reduce the Web Cache

The task here is to make your web cache as small as possible. The less disk space your reserve for your temporary files is less data that your computer needs to search through. This action can easily award you a faster connection on the net.

8. Clear Temp Files

Deleting web cache is a good way to speed up a slow internet connection. These files take up valuable resources that could be used to for a better net surfing experience. When we are online web cache are active. Cookies are loaded in your browsers and constantly update themselves.Here how to delete your temporary internet files.

When you log on to a web page, your computer is assaulted with data it needs to organize and present in an orderly manner. This information takes time to load. So, the more information the longer it takes to load. However you tweak a number of settings to decrease the time it takes to load a web page. By reducing the amount of animation and scripts.

10. Control Your AntiVirus

When your antivirus performs a scan. It produces a log. This log can be up to 100 megabytes of information each. The registry holds all this info. The addition of the info is not the major problem. Its that the information is placed in many places in your registry. This causes fragmentation of files. Your computer's operating system must search through the massive files in the registry constantly. If the registry is error-filled, fragmented, or bloated with unnecessary and duplicate files, you will have major slowdown. Organize and keep it maintained, the problem will most likely be solved.

Those mentioned suggestions is a great help not just to the students but all also to the faculty and staff of the university because they can make use of the internet properly without hassles. However, it would be a lot better if both students and staff cooperate in making the internet connection faster. It's not just for the good of oneself but also for the good of the institution.

“Innovation is the ability to see change as an opportunity - not a threat”

Let us not be afraid of doing new things! because if we are to define innovation-- refers to a new way of doing something.

But let us take as this way: the goal of innovation is positive change, to make someone or something better.

As what the facilitator of this course referred to us, I am one of the Tagumers,[ ]I am a new subscriber in the facilitation of the campus internet services. but since i am hired to be consulted for the improvement of the internet,heres my recommendation.

the university must follow steps and strategies to implement changes.according to my research, these are STRATEGIES OF THE INNOVATION PROCESS

because once you have made a decision that your institution must innovate to remain or to become competitive, you should establish a rational, logical process to guide you during the entire course of action.

First: You need to pick a type of innovation process that is correct for your company. The result of this process could include a new product, a new service, a break-through in R & D or the reinvention of a current strategy concept, yours or a competitor's. consider this 2 questions below:○What strengths can this process take advantage of? ○What weaknesses should you stay away from?

Second: Once you have decided your general goals, you should look at your organization to see where and how this process should be initiated. Does your organizational structure support the innovation process you have selected? What are the organization's strengths and weaknesses? Will the culture within the company support the innovation development procedures? What is the tolerance for risk within the company? Will the management team allow highly speculative research, or is it more conservative in its approach? The important point here is that the level of risk tolerance must be compatible with the overall philosophy of the company, or the dissonance could doom the project.

Third: What is the time scale for such a development process? This will depend on a number of factors. First among these is the size and complexity of the project.

Fourth: Decide on what measures should be appropriate to spur creativity.

Ask your customers(students): Where better to learn what you might do that your customers might buy than to ask them? Be careful that you don't get the problem of the day. Your questions need to be sufficiently penetrating that you get to the real problems your customers are facing, and thus to the real needs they have. Most of the time, the first things you hear are not the real problems they have. You need to gently probe for their real pain, and what the possible solutions might be, in order to make this effective.

Brainstorming: While this is an obvious tactic, to be done well, this exercise needs to be structured and controlled.

Next you need a good process to capture ideas so they may be evaluated and filtered. One excellent tool to do this is our Worksheet Page 4.4 Perceived Opportunities. This page provides the format to capture all ideas generated by the team, and a process to quickly evaluate at a base level each opportunity to see if it meets the first level needs of the company. The process is very straightforward to use. First, the team goes into a brainstorming session as discussed above. An open atmosphere is requisite for the best generation of ideas, and immediate judgment tends to intimidate those who might contribute in the absence of such judgment.

Once the team has generated all the ideas they can come up with, take a break for 20 minutes or so to allow everyone's minds to clear. On reconvening, ask if anyone else has had any new thoughts.

(by baldwin)

The number one benefit of information technology is that it empowers people to do what they want to do. It lets people be creative. It lets people be productive. It lets people learn things they didn't think they could learn before, and so in a sense it is all about potential.Steve Ballmer -

Anyway, we in the university had our very own faculties who are teaching us this knowledge about technologies related to our course and future career, who i know like me(assuming as professional) are capable enough of initiating innovation in the university. Let us have them as credit in our university as well as in our department.

Obviously everyone wants to be successful, but we want to be looked back on as being very innovative, very trusted and ethical and ultimately making a big difference among other campuses.

As we go forward, I hope we're going to continue to use technology to make really big differences in how people live and work.

While the University compares favorably with other major universities, our increasing dependence on information technology has strained our existing infrastructure, and we are no longer able to provide acceptable quality of service to the University community. This affects our ability to recruit and retain faculty and students, as well as the productivity and effectiveness of all members of the campus community.

If we are to do more than merely catch up, we need to go beyond one-time, piecemeal upgrades to a new way of thinking about information and communication technology as central to everything that takes place in a university. If we act quickly and boldly, we can see great improvements within a year; within three to five years the university can regain national prominence in information and communication technology infrastructure.Changes in attitude and organization, as well as basic infrastructure, will be necessary. The ideal is a distributed but coordinated infrastructure support program throughout the University, with the appropriate balance between local provision of services and support and central guidance on standards and policies.

Improvements in physical infrastructure are needed right away. The goal is to guarantee a high level of end-to-end performance, giving faculty and students the ability to move very large amounts of data rapidly from their desktops and labs to other points on or off campus, seamlessly, securely, and reliably. To meet this goal, we must rebuild our networks: in buildings, across campus, and to the outside world through upgraded Internet connectivity and Internet2 capability. Ultimately, no problem should be intractable because of insufficient computing power or network capacity.

In addition, because many of today’s high-end technologies will become tomorrow’s basic infrastructure, the University must continually invest in pilot projects involving emerging technologies—such as wireless networking—so that it is well positioned for large-scale implementation across campus of those technologies that prove successful and desirable.

As an IT consultant, I recommend that the University take these actions:

Human capital

 Through recruitment and in-house training, upgrade faculty, staff, and student skills and maintain those skills at a high level of competence. Invest significant additional resources in “side-by-side” IT training and consulting assistance for faculty, graduate students, and staff who use the technology in teaching and research. Make a long-term commitment to recruiting, training, developing, and retaining qualified IT staff in the current, highly competitive market.

Policy, governance and management

 Establish, streamline, or otherwise optimize management and governance mechanisms for information and communication technology to ensure that timely, informed, strategic decisions are made for the campus as a whole. Provide central guidance in the areas of financial control, legal issues such as intellectual property, security and risk management, adherence to the academic missions of the University, and coordination of overall services. Allow for local information technology services to be provided whenever possible through stronger capability and leadership within schools. Foster more effective planning and use of resources through continued unbundling of unrelated information and communication technology services and establishment of clear pricing guidelines, overseen by a “Campus IT Pricing and Services Commission.”

Security and quality assurance

 Create a campus-wide security and information assurance office, with appropriate representation from various information and communication technology administrative units, to rigorously and continuously review and assess security policies and practices. Establish a crisis response team, with representation from campus service providers, to coordinate University-wide response to potential technology-based attacks or mishaps.

Physical infrastructure

 Beginning soon and continuing until completed, upgrade in-building network infrastructure to support up to 1 Gbit/s connectivity to end-user locations, with minimum desktop performance of 100 Mbit/s. Immediately begin upgrading campus backbone infrastructure to provide backbone bandwidth capacity to support target 10Gbit/s aggregate service. Upgrade external Internet connectivity and Internet2 connectivity. Equip many more classrooms than at present with network access and a significant number of classrooms with multimedia access. Maintain and enhance the Media Union—an important resource for advanced experimentation in the use of new media—by keeping its technology at the leading edge and retaining professional experts who can use, demonstrate and teach the skills to apply that technology.

Advanced technologies and services for emerging technologies

 Invest right away in the following pilot projects and initiatives: A pilot initiative in wireless networking, with the eventual goal of campus-wide installation. Several advanced prototype demonstrations of services underlying research and instructional applications, such as high performance computing, very large data sets, management of massive, real-time data, collaborative technologies, and visualization and virtual reality technologies. A pilot “Voice over IP” initiative, taking advantage of technology that allows voice communication (what we now call telephone service) to be carried across data networks based on the Internet Protocol (IP). A high-performance end-to-end computing and networking initiative to take advantage of the full potential of emerging infrastructure and computing capabilitiles. Continue investing in initiatives that involve limited, early use of new and emerging technologies, with an eye to eventual campus-wide deployment.

How to speed up an internet connection?

Well, first we need to decide where you sit in the grand scheme of things. You will likely make your connection through one, or a combination of, a few options in internet service providers: Dial Up, Satellite, DSL, Wireless LAN, Cable Modem, FTTH/FTTP, or a dedicated service line such as T1/T3/DS3+. If you're on a T1 or better you're likely not complaining. Paying for such a service means that you should be getting the individual monitoring by your ISP to make sure you are getting what you are paying for. If you're not, switch providers. Fiber to the Prem/Home (FTTP/FTTH) subscribers are relatively new and from my experience well maintained. This brings us to the most likely candidates to look for connection improvement.

Dial Up- Your max: 56kbps (actually about 53kbps of true data after overhead). If you are on dial up, you are going to see 'slow speeds' by today's standards. The internet as most people know it has evolved around content, and content isnt small. One of two factors will most influence your connection quality through dial up- subscription rate in your area and phone line quality from your home to your provider. The sad news is, there is little you can do about either. If your area is over-subscribed (ie, more subscribers than your ISP can support connections to) you are likely going to see long waits connecting and/or frequent disconnects. Bug your ISP and hope they listen, or utilize secondary connection numbers (this may result in long distance tolls unless you are offered more than a single 800 number to dial into). The quality of your local phone lines, similarly, is going to be largely out of your control as well. If you find yourself removed from urban areas, there is a good chance you are still behind very old telephone lines. One rough, yet handy, test is to pick up a -corded- handset and listen for background noise. ANY clicking, humming, buzzing, or any other noise other than a pure dial tone is going to negatively influence the transmission of data over the lines.

The good news is, this gives you a defined disturbance that you can file a complaint with your TELEPHONE provider over. Potential culprits: poorly grounded electric fences (learned from experience), copper phonelines with cracked insulators, poor house wiring.

DSL- Ahhhh.. DSL. Your Max: 256kbps-24Mbps (variable by area) DSL is, generally speaking, either good or bad in an area. Assuming exceptional line quality and a relatively short distance to your area's gateway (DSLAM), the most common/significant factor influencing DSL speeds is subscription for an area. This is for a different reason than dial up however. DSL does not provide (even though they sell it essentially as such) a set data rate to each individual. Instead, they offer a 'pipe' of sorts of a set rate to an area. That rate is shared among all subscribers serviced by that 'pipe'. This is not a bad thing, unless the area is oversubscribed. Oversubscription can only be fixed by your ISP. Line quality and distance from your area's DSLAM will determine the number of useable channels through which you can communicate. As with dial up, this is somewhat fixed unless you can prove a case of poor line quality AND convince the owners of your local phone lines to improve it (may or may not be your DSL provider). With DSL you might still listen for clicks and humming, but utilizing the higher frequenices as DSL does means inaudible interferences may also (and more commonly) cause you a problem.

Satellite - Satellite internet is great if you are very remote, and can handle latency. There are several implementations, but if you are using satellite it is probably your only option (unless you have chosen it over dial up for *hopefully* higher bandwidth). Once you have been installed, about the only influences on your conenction speed are weather and alignment. Alignment is a quick fix, while weather is entirely out of anyone's control.

Wireless LAN- Wireless service, if offered in your area, will vary depending on several factors, most notably: distance to your access point, broadcast frequency of the access point, line-of-site obstruction (largely composition of obstructions) to access point, and number of subscribers. Increasing your connection quality centers around optimising the factors- get closer, remove obstructions (or move to a clear line of sight), or attempting to connect through a less populated node.

Docsis Cable Modem- Docsis Cable Modems can theoretically support up to about a 54Mbps connection speed.Current plant and docsis standards, however, make those speeds less than reliably attainable. You should be aware of the speed to which your modem is provisioned (you will pay varying prices for different speed 'packages' through most ISP's). Most Cable Modem connections that have a problem see it somewhere from the Node to the house. The Node is the point in your neighborhood where the fiber (fiber optics passing light) network ends and is converted to an analog signal passed over a coaxial network (the copper cable you plug into your TV/DVR/MODEM) to your home. While there are countless variables that can impact the plant, you should be concerned about 4 signal levels: Transmit, Transmit SNR (Upstream Signal to Noise Ratio), Receive, Recieve SNR(Downstream Signal to Noise Ratio). You can view 3 of these levels by browsing to your modem's diagnostic page. To do this, point your browser to 192.168.100.1. This is the internal IP of your cable modem giving you access to a web interface displaying many useful statistics. The ones you want will be under the 'Signal' tab. Ideally look for:

These are ideal signal levels. While your connection might still be ok slightly outside of this range, you will want to get them balanced as close to the medians of the ranges if you are suffering slow speeds. A good service technician should know how to properly balance these levels. A second place to look while in the modem diagnostic page is at your event log. T3 and T4 timeouts are bad, and again should be addressed by a service tech. Some things you could consider if the appointment is a while out: keep your modem on a power purifier or UPS for clear, uninterrupted power; if your receive power is low, try removing unnecessary cable splitters; look for cracks, cuts, or exposed coaxial cable.

This is all geared towards directly improving you internet connection quality. Increasing your connection quality is the only way to increase your available bandwidth, and ONLY if you suffer from mentioned service degredations. Keep in mind that there are a number of PC tips that can keep your PC running at top speed, so as not to slow down your perceived connection speed.

[see continuation on page 3]

Last edited by Ida Karla Duguran on Wed Aug 26, 2009 12:42 am; edited 2 times in total

AnIT consultantworks in partnership with clients, advising them how to use information technology in order to meet their business objectives or overcome problems. Consultants work to improve the structure and efficiency and of an organsiation\'s IT systems.IT consultants may be involved in a variety of activities, including marketing, project management, client relationship management and systems development.They may also be responsible for user training and feedback. In many companies, these tasks will be carried out by an IT project team. IT consultants are increasingly involved in sales and business development, as well as technical duties.

with IT at the heart of the organization, not only for transfer of the billions of purchases made through its reach, but also for the decision making prospects of its owner/customers, namely the banks that both use and own the service. All in all, these lively discussions show how radical decentralization works with technology and suggests that other organizations of the “here and now” can benefit from revisiting their information sharing services.

Common Impact seeks an IT Consultant to design technology solutions to meet our nonprofitclients’ needs. For the past eight years, Common Impact has built the leading model that connects skilledprofessionals from global companies to high-potential local nonprofits; the IT Consultant will beresponsible for facilitating these connections while helping Common Impact grow its own systems andoperations to new geographies. This is an exciting opportunity, reporting to our Managing Director, for anexperienced architect to design technical solutions for nonprofits’ greatest challenges, to join CommonImpact’s talented team, and to grow a proven model of social impact.

INTERNETThe Internet is a global system of interconnected computer networks that use the standardized Internet Protocol Suite (TCP/IP). It is a network of networks that consists of millions of private and public, academic, business, and government networks of local to global scope that are linked by copper wires, fiber-optic cables, wireless connections, and other technologies. The Internet carries a vast array of information resources and services, most notably the inter-linked hypertext documents of the World Wide Web (WWW) and the infrastructure to support electronic mail, in addition to popular services such as online chat, file transfer and file sharing, online gaming, and Voice over Internet Protocol (VoIP) person-to-person communication via voice and video. TECHNOLOGYOur schools thrive on information. In the ever-changing world filled with new technology, our teachers and students require the right information, from the right sources, today. Having direct access to industry information gives the competitive edge needed to succeed. Student performance can be improved when the enhancement of teaching and learning using technology is adopted as the norm. Technology is the need of the day. The technological advancements have made society take a leap towards success. Every technological reform is a small step towards advancement. Every new invention in technology is a step towards progress of mankind. Centuries ago, hardly anyone would have even dream of working on a computer. Generations of the yesterday years would have hardly imagined being able to communicate with people on the other side of the globe. But there were some intelligent minds to dared to dream of such revolutionary discoveries and they made the \'impossibles\' possible. Ours as well as our future generations are lucky to be able to witness the technological reforms. We are fortunate enough to lead a life of luxury and comfort. Since ours are the times of technology, why not let the technological reform spread far and wide? Why not make the masses aware of the new technology? Why not equip the entire society with the knowledge of the new inventions in technology? Computers can offer livelier explanations of various subjects. The Internet is an ocean of information, which can be harnessed for the rendition of information in school.

Innovation refers to a new way of doing something. It may refer to incremental, radical, and revolutionary changes in thinking, products, processes, or organizations. A distinction is typically made between invention, an idea made manifest, and innovation, ideas applied successfully

InfrastructureInfrastructure is the basic, underlying framework or features of a system or organization. Can be defined as the basic physical and organizational structures needed for the operation of a society or enterprise, or the services and facilities necessary for an economy to function. The term typically refers to the technical structures that support a society, such as roads, water supply, sewers, power grids, telecommunications, and so forth. Viewed functionally, infrastructure facilitates the production of goods and services; for example, roads enable the transport of raw materials to a factory, and also for the distribution of finished products to markets. In some contexts, the term may also include basic social services such as schools and hospitals.

Inorder for the internet connectivity to be improved and as anIT CONSULTANT,I would suggest the following.DSL TECHNOLOGY

DSL (Eng. Digital Subscriber Loop) - a digital subscriber loop, the family of broadband access to the Internet technology. The standard data reception speed fluctuates between128 Kbps up to 24,000 Kbps, depending on the applied DSL technology and its level. The commercial success of DSL and similar technologies is reflected in the fact that in the last decades, as electronics is becoming faster and cheaper, the costs of digging out pits for the new wiring have remained unchangeably high. All of the DSL technologies use highly complex algorithms for converting the digital signal, overcoming the inseparable limitations of wires` strands. Recently costs of a similar installation would have been incredibly high but thanks to VLSI technology, costs of installation of DSL in the already existing local terminals, with DSLAM multiplexer on one end and a DSL modem on the other, requires a smaller amount of expenditures that could arise at the same route and distance, when installing a new optic fiber. DSL technology is used in majority of apartments and minor offices, the proper filters enable simultaneous operation of voice services and of DSL. DSL modem can use the same subscriber`s line that the communication devices based on POTS technology use, turning faxes and analogue modems on. At the same time, only one DSL modem can use the subscriber`s line. The standard method of making DSL available to many computers in the same premises is using a router, which connects the DSL modem and local Ethernet or Wi-Fi net. The channels of dispatch and receipt streams are used to obtain connection between the subscriber and Internet services supplier.

DVB-S TECHNOLOGYDVB-S is currently one of the most popular methods of satellite access to the Internet. As every connection, DVB-S has its drawbacks and advantages resulting directly from its physical construction and laws that manage it. Major advantages include:

- Global reach - the technology in-question, given application of proper layout of the satellites on the orbits (rose or polar) provides practically global reach, including areas on greater latitudes (poles). Unfortunately, geostationary orbits, which provide access in areas on moderate latitudes (Europe etc.) are currently used in commercial solutions.

- flexibility - it is obvious that the technique consists in wireless radio transmission, in consequence of which, the client is independent of all types of cables connecting him to the operator. As a result, it allows for application of portable (operating without movement but with the possibility to move them) and mobile (with operation in movement option) access terminals, which conditions the flexibility through adjustment of the system for clients` needs.

There are a few problems connected with this kind of access. First of all, the operator must guarantee constant access to his network (constant access to services and incessant rendering of the given service). He is also obliged to provide proper quality of the services (QoS) and constant control of this parameter. Failure to fulfill these conditions may force today`s client with high expectations (ex. a bank) to use services offered by a different operator. The next problem is the maximal radiation power of the receiving-dispatching system (EIRP), which is strictly determined by Telecommunication Regulatory Offices in the given country. The remaining, rather important aspects include: right capacity of the system and provision the necessary bandwidths in proper frequency ranges.A crucial feature of the teleinformatic system are minor delays in transmission. As it was mentioned before, construction of the link itself (precisely distance between the satellite and Earth) introduces certain limitations connected with the speed of propagation of the electromagnetic wave in free space. The picture below presents the possible orbits on which satellites can be located. Logically, the shortest transmission delays will take place in case of units placed closer to Earth. Theoretic transmission delays for individual orbits are:- LEO (Low Earth Orbit) - from 2 to 50ms- MEO (Medium Earth Orbit) - from 30 to 70ms- GSO (GeoStationary Orbit) - approx.120ms- HEO (High Eliptical Orbit) - undetermined value due to far too great changes of the distance from Earth in time function

The satellite access systems usually operate in two frequency bands: Ku (10-18GHz) and Ka (18-31GHz). Ka band was divided into two sub-bands: dedicated to the `downward` link (satellite-terminal) - 19.7-21.2 GHz and to the `upward` link (terminal-satellite) - 29.5-31 GHz. Moreover, application of the so-called V band(40-75GHz) is also planned in communication via satellite. Multi-access to the system is realized in MF-TDMA (Multi Frequency Time Division Multiple Access) system, which consists in separation of particular tome spaces for the user in a given (one of many) frequency band (3 transmission dimensions - time, frequency and level of the signal)

OPTICAL-FIBER TECHNOLOGYFITL (Fiber In The Loop) - Fiber in the Loop systemthe latest offer of world telecommunication concerns are fiber in the loop systems which enable common realization of various telecommunication and TV services. Such type of systems includes FITL system. FITL enables construction of a common network for rendering services to telephone and TV subscribers with limitation or complete elimination of hitherto used copper wires. FITL systems constantly try to live up to forever growing requirements towards public telephone network operators. Apart from provision of basic telecommunication services, such as telephone connectivity, more and more is said about the possibility to render services out of the basic range, such as multimedia services: video on demand, videoconferences, video telephone, quick access to the Internet, electronic shopping and so on. FITL system widely uses the fiber technique, recognized and tested in cable transmission lines. It assumes application of optical carriers, mainly in main or distributive part of a classic access network. Depending on location of the optical network unit, ONU, three types of network architecture can be distinguished: FTTB (Fiber To The Building), FTTC (Fiber To The Curb) or FTTH (Fiber To The Home). Name of a given architecture describes the place of location of ONU. All of these solutions can use the transport part of a telecommunication network to obtain access to the group of services, which is also based on the optical-fiber cable.Reference models for FITL architecture

OLT, optical line termination, is a type of access net contact point with the access point to telecommunication services (depending on configuration of the network, it can either be an access group or directly the service group. ex. the telephone exchange). Physically the connection between OLT and one or more of the optical network units is realized via ODN on one or two optical fibers, depending on the applied transmission method (possible duplex, diplex or simplex). ONU, delivers application information of the given telecommunication service between ODN and the subscriber and the other way around. The contact point of the user with the access network is subscriber`s plug.

FTTC, FTTB, FTTH Architectures SpecificationDepending on location on ONU module in the access net, three FILT network architectures are distinguished:a) FTTC - Fiber To The Curb,b) FTTB - Fiber To The Building,c) FTTH - Fiber To The Home.

It must be noted, however, that the above specifications determine the level of penetration of an optical-fiber medium into the access network. Transmission from ONU to the subscriber is based on classical access net, i.e. traditional copper strand using one of the available xDSL transmission techniques. All of the above-mentioned solutions provide a sufficient band for present and future interactive, distribution, narrow- or broad-band applications.

WIRELESS LOCAL AREA NETWORK TECHNOLOGYWLAN - Wireless Local Area Network - is at present probably the most popular and the most often used wireless way to access the Internet by the net users. However, this solution is not one of those that guarantees top quality of the services (QOS) but the price one must pay investing in the devices totally makes up for this drawback. The equipment, especially the 2,4 GHz, is relatively cheap and is still getting cheaper. That is why it is willingly used by minor Internet providers to solve the last-mile problem.The IEEE 802.11 standard is promoted by the Local and Metropolitan Area Networks Standards Committee, IEEE Computer Society. Before it was approved in June 1997, it was preceded by six draft versions. Only its final shape was recognized by both IEEE and ISO/IEC standards. It enabled a big group of producers and salesmen to develop a wide range of devices for ISM (Industrial, Scientific and Medical) and UNII (Unlicensed National Information Infrastructure) bands on general release.

At present a few standards of WLANs are dominant. The most popular ones are the following: 802.11a, 802.11b and 802.11g. Due to this a few physical layers are defined, which give the designer the possibility to chose one of them, depending on the system requirements and future users� needs. The IEEE 802.11 standard defines two lowest layers of the wireless computer network model operating with the throughput in the radio linkup to 2 Mbit/s. It presumes two types of radio interface: operating within 2,4 GHz band and one using the IR. The original specification of IEEE 802.11 was characterized by low bandwidth and problems with co-cooperativeness.

802.11b specification, published in 1999, determined a new PHY layer which guarantees a greater bit speed with application of DSSS (Direct Sequence Spread Spectrum) in 2,4 GHz range. Devices operating in this standard can dispatch data with speed up to 11 Mb/s. 802.11a specification was presented in 2001 and it defines PHY layer operating in 5 GHz band. The maximum bandwidth was increased up to 54 MB/s, thanks to, among others, application of a new modulation method, OFDM (Orthogonal Frequency Division Multiplexing). 802.11g is the latest PHY specification, operating in 2,4 GHz range and using OFDM spectrum dispersing technique.

WLAN TOPOLOGYa) The temporary net ad-hoc in the group of 802.11x standards is called IBSS net (Independent Basic Service Set). In order to created IBSS at least two devices are required (ex. computers) equipped with wireless network adapter cards. Such a web is not connected to a wireless net and, therefore, data exchange with the framework net is impossible (ex. access to sources, ex. Internet). Ad-hoc net does not require application of access points.b) Dependent net (BSS - Basic Service Set) uses devices called access points, Aps. Their task is to reinforce and regenerate received signals, to manage movement and provide access to the wireless part of the infrastructure. The reach of a dependent network is limited to one access point within which the mobile station may move without breaking the connection.c) Compound network (ESS - Extended Service Set) is created when at least two BSS sub-networks are connected to LAN; it is the most developed example of a combined network which can be successfully used for creating vast, combined, local computer networks.

WiMAX TECHNOLOGYWiMAX is a wireless access to the Internet technology whose aim is to provide broadband access to the network to end users, mainly in cities. It is based on the American IEEE 802.16 and European standards ETSI HiperMAN. WiMAX is an alternative to wireless networks, especially in areas where telecommunication network structure is poorly developed. WiMAX provides optimal solutions of the so-called last-mile problem, i.e. leading the link to its end user. Both standards (European and American) make creation of many base stations` configurations possible, which, in consequence, may cause a situation in which devices made by different producers may not co-operate with each other. That is why WiMAX`s task is to standardize the method of devices` configuration - to solve this problem. The discussed solutions are supposed to guarantee the possibility of co-operation of the devices both with and without direct optical visibility of the aerials and they also give operators the possibility to extend their range of access to the Internet services with mobility by creating competitive solutions (VoIP) for mobile networks. Solutions described in the hereby article refer to 802.16 standard and that is why it is worth to begin with presentation of the standards in-question. It was created from 1999 and, as one may expect, many versions of it have hitherto been created. At the moment two of them are valid:

- 802.16-2004 finished in 2004, it offers access to the network for terminals that are not on the move.- 802.16e finished in 2005, it offers access to the network for terminals that are either stationary or on the move. Certification of devices that comply with this standard should begin in 2007. As I have already mentioned, the technology does not require direct visibility of the aerials, therefore it uses the NLOS model of radio waves propagation ( Non Line of Sight). Application of such a model created certain problems for the standard creators, the most important one being extension of reach of the system. Given lack of optical visibility, multiway propagation takes place, which means that many various signals reach the terminal, signals with various delay times, disturbed signals and with polarity different than in the direct signal. These phenomena cause a considerable decrease of the received power. That is why, in order to improve the system parameters, a number of solutions such as those presented below was applied:

OFDM (Orthogonal Frequency Division Multiplexing): It is a multiplexing technique in which every data portion is sent on a separate available subcarrier, into which the available spectrum is divided, i.e. the data stream is divided into many parallel smaller sub-streams. The subcarriers, each of which transmits one sub-stream, are orthogonal towards each other. In other words, they do not disturb each other despite the fact that they overlap. This solution is resistant to multiroad propagation and in case of degradation of one of the subcarriers, the whole data stream is not lost, only the part transmitted on the given subcarrier. Basing on OFDM, a whole OFDMA method can be created, used optionally in the upward connection, in which every user is allocated with a specific number of subcarriers out of the total number of subcarriers for transmission. Division into sub-channels: such division is optional in the upward link. Without the division into sub-channels, construction of price-effective terminals would be difficult to realize. For example, if the terminal wanted to transmit in the same way as the base station, then it would require great power amounts and complex broadcasting systems which would result in higher expenses. Reduction of terminal`s transmitter`s power to ex. 25% of the base station transmitter`s power can be achieved in two ways: either by using all of the subcarriers and reducing power of each of them by 75% or by using division into sub-channels, i.e. transmit on every fourth subcarrier with the same power as the base station. Both solutions require the terminal to transmit with equal power. The second solution is obviously far better and its only drawback is fourfold reduction of the upward link`s speed. However, thanks to this, the terminal may transmit with far lower power than the base station and therefore, its costs decrease.

Application of aerial techniques: - AAS: such aerials make it possible to direct the aerial beam in a given way. During the signal transmission light from the reflector may be directed at a given user. During the transmission, AAS can direct beams only in the direction from which the transmission comes.- MIMO: application of numerous aerials for signal transmission at both ends of the radio connection (i.e. in the terminal and in the base station). This contributes to considerable speed increase. - Adaptor modulation. This technique consists in the choice of the right modulation, depending on the distance between the terminal and the base station. The greater the distance, the more susceptible the signal is to degradations. Therefore, modulations more resistant to disturbances are used at the expense of lower transmission speed. Others: -Separation of transmission and receipt -Proper detection and transmission errors correction techniques -Power steering

VOICE OVER IP-INTERNET TELEPHONYVoIP (Voice over Internet Protocol) is a technology used for sending sound via network based on IP protocol, which is e.g. Internet. It allows for integration of telephony and data transmission into one net thanks to treating sound as an ordinary stream of data. This technology is more and more often perceived as an alternative to casual telephone networks due to its numerous advantages. VoIP enables making phone calls also to subscribers who do not have Internet and use traditional telephone network or a mobile.

The basic principle of the technology\'s operation is relatively simple, namely the speech signal is transferred to digital representation, it undergoes compression with a proper codec and then it is divided into packets and sent via IP protocol. Many new technologies are used here, including logical elements and special protocols. Logical elements are necessary for, among others, managing notifications and storing information on it, routing packets etc. New protocols are mainly signal protocols, which means that they are used for making connections and multimedia sessions, determining user\'s location, translation of addresses, negotiations of parameters of the notification link, disconnecting and managing notifications, billings and realizations of safety mechanisms. At present two main protocols are used for VoIP, namely H.323 and SIP.

H.323 ProtocolThe first version of the protocol was adopted over 10 tears ago, namely in 1996, the second one in 1998. It belongs to the family of H.32x protocols, which describe multimedia connections inside various networks: H.320 - narrow-band digital networks ISDNH.321 - broad-band digital networks ISDN and ATMH.322 - packet networks with guaranteed bandH.323 - packet networks with no guaranteed bandH.324 - analogue POTS networks

All of these protocols support various sets of audio and video codecs, depending on the band made available in the network. They can operate with transmission control (TCP) or without it (UDP), where in VoIP connections no control protocols are used due to additional delays. H.323 supervises the process of sending multimedia data in packet networks, performing this task in real time. Components of H.323 precisely define how particular components of the system, operating according to the protocol, initiate multimedia sessions and how working posts exchange compressed audio and video data between themselves. H.323 supervises processes of sending multimedia data in packet networks, performing this task in real time. Architecture of telephone IP network, base on H.323 standard, consists of four basic elements: terminals, gatekeepers, gateways and MCU (Multipoint Control Units). Terminals are clients who have the possibility of initiation and receiving notifications. They are also used for sending and receiving two-way stream of data. A terminal can be both, software operating on a PC and a special device designed for this purpose. All terminals should make carrying out a telephone conversation possible, while data or video service is optional. Gatekeepers manage the so-called zone, which is a collection of terminals, gateways and MCU. H.323 standard divides the net into these zones. Notifications inside a net are managed by a gatekeeper. Inter-zone notifications can engaged many gatekeepers. A gatekeeper, when present in the net, supervises the course of all telephone conversations carried out in a zone. Its basic tasks are the following: control of the available band, routing notifications, receiving, refusing notifications in a zone, translation of addresses and user authorization. Gatekeeper is also an interface to other H.323 networks. A \"gatekeeper\" is an optional element of the network but if it is present in a given subnetwork, then the terminals can use it. A gateway is responsible for connecting the telephone IP network to other types of network. For example, a gateway can connect H.323 network with SIP, PSTN (Public Switched Telephone Network) or ISDN networks. The gateway must provide an interface of real time between various transmission formats and communication procedures. Moreover, it is responsible for establishing and disconnecting connections in both connected networks. The gateway must therefore have mechanisms converting various formats at its disposal and it must operate networks based on various technologies. MCU runs conferences in which at least three end points participate. An MCU unit manages conference resources, runs negotiations between end points (agreeing on, for instance, the method of encoding audio and video data) and it can steer streams of packets containing multimedia data. MCU consists of two basic elements: Multipoint Controller (MC) and optionally a few Multipoint Processors (MP). MC is responsible for exchange of information and for negotiation of communication parameters between the end points, it runs H.255.0 signaling MP is among others responsible for mixing various multimedia data, format translations and eventual redistribution of streams from users.

What is very often possible is integration of network elements in one physical device. For example, functionality of a gatekeeper can be combined with functionality of a gateway and an MCU or an MCU can be built into the terminal so as to make conference connections possible without additional devices.

LINUX ROUTER FOR SMALL ISPEverything takes place in Linux environment (distribution practically does not matter but personally I recommend Slackware). I presume that people reading this article are equipped with basic knowledge on compilations, patching and software installation in Linux systems and that they are familiar with basic principles of working with packet filter iptables and a tool for queuing tc. The following elements influence the system so that it manages the band properly:

I suggest installation of the software in the above-stated order. Kernel 2.6.x is much more stable than series 2.4.x. The above configuration was tested by me on a big group of routers (10 ) and it has been working perfectly well. Needles to say, in Linux, you can only limit the output movement from a given interface with tc (Traffic Control). What we need then is a virtual mediation interface (the so-called IMQ) to which the Input movement should be redirected (the so-called Input). Thanks to such an operation, the input movement could be limited as output movement from a virtual IMQ. Obviously, queuing of solutions can be done without IMQ but I do not recommend this method because it is based up feigning packets, which makes later classification and prioritization impossible. I need to say a few more words on the interfaces. No network card based on Realtek\'s chipset should be used (especially RTL 8319), as they are devices designed for working posts and they lose work stability when there are more connections, which is manifested in an increase of the \'pings\' and in many other awkward phenomena. Instead of them I recommend cards based upon Intel Pro or 3com.Another aspect is the ESFQ mechanism (Enhanced Stochastic Fair Queuing) for the WAN router. In comparison to the traditionally used SFQ mechanism, it enables just division of links with a view to connections from the source and target address (hash src, hash dst - upload, download to WAN). Thanks to this, we are able to make WAN interfaces totally independent from LAN interfaces. Briefly speaking - no queuing is necessary for LAN, the whole movement will be justly separated between them. All of the above solutions refer obviously to HTB mechanism (Hierarchic Token Bucket).

Information Technology in Education, effects of the continuing developments in information technology (IT) on education. The pace of change brought about by new technologies has had a significant effect on the way people live, work, and play worldwide. New and emerging technologies challenge the traditional process of teaching and learning, and the way education is managed. Information technology, while an important area of study in its own right, is having a major impact across all curriculum areas. Easy worldwide communication provides instant access to a vast array of data, challenging assimilation and assessment skills. Rapid communication, plus increased access to IT in the home, at work, and in educational establishments, could mean that learning becomes a truly lifelong activity—an activity in which the pace of technological change forces constant evaluation of the learning process itself.

The faculty and staff should become aware of the problemsand opportunities presented by the use of IT and learn how to effectively confrontsuch challenges. Today’s internet plays a vital role in the educational advancementof the university.

An IT consultant works in partnership with clients, advising them how to use information technology in order to meet their business objectives or overcome problems. Consultants work to improve the structure and efficiency and of an organization's IT systems.

IT consultants may be involved in a variety of activities, including marketing, project management, client relationship management and systems development.

They may also be responsible for user training and feedback. In many companies, these tasks will be carried out by an IT project team. IT consultants are increasingly involved in sales and business development, as well as technical duties.

The explosive growth of the internet is the revolutionary phenomenon in computing and telecommunications. The internet has become the largest and the most important to all networks today, and has evolved into a global information superhighway.

The internet has also become a key platform for a rapidly expanding list of info and entertainment services and etc.

If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? (3000words)

First i will define what is an IT consultant?• The Information Technology Consultant works with user groups to solve business problems with available technology including hardware, software, databases, and peripherals. Services may include studying and analyzing systems needs; systems development; systems process analysis, design, and re-engineering; feasibility studies; developing requests for proposals; evaluating vendor products andmaking recommendations on selection. Enterprise support may require knowledge of business management, systems engineering operations research, and management engineering. Duties are performed at various levels within the defined category

• Probably focus your technical skills on predictable technologies: Microsoft server and desktop OSs, Microsoft Office software, e-mail platforms, and TCP/IP. Particularly ambitious IT consultants might add VoIP, Cisco, SonicWALL, and other network technologies to their plate, too. But you may be missing a cash cow if you overlook Intuit products. The company recently reported a 9 percent increase in quarterly revenue vs. the third quarter 2008. With third quarter 2009 revenue of $1.4 billion, someone is buying Intuit products, and many of those people could be your clients.

• A consultant is a professional that provides expert advice in a particular domain or area of expertise such as accountancy, information technology, the law, human resources, marketing, medicine, finance or more esoteric areas of knowledge, for example engineering and scientific specialties such as materials science, instrumentation, avionics, and stress analysis. See related Certified Management Consultant and MBA.• An expert in a specialized field brought in to provide independent professional advice to an organization on some aspect of its activities. A consultant may advise on the overall management of an organization or on a specific project, such as the introduction of a new computer system. Consultants are usually retained by a client for a set period of time, during which they will investigate the matter in hand and produce a report detailing their recommendations. Consultants may be established in business independently or be employed by a large consulting firm. Specific types of consultants include management consultants and internal consultants.

==I would suggest the following to improve the internet connectivity==

Technology

-Technology is helping to connect low-income residents with information, services, andpeople in their communities. Technology can contribute to these efforts in significantways. Universities tend to play three general roles in the marriage of technology applications tocommunity development goals • Consultant, • Application Service Provider• Catalyst.Although these roles are not mutually exclusive, each offers a different set of benefits andchallenges for the specific technology project, and for the university-community partnership as a whole.Technology can open the door to meaningful partnerships. Although the field of community development has yet to take full advantage of IT, the innovations at these six sites demonstrate that technology can play an important role in community development efforts. It is most effective when utilized as a means to impact community priorities, ratherthan an end in itself. Using IT to achieve broad community goals can lead to sustainable, long-term collaboration between “town and gown.

Infrastructure- Infrastructure can be defined as the basic physical and organizational structures needed for the operation of a society or enterprise, or the services and facilities necessary for an economy to function. The term typically refers to the technical structures that support a society, such as roads, water supply, sewers, power grids, telecommunications, and so forth. Viewed functionally, infrastructure facilitates the production of goods and services; for example, roads enable the transport of raw materials to a factory, and also for the distribution of finished products to markets. In some contexts, the term may also include basic social services such as schools and hospitals. In military parlance, the term refers to the buildings and permanent installations necessary for the support, redeployment, and operation of military forces.

..Infrastructure consulting, integration services and solutions help you build a responsive IT infrastructure, including network, data centers, messaging systems to meet the critical needs of today's businesses. We have extensive methodologies for architecting, integrating and testing the infrastructure solutions. Our long-standing strategic alliances with different companies helps customers gain access to worldwide professional services of these technology leaders during architecture, design and implementation phases of projects. We also offer IT infrastructure assessment and audit services, which help you realign your IT infrastructure to meet growing and changing business requirements.IT Infrastructure availability is critical to your business. Whenever a new application is scheduled to go online, or you are experiencing availability issues and can't determine the cause, our infrastructure assessment services would help you to align your IT infrastructure with your business objectives.The basic physical systems of a country's or community's population, including roads, utilities, water, sewage, etc. These systems are considered essential for enabling productivity in the economy. Developing infrastructure often requires large initial investment, but the economies of scale tend to be significant.-infrastructure is the physical hardware used to interconnect computers and users. Infrastructure includes the transmission media, including telephone lines, cable television lines, and satellites and antennas, and also the routers, aggregators, repeaters, and other devices that control transmission paths. Infrastructure also includes the software used to send, receive, and manage the signals that are transmitted.

In some usages, infrastructure refers to interconnecting hardware and software and not to computers and other devices that are interconnected. However, to some information technology users, infrastructure is viewed as everything that supports the flow and processing of information.

Infrastructure companies play a significant part in evolving the Internet, both in terms of where the interconnections are placed and made accessible and in terms of how much information can be carried how quickly.

Innovation - refers to a new way of doing something. It may refer to incremental and emergent or radical and revolutionary changes in thinking, products, processes, or organizations. A distinction is typically made between invention, an idea made manifest, and innovation, ideas applied successfully. (Mckeown 2008) In many fields, something new must be substantially different to be innovative, not an insignificant change, e.g., in the arts, economics, business and government policy. In economics the change must increase value, customer value, or producer value. The goal of innovation is positive change, to make someone or something better. Innovation leading to increased productivity is the fundamental source of increasing wealth in an economy.Innovation is an important topic in the study of economics, business, design, technology, sociology, and engineering. Colloquially, the word "innovation" is often synonymous with the output of the process. However, economists tend to focus on the process itself, from the origination of an idea to its transformation into something useful, to its implementation; and on the system within which the process of innovation unfolds. Since innovation is also considered a major driver of the economy, especially when it leads to increasing productivity, the factors that lead to innovation are also considered to be critical to policy makers. In particular, followers of innovation economics stress using public policy to spur innovation and growth.Those who are directly responsible for application of the innovation are often called pioneers in their field, whether they are individuals or organizations.==Goals of innovation==Programs of organizational innovation are typically tightly linked to organizational goals and objectives, to the business plan, and to market competitive positioning.For example, one driver for innovation programs in corporations is to achieve growth objectives. As Davila et al. (2006) note,"Companies cannot grow through cost reduction and reengineering alone . . . Innovation is the key element in providing aggressive top-line growth, and for increasing bottom-line results" In general, business organizations spend a significant amount of their turnover on innovation i.e. making changes to their established products, processes and services. The amount of investment can vary from as low as a half a percent of turnover for organizations with a low rate of change to anything over twenty percent of turnover for organizations with a high rate of change.The average investment across all types of organizations is four percent. For an organization with a turnover of say one billion currency units, this represents an investment of forty million units. This budget will typically be spread across various functions including marketing, product design, information systems, manufacturing systems and quality assurance.==Failure of innovation==Research findings vary, ranging from fifty to ninety percent of innovation projects judged to have made little or no contribution to organizational goals. One survey regarding product innovation quotes that out of three thousand ideas for new products, only one becomes a success in the marketplace. Failure is an inevitable part of the innovation process, and most successful organizations factor in an appropriate level of risk. Perhaps it is because all organizations experience failure that many choose not to monitor the level of failure very closely. The impact of failure goes beyond the simple loss of investment. Failure can also lead to loss of morale among employees, an increase in cynicism and even higher resistance to change in the future.Innovations that fail are often potentially good ideas but have been rejected or postponed due to budgetary constraints, lack of skills or poor fit with current goals. Failures should be identified and screened out as early in the process as possible. Early screening avoids unsuitable ideas devouring scarce resources that are needed to progress more beneficial ones. Organizations can learn how to avoid failure when it is openly discussed and debated. The lessons learned from failure often reside longer in the organizational consciousness than lessons learned from success. While learning is important, high failure rates throughout the innovation process are wasteful and a threat to the organization’s future.The causes of failure have been widely researched and can vary considerably. Some causes will be external to the organization and outside its influence of control. Others will be internal and ultimately within the control of the organization. Internal causes of failure can be divided into causes associated with the cultural infrastructure and causes associated with the innovation process itself. Failure in the cultural infrastructure varies between organizations but the following are common across all organizations at some stage in their life cycle (O'Sullivan, 2002):1. Poor Leadership2. Poor Organization3. Poor Communication4. Poor Empowerment5. Poor Knowledge ManagementCommon causes of failure within the innovation process in most organisations can be distilled into five types:1. Poor goal definition2. Poor alignment of actions to goals3. Poor participation in teams4. Poor monitoring of results5. Poor communication and access to informationEffective goal definition requires that organizations state explicitly what their goals are in terms understandable to everyone involved in the innovation process. This often involves stating goals in a number of ways. Effective alignment of actions to goals should link explicit actions such as ideas and projects to specific goals. It also implies effective management of action portfolios. Participation in teams refers to the behavior of individuals in and of teams, and each individual should have an explicitly allocated responsibility regarding their role in goals and actions and the payment and rewards systems that link them to goal attainment. Finally, effective monitoring of results requires the monitoring of all goals, actions and teams involved in the innovation process.Innovation can fail if seen as an organizational process whose success stems from a mechanistic approach i.e. 'pull lever obtain result'. While 'driving' change has an emphasis on control, enforcement and structures it is only a partial truth in achieving innovation. Organizational gatekeepers frame the organizational environment that "Enables" innovation; however innovation is "Enacted" – recognized, developed, applied and adopted – through individuals.Individuals are the 'atom' of the organization close to the minutiae of daily activities. Within individuals gritty appreciation of the small detail combines with a sense of desired organizational objectives to deliver (and innovate for) a product/service offer.From this perspective innovation succeeds from strategic structures that engage the individual to the organization’s benefit. Innovation pivots on intrinsically motivated individuals, within a supportive culture, informed by a broad sense of the future.Innovation, implies change, and can be counter to an organization’s orthodoxy. Space for fair hearing of innovative ideas is required to balance the potential autoimmune exclusion that quells an infant innovative culture.

If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved?

If I were hired by the university as an IT consultant I suggest the following:

Computer Specification Requirements:First I suggest that the university must use high end computer devices. Computers that perform fast and accurate could also help browsing the internet with greater speed. Why, because computer application upgrades as the taste of the people on software. Internet nowadays is not like the internet before. Before when you browse the internet you see more text and still pictures and images which requires minimal computer specification but browsing the internet today you will see many websites with lots of animations, video streaming which requires greater video card memory requirements, online gaming that is so heavy to be load to the computer’s memory and etc. The Internet is used for traditional forms of entertainment, as well. We listen to recording artists, preview or view motion pictures, read entire books and download material for future offline access. Live sporting events and concerts can be experienced as they are happening, or recorded and viewed on demand. So I suggest that the university must use a computer devices with today’s new design technologies enough meet desired speed and compatibility specially for the server.

Internet Connection Type:The Internet is a worldwide network comprising government, academic, commercial, military and corporate networks. The Internet was originally used by the US military, before becoming widely used for academic and commercial research. Users accessing the Internet can read and download data from almost anywhere in the world. You can communicate across the Internet using Internet e-mail.

In order to connect to the Internet, use e-mail and access the World Wide Web you must obtain and set up a modem. This allows the PC to access the Internet over a telephone line, or obtain a LAN Modem to provide WAN access to several people simultaneously on your LAN. This will allow several PCs to share a single connection to the Internet. Obtain an Internet account from an Internet Service Provider or ISP. An ISP is a company that can provide access to the Internet and give you an Internet e-mail address. You then access the Internet by using your modem to dial into the ISP server. Obtain and install a web browser. This allows you to view web pages as well as send and receive e-mail.

For High speed internet connection the university must subscribe a net connection in form of broadband, DSL type of connection with the highest speed feature. Broadband is the transmission capacity with sufficient bandwidth to permit combined provision of voice, data and video. According to ITU report, it refers to DSL and cable modem services with band width greater than 128kbps in at least one direction. In determining the speed connection we should determine the bandwidth of internet type subscribe by the university. Bandwidth is the range of frequencies available to be occupied by signals. In analogue systems it is measured in terms of Hertz (Hz) and in digital systems in bit/s per second (bit/s), the higher the bandwidth, the greater the amount of information that can be transmitted in a given time. High bandwidth channels are referred to as broadband which typically means 1.5/2.0 Mbit/s or higher. So if the connections int the laboratories increases more bandwidth is needed.

Network Architecture:Networks can be classified by their topology. A topology refers to the specific physical configuration of a network or a portion of a network that identifies how the various locations within the network are interconnected and how the hardware function to form an operable network. There are many kinds of topology but only one of the topologies I can suggest and one of them is the Star topology.

Star Topology: Why star topology? Star topology all the stations are wired into a central wiring concentrator or network device that route message to the appropriate computer. Each Computer is linked by a separate point-to-point circuit through the central connection point, so if one of the circuits looses or breaks only one computer is disconnected to the network and other computers connected to the internet are not affected. In star topology packets sent from one station to another are repeated to all ports. This allows all station to see each packet sent on the network, but only the station a packet is addressed pays attention to it.

Because the central computer or network device receives all messages in the network, it must have sufficient capacity to handle traffic peaks, otherwise it may become overloaded. Any break in the cable affect only one computer on that circuit, since all computer networks connects to the central point, if the central point fails, the entire network fails.

Structured Cabling System: Definition: Structured cabling skills are crucial for any networking professional. Structured cabling creates a physical topology where telecommunications cabling is organized into hierarchical termination and interconnection structures according to standards. The word telecommunications is used to express the necessity of dealing with electrical power wires, telephone wires, and cable television coaxial cable in addition to copper and optical networking media. Structured cabling is an OSI Layer 1 issue. Without Layer 1 connectivity, the Layer 2 switching and Layer 3 routing process that makes data transfer across large networks possible cannot occur. Especially for people new to the networking workforce, many of the day-to-day jobs deal with structured cabling. Many different standards are used to define the rules of structured cabling. These standards vary around the world. Three standards of central importance in structured cabling are ANSI TIA/EIA-568-B, ISO/IEC 11801, and IEEE 802.x.

Why practice structured Cabling? It is a manner of planning the physical arrangement of the cables and placement of the device port or outlets according to purpose. One of the subsystems is the Backbone cabling. The backbone cabling provides interconnection between telecommunication closets, equipment rooms, and entrance facilities. It consists of the backbone cables, intermediate and main cross-connects, mechanical terminators and patch cords or jumpers used backbone-to-backbone cross-connection. It includes cable between buildings.

When connecting buildings I suggest that the university use the FDDI technology. FDDI or Fiber Distributed Data Interface. This is a network standard for high speed transmission over fiber-optic cable. FDDI uses two rings of fiber-optic cabling (providing greater resilience) and transmits at 100Mbps at distances up to 2km (1.24 mile) between nodes. FDDI is typically used as a backbone technology providing connectivity between Ethernet and Token Ring networks. It is used in critical applications, for example in an airport. A Fiber Optic Cable is a cable containing one or more optical fibers used for transmitting data in the form of light. Fiber-optic cable is more expensive than copper but is not susceptible to electromagnetic interference and is capable of higher data transfer speeds over greater distances. Fiber Optics is commonly used by many organizations with connections having two or more buildings are concerned.

Network Devices:The first versions of Ethernet used coaxial cable to connect computers in a bus topology. Each computer was directly connected to the backbone. These early versions of Ethernet were known as Thicknet, (10BASE5) and Thinnet (10BASE2).

10BASE5, or Thicknet, used a thick coaxial that allowed for cabling distances of up to 500 meters before the signal required a repeater. 10BASE2, or Thinnet, used a thin coaxial cable that was smaller in diameter and more flexible than Thicknet and allowed for cabling distances of 185 meters.

The ability to migrate the original implementation of Ethernet to current and future Ethernet implementations is based on the practically unchanged structure of the Layer 2 frame. Physical media, media access, and media control have all evolved and continue to do so. But the Ethernet frame header and trailer have essentially remained constant.

The early implementations of Ethernet were deployed in a low-bandwidth LAN environment where access to the shared media was managed by CSMA, and later CSMA/CD. In additional to being a logical bus topology at the Data Link layer, Ethernet also used a physical bus topology. This topology became more problematic as LANs grew larger and LAN services made increasing demands on the infrastructure.

The original thick coaxial and thin coaxial physical media were replaced by early categories of UTP cables. Compared to the coaxial cables, the UTP cables were easier to work with, lightweight, and less expensive.

The physical topology was also changed to a star topology using hubs. Hubs concentrate connections. In other words, they take a group of nodes and allow the network to see them as a single unit. When a frame arrives at one port, it is copied to the other ports so that all the segments on the LAN receive the frame. Using the hub in this bus topology increased network reliability by allowing any single cable to fail without disrupting the entire network. However, repeating the frame to all other ports did not solve the issue of collisions. Later in this chapter, you will see how issues with collisions in Ethernet networks are managed with the introduction of switches into the network.

By the way, Ethernet was invented by Xerox Corporation and developed jointly by Xerox, Intel, and Digital Equipment Corporation (DEC) and is a widely used LAN technology.

Ethernet networks use the CSMA/CD protocol and run over various cables at a rate of 10Mbps; they are used by, for example, TCP/IP and XNS protocols. Ethernet is similar to a series of standards produced by IEEE referred to as IEEE 802.3.1. Cable MediumTwisted pair (TP) a pair of thin wires commonly used for telephone wiring and computer networks. The wires are twisted around each other to minimize interference from other cables. The two major types of twisted pair are shielded twisted pair (STP) and unshielded twisted pair (UTP). UTP is popular because it is thinner and doesn't take up much room, but STP has added protection against electromagnetic interference.

Twisted pair cable has easy-to-use connectors that simply slot into the ports on the devices and network equipment.

If one of the twisted pair cables is damaged or disconnected, only that specific connection is broken and the rest of the network remains operational. Making changes to the network, such as adding PCs, is easy and can be done without affecting other devices on the network.

For Ethernet networks, you can use either Category 3 or Category 5 cable. A category 5 cable is one of five grades of twisted pair cabling defined by the EIA/TIA-586 standard. Category 5 is used in 100BASE-T networks (Fast Ethernet) and can transmit data up to speeds of 100Mbps. Category 5 cabling is better to use for network cabling than Category 3, because it supports both Ethernet (10Mbps) and Fast Ethernet (100Mbps) speeds. However, if Category 5 cable is used, you can upgrade your network from Ethernet to Fast Ethernet in the future (as Category 3 cable cannot be used for Fast Ethernet networks).

To adhere to Ethernet and Fast Ethernet standards and avoid bad network performance, the distance between a PC or laptop and a hub or switch should never exceed 100m (328ft). When connecting two Fast Ethernet devices it is recommended that the maximum length of cable used to connect them is 5m (16.4ft).This is so that PCs and laptops can be connected to those devices with 100m (328ft) cable, without exceeding the Fast Ethernet maximum total cabling length between any two PCs and Laptops of 205m (627ft).

According to my friend who works with AMA Computer Learning Center Davao, the newest network cable being used today is the UTP category-6 which offers greater network capability which means faster and extended network connection.

to be continued...

Last edited by Anthony Rigor Aguilar on Wed Aug 19, 2009 9:57 am; edited 1 time in total