OS/2 is a series of computer operating systems, initially created by Microsoft and IBM under the leadership of IBM software designer Ed Iacobucci.[2] As a result of a feud between the two companies over how to position OS/2 relative to Microsoft's new Windows 3.1 operating environment,[3] the two companies severed the relationship in 1992 and OS/2 development fell to IBM exclusively.[4] The name stands for "Operating System/2", because it was introduced as part of the same generation change release as IBM's "Personal System/2 (PS/2)" line of second-generation personal computers. The first version of OS/2 was released in December 1987 and newer versions were released until December 2001.

OS/2 was intended as a protected mode successor of PC DOS. Notably, basic system calls were modelled after MS-DOS calls; their names even started with "Dos" and it was possible to create "Family Mode" applications: text mode applications that could work on both systems.[5] Because of this heritage, OS/2 shares similarities with Unix, Xenix, and Windows NT.

IBM discontinued its support for OS/2 on 31 December 2006.[6] Since then, it has been updated, maintained and marketed under the name eComStation. In 2015 it was announced[7] that a new OEM distribution of OS/2 would be released that was to be called ArcaOS.[8] ArcaOS is available for purchase.[9]

The development of OS/2 began when IBM and Microsoft signed the "Joint Development Agreement" in August 1985.[10][11] It was code-named "CP/DOS" and it took two years for the first product to be delivered.

OS/2 1.0 was announced in April 1987 and released in December. The original release is textmode-only, and a GUI was introduced with OS/2 1.1 about a year later. OS/2 features an API for controlling the video display (VIO) and handling keyboard and mouse events so that programmers writing for protected-mode need not call the BIOS or access hardware directly. In addition, development tools include a subset of the video and keyboard APIs as linkable libraries so that family mode programs are able to run under MS-DOS. A task-switcher named Program Selector is available through the Ctrl-Esc hotkey combination, allowing the user to select among multitasked text-mode sessions (or screen groups; each can run multiple programs).[12]

The promised user interface, Presentation Manager, was introduced with OS/2 1.1 in October 1988.[13] It had a similar user interface to Windows 2.1, which was released in May of that year. (The interface was replaced in versions 1.2 and 1.3 by a look closer in appearance to Windows 3.1).

The Extended Edition of 1.1, sold only through IBM sales channels, introduced distributed database support to IBM database systems and SNA communications support to IBM mainframe networks.

The collaboration between IBM and Microsoft unravelled in 1990, between the releases of Windows 3.0 and OS/2 1.3. During this time, Windows 3.0 became a tremendous success, selling millions of copies in its first year.[17] Much of its success was because Windows 3.0 (along with MS-DOS) was bundled with most new computers.[18] OS/2, on the other hand, was available only as an additional stand-alone software package. In addition, OS/2 lacked device drivers for many common devices such as printers, particularly non-IBM hardware.[19] Windows, on the other hand, supported a much larger variety of hardware. The increasing popularity of Windows prompted Microsoft to shift its development focus from cooperating on OS/2 with IBM to building its own business based on Windows.[20]

Several technical and practical reasons contributed to this breakup.

The two companies had significant differences in culture and vision. Microsoft favored the open hardware system approach that contributed to its success on the PC; IBM sought to use OS/2 to drive sales of its own hardware, including systems that could not support the features Microsoft wanted. Microsoft programmers also became frustrated with IBM's bureaucracy and its use of lines of code to measure programmer productivity.[21] IBM developers complained about the terseness and lack of comments in Microsoft's code, while Microsoft developers complained that IBM's code was bloated.[22]

The two products have significant differences in API. OS/2 was announced when Windows 2.0 was near completion, and the Windows API already defined. However, IBM requested that this API be significantly changed for OS/2.[23] Therefore, issues surrounding application compatibility appeared immediately. OS/2 designers hoped for source code conversion tools, allowing complete migration of Windows application source code to OS/2 at some point. However, OS/2 1.x did not gain enough momentum to allow vendors to avoid developing for both OS/2 and Windows in parallel.

OS/2 1.x targets the Intel 80286 processor and DOS fundamentally doesn't. IBM insisted on supporting the 80286 processor, with its 16-bit segmented memory mode, because of commitments made to customers who had purchased many 80286-based PS/2s as a result of IBM's promises surrounding OS/2.[24] Until release 2.0 in April 1992, OS/2 ran in 16-bit protected mode and therefore could not benefit from the Intel 80386's much simpler 32-bitflat memory model and virtual 8086 mode features. This was especially painful in providing support for DOS applications. While, in 1988, Windows/386 2.1 could run several cooperatively multitasked DOS applications, including expanded memory (EMS) emulation, OS/2 1.3, released in 1991, was still limited to one 640 kB "DOS box".

Given these issues, Microsoft started to work in parallel on a version of Windows which was more future-oriented and more portable. The hiring of Dave Cutler, former VMS architect, in 1988 created an immediate competition with the OS/2 team, as Cutler did not think much of the OS/2 technology and wanted to build on his work at Digital rather than creating a "DOS plus". His "NT OS/2" was a completely new architecture.[25]

The OS/2 2.0 upgrade box

IBM grew concerned about the delays in development of OS/2 2.0. Initially, the companies agreed that IBM would take over maintenance of OS/2 1.0 and development of OS/2 2.0, while Microsoft would continue development of OS/2 3.0. In the end, Microsoft decided to recast NT OS/2 3.0 as Windows NT, leaving all future OS/2 development to IBM. From a business perspective, it was logical to concentrate on a consumer line of operating systems based on DOS and Windows, and to prepare a new high-end system in such a way as to keep good compatibility with existing Windows applications. While waiting for this new high-end system to develop, Microsoft would still receive licensing money from Xenix and OS/2 sales. Windows NT's OS/2 heritage can be seen in its initial support for the HPFS filesystem, text mode OS/2 1.x applications, and OS/2 LAN Manager network support. Some early NT materials even included OS/2 copyright notices embedded in the software.[citation needed]
One example of NT OS/2 1.x support is in the WIN2K resource kit. Windows NT could also support OS/2 1.x Presentation Manager and AVIO applications with the addition of the Windows NT Add-On Subsystem for Presentation Manager.[26]

OS/2 2.0 was released in April 1992. It provided a 32-bit API for native programs, though the OS itself still contained some 16-bit code and drivers. It also included a new OOUI (object-oriented user interface) called the Workplace Shell. This was a fully object-oriented interface that was a significant departure from the previous GUI. Rather than merely providing an environment for program windows (such as the Program Manager), the Workplace Shell provided an environment in which a user could manage programs, files and devices by manipulating objects on the screen. With the Workplace Shell, everything in the system is an "object" to be manipulated.

OS/2 2.0 was touted by IBM as "a better DOS than DOS and a better Windows than Windows".[27] It managed this by including fully licensed MS-DOS 5.0 which had been patched and improved upon. For the first time, OS/2 was able to run more than one DOS application at a time. This was so effective that it allowed OS/2 to run a modified copy of Windows 3.0, itself a DOS extender, including Windows 3.0 applications.

Because of the limitations of the Intel 80286 processor, OS/2 1.x could run only one DOS program at a time, and did this in a way that allowed the DOS program to have total control over the computer. A problem in DOS mode could crash the entire computer. In contrast, OS/2 2.0 could benefit from the virtual 8086 mode of the Intel 80386 processor to create a much safer virtual machine in which to run DOS programs. This included an extensive set of configuration options to optimize the performance and capabilities given to each DOS program. Any real mode operating system (such as 8086 Xenix) could also be made to run using OS/2's virtual machine capabilities, subject to certain direct hardware access limitations.

Like most 32-bit environments, OS/2 could not run protected-mode DOS programs using the older VCPI interface, unlike the Standard mode of Windows 3.1; it only supported programs written according to DPMI. (Microsoft discouraged the use of VCPI under Windows 3.1, however, due to performance degradation.[28])

Unlike Windows NT, OS/2 also always gave DOS programs the possibility of masking real hardware interrupts, so any DOS program could deadlock the machine this way. OS/2 could however use a hardware watchdog on selected machines (notably IBM machines) to break out of such a deadlock. Later, release 3.0 leveraged the enhancements of newer Intel 486 and Intel Pentium processors—the Virtual Interrupt Flag (VIF), which was part of the Virtual Mode Extensions (VME)—to solve this problem.

Compatibility with Windows 3.0 (and later Windows 3.1) was achieved by adapting Windows user-mode code components to run inside a virtual DOS machine (VDM). Originally, a nearly complete version of Windows code was included with OS/2 itself: Windows 3.0 in OS/2 2.0, and Windows 3.1 in OS/2 2.1. Later, IBM developed versions of OS/2 that would use whatever Windows version the user had installed previously, patching it on the fly, and sparing the cost of an additional Windows license.[29] It could either run full-screen, using its own set of video drivers, or "seamlessly," where Windows programs would appear directly on the OS/2 desktop. The process containing Windows was given fairly extensive access to hardware, especially video, and the result was that switching between a full-screen WinOS/2 session and the Workplace Shell could occasionally cause issues.[30]

Because OS/2 only runs the user-mode system components of Windows, it is not compatible with Windows device drivers (VxDs) and applications needing them.

Multiple Windows applications run by default in a single Windows session - multitasking cooperatively and without memory protection - just as they would under native Windows 3.x. However, to achieve true isolation between Windows 3.x programs, OS/2 also can run multiple copies of Windows in parallel, with each copy residing in a separate VDM. The user can then optionally place each program either in its own Windows session - with preemptive multitasking and full memory protection between sessions, though not within them - or allow some applications to run together cooperatively in a shared Windows session while isolating other applications in one or more separate Windows sessions. At the cost of additional hardware resources, this approach can protect each program in any given Windows session (and each instance of Windows itself) from every other program running in any separate Windows session (though not from other programs running in the same Windows session).

Whether Windows applications are running in full-screen or windowed mode, and in one Windows session or several, it is possible to use DDE between OS/2 and Windows applications, and OLE between Windows applications only.[31]

Released in 1994, OS/2 version 3.0 was labelled as OS/2 Warp to highlight the new performance benefits, and generally to freshen the product image. "Warp" had originally been the internal IBM name for the release: IBM claimed that it had used Star Trek terms as internal names for prior OS/2 releases, and that this one seemed appropriate for external use as well. At the launch of OS/2 Warp in 1994, Patrick Stewart was to be the Master of Ceremonies; however Kate Mulgrew[32] of the then-upcoming series Star Trek: Voyager was substituted at the last minute.[33][34]:p. 108

OS/2 Warp offers a host of benefits over OS/2 2.1, notably broader hardware support, greater multimedia capabilities, Internet-compatible networking, and it includes a basic office application suite known as IBM Works. It was released in two versions: the less expensive "Red Spine" and the more expensive "Blue Spine" (named for the color of their boxes). "Red Spine" was designed to support Microsoft Windows applications by utilizing any existing installation of Windows on the computer's hard drive. "Blue Spine" includes Windows support in its own installation, and so can support Windows applications without a Windows installation. As most computers were sold with Microsoft Windows pre-installed and the price was less, "Red Spine" was the more popular product.[citation needed] OS/2 Warp Connect—which has full LAN client support built-in—followed in mid-1995. Warp Connect was nicknamed "Grape".[13]

In OS/2 2.0, most performance-sensitive subsystems, including the graphics (Gre) and multimedia (MMPM/2) systems, were updated to 32-bit code in a fixpack, and included as part of OS/2 2.1. Warp 3 brought about a fully 32-bit windowing system, while Warp 4 introduced the object-oriented 32-bit GRADD display driver model.

Mozilla 1.7.13 for OS/2 Warp 4

Firefox 3.5.4 for OS/2 Warp 4

In 1996, Warp 4 added Java and speech recognition software. IBM also released server editions of Warp 3 and Warp 4 which bundled IBM's LAN Server product directly into the operating system installation. A personal version of Lotus Notes was also included, with a number of template databases for contact management, brainstorming, and so forth. The UK-distributed free demo CD-ROM of OS/2 Warp essentially contained the entire OS and was easily, even accidentally, cracked[clarification needed], meaning that even people who liked it did not have to buy it. This was seen as a backdoor tactic to increase the number of OS/2 users, in the belief that this would increase sales and demand for third-party applications, and thus strengthen OS/2's desktop numbers.[citation needed] This suggestion was bolstered by the fact that this demo version had replaced another which was not so easily cracked, but which had been released with trial versions of various applications.[citation needed] In 2000, the July edition of Australian Personal Computer magazine bundled software CD-ROMs, included a full version of Warp 4 that required no activation and was essentially a free release. Special versions of OS/2 2.11 and Warp 4 also included symmetric multiprocessing (SMP) support.

OS/2 sales were largely concentrated in networked computing used by corporate professionals; however, by the early 1990s, it was overtaken by Microsoft Windows NT. While OS/2 was arguably technically superior to Microsoft Windows 95, OS/2 failed to develop much penetration in the consumer and stand-alone desktop PC segments; there were reports that it could not be installed properly on IBM's own Aptiva series of home PCs.[35] Microsoft made an offer in 1994 where IBM would receive the same terms as Compaq (the largest PC manufacturer at the time) for a license of Windows 95, if IBM ended development of OS/2 completely. IBM refused and instead went with an "IBM First" strategy of promoting OS/2 Warp and disparaging Windows, as IBM aimed to drive sales of its own software as well as hardware. By 1995, Windows 95 negotiations between IBM and Microsoft, which were already difficult, stalled when IBM purchased Lotus SmartSuite, which would have directly competed with Microsoft Office. As a result of the dispute, IBM signed the license agreement 15 minutes before Microsoft's Windows 95 launch event, which was later than their competitors and this badly hurt sales of IBM PCs. IBM officials later conceded that OS/2 would not have been a viable operating system to keep them in the PC business.[36][37]

In 1991 IBM started development on an intended replacement for OS/2 called Workplace OS. This was an entirely new product, brand new code, that borrowed only a few sections of code from both the existing OS/2 and AIX products. It used an entirely new microkernel code base, intended (eventually) to host several of IBM's operating systems (including OS/2) as microkernel "personalities". It also included major new architectural features including a system registry, JFS, support for UNIX graphics libraries, and a new driver model.[38]

Workplace OS was developed solely for POWER platforms, and IBM intended to market a full line of PowerPCs in an effort to take over the market from Intel. A mission was formed to create prototypes of these machines and they were disclosed to several Corporate customers, all of whom raised issues with the idea of dropping Intel.

Advanced plans for the new code base would eventually include replacement of the OS/400 operating system by Workplace OS, as well as a microkernel product that would have been used in industries such as telecommunications and set-top television receivers.

A partial pre-alpha version of Workplace OS was demonstrated at Comdex where a bemused Bill Gates stopped by the booth. The second and last time it was shown in public was at an OS/2 user group in Phoenix AZ, where the pre-alpha code refused to boot.

It was released in 1995. But with $990 million being spent per year on development of this as well as Workplace OS, and no possible profit or widespread adoption, the end of the entire Workplace OS and OS/2 product line was near.

A project was launched internally by IBM to evaluate the looming competitive situation with Microsoft Windows 95. Primary concerns included the major code quality issues in the existing OS/2 product (resulting in over 20 service packs, each requiring more diskettes than the original installation), and the ineffective and heavily matrixed development organization in Boca Raton (where the consultants reported that "basically, everybody reports to everybody") and Austin.

That study, tightly classified as "Registered Confidential" and printed only in numbered copies, identified untenable weaknesses and failures across the board in the Personal Systems Division as well as across IBM as a whole. This resulted in a decision being made at a level above the Division to cut over 95% of the overall budget for the entire product line, end all new development (including Workplace OS), eliminate the Boca Raton development lab, end all sales and marketing efforts of the product, and lay off over 1,300 development individuals (as well as sales and support personnel). $990 million had been spent in the last full year. Warp 4 became the last distributed version of OS/2.

A small and dedicated community remained faithful to OS/2 for many years after its final mainstream release,[39] but overall, OS/2 failed to catch on in the mass market and is little used outside certain niches where IBM traditionally had a stronghold. For example, many bank installations, especially automated teller machines, run OS/2 with a customized user interface; French SNCF national railways used OS/2 1.x in thousands of ticket selling machines.[citation needed] Telecom companies such as Nortel use OS/2 in some voicemail systems. Also, OS/2 was used for the host PC used to control the Satellite Operations Support System equipment installed at NPR member stations from 1994 to 2007, and used to receive the network's programming via satellite.[citation needed]

Although IBM began indicating shortly after the release of Warp 4 that OS/2 would eventually be withdrawn, the company did not end support until December 31, 2006.[40] Sales of OS/2 stopped on December 23, 2005. The latest IBM version is 4.52, which was released for both desktop and server systems in December 2001. Serenity Systems has been reselling OS/2 since 2001, calling it eComStation. Version 1.2 was released in 2004. After a series of preliminary "release candidates," version 2.0 GA (General Availability) was released on 15 May 2010.[41] eComStation version 2.1 GA was released on May 20, 2011.[42]

IBM is still delivering defect support for a fee.[40][43] IBM urges customers to migrate their often highly complex applications to e-business technologies such as Java in a platform-neutral manner. Once application migration is completed, IBM recommends migration to a different operating system, suggesting Linux as an alternative.[44][45][46]

This section needs to be updated. Please update this article to reflect recent events or newly available information.(March 2016)

As of 2008[update], support for running OS/2 under virtualization appears to be improving in several third-party products. OS/2 has historically been more difficult to run in a virtual machine than most other legacy x86 operating systems because of its extensive reliance on the full set of features of the x86 CPU; in particular, OS/2's use of ring 2 prevented it from running in VMware.[47] Emulators such as QEMU and Bochs don't suffer from this problem and can run OS/2.[citation needed]
A beta of VMWare Workstation 2.0 released in January 2000 was the first hypervisor that could run OS/2 at all. Later, the company decided to drop official OS/2 support.

VirtualPC from Microsoft (originally Connectix) has been able to run OS/2 without hardware virtualization support for many years. It also provided “additions” code which greatly improves host-guest OS interactions in OS/2. The additions are not provided with the current version of VirtualPC, but the version last included with a release may still be used with current releases. At one point, OS/2 was a supported host for VirtualPC in addition to a guest. Note that OS/2 runs only as a guest on those versions of VirtualPC that use virtualization (x86 based hosts) and not those doing full emulation (VirtualPC for Mac).

VirtualBox from Oracle Corporation (originally InnoTek, later Sun) supports OS/2 Warp 3, 4 and 4.5 as well as eComStation as guests. However, attempting to run OS/2 and eComStation can still be difficult, if not impossible, because of the strict requirements of VT-x/AMD-V hardware-enabled virtualization and only ACP2/MCP2 is reported to work in a reliable manner.[48]

The difficulties in efficiently running OS/2 have, at least once, created an opportunity for a new virtualization company. A large bank in Moscow needed a way to use OS/2 on newer hardware that OS/2 did not support. As virtualization software is an easy way around this, the company desired to run OS/2 under a hypervisor. Once it was determined that VMware was not a possibility, it hired a group of Russian software developers to write a host-based hypervisor that would officially support OS/2. Thus, the Parallels, Inc. company and their Parallels Workstation was born.[49]

OS/2 has few native computer viruses;[50] while it is not invulnerable by design, its reduced market share appears to have discouraged virus writers. There are, however, OS/2-based antivirus programs, dealing with DOS viruses and Windows viruses that could pass through an OS/2 server.[51]

Many people hoped that IBM would release OS/2 or a significant part of it as open source. Petitions were held in 2005 and 2007, but IBM refused them, citing legal and technical reasons.[52] It is unlikely that the entire OS will be open at any point in the future because it contains third-party code to which IBM does not have copyright, and much of this code is from Microsoft. IBM also once engaged in a technology transfer with Commodore, licensing Amiga technology for OS/2 2.0 and above, in exchange for the REXX scripting language.[53] This means that OS/2 may have some code that was not written by IBM, which can therefore prevent the OS from being re-announced as open-sourced in the future.[54][55] On the other hand, IBM donated Object REXX for Windows and OS/2 to the Open Object REXX project maintained by the REXX Language Association on Sourceforge.[56]

There was a petition, arranged by OS2World, to open parts of the OS. Open source operating systems such as Linux have already profited from OS/2 indirectly through IBM's release of the improved JFSfile system, which was ported from the OS/2 code base. As IBM didn't release the source of the OS/2 JFS driver, developers ported the Linux driver back to eComStation and added the functionality to boot from a JFS partition. This new JFS driver has been integrated into eComStation v2.0, the successor of OS/2.

The graphic system has a layer named Presentation Manager that manages windows, fonts, and icons. This is similar in functionality to a non-networked version of X11 or the Windows GDI. On top of this lies the Workplace Shell (WPS) introduced in OS/2 2.0. WPS is an object-orientedshell allowing the user to perform traditional computing tasks such as accessing files, printers, launching legacy programs, and advanced object oriented tasks using built-in and third-party application objects that extended the shell in an integrated fashion not available on any other mainstream operating system. WPS follows IBM's Common User Access user interface standards.

Hardware vendors were reluctant to support device drivers for alternative operating systems including OS/2 and Linux, leaving users with few choices from a select few vendors. To relieve this issue for video cards, IBM licensed a reduced version of the Scitech display drivers, allowing users to choose from a wide selection of cards supported through Scitech's modular driver design.[59]

WPS represents objects such as disks, folders, files, program objects, and printers using the System Object Model (SOM), which allows code to be shared among applications, possibly written in different programming languages. A distributed version called DSOM allowed objects on different computers to communicate. DSOM is based on CORBA. The object oriented aspect of SOM is similar to, and a direct competitor to, Microsoft's Component Object Model, though it is implemented in a radically different manner; for instance, one of the most notable differences between SOM and COM is SOM's support for inheritance (one of the most fundamental concepts of OO programming)—COM does not have such support. SOM and DSOM are no longer being developed.

OS/2 also includes a radical advancement in application development with compound document technology called OpenDoc, which was developed with Apple. OpenDoc proved interesting as a technology, but was not widely used or accepted by users or developers. OpenDoc is also no longer being developed.

The multimedia capabilities of OS/2 are accessible through Media Control Interface commands.
The last update (bundled with the IBM version of Netscape Navigator plugins) added support for MPEG files. Support for newer formats like PNG, progressive JPEG, DivX, Ogg, MP3 comes from third parties. Sometimes it is integrated with the multimedia system, but in other offers it comes as standalone applications.

Some problems were classic subjects of comparison with other operating systems:

Synchronous input queue (SIQ): if a GUI application was not servicing its window messages, the entire GUI system could get stuck and a reboot was required. This problem was considerably reduced with later Warp 3 fixpacks and refined by Warp 4, by taking control over the application after it had not responded for several seconds.[60]

No unified object handles (OS/2 v2.11 and earlier): The availability of threads probably led system designers to overlook mechanisms which allow a single thread to wait for different types of asynchronous events at the same time, for example the keyboard and the mouse in a "console" program. Even though select was added later, it only worked on network sockets. In case of a console program, dedicating a separate thread for waiting on each source of events made it difficult to properly release all the input devices before starting other programs in the same "session". As a result, console programs usually polled the keyboard and the mouse alternately, which resulted in wasted CPU and a characteristic "jerky" reactivity to user input. In OS/2 3.0 IBM introduced a new call for this specific problem.[61]

OS/2 has been widely used in Iran Export Bank (Bank Saderat Iran) in their teller machines, ATMs and local servers (over 30,000 working stations). As of 2011, the bank moved to virtualize and renew their infrastructure by moving OS/2 to Virtual Machines running over Windows.

OS/2 has been used in the banking industry. Suncorp bank in Australia still ran its ATM network on OS/2 as late as 2002. ATMs in Perisher Blue used OS/2 as late as 2009, and even the turn of the decade.[63]

OS/2 was widely adopted by accounting professionals and auditing companies. In mid-1990s native 32-bit accounting software were well developed and serving corporate markets.

OS/2 ran the faulty baggage handling system at Denver International Airport. The OS was eventually scrapped, but the software written for the system led to massive delays in the opening of the new airport. The OS itself was not at fault, but the software written to run on the OS. The baggage handling system was eventually removed.

OS/2 was used by radio personality Howard Stern. He once had a 10-minute on-air rant about OS/2 versus Windows 95 and recommended OS/2. He also used OS/2 on his IBM 760CD laptop.

OS/2 was used as part of the Satellite Operations Support System (SOSS) for NPR's Public Radio Satellite System. SOSS was a computer-controlled system using OS/2 that NPR member stations used to receive programming feeds via satellite. SOSS was introduced in 1994 using OS/2 3.0, and was retired in 2007, when NPR switched over to its successor, the ContentDepot.

OS/2 was used to control the SkyTrain automated light rail system in Vancouver, British Columbia, Canada until the late 2000s when it was replaced by Windows XP.

OS/2 was used in the London UndergroundJubilee Line Extension Signals Control System (JLESCS) in London, UK. This control system delivered by Alcatel was in use from 1999 to 2011 i.e. between abandonment before opening of the line's unimplemented original automatic train control system and the present SelTrac system. JLESCS did not provide automatic train operation only manual train supervision. Six OS/2 local site computers were distributed along the railway between Stratford and Westminster, the shunting tower at Stratford depot, and several formed the central equipment located at Neasden. It was once intended to cover the rest of the line between Green Park and Stanmore but this was never introduced.

OS/2 has been used by The Co-operative Bank in the UK for its domestic call centre staff, using a bespoke program created to access customer accounts which cannot easily be migrated to Windows.

OS/2 has been used by the Stop & Shop supermarket chain (and has been installed in new stores as recently as March 2010).

OS/2 has been used on ticket machines for Croydon Tramlink in outer-London (UK).

OS/2 was used by Trenitalia, both for the desktops at Ticket Counters and for the Automatic Ticket Counters up to 2011. Incidentally, the Automatic Ticket Counters with OS/2 were more reliable than the current ones running a flavor of Windows.[citation needed]

OS/2 was used as the main operating system for Abbey National General Insurance motor and home direct call centre products using the PMSC Series III insurance platform on DB2.2 from 1996-2001

BYTE in 1989 listed OS/2 as among the "Excellence" winners of the BYTE Awards, stating that it "is today where the Macintosh was in 1984: It's a development platform in search of developers". The magazine predicted that "When it's complete and bug-free, when it can really use the 80386, and when more desktops sport OS/2-capable PCs, OS/2 will—deservedly—supersede DOS. But even as it stands, OS/2 is a milestone product".[65]

The 3890/XP1 was announced November 12, 1988. It initially used OS/2 1.1 Extended Edition[70] on a PS/2 Model 80 to emulate the stacker control software that previously ran on a System 360. IBM later switched to OS/2 Warp.[71]

IBM 473x

ATM

Used in a range of Automatic Teller Machines manufactured by IBM. Was also used in later 478x ATMs manufactured with Diebold.

IBM 9672

Mainframe

Used as the operating system for the Support Element (SE).[72] Was also used in later mainframe models such as the IBM 2064 and 2074.[73]

^Iacobucci, Ed; foreword by Bill Gates (1988). "Foreword". OS/2 Programmer's Guide. McGraw-Hill Osborne Media. ISBN0-07-881300-X. I believe OS/2 is destined to be the most important operating system, and possibly program, of all time. As the successor to DOS, which has over 10,000,000 systems in use, it creates incredible opportunities for everyone involved with PCs.

^Gates, Bill. "Bill Gates Interview". Computer History Collection (transcript of a Video History interview). Interviewed by David Allison. National Museum of American History, Smithsonian Institution. Retrieved April 10, 2013.

^"Biography for Kate Mulgrew". Internet Movie Database. In 1996, was contracted by IBM to help promote the latest release of OS/2 Warp, version 4 (previously codenamed Merlin), due to associations with Star Trek.

1.
CP/M
–
Initially confined to single-tasking on 8-bit processors and no more than 64 kilobytes of memory, later versions of CP/M added multi-user variations and were migrated to 16-bit processors. The combination of CP/M and S-100 bus computers was loosely patterned on the MITS Altair and this computer platform was widely used in business through the late 1970s and into the mid-1980s. CP/M increased the size for both hardware and software by greatly reducing the amount of programming required to install an application on a new manufacturers computer. An important driver of innovation was the advent of low-cost microcomputers running CP/M, as independent programmers and hackers bought them. CP/M was displaced by MS-DOS soon after the 1981 introduction of the IBM PC, manufacturers of CP/M-compatible systems customized portions of the operating system for their own combination of installed memory, disk drives, and console devices. CP/M would also run on based on the Zilog Z80 processor since the Z80 was compatible with 8080 code. CP/M used the 7-bit ASCII set, the other 128 characters made possible by the 8-bit byte were not standardized. For example, one Kaypro used them for Greek characters, WordStar used the 8th bit as an end-of-word marker. The BIOS and BDOS were memory-resident, while the CCP was memory-resident unless overwritten by an application, a number of transient commands for standard utilities were also provided. The transient commands resided in files with the extension. COM on disk, the BIOS directly controlled hardware components other than the CPU and main memory. It contained functions such as input and output and the reading and writing of disk sectors. The BDOS implemented the CP/M file system and some input/output abstractions on top of the BIOS, the CCP took user commands and either executed them directly or loaded and started an executable file of the given name. Third-party applications for CP/M were also essentially transient commands, the BDOS, CCP and standard transient commands were the same in all installations of a particular revision of CP/M, but the BIOS portion was always adapted to the particular hardware. Adding memory to a computer, for example, meant that the CP/M system had to be reinstalled with an updated BIOS capable of addressing the additional memory, a utility was provided to patch the supplied BIOS, BDOS and CCP to allow them to be run from higher memory. Once installed, the system was stored in reserved areas at the beginning of any disk which would be used to boot the system. On start-up, the bootloader would load the system from the disk in drive A. By modern standards CP/M was primitive, owing to the constraints on program size. With version 1.0 there was no provision for detecting a changed disk, if a user changed disks without manually rereading the disk directory the system would write on the new disk using the old disks directory information, ruining the data stored on the disk

2.
Software developer
–
A software developer is a person concerned with facets of the software development process, including the research, design, programming, and testing of computer software. Other job titles which are used with similar meanings are programmer, software analyst. According to developer Eric Sink, the differences between system design, software development, and programming are more apparent, even more so that developers become systems architects, those who design the multi-leveled architecture or component interactions of a large software system. In a large company, there may be employees whose sole responsibility consists of one of the phases above. In smaller development environments, a few people or even an individual might handle the complete process. The word software was coined as a prank as early as 1953, before this time, computers were programmed either by customers, or the few commercial computer vendors of the time, such as UNIVAC and IBM. The first company founded to provide products and services was Computer Usage Company in 1955. The software industry expanded in the early 1960s, almost immediately after computers were first sold in mass-produced quantities, universities, government, and business customers created a demand for software. Many of these programs were written in-house by full-time staff programmers, some were distributed freely between users of a particular machine for no charge. Others were done on a basis, and other firms such as Computer Sciences Corporation started to grow. The computer/hardware makers started bundling operating systems, systems software and programming environments with their machines, new software was built for microcomputers, so other manufacturers including IBM, followed DECs example quickly, resulting in the IBM AS/400 amongst others. The industry expanded greatly with the rise of the computer in the mid-1970s. In the following years, it created a growing market for games, applications. DOS, Microsofts first operating system product, was the dominant operating system at the time, by 2014 the role of cloud developer had been defined, in this context, one definition of a developer in general was published, Developers make software for the world to use. The job of a developer is to crank out code -- fresh code for new products, code fixes for maintenance, code for business logic, bus factor Software Developer description from the US Department of Labor

3.
IBM
–
International Business Machines Corporation is an American multinational technology company headquartered in Armonk, New York, United States, with operations in over 170 countries. The company originated in 1911 as the Computing-Tabulating-Recording Company and was renamed International Business Machines in 1924, IBM manufactures and markets computer hardware, middleware and software, and offers hosting and consulting services in areas ranging from mainframe computers to nanotechnology. IBM is also a research organization, holding the record for most patents generated by a business for 24 consecutive years. IBM has continually shifted its business mix by exiting commoditizing markets and focusing on higher-value, also in 2014, IBM announced that it would go fabless, continuing to design semiconductors, but offloading manufacturing to GlobalFoundries. Nicknamed Big Blue, IBM is one of 30 companies included in the Dow Jones Industrial Average and one of the worlds largest employers, with nearly 380,000 employees. Known as IBMers, IBM employees have been awarded five Nobel Prizes, six Turing Awards, ten National Medals of Technology, in the 1880s, technologies emerged that would ultimately form the core of what would become International Business Machines. On June 16,1911, their four companies were amalgamated in New York State by Charles Ranlett Flint forming a fifth company, the Computing-Tabulating-Recording Company based in Endicott, New York. The five companies had 1,300 employees and offices and plants in Endicott and Binghamton, New York, Dayton, Ohio, Detroit, Michigan, Washington, D. C. and Toronto. They manufactured machinery for sale and lease, ranging from commercial scales and industrial time recorders, meat and cheese slicers, to tabulators and punched cards. Thomas J. Watson, Sr. fired from the National Cash Register Company by John Henry Patterson, called on Flint and, Watson joined CTR as General Manager then,11 months later, was made President when court cases relating to his time at NCR were resolved. Having learned Pattersons pioneering business practices, Watson proceeded to put the stamp of NCR onto CTRs companies and his favorite slogan, THINK, became a mantra for each companys employees. During Watsons first four years, revenues more than doubled to $9 million, Watson had never liked the clumsy hyphenated title of the CTR and in 1924 chose to replace it with the more expansive title International Business Machines. By 1933 most of the subsidiaries had been merged into one company, in 1937, IBMs tabulating equipment enabled organizations to process unprecedented amounts of data, its clients including the U. S. During the Second World War the company produced small arms for the American war effort, in 1949, Thomas Watson, Sr. created IBM World Trade Corporation, a subsidiary of IBM focused on foreign operations. In 1952, he stepped down after almost 40 years at the company helm, in 1957, the FORTRAN scientific programming language was developed. In 1961, IBM developed the SABRE reservation system for American Airlines, in 1963, IBM employees and computers helped NASA track the orbital flight of the Mercury astronauts. A year later it moved its headquarters from New York City to Armonk. The latter half of the 1960s saw IBM continue its support of space exploration, on April 7,1964, IBM announced the first computer system family, the IBM System/360

4.
Microsoft
–
Its best known software products are the Microsoft Windows line of operating systems, Microsoft Office office suite, and Internet Explorer and Edge web browsers. Its flagship hardware products are the Xbox video game consoles and the Microsoft Surface tablet lineup, as of 2016, it was the worlds largest software maker by revenue, and one of the worlds most valuable companies. Microsoft was founded by Paul Allen and Bill Gates on April 4,1975, to develop and it rose to dominate the personal computer operating system market with MS-DOS in the mid-1980s, followed by Microsoft Windows. The companys 1986 initial public offering, and subsequent rise in its share price, since the 1990s, it has increasingly diversified from the operating system market and has made a number of corporate acquisitions. In May 2011, Microsoft acquired Skype Technologies for $8.5 billion, in June 2012, Microsoft entered the personal computer production market for the first time, with the launch of the Microsoft Surface, a line of tablet computers. The word Microsoft is a portmanteau of microcomputer and software, Paul Allen and Bill Gates, childhood friends with a passion for computer programming, sought to make a successful business utilizing their shared skills. In 1972 they founded their first company, named Traf-O-Data, which offered a computer that tracked and analyzed automobile traffic data. Allen went on to pursue a degree in science at Washington State University. The January 1975 issue of Popular Electronics featured Micro Instrumentation and Telemetry Systemss Altair 8800 microcomputer, Allen suggested that they could program a BASIC interpreter for the device, after a call from Gates claiming to have a working interpreter, MITS requested a demonstration. Since they didnt actually have one, Allen worked on a simulator for the Altair while Gates developed the interpreter and they officially established Microsoft on April 4,1975, with Gates as the CEO. Allen came up with the name of Micro-Soft, as recounted in a 1995 Fortune magazine article. In August 1977 the company formed an agreement with ASCII Magazine in Japan, resulting in its first international office, the company moved to a new home in Bellevue, Washington in January 1979. Microsoft entered the OS business in 1980 with its own version of Unix, however, it was MS-DOS that solidified the companys dominance. For this deal, Microsoft purchased a CP/M clone called 86-DOS from Seattle Computer Products, branding it as MS-DOS, following the release of the IBM PC in August 1981, Microsoft retained ownership of MS-DOS. Since IBM copyrighted the IBM PC BIOS, other companies had to engineer it in order for non-IBM hardware to run as IBM PC compatibles. Due to various factors, such as MS-DOSs available software selection, the company expanded into new markets with the release of the Microsoft Mouse in 1983, as well as with a publishing division named Microsoft Press. Paul Allen resigned from Microsoft in 1983 after developing Hodgkins disease, while jointly developing a new OS with IBM in 1984, OS/2, Microsoft released Microsoft Windows, a graphical extension for MS-DOS, on November 20,1985. Once Microsoft informed IBM of NT, the OS/2 partnership deteriorated, in 1990, Microsoft introduced its office suite, Microsoft Office

5.
Programming language
–
A programming language is a formal computer language designed to communicate instructions to a machine, particularly a computer. Programming languages can be used to programs to control the behavior of a machine or to express algorithms. From the early 1800s, programs were used to direct the behavior of such as Jacquard looms. Thousands of different programming languages have created, mainly in the computer field. Many programming languages require computation to be specified in an imperative form while other languages use forms of program specification such as the declarative form. The description of a language is usually split into the two components of syntax and semantics. Some languages are defined by a document while other languages have a dominant implementation that is treated as a reference. Some languages have both, with the language defined by a standard and extensions taken from the dominant implementation being common. A programming language is a notation for writing programs, which are specifications of a computation or algorithm, some, but not all, authors restrict the term programming language to those languages that can express all possible algorithms. For example, PostScript programs are created by another program to control a computer printer or display. More generally, a language may describe computation on some, possibly abstract. It is generally accepted that a specification for a programming language includes a description, possibly idealized. In most practical contexts, a programming language involves a computer, consequently, abstractions Programming languages usually contain abstractions for defining and manipulating data structures or controlling the flow of execution. Expressive power The theory of computation classifies languages by the computations they are capable of expressing, all Turing complete languages can implement the same set of algorithms. ANSI/ISO SQL-92 and Charity are examples of languages that are not Turing complete, markup languages like XML, HTML, or troff, which define structured data, are not usually considered programming languages. Programming languages may, however, share the syntax with markup languages if a computational semantics is defined, XSLT, for example, is a Turing complete XML dialect. Moreover, LaTeX, which is used for structuring documents. The term computer language is used interchangeably with programming language

6.
C (programming language)
–
C was originally developed by Dennis Ritchie between 1969 and 1973 at Bell Labs, and used to re-implement the Unix operating system. C has been standardized by the American National Standards Institute since 1989, C is an imperative procedural language. Therefore, C was useful for applications that had formerly been coded in assembly language. Despite its low-level capabilities, the language was designed to encourage cross-platform programming, a standards-compliant and portably written C program can be compiled for a very wide variety of computer platforms and operating systems with few changes to its source code. The language has become available on a wide range of platforms. In C, all code is contained within subroutines, which are called functions. Function parameters are passed by value. Pass-by-reference is simulated in C by explicitly passing pointer values, C program source text is free-format, using the semicolon as a statement terminator and curly braces for grouping blocks of statements. The C language also exhibits the characteristics, There is a small, fixed number of keywords, including a full set of flow of control primitives, for, if/else, while, switch. User-defined names are not distinguished from keywords by any kind of sigil, There are a large number of arithmetical and logical operators, such as +, +=, ++, &, ~, etc. More than one assignment may be performed in a single statement, function return values can be ignored when not needed. Typing is static, but weakly enforced, all data has a type, C has no define keyword, instead, a statement beginning with the name of a type is taken as a declaration. There is no function keyword, instead, a function is indicated by the parentheses of an argument list, user-defined and compound types are possible. Heterogeneous aggregate data types allow related data elements to be accessed and assigned as a unit, array indexing is a secondary notation, defined in terms of pointer arithmetic. Unlike structs, arrays are not first-class objects, they cannot be assigned or compared using single built-in operators, There is no array keyword, in use or definition, instead, square brackets indicate arrays syntactically, for example month. Enumerated types are possible with the enum keyword and they are not tagged, and are freely interconvertible with integers. Strings are not a data type, but are conventionally implemented as null-terminated arrays of characters. Low-level access to memory is possible by converting machine addresses to typed pointers

7.
C++
–
C++ is a general-purpose programming language. It has imperative, object-oriented and generic programming features, while also providing facilities for low-level memory manipulation and it was designed with a bias toward system programming and embedded, resource-constrained and large systems, with performance, efficiency and flexibility of use as its design highlights. C++ is a language, with implementations of it available on many platforms and provided by various organizations, including the Free Software Foundation, LLVM, Microsoft, Intel. C++ is standardized by the International Organization for Standardization, with the latest standard version ratified and published by ISO in December 2014 as ISO/IEC14882,2014. The C++ programming language was standardized in 1998 as ISO/IEC14882,1998. The current C++14 standard supersedes these and C++11, with new features, the C++17 standard is due in 2017, with the draft largely implemented by some compilers already, and C++20 is the next planned standard thereafter. Many other programming languages have influenced by C++, including C#, D, Java. In 1979, Bjarne Stroustrup, a Danish computer scientist, began work on C with Classes, the motivation for creating a new language originated from Stroustrups experience in programming for his Ph. D. thesis. When Stroustrup started working in AT&T Bell Labs, he had the problem of analyzing the UNIX kernel with respect to distributed computing, remembering his Ph. D. experience, Stroustrup set out to enhance the C language with Simula-like features. C was chosen because it was general-purpose, fast, portable, as well as C and Simulas influences, other languages also influenced C++, including ALGOL68, Ada, CLU and ML. Initially, Stroustrups C with Classes added features to the C compiler, Cpre, including classes, derived classes, strong typing, inlining, furthermore, it included the development of a standalone compiler for C++, Cfront. In 1985, the first edition of The C++ Programming Language was released, the first commercial implementation of C++ was released in October of the same year. In 1989, C++2.0 was released, followed by the second edition of The C++ Programming Language in 1991. New features in 2.0 included multiple inheritance, abstract classes, static functions, const member functions. In 1990, The Annotated C++ Reference Manual was published and this work became the basis for the future standard. Later feature additions included templates, exceptions, namespaces, new casts, after a minor C++14 update released in December 2014, various new additions are planned for 2017 and 2020. According to Stroustrup, the name signifies the nature of the changes from C. This name is credited to Rick Mascitti and was first used in December 1983, when Mascitti was questioned informally in 1992 about the naming, he indicated that it was given in a tongue-in-cheek spirit

8.
Assembly language
–
Each assembly language is specific to a particular computer architecture. In contrast, most high-level programming languages are generally portable across multiple architectures, Assembly language may also be called symbolic machine code. Assembly language is converted into machine code by a utility program referred to as an assembler. The conversion process is referred to as assembly, or assembling the source code, Assembly time is the computational step where an assembler is run. Assembly language uses a mnemonic to represent each low-level machine instruction or opcode, typically also each architectural register, flag, depending on the architecture, these elements may also be combined for specific instructions or addressing modes using offsets or other data as well as fixed addresses. Many assemblers offer additional mechanisms to facilitate development, to control the assembly process. A macro assembler includes a facility so that assembly language text can be represented by a name. A cross assembler is an assembler that is run on a computer or operating system of a different type from the system on which the code is to run. Cross-assembling facilitates the development of programs for systems that do not have the resources to support software development, a microassembler is a program that helps prepare a microprogram, called firmware, to control the low level operation of a computer. A meta-assembler is a used in some circles for a program that accepts the syntactic and semantic description of an assembly language. An assembler program creates object code by translating combinations of mnemonics and syntax for operations and this representation typically includes an operation code as well as other control bits and data. The assembler also calculates constant expressions and resolves symbolic names for memory locations, the use of symbolic references is a key feature of assemblers, saving tedious calculations and manual address updates after program modifications. Most assemblers also include facilities for performing textual substitution – e. g. to generate common short sequences of instructions as inline. Some assemblers may also be able to some simple types of instruction set-specific optimizations. One concrete example of this may be the ubiquitous x86 assemblers from various vendors, most of them are able to perform jump-instruction replacements in any number of passes, on request. Like early programming languages such as Fortran, Algol, Cobol and Lisp, assemblers have been available since the 1950s, however, assemblers came first as they are far simpler to write than compilers for high-level languages. There may be several assemblers with different syntax for a particular CPU or instruction set architecture, despite different appearances, different syntactic forms generally generate the same numeric machine code, see further below. A single assembler may also have different modes in order to support variations in syntactic forms as well as their exact semantic interpretations, there are two types of assemblers based on how many passes through the source are needed to produce the executable program

9.
Software release life cycle
–
Usage of the alpha/beta test terminology originated at IBM. As long ago as the 1950s, IBM used similar terminology for their hardware development, a test was the verification of a new product before public announcement. B test was the verification before releasing the product to be manufactured, C test was the final test before general availability of the product. Martin Belsky, a manager on some of IBMs earlier software projects claimed to have invented the terminology, IBM dropped the alpha/beta terminology during the 1960s, but by then it had received fairly wide notice. The usage of beta test to refer to testing done by customers was not done in IBM, rather, IBM used the term field test. Pre-alpha refers to all activities performed during the project before formal testing. These activities can include requirements analysis, software design, software development, in typical open source development, there are several types of pre-alpha versions. Milestone versions include specific sets of functions and are released as soon as the functionality is complete, the alpha phase of the release life cycle is the first phase to begin software testing. In this phase, developers generally test the software using white-box techniques, additional validation is then performed using black-box or gray-box techniques, by another testing team. Moving to black-box testing inside the organization is known as alpha release, alpha software can be unstable and could cause crashes or data loss. Alpha software may not contain all of the features that are planned for the final version, in general, external availability of alpha software is uncommon in proprietary software, while open source software often has publicly available alpha versions. The alpha phase usually ends with a freeze, indicating that no more features will be added to the software. At this time, the software is said to be feature complete, Beta, named after the second letter of the Greek alphabet, is the software development phase following alpha. Software in the stage is also known as betaware. Beta phase generally begins when the software is complete but likely to contain a number of known or unknown bugs. Software in the phase will generally have many more bugs in it than completed software, as well as speed/performance issues. The focus of beta testing is reducing impacts to users, often incorporating usability testing, the process of delivering a beta version to the users is called beta release and this is typically the first time that the software is available outside of the organization that developed it. Beta version software is useful for demonstrations and previews within an organization

10.
French language
–
French is a Romance language of the Indo-European family. It descended from the Vulgar Latin of the Roman Empire, as did all Romance languages, French has evolved from Gallo-Romance, the spoken Latin in Gaul, and more specifically in Northern Gaul. Its closest relatives are the other langues doïl—languages historically spoken in northern France and in southern Belgium, French was also influenced by native Celtic languages of Northern Roman Gaul like Gallia Belgica and by the Frankish language of the post-Roman Frankish invaders. Today, owing to Frances past overseas expansion, there are numerous French-based creole languages, a French-speaking person or nation may be referred to as Francophone in both English and French. French is a language in 29 countries, most of which are members of la francophonie. As of 2015, 40% of the population is in Europe, 35% in sub-Saharan Africa, 15% in North Africa and the Middle East, 8% in the Americas. French is the fourth-most widely spoken mother tongue in the European Union, 1/5 of Europeans who do not have French as a mother tongue speak French as a second language. As a result of French and Belgian colonialism from the 17th and 18th century onward, French was introduced to new territories in the Americas, Africa, most second-language speakers reside in Francophone Africa, in particular Gabon, Algeria, Mauritius, Senegal and Ivory Coast. In 2015, French was estimated to have 77 to 110 million native speakers, approximately 274 million people are able to speak the language. The Organisation internationale de la Francophonie estimates 700 million by 2050, in 2011, Bloomberg Businessweek ranked French the third most useful language for business, after English and Standard Mandarin Chinese. Under the Constitution of France, French has been the language of the Republic since 1992. France mandates the use of French in official government publications, public education except in specific cases, French is one of the four official languages of Switzerland and is spoken in the western part of Switzerland called Romandie, of which Geneva is the largest city. French is the language of about 23% of the Swiss population. French is also a language of Luxembourg, Monaco, and Aosta Valley, while French dialects remain spoken by minorities on the Channel Islands. A plurality of the worlds French-speaking population lives in Africa and this number does not include the people living in non-Francophone African countries who have learned French as a foreign language. Due to the rise of French in Africa, the total French-speaking population worldwide is expected to reach 700 million people in 2050, French is the fastest growing language on the continent. French is mostly a language in Africa, but it has become a first language in some urban areas, such as the region of Abidjan, Ivory Coast and in Libreville. There is not a single African French, but multiple forms that diverged through contact with various indigenous African languages, sub-Saharan Africa is the region where the French language is most likely to expand, because of the expansion of education and rapid population growth

11.
German language
–
German is a West Germanic language that is mainly spoken in Central Europe. It is the most widely spoken and official language in Germany, Austria, Switzerland, South Tyrol, the German-speaking Community of Belgium and it is also one of the three official languages of Luxembourg. Major languages which are most similar to German include other members of the West Germanic language branch, such as Afrikaans, Dutch, English, Luxembourgish and it is the second most widely spoken Germanic language, after English. One of the languages of the world, German is the first language of about 95 million people worldwide. The German speaking countries are ranked fifth in terms of publication of new books. German derives most of its vocabulary from the Germanic branch of the Indo-European language family, a portion of German words are derived from Latin and Greek, and fewer are borrowed from French and English. With slightly different standardized variants, German is a pluricentric language, like English, German is also notable for its broad spectrum of dialects, with many unique varieties existing in Europe and also other parts of the world. The history of the German language begins with the High German consonant shift during the migration period, when Martin Luther translated the Bible, he based his translation primarily on the standard bureaucratic language used in Saxony, also known as Meißner Deutsch. Copies of Luthers Bible featured a long list of glosses for each region that translated words which were unknown in the region into the regional dialect. Roman Catholics initially rejected Luthers translation, and tried to create their own Catholic standard of the German language – the difference in relation to Protestant German was minimal. It was not until the middle of the 18th century that a widely accepted standard was created, until about 1800, standard German was mainly a written language, in urban northern Germany, the local Low German dialects were spoken. Standard German, which was different, was often learned as a foreign language with uncertain pronunciation. Northern German pronunciation was considered the standard in prescriptive pronunciation guides though, however, German was the language of commerce and government in the Habsburg Empire, which encompassed a large area of Central and Eastern Europe. Until the mid-19th century, it was essentially the language of townspeople throughout most of the Empire and its use indicated that the speaker was a merchant or someone from an urban area, regardless of nationality. Some cities, such as Prague and Budapest, were gradually Germanized in the years after their incorporation into the Habsburg domain, others, such as Pozsony, were originally settled during the Habsburg period, and were primarily German at that time. Prague, Budapest and Bratislava as well as cities like Zagreb, the most comprehensive guide to the vocabulary of the German language is found within the Deutsches Wörterbuch. This dictionary was created by the Brothers Grimm and is composed of 16 parts which were issued between 1852 and 1860, in 1872, grammatical and orthographic rules first appeared in the Duden Handbook. In 1901, the 2nd Orthographical Conference ended with a standardization of the German language in its written form

12.
Italian language
–
By most measures, Italian, together with Sardinian, is the closest to Latin of the Romance languages. Italian is a language in Italy, Switzerland, San Marino, Vatican City. Italian is spoken by minorities in places such as France, Montenegro, Bosnia & Herzegovina, Crimea and Tunisia and by large expatriate communities in the Americas. Many speakers are native bilinguals of both standardized Italian and other regional languages, Italian is the fourth most studied language in the world. Italian is a major European language, being one of the languages of the Organisation for Security and Cooperation in Europe. It is the third most widely spoken first language in the European Union with 65 million native speakers, including Italian speakers in non-EU European countries and on other continents, the total number of speakers is around 85 million. Italian is the working language of the Holy See, serving as the lingua franca in the Roman Catholic hierarchy as well as the official language of the Sovereign Military Order of Malta. Italian is known as the language of music because of its use in musical terminology and its influence is also widespread in the arts and in the luxury goods market. Italian has been reported as the fourth or fifth most frequently taught foreign language in the world, Italian was adopted by the state after the Unification of Italy, having previously been a literary language based on Tuscan as spoken mostly by the upper class of Florentine society. Its development was influenced by other Italian languages and to some minor extent. Its vowels are the second-closest to Latin after Sardinian, unlike most other Romance languages, Italian retains Latins contrast between short and long consonants. As in most Romance languages, stress is distinctive, however, Italian as a language used in Italy and some surrounding regions has a longer history. What would come to be thought of as Italian was first formalized in the early 14th century through the works of Tuscan writer Dante Alighieri, written in his native Florentine. Dante is still credited with standardizing the Italian language, and thus the dialect of Florence became the basis for what would become the language of Italy. Italian was also one of the recognised languages in the Austro-Hungarian Empire. Italy has always had a dialect for each city, because the cities. Those dialects now have considerable variety, as Tuscan-derived Italian came to be used throughout Italy, features of local speech were naturally adopted, producing various versions of Regional Italian. Even in the case of Northern Italian languages, however, scholars are not to overstate the effects of outsiders on the natural indigenous developments of the languages

13.
Spanish language
–
Spanish —also called Castilian —is a Romance language that originated in the Castile region of Spain, with hundreds of millions of native speakers around the world. It is usually considered the worlds second-most spoken native language after Mandarin Chinese and it is one of the few languages to use inverted question and exclamation marks. Spanish is a part of the Ibero-Romance group of languages, which evolved from several dialects of Vulgar Latin in Iberia after the collapse of the Western Roman Empire in the 5th century. Beginning in the early 16th century, Spanish was taken to the colonies of the Spanish Empire, most notably to the Americas, as well as territories in Africa, Oceania, around 75% of modern Spanish is derived from Latin. Greek has also contributed substantially to Spanish vocabulary, especially through Latin, Spanish vocabulary has been in contact from an early date with Arabic, having developed during the Al-Andalus era in the Iberian Peninsula. With around 8% of its vocabulary being Arabic in origin, this language is the second most important influence after Latin and it has also been influenced by Basque as well as by neighboring Ibero-Romance languages. It also adopted words from languages such as Gothic language from the Visigoths in which many Spanish names and surnames have a Visigothic origin. Spanish is one of the six languages of the United Nations. It is the language in the world by the number of people who speak it as a mother tongue, after Mandarin Chinese. It is estimated more than 437 million people speak Spanish as a native language. Spanish is the official or national language in Spain, Equatorial Guinea, speakers in the Americas total some 418 million. In the European Union, Spanish is the tongue of 8% of the population. Spanish is the most popular second language learned in the United States, in 2011 it was estimated by the American Community Survey that of the 55 million Hispanic United States residents who are five years of age and over,38 million speak Spanish at home. The Spanish Constitution of 1978 uses the term castellano to define the language of the whole Spanish State in contrast to las demás lenguas españolas. Article III reads as follows, El castellano es la lengua española oficial del Estado, las demás lenguas españolas serán también oficiales en las respectivas Comunidades Autónomas. Castilian is the official Spanish language of the State, the other Spanish languages as well shall be official in their respective Autonomous Communities. The Spanish Royal Academy, on the hand, currently uses the term español in its publications. Two etymologies for español have been suggested, the Spanish Royal Academy Dictionary derives the term from the Provençal word espaignol, and that in turn from the Medieval Latin word Hispaniolus, from—or pertaining to—Hispania

14.
Portuguese language
–
Portuguese is a Romance language and the sole official language of Portugal, Brazil, Cape Verde, Guinea-Bissau, Mozambique, Angola, and São Tomé and Príncipe. It also has co-official language status in East Timor, Equatorial Guinea, Portuguese is part of the Ibero-Romance group that evolved from several dialects of Vulgar Latin in the medieval Kingdom of Galicia, and has kept some Celtic phonology. Portuguese is also termed the language of Camões, after Luís Vaz de Camões, one of the greatest literary figures in the Portuguese language and author of the Portuguese epic poem, the museum is the first of its kind in the world. In 2015 the museum was destroyed in a fire, but there are plans to reconstruct it, when the Romans arrived in the Iberian Peninsula in 216 BCE, they brought the Latin language with them, from which all Romance languages descend. Between 409 CE and 711 CE, as the Roman Empire collapsed in Western Europe, Portuguese evolved from the medieval language, known today by linguists as Galician-Portuguese, Old Portuguese or Old Galician, of the northwestern medieval Kingdom of Galicia. It is in Latin administrative documents of the 9th century that written Galician-Portuguese words and this phase is known as Proto-Portuguese, which lasted from the 9th century until the 12th-century independence of the County of Portugal from the Kingdom of León, by then reigning over Galicia. In the first part of the Galician-Portuguese period, the language was used for documents. For some time, it was the language of preference for poetry in Christian Hispania. Portugal became an independent kingdom in 1139, under King Afonso I of Portugal, in the second period of Old Portuguese, in the 15th and 16th centuries, with the Portuguese discoveries, the language was taken to many regions of Africa, Asia and the Americas. The language continued to be popular in parts of Asia until the 19th century, some Portuguese-speaking Christian communities in India, Sri Lanka, Malaysia, and Indonesia preserved their language even after they were isolated from Portugal. The end of the Old Portuguese period was marked by the publication of the Cancioneiro Geral by Garcia de Resende, Most literate Portuguese speakers were also literate in Latin, and thus they easily adopted Latin words into their writing—and eventually speech—in Portuguese. Portuguese is the language of the majority of people in Brazil and Portugal, perhaps 75% of the population of Angola speaks Portuguese natively, and 85% are fluent. Just over 40% of the population of Mozambique are native speakers of Portuguese, Portuguese is also spoken natively by 30% of the population in Guinea-Bissau, and a Portuguese-based creole is understood by all. No data is available for Cape Verde, but almost all the population is bilingual, there are also significant Portuguese speaking immigrant communities in many countries including Andorra, Bermuda, Canada, France, Japan, Jersey, Namibia, Paraguay, Macau, Switzerland, Venezuela. In some parts of former Portuguese India, namely Goa and Daman and Diu, in 2014, an estimated 1,500 students were learning Portuguese in Goa. Equatorial Guinea made an application for full membership to the CPLP in June 2010. In 2011, Portuguese became its official language and, in July 2014. Portuguese is a subject in The school curriculum in Uruguay

15.
Russian language
–
Russian is an East Slavic language and an official language in Russia, Belarus, Kazakhstan, Kyrgyzstan and many minor or unrecognised territories. Russian belongs to the family of Indo-European languages and is one of the four living members of the East Slavic languages, written examples of Old East Slavonic are attested from the 10th century and beyond. It is the most geographically widespread language of Eurasia and the most widely spoken of the Slavic languages and it is also the largest native language in Europe, with 144 million native speakers in Russia, Ukraine and Belarus. Russian is the eighth most spoken language in the world by number of native speakers, the language is one of the six official languages of the United Nations. Russian is also the second most widespread language on the Internet after English, Russian distinguishes between consonant phonemes with palatal secondary articulation and those without, the so-called soft and hard sounds. This distinction is found between pairs of almost all consonants and is one of the most distinguishing features of the language, another important aspect is the reduction of unstressed vowels. Russian is a Slavic language of the Indo-European family and it is a lineal descendant of the language used in Kievan Rus. From the point of view of the language, its closest relatives are Ukrainian, Belarusian, and Rusyn. An East Slavic Old Novgorod dialect, although vanished during the 15th or 16th century, is considered to have played a significant role in the formation of modern Russian. In the 19th century, the language was often called Great Russian to distinguish it from Belarusian, then called White Russian and Ukrainian, however, the East Slavic forms have tended to be used exclusively in the various dialects that are experiencing a rapid decline. In some cases, both the East Slavic and the Church Slavonic forms are in use, with different meanings. For details, see Russian phonology and History of the Russian language and it is also regarded by the United States Intelligence Community as a hard target language, due to both its difficulty to master for English speakers and its critical role in American world policy. The standard form of Russian is generally regarded as the modern Russian literary language, mikhail Lomonosov first compiled a normalizing grammar book in 1755, in 1783 the Russian Academys first explanatory Russian dictionary appeared. By the mid-20th century, such dialects were forced out with the introduction of the education system that was established by the Soviet government. Despite the formalization of Standard Russian, some nonstandard dialectal features are observed in colloquial speech. Thus, the Russian language is the 6th largest in the world by number of speakers, after English, Mandarin, Hindi/Urdu, Spanish, Russian is one of the six official languages of the United Nations. Education in Russian is still a choice for both Russian as a second language and native speakers in Russia as well as many of the former Soviet republics. Russian is still seen as an important language for children to learn in most of the former Soviet republics, samuel P. Huntington wrote in the Clash of Civilizations, During the heyday of the Soviet Union, Russian was the lingua franca from Prague to Hanoi

16.
X86
–
X86 is a family of backward-compatible instruction set architectures based on the Intel 8086 CPU and its Intel 8088 variant. The term x86 came into being because the names of several successors to Intels 8086 processor end in 86, many additions and extensions have been added to the x86 instruction set over the years, almost consistently with full backward compatibility. The architecture has been implemented in processors from Intel, Cyrix, AMD, VIA and many companies, there are also open implementations. In the 1980s and early 1990s, when the 8088 and 80286 were still in common use, today, however, x86 usually implies a binary compatibility also with the 32-bit instruction set of the 80386. An 8086 system, including such as 8087 and 8089. There were also terms iRMX, iSBC, and iSBX – all together under the heading Microsystem 80, however, this naming scheme was quite temporary, lasting for a few years during the early 1980s. Today, x86 is ubiquitous in both stationary and portable computers, and is also used in midrange computers, workstations, servers. A large amount of software, including operating systems such as DOS, Windows, Linux, BSD, Solaris and macOS, functions with x86-based hardware. There have been attempts, including by Intel itself, to end the market dominance of the inelegant x86 architecture designed directly from the first simple 8-bit microprocessors. Examples of this are the iAPX432, the Intel 960, Intel 860, however, the continuous refinement of x86 microarchitectures, circuitry and semiconductor manufacturing would make it hard to replace x86 in many segments. The table below lists processor models and model series implementing variations of the x86 instruction set, each line item is characterized by significantly improved or commercially successful processor microarchitecture designs. Such x86 implementations are seldom simple copies but often employ different internal microarchitectures as well as different solutions at the electronic, quite naturally, early compatible microprocessors were 16-bit, while 32-bit designs were developed much later. For the personal computer market, real quantities started to appear around 1990 with i386 and i486 compatible processors, other companies, which designed or manufactured x86 or x87 processors, include ITT Corporation, National Semiconductor, ULSI System Technology, and Weitek. Some early versions of these microprocessors had heat dissipation problems, AMD later managed to establish itself as a serious contender with the K6 set of processors, which gave way to the very successful Athlon and Opteron. There were also other contenders, such as Centaur Technology, Rise Technology, VIA Technologies energy efficient C3 and C7 processors, which were designed by the Centaur company, have been sold for many years. Centaurs newest design, the VIA Nano, is their first processor with superscalar and it was, perhaps interestingly, introduced at about the same time as Intels first in-order processor since the P5 Pentium, the Intel Atom. The instruction set architecture has twice been extended to a word size. In 1999-2003, AMD extended this 32-bit architecture to 64 bits and referred to it as x86-64 in early documents, Intel soon adopted AMDs architectural extensions under the name IA-32e, later using the name EM64T and finally using Intel 64

17.
Kernel (operating system)
–
The kernel is a computer program that is the core of a computers operating system, with complete control over everything in the system. It is the first program loaded on start-up and it handles the rest of start-up as well as input/output requests from software, translating them into data-processing instructions for the central processing unit. It handles memory and peripherals like keyboards, monitors, printers, the critical code of the kernel is usually loaded into a protected area of memory, which prevents it from being overwritten by applications or other, more minor parts of the operating system. The kernel performs its tasks, such as running processes and handling interrupts, in contrast, everything a user does is in user space, writing text in a text editor, running programs in a GUI, etc. This separation prevents user data and kernel data from interfering with other and causing instability. The kernels interface is an abstraction layer. When a process makes requests of the kernel, it is called a system call, Kernel designs differ in how they manage these system calls and resources. A monolithic kernel runs all the operating instructions in the same address space. A microkernel runs most processes in space, for modularity. The kernel takes responsibility for deciding at any time which of the running programs should be allocated to the processor or processors. Random-access memory Random-access memory is used to both program instructions and data. Typically, both need to be present in memory in order for a program to execute, often multiple programs will want access to memory, frequently demanding more memory than the computer has available. The kernel is responsible for deciding which memory each process can use, input/output devices I/O devices include such peripherals as keyboards, mice, disk drives, printers, network adapters, and display devices. The kernel allocates requests from applications to perform I/O to an appropriate device, key aspects necessary in resource management are the definition of an execution domain and the protection mechanism used to mediate the accesses to the resources within a domain. Kernels also usually provide methods for synchronization and communication between processes called inter-process communication, finally, a kernel must provide running programs with a method to make requests to access these facilities. The kernel has full access to the memory and must allow processes to safely access this memory as they require it. Often the first step in doing this is virtual addressing, usually achieved by paging and/or segmentation, virtual addressing allows the kernel to make a given physical address appear to be another address, the virtual address. This allows every program to behave as if it is the one running

18.
Hybrid kernel
–
A hybrid kernel is an operating system kernel architecture that attempts to combine aspects and benefits of microkernel and monolithic kernel architectures used in computer operating systems. The traditional kernel categories are monolithic kernels and microkernels, the hybrid category is controversial, due to the similarity of hybrid kernels and ordinary monolithic kernels, the term has been dismissed by Linus Torvalds as simple marketing. The idea behind a hybrid kernel is to have a structure similar to that of a microkernel. In contrast to a microkernel, all operating system services in a hybrid kernel are still in kernel space, so there are none of the reliability benefits of having services in user space, as with a microkernel.1, and Xbox One. Conversely, the reason NT is not a system is because most of the system components run in the same address space as the kernel. The subsystems are not written to a particular OS personality, the primary operating system personality on Windows is the Windows API, which is always present. The emulation subsystem which implements the Windows personality is called the Client/Server Runtime Subsystem, on versions of NT prior to 4.0, this subsystem process also contained the window manager, graphics device interface and graphics device drivers. For performance reasons, however, in version 4.0 and later, as of 2007, one other operating system personality, UNIX, is offered as an optionally installed system component on certain versions of Windows Vista and Windows Server 2003 R2. The associated subsystem process is the Subsystem for UNIX-Based Applications, which was part of a Windows add-on called Windows Services for UNIX. An OS/2 subsystem was supported in versions of Windows NT. The POSIX subsystem was supplanted by the UNIX subsystem, hence the identical executable name, in August 2016, Microsoft unveiled the latest Windows subsystem called the Linux Subsystem for Windows. This subsystem, available only on 64-bit Windows 10 version 1607 and this was intended so that developers could run their tools on Windows without having to emulate them, and thus requires developer mode to be enabled in Windows Settings. It is designed only to run applications, although a reddit user has discovered a way to run GUI applications or even an entire desktop environment with it. Certain applications that rely on the Linux kernel itself will not be able to run because it does not include the Linux kernel. Applications that run on NT are written to one of the OS personalities, an OS personality is implemented via a set of user-mode DLLs, which are mapped into application processes address spaces as required, together with an emulation subsystem server process. XNU is the kernel that Apple Inc. acquired and developed for use in the OS X and iOS operating systems and released as free, XNU is an acronym for X is Not Unix. XNU runs on ARM as part of iOS, IA-32, archived from the original on March 15,2006

19.
User interface
–
The user interface, in the industrial design field of human–computer interaction, is the space where interactions between humans and machines occur. Examples of this concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls. The design considerations applicable when creating user interfaces are related to or involve such disciplines as ergonomics and psychology. Generally, the goal of user interface design is to produce a user interface makes it easy, efficient. This generally means that the needs to provide minimal input to achieve the desired output. Other terms for user interface are man–machine interface and when the machine in question is a computer human–computer interface, the user interface or human–machine interface is the part of the machine that handles the human–machine interaction. Membrane switches, rubber keypads and touchscreens are examples of the part of the Human Machine Interface which we can see. In complex systems, the interface is typically computerized. The term human–computer interface refers to this kind of system, in the context of computing the term typically extends as well to the software dedicated to control the physical elements used for human-computer interaction. The engineering of the interfaces is enhanced by considering ergonomics. The corresponding disciplines are human factors engineering and usability engineering, which is part of systems engineering, tools used for incorporating human factors in the interface design are developed based on knowledge of computer science, such as computer graphics, operating systems, programming languages. Nowadays, we use the graphical user interface for human–machine interface on computers. There is a difference between a user interface and an interface or a human–machine interface. A human-machine interface is typically local to one machine or piece of equipment, an operator interface is the interface method by which multiple equipment that are linked by a host control system is accessed or controlled. The system may expose several user interfaces to serve different kinds of users, for example, a computerized library database might provide two user interfaces, one for library patrons and the other for library personnel. The user interface of a system, a vehicle or an industrial installation is sometimes referred to as the human–machine interface. HMI is a modification of the original term MMI, in practice, the abbreviation MMI is still frequently used although some may claim that MMI stands for something different now. Another abbreviation is HCI, but is commonly used for human–computer interaction

20.
Workplace Shell
–
The Workplace Shell is an object-oriented desktop shell produced by IBMs Boca Raton development lab for OS/22.0. The Workplace Shell was also used in OS/2 Warp 3 and Warp 4, IBM originally intended to deliver the Workplace Shell as part of the OfficeVision/2 LAN product, but in 1991 announced plans to release it as part of OS/22.0 instead. Although mostly written in C, under the covers the Workplace Shell is implemented as a class library. The WPS classes are glued together with an interface definition language, SOM and its IDL was developed by IBM in their Austin, Texas lab. The classes can easily be manipulated by sending simple settings strings to them both via a C and a Rexx API, when implementing a new WPS class, it is derived from an existing class from within the WPS class hierarchy. For modifying, extending or removing certain functionality of the parent class, the resulting object class is shipped in DLL form. A part of the WPS design allows for the developer of a class Y which extends or modifies a class X to execute an additional API on installation which will let the WPS replace class X by class Y. This will make even all existing instances of class X behave as instances of the modified class Y, i. e. almost a retroactive inheritance. This allows for many useful third-party desktop utilities that add or modify functionality to or of existing objects without access to IBMs source code, where the IDL and class headers also of derived classes are published, these classes can as well be extended in turn in the same way. OsFree, which seeks to entirely reimplement OS/2 as free software, for OS/2 and eComStation developers, the Workplace Shell Toolkit eases common programming tasks when creating WPS classes, as well as for plain Presentation Manager programming. DFM is a Linux file manager for the X Window System, Workplace Shell for Windows is a freeware clone of the WPS made for Windows unofficially by IBM employees. Download Workplace Shell for Windows OS2eZine Article, WPS for Windows v1.51 Download Workplace Shell for Windows 2.0 Source Code

21.
Graphical user interface
–
GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces, which require commands to be typed on a computer keyboard. The actions in a GUI are usually performed through direct manipulation of the graphical elements, beyond computers, GUIs are used in many handheld mobile devices such as MP3 players, portable media players, gaming devices, smartphones and smaller household, office and industrial controls. Designing the visual composition and temporal behavior of a GUI is an important part of application programming in the area of human–computer interaction. Its goal is to enhance the efficiency and ease of use for the logical design of a stored program. Methods of user-centered design are used to ensure that the language introduced in the design is well-tailored to the tasks. The visible graphical interface features of an application are sometimes referred to as chrome or GUI, typically, users interact with information by manipulating visual widgets that allow for interactions appropriate to the kind of data they hold. The widgets of an interface are selected to support the actions necessary to achieve the goals of users. A model–view–controller allows a structure in which the interface is independent from and indirectly linked to application functions. This allows users to select or design a different skin at will, good user interface design relates to users more, and to system architecture less. Large widgets, such as windows, usually provide a frame or container for the main presentation content such as a web page, smaller ones usually act as a user-input tool. A GUI may be designed for the requirements of a market as application-specific graphical user interfaces. By the 1990s, cell phones and handheld game systems also employed application specific touchscreen GUIs, newer automobiles use GUIs in their navigation systems and multimedia centers, or navigation multimedia center combinations. Sample graphical desktop environments A GUI uses a combination of technologies and devices to provide a platform that users can interact with, a series of elements conforming a visual language have evolved to represent information stored in computers. This makes it easier for people with few computer skills to work with, the most common combination of such elements in GUIs is the windows, icons, menus, pointer paradigm, especially in personal computers. The WIMP style of interaction uses a virtual device to represent the position of a pointing device, most often a mouse. Available commands are compiled together in menus, and actions are performed making gestures with the pointing device, a window manager facilitates the interactions between windows, applications, and the windowing system. The windowing system handles hardware devices such as pointing devices, graphics hardware, window managers and other software combine to simulate the desktop environment with varying degrees of realism. Smaller mobile devices such as personal assistants and smartphones typically use the WIMP elements with different unifying metaphors, due to constraints in space

22.
Software license
–
A software license is a legal instrument governing the use or redistribution of software. Under United States copyright law all software is copyright protected, in code as also object code form. The only exception is software in the public domain, most distributed software can be categorized according to its license type. Two common categories for software under copyright law, and therefore with licenses which grant the licensee specific rights, are proprietary software and free, unlicensed software outside the copyright protection is either public domain software or software which is non-distributed, non-licensed and handled as internal business trade secret. Contrary to popular belief, distributed unlicensed software is copyright protected. Examples for this are unauthorized software leaks or software projects which are placed on public software repositories like GitHub without specified license. As voluntarily handing software into the domain is problematic in some international law domains, there are also licenses granting PD-like rights. Therefore, the owner of a copy of software is legally entitled to use that copy of software. Hence, if the end-user of software is the owner of the respective copy, as many proprietary licenses only enumerate the rights that the user already has under 17 U. S. C. §117, and yet proclaim to take away from the user. Proprietary software licenses often proclaim to give software publishers more control over the way their software is used by keeping ownership of each copy of software with the software publisher. The form of the relationship if it is a lease or a purchase, for example UMG v. Augusto or Vernor v. Autodesk. The ownership of goods, like software applications and video games, is challenged by licensed. The Swiss based company UsedSoft innovated the resale of business software and this feature of proprietary software licenses means that certain rights regarding the software are reserved by the software publisher. Therefore, it is typical of EULAs to include terms which define the uses of the software, the most significant effect of this form of licensing is that, if ownership of the software remains with the software publisher, then the end-user must accept the software license. In other words, without acceptance of the license, the end-user may not use the software at all, one example of such a proprietary software license is the license for Microsoft Windows. The most common licensing models are per single user or per user in the appropriate volume discount level, Licensing per concurrent/floating user also occurs, where all users in a network have access to the program, but only a specific number at the same time. Another license model is licensing per dongle which allows the owner of the dongle to use the program on any computer, Licensing per server, CPU or points, regardless the number of users, is common practice as well as site or company licenses

23.
Operating system
–
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. All computer programs, excluding firmware, require a system to function. Operating systems are found on many devices that contain a computer – from cellular phones, the dominant desktop operating system is Microsoft Windows with a market share of around 83. 3%. MacOS by Apple Inc. is in place, and the varieties of Linux is in third position. Linux distributions are dominant in the server and supercomputing sectors, other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can run one program at a time. Multi-tasking may be characterized in preemptive and co-operative types, in preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, e. g. Solaris, Linux, cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking, 32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem, a distributed operating system manages a group of distinct computers and makes them appear to be a single computer. The development of networked computers that could be linked and communicate with each other gave rise to distributed computing, distributed computations are carried out on more than one machine. When computers in a work in cooperation, they form a distributed system. The technique is used both in virtualization and cloud computing management, and is common in large server warehouses, embedded operating systems are designed to be used in embedded computer systems. They are designed to operate on small machines like PDAs with less autonomy and they are able to operate with a limited number of resources. They are very compact and extremely efficient by design, Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is a system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could run different programs in succession to speed up processing

24.
Windows 3.1
–
Windows 3. 1x is a series of 16-bit operating environments produced by Microsoft for use on personal computers. The series began with Windows 3.1, which was first sold during April 1992 as a successor to Windows 3.0, subsequent versions were released between 1992 and 1994 until the series was superseded by Windows 95. Windows 3.1, released on April 6,1992, introduced a TrueType font system, similar functionality was available for Windows 3.0 through Adobe Type Manager font system from Adobe. Windows 3.1 was designed to have compatibility with older Windows platforms. As with Windows 3.0, version 3.1 had File Manager and Program Manager and it included Minesweeper as a replacement for Reversi. Windows 3.1 Multimedia PC Version included a viewer. It was targeted to the new Multimedia PC standard and included sound, Windows 3.1 dropped real mode support and required a minimum of a 286 PC with 1 MB of RAM to run. The effect of this was to increase system stability over the crash-prone Windows 3.0, some older features were removed, like CGA graphics support and compatibility with real mode Windows 2. x applications. Truetype font support was added, providing scalable fonts to Windows applications, Windows 3.1 included the following fonts, Arial, Courier New, Times New Roman, and Symbol in regular, bold, italic, and bold-italic versions. Truetype fonts could be scaled to any size and rotated, depending on the calling application, a few DOS applications, such as late releases of Microsoft Word, could access Windows Clipboard. Windows own drivers couldnt work directly with DOS applications, hardware such as mice required a DOS driver to be loaded before starting Windows, icons could be dragged and dropped for the first time, in addition to having a more detailed appearance. A file could be dragged onto Print Manager icon and the file would be printed by the current printer, assuming it was associated with a capable of printing. Alternatively, the file could be dragged out of File Manager, while Windows 3.0 was limited to 16 MB maximum memory, Windows 3.1 can access a theoretical 4 GB in 386 Enhanced Mode. However, no process can use more than 16 MB. File Manager was significantly improved over Windows 3.0, Multimedia support was enhanced over what was available in Windows 3.0 with Multimedia Extensions and available to all Windows 3.1 users. Windows 3.1 was available via 720 KB,1.2 MB, and 1.44 MB floppy distributions. It was also the first version of Windows to be distributed on CD-ROM — although this was common for Windows for Workgroups 3.11. Installed size on the disk was between 10 MB and 15 MB

25.
IBM Personal System/2
–
The Personal System/2 or PS/2 was IBMs third generation of personal computers. Released in 1987, it replaced the IBM PC, XT, AT. The PS/2 line was created by IBM in an attempt to control of the PC market by introducing an advanced yet proprietary architecture. IBMs considerable market presence plus the reliability of the PS/2 ensured that the systems would sell in large numbers. Also the evolving Wintel architecture was seeing a period of dramatic reductions in price, the OS/2 operating system was announced at the same time as the PS/2 line and was intended to be the primary operating system for models with Intel 286 or later processors. However, at the time of the first shipments, only PC DOS was available, OS/21.0 and Microsofts Windows 2.0 became available several months later. IBM also released AIX PS/2, a UNIX operating system for PS/2 models with Intel 386 or later processors, for years before IBM released the PS/2, rumors spread about IBMs plans for successors to its IBM PC, XT, and AT personal computers. Among the rumors that did not come true, The company would use proprietary, the company would release a version of its VM mainframe operating system for them. The company would design the new computers to make third-party communications products more difficult to design, IBMs PS/2 was designed to remain software compatible with their PC/AT/XT line of computers upon which the large PC clone market was built, but the hardware was quite different. CBIOS was so compatible that it even included Cassette BASIC, while IBM did not publish the BIOS source code, it did promise to publish BIOS entry points. With the IBM PS/2 line, Micro Channel Architecture was also introduced, MCA was conceptually similar to the channel architecture of the IBM System/360 mainframes. MCA was technically superior to ISA and allowed for higher speed communications within the system, MCA featured many advances not seen in other standards until several years later. Transfer speeds were on par with the much later PCI standard, MCA allowed one-to-one, card to card, and multi-card to processor simultaneous transaction management which is a feature of the PCI-X bus format. Bus mastering capability, bus arbitration, and a form of plug-and-play BIOS management of hardware were all benefits of MCA. (One book from the year 2000 writes, MCA used a version of what we know now as “Plug-N′-Play”, requiring a special setup disk for each machine. MCA never gained wide acceptance outside of the PS/2 line due to IBMs anti-clone practices and incompatibilities with ISA, IBM offered to sell an MCA license to anyone who could afford the royalty. However, royalties were required for every MCA-compatible machine sold and a payment for every IBM-compatible machine the particular maker had made in the past, there was nothing unique in IBM insisting on payment of royalties on the use of its patents applied to Micro Channel based machines. However, up until that time, some companies had failed to pay IBM for the use of its patents on the generation of Personal Computer

26.
Personal computer
–
A personal computer is a multi-purpose electronic computer whose size, capabilities, and price make it feasible for individual use. PCs are intended to be operated directly by a end-user, rather than by an expert or technician. In the 2010s, PCs are typically connected to the Internet, allowing access to the World Wide Web, personal computers may be connected to a local area network, either by a cable or a wireless connection. In the 2010s, a PC may be, a multi-component desktop computer, designed for use in a location a laptop computer, designed for easy portability or a tablet computer. In the 2010s, PCs run using a system, such as Microsoft Windows, Linux. The very earliest microcomputers, equipped with a front panel, required hand-loading of a program to load programs from external storage. Before long, automatic booting from permanent read-only memory became universal, in the 2010s, users have access to a wide range of commercial software, free software and free and open-source software, which are provided in ready-to-run or ready-to-compile form. Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the computer market, first with MS-DOS. Alternatives to Microsofts Windows operating systems occupy a minority share of the industry and these include Apples OS X and free open-source Unix-like operating systems such as Linux and Berkeley Software Distribution. Advanced Micro Devices provides the alternative to Intels processors. PC is an initialism for personal computer, some PCs, including the OLPC XOs, are equipped with x86 or x64 processors but not designed to run Microsoft Windows. PC is used in contrast with Mac, an Apple Macintosh computer and this sense of the word is used in the Get a Mac advertisement campaign that ran between 2006 and 2009, as well as its rival, Im a PC campaign, that appeared in 2008. Since Apples transition to Intel processors starting 2005, all Macintosh computers are now PCs, the “brain” may one day come down to our level and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far, in the history of computing there were many examples of computers designed to be used by one person, as opposed to terminals connected to mainframe computers. Using the narrow definition of operated by one person, the first personal computer was the ENIAC which became operational in 1946 and it did not meet further definitions of affordable or easy to use. An example of an early single-user computer was the LGP-30, created in 1956 by Stan Frankel and used for science and it came with a retail price of $47, 000—equivalent to about $414,000 today. Introduced at the 1965 New York Worlds Fair, the Programma 101 was a programmable calculator described in advertisements as a desktop computer. It was manufactured by the Italian company Olivetti and invented by the Italian engineer Pier Giorgio Perotto, the Soviet MIR series of computers was developed from 1965 to 1969 in a group headed by Victor Glushkov

27.
Protected mode
–
In computing, protected mode, also called protected virtual address mode, is an operational mode of x86-compatible central processing units. It allows system software to use such as virtual memory, paging. When a processor that supports x86 protected mode is powered on, it begins executing instructions in real mode, protected mode may only be entered after the system software sets up several descriptor tables and enables the Protection Enable bit in the control register 0. Protected mode was first added to the x86 architecture in 1982, with the release of Intels 80286 processor, the Intel 8086, the predecessor to the 286, was originally designed with a 20-bit address bus for its memory. This allowed the processor to access 220 bytes of memory, equivalent to 1 megabyte, as the cost of memory decreased and memory use increased, the 1 MB limitation became a significant problem. Intel intended to solve this limitation along with others with the release of the 286, the initial protected mode, released with the 286, was not widely used, for example, it was used by Microsoft Xenix, Coherent and Minix. Several shortcomings such as the inability to access the BIOS or DOS calls due to inability to back to real mode without resetting the processor prevented widespread usage. The 286 maintained backwards compatibility with its precursor the 8086 by initially entering real mode on power up, real mode functioned virtually identically to the 8086, allowing the vast majority of existing 8086 software to run unmodified on the newer 286. Real mode also served as a more basic mode in which protected mode could be set up and this enabled 24 bit addressing which allowed the processor to access 224 bytes of memory, equivalent to 16 megabytes. With the release of the 386 in 1985, many of the issues preventing widespread adoption of the protected mode were addressed. The 386 was released with an address bus size of 32 bits, the segment sizes were also increased to 32 bits, meaning that the full address space of 4 gigabytes could be accessed without the need to switch between multiple segments. In addition to the size of the address bus and segment registers. Protected mode is now used in all modern operating systems which run on the x86 architecture, such as Microsoft Windows, Linux. Hardware support required for virtualizing the protected mode itself, however, had to wait for another 20 years. IBM devised a workaround which involved resetting the CPU via the controller and saving the system registers, stack pointer. This allowed the BIOS to restore the CPU to a similar state, later, a triple fault was used to reset the 286 CPU, which was a lot faster and cleaner than the keyboard controller method. To enter protected mode, the Global Descriptor Table must first be created with a minimum of three entries, a descriptor, a code segment descriptor and data segment descriptor. In an IBM-compatible machine, the A20 line also must be enabled to allow the use of all the lines so that the CPU can access beyond 1 megabyte of memory

28.
IBM PC DOS
–
IBM PC DOS is a discontinued operating system for the IBM Personal Computer, manufactured and sold by IBM from the 1981 into the 2000s. Before version 6.1, PC DOS was an IBM-branded version of MS-DOS, from version 6.1 on, PC DOS became IBMs independent product. The IBM task force assembled to develop the PC decided that critical components of the machine, including the operating system and this radical break from company tradition of in-house development was one of the key decisions that made the IBM PC an industry standard. At that time private company Microsoft, founded five years before by Bill Gates, was selected for the operating system. IBM wanted Microsoft to retain ownership of whatever software it developed, according to task force member Jack Sams, The reasons were internal. We had a problem being sued by people claiming we had stolen their stuff. It could be expensive for us to have our programmers look at code that belonged to someone else because they would then come back and say we stole it. We had lost a series of suits on this, and so we didnt want to have a product which was clearly someone elses product worked on by IBM people and we went to Microsoft on the proposition that we wanted this to be their product. IBM first contacted Microsoft to look the company over in July 1980, negotiations continued over the next months, and the paperwork was officially signed in early November. Although IBM expected that most customers would use PC DOS, The IBM PC also supported CP/M-86, which became available six months after PC DOS, and UCSD p-System operating systems. IBMs expectation proved correct, one found that 96. 3% of PCs were ordered with the $40 PC-DOS compared to 3. 4% for the $240 CP/M-86. Microsoft first licensed, then purchased 86-DOS from Seattle Computer Products, ORear got 86-DOS to run on the prototype PC in February 1981. 86-DOS had to be converted from 8-inch to 5. 25-inch floppy disks and integrated with the BIOS, IBM had more people writing requirements for the computer than Microsoft had writing code. ORear often felt overwhelmed by the number of people he had to deal with at the ESD facility in Boca Raton, 86-DOS was rebranded IBM PC DOS1.0 for its August 1981 release with the IBM PC. The initial version of DOS was largely based on CP/M-801. x and most of its architecture, function calls, the most significant difference was the fact that it introduced a different file system, FAT12. Unlike all later DOS versions, the DATE and TIME commands were separate rather than part of COMMAND. COM. Single-sided 160 kilobyte 5.25 floppies were the only disk format supported, in late 1981 Paterson, now at Microsoft, began writing PC DOS1.10. It debuted in May 1982 along with the Revision B IBM PC, support for the new double-sided drives was added, allowing 320 kB per disk

29.
System call
–
In computing, a system call is the programmatic way in which a computer program requests a service from the kernel of the operating system it is executed on. This may include hardware-related services, creation and execution of new processes, system calls provide an essential interface between a process and the operating system. In most systems, system calls can only be made from userspace processes, while in some systems, OS/360 and successors for example, the architecture of most modern processors, with the exception of some embedded systems, involves a security model. The operating system executes at the highest level of privilege, and allows applications to request services via system calls, generally, systems provide a library or API that sits between normal programs and the operating system. The librarys wrapper functions expose an ordinary function calling convention for using the system call, in this way the library, which exists between the OS and the application, increases portability. The call to the function itself does not cause a switch to kernel mode and is usually a normal subroutine call. The actual system call does transfer control to the kernel, for example, in Unix-like systems, fork and execve are C library functions that in turn execute instructions that invoke the fork and exec system calls. On exokernel based systems, the library is especially important as an intermediary, on exokernels, libraries shield user applications from the very low level kernel API, and provide abstractions and resource management. IBM operating systems descended from OS/360 and DOS/360, including z/OS and z/VSE and this reflects their origin at a time when programming in assembly language was more common than high-level language usage. IBM system calls are not directly executable by high-level language programs. On Unix, Unix-like and other POSIX-compliant operating systems, popular system calls are open, read, write, close, wait, exec, fork, exit, many modern operating systems have hundreds of system calls. For example, Linux and OpenBSD each have over 300 different calls, NetBSD has close to 500, FreeBSD has over 500, Windows 7 has close to 700, while Plan 9 has 51. This special ability of the program is also implemented with a system call. Implementing system calls requires a transfer from user space to kernel space. A typical way to implement this is to use a software interrupt or trap, interrupts transfer control to the operating system kernel so software simply needs to set up some register with the system call number needed, and execute the software interrupt. This is the only provided for many RISC processors. For example, the x86 instruction set contains the instructions SYSCALL/SYSRET and these are fast control transfer instructions that are designed to quickly transfer control to the kernel for a system call without the overhead of an interrupt. Linux 2.5 began using this on the x86, where available, formerly it used the INT instruction, an older x86 mechanism is the call gate

30.
MS-DOS
–
MS-DOS is a discontinued operating system for x86-based personal computers mostly developed by Microsoft. MS-DOS resulted from a request in 1981 by IBM for a system to use in its IBM PC range of personal computers. Microsoft quickly bought the rights to 86-DOS from Seattle Computer Products, IBM licensed and released it in August 1981 as PC DOS1.0 for use in their PCs. During its life, several competing products were released for the x86 platform and it was also the underlying basic operating system on which early versions of Windows ran as a GUI. It is a operating system, and consumes negligible installation space. MS-DOS was a form of 86-DOS – owned by Seattle Computer Products. This first version was shipped in August 1980, Microsoft, which needed an operating system for the IBM Personal Computer hired Tim Paterson in May 1981 and bought 86-DOS1.10 for $75,000 in July of the same year. Microsoft kept the number, but renamed it MS-DOS. They also licensed MS-DOS1. 10/1.14 to IBM, within a year Microsoft licensed MS-DOS to over 70 other companies. It was designed to be an OS that could run on any 8086-family computer, thus, there were many different versions of MS-DOS for different hardware, and there is a major distinction between an IBM-compatible machine and an MS-DOS machine. This design would have worked well for compatibility, if application programs had only used MS-DOS services to perform device I/O, Microsoft omitted multi-user support from MS-DOS because Microsofts Unix-based operating system, Xenix, was fully multi-user. After the breakup of the Bell System, however, AT&T Computer Systems started selling UNIX System V, believing that it could not compete with AT&T in the Unix market, Microsoft abandoned Xenix, and in 1987 transferred ownership of Xenix to the Santa Cruz Operation. On 25 March 2014, Microsoft made the code to SCP MS-DOS1.25, as an April Fools joke in 2015, Microsoft Mobile launched a Windows Phone application called MS-DOS Mobile which was presented as a new mobile operating system and worked similar to MS-DOS. Version 3.1 – Support for Microsoft Networks Version 3.2 – First version to support 3.5 inch,720 kB floppy drives and diskettes. Version 3.21 Version 3.22 – Version 3.25 Version 3.3 – First version to support 3.5 inch,1.44 MB floppy drives and diskettes, Version 3. 3a Version 3.31 – supports FAT16B and larger drives. MS-DOS4.0 and MS-DOS4.1 – A separate branch of development with additional multitasking features and it is unrelated to any later versions, including versions 4.00 and 4.01 listed below MS-DOS4. x – includes a graphical/mouse interface. It had many bugs and compatibility issues. Version 4.00 – First version to support a hard disk partition that is greater than 32 MiB. Version 4.01 – Microsoft rewritten Version 4.00 released under MS-DOS label, First version to introduce volume serial number when formatting hard disks and floppy disks

31.
Text mode
–
Text mode is a computer display mode in which content is internally represented on a computer screen in terms of characters rather than individual pixels. Typically, the screen consists of a rectangular grid of character cells. Text mode is contrasted to all points addressable mode or other kinds of computer graphics modes, text mode applications communicate with the user with command-line interfaces and text user interfaces. A typical example is the IBM code page 437 character set and this was an analogy of early mechanical printers which had fixed pitch. This way, the output seen on the screen could be sent directly to the printer maintaining exactly the same format, depending on the environment, the screen buffer can be directly addressable. Programs that display output on remote video terminals must issue special control sequences to manipulate the screen buffer, the most popular standards for such control sequences are ANSI and VT100. Text mode video rendering came to prominence in the early 1970s, the advantages of text modes as compared to graphics modes include lower memory consumption and faster screen manipulation. Early framebuffers were standalone devices which cost thousands of dollars, in addition to the expense of the advanced high-resolution displays to which they were connected. For applications that required simple line graphics but for which the expense of a framebuffer could not be justified, but there were many computer applications for which all that was required was the ability to render ordinary text in a quick and cost-effective fashion to a cathode ray tube. Text mode avoids the problem of memory by having dedicated display hardware re-render each line of text from characters into pixels with each scan of the screen by the cathode ray. In turn, the hardware needs only enough memory to store the pixels equivalent to one line of text at a time. For example, a screen buffer sufficient to hold a standard grid of 80 by 25 characters requires at least 2,000 bytes. S. dollars, another advantage of text mode is that it has relatively low bandwidth requirements in remote terminal use. Text mode rendering with user-defined characters has also been useful for 2D computer, a video controller implementing a text mode usually uses two distinct areas of memory. Character memory or a table contains a raster font in use. Display matrix tracks which character is in each cell, in the simple case the display matrix can be just a matrix of code points, but it usually stores for each character position not only a code, but also attributes. The video controller has two registers, scan line counter and dot counter, serving as coordinates in the dot matrix. Each of them must be divided by corresponding glyph size to obtain an index in the display matrix, the character memory resides in a read-only memory in some systems. Other systems allow the use of RAM for this purpose, making it possible to redefine the typeface, in some historical graphics chips, including the TMS9918, the MOS Technology VIC, and the Game Boy graphics hardware, this was actually the canonical way of doing pixel graphics

32.
Unix
–
Among these is Apples macOS, which is the Unix version with the largest installed base as of 2014. Many Unix-like operating systems have arisen over the years, of which Linux is the most popular, Unix was originally meant to be a convenient platform for programmers developing software to be run on it and on other systems, rather than for non-programmer users. The system grew larger as the system started spreading in academic circles, as users added their own tools to the system. Unix was designed to be portable, multi-tasking and multi-user in a time-sharing configuration and these concepts are collectively known as the Unix philosophy. By the early 1980s users began seeing Unix as a universal operating system. Under Unix, the system consists of many utilities along with the master control program. To mediate such access, the kernel has special rights, reflected in the division between user space and kernel space, the microkernel concept was introduced in an effort to reverse the trend towards larger kernels and return to a system in which most tasks were completed by smaller utilities. In an era when a standard computer consisted of a disk for storage and a data terminal for input and output. However, modern systems include networking and other new devices, as graphical user interfaces developed, the file model proved inadequate to the task of handling asynchronous events such as those generated by a mouse. In the 1980s, non-blocking I/O and the set of inter-process communication mechanisms were augmented with Unix domain sockets, shared memory, message queues, and semaphores. In microkernel implementations, functions such as network protocols could be moved out of the kernel, Multics introduced many innovations, but had many problems. Frustrated by the size and complexity of Multics but not by the aims and their last researchers to leave Multics, Ken Thompson, Dennis Ritchie, M. D. McIlroy, and J. F. Ossanna, decided to redo the work on a much smaller scale. The name Unics, a pun on Multics, was suggested for the project in 1970. Peter H. Salus credits Peter Neumann with the pun, while Brian Kernighan claims the coining for himself, in 1972, Unix was rewritten in the C programming language. Bell Labs produced several versions of Unix that are referred to as Research Unix. In 1975, the first source license for UNIX was sold to faculty at the University of Illinois Department of Computer Science, UIUC graduate student Greg Chesson was instrumental in negotiating the terms of this license. During the late 1970s and early 1980s, the influence of Unix in academic circles led to adoption of Unix by commercial startups, including Sequent, HP-UX, Solaris, AIX. In the late 1980s, AT&T Unix System Laboratories and Sun Microsystems developed System V Release 4, in the 1990s, Unix-like systems grew in popularity as Linux and BSD distributions were developed through collaboration by a worldwide network of programmers

33.
Xenix
–
Xenix is a discontinued version of the Unix operating system for various microcomputer platforms, licensed by Microsoft from AT&T Corporation in the late 1970s. The Santa Cruz Operation later acquired rights to the software. In the mid-to-late 1980s, Xenix was the most common Unix variant, Microsoft chairman Bill Gates said in 1996 that for a long time that company had the highest-volume AT&T Unix license. Bell Labs, the developer of Unix, was part of the regulated Bell System and it instead licensed the software to others. Because Microsoft was not able to license the UNIX name itself, Microsoft called Xenix a universal operating environment. The first version of Xenix was very close to the original UNIX version 7 source on the PDP-11, Microsoft said in 1981, and later versions were to incorporate its own fixes and improvements. The first port was for the Z8001 16-bit processor, the first customer ship was January 1981 for Central Data Corporation of Illinois, the first 8086 port was for the Altos Computer Systems non-PC-compatible 8600-series computers. Intel sold complete computers with Xenix under their Intel System 86 brand and this included processor boards like iSBC 86/12 and also MMU boards such as the iSBC309. The first Intel Xenix systems shipped in July 1982, seattle Computer Products also made 8086 computers bundled with Xenix, like their Gazelle II, which used the S-100 bus and was available in late 1983 or early 1984. There was also a port for IBM System 9000, SCO had initially worked on its own PDP-11 port of V7, called Dynix, but then struck an agreement with Microsoft for joint development and technology exchange on Xenix in 1982. In 1984, a port to the 68000-based Apple Lisa was jointly developed by SCO and Microsoft, the difficulty in porting to the various 8086 and Z8000-based machines, said Microsoft in its 1983 OEM directory, had been the lack of a standardized memory management unit and protection facilities. A generally available port to the unmapped Intel 8086/8088 architecture was done by The Santa Cruz Operation around 1983, SCO Xenix for the PC XT shipped sometime in 1984 and contained some enhancement from 4. 2BSD, it also supported the Micnet local area networking. The later 286 version of Xenix leveraged the integrated MMU present on this chip, the 286 Xenix was accompanied by new hardware from Xenix OEMs. For example, the Sperry PC/IT, an IBM PC AT clone, was advertised as capable of supporting eight simultaneous dumb terminal users under this version and it was followed by a System V.2 codebase in Xenix 5.0. Microsoft hopes that XENIX will become the choice for software production and exchange. Microsoft referred to its own MS-DOS as its single-user, single-tasking operating system, microsofts Chris Larson described MS-DOS2. 0s Xenix compatibility as the second most important feature. AT&T started selling System V, however, after the breakup of the Bell System, Microsoft, believing that it could not compete with Unixs developer, decided to abandon Xenix. The decision was not immediately transparent, which led to the term vaporware and it agreed with IBM to develop OS/2, and the Xenix team was assigned to that project

34.
Windows NT
–
Windows NT is a family of operating systems produced by Microsoft, the first version of which was released in July 1993. It is a processor-independent, multiprocessing, multi-user operating system, the first version of Windows NT was Windows NT3.1 and was produced for workstations and server computers. It was intended to complement consumer versions of Windows that were based on MS-DOS, gradually, the Windows NT family was expanded into Microsofts general-purpose operating system product line for all personal computers, deprecating the Windows 9x family. NT was formerly expanded to New Technology but no longer carries any specific meaning, starting with Windows 2000, NT was removed from the product name and is only included in the product version string. NT was the first purely 32-bit version of Windows, whereas its consumer-oriented counterparts, Windows 3. 1x and it is a multi-architecture operating system. Initially, it supported several CPU architectures, including IA-32, MIPS, DEC Alpha, PowerPC, the latest versions support x86 and ARM. This lineage is made clear in Cutlers foreword to Inside Windows NT by Helen Custer and it has been suggested that Dave Cutler intended the initialism WNT as a play on VMS, incrementing each letter by one. However, the project was intended as a follow-on to OS/2 and was referred to as NT OS/2 before receiving the Windows brand. One of the original NT developers, Mark Lucovsky, states that the name was taken from the original target processor—the Intel i860, the letters were dropped from the names of releases from Windows 2000 onwards, though Microsoft described that product as being Built on NT Technology. A main design goal of NT was hardware and software portability, the idea was to have a common code base with a custom Hardware Abstraction Layer for each platform. However, support for MIPS, Alpha, and PowerPC was later dropped in Windows 2000, broad software compatibility was achieved with support for several API personalities, including Windows API, POSIX, and OS/2 APIs – the latter two were phased out starting with Windows XP. Partial MS-DOS compatibility was achieved via an integrated DOS Virtual Machine – although this feature is being phased out in the x86-64 architecture, NT supported per-object access control lists allowing a rich set of security permissions to be applied to systems and services. NT supported Windows network protocols, inheriting the previous OS/2 LAN Manager networking, Windows NT3.1 was the first version of Windows to use 32-bit flat virtual memory addressing on 32-bit processors. Its companion product, Windows 3.1, used segmented addressing, notably, in Windows NT3. x, several I/O driver subsystems, such as video and printing, were user-mode subsystems. In Windows NT4, the video, server, and printer spooler subsystems were moved into kernel mode, NTFS, a journaled, secure file system, was created for NT. Windows NT also allows for other file systems, starting with versions 3.1. Windows NT introduced its own model, the Windows NT driver model. With Windows 2000, the Windows NT driver model was enhanced to become the Windows Driver Model, which was first introduced with Windows 98, but was based on the NT driver model

35.
EComStation
–
EComStation or eCS is a PC operating system based on OS/2, published by Serenity Systems and Mensys BV and currently owned and developed by XEU. com. It includes several additions and accompanying software not present in the IBM version of the system, eComStation is a 32-bit operating system which runs exclusively on the x86 processor architecture and is still used as of 2017. Version 1 of eComStation, released in 2001, was based around the integrated OS/2 version 4.5 client Convenience Package for OS/2 Warp version 4, key among these were the JFS file system and the logical volume manager. Operating system features and enhancements that had made available as updates. IBM-supplied updates that had only been offered to customers with maintenance contracts, such as UDF support. EComStation provided a channel for end users to obtain these updates. Utilities and drivers licensed from third parties including scanner support and drivers for multiple serial cards, open source utilities from the Unix world. A number of utilities and drivers developed by various third parties. As IBM began to wind down OS/2 development, Serenity and its partners began to take up the slack in terms of keeping the operating system usable on current hardware, the results of many of these efforts are included in version 2 of eComStation, among others, ACPI support. A new generic graphic card driver called Panorama, a universal sound card driver based on ALSA. On-the-fly resizing of hard drive partitions, a new client to access CIFS/SMB LAN resources based upon Samba. Ports of current Mozilla Firefox and Mozilla Thunderbird for browsing and email, a port of the OpenOffice. org office suite. When it became clear that IBM would not release any new retail version of the OS/2 Warp client operating system after version 4 in 1996, IBM released a final version of its server edition, IBM OS/2 Warp Server for e-Business or WSeB, internally called version 4.5. The OS/2 software vendor Stardock made such a proposal to IBM in 1999, notwithstanding Cheungs fairly simple initial concept, community input was actively solicited from the beginning, and feature requests quickly began coming in. The final GA release of eComStation 1.0 was not released until July 2001, most obviously, the IBM OS/2 install routine was no longer used, instead, a rapid-deployment system based on Cheungs WiseManager product was utilized to install the operating system components. EComStation 1.0 was built on the 2000 release of IBMs Convenience Package for OS/2 Warp version 4, additionally, several commercial applications were bundled with the operating system package, most notably Lotus SmartSuite for OS/2 and IBM Desktop On-Call. Once the English edition was released, efforts turned to making other language editions available, ultimately, no further non-English NLVs were released for eComStation 1.0, other languages would not become available until eComStation 1.1 or 1.2. EComStation 1.1 included several new features compared to version 1.0

36.
Display device
–
A display device is an output device for presentation of information in visual or tactile form. When the input information is supplied has a signal, the display is called an electronic display. Common applications for visual displays are televisions or computer monitors. Some displays can show only digits or alphanumeric characters and they are called segment displays, because they are composed of several segments that switch on and off to give appearance of desired glyph. The segments are usually single LEDs or liquid crystals and they are mostly used in digital watches and pocket calculators. There are several types, Seven-segment display Fourteen-segment display Sixteen-segment display HD44780 LCD controller a widely accepted protocol for LCDs and they use electro-mechanical parts to dynamically update a tactile image so that the image may be felt by the fingers. Optacon, using metal rods instead of light in order to convey images to blind people by tactile sensation, in the history of display technology, a variety of display devices and technologies have been used

37.
Computer keyboard
–
In computing, a computer keyboard is a typewriter-style device which uses an arrangement of buttons or keys to act as a mechanical lever or electronic switch. Following the decline of punch cards and paper tape, interaction via teleprinter-style keyboards became the input device for computers. A keyboard typically has characters engraved or printed on the keys, however, to produce some symbols requires pressing and holding several keys simultaneously or in sequence. While most keyboard keys produce letters, numbers or signs, other keys or simultaneous key presses can produce actions or execute computer commands. In normal usage, the keyboard is used as a text entry interface to type text and numbers into a word processor, in a modern computer, the interpretation of key presses is generally left to the software. A computer keyboard distinguishes each physical key from every other and reports all key presses to the controlling software, Keyboards are also used for computer gaming, either with regular keyboards or by using keyboards with special gaming features, which can expedite frequently used keystroke combinations. A keyboard is used to give commands to the operating system of a computer, such as Windows Control-Alt-Delete combination. A command-line interface is a type of user interface operated entirely through a keyboard and it was through such devices that modern computer keyboards inherited their layouts. Earlier models were developed separately by individuals such as Royal Earl House, earlier, Herman Hollerith developed the first keypunch devices, which soon evolved to include keys for text and number entry akin to normal typewriters by the 1930s. From the 1940s until the late 1960s, typewriters were the means of data entry. The keyboard remained the primary, most integrated computer peripheral well into the era of personal computing until the introduction of the mouse as a device in 1984. By this time, text-only user interfaces with sparse graphics gave way to comparatively graphics-rich icons on screen, One factor determining the size of a keyboard is the presence of duplicate keys, such as a separate numeric keyboard, for convenience. A keyboard with few keys is called a keypad, another factor determining the size of a keyboard is the size and spacing of the keys. Reduction is limited by the consideration that the keys must be large enough to be easily pressed by fingers. Alternatively a tool is used for pressing small keys, standard alphanumeric keyboards have keys that are on three-quarter inch centers, and have a key travel of at least 0.150 inches. Desktop computer keyboards, such as the 101-key US traditional keyboards or the 104-key Windows keyboards, include characters, punctuation symbols, numbers. The internationally common 102/104 key keyboards have a left shift key. Also the enter key is usually shaped differently, computer keyboards are similar to electric-typewriter keyboards but contain additional keys, such as the command or Windows keys

38.
BIOS
–
The BIOS is a type of firmware used to perform hardware initialization during the booting process on IBM PC compatible computers, and to provide runtime services for operating systems and programs. The BIOS firmware is built into personal computers, and it is the first software they run when powered on, the name itself originates from the Basic Input/Output System used in the CP/M operating system in 1975. Originally proprietary to the IBM PC, the BIOS has been engineered by companies looking to create compatible systems. The fundamental purposes of the BIOS in modern PCs are to initialize and test the hardware components. Variations in the hardware are hidden by the BIOS from programs that use BIOS services instead of directly accessing the hardware. MS-DOS, which was the dominant PC operating system from the early 1980s until the mid-1990s, relied on BIOS services for disk, keyboard, and text display functions. Most BIOS implementations are specifically designed to work with a computer or motherboard model. This allows easy updates to the BIOS firmware so new features can be added or bugs can be fixed, unified Extensible Firmware Interface was designed as a successor to BIOS, aiming to address its technical shortcomings. As of 2014, new PC hardware predominantly ships with UEFI firmware, together with the underlying hardware-specific, but operating system-independent System BIOS, which resides in ROM, it represents the analogous to the CP/M BIOS. With the introduction of PS/2 machines, IBM divided the System BIOS into real-mode, the BIOS of the original IBM PC XT had no interactive user interface. Options on the IBM PC and XT were set by switches and jumpers on the main board, starting around the mid-1990s, it became typical for the BIOS ROM to include a BIOS configuration utility or BIOS setup utility, accessed at system power-up by a particular key sequence. This program allowed the user to set configuration options, of the type formerly set using DIP switches. The disk was supplied with the computer, and if it was lost the settings could not be changed. Instead of battery-backed RAM, the modern Wintel machine may store the BIOS configuration settings in flash ROM, early Intel processors started at physical address 000FFFF0h. When a modern x86 microprocessor is reset, it starts in pseudo 16-bit real mode, the code segment register is initialized with selector F000h, base FFFF0000h, and limit FFFFh, so that execution starts at 4 GB minus 16 bytes. The platform logic maps this address into the system ROM, mirroring address 000FFFF0h, if the system has just been powered up or the reset button was pressed, the full power-on self-test is run. This saves the time used to detect and test all memory. If the download was apparently successful, the BIOS would verify a checksum on it and then run it

39.
Keyboard shortcut
–
In computing, a keyboard shortcut is a series of one or several keys that invoke a software or operating system operation when triggered by the user. The meaning of term keyboard shortcut can vary depending on software manufacturer, keyboard shortcuts are generally used to expedite common operations by reducing input sequences to a few keystrokes, hence the term shortcut. To differentiate from general keyboard input, most keyboard shortcuts require the user to press, unmodified key presses are sometimes accepted when the keyboard is not used for general input - such as with graphics packages e. g. Adobe Photoshop or IBM Lotus Freelance Graphics. Other keyboard shortcuts use function keys that are dedicated for use in shortcuts, for simultaneous keyboard shortcuts, one usually first holds down the modifier key, then quickly presses and releases the regular key, and finally releases the modifier key. This distinction is important, as trying to all the keys simultaneously will frequently either miss some of the modifier keys. Sequential shortcuts usually involve pressing and releasing a dedicated key, such as the Esc key. Mnemonics are distinguishable from keyboard shortcuts, in most GUIs, a programs keyboard shortcuts are discoverable by browsing the programs menus – the shortcut is indicated next to the menu choice. There are keyboards that have the shortcuts for an application already marked on them. These keyboards are used for editing video, audio, or graphics. There are also stickers with shortcuts printed on them that can be applied to a regular keyboard, reference cards intended to be propped up in the users workspace also exist for many applications. This highlights a difference in philosophy regarding shortcuts, some systems, typically end-user-oriented systems such as Mac OS or Windows, consider standardized shortcuts essential to the environments ease of use. These systems usually limit a users ability to change shortcuts, possibly even requiring a separate or third-party utility to perform the task, other systems, typically Unix and related, consider shortcuts to be a users prerogative, and that they should be changeable to suit individual preference. For Microsoft Windows, multiple software programs exist like Hotkeycontrol which allow deeper customization of keyboard shortcuts performing advanced tasks, the motivations for customizing key bindings vary. Users new to a program or software environment may customize the new environments keybindings to be similar to another environment with which they are more familiar. More advanced users may customize key bindings to better suit their workflow, adding shortcuts for their commonly used actions, hardcore gamers often customize their key bindings in order to increase performance via faster reaction times. The original Macintosh User Interface Guidelines defined a set of keyboard shortcuts that would remain consistent across application programs and this provides a better user experience than the situation then prevalent of applications using the same keys for different functions. This could result in user errors if one program used ⌘ Command+D to mean Delete while another used it to Duplicate an item, Help Later environments such as Microsoft Windows retain some of these bindings, while adding their own from alternate standards like Common User Access. The simplest keyboard shortcuts consist of one key

40.
X.25
–
X.25 is an ITU-T standard protocol suite for packet switched wide area network communication. An X.25 WAN consists of packet-switching exchange nodes as the networking hardware, X.25 is a family of protocols that was popular during the 1980s with telecommunications companies and in financial transaction systems such as automated teller machines. X.25 was originally defined by the International Telegraph and Telephone Consultative Committee in a series of drafts and finalized in a publication known as The Orange Book in 1976. While X.25 has, to an extent, been replaced by less complex protocols, especially the Internet protocol. X.25 is one of the oldest packet-switched services available and it was developed before the OSI Reference Model. The protocol suite is designed as three conceptual layers, which correspond closely to the three layers of the seven-layer OSI model. It also supports functionality not found in the OSI network layer, X.25 was developed in the ITU-T Study Group VII based upon a number of emerging data network projects. Various updates and additions were worked into the standard, eventually recorded in the ITU series of books describing the telecommunication systems. These books were published every year with different-colored covers. The X.25 specification is only part of the set of X-Series specifications on public data networks. The public data network was the name given to the international collection of X.25 providers. Their combined network had large global coverage during the 1980s and into the 1990s, publicly accessible X.25 networks were set up in most countries during the 1970s and 1980s, to lower the cost of accessing various online services. Beginning in the early 1990s, in North America, use of X.25 networks started to be replaced by Frame Relay, most systems that required X.25 now use TCP/IP, however it is possible to transport X.25 over TCP/IP when necessary. X.25 networks are still in use throughout the world, a variant called AX.25 is also used widely by amateur packet radio. Racal Paknet, now known as Widanet, is still in operation in many regions of the world, running on an X.25 protocol base. Additionally X.25 is still under heavy use in the business even though a transition to modern protocols like X.400 is without option as X.25 hardware becomes increasingly rare. As recently as March 2006, the United States National Airspace Data Interchange Network has used X.25 to interconnect remote airfields with Air Route Traffic Control Centers, france was one of the last remaining countries where commercial end-user service based on X.25 operated. Known as Minitel it was based on Videotex, itself running on X.25, as planned, service was terminated 30 June 2012

The Personal System/2 or PS/2 was IBM's third generation of personal computers. Released in 1987, it officially …

The original IBM PS/2 mouse.

The PS/2 connection ports (later colored purple for keyboard and green for mouse, according to PC 97) were once commonly used for connecting input devices.

MCA IBM XGA-2 Graphics Card

Some PS/2 models used a quick-attachment socket on the back of the floppy drive. This connector is incompatible with a standard 5.25" floppy connector because IBM also supplied power to the drive via the connector, which was not done for the 5.25" drive. Also shown on the right is the special IBM-only hard drive which incorporates power and data into a single connector. Power supplies of certain PS/2 models did not have the additional spare 4-pin power connectors for use with internal storage.

Older Ethernet equipment. Clockwise from top-left: An Ethernet transceiver with an in-line 10BASE2 adapter, a similar model transceiver with a 10BASE5 adapter, an AUI cable, a different style of transceiver with 10BASE2 BNC T-connector, two 10BASE5 end fittings (N connectors), an orange "vampire tap" installation tool (which includes a specialized drill bit at one end and a socket wrench at the other), and an early model 10BASE5 transceiver (h4000) manufactured by DEC. The short length of yellow 10BASE5 cable has one end fitted with a N connector and the other end prepared to have a N connector shell installed; the half-black, half-grey rectangular object through which the cable passes is an installed vampire tap.

Two Internet hosts connected via two routers and the corresponding layers used at each hop. The application on each host executes read and write operations as if the processes were directly connected to each other by some kind of data pipe. Every other detail of the communication is hidden from each process. The underlying mechanisms that transmit data between the host computers are located in the lower protocol layers.

Encapsulation of application data descending through the layers described in RFC 1122

In the microkernel approach, the kernel itself only provides basic functionality that allows the execution of servers, separate programs that assume former kernel functions, such as device drivers, GUI servers, etc.

The hybrid kernel approach combines the speed and simpler design of a monolithic kernel with the modularity and execution safety of a microkernel.