Although it was originally released for the Amiga, Vim has since been developed to be cross-platform, supporting many other platforms. In 2006, it was voted the most popular editor amongst Linux Journal readers;[8] in 2015 the Stack Overflow developer survey found it to be the third most popular text editor,[9] and the fifth most popular development environment in 2018.[10]

Contents

Bram Moolenaar began working on Vim for the Amiga computer in 1988. Moolenaar first publicly released Vim (v1.14) in 1991.[11] Vim was based on an earlier editor, Stevie, for the Atari ST,[1] created by Tim Thompson, Tony Andrews, and G.R. (Fred) Walter.[12][13][discuss]

The name "Vim" is an acronym for "Vi IMproved"[14] because Vim is an extended version of the vi editor, with many additional features designed to be helpful in editing program source code. Originally, the acronym stood for "Vi IMitation", but that was changed with the release of Vim 2.0 in December 1993.[15] A later comment states that the reason for changing the name was that Vim's feature set surpassed that of vi.[16]

Like vi, Vim's interface is not based on menus or icons but on commands given in a text user interface; its GUI mode, gVim, adds menus and toolbars for commonly used commands but the full functionality is still expressed through its command line mode. Vi (and by extension Vim) tends to allow a typist to keep their fingers on the home row, which can be an advantage for a touch typist.[29]

Vim has a built-in tutorial for beginners (accessible through the "vimtutor" command). There is also the Vim Users' Manual that details Vim's features. This manual can be read from within Vim, or found online.[30][31]

Vim also has a built-in help facility (using the :help command) that allows users to query and navigate through commands and features.

Part of Vim's power is that it can be extensively customized. The basic interface can be controlled by the many options available, and the user can define personalized key mappings—often called macros—or abbreviations to automate sequences of keystrokes, or even call internal or user-defined functions.

There are many plugins available that will extend or add new functionality to Vim. These complex scripts are usually written in Vim's internal scripting language, vimscript (also known as VimL).[32] Vim also supports scripting using Lua (as of Vim 7.3), Perl, Python, Racket[33] (formerly PLT Scheme), Ruby, and Tcl.

There are projects bundling together complex scripts and customizations and aimed at turning Vim into a tool for a specific task or adding a major flavour to its behaviour. Examples include Cream, which makes Vim behave like a click-and-type editor, or VimOutliner, which provides a comfortable outliner for users of Unix-like systems.

Vim has a vi compatibility mode, but, when not in this mode, Vim has many enhancements over vi.[34] However, even in compatibility mode, Vim is not entirely compatible with vi as defined in the Single Unix Specification[35] and POSIX (e.g., Vim does not support vi's open mode, only visual mode). Vim has nevertheless been described as "very much compatible with Vi".[36]

Vim macros can contain a sequence of normal-mode commands, but can also invoke ex commands or functions written in Vim script for more complex tasks. Almost all extensions (called plugins or more commonly scripts) of the core Vim functionality are written in Vim script, but plugins can also utilize other interpreted languages like Perl, Python, Lua, or Ruby (if support for them is compiled into the Vim binary).

Vim script files are stored in plain text format and the file name extension is .vim. There are libraries for Vim script available on www.vim.org as Vim plugins.

Neovim[43] is an extension of Vim that strives to improve the extensibility and maintainability of Vim.[44] Neovim shares the same configuration syntax with Vim; as a result, the same config file can be used with both editors.[45] As of version 0.1, released in December 2015, Neovim is compatible with almost all of Vim's features.[46]

The Neovim project was started in 2014, with some Vim community members offering early support of the high-level refactoring effort to provide better scripting, plugins, and integration with modern GUIs.[47][48] The project is open source and the full code is available on Github.[49] Neovim had a successful fundraiser on March 23, 2014,[50] supporting at least one full-time developer. Several frontends are under development, making use of Neovim's capabilities.[51][52]

The Neovim editor is available in Ubuntu's PPAs,[53] and several other package managers,[54] making it possible to install on a variety of Linux-based operating systems.

^Vim documentation: intro: "Vim is pronounced as one word, like Jim, not vi-ai-em. It's written with a capital, since it's a name, again like Jim."

^Zapletal, Lukáš (April 18, 2005), "Interview: Bram Moolenaar", LinuxEXPRES: 21–22, retrieved February 5, 2015, Is VIM derivate of other VI clone or you started from scratch? I started with Stevie. This was a Vi clone for the Atari ST computer, ported to the Amiga. It had quite a lot of problems and could not do everything that Vi could, but since the source code was available I could fix that myself. (English translation)

^"vim(1)". die.net. Vim. 11 April 2006. Archived from the original on 9 July 2016. Retrieved 9 July 2016. Vim is based on Stevie, worked on by: Tim Thompson, Tony Andrews and G.R. (Fred) Walter. Although hardly any of the original code remains.

1.
Unix
–
Among these is Apples macOS, which is the Unix version with the largest installed base as of 2014. Many Unix-like operating systems have arisen over the years, of which Linux is the most popular, Unix was originally meant to be a convenient platform for programmers developing software to be run on it and on other systems, rather than for non-programmer users. The system grew larger as the system started spreading in academic circles, as users added their own tools to the system. Unix was designed to be portable, multi-tasking and multi-user in a time-sharing configuration and these concepts are collectively known as the Unix philosophy. By the early 1980s users began seeing Unix as a universal operating system. Under Unix, the system consists of many utilities along with the master control program. To mediate such access, the kernel has special rights, reflected in the division between user space and kernel space, the microkernel concept was introduced in an effort to reverse the trend towards larger kernels and return to a system in which most tasks were completed by smaller utilities. In an era when a standard computer consisted of a disk for storage and a data terminal for input and output. However, modern systems include networking and other new devices, as graphical user interfaces developed, the file model proved inadequate to the task of handling asynchronous events such as those generated by a mouse. In the 1980s, non-blocking I/O and the set of inter-process communication mechanisms were augmented with Unix domain sockets, shared memory, message queues, and semaphores. In microkernel implementations, functions such as network protocols could be moved out of the kernel, Multics introduced many innovations, but had many problems. Frustrated by the size and complexity of Multics but not by the aims and their last researchers to leave Multics, Ken Thompson, Dennis Ritchie, M. D. McIlroy, and J. F. Ossanna, decided to redo the work on a much smaller scale. The name Unics, a pun on Multics, was suggested for the project in 1970. Peter H. Salus credits Peter Neumann with the pun, while Brian Kernighan claims the coining for himself, in 1972, Unix was rewritten in the C programming language. Bell Labs produced several versions of Unix that are referred to as Research Unix. In 1975, the first source license for UNIX was sold to faculty at the University of Illinois Department of Computer Science, UIUC graduate student Greg Chesson was instrumental in negotiating the terms of this license. During the late 1970s and early 1980s, the influence of Unix in academic circles led to adoption of Unix by commercial startups, including Sequent, HP-UX, Solaris, AIX. In the late 1980s, AT&T Unix System Laboratories and Sun Microsystems developed System V Release 4, in the 1990s, Unix-like systems grew in popularity as Linux and BSD distributions were developed through collaboration by a worldwide network of programmers

2.
Linux
–
Linux is a Unix-like computer operating system assembled under the model of free and open-source software development and distribution. The defining component of Linux is the Linux kernel, an operating system kernel first released on September 17,1991 by Linus Torvalds, the Free Software Foundation uses the name GNU/Linux to describe the operating system, which has led to some controversy. Linux was originally developed for computers based on the Intel x86 architecture. Because of the dominance of Android on smartphones, Linux has the largest installed base of all operating systems. Linux is also the operating system on servers and other big iron systems such as mainframe computers. It is used by around 2. 3% of desktop computers, the Chromebook, which runs on Chrome OS, dominates the US K–12 education market and represents nearly 20% of the sub-$300 notebook sales in the US. Linux also runs on embedded systems – devices whose operating system is built into the firmware and is highly tailored to the system. This includes TiVo and similar DVR devices, network routers, facility automation controls, televisions, many smartphones and tablet computers run Android and other Linux derivatives. The development of Linux is one of the most prominent examples of free, the underlying source code may be used, modified and distributed‍—‌commercially or non-commercially‍—‌by anyone under the terms of its respective licenses, such as the GNU General Public License. Typically, Linux is packaged in a known as a Linux distribution for both desktop and server use. Distributions intended to run on servers may omit all graphical environments from the standard install, because Linux is freely redistributable, anyone may create a distribution for any intended use. The Unix operating system was conceived and implemented in 1969 at AT&Ts Bell Laboratories in the United States by Ken Thompson, Dennis Ritchie, Douglas McIlroy, first released in 1971, Unix was written entirely in assembly language, as was common practice at the time. Later, in a key pioneering approach in 1973, it was rewritten in the C programming language by Dennis Ritchie, the availability of a high-level language implementation of Unix made its porting to different computer platforms easier. Due to an earlier antitrust case forbidding it from entering the computer business, as a result, Unix grew quickly and became widely adopted by academic institutions and businesses. In 1984, AT&T divested itself of Bell Labs, freed of the legal obligation requiring free licensing, the GNU Project, started in 1983 by Richard Stallman, has the goal of creating a complete Unix-compatible software system composed entirely of free software. Later, in 1985, Stallman started the Free Software Foundation, by the early 1990s, many of the programs required in an operating system were completed, although low-level elements such as device drivers, daemons, and the kernel were stalled and incomplete. Linus Torvalds has stated that if the GNU kernel had been available at the time, although not released until 1992 due to legal complications, development of 386BSD, from which NetBSD, OpenBSD and FreeBSD descended, predated that of Linux. Torvalds has also stated that if 386BSD had been available at the time, although the complete source code of MINIX was freely available, the licensing terms prevented it from being free software until the licensing changed in April 2000

3.
Android (operating system)
–
Android is a mobile operating system developed by Google, based on the Linux kernel and designed primarily for touchscreen mobile devices such as smartphones and tablets. In addition to devices, Google has further developed Android TV for televisions, Android Auto for cars. Variants of Android are also used on notebooks, game consoles, digital cameras, beginning with the first commercial Android device in September 2008, the operating system has gone through multiple major releases, with the current version being 7.0 Nougat, released in August 2016. Android applications can be downloaded from the Google Play store, which features over 2.7 million apps as of February 2017, Android has been the best-selling OS on tablets since 2013, and runs on the vast majority of smartphones. In September 2015, Android had 1.4 billion monthly active users, Android is popular with technology companies that require a ready-made, low-cost and customizable operating system for high-tech devices. The success of Android has made it a target for patent, Android Inc. was founded in Palo Alto, California in October 2003 by Andy Rubin, Rich Miner, Nick Sears, and Chris White. Rubin described the Android project as tremendous potential in developing smarter mobile devices that are aware of its owners location. The early intentions of the company were to develop an operating system for digital cameras. Despite the past accomplishments of the founders and early employees, Android Inc. operated secretly and that same year, Rubin ran out of money. Steve Perlman, a friend of Rubin, brought him $10,000 in cash in an envelope. In July 2005, Google acquired Android Inc. for at least $50 million and its key employees, including Rubin, Miner and White, joined Google as part of the acquisition. Not much was known about Android at the time, with Rubin having only stated that they were making software for mobile phones, at Google, the team led by Rubin developed a mobile device platform powered by the Linux kernel. Google marketed the platform to handset makers and carriers on the promise of providing a flexible, upgradeable system, Google had lined up a series of hardware components and software partners and signaled to carriers that it was open to various degrees of cooperation. Speculation about Googles intention to enter the communications market continued to build through December 2006. In September 2007, InformationWeek covered an Evalueserve study reporting that Google had filed several patent applications in the area of mobile telephony, the first commercially available smartphone running Android was the HTC Dream, also known as T-Mobile G1, announced on September 23,2008. Since 2008, Android has seen numerous updates which have improved the operating system, adding new features. Each major release is named in order after a dessert or sugary treat, with the first few Android versions being called Cupcake, Donut, Eclair. In 2010, Google launched its Nexus series of devices, a lineup in which Google partnered with different device manufacturers to produce new devices and introduce new Android versions

4.
Amiga
–
The Amiga is a family of personal computers sold by Commodore in the 1980s and 1990s. The Amiga provided a significant upgrade from earlier 8-bit home computers, the Amiga 1000 was officially released in July 1985, but a series of production problems meant it did not become widely available until early 1986. The best selling model, the Amiga 500, was introduced in 1987 and became one of the home computers of the late 1980s. The A3000, introduced in 1990, started the second generation of Amiga systems, followed by the A500+, finally, as the third generation, the A1200 and the A4000 were released in late 1992. The platform became particularly popular for gaming and programming demos and it also found a prominent role in the desktop video, video production, and show control business, leading to video editing systems such as the Video Toaster. The Amigas native ability to play back multiple digital sound samples made it a popular platform for early tracker music software. It was also an expensive alternative to the Apple Macintosh. Initially, the Amiga was developed alongside various Commodore PC clones, Commodore ultimately went bankrupt in April 1994 after the Amiga CD32 model failed in the marketplace. Since the demise of Commodore, various groups have marketed successors to the original Amiga line, including Genesi, Eyetech, ACube Systems Srl, likewise, AmigaOS has influenced replacements, clones and compatible systems such as MorphOS, AmigaOS4 and AROS. The Amiga was so far ahead of its time that almost nobody—including Commodores marketing department—could fully articulate what it was all about. Today, its obvious the Amiga was the first multimedia computer, but in those days it was derided as a machine because few people grasped the importance of advanced graphics, sound. Nine years later, vendors are still struggling to make systems that work like 1985 Amigas, Jay Miner joined Atari in the 1970s to develop custom integrated circuits, and led development of the Atari 2600s TIA. Almost as soon as its development was complete, the team developing a much more sophisticated set of chips, CTIA, ANTIC and POKEY. With the 8-bit lines launch in 1979, Miner again started looking at a next generation chipset, Miner wanted to start work with the new Motorola 68000, but management was only interested in another MOS6502 based system. Miner left the company, and the industry, shortly thereafter, in 1982, Larry Kaplan was approached by a number of investors who wanted to develop a new game platform. Kaplan hired Miner to run the side of the newly formed company. The system was code-named Lorraine in keeping with Miners policy of giving systems female names, in case the company presidents wife. When Kaplan left the late in 1982 to rejoin Atari, Miner was promoted to head engineer

5.
Uganda
–
Uganda, officially the Republic of Uganda, is a landlocked country in East Africa. It is bordered to the east by Kenya, to the north by South Sudan, to the west by the Democratic Republic of the Congo, to the south-west by Rwanda, Uganda is the worlds second most populous landlocked country after Ethiopia. The southern part of the country includes a portion of Lake Victoria, shared with Kenya. Uganda is in the African Great Lakes region, Uganda also lies within the Nile basin, and has a varied but generally a modified equatorial climate. Uganda takes its name from the Buganda kingdom, which encompasses a portion of the south of the country. The people of Uganda were hunter-gatherers until 1,700 to 2,300 years ago, beginning in 1894, the area was ruled as a protectorate by the British, who established administrative law across the territory. Uganda gained independence from Britain on 9 October 1962, luganda, a central language, is widely spoken across the country, and several other languages are also spoken including Runyoro, Runyankole, Rukiga, and Luo. The president of Uganda is Yoweri Museveni, who came to power in January 1986 after a protracted guerrilla war. The ancestors of the Ugandans were hunter-gatherers until 1, 700-2,300 years ago, Bantu-speaking populations, who were probably from central Africa, migrated to the southern parts of the country. According to oral tradition, the Empire of Kitara covered an important part of the lakes area, from the northern lakes Albert and Kyoga to the southern lakes Victoria. Bunyoro-Kitara is claimed as the antecedent of the Buganda, Toro, Ankole, some Luo invaded the area of Bunyoro and assimilated with the Bantu there, establishing the Babiito dynasty of the current Omukama of Bunyoro-Kitara. Arab traders moved inland from the Indian Ocean coast of East Africa in the 1830s and they were followed in the 1860s by British explorers searching for the source of the Nile. British Anglican missionaries arrived in the kingdom of Buganda in 1877 and were followed by French Catholic missionaries in 1879, the British government chartered the Imperial British East Africa Company to negotiate trade agreements in the region beginning in 1888. From 1886, there were a series of wars in Buganda. Because of civil unrest and financial burdens, IBEAC claimed that it was unable to maintain their occupation in the region, in the 1890s,32,000 labourers from British India were recruited to East Africa under indentured labour contracts to construct the Uganda Railway. Most of the surviving Indians returned home, but 6,724 decided to remain in East Africa after the lines completion, subsequently, some became traders and took control of cotton ginning and sartorial retail. British naval ships unknowingly carried rats that contained the bubonic plague and these rats spread the disease throughout Uganda. From 1900 to 1920, a sleeping sickness epidemic in the part of Uganda, along the north shores of Lake Victoria

6.
Atari ST
–
The Atari ST is a line of home computers from Atari Corporation and the successor to the Atari 8-bit family. The initial ST model, the 520ST, saw limited release in April-June 1985 and was available in July. The Atari ST is the first personal computer to come with a bitmapped color GUI, the 1040ST, released in 1986, is the first personal computer to ship with a megabyte of RAM in the base configuration and also the first with a cost-per-kilobyte of less than US$1. The Atari ST is part of a generation of home computers that have 16 or 32-bit processors,256 KiB or more of RAM. It includes the Macintosh, Commodore Amiga, Apple IIGS, and, in certain markets, ST officially stands for Sixteen/Thirty-two, which refers to the Motorola 68000s 16-bit external bus and 32-bit internals. The ST was sold with either Ataris color or monochrome monitor, the systems color graphics modes were only available on the color monitor, while the highest resolution mode needed the monochrome monitor. In some markets, particularly Germany, the machine gained a foothold as a small business machine for CAD. Thanks to its built-in MIDI ports, the ST enjoyed success for running music-sequencer software, the ST was superseded by the Atari STE, Atari TT, Atari MEGA STE, and Falcon computers. The Atari ST was born from the rivalry between home-computer makers Atari, Inc. and Commodore International, when his idea was rejected, Miner left Atari to form a small think tank called Hi-Toro in 1982 and began designing the new Lorraine chipset. The company, which was later renamed Amiga Corporation, was pretending to sell video game controllers to deceive competition while it developed a Lorraine-based computer, Amiga ran out of capital to complete Lorraines development, and Atari, owned by Warner Communications, paid Amiga to continue development work. In return Atari received exclusive use of the Lorraine design for one year as a game console. After one year Atari would have the right to add a keyboard and market the complete computer, as Atari was heavily involved with Disney at the time, it was later code-named Mickey, and the 256K memory expansion board was codenamed Minnie. After leaving Commodore International in January 1984, Jack Tramiel formed Tramel Technology with his sons and other employees and, in April. The company initially considered the National Semiconductor NS320xx microprocessor but was disappointed with its performance and this started the move to the 68000. Tramiel learned that Warner wanted to sell Atari which, in mid-1984, was losing about a million dollars per day, interested in Ataris overseas manufacturing and worldwide distribution network for his new computer, Tramiel negotiated with Warner in May and June 1984. He secured funding and bought Ataris Consumer Division in July, as executives and engineers left Commodore to join Tramiels new Atari Corporation, Commodore responded by filing lawsuits against four former engineers for theft of trade secrets. This company was originally called TTL, later renamed to Atari Corp, at the time of the purchase of Atari Incs assets, there were roughly 900 employees remaining from a high point of 10,000. After the interviews, approximately 100 employees were hired to work at Atari Corp, Amiga Corp. had sought more monetary support from investors in spring 1984

7.
Operating system
–
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. All computer programs, excluding firmware, require a system to function. Operating systems are found on many devices that contain a computer – from cellular phones, the dominant desktop operating system is Microsoft Windows with a market share of around 83. 3%. MacOS by Apple Inc. is in place, and the varieties of Linux is in third position. Linux distributions are dominant in the server and supercomputing sectors, other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can run one program at a time. Multi-tasking may be characterized in preemptive and co-operative types, in preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, e. g. Solaris, Linux, cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking, 32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem, a distributed operating system manages a group of distinct computers and makes them appear to be a single computer. The development of networked computers that could be linked and communicate with each other gave rise to distributed computing, distributed computations are carried out on more than one machine. When computers in a work in cooperation, they form a distributed system. The technique is used both in virtualization and cloud computing management, and is common in large server warehouses, embedded operating systems are designed to be used in embedded computer systems. They are designed to operate on small machines like PDAs with less autonomy and they are able to operate with a limited number of resources. They are very compact and extremely efficient by design, Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is a system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could run different programs in succession to speed up processing

8.
MacOS
–
Within the market of desktop, laptop and home computers, and by web usage, it is the second most widely used desktop OS after Microsoft Windows. Launched in 2001 as Mac OS X, the series is the latest in the family of Macintosh operating systems, Mac OS X succeeded classic Mac OS, which was introduced in 1984, and the final release of which was Mac OS9 in 1999. An initial, early version of the system, Mac OS X Server 1.0, was released in 1999, the first desktop version, Mac OS X10.0, followed in March 2001. In 2012, Apple rebranded Mac OS X to OS X. Releases were code named after big cats from the release up until OS X10.8 Mountain Lion. Beginning in 2013 with OS X10.9 Mavericks, releases have been named after landmarks in California, in 2016, Apple rebranded OS X to macOS, adopting the nomenclature that it uses for their other operating systems, iOS, watchOS, and tvOS. The latest version of macOS is macOS10.12 Sierra, macOS is based on technologies developed at NeXT between 1985 and 1997, when Apple acquired the company. The X in Mac OS X and OS X is pronounced ten, macOS shares its Unix-based core, named Darwin, and many of its frameworks with iOS, tvOS and watchOS. A heavily modified version of Mac OS X10.4 Tiger was used for the first-generation Apple TV, Apple also used to have a separate line of releases of Mac OS X designed for servers. Beginning with Mac OS X10.7 Lion, the functions were made available as a separate package on the Mac App Store. Releases of Mac OS X from 1999 to 2005 can run only on the PowerPC-based Macs from the time period, Mac OS X10.5 Leopard was released as a Universal binary, meaning the installer disc supported both Intel and PowerPC processors. In 2009, Apple released Mac OS X10.6 Snow Leopard, in 2011, Apple released Mac OS X10.7 Lion, which no longer supported 32-bit Intel processors and also did not include Rosetta. All versions of the system released since then run exclusively on 64-bit Intel CPUs, the heritage of what would become macOS had originated at NeXT, a company founded by Steve Jobs following his departure from Apple in 1985. There, the Unix-like NeXTSTEP operating system was developed, and then launched in 1989 and its graphical user interface was built on top of an object-oriented GUI toolkit using the Objective-C programming language. This led Apple to purchase NeXT in 1996, allowing NeXTSTEP, then called OPENSTEP, previous Macintosh operating systems were named using Arabic numerals, e. g. Mac OS8 and Mac OS9. The letter X in Mac OS Xs name refers to the number 10 and it is therefore correctly pronounced ten /ˈtɛn/ in this context. However, a common mispronunciation is X /ˈɛks/, consumer releases of Mac OS X included more backward compatibility. Mac OS applications could be rewritten to run natively via the Carbon API, the consumer version of Mac OS X was launched in 2001 with Mac OS X10.0. Reviews were variable, with praise for its sophisticated, glossy Aqua interface

9.
Graphical user interface
–
GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces, which require commands to be typed on a computer keyboard. The actions in a GUI are usually performed through direct manipulation of the graphical elements, beyond computers, GUIs are used in many handheld mobile devices such as MP3 players, portable media players, gaming devices, smartphones and smaller household, office and industrial controls. Designing the visual composition and temporal behavior of a GUI is an important part of application programming in the area of human–computer interaction. Its goal is to enhance the efficiency and ease of use for the logical design of a stored program. Methods of user-centered design are used to ensure that the language introduced in the design is well-tailored to the tasks. The visible graphical interface features of an application are sometimes referred to as chrome or GUI, typically, users interact with information by manipulating visual widgets that allow for interactions appropriate to the kind of data they hold. The widgets of an interface are selected to support the actions necessary to achieve the goals of users. A model–view–controller allows a structure in which the interface is independent from and indirectly linked to application functions. This allows users to select or design a different skin at will, good user interface design relates to users more, and to system architecture less. Large widgets, such as windows, usually provide a frame or container for the main presentation content such as a web page, smaller ones usually act as a user-input tool. A GUI may be designed for the requirements of a market as application-specific graphical user interfaces. By the 1990s, cell phones and handheld game systems also employed application specific touchscreen GUIs, newer automobiles use GUIs in their navigation systems and multimedia centers, or navigation multimedia center combinations. Sample graphical desktop environments A GUI uses a combination of technologies and devices to provide a platform that users can interact with, a series of elements conforming a visual language have evolved to represent information stored in computers. This makes it easier for people with few computer skills to work with, the most common combination of such elements in GUIs is the windows, icons, menus, pointer paradigm, especially in personal computers. The WIMP style of interaction uses a virtual device to represent the position of a pointing device, most often a mouse. Available commands are compiled together in menus, and actions are performed making gestures with the pointing device, a window manager facilitates the interactions between windows, applications, and the windowing system. The windowing system handles hardware devices such as pointing devices, graphics hardware, window managers and other software combine to simulate the desktop environment with varying degrees of realism. Smaller mobile devices such as personal assistants and smartphones typically use the WIMP elements with different unifying metaphors, due to constraints in space

10.
OS/2
–
OS/2 is a series of computer operating systems, initially created by Microsoft and IBM, then later developed by IBM exclusively. The name stands for Operating System/2, because it was introduced as part of the generation change release as IBMs Personal System/2 line of second-generation personal computers. The first version of OS/2 was released in December 1987 and newer versions were released until December 2001, OS/2 was intended as a protected mode successor of PC DOS. Because of this heritage, OS/2 shares similarities with Unix, Xenix, IBM discontinued its support for OS/2 on 31 December 2006. Since then, it has been updated, maintained and marketed under the name eComStation, in 2015 it was announced that a new OEM distribution of OS/2 would be released that was to be called ArcaOS. The development of OS/2 began when IBM and Microsoft signed the Joint Development Agreement in August 1985 and it was code-named CP/DOS and it took two years for the first product to be delivered. OS/21.0 was announced in April 1987 and released in December, the original release is textmode-only, and a GUI was introduced with OS/21.1 about a year later. OS/2 features an API for controlling the display and handling keyboard. In addition, development tools include a subset of the video, a task-switcher named Program Selector is available through the Ctrl-Esc hotkey combination, allowing the user to select among multitasked text-mode sessions. Communications and database-oriented extensions were delivered in 1988, as part of OS/21.0 Extended Edition, SNA, X. 25/APPC/LU6.2, LAN Manager, Query Manager, SQL. The promised graphical user interface, Presentation Manager, was introduced with OS/21.1 in October,1988 and it had a similar user interface to Windows 2.1, which was released in May of that year. The Extended Edition of 1.1, sold only through IBM sales channels, introduced distributed database support to IBM database systems, in 1989, Version 1.2 introduced Installable Filesystems and notably the HPFS filesystem. HPFS provided a number of improvements over the older FAT file system, including long filenames, in addition, extended attributes were also added to the FAT file system. The Extended Edition of 1.2 introduced TCP/IP and Ethernet support, OS/2 and Windows-related books of the late 1980s acknowledged the existence of both systems and promoted OS/2 as the system for the future. The collaboration between IBM and Microsoft unravelled in 1990, between the releases of Windows 3.0 and OS/21.3, during this time, Windows 3.0 became a tremendous success, selling millions of copies in its first year. Much of its success was because Windows 3.0 was bundled with most new computers, OS/2, on the other hand, was available only as an expensive stand-alone software package. In addition, OS/2 lacked device drivers for many devices such as printers. Windows, on the hand, supported a much larger variety of hardware

The GNU General Public License (GNU GPL or GPL) is a widely used free software license, which guarantees end users the …

Richard Stallman at the launch of the first draft of the GNU GPLv3 at MIT, Cambridge, Massachusetts, USA. To his right is Columbia Law Professor Eben Moglen, chairman of the Software Freedom Law Center.

Usenet is a worldwide distributed discussion system available on computers. It was developed from the …

Image: Usenet total storage

A visual example of the many complex steps required to prepare data to be uploaded to Usenet newsgroups. These steps must be done again in reverse to download data from Usenet.

A diagram of Usenet servers and clients. The blue, green, and red dots on the servers represent the groups they carry. Arrows between servers indicate newsgroup group exchanges (feeds). Arrows between clients and servers indicate that a user is subscribed to a certain group and reads or submits articles.

A software bug is an error, flaw, failure or fault in a computer program or system that causes it to produce an …

A page from the Harvard Mark II electromechanical computer's log, featuring a dead moth that was removed from the device

The typical bug history (GNU Classpath project data). A new bug submitted by the user is unconfirmed. Once it has been reproduced by a developer, it is a confirmed bug. The confirmed bugs are later fixed. Bugs belonging to other categories (unreproducible, will not be fixed, etc.) are usually in the minority

Code completion in Qt Creator 5.0: The programmer types some code, and when the software detects a recognizable string such as a variable identifier or class name it presents a menu to the programmer which contains the complete name of the identified variable or the methods applicable to the detected class, and the programmer makes a choice with her or his mouse or with the keyboard arrow keys. If the programmer continues typing without making a choice, then the menu disappears