In 1967, Ritchie began working at the Bell Labs Computing Sciences Research Center, and in 1968, he defended his PhD thesis on "Program Structure and Computational Complexity" at Harvard under the supervision of Patrick C. Fischer. However, Ritchie never officially received his PhD degree.[8]

During the 1960s, Ritchie and Ken Thompson worked on the Multics operating system at Bell Labs. However, Bell Labs pulled out of the project in 1969. Thompson then found an old PDP-7 machine and developed his own application programs and operating system from scratch, aided by Ritchie and others. In 1970, Brian Kernighan suggested the name "Unix", a pun on the name "Multics".[9] To supplement assembly language with a system-level programming language, Thompson created B. Later, B was replaced by C, created by Ritchie, who continued to contribute to the development of Unix and C for many years.[10]

During the 1970s, Ritchie collaborated with James Reeds and Robert Morris on a ciphertext-only attack on the M-209 US cipher machine that could solve messages of at least 2000–2500 letters.[11] Ritchie relates that, after discussions with the NSA, the authors decided not to publish it, as they were told that the principle was applicable to machines still in use by foreign governments.[11]

Ritchie was also involved with the development of the Plan 9 and Inferno operating systems, and the programming language Limbo.

As part of an AT&T restructuring in the mid-1990s, Ritchie was transferred to Lucent Technologies, where he retired in 2007 as head of System Software Research Department.[12]

Ritchie is best known as the creator of the C programming language, a key developer of the Unix operating system, and co-author of the book The C Programming Language; he was the 'R' in K&R (a common reference to the book's authors Kernighan and Ritchie). Ritchie worked together with Ken Thompson, who is credited with writing the original version of Unix; one of Ritchie's most important contributions to Unix was its porting to different machines and platforms.[13] They were so influential on Research Unix that Doug McIlroy later wrote, "The names of Ritchie and Thompson may safely be assumed to be attached to almost everything not otherwise attributed."[14]

Ritchie liked to emphasize that he was just one member of a group. He suggested that many of the improvements he introduced simply "looked like a good thing to do," and that anyone else in the same place at the same time might have done the same thing. But Bjarne Stroustrup who designed C++ said "If Dennis had decided to spend that decade on esoteric math, Unix would have been stillborn."[15]

Nowadays, the C language is widely used today in application, operating system, and embedded system development, and its influence is seen in most modern programming languages. Unix has also been influential, establishing computing concepts and principles that have been widely adopted.

In an interview from 1999, Ritchie clarified that he saw Linux and BSD operating systems as a continuation of the basis of the Unix operating system, and as derivatives of Unix:[16]

I think the Linux phenomenon is quite delightful, because it draws so strongly on the basis that Unix provided. Linux seems to be among the healthiest of the direct Unix derivatives, though there are also the various BSD systems as well as the more official offerings from the workstation and mainframe manufacturers.

In the same interview, he stated that he viewed both Unix and Linux as "the continuation of ideas that were started by Ken and me and many others, many years ago."[16]

In 1983, Ritchie and Thompson received the Turing Award for their development of generic operating systems theory and specifically for the implementation of the UNIX operating system. Ritchie's Turing Award lecture was titled "Reflections on Software Research".[17] In 1990, both Ritchie and Thompson received the IEEE Richard W. Hamming Medal from the Institute of Electrical and Electronics Engineers (IEEE), "for the origination of the UNIX operating system and the C programming language".[18]

In 1997, both Ritchie and Thompson were made Fellows of the Computer History Museum, "for co-creation of the UNIX operating system, and for development of the C programming language."[19]

On April 21, 1999, Thompson and Ritchie jointly received the National Medal of Technology of 1998 from President Bill Clinton for co-inventing the UNIX operating system and the C programming language which, according to the citation for the medal, "led to enormous advances in computer hardware, software, and networking systems and stimulated growth of an entire industry, thereby enhancing American leadership in the Information Age".[20][21]

Ritchie was under the radar. His name was not a household name at all, but... if you had a microscope and could look in a computer, you'd see his work everywhere inside.

In an interview shortly after Ritchie's death, long time colleague Brian Kernighan said Ritchie never expected C to be so significant.[29]
Kernighan told The New York Times "The tools that Dennis built—and their direct descendants—run pretty much everything today.”[30] Kernighan reminded readers of how important a role C and Unix had played in the development of later high-profile projects, such as the iPhone.[31][32] Other testimonials to his influence followed.[33][34][35][36]

At his death, a commentator compared the relative importance of Steve Jobs and Ritchie, concluding that "[Ritchie's] work played a key role in spawning the technological revolution of the last forty years—including technology on which Apple went on to build its fortune."[37] Another commentator said, "Ritchie, on the other hand, invented and co-invented two key software technologies which make up the DNA of effectively every single computer software product we use directly or even indirectly in the modern age. It sounds like a wild claim, but it really is true."[38] Another said, "many in computer science and related fields knew of Ritchie’s importance to the growth and development of, well, everything to do with computing,..."[39]

^ abcdeLohr, Steve (October 12, 2011), "Dennis Ritchie, Programming Trailblazer, Dies at 70", The New York Times, retrieved October 13, 2011, Dennis M. Ritchie, who helped shape the modern digital era by creating software tools that power things as diverse as search engines like Google and smartphones, was found dead on Wednesday at his home in Berkeley Heights, N.J. He was 70. Mr. Ritchie, who lived alone, was in frail health in recent years after treatment for prostate cancer and heart disease, said his brother Bill.

^ abc"Unix creator Dennis Ritchie dies aged 70". BBC News. October 13, 2011. Retrieved October 14, 2011. Pioneering computer scientist Dennis Ritchie has died after a long illness. ... The first news of Dr Ritchie's death came via Rob Pike, a former colleague who worked with him at Bell Labs. Mr Ritchie's passing was then confirmed in a statement from Alcatel-Lucent which now owns Bell Labs.

^ abRob Pike (October 12, 2011), (untitled post to Google+), retrieved October 14, 2011, I just heard that, after a long illness, Dennis Ritchie (dmr) died at home this weekend. I have no more information.

^Shishir Prasad (November 4, 2011). "No one thought 'C' would become so big: Brian Kernighan". Forbes India. Retrieved November 28, 2011. Q Did Dennis Ritchie or you ever think C would become so popular? [Kernighan] I don't think that at the time Dennis worked on Unix and C anyone thought these would become as big as they did. Unix, at that time, was a research project inside Bell Labs.

^"Myths of Steve Jobs". Deccan Herald. November 28, 2011. Retrieved November 28, 2011. Dennis Ritchie, the inventor of the C language and co-inventor of the Unix operating system, died a few days after Steve Jobs. He was far more influential than Jobs.

^David Cardinal (November 2, 2011). "Dennis Ritchie, creator of C, bids "goodbye, world"". Extreme Tech. Retrieved November 28, 2011. The book came off the shelf in service of teaching another generation a simple, elegant way to program that allows the developer to be directly in touch with the innards of the computer. The lowly integer variable—int—has grown in size over the years as computers have grown, but the C language and its sparse, clean, coding style live on. For that we all owe a lot to Dennis Ritchie.

^"The Strange Birth and Long Life of Unix". Newswise. November 23, 2011. Retrieved November 28, 2011. Four decades ago, Ken Thompson, the late Dennis Ritchie, and others at AT&T's Bell Laboratories developed Unix, which turned out to be one of the most influential pieces of software ever written. Their work on this operating system had to be done on the sly, though, because their employer had recently backed away from operating-systems research.

1.
Bronxville, New York
–
Bronxville /ˈbrɒŋksvɪl/ is a suburban village in Westchester County, New York, located about 15 miles north of midtown Manhattan. It is part of the town of Eastchester, the village comprises 1 square mile of land in its entirety, approximately 20% of the town of Eastchester. As of the 2010 U. S. census, Bronxville had a population of 6,323, as of 2014, it was ranked 18th in the state in median income. The area, once known as Underhills Crossing, became Bronxville when the village was formally established, the population grew in the second half of the 19th century when railroads allowed commuters from Westchester County to work in New York City. Lawrences influence can be throughout the community, including the historic Lawrence Park neighborhood, the Houlihan Lawrence Real Estate Corporation. John F Kennedy, the president of the United States, also resided here for a time, the village was home to an arts colony in the early 20th century during which time many noteworthy houses by prominent and casual architects were built. After the Bronx River Parkway was completed in 1925, the Village expanded rapidly with the construction of apartment buildings. As of 1959, they continued to own or manage 97% of the rental market, in both rentals and ownership, the village discouraged and effectively prohibited Jewish residency, earning the name The Holy Square Mile. The Gramatan Hotel on Sunset Hill was a hotel in the late 19th century. Gramatan was the name of the chief of the local Siwanoy Indian tribe that was centered in the Gramatan Rock area above Bronxville Station, chief Gramatan sold the land to the settlers. The hotel was demolished in 1970, and a complex of townhouses was built on the site in 1980, elizabeth Clift Bacon, General George Armstrong Custers widow, lived in Bronxville, and her house still stands to this day. St. Josephs Catholic Church, located in the area, was attended by the Kennedys when they were residents from 1929 to about 1936. In 1958 future-senator Ted Kennedy married Joan Bennett in St. Josephs Church, in 1960, the Village voted 5,1 for Nixon over Kennedy. The US Post Office–Bronxville was listed on the National Register of Historic Places in 1988, other sites on the National Register are the Bronxville Womens Club, Lawrence Park Historic District, and Masterton-Dusenberry House. As of the 2000 census, there were 6,543 people,2,312 households and 1,660 families residing in the village, the population density was 6,869.3 per square mile. There were 2,387 housing units at a density of 2,506.0 per square mile. The racial makeup of the village was 91. 88% White,1. 15% African American,0. 05% Native American,4. 83% Asian,0. 06% Pacific Islander,0. 73% from other races, and 1. 30% from two or more races. Hispanic or Latino of any race were 2. 93% of the population,24. 3% of all households were made up of individuals and 11. 4% had someone living alone who was 65 years of age or older

2.
Berkeley Heights, New Jersey
–
Berkeley Heights is a township in Union County, New Jersey, United States. New Providence Township became part of the newly formed Union County at its creation on March 19,1857, portions of the township were taken on March 23,1869, to create Summit, and on March 14,1899, to form the borough of New Providence. On November 6,1951, the name of the township was changed to Berkeley Heights, the township was named for John Berkeley, 1st Baron Berkeley of Stratton, one of the founders of the Province of New Jersey. In Money magazines 2013 Best Places to Live rankings, Berkeley Heights was ranked 6th in the nation, the magazines 2007 list had the township ranked 45th out of a potential 2,800 places in the United States with populations above 7,500 and under 50,000. In its 2010 rankings of the Best Places to Live, New Jersey Monthly magazine ranked Berkeley Heights as the 19th best place to live in New Jersey. In its 2008 rankings of the Best Places To Live New Jersey Monthly magazine ranked Berkeley Heights as the 59th best place to live in New Jersey. The earliest construction in Berkeley Heights began in an area that is now part of the 1,960 acres Watchung Reservation, the first European settler was Peter Willcox, who received a 424 acres land grant in 1720 from the Elizabethtown Associates. This group bought much of northern New Jersey from the Lenape in the late 17th century, Willcox built a grist and lumber mill across Green Brook. In 1793, a government was formed. It encompassed the area from present-day Springfield Township, Summit, New Providence, and Berkeley Heights, growth continued in the area, and by 1809, Springfield Township divided into Springfield Township and New Providence Township, which included present day Summit, New Providence, and Berkeley Heights. In 1845, Willcoxs heirs sold the mill to David Felt, Felt built a small village around the mill aptly named Feltville. It included homes for workers and their families, dormitories, orchards, a post office, in 1860, Feltville was sold to sarsaparilla makers. Other manufacturing operations continued until Feltville went into bankruptcy in 1882, when residents moved away, the area became known as Deserted Village. Village remains consist of seven houses, a store, the mill, Deserted Village is listed on the National Register of Historic Places and is undergoing restoration by the Union County Parks Department. Restoration grants of almost $2 million were received from state agencies. Deserted Village, in the Watchung Reservation, is open daily for unguided walking tours during daylight hours, on March 23,1869, Summit Township seceded from New Providence Township. On March 14,1899, the Borough of New Providence seceded from New Providence Township, present day Berkeley Heights remained as New Providence Township. Among the exhibits are a Victorian master bedroom and a Victorian childrens room, the childrens room also has reproductions of antique toys, which visitors can play with

3.
Harvard University
–
Although never formally affiliated with any denomination, the early College primarily trained Congregationalist and Unitarian clergy. Its curriculum and student body were gradually secularized during the 18th century, james Bryant Conant led the university through the Great Depression and World War II and began to reform the curriculum and liberalize admissions after the war. The undergraduate college became coeducational after its 1977 merger with Radcliffe College, Harvards $34.5 billion financial endowment is the largest of any academic institution. Harvard is a large, highly residential research university, the nominal cost of attendance is high, but the Universitys large endowment allows it to offer generous financial aid packages. Harvards alumni include eight U. S. presidents, several heads of state,62 living billionaires,359 Rhodes Scholars. To date, some 130 Nobel laureates,18 Fields Medalists, Harvard was formed in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. In 1638, it obtained British North Americas first known printing press, in 1639 it was named Harvard College after deceased clergyman John Harvard an alumnus of the University of Cambridge who had left the school £779 and his scholars library of some 400 volumes. The charter creating the Harvard Corporation was granted in 1650 and it offered a classic curriculum on the English university model‍—‌many leaders in the colony had attended the University of Cambridge‍—‌but conformed to the tenets of Puritanism. It was never affiliated with any denomination, but many of its earliest graduates went on to become clergymen in Congregational. The leading Boston divine Increase Mather served as president from 1685 to 1701, in 1708, John Leverett became the first president who was not also a clergyman, which marked a turning of the college toward intellectual independence from Puritanism. When the Hollis Professor of Divinity David Tappan died in 1803 and the president of Harvard Joseph Willard died a year later, in 1804, in 1846, the natural history lectures of Louis Agassiz were acclaimed both in New York and on the campus at Harvard College. Agassizs approach was distinctly idealist and posited Americans participation in the Divine Nature, agassizs perspective on science combined observation with intuition and the assumption that a person can grasp the divine plan in all phenomena. When it came to explaining life-forms, Agassiz resorted to matters of shape based on an archetype for his evidence. Charles W. Eliot, president 1869–1909, eliminated the position of Christianity from the curriculum while opening it to student self-direction. While Eliot was the most crucial figure in the secularization of American higher education, he was motivated not by a desire to secularize education, during the 20th century, Harvards international reputation grew as a burgeoning endowment and prominent professors expanded the universitys scope. Rapid enrollment growth continued as new schools were begun and the undergraduate College expanded. Radcliffe College, established in 1879 as sister school of Harvard College, Harvard became a founding member of the Association of American Universities in 1900. In the early 20th century, the student body was predominately old-stock, high-status Protestants, especially Episcopalians, Congregationalists, by the 1970s it was much more diversified

5.
C (programming language)
–
C was originally developed by Dennis Ritchie between 1969 and 1973 at Bell Labs, and used to re-implement the Unix operating system. C has been standardized by the American National Standards Institute since 1989, C is an imperative procedural language. Therefore, C was useful for applications that had formerly been coded in assembly language. Despite its low-level capabilities, the language was designed to encourage cross-platform programming, a standards-compliant and portably written C program can be compiled for a very wide variety of computer platforms and operating systems with few changes to its source code. The language has become available on a wide range of platforms. In C, all code is contained within subroutines, which are called functions. Function parameters are passed by value. Pass-by-reference is simulated in C by explicitly passing pointer values, C program source text is free-format, using the semicolon as a statement terminator and curly braces for grouping blocks of statements. The C language also exhibits the characteristics, There is a small, fixed number of keywords, including a full set of flow of control primitives, for, if/else, while, switch. User-defined names are not distinguished from keywords by any kind of sigil, There are a large number of arithmetical and logical operators, such as +, +=, ++, &, ~, etc. More than one assignment may be performed in a single statement, function return values can be ignored when not needed. Typing is static, but weakly enforced, all data has a type, C has no define keyword, instead, a statement beginning with the name of a type is taken as a declaration. There is no function keyword, instead, a function is indicated by the parentheses of an argument list, user-defined and compound types are possible. Heterogeneous aggregate data types allow related data elements to be accessed and assigned as a unit, array indexing is a secondary notation, defined in terms of pointer arithmetic. Unlike structs, arrays are not first-class objects, they cannot be assigned or compared using single built-in operators, There is no array keyword, in use or definition, instead, square brackets indicate arrays syntactically, for example month. Enumerated types are possible with the enum keyword and they are not tagged, and are freely interconvertible with integers. Strings are not a data type, but are conventionally implemented as null-terminated arrays of characters. Low-level access to memory is possible by converting machine addresses to typed pointers

6.
Unix
–
Among these is Apples macOS, which is the Unix version with the largest installed base as of 2014. Many Unix-like operating systems have arisen over the years, of which Linux is the most popular, Unix was originally meant to be a convenient platform for programmers developing software to be run on it and on other systems, rather than for non-programmer users. The system grew larger as the system started spreading in academic circles, as users added their own tools to the system. Unix was designed to be portable, multi-tasking and multi-user in a time-sharing configuration and these concepts are collectively known as the Unix philosophy. By the early 1980s users began seeing Unix as a universal operating system. Under Unix, the system consists of many utilities along with the master control program. To mediate such access, the kernel has special rights, reflected in the division between user space and kernel space, the microkernel concept was introduced in an effort to reverse the trend towards larger kernels and return to a system in which most tasks were completed by smaller utilities. In an era when a standard computer consisted of a disk for storage and a data terminal for input and output. However, modern systems include networking and other new devices, as graphical user interfaces developed, the file model proved inadequate to the task of handling asynchronous events such as those generated by a mouse. In the 1980s, non-blocking I/O and the set of inter-process communication mechanisms were augmented with Unix domain sockets, shared memory, message queues, and semaphores. In microkernel implementations, functions such as network protocols could be moved out of the kernel, Multics introduced many innovations, but had many problems. Frustrated by the size and complexity of Multics but not by the aims and their last researchers to leave Multics, Ken Thompson, Dennis Ritchie, M. D. McIlroy, and J. F. Ossanna, decided to redo the work on a much smaller scale. The name Unics, a pun on Multics, was suggested for the project in 1970. Peter H. Salus credits Peter Neumann with the pun, while Brian Kernighan claims the coining for himself, in 1972, Unix was rewritten in the C programming language. Bell Labs produced several versions of Unix that are referred to as Research Unix. In 1975, the first source license for UNIX was sold to faculty at the University of Illinois Department of Computer Science, UIUC graduate student Greg Chesson was instrumental in negotiating the terms of this license. During the late 1970s and early 1980s, the influence of Unix in academic circles led to adoption of Unix by commercial startups, including Sequent, HP-UX, Solaris, AIX. In the late 1980s, AT&T Unix System Laboratories and Sun Microsystems developed System V Release 4, in the 1990s, Unix-like systems grew in popularity as Linux and BSD distributions were developed through collaboration by a worldwide network of programmers

7.
National Medal of Technology and Innovation
–
The award may be granted to a specific person, to a group of people or to an entire organization or corporation. It is the highest honor the United States can confer to a US citizen for achievements related to technological progress, the National Medal of Technology was created in 1980 by the United States Congress under the Stevenson-Wydler Technology Innovation Act. It was an effort to foster technological innovation and the technological competitiveness of the United States in the international arena. The first National Medals of Technology were issued in 1985 by then-U. S, President Ronald Reagan to 12 individuals and one company. Among the first recipients were Steve Jobs and Stephen Wozniak, founders of Apple Computer, the medal has since been awarded annually. On August 9,2007, President George Bush signed the America COMPETES Act of 2007, the Act amended Section 16 of the Stevenson-Wydler Technology Innovation Act of 1980, changing the name of the Medal to the National Medal of Technology and Innovation. Each year the Technology Administration under the U. S. Department of Commerce calls for the nomination of new candidates for the National Medal of Technology, candidates are nominated by their peers who have direct, first-hand knowledge of the candidates achievements. Candidates may be individuals, teams of individuals, organizations or corporations, individuals and all members of teams nominated must be U. S. citizens and organizations and corporations must be U. S. -owned. All nominations are referred to the National Medal of Technology Evaluation Committee which issues recommendations to the U. S. Secretary of Commerce, all nominees selected as finalists through the merit review process will be subject to an FBI security check. Information collected through the security check may be considered in the selection of winners. The Secretary of Commerce is then able to advise the President of the United States as to which candidates ought to receive the National Medal of Technology. The new National Medal of Technology laureates are announced by the U. S. President once the final selections have been made. As of 2005, there have more than 135 individuals and 12 companies recognized. Summarized here is a list of laureates and a summary of their accomplishments

8.
Computer History Museum
–
The Computer History Museum is a museum established in 1996 in Mountain View, California, US. The Museum is dedicated to preserving and presenting the stories and artifacts of the information age, the museums origins date to 1968 when Gordon Bell began a quest for a historical collection and, at that same time, others were looking to preserve the Whirlwind computer. The resulting Museum Project had its first exhibit in 1975, located in a coat closet in a DEC lobby. In 1978, the museum, now The Digital Computer Museum, moved to a larger DEC lobby in Marlborough, maurice Wilkes presented the first lecture at TDCM in 1979 – the presentation of such lectures has continued to the present time. TDCM incorporated as The Computer Museum in 1982, in 1984, TCM moved to Boston, locating on Museum Wharf. In 1996/1997, The TCM History Center in Silicon Valley was established, a site at Moffett Field was provided by NASA, in 1999, TCMHC incorporated and TCM ceased operation, shipping its remaining artifacts to TCMHC in 2000. The name TCM had been retained by the Boston Museum of Science so, in 2000, in 2003, CHM opened its new building, at 1401 N. Shoreline Blvd in Mountain View, California, to the public. The facility was heavily renovated and underwent a two-year $19 million makeover before reopening on January 2011. The Computer History Museum claims to house the largest and most significant collection of computing artifacts in the world, the collection comprises nearly 90,000 objects, photographs and films, as well as 4,000 feet of cataloged documentation and several hundred gigabytes of software. The CHM oral history program conducts video interviews around the history of computing and networking, the museums 25, 000-square-foot exhibit Revolution, The First 2000 Years of Computing, opened to the public on January 13,2011. It covers the history of computing in 20 galleries, from the abacus to the Internet, the entire exhibition is also available online. The museum features a Liquid Galaxy in the “Going Places, A History of Silicon Valley” exhibit, the exhibit features 20 preselected locations that visitors can fly to on the Liquid Galaxy. An operating Difference Engine designed by Charles Babbage in the 1840s and it had been on loan since 2008 from its owner, Nathan Myhrvold, a former Microsoft executive. Former media executive John Hollar was appointed CEO of The Computer History Museum in July 2008, in 2012 the APL programming language followed. In February 2013 Adobe Systems, Inc. donated the Photoshop 1.0.1 source code to the collection, on October 21,2014, Xerox Altos source code and other resources followed. The CHM Fellows are exceptional men and women whose ideas have changed the world, the first fellow was Rear Admiral Grace Hopper in 1987. The fellows program has grown to 70 members as of 2015, vintage Computer Festival held annually at The Computer History Museum Computer museums History of computing History of computer science Bell, Gordon. Out of a Closet, The Early Years of the Computer * Museum, official website Computer History Museums channel on YouTube The Computer Museum Archive

9.
Japan Prize
–
The Prize is presented by the Japan Prize Foundation. Since its inception in 1985, the Foundation has awarded 81 people from 13 countries, the Japan Prize consists of a certificate, a commemorative medal and a cash award of ¥50 million. No discrimination is made as to nationality, occupation, race, only living persons may be named. Every November, the Japan Prize Foundation selects two fields for the award according to current trends in science and technology, the nomination and selection process takes about one year. The laureates, one from each field, are announced in January, the prestigious prize presentation ceremony is held in the presence of the Emperor of Japan and the Empress. The 2014 Japan Prize Presentation Ceremony was held on April 23 at the National Theatre in Tokyo, at present the international prize is often considered one of the most prestigious awards in science and technology fields after the Nobel Prize. The creation of the Japan Prize was motivated by the desire to express Japans gratitude to international society and this plan was supported with the funds donated by Konosuke Matsushita, the founder of Panasonic Corporation. He was the first chairman of the Japan Prize Preparatory Foundation, peace and prosperity for mankind have been my lifelong desires. I am extremely pleased, therefore, that the Japan Prize has been established with the goal of making some contribution on behalf of Japan to the development of international society. The progress of science and technology has been phenomenal. It is not overstating its role to say that we owe the civilization we enjoy today to this very progress. On the other hand, however, there are many global problems which remain to be solved. It is my hope that the Japan Prize achieves the recognition it deserves. In 1982 the Japan Prize Preparatory Foundation is established and then the establishment of the Japan Prize is endorsed by the Cabinet, in 1985 The 1st Japan Prize Presentation Ceremony is held in Tokyo

10.
Computer science
–
Computer science is the study of the theory, experimentation, and engineering that form the basis for the design and use of computers. An alternate, more succinct definition of science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems and its fields can be divided into a variety of theoretical and practical disciplines. Some fields, such as computational complexity theory, are highly abstract, other fields still focus on challenges in implementing computation. Human–computer interaction considers the challenges in making computers and computations useful, usable, the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, further, algorithms for performing computations have existed since antiquity, even before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623, in 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner. He may be considered the first computer scientist and information theorist, for, among other reasons and he started developing this machine in 1834, and in less than two years, he had sketched out many of the salient features of the modern computer. A crucial step was the adoption of a card system derived from the Jacquard loom making it infinitely programmable. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information, when the machine was finished, some hailed it as Babbages dream come true. During the 1940s, as new and more powerful computing machines were developed, as it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as an academic discipline in the 1950s. The worlds first computer science program, the Cambridge Diploma in Computer Science. The first computer science program in the United States was formed at Purdue University in 1962. Since practical computers became available, many applications of computing have become distinct areas of study in their own rights and it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM704 and later the IBM709 computers, still, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again. During the late 1950s, the science discipline was very much in its developmental stages. Time has seen significant improvements in the usability and effectiveness of computing technology, modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base

11.
Lucent
–
Lucent Technologies, Inc. was an American multinational telecommunications equipment company headquartered in Murray Hill, New Jersey, in the United States. It was established on September 30,1996, through the divestiture of the former AT&T Technologies business unit of AT&T Corporation, Lucent was merged with Alcatel SA of France in a merger of equals on December 1,2006, forming Alcatel-Lucent. Alcatel-Lucent was absorbed by Nokia in January 2016, the name was applied in 1996 at the time of the split from AT&T. The name was criticised, as the logo was to be. This same linguistic root also gives Lucifer, the light bearer, shortly after the Lucent renaming in 1996, Lucents Plan 9 project released a development of their work as the Inferno OS in 1997. This extended the Lucifer and Dante references as a series of punning names for the components of Inferno - Dis, Limbo, Charon, when the rights to Inferno were sold in 2000, the company Vita Nuova Holdings was formed to represent them. This continues the Dante theme, although moving away from his Divine Comedy to the poem La Vita Nuova, the Lucent logo, the Innovation Ring, was designed by Landor Associates, a prominent San Francisco-based branding consultancy. One source inside Lucent says that the logo is actually a Zen Buddhist symbol for eternal truth, another source says it represents the mythic ouroboros, a snake holding its tail in its mouth. Lucents logo also has said to represent constant re-creating and re-thinking. Carly Fiorina picked the logo because her mother was a painter, a telecommunication commentator referred to the logo as a big red zero and predicted financial losses. Bell Labs brought prestige to the new company, as well as the revenue from thousands of patents, Lucent also operated from 666 Fifth Avenue in Manhattan, New York City. At the time of its spinoff, Lucent was placed under the leadership of Henry Schacht, richard McGinn, who was serving as President and COO, succeeded Schacht as CEO in 1997 while Schacht remained chairman of the board. Lucent became a stock of the investment community in the late 1990s. Its market capitalization reached a high of $258 billion, and it was at the time the most widely held company with 5.3 million shareholders. In 1997, Lucent acquired Milpitas-based voice mail market leader Octel Communications Corporation for $2.1 billion, by 1999 Lucent stock continued to soar and in that year Lucent acquired Ascend Communications, an Alameda, California–based manufacturer of communications equipment for US$24 billion. Lucent held discussions to acquire Juniper Networks but decided instead to build its own routers internally, in 1995, Carly Fiorina led corporate operations. In that capacity, she reported to Lucent chief executive Henry B and she played a key role in planning and implementing the 1996 initial public offering of a successful stock and company launch strategy. Under her guidance, the spin-off became one of the most successful IPOs in U. S. history, later in 1996, Fiorina was appointed president of Lucents consumer products sector, reporting to president and chief operating officer Rich McGinn

12.
Bell Labs
–
Nokia Bell Labs is an American research and scientific development company, owned by Finnish company Nokia. Its headquarters are located in Murray Hill, New Jersey, in addition to laboratories around the rest of the United States. The historic laboratory originated in the late 19th century as the Volta Laboratory, Bell Labs was also at one time a division of the American Telephone & Telegraph Company, half-owned through its Western Electric manufacturing subsidiary. Eight Nobel Prizes have been awarded for work completed at Bell Laboratories, in 1880, the French government awarded Alexander Graham Bell the Volta Prize of 50,000 francs, approximately US$10,000 at that time for the invention of the telephone. Bell used the award to fund the Volta Laboratory in Washington, D. C. in collaboration with Sumner Tainter, the laboratory is also variously known as the Volta Bureau, the Bell Carriage House, the Bell Laboratory and the Volta Laboratory. The laboratory focused on the analysis, recording, and transmission of sound, Bell used his considerable profits from the laboratory for further research and education to permit the diffusion of knowledge relating to the deaf. This resulted in the founding of the Volta Bureau c,1887, located at Bells fathers house at 1527 35th Street in Washington, D. C. where its carriage house became their headquarters in 1889. In 1893, Bell constructed a new building, close by at 1537 35th St. specifically to house the lab, the building was declared a National Historic Landmark in 1972. In 1884, the American Bell Telephone Company created the Mechanical Department from the Electrical, the first president of research was Frank B. Jewett, who stayed there until 1940, ownership of Bell Laboratories was evenly split between AT&T and the Western Electric Company. Its principal work was to plan, design, and support the equipment that Western Electric built for Bell System operating companies and this included everything from telephones, telephone exchange switches, and transmission equipment. Bell Labs also carried out consulting work for the Bell Telephone Company, a few workers were assigned to basic research, and this attracted much attention, especially since they produced several Nobel Prize winners. Until the 1940s, the principal locations were in and around the Bell Labs Building in New York City. Of these, Murray Hill and Crawford Hill remain in existence, the largest grouping of people in the company was in Illinois, at Naperville-Lisle, in the Chicago area, which had the largest concentration of employees prior to 2001. Since 2001, many of the locations have been scaled down or closed. The Holmdel site, a 1.9 million square foot structure set on 473 acres, was closed in 2007, the mirrored-glass building was designed by Eero Saarinen. In August 2013, Somerset Development bought the building, intending to redevelop it into a commercial and residential project. The prospects of success are clouded by the difficulty of readapting Saarinens design and by the current glut of aging, eight Nobel Prizes have been awarded for work completed at Bell Laboratories

13.
Ken Thompson
–
Kenneth Lane Ken Thompson, commonly referred to as ken in hacker circles, is an American pioneer of computer science. Having worked at Bell Labs for most of his career, Thompson designed and implemented the original Unix operating system. He also invented the B programming language, the predecessor to the C programming language. Since 2006, Thompson has worked at Google, where he co-invented the Go programming language, Thompson was born in New Orleans. When asked how he learned to program, Thompson stated, I was always fascinated with logic and even in grade school Id work on problems in binary. Thompson was hired by Bell Labs in 1966, in the 1960s at Bell Labs, Thompson and Dennis Ritchie worked on the Multics operating system. While writing Multics, Thompson created the Bon programming language, and he also created a video game called Space Travel. Later on Bell Labs withdrew from the MULTICS project, in order to go on playing the game, Thompson found an old PDP-7 machine and rewrote Space Travel on it. In 1970, Brian Kernighan suggested the name Unix, in a somewhat treacherous pun on the name Multics, after initial work on Unix, Thompson decided that Unix needed a system programming language and created B, a precursor to Ritchies C. In the 1960s, Thompson also began work on regular expressions, Thompson had developed the CTSS version of the editor QED, which included regular expressions for searching text. QED and Thompsons later editor ed contributed greatly to the popularity of regular expressions. Almost all programs that work with regular expressions today use some variant of Thompsons notation and he also invented Thompsons construction algorithm used for converting regular expression into nondeterministic finite automaton in order to make expression matching faster. Then there was a rewrite in a language that would come to be called C. He worked mostly on the language and on the I/O system and that was for the PDP-11, which was serendipitous, because that was the computer that took over the academic community. Feedback from Thompsons Unix development was instrumental in the development of the C programming language. Thompson would later say that the C language grew up one of the rewritings of the system and, as such. In 1975, Thompson took a sabbatical from Bell Labs and went to his alma mater, there, he helped to install Version 6 Unix on a PDP-11/70. Unix at Berkeley would later become maintained as its own system, along with Joseph Condon, Thompson created the hardware and software for Belle, a world champion chess computer

14.
Operating system
–
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. All computer programs, excluding firmware, require a system to function. Operating systems are found on many devices that contain a computer – from cellular phones, the dominant desktop operating system is Microsoft Windows with a market share of around 83. 3%. MacOS by Apple Inc. is in place, and the varieties of Linux is in third position. Linux distributions are dominant in the server and supercomputing sectors, other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can run one program at a time. Multi-tasking may be characterized in preemptive and co-operative types, in preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, e. g. Solaris, Linux, cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking, 32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem, a distributed operating system manages a group of distinct computers and makes them appear to be a single computer. The development of networked computers that could be linked and communicate with each other gave rise to distributed computing, distributed computations are carried out on more than one machine. When computers in a work in cooperation, they form a distributed system. The technique is used both in virtualization and cloud computing management, and is common in large server warehouses, embedded operating systems are designed to be used in embedded computer systems. They are designed to operate on small machines like PDAs with less autonomy and they are able to operate with a limited number of resources. They are very compact and extremely efficient by design, Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is a system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could run different programs in succession to speed up processing

15.
Association for Computing Machinery
–
The Association for Computing Machinery is an international learned society for computing. It was founded in 1947 and is the worlds largest scientific and it is a not-for-profit professional membership group. Its membership is more than 100,000 as of 2011 and its headquarters are in New York City. The ACM is an organization for academic and scholarly interests in computer science. Its motto is Advancing Computing as a Science & Profession, ACM is organized into over 171 local chapters and 37 Special Interest Groups, through which it conducts most of its activities. Additionally, there are over 500 college and university chapters, the first student chapter was founded in 1961 at the University of Louisiana at Lafayette. Many of the SIGs, such as SIGGRAPH, SIGPLAN, SIGCSE and SIGCOMM, sponsor regular conferences, the groups also publish a large number of specialized journals, magazines, and newsletters. ACM publishes over 50 journals including the prestigious Journal of the ACM, other publications of the ACM include, ACM XRDS, formerly Crossroads, was redesigned in 2010 and is the most popular student computing magazine in the US. ACM Interactions, an interdisciplinary HCI publication focused on the connections between experiences, people and technology, and the third largest ACM publication, ACM Computing Surveys ACM Computers in Entertainment A number of journals, specific to subfields of computer science, titled ACM Transactions. ACM has made almost all of its publications available to subscribers online at its Digital Library. Individual members additionally have access to Safari Books Online and Books24x7, ACM also offers insurance, online courses, and other services to its members. In 1997, ACM Press published Wizards and Their Wonders, Portraits in Computing, written by Christopher Morgan, the book is a collection of historic and current portrait photographs of figures from the computer industry. The ACM Portal is a service of the ACM. Its core are two sections, ACM Digital Library and the ACM Guide to Computing Literature. The ACM Digital Library is the collection of all articles published by the ACM in its articles, magazines. The Guide is a bibliography in computing with over one million entries, the ACM Digital Library contains a comprehensive archive starting in the 1950s of the organizations journals, magazines, newsletters and conference proceedings. Online services include a forum called Ubiquity and Tech News digest, there is an extensive underlying bibliographic database containing key works of all genres from all major publishers of computing literature. This secondary database is a rich discovery service known as The ACM Guide to Computing Literature, ACM adopted a hybrid Open Access publishing model in 2013

16.
Institute of Electrical and Electronics Engineers
–
The Institute of Electrical and Electronics Engineers is a professional association with its corporate office in New York City and its operations center in Piscataway, New Jersey. It was formed in 1963 from the amalgamation of the American Institute of Electrical Engineers, today, it is the worlds largest association of technical professionals with more than 400,000 members in chapters around the world. Its objectives are the educational and technical advancement of electrical and electronic engineering, telecommunications, computer engineering, IEEE stands for the Institute of Electrical and Electronics Engineers. The association is chartered under this full legal name, IEEEs membership has long been composed of engineers and scientists. For this reason the organization no longer goes by the name, except on legal business documents. The IEEE is dedicated to advancing technological innovation and excellence and it has about 430,000 members in about 160 countries, slightly less than half of whom reside in the United States. The major interests of the AIEE were wire communications and light, the IRE concerned mostly radio engineering, and was formed from two smaller organizations, the Society of Wireless and Telegraph Engineers and the Wireless Institute. After World War II, the two became increasingly competitive, and in 1961, the leadership of both the IRE and the AIEE resolved to consolidate the two organizations. The two organizations merged as the IEEE on January 1,1963. The IEEE is incorporated under the Not-for-Profit Corporation Law of the state of New York and it was formed in 1963 by the merger of the Institute of Radio Engineers and the American Institute of Electrical Engineers. The IEEE serves as a publisher of scientific journals and organizer of conferences, workshops. IEEE develops and participates in activities such as accreditation of electrical engineering programs in institutes of higher learning. The IEEE logo is a design which illustrates the right hand grip rule embedded in Benjamin Franklins kite. IEEE has a dual complementary regional and technical structure – with organizational units based on geography and it manages a separate organizational unit which recommends policies and implements programs specifically intended to benefit the members, the profession and the public in the United States. The IEEE includes 39 technical Societies, organized around specialized technical fields, the IEEE Standards Association is in charge of the standardization activities of the IEEE. The IEEE History Center became an organization to the Engineering. The new ETHW is an effort by various engineering societies as a formal repository of topic articles, oral histories, first-hand histories, Landmarks + Milestones. The IEEE History Center is annexed to Stevens University Hoboken, NJ, in 2016, the IEEE acquired GlobalSpec, adding the provision of engineering data for a profit to its organizational portfolio

17.
Bill Clinton
–
William Jefferson Clinton is an American politician who served as the 42nd President of the United States from 1993 to 2001. Prior to the Presidency he was the 40th Governor of Arkansas from 1979 to 1981, before that, he served as Arkansas Attorney General from 1977 to 1979. A member of the Democratic Party, Clinton was ideogically a New Democrat, Clinton is married to Hillary Clinton, who served as United States Secretary of State from 2009 to 2013 and U. S. Senator from New York from 2001 to 2009, and served the Democratic nominee for President in 2016, Bill Clinton and Hillary Rodham both earned degrees from Yale Law School, where they met and began dating. As Governor of Arkansas, Clinton overhauled the states education system, Clinton was elected President of the United States in 1992, defeating incumbent George H. W. Bush. At age 46, he was the third-youngest president and the first from the Baby Boomer generation, Clinton presided over the longest period of peacetime economic expansion in American history and signed into law the North American Free Trade Agreement. After failing to pass health care reform, the Democratic House was ousted when the Republican Party won control of the Congress in 1994. Two years later, in 1996, Clinton became the first Democrat since Franklin D. Roosevelt to be elected to a second term, Clinton passed welfare reform and the State Childrens Health Insurance Program, providing health coverage for millions of children. Clinton was acquitted by the U. S. Senate in 1999, the Congressional Budget Office reported a budget surplus between the years 1998 and 2000, the last three years of Clintons presidency. In foreign policy, Clinton ordered U. S. Clinton left office with the highest end-of-office approval rating of any U. S. President since World War II, since then, Clinton has been involved in public speaking and humanitarian work. He created the William J. Clinton Foundation to address international causes, such as the prevention of AIDS, in 2004, Clinton published his autobiography, My Life. In 2009, Clinton was named the United Nations Special Envoy to Haiti, since leaving office, Clinton has been rated highly in public opinion polls of U. S. Presidents. Clinton was born on August 19,1946, at Julia Chester Hospital in Hope, Arkansas and he was the son of William Jefferson Blythe Jr. a traveling salesman who had died in an automobile accident three months before his birth, and Virginia Dell Cassidy. His parents had married on September 4,1943, but this later proved to be bigamous. Soon after their son was born, his mother traveled to New Orleans to study nursing, leaving her son in Hope with her parents Eldridge and Edith Cassidy, who owned and ran a small grocery store. At a time when the Southern United States was segregated racially, in 1950, Bills mother returned from nursing school and married Roger Clinton Sr. who owned an automobile dealership in Hot Springs, Arkansas, with his brother and Earl T. Ricks. The family moved to Hot Springs in 1950, although he immediately assumed use of his stepfathers surname, it was not until Clinton turned fifteen that he formally adopted the surname Clinton as a gesture toward his stepfather. In Hot Springs, Clinton attended St. Johns Catholic Elementary School, Ramble Elementary School, and Hot Springs High School—where he was a student leader, avid reader

18.
User (computing)
–
A user is a person who uses a computer or network service. Users generally use a system or a product without the technical expertise required to fully understand it. Power users use advanced features of programs, though they are not necessarily capable of computer programming, a user often has a user account and is identified to the system by a username. Other terms for username include login name, screenname, nickname and handle, some software products provide services to other systems and have no direct end users. End users are the ultimate users of a software product. The term is used to abstract and distinguish those who use the software from the developers of the system. This abstraction is primarily useful in designing the user interface, in user-centered design, personas are created to represent the types of users. It is sometimes specified for each persona which types of user interfaces it is comfortable with, in this context, graphical user interfaces are usually preferred to command-line interfaces for the sake of usability. The end-user development discipline blurs the distinction between users and developers. It designates activities or techniques in which people who are not professional developers create automated behavior, systems whose actor is another system or a software agent have no direct end users. To log in to an account, a user is required to authenticate oneself with a password or other credentials for the purposes of accounting, security, logging. Once the user has logged on, the system will often use an identifier such as an integer to refer to them, rather than their username. In Unix systems, the username is correlated with an identifier or user id. Computer systems operate in one of two based on what kind of users they have, Single-user systems do not have a concept of several user accounts. Multi-user systems have such a concept, and require users to identify themselves using the system. Each user account on a system typically has a home directory, in which to store files pertaining exclusively to that users activities. User accounts often contain a public profile, which contains basic information provided by the accounts owner. Various computer operating-systems and applications expect/enforce different rules for the formats of user names, in some cases, a user may be better known by their username than by their real name, such as CmdrTaco, founder of the website Slashdot

19.
Summit, New Jersey
–
Summit is a city in Union County, New Jersey, United States. Summit had the 16th-highest per capita income in the state as of the 2000 Census, the region in which Summit is located was purchased from Native Americans on October 28,1664. Summits earliest European settlers came to the area around the year 1710, the original name of Summit was Turkey Hill to distinguish it from the area then known as Turkey. Summit was called the Heights over Springfield during the late 18th century and most of the 19th century, during this period, Summit was part of Springfield Township, which eventually broke up into separate municipalities. Eventually only Summit and New Providence remained joined, today, the lodge is part of a large mansion, at 50 Kent Place Boulevard, opposite Kent Place School. The railroad allowed Summit to outgrow neighboring New Providence, which didnt have a train station, in 1868, a hotel named The Summit House burned beside the railroad. In 1869, Summit and New Providence separated and the Summit area was incorporated as the Township of Summit, in the late 19th century, the area began shifting from farmland to wealthy estates, in 1892, renowned architect C. The present-day incarnation of Summit, known formally as the City of Summit, was incorporated on April 11,1899. During this time, Summit was the home of Americas antivice crusader, Anthony Comstock, who moved there about 1880 and built a house in 1892 at 35 Beekman Road, where he died in 1915. In the 19th century, Summit served as a getaway spot for wealthy residents of New York City in search of fresh air. Weekenders or summer vacationers would reach Summit by train and relax at large hotels and smaller inns, calvary Episcopal Church was built in 1894-95, the New York Times called it a handsome new house of worship. Silk weaving thrived as an industry in the late 19th century, a new railway was constructed from what was then-called New Orange. The Rahway Valley Railroad connected Summit with the Delaware, Lackawanna, a trolley line called the Morris County Traction Company, once ran a passenger trolley through Summit to/from Newark and Morris County, in the early part of the 20th century. Broad Street in Summit was designed and built for the trolley, portions of the rails could still be seen on it as late as the 1980s. There were disputes between Summits commuters and the Lackawanna railroad about walkways, in one incident in 1905, a number of passengers seeking to board the 6,35 train found their way barred. They made a rush, and when the dust cleared away. It is said the company put the door back. The commuters say they will remove it as often as it is replaced, following World War II, the city experienced a great building boom, as living outside New York City and commuting to work became more common and the population of New Jersey grew

20.
Summit High School (New Jersey)
–
The school has been accredited by the Middle States Association of Colleges and Schools Commission on Secondary Schools since 1934. The school was opened in 1888 due to an increased need for a publicly operated secondary school within the City of Summit, the schools athletic teams are referred to as the Hilltoppers, though the schools actual mascot is a mountain goat wearing a Summit High School athletic jersey. The schools colors are maroon, white and gold, although for most of its history they were maroon and white. As of the 2014-15 school year, the school had an enrollment of 1,231 students and 104.1 classroom teachers, there were 153 students eligible for free lunch and 64 eligible for reduced-cost lunch. The school was located in a building constructed in the 1920s on Morris Avenue between Maple and Elm Streets near downtown Summit. It shared this building with the high school until 1936. However, a number of parents outside of east Summit—then called Deantown—objected to their traveling to this section of the city. By 1943, the new high school had closed and the high school again shared the Morris Avenue building with the junior high school. In 1962, the school relocated to a larger, more modern facility located at 125 Kent Place Boulevard, slightly outside of downtown Summit. In the late 1990s, a push was made to renovate the high school. From 2000 until 2003, the school building underwent a series of renovations, including the construction of a new media center, cafeteria, gymnasium. Special attention was paid to upgrading the buildings existing facilities. In the early 2000s the field was converted to FieldTurf due to frequent problems with field conditions. A grant was made by the Metro Homes Corporation, and the stadium has been renamed Metro Homes Field. Tatlock also includes a house with locker room facilities and practice fields adjacent to Washington Elementary School. Junior varsity and middle school tennis practices at the four tennis courts adjacent to the track complex, memorial Field is located a short drive from the high school on Larned Road near Brayton School. This large public field is used for soccer, cross country and this field complex is also used extensively by Summits youth sports programs. Varsity and junior varsity tennis matches are played at the new eight-court complex next to Brayton School and it is also the only field that has been sold out in the towns history, when the varsity played on it, before the HS Upper Field was being rebuilt

21.
Academic degree
–
An academic degree is a qualification awarded on successful completion of a course of study in higher education, normally at a college or university. These institutions commonly offer degrees at various levels, typically including bachelors, master’s and doctorates, often alongside other academic certificates, and professional degrees. The most common degree is the bachelors degree, although in some countries lower qualifications are titled degrees while in others a higher-level first degree is more usual. The degrees awarded by European universities—the bachelor’s degree, the licentiate, the degree. The doctorate appeared in medieval Europe as a license to teach at a medieval university and its roots can be traced to the early church when the term doctor referred to the Apostles, church fathers and other Christian authorities who taught and interpreted the Bible. The right to grant a licentia docendi was originally reserved to the church required the applicant to pass a test, to take oath of allegiance. The Third Council of the Lateran of 1179 guaranteed the access – now largely free of charge – of all able applicants, at the university, doctoral training was a form of apprenticeship to a guild. The traditional term of study before new teachers were admitted to the guild of Master of Arts, originally the terms master and doctor were synonymous, but over time the doctorate came to be regarded as a higher qualification than the master degree. The earliest doctoral degrees reflected the historical separation of all higher University study into three fields. Over time, the D. D. has gradually become less common outside theology, Studies outside theology, law, and medicine were then called philosophy, due to the Renaissance conviction that real knowledge could be derived from empirical observation. The degree title of Doctor of Philosophy is of a later time. Studies in what once was called philosophy are now classified as sciences and humanities, Master of Arts were eligible to enter study under the higher faculties of Law, Medicine or Theology, and earn first a bachelors and then master or doctors degrees in these subjects. Thus a degree was only a step on the way to becoming a qualified master – hence the English word graduate. The naming of degrees eventually became linked with the subjects studied, scholars in the faculties of arts or grammar became known as master, but those in theology, medicine, and law were known as doctor. As study in the arts or in grammar was a prerequisite to study in subjects such as theology, medicine and law. The practice of using the doctor for PhDs developed within German universities. The French terminology is tied closely to the meanings of the terms. The baccalauréat is conferred upon French students who have completed their secondary education

22.
Physics
–
Physics is the natural science that involves the study of matter and its motion and behavior through space and time, along with related concepts such as energy and force. One of the most fundamental disciplines, the main goal of physics is to understand how the universe behaves. Physics is one of the oldest academic disciplines, perhaps the oldest through its inclusion of astronomy, Physics intersects with many interdisciplinary areas of research, such as biophysics and quantum chemistry, and the boundaries of physics are not rigidly defined. New ideas in physics often explain the mechanisms of other sciences while opening new avenues of research in areas such as mathematics. Physics also makes significant contributions through advances in new technologies that arise from theoretical breakthroughs, the United Nations named 2005 the World Year of Physics. Astronomy is the oldest of the natural sciences, the stars and planets were often a target of worship, believed to represent their gods. While the explanations for these phenomena were often unscientific and lacking in evidence, according to Asger Aaboe, the origins of Western astronomy can be found in Mesopotamia, and all Western efforts in the exact sciences are descended from late Babylonian astronomy. The most notable innovations were in the field of optics and vision, which came from the works of many scientists like Ibn Sahl, Al-Kindi, Ibn al-Haytham, Al-Farisi and Avicenna. The most notable work was The Book of Optics, written by Ibn Al-Haitham, in which he was not only the first to disprove the ancient Greek idea about vision, but also came up with a new theory. In the book, he was also the first to study the phenomenon of the pinhole camera, many later European scholars and fellow polymaths, from Robert Grosseteste and Leonardo da Vinci to René Descartes, Johannes Kepler and Isaac Newton, were in his debt. Indeed, the influence of Ibn al-Haythams Optics ranks alongside that of Newtons work of the same title, the translation of The Book of Optics had a huge impact on Europe. From it, later European scholars were able to build the devices as what Ibn al-Haytham did. From this, such important things as eyeglasses, magnifying glasses, telescopes, Physics became a separate science when early modern Europeans used experimental and quantitative methods to discover what are now considered to be the laws of physics. Newton also developed calculus, the study of change, which provided new mathematical methods for solving physical problems. The discovery of new laws in thermodynamics, chemistry, and electromagnetics resulted from greater research efforts during the Industrial Revolution as energy needs increased, however, inaccuracies in classical mechanics for very small objects and very high velocities led to the development of modern physics in the 20th century. Modern physics began in the early 20th century with the work of Max Planck in quantum theory, both of these theories came about due to inaccuracies in classical mechanics in certain situations. Quantum mechanics would come to be pioneered by Werner Heisenberg, Erwin Schrödinger, from this early work, and work in related fields, the Standard Model of particle physics was derived. Areas of mathematics in general are important to this field, such as the study of probabilities, in many ways, physics stems from ancient Greek philosophy

23.
Applied mathematics
–
Applied mathematics is a branch of mathematics that deals with mathematical methods that find use in science, engineering, business, computer science, and industry. Thus, applied mathematics is a combination of science and specialized knowledge. The term applied mathematics also describes the professional specialty in which work on practical problems by formulating and studying mathematical models. The activity of applied mathematics is thus connected with research in pure mathematics. Historically, applied mathematics consisted principally of applied analysis, most notably differential equations, approximation theory, quantitative finance is now taught in mathematics departments across universities and mathematical finance is considered a full branch of applied mathematics. Engineering and computer science departments have made use of applied mathematics. Today, the applied mathematics is used in a broader sense. It includes the areas noted above as well as other areas that have become increasingly important in applications. Even fields such as number theory that are part of mathematics are now important in applications. There is no consensus as to what the various branches of applied mathematics are, such categorizations are made difficult by the way mathematics and science change over time, and also by the way universities organize departments, courses, and degrees. Many mathematicians distinguish between applied mathematics, which is concerned with methods, and the applications of mathematics within science. Mathematicians such as Poincaré and Arnold deny the existence of applied mathematics, similarly, non-mathematicians blend applied mathematics and applications of mathematics. The use and development of mathematics to industrial problems is also called industrial mathematics. Historically, mathematics was most important in the sciences and engineering. Academic institutions are not consistent in the way they group and label courses, programs, at some schools, there is a single mathematics department, whereas others have separate departments for Applied Mathematics and Mathematics. It is very common for Statistics departments to be separated at schools with graduate programs, many applied mathematics programs consist of primarily cross-listed courses and jointly appointed faculty in departments representing applications. Some Ph. D. programs in applied mathematics require little or no coursework outside of mathematics, in some respects this difference reflects the distinction between application of mathematics and applied mathematics. Research universities dividing their mathematics department into pure and applied sections include MIT, brigham Young University also has an Applied and Computational Emphasis, a program that allows student to graduate with a Mathematics degree, with an emphasis in Applied Math

24.
Version 7 Unix
–
Seventh Edition Unix, also called Version 7 Unix, Version 7 or just V7, was an important early release of the Unix operating system. V7, released in 1979, was the last Bell Laboratories release to see widespread distribution before the commercialization of Unix by AT&T Corporation in the early 1980s, V7 was originally developed for Digital Equipment Corporations PDP-11 minicomputers and was later ported to other platforms. Unix versions from Bell Labs were designated by the edition of the manual with which they were accompanied. Released in 1979, the Seventh Edition was preceded by Sixth Edition, V7 was the first readily portable version of Unix. The first Sun workstations ran a V7 port by UniSoft, the first version of Xenix for the Intel 8086 was derived from V7, the VAX port of V7, called UNIX/32V, was the direct ancestor of the popular 4BSD family of Unix systems. The group at University of Wollongong that had ported V6 to the Interdata 7/32 ported V7 to that machine as well, Interdata sold the port as Edition VII, making it the first commercial UNIX offering. DEC distributed their own PDP-11 version of V7, called V7M, UEG evolved into the group that later developed Ultrix. At the time of its release, though, its greatly extended feature set came at the expense of a decrease in performance compared to V6, the exact number of system calls varies depending on the operating system version. More recent systems have seen growth in the number of supported system calls. Linux 3.2.0 has 380 system calls and FreeBSD8.0 has over 450, in 2002, Caldera International released V7 as FOSS under a permissive BSD-like software license. Bootable images for V7 can still be downloaded today, and can be run on modern hosts using PDP-11 emulators such as SIMH, an x86 port has been developed by Nordier & Associates. Paul Allen maintains several publicly accessible computer systems, including a PDP-11/70 running Unix Version 7. Request a login from Living Computers, Museum + Labs and try running Version 7 Unix on the original equipment, many new features were introduced in Version 7. Programming tools, lex, lint, and make, the Portable C Compiler was provided along with the earlier, PDP-11-specific, C compiler by Ritchie. These first appeared in the Research Unix lineage in Version 7, mpx files were considered experimental, not enabled in the default kernel, and disappeared from later versions, which offered sockets or CB UNIXs IPC facilities instead. Version 6 Unix Seventh Edition Unix terminal interface Ancient UNIX Unix Seventh Edition manual Browsable source code PDP Unix Preservation Society

25.
PDP-11
–
The PDP-11 is a series of 16-bit minicomputers sold by Digital Equipment Corporation from 1970 into the 1990s, one of a succession of products in the PDP series. The PDP-11 had several innovative features, and was easier to program than its predecessors through the additional general-purpose registers. The PDP-11 replaced the PDP-8 in many applications, although both product lines lived in parallel for more than 10 years. In total, around 600,000 PDP-11s of all models were sold and its successor in the mid-range minicomputer niche was the 32-bit VAX-11, named as a nod to the PDP-11s popularity. The PDP-11 is considered by experts to be the most popular minicomputer ever. Design features of the PDP-11 influenced the design of most late-1970s computer systems including the Intel x86, design features of PDP-11 operating systems, as well as other operating systems from Digital Equipment, influenced the design of other operating systems such as CP/M and hence also MS-DOS. For a decade, the PDP-11 was the smallest system that could run Unix and it is commonly stated that the C programming language took advantage of several low-level PDP-11–dependent programming features, albeit not originally by design. In 1967–68, DEC engineers designed a 16-bit, word-addressed machine, management cancelled the project and some of the engineers later left DEC and produced it as the Data General Nova. A subsequent effort, code-named Desk Calculator, looked at a variety of options before choosing what became the 16-bit PDP-11, DECs previous PDP-8 and PDP-9 had 12- and 18-bit words, the PDP-11 family was announced in January 1970 and shipments began early that year. DEC sold over 170,000 PDP-11s in the 1970s, initially manufactured of small-scale transistor–transistor logic, a single-board large scale integration version of the processor was developed in 1975. A two-or-three-chip processor, the J-11 was developed in 1979, the last models of the PDP-11 line were the PDP-11/94 and -11/93 introduced in 1990. The PDP-11 processor architecture has an orthogonal instruction set. For example, instead of such as load and store. More complex instructions such as add likewise can have memory, register, input, most operands can apply any of eight addressing modes to eight registers. The addressing modes provide register, immediate, absolute, relative, deferred, and indexed addressing, use of relative addressing lets a machine-language program be position-independent. Early models of the PDP-11 had no dedicated bus for input/output, an input/output device determined the memory addresses to which it would respond, and specified its own interrupt vector and interrupt priority. DEC openly published the basic Unibus specifications, even offering prototyping bus interface circuit boards, the Unibus made the PDP-11 suitable for custom peripherals. Higher-performance members of the PDP-11 family, starting with the PDP-11/45 Unibus and 11/83 Q-bus systems, instead, memory was interfaced by dedicated circuitry and space in the CPU cabinet, while the Unibus continued to be used for I/O only

26.
PDP-7
–
The DEC PDP-7 was a minicomputer produced by Digital Equipment Corporation as part of the PDP series. Introduced in 1965, it was the first to use their Flip-Chip technology, with a cost of only US$72,000, it was cheap but powerful by the standards of the time. The PDP-7 was the third of Digitals 18-bit machines, with essentially the same instruction set architecture as the PDP-4 and it was the first wire-wrapped PDP. The computer had a cycle time of 1.75 µs. I/O included a keyboard, printer, paper-tape and dual transport DECtape drives, the standard memory capacity was 4K words but expandable up to 64K words. A PDP-7 was also the development system used during the development of MUMPS at MGH in Boston a few years earlier, there are few remaining PDP-7 systems still in operable condition. Another machine, a PDP-7 is known to be in the collection of Max Burnet near Sydney, Australia, information about the PDP-7 and PDP-7/A, including some manuals and a customer list covering 99 of the possible 120 systems shipped, Digital Equipment Corporation PDP-7. Origins and History of Unix, 1969–1995, the famous PDP-7 comes to the rescue at the Wayback Machine

27.
Brian Kernighan
–
Brian Wilson Kernighan is a Canadian computer scientist who worked at Bell Labs alongside Unix creators Ken Thompson and Dennis Ritchie and contributed to the development of Unix. He is also coauthor of the AWK and AMPL programming languages, the K of K&R C and the K in AWK both stand for Kernighan. Since 2000 Brian Kernighan has been a Professor at the Computer Science Department of Princeton University, born in Toronto, Kernighan attended the University of Toronto between 1960 and 1964, earning his Bachelors degree in engineering physics. He received his PhD in electrical engineering from Princeton University in 1969 for research supervised by Peter Weiner, Kernighan has held a professorship in the department of computer science at Princeton since 2000. Each fall he teaches a course called Computers in Our World, kernighans name became widely known through co-authorship of the first book on the C programming language with Dennis Ritchie. Kernighan affirmed that he had no part in the design of the C language and he authored many Unix programs, including ditroff. In collaboration with Shen Lin he devised well-known heuristics for two NP-complete optimization problems, graph partitioning and the travelling salesman problem, in a display of authorial equity, the former is usually called the Kernighan–Lin algorithm, while the latter is known as the Lin–Kernighan heuristic. Kernighan was the editor for Prentice Hall International. His Software Tools series spread the essence of C/Unix thinking with makeovers for BASIC, FORTRAN, and Pascal and he has said that if stranded on an island with only one programming language it would have to be C. Kernighan coined the term Unix and helped popularize Thompsons Unix philosophy, Kernighan is also known as a coiner of the expression What You See Is All You Get, which is a sarcastic variant of the original What You See Is What You Get. Kernighans term is used to indicate that WYSIWYG systems might throw away information in a document that could be useful in other contexts, kernighans original 1978 implementation of Hello, World. Was sold at The Algorithm Auction, the world’s first auction of computer algorithms, in 1996, Kernighan taught CS50 which is the Harvard University introductory course in Computer Science. His students on CS50 include David J. Malan who now runs the course, an Interview with Brian Kernighan — By Mihai Budiu, for PC Report Romania, August 2000 Transcript of an interview with Brian Kernighan. Archived from the original on 2009-04-28, archived from the original on 2009-05-28

28.
M-209
–
The M-209 was designed by Swedish cryptographer Boris Hagelin in response to a request for such a portable cipher machine, and was an improvement of an earlier machine, the C-36. The M-209 is about the size of a lunchbox, in its final form measuring 3.25 by 5.5 by 7 inches and it represented a brilliant achievement for pre-electronic technology. It used a scheme similar to that of a telecipher machine, such as the Lorenz cipher. Basic operation of the M-209 is relatively straightforward, six adjustable key wheels on top of the box each display a letter of the alphabet. These six wheels comprise the key for the machine, providing an initial state. To encipher a message, the sets the key wheels to a random sequence of letters. An enciphering-deciphering knob on the side of the machine is set to encipher. A dial known as the disk, also on the left side, is turned to the first letter in the message. To indicate spaces between words in the message, the letter Z is enciphered, repeating the process for the remainder of the message gives a complete ciphertext, which can then be transmitted using Morse code or another method. Since the initial key wheel setting is random, it is necessary to send those settings to the receiving party. Printed ciphertext is automatically spaced into groups of five by the M-209 for ease of readability, a letter counter on top of the machine indicated the total number of encoded letters, and could be used as a point of reference if a mistake was made in enciphering or deciphering. The first letter of the ciphertext is entered via the indicator disk, when the letter Z is encountered, a cam causes a blank space to appear in the message, thus reconstituting the original message with spaces. Absent Zs can typically be interpreted by the operator, based on context, an experienced M-209 operator might spend two to four seconds enciphering or deciphering each letter. Inside the casing of the M-209, a more complicated picture emerges. The six key wheels each have a movable pin aligned with each letter on the wheel. These pins may each be positioned to the left or right, the left position is ineffective, while the right position is effective. Each key wheel contains a different number of letters, and a different number of pins. Each key wheel is associated with a slanted metal guide arm that is activated by any pins in the effective position, the positions of the pins on each key wheel comprise the first part of the internal keying mechanism of the M-209

29.
National Security Agency
–
NSA is concurrently charged with protection of U. S. government communications and information systems against penetration and network warfare. Moreover, NSA maintains physical presence in a number of countries across the globe. SCS collection tactics allegedly encompass close surveillance, burglary, wiretapping, breaking and entering, additionally, the NSA Director simultaneously serves as the Commander of the United States Cyber Command and as Chief of the Central Security Service. Originating as a unit to decipher coded communications in World War II, NSA surveillance has been a matter of political controversy on several occasions, such as its spying on anti-Vietnam-war leaders or economic espionage. In 2013, the extent of some of the NSAs secret surveillance programs was revealed to the public by Edward Snowden, internationally, research has pointed to the NSAs ability to surveil the domestic Internet traffic of foreign countries through boomerang routing. The origins of the National Security Agency can be traced back to April 28,1917, a code and cipher decryption unit was established as the Cable and Telegraph Section which was also known as the Cipher Bureau. It was headquartered in Washington, D. C. and was part of the war effort under the executive branch without direct Congressional authorization, during the course of the war it was relocated in the armys organizational chart several times. On July 5,1917, Herbert O. Yardley was assigned to head the unit, at that point, the unit consisted of Yardley and two civilian clerks. It absorbed the navys cryptoanalysis functions in July 1918, World War I ended on November 11,1918, and MI-8 moved to New York City on May 20,1919, where it continued intelligence activities as the Code Compilation Company under the direction of Yardley. MI-8 also operated the so-called Black Chamber, the Black Chamber was located on East 37th Street in Manhattan. Its purpose was to crack the codes of foreign governments. Other Black Chambers were also found in Europe, during World War II, the Signal Security Agency was created to intercept and decipher the communications of the Axis powers. When the war ended, the SSA was reorganized as the Army Security Agency, on May 20,1949, all cryptologic activities were centralized under a national organization called the Armed Forces Security Agency. This organization was established within the U. S. Department of Defense under the command of the Joint Chiefs of Staff. The AFSA was tasked to direct Department of Defense communications and electronic intelligence activities, in December 1951, President Harry S. Truman ordered a panel to investigate how AFSA had failed to achieve its goals. The results of the led to improvements and its redesignation as the National Security Agency. The agency was established by Truman in a memorandum of October 24,1952. Since President Trumans memo was a document, the existence of the NSA was not known to the public at that time

30.
Plan 9 from Bell Labs
–
Plan 9 from Bell Labs is a distributed operating system, originally developed by the Computing Sciences Research Center at Bell Labs between the mid-1980s and 2002. It takes some of the principles of Unix, developed in the research group. In Plan 9, virtually all computing resources, including files, network connections, a unified network protocol called 9P ties a network of computers running Plan 9 together, allowing them to share all resources so represented. The name Plan 9 from Bell Labs is a reference to the Ed Wood 1959 cult science fiction Z-movie Plan 9 from Outer Space, also, Glenda, the Plan 9 Bunny, is presumably a reference to Woods film Glen or Glenda. The system continues to be used and developed by operating system researchers, Plan 9 from Bell Labs was originally developed, starting mid-1980s, by members of the Computing Science Research Center at Bell Labs, the same group that originally developed Unix and C. The Plan 9 team was led by Rob Pike, Ken Thompson, Dave Presotto and Phil Winterbottom. Over the years, many developers have contributed to the project including Brian Kernighan, Tom Duff, Doug McIlroy, Bjarne Stroustrup. Plan 9 replaced Unix as Bell Labss primary platform for operating systems research and it explored several changes to the original Unix model that facilitate the use and programming of the system, notably in distributed multi-user environments. After several years of development and internal use, Bell Labs shipped the system to universities in 1992. Three years later, in 1995, Plan 9 was made available for parties by AT&T via the book publisher Harcourt Brace. By early 1996, the Plan 9 project had been put on the burner by AT&T in favor of Inferno. In the late 1990s, Bell Labs new owner Lucent Technologies dropped commercial support for the project and in 2000, a fourth release under a new free software license occurred in 2002. A user and development community, including current and former Bell Labs personnel, the development source tree is accessible over the 9P and HTTP protocols and is used to update existing installations. In addition to the components of the OS included in the ISOs, Bell Labs also hosts a repository of externally developed applications. Plan 9 is a operating system, designed to make a network of heterogeneous. In a typical Plan 9 installation, users work at running the window system rio. Permanent data storage is provided by additional network hosts acting as file servers and its designers state that, he foundations of the system are built on two ideas, a per-process name space and a simple message-oriented file system protocol. The potential complexity of this setup is controlled by a set of locations for common resources

31.
Inferno (operating system)
–
Inferno is a distributed operating system started at Bell Labs, but is now developed and maintained by Vita Nuova Holdings as free software. The name of the system and many of its associated programs. Inferno programs are portable across a mix of hardware, networks. A communications protocol called Styx is applied uniformly to access local and remote resources, which programs use by calling standard file operations, open, read, write. As of the edition of Inferno, Styx is identical to Plan 9s newer version of its hallmark 9P protocol. Inferno was created in 1995 by members of Bell Labs Computer Science Research division to bring ideas of Plan 9 from Bell Labs to a range of devices. This is the conclusion of the Oak project that became Java. The Dis virtual machine is a machine intended to closely match the architecture it runs on. An advantage of this approach is the simplicity of creating a just-in-time compiler for new architectures. The virtual machine provides memory management designed to be efficient on devices with as little as 1 MiB of memory and its garbage collector is a hybrid of reference counting and a real-time coloring collector that gathers cyclic data. The kernel also includes some built-in modules that provide interfaces of the operating system, such as system calls, graphics, security. Portability across environments, it runs as an operating system on small terminals, and also as a user application under Bell Plan 9, MS Windows NT, Windows 95. In all of these environments, Inferno programs see an identical interface, Distributed design, the identical environment is established at the users terminal and at the server, and each may import the resources of the other. Aided by the facilities of the run-time system, programs may be split easily between client and server. Minimal hardware requirements, it runs useful applications stand-alone on machines with as little as 1 MiB of memory, portable programs, Inferno programs are written in the type-safe language Limbo and compiled to Dis bytecode, which can be run without modifications on all Inferno platforms. Dynamic adaptability, programs may, depending on the hardware or other resources available, for example, a video player might use any of several different decoder modules. Inferno is a descendant of Plan 9, and shares many concepts and even source code in the kernel, particularly around devices. Inferno shares with Plan 9 the Unix heritage from Bell Labs, many of the command line tools in Inferno were Plan 9 tools that were translated to Limbo

32.
Research Unix
–
The term Research Unix first appeared in the Bell System Technical Journal to distinguish it from other versions internal to Bell Labs whose code-base had diverged from the primary CSRC version. However, that term was little-used until Version 8 Unix, but has been applied to earlier versions as well. Prior to V8, the system was most commonly called simply UNIX or the UNIX Time-Sharing System. AT&T licensed Version 5 to educational institutions, and Version 6 also to commercial sites, schools paid $200 and others $20,000, discouraging most commercial use, but Version 6 was the most widely used version into the 1980s. So, the first Research Unix would be the First Edition, another common way of referring to them is Version x Unix, where x is the manual edition. All modern editions of Unix—excepting Unix-like implementations such as Coherent, Minix, starting with the 8th Edition, versions of Research Unix had a close relationship to BSD. This began by using 4. 1cBSD as the basis for the 8th Edition. 1c and this continued with 9th and 10th. The ordinary user command-set was, I guess, a bit more BSD-flavored than SysVish, Version 3, Version 4 and Version 5 should not be confused with the UNIX3.0, UNIX4.0 and UNIX5.0 releases by the AT&T UNIX Support Group. After Version 10, Unix development at Bell Labs was stopped in favor of a system, Plan 9 from Bell Labs. In 2002, Caldera International released V7 Unix as FOSS under a permissive BSD-like software license, in 2017, Unix Heritage Society and Alcatel-Lucent USA Inc. List of new features in Research Unix 9th Edition

33.
Douglas McIlroy
–
Malcolm Douglas McIlroy is a mathematician, engineer, and programmer. As of 2007 he is an Adjunct Professor of Computer Science at Dartmouth College, McIlroy is best known for having originally developed Unix pipelines, software componentry and several Unix tools, such as spell, diff, sort, join, graph, speak, and tr. His seminal work on software componentization makes him a pioneer of component-based software engineering and he taught at MIT from 1954 to 1958. McIlroy joined Bell Laboratories in 1958, from 1965 to 1986 was head of its Computing Techniques Research Department, from 1967 to 1968, McIlroy also served as a visiting lecturer at Oxford University. In 1997, McIlroy retired from Bell Labs, and took a position as an Adjunct Professor in the Dartmouth College Computer Science Department, McIlroy is a member of the National Academy of Engineering, and has won both the USENIX Lifetime Achievement Award and its Software Tools award. He also served on the committee of CSNET. Those types are not abstract, they are as real as int, as a programmer, it is your job to put yourself out of business. What you do today can be automated tomorrow, keep it simple, make it general, and make it intelligible. The real hero of programming is the one who writes negative code

34.
Bjarne Stroustrup
–
Bjarne Stroustrup is a Danish computer scientist, most notable for the creation and development of the widely used C++ programming language. He is a professor at Columbia University, and works at Morgan Stanley as a Managing Director in New York. Stroustrup has a degree in mathematics and computer science from Aarhus University, Denmark. His thesis advisor at Cambridge was David Wheeler, Stroustrup began developing C++ in 1978, and, in his own words, invented C++, wrote its early definitions, and produced its first implementation. Chose and formulated the design criteria for C++, designed all its major facilities, Stroustrup also wrote a textbook for the language, The C++ Programming Language. Stroustrup was the head of AT&T Bell Labs Large-scale Programming Research department, Stroustrup was elected member of the National Academy of Engineering in 2004. He is a Fellow of the ACM and an IEEE Fellow, from 2002 to 2014, Stroustrup was the College of Engineering Chair in Computer Science Professor at Texas A&M University. As of January 2014, Stroustrup is a Managing Director in the division of Morgan Stanley in New York City. ITMO University noble doctor since 2013, in 2015, he was made a Fellow of the Computer History Museum for his invention of the C++ programming language. Together with his wife and children, Stroustrup is a resident of Watchung, Stroustrup has written or co-written a number of publications including the following books. Programming, Principles and Practice Using C++, meet Bjarne Stroustrup List of interviews with Bjarne Stroustrup A hoax interview transcript with IEEEs Computer magazine. Computerworld Interview with Bjarne Stroustrup Bjarne Stroustrups FAQ, Class

35.
C++
–
C++ is a general-purpose programming language. It has imperative, object-oriented and generic programming features, while also providing facilities for low-level memory manipulation and it was designed with a bias toward system programming and embedded, resource-constrained and large systems, with performance, efficiency and flexibility of use as its design highlights. C++ is a language, with implementations of it available on many platforms and provided by various organizations, including the Free Software Foundation, LLVM, Microsoft, Intel. C++ is standardized by the International Organization for Standardization, with the latest standard version ratified and published by ISO in December 2014 as ISO/IEC14882,2014. The C++ programming language was standardized in 1998 as ISO/IEC14882,1998. The current C++14 standard supersedes these and C++11, with new features, the C++17 standard is due in 2017, with the draft largely implemented by some compilers already, and C++20 is the next planned standard thereafter. Many other programming languages have influenced by C++, including C#, D, Java. In 1979, Bjarne Stroustrup, a Danish computer scientist, began work on C with Classes, the motivation for creating a new language originated from Stroustrups experience in programming for his Ph. D. thesis. When Stroustrup started working in AT&T Bell Labs, he had the problem of analyzing the UNIX kernel with respect to distributed computing, remembering his Ph. D. experience, Stroustrup set out to enhance the C language with Simula-like features. C was chosen because it was general-purpose, fast, portable, as well as C and Simulas influences, other languages also influenced C++, including ALGOL68, Ada, CLU and ML. Initially, Stroustrups C with Classes added features to the C compiler, Cpre, including classes, derived classes, strong typing, inlining, furthermore, it included the development of a standalone compiler for C++, Cfront. In 1985, the first edition of The C++ Programming Language was released, the first commercial implementation of C++ was released in October of the same year. In 1989, C++2.0 was released, followed by the second edition of The C++ Programming Language in 1991. New features in 2.0 included multiple inheritance, abstract classes, static functions, const member functions. In 1990, The Annotated C++ Reference Manual was published and this work became the basis for the future standard. Later feature additions included templates, exceptions, namespaces, new casts, after a minor C++14 update released in December 2014, various new additions are planned for 2017 and 2020. According to Stroustrup, the name signifies the nature of the changes from C. This name is credited to Rick Mascitti and was first used in December 1983, when Mascitti was questioned informally in 1992 about the naming, he indicated that it was given in a tongue-in-cheek spirit

36.
Embedded system
–
An embedded system is a computer system with a dedicated function within a larger mechanical or electrical system, often with real-time computing constraints. It is embedded as part of a device often including hardware. Embedded systems control many devices in use today. Ninety-eight percent of all microprocessors are manufactured as components of embedded systems, examples of properties of typically embedded computers when compared with general-purpose counterparts are low power consumption, small size, rugged operating ranges, and low per-unit cost. This comes at the price of limited processing resources, which make them more difficult to program. For example, intelligent techniques can be designed to power consumption of embedded systems. Modern embedded systems are based on microcontrollers, but ordinary microprocessors are also common. In either case, the processor used may be ranging from general purpose to those specialised in certain class of computations. A common standard class of dedicated processors is the signal processor. Since the embedded system is dedicated to tasks, design engineers can optimize it to reduce the size and cost of the product and increase the reliability. Some embedded systems are mass-produced, benefiting from economies of scale, complexity varies from low, with a single microcontroller chip, to very high with multiple units, peripherals and networks mounted inside a large chassis or enclosure. One of the very first recognizably modern embedded systems was the Apollo Guidance Computer, an early mass-produced embedded system was the Autonetics D-17 guidance computer for the Minuteman missile, released in 1961. When the Minuteman II went into production in 1966, the D-17 was replaced with a new computer that was the first high-volume use of integrated circuits. Since these early applications in the 1960s, embedded systems have come down in price and there has been a rise in processing power. An early microprocessor for example, the Intel 4004, was designed for calculators and other systems but still required external memory. By the early 1980s, memory, input and output system components had been integrated into the chip as the processor forming a microcontroller. Microcontrollers find applications where a computer would be too costly. A comparatively low-cost microcontroller may be programmed to fulfill the role as a large number of separate components

Apple's first product, the Apple I, invented by Apple co-founder Steve Wozniak, was sold as an assembled circuit board and lacked basic features such as a keyboard, monitor, and case. The owner of this unit added a keyboard and wooden case.

The Apple II, introduced in 1977, was a major technological advancement over its predecessor.