Units started to ship to Kickstarter backers in March 2013 and were released to the general public in June 2013. It features an exclusive Ouya store for applications and games designed specifically for the Ouya platform, of which the majority are casual games targeted at or used by a mass audience of casual gamers. Out of the box, Ouya supports media apps such as Twitch.tv and XBMC media player.[8] It runs a modified version of Android Jelly Bean, with rooting being officially encouraged.[8] The console's hardware design allows it to be easily opened up, requiring only a standard screwdriver for easy modding and possible hardware add-ons.[12]

All systems can be used as development kits, allowing any Ouya owner to also be a developer, without the need for licensing fees. All games were initially required to have some kind of free-to-play aspect,[8] whether that be completely free, has a free trial, or has purchasable upgrades, levels, or other in-game items. This requirement was later removed.[13]

Despite the successful Kickstarter campaign, sales of the Ouya were lackluster,[14] causing financial problems for Ouya Inc. and forcing the company to wind down the business. Its software assets were sold to Razer Inc., who announced the discontinuation of the Ouya console in July 2015.[1] The Ouya has since been considered a commercial failure.[15][16]

Contents

Ouya was announced on July 3, 2012 as a new home video game console, led by Julie Uhrman, the chief executive officer of Santa Monica, California-based Boxer8, Inc. (later rebranded Ouya, Inc. on August 13, 2012). On July 10, Ouya started a campaign to gauge how many people were interested in the project.[8] Boxer8 confirmed having a working prototype[17] with in-progress software and user interface. It features an Nvidia Tegra 3 chip and a price tag of $99 ($95 for 1000 "early birds" backers of the Kickstarter campaign).

The Kickstarter fundraising goal was reached within 8 hours. Funding continued to increase as more models were made available at various funding levels. According to Kickstarter, in reaching its goal, Ouya holds the record for best first-day performance of any project hosted to date. Within the first 24 hours, the project attracted one backer every 5.59 seconds. Ouya became the eighth project in Kickstarter history to raise more than a million dollars and was the quickest project ever to do so.[18][19] The Kickstarter campaign finished on August 9 with $8,596,475 at 904% of their goal. This made the Ouya Kickstarter the fifth-highest earning in the website's history at the time.

Ouya units for Kickstarter funders started to ship on March 28, 2013.[20] On June 25, 2013, the Ouya was released to the public for $99.

Ouya announced the "Free the Games Fund" in July 2013 with the goal to support developers making games exclusively for their system with Ouya matching a Kickstarter campaign's pledge dollar-for-dollar if a minimum of $50,000 is raised, but only if the game will be an Ouya exclusive for six months.[21]

In October 2013, Uhrman stated that the company planned on releasing a new iteration of the Ouya console sometime in 2014,[22] with an improved controller, double the storage space, and better Wi-Fi.[23] On November 23, 2013, a limited edition white Ouya with double the storage of the original and a new controller design was available for pre-order at $129.[24]

As of January 1, 2014, the limited edition white Ouya went off sale and cannot be found on the official store, nor from any official resellers. On January 31, 2014, a new black version of the Ouya was released with double storage and new controller design.[25]

In January 2015, Ouya received an investment of 10 million USD from Alibaba with the possibility of incorporating some of Ouya technologies into Alibaba’s set-top box.[26]

In April 2015, it was revealed that Ouya was trying to sell the company because it failed to renegotiate its debt.[27][28] On July 27, 2015, it was announced that Razer Inc. had acquired Ouya's employees and content library and that Ouya hardware was now discontinued. The deal does not include Ouya's hardware assets. Owners were encouraged to migrate to Razer's own Forge microconsole; Ouya's content library will be integrated into the Forge ecosystem, and "[the] Ouya brand name will live on as a standalone gaming publisher for Android TV and Android-based TV consoles."[29][30] On the same day Uhrman stepped down as Ouya's CEO.[31]

The technical team and developer relations personnel behind Ouya joined the software team of Razer, which developed its own game platform called the Forge TV. The Forge TV was discontinued in 2016.[32][33][34][35]

The Ouya is a 75-millimetre (2.95-inch) cube designed to be used with a TV as the display via an HDMI connection. It ships with a single wireless controller, but it can also support multiple controllers. Games are available via digital distribution or can be side-loaded.

The Ouya controller is a typical gamepad with dual analogue sticks, a directional pad, 4 face buttons (labeled O, U, Y, and A) and pairs of back bumpers and triggers. It also includes a single-touch touchpad in the center of the controller.[42] The Ouya controller also has magnetically attached faceplates which enclose the 2 AA batteries, one on each side of the removable plates.

While initial reception of the Ouya was positive, raising $3.7 million on Kickstarter in the first two days, there were a number of vocal critics who were skeptical of the ability of the fledgling company to deliver a product at all. On July 12, 2012, PC Magazine's Sascha Segan ran an op-ed entitled "Why Kickstarter's Ouya Looks Like a Scam"[45] which was critical not only of the Ouya but of all Kickstarter-funded hardware projects. Unreality Magazine defended the Ouya, stating "A scam implies some sort of intentionally illegal deceit." ... "Tapping multiple investors from multiple sources isn’t a scam, it’s not even illegal, it’s business."[46]

Engadget reviewed the Kickstarter pre-release version of the Ouya on April 3, 2013. While praising the low cost and ease of hacking the console, it reported issues with controller buttons becoming stuck beneath the controller plating and the right analog stick snagging on the plating. It also reported a slight lag between the controller and the console and went on to say the controller was "usable, but it's far from great."[47]

The Verge reported similar issues with the controller and questioned its construction quality. While they praised the hacking and openness of the console, calling it "a device with lots of potential and few true limitations", the review was mostly negative and was critical of the interface and game launch choice and stated that "Ouya isn't a viable gaming platform, or a good console, or even a nice TV interface."[48]

Engadget reviewed the retail version of the Ouya, and noted a largely improved experience as compared to what they found with their pre-release unit. Improvements to the gamepad were "huge", and they found "that the UI has been cleaned up and sped up". Engadget concluded that their "latest experience with the Android-based gaming device [left them] feeling optimistic" and that the company was "taking customer feedback seriously".[50]

Digital Trends called the final retail console "a device with a lot of potential built with love", and called the design a "sleek and cool-looking cube filled with gamingy goodness". The mostly positive review cited a lot of potential for the future, but was tempered by noting deficiencies in performance ("as powerful as many current smartphones"), and pointing out that the Ouya won't be able to compete with the "big three" console makers on performance, but must rely on carving out a niche in the market.[51]

ExtremeTech found that Ouya "has a number of serious faults". They mentioned the sub-par controller, the connectivity issues, and games which worked flawlessly on smartphones but stuttered on the console. Also, they remarked that "there just aren’t enough worthwhile games to play".[52]

Market analyst NPD Group described Ouya sales in its first month as "relatively light",[53] while several outlets noted low sales of games on the service in initial reports from developers.[54][55] In April 2014, developer Matt Thorson stated that his title TowerFall, the Ouya's most popular game at the time, had only sold 7,000 copies for the console.[56]

In July 2013, Ouya announced the "Free the Games Fund", a scheme to help fund developers, where Ouya would match any Kickstarter campaign if a minimum target of $50,000 was reached, and provided the game remained Ouya exclusive for six months.[21] Suspicions were raised concerning the first two games to reach the target. Commentators noticed the small number of backers each pledging a high value amount, the large number of those who had never backed a project before, as well as the use of duplicate names and avatars that included those of celebrities.[57] This led some to suggest that the projects were artificially inflating their project's backing in order to receive extra money from Ouya. In addition, one project had a backer whose identity appeared to be taken from that of a missing person's case.[57]

Nevertheless, Ouya rejected any suspicion regarding the backing of the projects, and planned to continue with providing funding.[57][58] In September 2013, funding for one of the games that had reached its target (Elementary, My Dear Holmes) was later suspended by Kickstarter.[59] The developers of the other funded game, Gridiron Thunder, threatened litigation against a commenter on the Kickstarter page,[59] and further dismissed concerns that they would have no rights to official NFL branding, a license currently held by Electronic Arts.[60] In the same month, another project, Dungeons the Eye of Draconus, caused controversy by openly stating that a relative of one developer had provided substantial additional backing in order to have the project qualify for money from the Free the Games fund.[61] The project was removed by Ouya from the Free the Games fund, resulting in the developers removing the project from Kickstarter.[61]

Many developers criticized the fund's rules. Sophie Houlden removed her game, Rose and Time, from the Ouya marketplace in protest.[62] Matt Gilgenbach, who was trying to finance his game Neverending Nightmares with help from the fund, said, "It would kill me if due to other projects abusing the Free the Games Fund, people lost confidence in our project and what we are trying to do...While I believe in the idea of the Free the Games Fund, I think it definitely could use some reform in light of the potential avenues for abuse."[63] Eventually, Uhrman accepted this criticism. "Developers were telling us over and over, 'You’re being too idealistic, and you’re being too naive'...That was the part that personally took me a while to understand." Ouya changed the fund rules, including adding a dollar-per-backer limit. Houlden put her game back on the store, and Neverending Nightmares qualified for funding under the new rules.[62]

On September 18, 2013, Ouya modified the exclusivity clause of the fund. Developers would still not be able to release their software on mobile devices, video game consoles, and set-top boxes during the six-month exclusivity period, but they would be allowed to release on other personal computer systems, such as Windows, Mac OS X, and Linux, during that time.[64]

1.
Microconsole
–
A microconsole is a type of video game console. In late 2010, cloud gaming startup OnLive released MicroConsole, a television adapter, the MicroConsole TV adapter was produced at a loss. OnLives MicroConsole made the company a leader in the nascent microconsole field. The Ouya was a success and raised $8.5 million. Significant interest in low-cost Android console gaming followed Ouyas success, spurred by the games industry growth. The industry began to refer to the consoles as alternative consoles. Forbess Daniel Nye Griffiths referred to Ouya and GameSticks close release dates as the fields first showdown. The GamePop and MOJO announcements in the summer referred to the devices as microconsoles. The PlayStation TV is a microconsole announced in September 2013 at a Sony Computer Entertainment Japan presentation and it was released in Japan on November 14,2013 and in North America on October 14,2014. Gamasutra called Ouya, GameStick, and GamePop console alternatives that represent a new market space for developers. Microconsole promises of a less restrictive platform are expected to empower independent game developers, kelly referred to the deliberately small microconsoles as the netbooks of the console world, not intended to compete with big video game consoles. Other reviewers called the microconsoles competitors, though not a threat, kelly added that Ouya is heavily focused on the early adopter audience and its interests, and that Ouyas natural advantage of price has not been communicated effectively. Edge questioned possibilities of success due to competition within the field as well as from Nintendo, Sony. The pre-release Ouya was panned by early reviewers, the Verge called it unfinished, and in a later review, Eurogamer questioned why consumers would purchase a console that duplicated the functionality of smartphones they already had. The video game saw the digital media receiver on Apples Apple TV as potential microconsole competition due to the companys experience in the mobile games market. List of microconsoles Mobile game Cloud gaming Handheld TV game Video game clones

2.
Eighth generation of video game consoles
–
In the history of video games, the seventh generation includes consoles released since late 2005 by Nintendo, Microsoft, and Sony Computer Entertainment. The eighth generation began in November 2012, each new console introduced a new type of breakthrough in technology. Some of the Wii controllers could be moved about to control in-game actions, Video game consoles had become an important part of the global IT infrastructure. It is estimated that video game consoles represented 25% of the worlds general-purpose computational power in the year 2007, joining Nintendo in the motion market, Sony Computer Entertainment released the PlayStation Move in September 2010. The PlayStation Move features motion sensing gaming, similar to that of the Wii, Microsoft joined the scene in November 2010, with its Kinect. Unlike the other two systems, Kinect does not use controllers of any sort and makes the users the controller, having sold 8 million units in its first 60 days on the market, Kinect has claimed the Guinness World Record of being the fastest selling consumer electronics device. While the Xbox 360 offers wired as well as wireless controllers as a product, all PlayStation 3 controllers can be used in wired. The Nintendo DS features a screen and built-in microphone. Additionally, the version of the NDS, the Nintendo DSi. The PlayStation Portable released later the year on December 12,2004. It became the first handheld game console to use an optical disc format, Universal Media Disc. Sony also gave the PSP robust multi-media capability, connectivity with the PlayStation 3, PlayStation 2, other PSPs, as well as Internet connectivity. The Nintendo DS likewise had connectivity to the internet through the Nintendo Wi-Fi Connection and Nintendo DS Browser, as well as wireless connectivity to other DS systems, a crowdfunded console, the Ouya, received $8.5 million in pre-orders, launching in 2013. Post-launch sales were poor, and the device was a commercial failure, the business was wound down due to financial problems and sold to Razer Inc. Razer discontinued the Ouya in July 2015, the first discontinued seventh generation console was the Wii, which Nintendo announced it would be discontinuing production of on October 2013. Microsoft also announced in 2016 that they would discontinue the Xbox 360 at the end of April that year, once this happens, the seventh generation of video game consoles will end. Nintendo entered this generation with a new approach embodied by its Wii and this approach was previously implemented in the portable market with the Nintendo DS. This strategy paid off, with demand for the Wii outstripping supply throughout 2007. Since Nintendo profited on each console right from the start unlike its competitors, as in previous generations, Nintendo provided strong support for its new console with popular first-party franchises like Mario, The Legend of Zelda, Metroid, and Pokémon, among others

3.
United States dollar
–
The United States dollar is the official currency of the United States and its insular territories per the United States Constitution. It is divided into 100 smaller cent units, the circulating paper money consists of Federal Reserve Notes that are denominated in United States dollars. The U. S. dollar was originally commodity money of silver as enacted by the Coinage Act of 1792 which determined the dollar to be 371 4/16 grain pure or 416 grain standard silver, the currency most used in international transactions, it is the worlds primary reserve currency. Several countries use it as their currency, and in many others it is the de facto currency. Besides the United States, it is used as the sole currency in two British Overseas Territories in the Caribbean, the British Virgin Islands and Turks and Caicos Islands. A few countries use the Federal Reserve Notes for paper money, while the country mints its own coins, or also accepts U. S. coins that can be used as payment in U. S. dollars. After Nixon shock of 1971, USD became fiat currency, Article I, Section 8 of the U. S. Constitution provides that the Congress has the power To coin money, laws implementing this power are currently codified at 31 U. S. C. Section 5112 prescribes the forms in which the United States dollars should be issued and these coins are both designated in Section 5112 as legal tender in payment of debts. The Sacagawea dollar is one example of the copper alloy dollar, the pure silver dollar is known as the American Silver Eagle. Section 5112 also provides for the minting and issuance of other coins and these other coins are more fully described in Coins of the United States dollar. The Constitution provides that a regular Statement and Account of the Receipts and that provision of the Constitution is made specific by Section 331 of Title 31 of the United States Code. The sums of money reported in the Statements are currently being expressed in U. S. dollars, the U. S. dollar may therefore be described as the unit of account of the United States. The word dollar is one of the words in the first paragraph of Section 9 of Article I of the Constitution, there, dollars is a reference to the Spanish milled dollar, a coin that had a monetary value of 8 Spanish units of currency, or reales. In 1792 the U. S. Congress passed a Coinage Act, Section 20 of the act provided, That the money of account of the United States shall be expressed in dollars, or units. And that all accounts in the offices and all proceedings in the courts of the United States shall be kept and had in conformity to this regulation. In other words, this act designated the United States dollar as the unit of currency of the United States, unlike the Spanish milled dollar the U. S. dollar is based upon a decimal system of values. Both one-dollar coins and notes are produced today, although the form is significantly more common

4.
Pound sterling
–
It is subdivided into 100 pence. A number of nations that do not use sterling also have called the pound. At various times, the sterling was commodity money or bank notes backed by silver or gold. The pound sterling is the worlds oldest currency still in use, the British Crown dependencies of Guernsey and Jersey produce their own local issues of sterling, the Guernsey pound and the Jersey pound. The pound sterling is also used in the Isle of Man, Gibraltar, the Bank of England is the central bank for the pound sterling, issuing its own coins and banknotes, and regulating issuance of banknotes by private banks in Scotland and Northern Ireland. Sterling is the fourth most-traded currency in the exchange market, after the United States dollar, the euro. Together with those three currencies it forms the basket of currencies which calculate the value of IMF special drawing rights, Sterling is also the third most-held reserve currency in global reserves. The full, official name, pound sterling, is used mainly in formal contexts, otherwise the term pound is normally used. The abbreviations ster. or stg. are sometimes used, the term British pound is commonly used in less formal contexts, although it is not an official name of the currency. The pound sterling is also referred to as cable amongst forex traders, the origins of this term are attributed to the fact that in the 1800s, the dollar/pound sterling exchange rate was transmitted via transatlantic cable. Forex brokers are sometimes referred to as cable dealers, as another established source notes, the compound expression was then derived, silver coins known as sterlings were issued in the Saxon kingdoms,240 of them being minted from a pound of silver. Hence, large payments came to be reckoned in pounds of sterlings, in 1260, Henry III granted them a charter of protection. And because the Leagues money was not frequently debased like that of England, English traders stipulated to be paid in pounds of the Easterlings, and land for their Kontor, the Steelyard of London, which by the 1340s was also called Easterlings Hall, or Esterlingeshalle. For further discussion of the etymology of sterling, see sterling silver, the currency sign for the pound sign is £, which is usually written with a single cross-bar, though a version with a double cross-bar is also sometimes seen. The ISO4217 currency code is GBP, occasionally, the abbreviation UKP is used but this is non-standard because the ISO3166 country code for the United Kingdom is GB. The Crown dependencies use their own codes, GGP, JEP, stocks are often traded in pence, so traders may refer to pence sterling, GBX, when listing stock prices. A common slang term for the pound sterling or pound is quid, since decimalisation in 1971, the pound has been divided into 100 pence. The symbol for the penny is p, hence an amount such as 50p properly pronounced fifty pence is more colloquially, quite often, pronounced fifty pee /fɪfti, pi and this also helped to distinguish between new and old pence amounts during the changeover to the decimal system

5.
Razer Inc.
–
Razer is dedicated to the creation and development of products mainly focused on PC gaming such as laptops, tablet computer, various PC peripherals, wearables, and accessories. The Razer brand is currently being marketed under Razer USA Ltd, Razer was founded in 1998 by a team of marketers and engineers to develop and market a high-end computer gaming mouse, the Boomslang, targeted to computer gamers. At Consumer Electronics Show 2011, Razer unveiled the Razer Switchblade, at CES2013, Razer unveiled its Razer Edge gaming tablet computer, which was previously known as Project Fiona. The tablet uses the Windows 8 operating system and is designed with gaming in mind, in May 2013, Razer unveiled the 14-inch Razer Blade and 17-inch Razer Blade Pro gaming laptops with fourth-generation Intel Haswell processors. At CES2014, Razer unveiled Project Christine, a modular gaming PC, each of the branches on the PC is a discrete component—a CPU, a GPU, a hard drive, memory—that simply plug into the central backbone. Once slotted in, Project Christine automatically syncs the newly added modules through PCI-Express, in July 2015, Razer announced it was purchasing the software division of video-game company Ouya. At Consumer Electronics Show 2016, Razer has been selected for Peoples Choice Winner for Razer Blade Stealth Ultrabook. The company won the year before for the Razer Forge TV, and this year, it took home the prize for the Razer Blade Stealth Ultrabook, in October 2016, Razer purchased THX according to THX CEO Ty Ahmad-Taylor. At CES2017, Razer revealed Project Valerie, a triple display laptop, and Project Ariana, in January 2017, Razer bought manufacturer Nextbit, the startup behind the Robin smartphone. Razers products are targeted at gamers, and include gaming laptops, gaming tablets, and PC peripherals such as mice, audio devices, keyboards, mouse mats. Razer has also released a VOIP software called Razer Comms, the Razer DeathAdder gaming mouse is the companys most popular product by sales numbers. The Razer Blade series is a series of gaming laptops developed by Razer and include the 12. 5-inch Razer Blade Stealth, the 14-inch Razer Blade, the Blade stealth was announced alongside the Razer Core. At DreamHack 2015, Razer and Lenovo announced a partnership to make a gaming desktop called the Lenovo Y900 Razer Edition gaming desktop. The Razer Edge is a gaming tablet PC developed by Razer specifically for games which run on Windows OS, in late 2014, Razer released their Chroma series of products. All Chroma series products have customizable RGB lighting, the first item in the series was the Blackwidow Chroma. The Blackwidow Chroma is a keyboard that has new features added onto the original Blackwidow keyboard. After the first Chroma product was released, Razer has continued to add products to the series and they have also added their RGB lighting onto another mouse they recently released called the Razer Mamba Chroma, which has RGB on the sides of the mouse with 14 different zones. A new version of the Razer Diamondback gaming mouse was also released, the Razer Nabu is a smart band developed by Razer with features such as mobile app notifications, fitness tracking, and more which was first released in December,2014

6.
Operating system
–
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. All computer programs, excluding firmware, require a system to function. Operating systems are found on many devices that contain a computer – from cellular phones, the dominant desktop operating system is Microsoft Windows with a market share of around 83. 3%. MacOS by Apple Inc. is in place, and the varieties of Linux is in third position. Linux distributions are dominant in the server and supercomputing sectors, other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can run one program at a time. Multi-tasking may be characterized in preemptive and co-operative types, in preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, e. g. Solaris, Linux, cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking, 32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem, a distributed operating system manages a group of distinct computers and makes them appear to be a single computer. The development of networked computers that could be linked and communicate with each other gave rise to distributed computing, distributed computations are carried out on more than one machine. When computers in a work in cooperation, they form a distributed system. The technique is used both in virtualization and cloud computing management, and is common in large server warehouses, embedded operating systems are designed to be used in embedded computer systems. They are designed to operate on small machines like PDAs with less autonomy and they are able to operate with a limited number of resources. They are very compact and extremely efficient by design, Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is a system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could run different programs in succession to speed up processing

7.
Android (operating system)
–
Android is a mobile operating system developed by Google, based on the Linux kernel and designed primarily for touchscreen mobile devices such as smartphones and tablets. In addition to devices, Google has further developed Android TV for televisions, Android Auto for cars. Variants of Android are also used on notebooks, game consoles, digital cameras, beginning with the first commercial Android device in September 2008, the operating system has gone through multiple major releases, with the current version being 7.0 Nougat, released in August 2016. Android applications can be downloaded from the Google Play store, which features over 2.7 million apps as of February 2017, Android has been the best-selling OS on tablets since 2013, and runs on the vast majority of smartphones. In September 2015, Android had 1.4 billion monthly active users, Android is popular with technology companies that require a ready-made, low-cost and customizable operating system for high-tech devices. The success of Android has made it a target for patent, Android Inc. was founded in Palo Alto, California in October 2003 by Andy Rubin, Rich Miner, Nick Sears, and Chris White. Rubin described the Android project as tremendous potential in developing smarter mobile devices that are aware of its owners location. The early intentions of the company were to develop an operating system for digital cameras. Despite the past accomplishments of the founders and early employees, Android Inc. operated secretly and that same year, Rubin ran out of money. Steve Perlman, a friend of Rubin, brought him $10,000 in cash in an envelope. In July 2005, Google acquired Android Inc. for at least $50 million and its key employees, including Rubin, Miner and White, joined Google as part of the acquisition. Not much was known about Android at the time, with Rubin having only stated that they were making software for mobile phones, at Google, the team led by Rubin developed a mobile device platform powered by the Linux kernel. Google marketed the platform to handset makers and carriers on the promise of providing a flexible, upgradeable system, Google had lined up a series of hardware components and software partners and signaled to carriers that it was open to various degrees of cooperation. Speculation about Googles intention to enter the communications market continued to build through December 2006. In September 2007, InformationWeek covered an Evalueserve study reporting that Google had filed several patent applications in the area of mobile telephony, the first commercially available smartphone running Android was the HTC Dream, also known as T-Mobile G1, announced on September 23,2008. Since 2008, Android has seen numerous updates which have improved the operating system, adding new features. Each major release is named in order after a dessert or sugary treat, with the first few Android versions being called Cupcake, Donut, Eclair. In 2010, Google launched its Nexus series of devices, a lineup in which Google partnered with different device manufacturers to produce new devices and introduce new Android versions

8.
Android version history
–
The version history of the Android mobile operating system began with the release of the Android alpha in November 5,2007. The first commercial version, Android 1.0, was released in September 2008, Android is continually developed by Google and the Open Handset Alliance, and it has seen a number of updates to its base operating system since the initial release. Versions 1.0 and 1.1 were not released under specific code names, each is in alphabetical order, with the most recent major version being Android 7.0 Nougat, released in August 2016. A version of Android KitKat exclusive to Android Wear devices was released on June 25,2014, the development of Android started in 2003 by Android, Inc. which was purchased by Google in 2005. There were at least two internal releases of the software inside Google and the OHA before the version was released. The code names Astro Boy and Bender were used internally for some pre-1.0 milestones, dan Morrill created some of the first mascot logos, but the current Android logo was designed by Irina Blok. The project manager, Ryan Gibson, conceived the confectionery-themed naming scheme that has used for the majority of the public releases. The beta was released on November 5,2007, while the development kit was released on November 12,2007. The main hardware platform for Android is the ARM architecture, with x86, unofficial Android-x86 project used to provide support for the x86 and MIPS architectures ahead of the official support. Since 2012, Android devices with Intel processors began to appear, including phones, while gaining support for 64-bit platforms, Android was first made to run on 64-bit x86 and then on ARM64. Since Android 5.0 Lollipop, 64-bit variants of all platforms are supported in addition to the 32-bit variants. Requirements for the amount of RAM for devices running Android 5.1 range from 512 MB of RAM for normal-density screens. Android 4.4 requires a 32-bit ARMv7, MIPS or x86 architecture processor, Android supports OpenGL ES1.1,2.0,3.0,3.2 and as of latest major version Vulkan. Some applications may require a certain version of the OpenGL ES

9.
System on a chip
–
A system on a chip or system on chip is an integrated circuit that integrates all components of a computer or other electronic systems. It may contain digital, analog, mixed-signal, and often radio-frequency functions—all on a single substrate, SoCs are very common in the mobile computing market because of their low power-consumption. A typical application is in the area of embedded systems, the contrast with a microcontroller, SoC integrates microcontroller with advanced peripherals like graphics processing unit, Wi-Fi module, or coprocessor. As long as we remember that the SoC does not necessarily contain built-in memory, in general, we can distinguish three types of SoC. SoC built around a microcontroller, SoC built around a microprocessor, a separate category may be Programmable SoC, part of elements is not permanently defined and can be programmable in a manner analogous to the FPGA or CPLD. When it is not feasible to construct a SoC for a particular application, in large volumes, SoC is believed to be more cost-effective than SiP since it increases the yield of the fabrication and because its packaging is simpler. Another option, as seen for example in cell phones, is package on package stacking during board assembly. The SoC includes processors and numerous digital peripherals, and comes in a ball grid package with lower and upper connections. The lower balls connect to the board and various peripherals, with the balls in a ring holding the memory buses used to access NAND flash. Memory packages could come from multiple vendors, DMA controllers route data directly between external interfaces and memory, bypassing the processor core and thereby increasing the data throughput of the SoC. A SoC consists of both the hardware, described above, and the controlling the microcontroller, microprocessor or DSP cores, peripherals. The design flow for a SoC aims to develop hardware and software in parallel. Most SoCs are developed from pre-qualified hardware blocks for the elements described above. Of particular importance are the protocol stacks that drive industry-standard interfaces like USB, the hardware blocks are put together using CAD tools, the software modules are integrated using a software-development environment. Once the architecture of the SoC has been defined, any new elements are written in an abstract language termed RTL which defines the circuit behaviour. These elements are connected together in the same RTL language to create the full SoC design, chips are verified for logical correctness before being sent to foundry. This process is called functional verification and it accounts for a significant portion of the time, with the growing complexity of chips, hardware verification languages like SystemVerilog, SystemC, e, and OpenVera are being used. Bugs found in the stage are reported to the designer

10.
Nvidia
–
Nvidia Corporation is an American technology company based in Santa Clara, California. It designs graphics processing units for the market, as well as system on a chip units for the mobile computing. Its primary GPU product line, labeled GeForce, is in competition with Advanced Micro Devices Radeon products. Nvidia expanded its presence in the industry with its handheld SHIELD Portable, SHIELD Tablet. Since 2014, Nvidia has shifted to become a company focused on four markets – gaming, professional visualization, data centers. In addition to GPU manufacturing, Nvidia provides parallel processing capabilities to researchers and they are deployed in supercomputing sites around the world. More recently, It has moved into the computing market. In addition to AMD, its competitors include Intel, Qualcomm, Nvidia is now focused on artificial intelligence. The name of the company comes from Invidia in Roman mythology who corresponds to Nemesis, RIVA TNT in 1998 solidified Nvidias reputation for capable hardware. Autumn 1999 saw the release of the GeForce, most notably introducing on-board transformation, running at 120 MHz and featuring four pixel pipelines, it implemented advanced video acceleration, motion compensation and hardware sub-picture alpha blending. The GeForce outperformed existing products by a wide margin, due to the success of its products, Nvidia won the contract to develop the graphics hardware for Microsofts Xbox game console, which earned Nvidia a $200 million advance. However, the project drew the time of many of its best engineers away from other projects, in the short term this did not matter, and the GeForce2 GTS shipped in the summer of 2000. In December 2000, Nvidia reached an agreement to acquire the assets of its one-time rival 3dfx. The acquisition process was finalized in April 2002, in July 2002, Nvidia acquired Exluna for an undisclosed sum. Exluna made software rendering tools and the personnel were merged into the Cg project, in August 2003, Nvidia acquired MediaQ for approximately US$70 million. On April 22,2004, Nvidia acquired iReady, also a provider of high performance TCP/IP, in December 2004, it was announced that Nvidia would assist Sony with the design of the graphics processor in the PlayStation 3 game console. In March 2006, it emerged that Nvidia would deliver RSX to Sony as an IP core, under the agreement, Nvidia would provide ongoing support to port the RSX to Sonys fabs of choice, as well as die shrinks to 65 nm. This practice contrasted with its business arrangement with Microsoft, in which Nvidia managed production, meanwhile, in May 2005 Microsoft chose to license a design by ATI and to make its own manufacturing arrangements for the Xbox 360 graphics hardware, as had Nintendo for the Wii console

11.
Tegra
–
Tegra is a system on a chip series developed by Nvidia for mobile devices such as smartphones, personal digital assistants, and mobile Internet devices. The Tegra integrates an ARM architecture central processing unit, graphics processing unit, northbridge, southbridge, early Tegra SoCs are designed as efficient multimedia processors, while more recent models emphasize gaming performance without sacrificing power efficiency. The Tegra APX2500 was announced on February 12,2008, the Tegra 6xx product line was revealed on June 2,2008, and the APX2600 was announced in February 2009. The APX chips were designed for smartphones, while the Tegra 600 and 650 chips were intended for smartbooks, the first product to use the Tegra was Microsofts Zune HD media player in September 2009, followed by the Samsung M1. Microsofts KIN was the first cellular phone to use the Tegra, however, in September 2008, Nvidia and Opera Software announced that they would produce a version of the Opera 9.5 browser optimised for the Tegra on Windows Mobile and Windows CE. At Mobile World Congress 2009, Nvidia introduced its port of Googles Android to the Tegra, on January 7,2010, Nvidia officially announced and demonstrated its next generation Tegra system-on-a-chip, the Nvidia Tegra 250, at Consumer Electronics Show 2010. Nvidia primarily supports Android on Tegra 2, but booting other ARM-supporting operating systems is possible on devices where the bootloader is accessible, Tegra 2 support for the Ubuntu GNU/Linux distribution was also announced on the Nvidia developer forum. Nvidia announced the first quad-core SoC at the February 2011 Mobile World Congress event in Barcelona, though the chip was codenamed Kal-El, it is now branded as Tegra 3. Early benchmark results show impressive gains over Tegra 2, and the chip was used in many of the released in the second half of 2011. In January 2012, Nvidia announced that Audi had selected the Tegra 3 processor for its in-vehicle infotainment systems, the processor will be integrated into Audis entire line of vehicles worldwide, beginning in 2013. In summer of 2012 Tesla Motors began shipping the Model S all electric, high performance sedan, one VCM powers the 17-inch touchscreen infotainment system, and one drives the 12. 3-inch all digital instrument cluster. In March 2015, Nvidia announced the Tegra X1, the first SoC to have a performance of 1 teraflop. At the announcement event, Nvidia showed off Epic Games Unreal Engine 4 Elemental demo, on October 20,2016, Nvidia announced that Nintendos upcoming Switch hybrid home/portable game console will be powered by Tegra hardware. There is a version of the Tegra 2 SoC supporting 3D displays, the Tegra 2 video decoder is largely unchanged from the original Tegra and has limited support for HD formats. The lack of support for high-profile H.264 is particularly troublesome when using online streaming services. While all cores are Cortex-A9s, the core is manufactured with a low-power silicon process. This core operates transparently to applications and is used to power consumption when processing load is minimal. The main quad-core portion of the CPU powers off in these situations, Tegra 3 is the first Tegra release to support ARMs SIMD extension, NEON

12.
Central processing unit
–
The computer industry has used the term central processing unit at least since the early 1960s. The form, design and implementation of CPUs have changed over the course of their history, most modern CPUs are microprocessors, meaning they are contained on a single integrated circuit chip. An IC that contains a CPU may also contain memory, peripheral interfaces, some computers employ a multi-core processor, which is a single chip containing two or more CPUs called cores, in that context, one can speak of such single chips as sockets. Array processors or vector processors have multiple processors that operate in parallel, there also exists the concept of virtual CPUs which are an abstraction of dynamical aggregated computational resources. Early computers such as the ENIAC had to be rewired to perform different tasks. Since the term CPU is generally defined as a device for software execution, the idea of a stored-program computer was already present in the design of J. Presper Eckert and John William Mauchlys ENIAC, but was initially omitted so that it could be finished sooner. On June 30,1945, before ENIAC was made, mathematician John von Neumann distributed the paper entitled First Draft of a Report on the EDVAC and it was the outline of a stored-program computer that would eventually be completed in August 1949. EDVAC was designed to perform a number of instructions of various types. Significantly, the programs written for EDVAC were to be stored in high-speed computer memory rather than specified by the wiring of the computer. This overcame a severe limitation of ENIAC, which was the considerable time, with von Neumanns design, the program that EDVAC ran could be changed simply by changing the contents of the memory. Early CPUs were custom designs used as part of a larger, however, this method of designing custom CPUs for a particular application has largely given way to the development of multi-purpose processors produced in large quantities. This standardization began in the era of discrete transistor mainframes and minicomputers and has accelerated with the popularization of the integrated circuit. The IC has allowed increasingly complex CPUs to be designed and manufactured to tolerances on the order of nanometers, both the miniaturization and standardization of CPUs have increased the presence of digital devices in modern life far beyond the limited application of dedicated computing machines. Modern microprocessors appear in electronic devices ranging from automobiles to cellphones, the so-called Harvard architecture of the Harvard Mark I, which was completed before EDVAC, also utilized a stored-program design using punched paper tape rather than electronic memory. Relays and vacuum tubes were used as switching elements, a useful computer requires thousands or tens of thousands of switching devices. The overall speed of a system is dependent on the speed of the switches, tube computers like EDVAC tended to average eight hours between failures, whereas relay computers like the Harvard Mark I failed very rarely. In the end, tube-based CPUs became dominant because the significant speed advantages afforded generally outweighed the reliability problems, most of these early synchronous CPUs ran at low clock rates compared to modern microelectronic designs. Clock signal frequencies ranging from 100 kHz to 4 MHz were very common at this time, the design complexity of CPUs increased as various technologies facilitated building smaller and more reliable electronic devices

13.
Gigabyte
–
The gigabyte is a multiple of the unit byte for digital information. The prefix giga means 109 in the International System of Units, the unit symbol for the gigabyte is GB. However, the term is used in some fields of computer science and information technology to denote 1073741824 bytes. The use of gigabyte may thus be ambiguous, to address this ambiguity, the International System of Quantities standardizes the binary prefixes which denote a series of integer powers of 1024. With these prefixes, a module that is labeled as having the size 1GB has one gibibyte of storage capacity. The term gigabyte is commonly used to mean either 10003 bytes or 10243 bytes, the latter binary usage originated as compromise technical jargon for byte multiples that needed to be expressed in a power of 2, but lacked a convenient name. As 1024 is approximately 1000, roughly corresponding to SI multiples, in 1998 the International Electrotechnical Commission published standards for binary prefixes, requiring that the gigabyte strictly denote 10003 bytes and gibibyte denote 10243 bytes. By the end of 2007, the IEC Standard had been adopted by the IEEE, EU, and NIST and this is the recommended definition by the International Electrotechnical Commission. The file manager of Mac OS X version 10.6 and later versions are an example of this usage in software. The binary definition uses powers of the base 2, as is the principle of binary computers. This usage is widely promulgated by some operating systems, such as Microsoft Windows in reference to computer memory and this definition is synonymous with the unambiguous unit gibibyte. Since the first disk drive, the IBM350, disk drive manufacturers expressed hard drive capacities using decimal prefixes, with the advent of gigabyte-range drive capacities, manufacturers based most consumer hard drive capacities in certain size classes expressed in decimal gigabytes, such as 500 GB. The exact capacity of a given model is usually slightly larger than the class designation. Practically all manufacturers of disk drives and flash-memory disk devices continue to define one gigabyte as 1000000000bytes. Some operating systems such as OS X express hard drive capacity or file size using decimal multipliers and this discrepancy causes confusion, as a disk with an advertised capacity of, for example,400 GB might be reported by the operating system as 372 GB, meaning 372 GiB. The JEDEC memory standards use IEEE100 nomenclature which quote the gigabyte as 1073741824bytes and this means that a 300 GB hard disk might be indicated variously as 300 GB,279 GB or 279 GiB, depending on the operating system. As storage sizes increase and larger units are used, these differences even more pronounced. Some legal challenges have been waged over this confusion such as a lawsuit against drive manufacturer Western Digital, Western Digital settled the challenge and added explicit disclaimers to products that the usable capacity may differ from the advertised capacity

14.
Random-access memory
–
Random-access memory is a form of computer data storage which stores frequently used program instructions to increase the general speed of a system. A random-access memory device allows data items to be read or written in almost the same amount of time irrespective of the location of data inside the memory. RAM contains multiplexing and demultiplexing circuitry, to connect the lines to the addressed storage for reading or writing the entry. Usually more than one bit of storage is accessed by the same address, in todays technology, random-access memory takes the form of integrated circuits. RAM is normally associated with types of memory, where stored information is lost if power is removed. Other types of non-volatile memories exist that allow access for read operations. These include most types of ROM and a type of memory called NOR-Flash. Integrated-circuit RAM chips came into the market in the early 1970s, with the first commercially available DRAM chip, early computers used relays, mechanical counters or delay lines for main memory functions. Ultrasonic delay lines could only reproduce data in the order it was written, drum memory could be expanded at relatively low cost but efficient retrieval of memory items required knowledge of the physical layout of the drum to optimize speed. Latches built out of vacuum tube triodes, and later, out of transistors, were used for smaller and faster memories such as registers. Such registers were relatively large and too costly to use for large amounts of data, the first practical form of random-access memory was the Williams tube starting in 1947. It stored data as electrically charged spots on the face of a cathode ray tube, since the electron beam of the CRT could read and write the spots on the tube in any order, memory was random access. The capacity of the Williams tube was a few hundred to around a thousand bits, but it was smaller, faster. In fact, rather than the Williams tube memory being designed for the SSEM, magnetic-core memory was invented in 1947 and developed up until the mid-1970s. It became a form of random-access memory, relying on an array of magnetized rings. By changing the sense of each rings magnetization, data could be stored with one bit stored per ring, since every ring had a combination of address wires to select and read or write it, access to any memory location in any sequence was possible. Magnetic core memory was the form of memory system until displaced by solid-state memory in integrated circuits. Data was stored in the capacitance of each transistor, and had to be periodically refreshed every few milliseconds before the charge could leak away

15.
Flash memory
–
Flash memory is electronic non-volatile computer storage medium that can be electrically erased and reprogrammed. Toshiba developed flash memory from EEPROM in the early 1980s and introduced it to the market in 1984, the two main types of flash memory are named after the NAND and NOR logic gates. The individual flash memory cells exhibit internal characteristics similar to those of the corresponding gates, where EPROMs had to be completely erased before being rewritten, NAND-type flash memory may be written and read in blocks which are generally much smaller than the entire device. NOR-type flash allows a machine word to be written‍—‌to an erased location‍—‌or read independently. The NAND type operates primarily in memory cards, USB flash drives, solid-state drives, NAND or NOR flash memory is also often used to store configuration data in numerous digital products, a task previously made possible by EEPROM or battery-powered static RAM. One key disadvantage of flash memory is that it can endure a relatively small number of write cycles in a specific block. In addition to being non-volatile, flash memory offers fast read access times, although flash memory is technically a type of EEPROM, the term EEPROM is generally used to refer specifically to non-flash EEPROM which is erasable in small blocks, typically bytes. Because erase cycles are slow, the block sizes used in flash memory erasing give it a significant speed advantage over non-flash EEPROM when writing large amounts of data. As of 2013, flash memory costs much less than byte-programmable EEPROM and had become the dominant memory type wherever a system required a significant amount of non-volatile solid-state storage, Flash memory was invented by Fujio Masuoka while working for Toshiba circa 1980. According to Toshiba, the flash was suggested by Masuokas colleague, Shōji Ariizumi. Masuoka and colleagues presented the invention at the IEEE1984 International Electron Devices Meeting held in San Francisco, Intel Corporation saw the massive potential of the invention and introduced the first commercial NOR type flash chip in 1988. NOR-based flash has long erase and write times, but provides full address and data buses, allowing random access to any memory location. This makes it a replacement for older read-only memory chips. Its endurance may be from as little as 100 erase cycles for a flash memory, to a more typical 10,000 or 100,000 erase cycles. NOR-based flash was the basis of early flash-based removable media, CompactFlash was originally based on it, however, the I/O interface of NAND flash does not provide a random-access external address bus. Rather, data must be read on a basis, with typical block sizes of hundreds to thousands of bits. This makes NAND flash unsuitable as a replacement for program ROM. In this regard, NAND flash is similar to other data storage devices, such as hard disks and optical media

16.
HDMI
–
HDMI is a digital replacement for analog video standards. HDMI implements the EIA/CEA-861 standards, which video formats and waveforms, transport of compressed, uncompressed, and LPCM audio, auxiliary data. CEA-861 signals carried by HDMI are electrically compatible with the CEA-861 signals used by the visual interface. No signal conversion is necessary, nor is there a loss of quality when a DVI-to-HDMI adapter is used. The CEC capability allows HDMI devices to each other when necessary. Several versions of HDMI have been developed and deployed since initial release of the technology but all use the same cable and connector. Other than improved audio and video capacity, performance, resolution and color spaces, newer versions have optional advanced features such as 3D, Ethernet data connection, production of consumer HDMI products started in late 2003. In Europe either DVI-HDCP or HDMI is included in the HD ready in-store labeling specification for TV sets for HDTV, HDMI began to appear on consumer HDTV camcorders and digital still cameras in 2006. As of January 6,2015, over 4 billion HDMI devices have been sold, the HDMI founders are Hitachi, Panasonic, Philips, Silicon Image, Sony, Thomson, RCA and Toshiba. Digital Content Protection, LLC provides HDCP for HDMI, HDMI has the support of motion picture producers Fox, Universal, Warner Bros. and Disney, along with system operators DirecTV, EchoStar and CableLabs. The HDMI founders began development on HDMI1.0 on April 16,2002, at the time, DVI-HDCP and DVI-HDTV were being used on HDTVs. HDMI1.0 was designed to improve on DVI-HDTV by using a connector and adding audio capability and enhanced YCbCr capability. The first Authorized Testing Center, which tests HDMI products, was opened by Silicon Image on June 23,2003, in California, the first ATC in Japan was opened by Panasonic on May 1,2004, in Osaka. The first ATC in Europe was opened by Philips on May 25,2005, in Caen, the first ATC in China was opened by Silicon Image on November 21,2005, in Shenzhen. The first ATC in India was opened by Philips on June 12,2008, the HDMI website contains a list of all the ATCs. According to In-Stat, the number of HDMI devices sold was 5 million in 2004,17.4 million in 2005,63 million in 2006, and 143 million in 2007. HDMI has become the de facto standard for HDTVs, and according to In-Stat, In-Stat has estimated that 229 million HDMI devices were sold in 2008. On April 8,2008 there were over 850 consumer electronics, on January 7,2009, HDMI Licensing, LLC announced that HDMI had reached an installed base of over 600 million HDMI devices

17.
720p
–
720p is a progressive HDTV signal format with 720 horizontal lines and an aspect ratio of 16,9, normally known as widescreen HDTV. The number 720 stands for the 720 horizontal scan lines of display resolution. The p stands for progressive scan, i. e. non-interlaced, when broadcast at 60 frames per second, 720p features the highest temporal resolution possible under the ATSC and DVB standards. The term assumes a widescreen ratio of 16,9. 720i is a term found in numerous sources and publications. Typically, it is an error in which the author is referring to the 720p HDTV format. However, in cases it is incorrectly presented as an actual alternative format to 720p. No proposed or existing broadcast standard permits 720 interlaced lines in a frame at any frame rate. Progressive scanning reduces the need to prevent flicker by anti-aliasing single high contrast horizontal lines and it is also easier to perform high-quality 50↔60 Hz conversion and slow-motion clips with progressive video. A 720p60 video has advantage over 480i and 1080i60 in that it reduces the number of 3,2 artifacts introduced during transfer from 24 frame/s film. However, 576i and 1080i50, which are common in Europe, generally do not suffer from pull down artifacts as film frames are simply played at 25 frames and the audio pitch corrected by 25/24ths. As a result, 720p60 is used for U. S. broadcasts while European HD broadcasts often use 1080i50 24* frame, arte, a dual-language French-German channel produced in collaboration by ARD, ZDF and France Télévisions, broadcasts in German at 720p50 but in French at 1080i50. EBU Technical paper on HDTV formats

18.
1080p
–
1080p is a set of HDTV high-definition video modes characterized by 1080 horizontal lines of vertical resolution, the p stands for progressive scan, id est non-interlaced. The term usually assumes a widescreen ratio of 16,9. It is often marketed as full HD, to contrast 1080p with 720p resolution screens, 1080p video signals are supported by ATSC standards in the United States and DVB standards in Europe. Small camcorders, smartphones and digital cameras can capture still and moving images in 1080p resolution.485 Gbit/s to nominally 3 Gbit/s using uncompressed RGB encoding. Most current revisions of SMPTE 372M, SMPTE 424M and EBU Tech 3299 require YCbCr color space and 4,2,2 chroma subsampling for transmitting 1080p50 and 1080p60 signal. In the United States, the original ATSC standards for HDTV supported 1080p video, in July 2008, the ATSC standards were amended to include H. 264/MPEG-4 AVC compression and 1080p at 50,59.94 and 60 frames per second. Such frame rates require H. 264/AVC High Profile Level 4.2, in Europe, 1080p25 signals have been supported by the DVB suite of broadcasting standards. The 1080p50 format is considered to be a production format and, eventually. EBU requires that legacy MPEG-4 AVC decoders should avoid crashing in the presence of SVC and/or 1080p50 packets, the ITU-T BT.2100 standard that includes advanced 1080p video was subsequently published in July 2016. There is no word when any of the networks will consider airing at 1080p in the foreseeable future. However, satellite services utilize the 1080p/24-30 format with MPEG-4 AVC/H.264 encoding for pay-per-view movies that are downloaded in advance via satellite or on-demand via broadband. At this time, no pay service such as USA, HDNET, etc. nor premium movie channel such as HBO. For material that originates from a progressive scanned 24 frame/s source, MPEG-2 lets the video be coded as 1080p24 and these progressively-coded frames are tagged with metadata instructing a decoder how to perform a 3,2 pulldown to interlace them. In June 2016, Germany commenced terrestrial broadcasts of eight 1080p50 high-definition channels, using DVB-T2 protocol with HEVC encoding, a total of 40 channels will be available by March 2017. Blu-ray Discs are able to hold 1080p HD content, and most movies released on Blu-ray Disc produce a full 1080p HD picture when the player is connected to a 1080p HDTV via an HDMI cable. The Blu-ray Disc video specification allows encoding of 1080p23.976, 1080p24, 1080i50, generally this type of video runs at 30 to 40 megabits per second, compared to the 3.5 megabits per second for conventional standard definition broadcasts. Smartphones with 1080p FullHD display have been available on the market since 2012, as of the end of 2014, it is the standard for mid-range to high-end smartphones and many of the flagship devices of 2014 used even higher resolutions. Several websites, including YouTube, allow videos to be uploaded in the 1080p format, YouTube streams 1080p content at approximately 4 megabits per second compared to Blu-rays 30 to 40 megabits per second

19.
Stereophonic sound
–
Stereophonic sound or, more commonly, stereo, is a method of sound reproduction that creates an illusion of multi-directional audible perspective. Thus the term applies to so-called quadraphonic and surround-sound systems as well as the more common two-channel. It is often contrasted with monophonic, or mono sound, where audio is heard as coming from one position, in the 2000s, stereo sound is common in entertainment systems such as broadcast radio and TV, recorded music and the cinema. The word stereophonic derives from the Greek στερεός, firm, solid + φωνή, sound, tone, voice and it was coined in 1927 by Western Electric, the signal is then reproduced over multiple loudspeakers to recreate, as closely as possible, the live sound. Secondly artificial or pan-pot stereo, in which a sound is reproduced over multiple loudspeakers. By varying the amplitude of the signal sent to each speaker an artificial direction can be suggested. The control which is used to vary this relative amplitude of the signal is known as a pan-pot, by combining multiple pan-potted mono signals together, a complete, yet entirely artificial, sound field can be created. In technical usage, true stereo sound recording and sound reproduction that uses stereographic projection to encode the relative positions of objects and events recorded. During two-channel stereo recording, two microphones are placed in strategically chosen locations relative to the source, with both recording simultaneously. The two recorded channels will be similar, but each will have distinct time-of-arrival and sound-pressure-level information, during playback, the listeners brain uses those subtle differences in timing and sound level to triangulate the positions of the recorded objects. Stereo recordings often cannot be played on systems without a significant loss of fidelity. This phenomenon is known as phase cancellation and this two-channel telephonic process was commercialized in France from 1890 to 1932 as the Théâtrophone, and in England from 1895 to 1925 as the Electrophone. Both were services available by coin-operated receivers at hotels and cafés, modern stereophonic technology was invented in the 1930s by British engineer Alan Blumlein at EMI, who patented stereo records, stereo films, and also surround sound. In early 1931, Blumlein and his wife were at a local cinema, Blumlein declared to his wife that he had found a way to make the sound follow the actor across the screen. The genesis of ideas is uncertain, but he explained them to Isaac Shoenberg in the late summer of 1931. His earliest notes on the subject are dated 25 September 1931, the application was dated 14 December 1931, and was accepted on 14 June 1933 as UK patent number 394,325. The patent covered many ideas in stereo, some of which are used today and these discs used the two walls of the groove at right angles in order to carry the two channels. Much of the development work on this system for cinematic use did not reach completion until 1935, in Blumleins short test films, his original intent of having the sound follow the actor was fully realised

20.
USB
–
It is currently developed by the USB Implementers Forum. USB was designed to standardize the connection of peripherals to personal computers. It has become commonplace on other devices, such as smartphones, PDAs, USB has effectively replaced a variety of earlier interfaces, such as serial ports and parallel ports, as well as separate power chargers for portable devices. Also, there are 5 modes of USB data transfer, in order of increasing bandwidth, Low Speed, Full Speed, High Speed, SuperSpeed, USB devices have some choice of implemented modes, and USB version is not a reliable statement of implemented modes. Modes are identified by their names and icons, and the specifications suggests that plugs, unlike other data buses, USB connections are directed, with both upstream and downstream ports emanating from a single host. This applies to power, with only downstream facing ports providing power. Thus, USB cables have different ends, A and B, therefore, in general, each different format requires four different connectors, a plug and receptacle for each of the A and B ends. USB cables have the plugs, and the corresponding receptacles are on the computers or electronic devices, in common practice, the A end is usually the standard format, and the B side varies over standard, mini, and micro. The mini and micro formats also provide for USB On-The-Go with a hermaphroditic AB receptacle, the micro format is the most durable from the point of view of designed insertion lifetime. The standard and mini connectors have a lifetime of 1,500 insertion-removal cycles. Likewise, the component of the retention mechanism, parts that provide required gripping force, were also moved into plugs on the cable side. A group of seven companies began the development of USB in 1994, Compaq, DEC, IBM, Intel, Microsoft, NEC, a team including Ajay Bhatt worked on the standard at Intel, the first integrated circuits supporting USB were produced by Intel in 1995. The original USB1.0 specification, which was introduced in January 1996, Microsoft Windows 95, OSR2.1 provided OEM support for the devices. The first widely used version of USB was 1.1, the 12 Mbit/s data rate was intended for higher-speed devices such as disk drives, and the lower 1.5 Mbit/s rate for low data rate devices such as joysticks. Apple Inc. s iMac was the first mainstream product with USB, following Apples design decision to remove all legacy ports from the iMac, many PC manufacturers began building legacy-free PCs, which led to the broader PC market using USB as a standard. The USB2.0 specification was released in April 2000 and was ratified by the USB Implementers Forum at the end of 2001.1 specification, the USB3.0 specification was published on 12 November 2008. Its main goals were to increase the transfer rate, decrease power consumption, increase power output. USB3.0 includes a new, higher speed bus called SuperSpeed in parallel with the USB2.0 bus, for this reason, the new version is also called SuperSpeed

21.
Wi-Fi
–
Wi-Fi or WiFi is a technology for wireless local area networking with devices based on the IEEE802.11 standards. Wi-Fi is a trademark of the Wi-Fi Alliance, which restricts the use of the term Wi-Fi Certified to products that successfully complete interoperability certification testing. Devices that can use Wi-Fi technology include personal computers, video-game consoles, smartphones, digital cameras, tablet computers, digital audio players, Wi-Fi compatible devices can connect to the Internet via a WLAN network and a wireless access point. Such an access point has a range of about 20 meters indoors, hotspot coverage can be as small as a single room with walls that block radio waves, or as large as many square kilometres achieved by using multiple overlapping access points. Wi-Fi most commonly uses the 2.4 gigahertz UHF and 5 gigahertz SHF ISM radio bands, having no physical connections, it is more vulnerable to attack than wired connections, such as Ethernet. In 1971, ALOHAnet connected the Hawaiian Islands with a UHF wireless packet network, ALOHAnet and the ALOHA protocol were early forerunners to Ethernet, and later the IEEE802.11 protocols, respectively. A1985 ruling by the U. S. Federal Communications Commission released the ISM band for unlicensed use and these frequency bands are the same ones used by equipment such as microwave ovens and are subject to interference. In 1991, NCR Corporation with AT&T Corporation invented the precursor to 802.11, the first wireless products were under the name WaveLAN. They are the credited with inventing Wi-Fi. In 1992 and 1996, CSIRO obtained patents for a method used in Wi-Fi to unsmear the signal. The first version of the 802.11 protocol was released in 1997 and this was updated in 1999 with 802. 11b to permit 11 Mbit/s link speeds, and this proved to be popular. In 1999, the Wi-Fi Alliance formed as an association to hold the Wi-Fi trademark under which most products are sold. Wi-Fi uses a number of patents held by many different organizations. In April 2009,14 technology companies agreed to pay CSIRO $1 billion for infringements on CSIRO patents and this led to Australia labeling Wi-Fi as an Australian invention, though this has been the subject of some controversy. In 2016, the local area network Test Bed was chosen as Australias contribution to the exhibition A History of the World in 100 Objects held in the National Museum of Australia. The name Wi-Fi, commercially used at least as early as August 1999, was coined by the brand-consulting firm Interbrand, the Wi-Fi Alliance had hired Interbrand to create a name that was a little catchier than IEEE802. 11b Direct Sequence. Phil Belanger, a member of the Wi-Fi Alliance who presided over the selection of the name Wi-Fi, has stated that Interbrand invented Wi-Fi as a pun upon the word hi-fi. Interbrand also created the Wi-Fi logo, the yin-yang Wi-Fi logo indicates the certification of a product for interoperability

22.
Bluetooth
–
Bluetooth is a wireless technology standard for exchanging data over short distances from fixed and mobile devices, and building personal area networks. Invented by telecom vendor Ericsson in 1994, it was conceived as a wireless alternative to RS-232 data cables. It can connect up to seven devices, overcoming problems that older technologies had when attempting to connect to each other. Bluetooth is managed by the Bluetooth Special Interest Group, which has more than 30,000 member companies in the areas of telecommunication, computing, networking, the IEEE standardized Bluetooth as IEEE802.15.1, but no longer maintains the standard. The Bluetooth SIG oversees development of the specification, manages the qualification program, a manufacturer must meet Bluetooth SIG standards to market it as a Bluetooth device. A network of patents apply to the technology, which are licensed to individual qualifying devices, the development of the short-link radio technology, later named Bluetooth, was initiated in 1989 by Nils Rydbeck, CTO at Ericsson Mobile in Lund, Sweden, and by Johan Ullman. The purpose was to develop wireless headsets, according to two inventions by Johan Ullman, SE 8902098-6, issued 1989-06-12 and SE9202239, issued 1992-07-24, Nils Rydbeck tasked Tord Wingren with specifying and Jaap Haartsen and Sven Mattisson with developing. Both were working for Ericsson in Lund, the specification is based on frequency-hopping spread spectrum technology. The idea of this name was proposed in 1997 by Jim Kardach who developed a system that would allow mobile phones to communicate with computers, at the time of this proposal he was reading Frans G. Bengtssons historical novel The Long Ships about Vikings and King Harald Bluetooth. The implication is that Bluetooth does the same with communications protocols, the Bluetooth logo is a bind rune merging the Younger Futhark runes and, Haralds initials. Bluetooth operates at frequencies between 2402 and 2480 MHz, or 2400 and 2483.5 MHz including guard bands 2 MHz wide at the end and 3.5 MHz wide at the top. This is in the globally unlicensed Industrial, Scientific and Medical 2.4 GHz short-range radio frequency band, Bluetooth uses a radio technology called frequency-hopping spread spectrum. Bluetooth divides transmitted data into packets, and transmits each packet on one of 79 designated Bluetooth channels, each channel has a bandwidth of 1 MHz. It usually performs 800 hops per second, with Adaptive Frequency-Hopping enabled, Bluetooth low energy uses 2 MHz spacing, which accommodates 40 channels. Originally, Gaussian frequency-shift keying modulation was the modulation scheme available. Since the introduction of Bluetooth 2. 0+EDR, π/4-DQPSK and 8DPSK modulation may also be used between compatible devices, devices functioning with GFSK are said to be operating in basic rate mode where an instantaneous data rate of 1 Mbit/s is possible. The term Enhanced Data Rate is used to describe π/4-DPSK and 8DPSK schemes, the combination of these modes in Bluetooth radio technology is classified as a BR/EDR radio. Bluetooth is a protocol with a master-slave structure

23.
Ethernet
–
Ethernet /ˈiːθərnɛt/ is a family of computer networking technologies commonly used in local area networks, metropolitan area networks and wide area networks. It was commercially introduced in 1980 and first standardized in 1983 as IEEE802.3, over time, Ethernet has largely replaced competing wired LAN technologies such as token ring, FDDI and ARCNET. The original 10BASE5 Ethernet uses coaxial cable as a medium, while the newer Ethernet variants use twisted pair. Over the course of its history, Ethernet data transfer rates have increased from the original 2.94 megabits per second to the latest 100 gigabits per second. The Ethernet standards comprise several wiring and signaling variants of the OSI physical layer in use with Ethernet, systems communicating over Ethernet divide a stream of data into shorter pieces called frames. As per the OSI model, Ethernet provides services up to, since its commercial release, Ethernet has retained a good degree of backward compatibility. Features such as the 48-bit MAC address and Ethernet frame format have influenced other networking protocols, the primary alternative for some uses of contemporary LANs is Wi-Fi, a wireless protocol standardized as IEEE802.11. Ethernet was developed at Xerox PARC between 1973 and 1974 and it was inspired by ALOHAnet, which Robert Metcalfe had studied as part of his PhD dissertation. In 1975, Xerox filed a patent application listing Metcalfe, David Boggs, Chuck Thacker, in 1976, after the system was deployed at PARC, Metcalfe and Boggs published a seminal paper. Metcalfe left Xerox in June 1979 to form 3Com and he convinced Digital Equipment Corporation, Intel, and Xerox to work together to promote Ethernet as a standard. The so-called DIX standard, for Digital/Intel/Xerox, specified 10 Mbit/s Ethernet, with 48-bit destination and source addresses and it was published on September 30,1980 as The Ethernet, A Local Area Network. Data Link Layer and Physical Layer Specifications, version 2 was published in November,1982 and defines what has become known as Ethernet II. Formal standardization efforts proceeded at the time and resulted in the publication of IEEE802.3 on June 23,1983. Ethernet initially competed with two largely proprietary systems, Token Ring and Token Bus, in the process, 3Com became a major company. 3Com shipped its first 10 Mbit/s Ethernet 3C100 NIC in March 1981, an Ethernet adapter card for the IBM PC was released in 1982, and, by 1985, 3Com had sold 100,000. Parallel port based Ethernet adapters were produced for a time, with drivers for DOS, by the early 1990s, Ethernet became so prevalent that it was a must-have feature for modern computers, and Ethernet ports began to appear on some PCs and most workstations. This process was sped up with the introduction of 10BASE-T and its relatively small modular connector. Since then, Ethernet technology has evolved to meet new bandwidth, in addition to computers, Ethernet is now used to interconnect appliances and other personal devices

24.
Volt
–
The volt is the derived unit for electric potential, electric potential difference, and electromotive force. One volt is defined as the difference in potential between two points of a conducting wire when an electric current of one ampere dissipates one watt of power between those points. It is also equal to the difference between two parallel, infinite planes spaced 1 meter apart that create an electric field of 1 newton per coulomb. Additionally, it is the difference between two points that will impart one joule of energy per coulomb of charge that passes through it. It can also be expressed as amperes times ohms, watts per ampere, or joules per coulomb, for the Josephson constant, KJ = 2e/h, the conventional value KJ-90 is used, K J-90 =0.4835979 GHz μ V. This standard is typically realized using an array of several thousand or tens of thousands of junctions. Empirically, several experiments have shown that the method is independent of device design, material, measurement setup, etc. in the water-flow analogy sometimes used to explain electric circuits by comparing them with water-filled pipes, voltage is likened to difference in water pressure. Current is proportional to the diameter of the pipe or the amount of water flowing at that pressure. A resistor would be a reduced diameter somewhere in the piping, the relationship between voltage and current is defined by Ohms Law. Ohms Law is analogous to the Hagen–Poiseuille equation, as both are linear models relating flux and potential in their respective systems, the voltage produced by each electrochemical cell in a battery is determined by the chemistry of that cell. Cells can be combined in series for multiples of that voltage, mechanical generators can usually be constructed to any voltage in a range of feasibility. High-voltage electric power lines,110 kV and up Lightning, Varies greatly. Volta had determined that the most effective pair of metals to produce electricity was zinc. In 1861, Latimer Clark and Sir Charles Bright coined the name volt for the unit of resistance, by 1873, the British Association for the Advancement of Science had defined the volt, ohm, and farad. In 1881, the International Electrical Congress, now the International Electrotechnical Commission and they made the volt equal to 108 cgs units of voltage, the cgs system at the time being the customary system of units in science. At that time, the volt was defined as the difference across a conductor when a current of one ampere dissipates one watt of power. The international volt was defined in 1893 as 1/1.434 of the emf of a Clark cell and this definition was abandoned in 1908 in favor of a definition based on the international ohm and international ampere until the entire set of reproducible units was abandoned in 1948. Prior to the development of the Josephson junction voltage standard, the volt was maintained in laboratories using specially constructed batteries called standard cells

25.
Direct current
–
Direct current is a flow of electrical charge carriers that always takes place in the same direction. The current need not always have the magnitude, but if it is to be defined as dc. This contrasts with alternating current which varies the direction of flow, sources of direct current include power supplies, electrochemical cells and batteries, and photovoltaic cells and panels. The intensity, or amplitude, of a direct current might fluctuate with time, in some such cases the dc has an ac component superimposed on it. An example of this is the output of a cell that receives a modulated light communications signal. A source of dc is sometimes called a dc generator, batteries and various other sources of dc produce a constant voltage. This is called pure dc and can be represented by a straight, the peak and effective values are the same. The peak to peak value is zero because the instantaneous amplitude never changes, in some instances the value of a dc voltage pulsates or oscillates rapidly with time, in a manner similar to the changes in an ac wave. The unfiltered output of a wave or a full wave rectifier. In 1820, Hans Christian Orsted discovered that electrical current creates a magnetic field and this discovery made scientists relate magnetism to the electric phenomena. In 1879, Thomas Edison invented the light bulb. He improved a 50-year-old idea using lower current electricity, a vacuum inside the globe and a small carbonized filament. At that time, the idea of lightning was not new. Edison not only invented an incandescent electric light, but an electric lighting system contained all the necessary elements to make the incandescent light safe, economical. Prior to 1879, direct current electricity had been used in lighting for the outdoors and it was in the 1880s when the modern electric utility industry began. It was an evolution from street lighting systems and from gas and it was located in Lower Manhattan, on Pearl Street. This station provided light and electricity to customers in a one square mile range, the station was called Thomas Edisons Pearl Street Electricity Generating Station. This station introduced four elements of an electric utility system, Efficient distribution, competitive price, reliable central generation

26.
Ampere
–
The ampere, often shortened to amp, is a unit of electric current. In the International System of Units the ampere is one of the seven SI base units and it is named after André-Marie Ampère, French mathematician and physicist, considered the father of electrodynamics. SI defines the ampere in terms of base units by measuring the electromagnetic force between electrical conductors carrying electric current. The ampere was then defined as one coulomb of charge per second, in SI, the unit of charge, the coulomb, is defined as the charge carried by one ampere during one second. In the future, the SI definition may shift back to charge as the base unit, ampères force law states that there is an attractive or repulsive force between two parallel wires carrying an electric current. This force is used in the definition of the ampere. The SI unit of charge, the coulomb, is the quantity of electricity carried in 1 second by a current of 1 ampere, conversely, a current of one ampere is one coulomb of charge going past a given point per second,1 A =1 C s. In general, charge Q is determined by steady current I flowing for a time t as Q = It, constant, instantaneous and average current are expressed in amperes and the charge accumulated, or passed through a circuit over a period of time is expressed in coulombs. The relation of the ampere to the coulomb is the same as that of the watt to the joule, the ampere was originally defined as one tenth of the unit of electric current in the centimetre–gram–second system of units. That unit, now known as the abampere, was defined as the amount of current that generates a force of two dynes per centimetre of length between two wires one centimetre apart. The size of the unit was chosen so that the derived from it in the MKSA system would be conveniently sized. The international ampere was a realization of the ampere, defined as the current that would deposit 0.001118 grams of silver per second from a silver nitrate solution. Later, more accurate measurements revealed that this current is 0.99985 A, at present, techniques to establish the realization of an ampere have a relative uncertainty of approximately a few parts in 107, and involve realizations of the watt, the ohm and the volt. Rather than a definition in terms of the force between two current-carrying wires, it has proposed that the ampere should be defined in terms of the rate of flow of elementary charges. Since a coulomb is equal to 6. 2415093×1018 elementary charges. The proposed change would define 1 A as being the current in the direction of flow of a number of elementary charges per second. In 2005, the International Committee for Weights and Measures agreed to study the proposed change, the new definition was discussed at the 25th General Conference on Weights and Measures in 2014 but for the time being was not adopted. The current drawn by typical constant-voltage energy distribution systems is usually dictated by the power consumed by the system, for this reason the examples given below are grouped by voltage level

27.
Coaxial power connector
–
A coaxial power connector is an electrical power connector used for attaching extra-low voltage devices such as consumer electronics to external electricity. Also known as barrel connectors, concentric barrel connectors or tip connectors, barrel plug connectors are commonly used to interface the secondary side of a power supply with the device. Some of these contain a normally closed switch, the switch can disconnect internal batteries whenever the external power supply is connected. The connector pairs for barrel connectors are defined in terms of plugs, receptacles may be panel-mounted or circuit board-mounted. Some in-line receptacles are also cable-mounted, type N connectors, and all EC_60320#Appliance_couplers IEC60320 appliance coupler plugs are examples of this. E. Which component is male and which female, as a result, there are varying opinions in this regard. Many industrial suppliers avoid gender terminology but many do not, similarly, some people view the corded plug as female and some perceive it as male. Some, after consideration and surveys, found that user perception of which was male, power is generally supplied by a plug to a receptacle. Cables are available with one in-line receptacle fanning out to a number of plugs, cables for such cases are available with a plug at each end, although cables or adapters with two receptacles are not widely available. On the female plug, the body is metallic and cylindrical in shape. The second, inside contact is a metallic cylinder constructed to accept insertion of the pin in the corresponding male connector. The inner and outer barrels are separated by an insulating layer, the outer contact is generally called the barrel, sleeve or ring, and the inner contact is called the tip. There is typically a single spring-loaded contact at the side of the male connector, there are many different sizes of coaxial power connectors. Contact ratings commonly vary from unspecified up to 5 amperes, voltage is often unspecified, but may be up to 48V with 12V typical. The smaller types usually have lower current and voltage ratings and it is quite possible that new sizes will continue to appear and disappear. The sizes and shapes of connectors do not consistently correspond to the power specifications across manufacturers. Two connectors from different manufacturers with different sizes could potentially be attached to power supplies with the same voltage, alternatively, connectors of the same size can be part of power supplies with different voltages and currents. Use of the power supply may cause severe equipment damage

28.
Kickstarter
–
Kickstarter is an American public-benefit corporation based in Brooklyn, New York, that maintains a global crowdfunding platform focused on creativity. The companys stated mission is to bring creative projects to life. People who back Kickstarter projects are offered tangible rewards and/or experiences in exchange for their pledges and this model traces its roots to subscription model of arts patronage, where artists would go directly to their audiences to fund their work. Kickstarter launched on April 28,2009, by Perry Chen, Yancey Strickler, the New York Times called Kickstarter the peoples NEA. Time named it one of the Best Inventions of 2010 and Best Websites of 2011, Kickstarter reportedly raised $10 million funding from backers including NYC-based venture firm Union Square Ventures and angel investors such as Jack Dorsey, Zach Klein and Caterina Fake. The company is based in the Greenpoint section of Brooklyn, andy Baio served as the sites CTO until November 2010, when he joined Expert Labs. Lance Ivy has been Lead Developer since the website launched, on February 14,2013, Kickstarter released an iOS app called Kickstarter for the iPhone. The app is aimed at users who create and back projects and is the first time Kickstarter has had an official mobile presence, and in Spain on May 19,2015. Kickstarter is one of a number of crowdfunding platforms for gathering money from the public, project creators choose a deadline and a minimum funding goal. If the goal is not met by the deadline, no funds are collected, Kickstarter applies a 5% fee on the total amount of the funds raised. Their payments processor applies an additional 3–5% fee, unlike many forums for fundraising or investment, Kickstarter claims no ownership over the projects and the work they produce. The web pages of projects launched on the site are permanently archived, after funding is completed, projects and uploaded media cannot be edited or removed from the site. Kickstarter advises backers to use their own judgment on supporting a project and they also warn project leaders that they could be liable for legal damages from backers for failure to deliver on promises. Projects might also fail even after a fundraising campaign when creators underestimate the total costs required or technical difficulties to be overcome. Asked what made Kickstarter different from other crowdfunding platforms, co-founder Perry Chen said, or, if there’s even an agreed upon definition of what it is. We haven’t actively supported the use of the term because it can provoke more confusion, in our case, we focus on a middle ground between patronage and commerce. People are offering cool stuff and experiences in exchange for the support of their ideas, people are creating these mini-economies around their project ideas. So, you aren’t coming to the site to get something for nothing, on June 21,2012, Kickstarter began publishing statistics on its projects

29.
Computing platform
–
Computing platform means in general sense, where any piece of software is executed. It may be the hardware or the system, even a web browser or other application. The term computing platform can refer to different abstraction levels, including a hardware architecture, an operating system. In total it can be said to be the stage on which programs can run. For example, an OS may be a platform that abstracts the underlying differences in hardware, platforms may also include, Hardware alone, in the case of small embedded systems. Embedded systems can access hardware directly, without an OS, this is referred to as running on bare metal, a browser in the case of web-based software. The browser itself runs on a platform, but this is not relevant to software running within the browser. An application, such as a spreadsheet or word processor, which hosts software written in a scripting language. This can be extended to writing fully-fledged applications with the Microsoft Office suite as a platform, software frameworks that provide ready-made functionality. Cloud computing and Platform as a Service, the social networking sites Twitter and facebook are also considered development platforms. A virtual machine such as the Java virtual machine, applications are compiled into a format similar to machine code, known as bytecode, which is then executed by the VM. A virtualized version of a system, including virtualized hardware, OS, software. These allow, for instance, a typical Windows program to run on what is physically a Mac, some architectures have multiple layers, with each layer acting as a platform to the one above it. In general, a component only has to be adapted to the layer immediately beneath it, however, the JVM, the layer beneath the application, does have to be built separately for each OS

30.
Twitch.tv
–
Twitch is a live streaming video platform owned by Twitch Interactive, a subsidiary of Amazon. com. Content on the site can either be viewed live or via video on demand, at the same time, Justin. tvs parent company was re-branded as Twitch Interactive to represent the shift in focus – Justin. tv was shut down in August 2014. The site has also branched out into music-related streams and content, in 2015, Twitch announced it had more than 1.5 million broadcasters and 100 million visitors per month. When Justin. tv was launched in 2007 by Justin Kan and Emmett Shear, the gaming category grew especially fast, and became the most popular content on the site. In June 2011, the decided to spin off the gaming content as Twitch. TV. It launched officially in public beta on June 6,2011, since then, Twitch has attracted more than 35 million unique visitors a month. Twitch had about 80 employees in June 2013, which increased to 100 by December 2013, the company was headquartered in San Franciscos Financial District. Twitch has been supported by significant investments of capital, with US$15 million in 2012. Investors during three rounds of fund raising leading up to the end of 2013 included Draper Associates, Bessemer Venture Partners, in addition to the influx of venture funding, it was believed in 2013 that the company had become profitable. Competing video services, such as YouTube and Dailymotion, began to increase the prominence of their content to compete. As of mid-2013, there were over 43 million viewers on Twitch monthly, with the viewer watching an hour. As of February 2014, Twitch is the fourth largest source of Internet traffic during peak times in the United States, behind Netflix, Google, Twitch makes up 1. 8% of total US Internet traffic during peak periods. On March 24,2015 Twitch was reportedly hacked and users’ details compromised, users’ accounts were reset, but it does not seem that any credit card or other financial information has been made available. However, passwords do appear to have been leaked and the company recommends that users reset their details on any site where they use the same password. By merging a video game, live video and a participatory experience and this is a wonderful proof of concept that we hope to see more of in the future. Beginning with its 2014 edition, Twitch was made the live streaming platform of the Electronic Entertainment Expo. On May 18,2014, Variety first reported that Google had reached a deal to acquire Twitch through its YouTube subsidiary for approximately US$1 billion. On August 5,2014, the original Justin. tv site was shut down

31.
Kodi (software)
–
Kodi is a free and open-source media player software application developed by the XBMC Foundation, a non-profit technology consortium. Kodi is available for operating systems and hardware platforms, with a software 10-foot user interface for use with televisions. It allows users to play and view most streaming media, such as videos, music, podcasts and it is a multi-platform home-theater PC application. The later versions also have a personal video-recorder graphical front end for receiving live television with electronic program guide, derivative applications such as MediaPortal and Plex have been spun off from XBMC or Kodi, as well as Just enough operating systems like OpenELEC and LibreELEC. These devices, and the add-ons that facilitate this infringement, are not affiliated with the Kodi project, the XBMC Foundation has not endorsed any of these products, and has threatened legal action against those using its trademarks to promote them. Kodi supports most common audio, video, and image formats, playlists, audio visualizations, slideshows, weather forecasts reporting, Kodi also functions as a game launcher on any operating system. The ending of Xbox support by the project was also the reason that it was renamed XBMC from the old Xbox Media Center name. The Xbox version of XBMC had the ability to console games. Since the XBMC for Xbox version was never distributed, endorsed, or supported by Microsoft, Kodi has greater basic hardware requirements than traditional 2D style software applications, it needs a 3D capable graphics hardware controller for all rendering. By taking advantage of such hardware-accelerated video decoding, Kodi can play back most videos on many inexpensive, low-performance systems, the latest version of XBMC supports over 74 languages. Kodi developers encourage users to make and submit their own addons to expand media content, many of these online content sources are in over-the-top content high definition services and use video streaming site as sources for the media content that is offered. Not all content sources on add-ons are available in every country, Kodi features an integrated Python Scripts interpreter for addon extensions, and WindowXML application framework in a similar fashion to Apple macOS Dashboard Widgets and Microsoft Gadgets. Python widget scripts allow normal users to add new functionality to Kodi themselves and these scrapers are used as importers to obtain detailed information from various Internet resources about movies and television shows. It can get synopses, reviews, movie posters, titles, genre classification, XBMCGUI then provides a rich display for audio and video files that the scrapers have identified. Fanart.11, and Touch which introduced with XBMC version 11.0 being design for small devices with touchscreen displays. Project Mayhem had been the default before XBMC version 9.1, users can also create their own skin and share it with others via public websites that are used for Kodi skin trading and development. In addition to skins and themes users can create a package called a build. Within this package homebrew developers are able to distribute a skin, the delivery mechanism used within the Kodi scene is called a wizard with the Replicant Wizard being the most prominent

32.
Android Jelly Bean
–
The first of these three,4. These changes allow the system to run at a full 60 frames per second on capable hardware. Alongside Android 4.1, Google also began to decouple APIs for its services on Android into a new system-level component known as Google Play Services, serviced through Google Play Store. Attendees of the Google I/O conference were given Nexus 7 tablets pre-loaded with Android 4.1, Google announced an intent to release 4.1 updates for existing Nexus devices and the Motorola Xoom tablet by mid-July. The Android 4.1 upgrade was released to the public for GSM Galaxy Nexus models on July 10,2012. On October 29,2012, Google unveiled Android 4.2, dubbed a sweeter tasting Jelly Bean, alongside its accompanying launch devices, firmware updates for the Nexus 7 and Galaxy Nexus were released in November 2012. Android 4.3 was subsequently released on July 24,2013 via firmware updates to the Galaxy Nexus,2012 Nexus 7, Nexus 4, a minor update,4.3.1, was released in October 2013 for the new Nexus 7 to address device-specific issues. Visually, Jelly Beans interface reflects a refinement of the Holo appearance introduced by Android 4.0. The default home screen of Jelly Bean received new features, such as the ability for other shortcuts and widgets on a home page to re-arrange themselves to fit an item being moved or resized. Notifications can also be disabled individually per-app, Android 4.2 added additional features to the user interface, the lock screen can be swiped to the left to display widget pages, and swiped to the right to go to the camera. The previous Browser application was officially deprecated on 4.2 in favor of Google Chrome for Android,4.2 also adds gesture typing on the keyboard, a redesigned clock app, and a new screen saver system known as Daydreams. On tablets, Android 4.2 also supports multiple users and these changes took effect for small tablets on 4.1, and for larger tablets on 4.2. Small tablets on Android are optimized primarily for use in a portrait orientation, when used in a landscape orientation, apps adjust themselves into the widescreen-oriented layouts seen on larger tablets. On large tablets, navigation buttons were placed in the bottom-left of a bar along the bottom of the screen, with the clock. Android Beam can now also be used to initiate Bluetooth file transfers through near-field communication, Android 4.2 added a rewritten Bluetooth stack, changing from the previous Bluez stack to a rewritten Broadcom open source stack called BlueDroid. A new NFC stack was added at the same time, Android 4.3 also included a hidden privacy feature known as App ops, which allowed users to individually deny permissions to apps

33.
Santa Monica, California
–
Santa Monica is a beachfront city in western Los Angeles County, California, United States. The Census Bureau population for Santa Monica in 2010 was 89,736, due in part to an agreeable climate, Santa Monica became a famed resort town by the early 20th century. The city has experienced a boom since the late 1980s through the revitalization of its core, significant job growth. The Santa Monica Pier remains a popular and iconic destination, Santa Monica was long inhabited by the Tongva people. Santa Monica was called Kecheek in the Tongva language, the first non-indigenous group to set foot in the area was the party of explorer Gaspar de Portolà, who camped near the present-day intersection of Barrington and Ohio Avenues on August 3,1769. Named after the Christian saint Monica, there are two different accounts of how the name came to be. One says it was named in honor of the feast day of Saint Monica, another version says it was named by Juan Crespí on account of a pair of springs, the Kuruvungna Springs, that were reminiscent of the tears Saint Monica shed over her sons early impiety. In Los Angeles, several battles were fought by the Californios, following the Mexican–American War, Mexico signed the Treaty of Guadalupe Hidalgo, which gave Mexicans and Californios living in state certain unalienable rights. US government sovereignty in California began on February 2,1848, in the 1870s the Los Angeles and Independence Railroad, connected Santa Monica with Los Angeles, and a wharf out into the bay. The first town hall was a modest 1873 brick building, later a beer hall and it is Santa Monicas oldest extant structure. By 1885, the towns first hotel was the Santa Monica Hotel, around the start of the 20th century, a growing population of Asian Americans lived in and around Santa Monica and Venice. A Japanese fishing village was near the Long Wharf while small numbers of Chinese lived or worked in Santa Monica, the two ethnic minorities were often viewed differently by White Americans who were often well-disposed towards the Japanese but condescending towards the Chinese. The Japanese village fishermen were an economic part of the Santa Monica Bay community. Donald Wills Douglas, Sr. built a plant in 1922 at Clover Field for the Douglas Aircraft Company, in 1924, four Douglas-built planes took off from Clover Field to attempt the first aerial circumnavigation of the world. Two planes returned after covering 27,553 miles in 175 days, the Douglas Company kept facilities in the city until the 1960s. The Great Depression hit Santa Monica deeply, one report gives citywide employment in 1933 of just 1,000. Hotels and office building owners went bankrupt, in the 1930s, corruption infected Santa Monica. The federal Works Project Administration helped build several buildings, most notably City Hall, the main Post Office and Barnum Hall were also among other WPA projects

34.
Alibaba Group
–
Alibaba Group Holding Limited is a Chinese e-commerce company that provides consumer-to-consumer, business-to-consumer and business-to-business sales services via web portals. It also provides electronic payment services, a search engine. The group began in 1999 when Jack Ma founded the website Alibaba. com, in 2012, two of Alibabas portals handled 1.1 trillion yuan in sales. Suppliers from other countries are supported, but the company operates in the Peoples Republic of China. At closing time on the date of its public offering,19 September 2014. However, the stock has traded down and market cap was about $212 billion at the end of December 2015 and it is the worlds largest retailer as of April 2016 surpassing Walmart, with operations in over 190 countries, as well as one of the largest Internet companies. Alibaba has been the most dominant retailer in the world, generating more revenues than Amazon. com and its online sales & profits surpassed all US retailers combined in 2015. It has been expanding into media and entertainment industry, with revenues rising 3-digit percents year on year, in September 2013, the company sought an IPO in the United States after a deal could not be reached with Hong Kong regulators. Planning occurred over 12 months before the market debut in September 2014. The pricing of the IPO initially raised US$21.8 billion, buyers were actually purchasing shares in a Cayman Islands shell corporation, not in the Alibaba group, as China forbids foreign ownership of its companies. Alibabas consumer-to-consumer portal Taobao, similar to eBay. com, features nearly a billion products and is one of the 20 most-visited websites globally. The Groups websites accounted for over 60% of the parcels delivered in China by March 2013, Alipay, an online payment escrow service, accounts for roughly half of all online payment transactions within China. As of February 2017, the company has been engaging and dealing with counterfeit issues, the company was founded in Malaysia, and the name came from the character Ali Baba from the Arabian literature One Thousand and One Nights because of its universal appeal. As Ma explained, One day I was in Malaysia in a coffee shop, and then a waitress came, and I said, Do you know about Alibaba. I said, What do you know about, and I said, Yes, this is the name. Then I went on to the street and found 30 people and asked them, people from India, people from Germany, people from Tokyo and China … they all knew about Alibaba. Alibaba is a kind, smart business person, and he helped the village, so … easy to spell, and globally known. Alibaba opens sesame for small- to medium-sized companies and we also registered the name Alimama, in case someone wants to marry us

35.
Android TV
–
Android TV is a smart TV platform developed by Google. Based on the Android, it creates an interactive television experience through a 10-foot user interface and it was initially announced on June 25,2014, at Google I/O2014 as a successor to Googles earlier attempt at a smart TV platform, which was Google TV. Android TV can be built both into TVs and into stand-alone digital media players, users have access to the Google Play Store to download Android apps, including media streaming services Netflix and Hulu, as well as games. The platform emphasizes voice search to find content or to answer queries. The TV interface is divided vertically into three sections, recommendations on top, media apps in the middle, and games on the bottom, the interface can be navigated using a game controller, remote control, or the Android TV mobile app. Android TV also supports Google Cast, the technology behind Googles media player Chromecast that allows a device to be used to select. Google has partnered with Sony, Sharp, and Philips to offer the platform in TVs, version 2 of Shield Android TV was announced at CES2017 with Google Assistant in a future update. Google and Asus co-developed the first device to employ Android TV, Razer released media player with a focus on gaming. Android TV allows consumers to use an HDTV set to music, watch video originating from Internet services or a local network. Android TV can be paired with Bluetooth gaming controllers to interact with the system interface / applications, as well as, Android TV also includes all features and streaming capabilities of the Chromecast device. The branding was changed because NVIDIA did not want to appear to be competing with eighth generation consoles, unlike the Nexus Player and the Forge TV, the Shield Android TV has a higher price point of US$200. A primary selling point of the device is the Tegra X1 chipset which is far more powerful than that of any previous Android TV device. The set-top box also has 3 GB of RAM,16 GB of internal storage, USB3.0 ports, gigabit Ethernet, the device ships with a Wi-Fi Direct NVIDIA-branded game controller. Other features include integration with NVIDIA GameStream and GeForce NOW, as with previous NVIDIA Shield branded devices, a small selection of NVIDIA-exclusive Android-ported AAA video games are optimised for the Tegra X1 chipset. The ADT-1 Developer Kit was released by Google before any commercial Android TV devices were released, the hardware was given to some Google I/O2014 attendees and later mailed to other developers. The device uses a Tegra 4 chipset and has 16 GB of flash memory. The Google Nexus Player was the first consumer Android TV device, releasing first in the US on November 3,2014 and it supports 1080p, but not 4K.0 port. The Freebox Player Mini is offered by French ISP Free, and is a 4K capable Android TV set-top-box, the Forge TV, by Razer, was announced at CES on January 6,2015

36.
Television
–
Television or TV is a telecommunication medium used for transmitting moving images in monochrome, or in color, and in two or three dimensions and sound. The term can refer to a set, a television program. Television is a medium for entertainment, education, news, politics, gossip. Television became available in experimental forms in the late 1920s. After World War II, a form of black-and-white TV broadcasting became popular in the United States and Britain, and television sets became commonplace in homes, businesses. During the 1950s, television was the medium for influencing public opinion. In the mid-1960s, color broadcasting was introduced in the US, for many reasons, the storage of television and video programming now occurs on the cloud. At the end of the first decade of the 2000s, digital television transmissions greatly increased in popularity, another development was the move from standard-definition television to high-definition television, which provides a resolution that is substantially higher. HDTV may be transmitted in various formats, 1080p, 1080i, in 2013, 79% of the worlds households owned a television set. Most TV sets sold in the 2000s were flat-panel, mainly LEDs, major manufacturers announced the discontinuation of CRT, DLP, plasma, and even fluorescent-backlit LCDs by the mid-2010s. In the near future, LEDs are gradually expected to be replaced by OLEDs, also, major manufacturers have announced that they will increasingly produce smart TVs in the mid-2010s. Smart TVs with integrated Internet and Web 2.0 functions became the dominant form of television by the late 2010s, Television signals were initially distributed only as terrestrial television using high-powered radio-frequency transmitters to broadcast the signal to individual television receivers. Alternatively television signals are distributed by cable or optical fiber, satellite systems and. Until the early 2000s, these were transmitted as analog signals, a standard television set is composed of multiple internal electronic circuits, including a tuner for receiving and decoding broadcast signals. A visual display device which lacks a tuner is correctly called a video monitor rather than a television, the word television comes from Ancient Greek τῆλε, meaning far, and Latin visio, meaning sight. The Anglicised version of the term is first attested in 1907 and it was. formed in English or borrowed from French télévision. In the 19th century and early 20th century, other. proposals for the name of a technology for sending pictures over distance were telephote. The abbreviation TV is from 1948, the use of the term to mean a television set dates from 1941

37.
ARM architecture
–
ARM, originally Acorn RISC Machine, later Advanced RISC Machine, is a family of reduced instruction set computing architectures for computer processors, configured for various environments. It also designs cores that implement this instruction set and licenses these designs to a number of companies that incorporate those core designs into their own products, a RISC-based computer design approach means processors require fewer transistors than typical complex instruction set computing x86 processors in most personal computers. This approach reduces costs, heat and power use and these characteristics are desirable for light, portable, battery-powered devices‍—‌including smartphones, laptops and tablet computers, and other embedded systems. For supercomputers, which large amounts of electricity, ARM could also be a power-efficient solution. ARM Holdings periodically releases updates to architectures and core designs, some older cores can also provide hardware execution of Java bytecodes. The ARMv8-A architecture, announced in October 2011, adds support for a 64-bit address space, with over 100 billion ARM processors produced as of 2017, ARM is the most widely used instruction set architecture in terms of quantity produced. Currently, the widely used Cortex cores, older classic cores, the British computer manufacturer Acorn Computers first developed the Acorn RISC Machine architecture in the 1980s to use in its personal computers. Its first ARM-based products were coprocessor modules for the BBC Micro series of computers, according to Sophie Wilson, all the tested processors at that time performed about the same, with about a 4 Mbit/second bandwidth. After testing all available processors and finding them lacking, Acorn decided it needed a new architecture, inspired by white papers on the Berkeley RISC project, Acorn considered designing its own processor. Wilson developed the set, writing a simulation of the processor in BBC BASIC that ran on a BBC Micro with a 6502 second processor. This convinced Acorn engineers they were on the right track, Wilson approached Acorns CEO, Hermann Hauser, and requested more resources. Hauser gave his approval and assembled a team to implement Wilsons model in hardware. The official Acorn RISC Machine project started in October 1983 and they chose VLSI Technology as the silicon partner, as they were a source of ROMs and custom chips for Acorn. Wilson and Furber led the design and they implemented it with a similar efficiency ethos as the 6502. A key design goal was achieving low-latency input/output handling like the 6502, the 6502s memory access architecture had let developers produce fast machines without costly direct memory access hardware. The first samples of ARM silicon worked properly when first received and tested on 26 April 1985, Wilson subsequently rewrote BBC BASIC in ARM assembly language. The in-depth knowledge gained from designing the instruction set enabled the code to be very dense, the original aim of a principally ARM-based computer was achieved in 1987 with the release of the Acorn Archimedes. In 1992, Acorn once more won the Queens Award for Technology for the ARM, the ARM2 featured a 32-bit data bus, 26-bit address space and 27 32-bit registers

38.
Graphics processing unit
–
GPUs are used in embedded systems, mobile phones, personal computers, workstations, and game consoles. In a personal computer, a GPU can be present on a video card, the term GPU was popularized by Nvidia in 1999, who marketed the GeForce 256 as the worlds first GPU, or Graphics Processing Unit. It was presented as a processor with integrated transform, lighting, triangle setup/clipping. Rival ATI Technologies coined the visual processing unit or VPU with the release of the Radeon 9700 in 2002. Arcade system boards have been using specialized graphics chips since the 1970s, in early video game hardware, the RAM for frame buffers was expensive, so video chips composited data together as the display was being scanned out on the monitor. Fujitsus MB14241 video shifter was used to accelerate the drawing of sprite graphics for various 1970s arcade games from Taito and Midway, such as Gun Fight, Sea Wolf, the Namco Galaxian arcade system in 1979 used specialized graphics hardware supporting RGB color, multi-colored sprites and tilemap backgrounds. The Galaxian hardware was used during the golden age of arcade video games, by game companies such as Namco, Centuri, Gremlin, Irem, Konami, Midway, Nichibutsu, Sega. In the home market, the Atari 2600 in 1977 used a video shifter called the Television Interface Adaptor,6502 machine code subroutines could be triggered on scan lines by setting a bit on a display list instruction. ANTIC also supported smooth vertical and horizontal scrolling independent of the CPU and it became one of the best known of what were known as graphics processing units in the 1980s. The Williams Electronics arcade games Robotron,2084, Joust, Sinistar, in 1985, the Commodore Amiga featured a custom graphics chip, with a blitter unit accelerating bitmap manipulation, line draw, and area fill functions. Also included is a coprocessor with its own instruction set, capable of manipulating graphics hardware registers in sync with the video beam. In 1986, Texas Instruments released the TMS34010, the first microprocessor with on-chip graphics capabilities and it could run general-purpose code, but it had a very graphics-oriented instruction set. In 1990-1992, this chip would become the basis of the Texas Instruments Graphics Architecture Windows accelerator cards, in 1987, the IBM8514 graphics system was released as one of the first video cards for IBM PC compatibles to implement fixed-function 2D primitives in electronic hardware. Fujitsu later competed with the FM Towns computer, released in 1989 with support for a full 16,777,216 color palette, in 1988, the first dedicated polygonal 3D graphics boards were introduced in arcades with the Namco System 21 and Taito Air System. In 1991, S3 Graphics introduced the S3 86C911, which its designers named after the Porsche 911 as an implication of the performance increase it promised. The 86C911 spawned a host of imitators, by 1995, all major PC graphics chip makers had added 2D acceleration support to their chips. By this time, fixed-function Windows accelerators had surpassed expensive general-purpose graphics coprocessors in Windows performance, throughout the 1990s, 2D GUI acceleration continued to evolve. As manufacturing capabilities improved, so did the level of integration of graphics chips, arcade systems such as the Sega Model 2 and Namco Magic Edge Hornet Simulator in 1993 were capable of hardware T&L years before appearing in consumer graphics cards

39.
GeForce
–
GeForce is a brand of graphics processing units designed by Nvidia. As of 2016, there have been thirteen iterations of the design, most recently, GeForce technology has been introduced into Nvidias line of embedded application processors, designed for electronic handhelds and mobile handsets. With respect to discrete GPUs, found in add-in graphics-boards, Nvidias GeForce, along with its nearest competitor, the AMD Radeon, the GeForce architecture is moving toward general-purpose graphics processor unit. The GeForce name originated from a contest held by Nvidia in early 1999 called Name That Chip, the company called out to the public to name the successor to the RIVA TNT2 line of graphics boards. There were over 12,000 entries received and 7 winners received a RIVA TNT2 Ultra graphics card as a reward, the license has common terms against reverse engineering, copying and sub-licensing, and it disclaims warranties and liability. Starting in 2016 the GeFORCE license says Nvidia collects, personally identifiable information about Customer and CUSTOMER SYSTEM as well as configures CUSTOMER SYSTEM in order to. The privacy notice goes on to say, We are not able to respond to Do Not Track signals set by a browser at this time and we also permit third party online advertising networks and social media companies to collect information. We may combine personal information that we collect about you with the browsing and tracking information collected by these technologies, initial GeForce 256 boards shipped with SDR SDRAM memory, and later boards shipped with faster DDR SDRAM memory. Launched in April 2000, the first GeForce2 was another high-performance graphics chip, Nvidia moved to a twin texture processor per pipeline design, doubling texture fillrate per clock compared to GeForce 256. Later, Nvidia released the GeForce2 MX, which offered similar to the GeForce 256. The MX was a value in the low/mid-range market segments and was popular with OEM PC manufacturers and users alike. The GeForce 2 Ultra was the model in this series. Launched in February 2001, the GeForce3 introduced programmable vertex and pixel shaders to the GeForce family and it had good overall performance and shader support, making it popular with enthusiasts although it never hit the midrange price point. The NV2A developed for the Microsoft Xbox game console is a derivative of the GeForce 3, launched in February 2002, the then-high-end GeForce4 Ti was mostly a refinement to the GeForce3. Another member of the GeForce 4 family, the budget GeForce4 MX, was based on the GeForce2 and it targeted the value segment of the market and lacked pixel shaders. Most of these used the AGP 4× interface, but a few began the transition to AGP 8×. Launched in 2003, the GeForce FX was a change in architecture compared to its predecessors. The GPU was designed not only to support the new Shader Model 2 specification, however, initial models like the GeForce FX5800 Ultra suffered from weak floating point shader performance and excessive heat which required infamously noisy two-slot cooling solutions

Older Ethernet equipment. Clockwise from top-left: An Ethernet transceiver with an in-line 10BASE2 adapter, a similar model transceiver with a 10BASE5 adapter, an AUI cable, a different style of transceiver with 10BASE2 BNC T-connector, two 10BASE5 end fittings (N connectors), an orange "vampire tap" installation tool (which includes a specialized drill bit at one end and a socket wrench at the other), and an early model 10BASE5 transceiver (h4000) manufactured by DEC. The short length of yellow 10BASE5 cable has one end fitted with a N connector and the other end prepared to have a N connector shell installed; the half-black, half-grey rectangular object through which the cable passes is an installed vampire tap.

The version history of the Android mobile operating system began with the public release of the Android beta in …

Image: Android Cupcake home screen

Image: Android 4.1 on the Galaxy Nexus

Image: Nexus 5 (Android 4.4.2) Screenshot

Global Android version distribution as of August 2017. As of November, Android Marshmallow is the most widely used version of Android, running on 30.9% of all Android devices accessing Google Play, while Android Lollipop runs on 27.2% of devices (79.0% on it or newer).