Colin Pillinger, one of Britain's most famous space scientists, has died at the age of 70 from a brain hemorrhage. Professor Pillinger was the very definition of a plucky Brit, whose force of personality and optimism enabled him to oversee construction of the Beagle-2 Mars probe. Despite overwhelming odds, the academic convinced the European Space Agency to convey the device on the back of one of its rockets. Unfortunately, contact was never made with the probe, and it is believed to have crashed trying to land on the Red planet. Despite this, Pillinger was able to raise the profile of the British space program and bring together the nation's various industrial and technical communities.

]]>
Fri, 09 May 2014 03:02:00 -040021|20882467http://www.engadget.com/2013/09/19/father-of-nintendo-hiroshi-yamauchi-dies/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2013/09/19/father-of-nintendo-hiroshi-yamauchi-dies/http://www.engadget.com/2013/09/19/father-of-nintendo-hiroshi-yamauchi-dies/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23commentsHiroshi Yamauchi was Nintendo's third and arguably most important president. When he took the reins from his grandfather in 1949, the Japanese company specialized in the manufacture of playing cards for its home market -- first Japanese-style cards and then, under Yamauchi's guidance, Western-style ones too. By the time he handed over control to Satoru Iwata 53 years later, he'd overseen the creation of all Nintendo's game consoles up to the GameCube and become one of Japan's richest men -- in other words, not a bad innings for a man who passed away today at the ripe old age of 85.

The next time you hinge open that notebook PC and smile at a feature that makes it easier to use, give a thought to Bill Moggridge, who passed away Saturday from cancer at the age of 69. The pioneering designer invented the modern clamshell design seen in all modern laptops, and is also viewed as the father of human interaction software design.

The Compass Computer he designed for Grid Systems with the screen folded over the keyboard appeared in 1981, flew on the space shuttle, and inspired virtually every notebook design since. Perhaps more importantly, when he tried to use the machine himself, Moggridge was exasperated with the difficulty and decided to take the human factor into account for software design. To that end, he engaged experts from fields like graphics design and psychology, and tried to "build empathy for the consumer into the product," according to former partner, Professor David Kelly. The pair merged their design firms to form Ideo in 1991, and worked with clients like Apple, Microsoft and Procter & Gamble, designing products like the first Macintosh mouse and Palm V handheld along the way.

In 2010, Moggridge became the director of the Smithsonian's Cooper-Hewitt Museum in New York, and was a recipient of that institution's lifetime achievement award. He also won the Prince Philip Designer's Prize, the longest running award of its type in the UK, given for "a design career which has upheld the highest standards and broken new ground." See why that's true by going to Cooper-Hewitt's tribute video, right after break.

It's a story that we hoped we'd never have to report. Neil Armstrong, the first man to set foot on Earth's Moon, has died at the age of 82 after complications from heart surgery three weeks earlier. His greatest accomplishment very nearly speaks for itself -- along with help from fellow NASA astronauts Buzz Aldrin and Michael Collins, he changed the landscape of space exploration through a set of footprints. It's still important to stress his accomplishments both before and after the historic Apollo 11 flight, though. He was instrumental to the Gemini and X-series test programs in the years before Apollo, and followed his moonshot with roles in teaching aerospace engineering as well as investigating the Apollo 13 and Space Shuttle Challenger incidents. What more can we say? Although he only spent a very small portion of his life beyond Earth's atmosphere, he's still widely considered the greatest space hero in the US, if not the world, and inspired a whole generation of astronauts. We'll miss him.

It's a sad day for science fiction fans everywhere, as Ray Bradbury has passed on at the age of 91. We'll always know him best as the author of Fahrenheit 451, but it's really massive legacy in short stories that defined his role in technology. Collections like The Illustrated Man and The Martian Chronicles made it a point to illustrate technology's impact and to never let our gadgetry trample human nature. Appropriately, for all of his ability to envision the future, he was actually rather cautious about embracing it: he only reluctantly allowed e-books and was worried the world was rushing too quickly towards devices. The irony of paying tribute on a technology website to this trepidation isn't lost on us, but we sincerely appreciate Bradbury's literary legacy -- he kept us honest (and entertained) in an industry that sometimes needs a reality check. He'll be missed.

We knew it was coming, but alas, the loss of Google Wave hits us anew now that the execution date has finally come. To say we fully grokked this platform would be untrue, but as we dug through its history to gather our thoughts, we realized what a misunderstood creature Wave really was. Released in 2009 with great fanfare and no shortage of Firefly references, the program meant well with its collaboration-friendly interface, emphasis on multimedia sharing and raft of third-party extensions such as real-time Swedish Chef translation. But while its heart was in the right place, the service sacrificed accessibility for intrigue, a distinct online identity for an early adopter sensibility. Thus, after the invite-only mystique wore off and talk of a Wave app store began to sound downright foolish, the program's future looked anything but rosy. But even a product this short-lived can have a legacy: in Wave's case, it could be making Google Plus seem downright approachable by comparison. And though this may be little consolation to those hardcore wavers -- few and far between as they may be -- the project's spirit will live on in the equally perplexing Apache Wave. RIP, Google Wave, we really hardly knew you.

]]>
Mon, 30 Apr 2012 13:51:00 -040021|20227272http://www.engadget.com/2012/02/25/pinball-pioneer-steve-kordek-dies-at-100/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2012/02/25/pinball-pioneer-steve-kordek-dies-at-100/http://www.engadget.com/2012/02/25/pinball-pioneer-steve-kordek-dies-at-100/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23commentsBelieve it or not, pinball (that most beloved of nerd pastimes) hasn't always looked this way -- a familiar field of bumpers with a pair of forward facing flippers at the bottom. That particular design originated with the 1948 title Triple Action, the work of Steve Kordek who died this week at the age of 100. Kordek is credited with a number of innovations to the analog arcade games, including multi-ball mode and drop targets. All told, the pioneer designed well over 100 different machines for Genco, Bally and Williams -- some of the biggest names in the pinball pantheon -- over the course of his roughly 60 year career. So, it is with a heavy heart that we bid farewell to a man that provided us with hours of entertainment and cost us plenty of quarters.

]]>
Sat, 25 Feb 2012 01:38:00 -050021|20179308http://www.engadget.com/2011/12/22/xeroxs-palo-alto-research-center-founder-jacob-goldman-passes-a/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/12/22/xeroxs-palo-alto-research-center-founder-jacob-goldman-passes-a/http://www.engadget.com/2011/12/22/xeroxs-palo-alto-research-center-founder-jacob-goldman-passes-a/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23comments
Jacob Goldman, the man who helped found the Palo Alto Research Center (PARC) as Xerox's chief scientist in 1970, has passed away at age 90. PARC holds a special place in gadget lore, as it was responsible for creating Alto, the first modern computer with a GUI and a mouse, the first WYSIWYG text editor, and Ethernet, among many other innovations. Prior to his time at Xerox, Dr. Goldman was the head of R&D at Ford Motor Company, and after retiring, he served on the boards of several companies, including Xerox. The New York Times reported that Goldman created PARC to research "the architecture of information" -- and the fruits of PARC's labor listed above show that he's made an immeasurable and lasting contribution to the computing world, and consequently, life as we know it. Godspeed, Dr. Goldman, and thanks for everything.

We have some somber news to bring you today: Charles Walton, the man who pioneered the rise of RFID technology, has died at the age of 89. The Cornell-educated entrepreneur garnered more than 50 patents over the course of his career, but it only took one to cement his legacy -- a 1973 patent for a "Portable radio frequency emitting identifier." It may not have been the first RFID-related invention, but Walton's breakthrough would prove to be foundational, spawning many similar patents, including ten from the creator himself. It all began at the Army Signal Corps, where Walton worked after studying electrical engineering at Cornell and earning a Master's degrees in electrical engineering and economics of engineering from the Stevens Institute of Technology. In 1960, he accepted a position at IBM, where he conducted research on disc drives before founding his own company, Proximity Devices, in 1970.

It was at Proximity where many of Walton's patents came to life, including his initial design, which he developed alongside the Schlage lock company and eventually licensed to other firms, as well. He would go on to earn millions from his technology, though as Venture Beat points out, he may have been a bit too far ahead of the curve. Many of Walton's patents expired by the time RFID devices caught on with big spenders like the Department of Defense and Wal-Mart, thereby excluding him from any subsequent windfall. But that didn't seem to bother him too much, as evidenced in a 2004 interview with Venture Beat: "I feel good about it and gratified I could make a contribution."

]]>
Tue, 29 Nov 2011 16:49:00 -050021|20116301http://www.engadget.com/2011/11/07/john-opel-ibm-ceo-during-onset-of-the-pc-era-dies-at-86/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/11/07/john-opel-ibm-ceo-during-onset-of-the-pc-era-dies-at-86/http://www.engadget.com/2011/11/07/john-opel-ibm-ceo-during-onset-of-the-pc-era-dies-at-86/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23comments
John Roberts Opel, the former IBM CEO who helped usher in the PC era, died last week at the age of 86. A native of Kansas City, MO, Opel received his MBA from the University of Chicago in 1949, after fighting in the Philippines and Okinawa during World War II. Upon graduating, he was presented with two job offers -- he could either re-write economics textbooks, or assume control of his father's hardware business in Missouri. Not particularly enthralled with either opportunity, Opel decided to think things over during a fishing trip with his father and a family friend. As fate would have it, that friend turned out to be Harry Strait, an IBM sales manager. Strait offered Opel a sales position at the company, fortuitously setting the young grad on a career path that would span 36 years. Opel's career, in fact, began and peaked at two inflection points that would come to define not only IBM, but the computing industry as a whole. When he came aboard, IBM was still producing typewriters and other accounting devices; but that would soon change, with the dawn of the computing era.

In 1959, he became assistant to then-chief executive Thomas J. Watson Jr. Just five years later, he oversaw the introduction of IBM's System 360 mainframe computer. He was appointed vice president in 1966, president in 1974 and, on January 1st, 1981, took over as IBM's fifth CEO, replacing Frank T. Cary. During his four-year tenure, Opel led IBM's push into the burgeoning PC market, overseeing the launch of IBM's first PC, the 5150, just seven months after taking the reins. He was also at the helm in 1982, when the Department of Justice dropped its 13-year antitrust suit against IBM, allowing the firm to expand its operations. Opel took full advantage. Under his stewardship, IBM's revenue nearly doubled and its corporate stature grew accordingly. In 1983, Opel made the cover of Time magazine, under a headline that read, "The Colossus That Works." He stepped down as CEO in 1985, served as chairman until 1986 and would remain on IBM's board until 1993. On Thursday, he passed away in Ft. Myers, FL, due to undisclosed causes. John Roberts Opel is survived by his wife of 56 years, five children, 15 grandchildren and a legacy that extends far beyond these 400 words.

]]>
Mon, 07 Nov 2011 06:25:00 -050021|20099720http://www.engadget.com/2011/10/25/john-mccarthy-ai-pioneer-dies-at-84/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/10/25/john-mccarthy-ai-pioneer-dies-at-84/http://www.engadget.com/2011/10/25/john-mccarthy-ai-pioneer-dies-at-84/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23comments
It might be a stretch to suggest that there'd be no AI without John McCarthy, but at the very least, we'd likely be discussing the concept much differently. The computer scientist, who died on Sunday at 84, is credited with coining the term "Artificial Intelligence" as part of a proposal for a Dartmouth conference on the subject. The event, held in 1956, is regarded as a watershed moment for the subject. Early the following decade, McCarthy pioneered LISP, a highly popular programming language amongst the AI development community. In 1971, he won a Turing Award from the Association for Computing Machinery and 20 years later was awarded National Medal of Science. A more complete obituary for McCarthy can be found in the source link below.

]]>
Tue, 25 Oct 2011 19:07:00 -040021|20089766http://www.engadget.com/2011/10/14/robert-galvin-former-chairman-and-ceo-of-motorola-dies-at-age/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/10/14/robert-galvin-former-chairman-and-ceo-of-motorola-dies-at-age/http://www.engadget.com/2011/10/14/robert-galvin-former-chairman-and-ceo-of-motorola-dies-at-age/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23commentsWe're very sorry to report that Robert W. Galvin, former chairman and CEO of Motorola, died this week in Chicago at the age of 89. Over the course of his nearly three-decade tenure at the helm, Galvin oversaw Motorola's transformation from a mid-level radio and walkie talkie manufacturer into one of the world's leading electronics makers. In the process, he cemented his legacy as one of the industry's most forward-looking executives. The Marshfield, Wisconsin native first joined the company as a stockroom apprentice in 1940, and would go on to spend his entire career there (save for a tour of duty in World War II). He was named chairman and chief executive in 1959, following the death of his father and company founder Paul Galvin. Under the younger Galvin's stewardship, Motorola expanded the depth and breadth of its operations, moving into emerging markets and focusing much of its efforts on the burgeoning cellular industry. Galvin spearheaded this transition, which saw Motorola introduce the first commercial cellphone in 1973, and the first cellphone network in the early 1980s. When he first took control, Motorola's annual sales stood at around $290 million. By the time he retired as chairman in 1990, however, that figure had ballooned to $10.8 billion. Galvin went on to serve on the company's board of directors until 2001 and, though he may have departed, his impact certainly won't be forgotten anytime soon. "We will continue to honor Bob Galvin's legacy here at Motorola Mobility," said current chairman and CEO Sanjay Jha. "He was committed to innovation, and was responsible for guiding Motorola through the creation of the global cellular telephone industry." Robert "Bob" Galvin is survived by his wife of 67 years, four children, 13 grandchildren and ten great-grandchildren.

]]>
Fri, 14 Oct 2011 07:23:00 -040021|20081443http://www.engadget.com/2011/10/13/dennis-ritchie-pioneer-of-c-programming-language-and-unix-repo/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/10/13/dennis-ritchie-pioneer-of-c-programming-language-and-unix-repo/http://www.engadget.com/2011/10/13/dennis-ritchie-pioneer-of-c-programming-language-and-unix-repo/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23commentsWe're getting reports today that Dennis Ritchie, the man who created the C programming language and spearheaded the development of Unix, has died at the age of 70. The sad news was first reported by Rob Pike, a Google engineer and former colleague of Ritchie's, who confirmed via Google+ that the computer scientist passed away over the weekend, after a long battle with an unspecified illness. Ritchie's illustrious career began in 1967, when he joined Bell Labs just one year before receiving a PhD in physics from Harvard University. It didn't take long, however, for the Bronxville, NY native to have a major impact upon computer science. In 1969, he helped develop the Unix operating system alongside Ken Thompson, Brian Kernighan and other Bell colleagues. At around the same time, he began laying the groundwork for what would become the C programming language -- a framework he and co-author Kernighan would later explain in their seminal 1978 book, The C Programming Language. Ritchie went on to earn several awards on the strength of these accomplishments, including the Turing Award in 1983, election to the National Academy of Engineering in 1988, and the National Medal of Technology in 1999. The precise circumstances surrounding his death are unclear at the moment, though news of his passing has already elicited an outpouring of tributes and remembrance for the man known to many as dmr (his e-mail address at Bell Labs). "He was a quiet and mostly private man," Pike wrote his brief post, "but he was also my friend, colleague, and collaborator, and the world has lost a truly great mind."

Arthur C. Nielsen Jr., the man who turned the A.C. Nielsen Company into a global leader in market research and television ratings, has died at the age of 92. Nielsen's father founded the company in 1923 and was known for spearheading much of the innovation behind it, but it was the younger Nielsen who led the firm to prominence, after joining in 1945 and taking over as president in 1957. In 1948, he convinced the firm to devote $150,000 to building the first general-purpose computer, the Univac. Building off of his father's revolutionary TV audience measurement system, he later expanded A.C. Nielsen's reach to new areas, including the development of a coupon clearinghouse and data-tracking services for magazines and even oil wells. Perhaps his most impressive achievement, however, was his ability to maintain A.C. Nielsen's position as the nation's pre-eminent TV and media ratings firm, even amidst the proliferation of cable networks.

Arthur "Art" Nielsen stepped down from his role as chairman of the company in 1983, a year before orchestrating its sale to the Dun & Bradstreet Corporation for $1.3 billion in stock. Throughout the course of his illustrious career, he served on the board of more than 20 companies, including Motorola and Walgreen, and advised three US presidents. But his life's work and lasting legacy could just as well be summarized by a simple proverb he learned from his father: "If you can put a number on it, then you know something." Arthur C. Nielsen passed away on October 4th in Winnetka, Illinois. He is survived by three children and seven grandchildren.

]]>
Thu, 06 Oct 2011 11:05:00 -040021|20075105http://www.engadget.com/2011/09/26/julius-blank-chip-making-pioneer-and-fairchild-co-founder-dies/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/09/26/julius-blank-chip-making-pioneer-and-fairchild-co-founder-dies/http://www.engadget.com/2011/09/26/julius-blank-chip-making-pioneer-and-fairchild-co-founder-dies/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23comments
Somber news coming out of Palo Alto today, where Julius Blank, the man who helped found the groundbreaking chipmaker Fairchild Semiconductor Corporation, has passed away at the age of 86. The Manhattan-born Blank (pictured third from left, above) began his engineering career in 1952, when he joined AT&T's Western Electric plant in New Jersey. As a member of the engineering group at the plant, Blank helped create phone technology that allowed users to dial long-distance numbers without going through an operator. It was also at Western Electric where he met fellow engineer Eugene Kleiner. In 1956, Blank and Kleiner left AT&T to work at the lab of Nobel Prize-winning physicist William B. Shockley, but departed just one year later (amid to start Fairchild, alongside a group of six other computer scientists that included future Intel Corporation founders Robert Noyce and Gordon Moore. At their new labs, Blank and his peers developed an inexpensive method for manufacturing silicon chips, earning them $1.5 million in capital from a single investor. As the only two with any manufacturing experience, Blank and Kleiner were charged with bringing the dream to fruition -- a task that required them to build the chips from scratch, beginning with the machinery for growing silicon crystals. They succeeded, of course, and in 1969, Blank left Fairchild to start Xicor, a tech firm that Intersil would later buy for $529 million, in 2004. But his legacy will forever be linked to those early days at Fairchild, where, as Blank described in a 2008 interview, he and his colleagues were able to experience the unique thrill of "building something from nothing." Julius Blank is survived by his two sons, Jeffrey and David, and two grandsons.

There's some sad news coming out of Illinois today, where Michael S. Hart, the e-book inventor who founded Project Gutenberg, has died at the age of 64. Hart's literary journey began in 1971, when he digitized and distributed his first text, after being inspired by a free printed copy of the Declaration of Independence he found at the University of Illinois at Urbana-Champaign. That same year, the Tacoma, Washington native founded Project Gutenberg -- an online library that aims to "encourage the creation and distribution of eBooks" and to "break down the bars of ignorance and illiteracy." By 1987, he'd already digitized a total of 313 books, including works from Homer, Shakespeare and the Bible, before recruiting more volunteers to help out. As of this June, Hart's pioneering library housed about 36,000 works in its collection (most of which are in the public domain), with an average of 50 new books added each week. Described by Project Gutenberg as an "ardent technologist and futurist," Hart leaves a literary legacy perhaps best summed up in his own words. "One thing about eBooks that most people haven't thought much is that eBooks are the very first thing that we're all able to have as much as we want other than air," he wrote in July. "Think about that for a moment and you realize we are in the right job." Michael S. Hart is survived by his mother and brother.

]]>
Thu, 08 Sep 2011 04:53:00 -040021|20037439http://www.engadget.com/2011/08/16/george-devol-creator-of-the-first-industrial-robot-arm-dies-at/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/08/16/george-devol-creator-of-the-first-industrial-robot-arm-dies-at/http://www.engadget.com/2011/08/16/george-devol-creator-of-the-first-industrial-robot-arm-dies-at/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23comments
He may not be a household name like Henry Ford, but it's arguable that George Devol's (above, right) work was even more influential in shaping the modern manufacturing landscape. In 1961, roughly seven years after first applying for the patent, his Unimate was put into service in a GE automobile plant. The world's first programmable, robotic arm was used to lift hot cast metal components out of a mold and stack them -- the assembly line has never been the same. Other companies soon followed suit, replacing expensive and fragile humans with mechanical labor. Devol died Thursday night in his home at the age of 99. If you're interested in getting a peek at his game-changing invention, you can find one at the Smithsonian's National Museum of American History.

We have some somber news to bring you this morning: Robert Morris, the cryptographer who helped create Unix, has died at the age of 78. Morris began his work on the groundbreaking OS back in 1970 at AT&T's Bell Laboratories, where he played a major role in developing Unix's math library, password structure and encryption functions. His cryptographic exploration continued into the late 1970s, when he began writing a paper on an early encryption tool from Germany. But the paper would never see the light of day, thanks to a request from the NSA, which was concerned about potential security ramifications. Instead, the agency brought Morris on board as a computer security expert in 1986. Much of what he did for Uncle Sam remains classified, though he was involved in internet surveillance projects and cyber warfare -- including what might have been America's first cyberattack in 1991, when the US crippled Saddam Hussein's control capabilities during the first Gulf War. Morris stayed with the NSA until 1994, when he retired to New Hampshire. He's survived by his wife, three children and one, massive digital fingerprint.

]]>
Fri, 01 Jul 2011 08:36:00 -040021|19981099http://www.engadget.com/2011/06/16/alan-haberman-barcode-advocate-dies-at-81/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/06/16/alan-haberman-barcode-advocate-dies-at-81/http://www.engadget.com/2011/06/16/alan-haberman-barcode-advocate-dies-at-81/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23commentsA man whose impact on the world is nearly unfathomable died Sunday. Alan L. Haberman, supermarket-executive-turned-barcode-champion, died in Newton Massachusetts from complications of heart and lung disease at the age of 81. While he did not invent those ubiquitous black and white stripes -- that honor belongs to Norman Joseph Woodland and Bernard Silver -- Haberman did lead the campaign to make barcodes the universal standard for electronic product encoding. He chaired the committee responsible for the designation of the zebra-like markings, which in 1973 adopted a barcode designed by George J. Laurer of IBM. In his work at the Uniform Code Council (now known as GS1 US), he pushed for acceptance of multiple standards, including RFID. His obituary can be read in-full at the source link below.

]]>
Thu, 16 Jun 2011 20:09:00 -040021|19969007http://www.engadget.com/2011/05/31/dangers-iconic-hiptop-fades-away-the-sidekick-is-here-to-stay/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/05/31/dangers-iconic-hiptop-fades-away-the-sidekick-is-here-to-stay/http://www.engadget.com/2011/05/31/dangers-iconic-hiptop-fades-away-the-sidekick-is-here-to-stay/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23comments
At the turn of the millennium, three men formed Danger Incorporated, which went on to create a smartphone perfectly positioned for its time. Those men eventually wound up at Google... after one of them founded Android. But what became of the T-Mobile Sidekick, their stylish swiveling phone? After an illustrious life filled with fame, fortune and failure, the Hiptop met its end today. Today, Microsoft and T-Mobile will shut down the Danger servers for good, leaving existing handsets without the push email and cloud services that once made them indispensable to the teens, tweens and businesspeople who used them day in and day out -- leaving the Android-powered Sidekick 4G to fan the remaining embers of the brand. Join us after the break for a video celebration of Danger's pop culture phenomenon, and head on over to Geekwire for a brief history of the iconic device. Now, if you'll excuse us, we've got a little water in our eye.

]]>
Tue, 31 May 2011 17:42:00 -040021|19954666http://www.engadget.com/2011/05/19/willard-boyle-man-who-revolutionized-digital-imaging-dies-at-8/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/05/19/willard-boyle-man-who-revolutionized-digital-imaging-dies-at-8/http://www.engadget.com/2011/05/19/willard-boyle-man-who-revolutionized-digital-imaging-dies-at-8/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23comments
We have some sad news to share with you today: Willard Boyle, the man who created the imaging technology behind everything from digital cameras to barcode scanners, has died at the age of 86. In 2009, Boyle shared a Nobel Prize in physics for inventing the CCD, which allowed people to capture images in digital format for the first time. It all began way back in 1969, when Boyle and his future co-Laureate, George E. Smith, started laying the groundwork for the CCD while working at Bell Laboratories. Building off of Einstein's photoelectric effect, the two eventually came up with a way to locate and quantify the electrons that are knocked out of orbit every time light strikes silicon. Boyle and Smith used this technology to create their own digital camera in 1970, as well as a TV camera in 1975. Prior to his groundbreaking invention, Boyle spent two years working for NASA's Apollo program and helped develop both the ruby laser and the semiconductor injection laser. The last three decades of Boyle's life were spent in Wallace, Canada, where he grew up and, on May 7th, passed away after battling kidney disease. He's survived by his wife, three children and an indelible legacy.

]]>
Thu, 19 May 2011 15:14:00 -040021|19944795http://www.engadget.com/2011/04/23/norio-ohga-former-sony-chairman-and-multimedia-pioneer-dies-at/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/04/23/norio-ohga-former-sony-chairman-and-multimedia-pioneer-dies-at/http://www.engadget.com/2011/04/23/norio-ohga-former-sony-chairman-and-multimedia-pioneer-dies-at/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23commentsThere's more sad news out of Japan this morning, we're afraid -- Sony is reporting that former chairman Norio Ohga passed away in Tokyo yesterday from multiple organ failure. He was 81. You may not personally remember a Sony under his reign -- Ohga directly helmed the company from 1982 to 1995 after decades of service in product planning -- but Norio Ohga was arguably the man responsible for turning Sony from a high-profile analog electronics manufacturer into a digital multimedia conglomerate. He helmed the deals that formed Sony Music, paved the way for Sony Pictures and established the very same Sony Computer Entertainment that would birth the PlayStation, and it was he who pushed the optical compact disc standard that all but replaced the magnetic cassettes and diskettes that held portable media. Without him, DVDs and Blu-rays might have fallen by the wayside, and that's another thought that brings tears to our eyes. You'll find Ohga's official obituary after the break.

]]>
Sat, 23 Apr 2011 12:53:00 -040021|19921879http://www.engadget.com/2011/03/28/paul-baran-early-internet-engineer-and-architect-passes-away-a/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2011/03/28/paul-baran-early-internet-engineer-and-architect-passes-away-a/http://www.engadget.com/2011/03/28/paul-baran-early-internet-engineer-and-architect-passes-away-a/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23commentsMost of you may not believe it, but the internet as we know it didn't really exist a mere 20 years ago. Paul Baran, an engineer of the ARPANET (an early attempt at a networked information superhighway) has passed away today at the age of 84. As the father of packet-switching -- the basis of all online information exchanges -- he was initially scoffed at by major communications players like AT&T, who thought the tech was too advanced to be realized at the time. However, after the US Department of Defense saw the need for an effective large-scale information network following WWII, the ARPANET was eventually -- and successfully -- built based on these packet-switching concepts and evolved to form the current interweb. We've definitely lost a visionary in the field of networking, and here's to hoping the next generation of like-minded innovators has the same perseverance and success.

]]>
Mon, 28 Mar 2011 19:13:00 -040021|19894450http://www.engadget.com/2010/10/23/man-with-bionic-arms-dies-after-car-crash/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2010/10/23/man-with-bionic-arms-dies-after-car-crash/http://www.engadget.com/2010/10/23/man-with-bionic-arms-dies-after-car-crash/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23comments
Otto Bock's mind-controlled bionic arms let Austria's Christian Kandlbauer work, play and even drive, but it seems the latter passion may have lead to the 22-year-old's untimely demise. Two days after a road accident where the young man's specially-modified Subaru crashed into a tree, Kandlbauer was pronounced brain-dead and taken off life support late last week. It's not known whether the prosthetic arms themselves had anything to do with the crash -- one was found ripped from his body at the scene -- but both he and his vehicle were cleared to drive by local authorities after passing a number of tests. Honestly, it's a tragedy for science and humanity either way.

]]>
Sat, 23 Oct 2010 14:09:00 -040021|19686530http://www.engadget.com/2008/08/28/rumors-of-steve-jobs-death-greatly-exaggerated/%3Futm_medium%3Dfeed%26utm_source%3DFeed_Classic%26utm_campaign%3DEngadget%26ncid%3Drss_semi
http://www.engadget.com/2008/08/28/rumors-of-steve-jobs-death-greatly-exaggerated/http://www.engadget.com/2008/08/28/rumors-of-steve-jobs-death-greatly-exaggerated/%3Futm_source%3DFeed_Classic%26utm_medium%3Dfeed%26utm_campaign%3DEngadget%23comments
You have to figure that major news outlets keep obituaries on hand for all kinds of public figures and celebrities -- still, you can't help feeling a bit of a chill upon learning that notice of Steve Jobs' death mistakingly hit the wires yesterday afternoon. A slip-up at news outlet Bloomberg caused the lengthy obituary to roll across a number of screens before being pulled -- but not before a Gawker tipster was able to send off a copy to the gossip site. Under normal circumstances, this would probably come off as a random gaffe with minimal impact, but given recent reactions / over reactions concerning Jobs' health (thanks in no small part to his appearance at WWDC, pictured above), this comes off as a rotten-timed moment in journalistic and technical butterfingerism. We can only hope this didn't send too many investors into a tailspin -- we'd hate to see any War of the Worlds moments caused by something so silly.