Engadget RSS Feedhttp://www.engadget.com
Engadgethttp://www.blogsmithmedia.com/www.engadget.com/media/feedlogo.gifEngadgethttp://www.engadget.com
en-usCopyright 2015 AOL Inc. The contents of this feed are available for non-commercial use only.Blogsmith http://www.blogsmith.com/http://www.engadget.com/2013/04/12/tsmc-narrows-production-of-16nm-finfet-chips-to-late-2013/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2013/04/12/tsmc-narrows-production-of-16nm-finfet-chips-to-late-2013/http://www.engadget.com/2013/04/12/tsmc-narrows-production-of-16nm-finfet-chips-to-late-2013/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments

For as often as TSMC has extolled the virtues of FinFET chip designs, we've been wondering exactly when we'd find them sitting in our devices. Thanks to competition from rival semiconductor firms, we'll get them relatively soon: the company now expects to produce its first wave of FinFET-based, 16-nanometer chips toward the end of 2013. While they won't be as nice as 14nm-XM chips in the pipeline, the 16nm parts should still offer battery life and speed improvements over the 28nm chips we know today. These improvements also won't be the end of the road -- TSMC anticipates 10nm designs built on extreme ultraviolet lithography late into 2015, and CEO Morris Chang believes there's seven or more years of advancements in manufacturing before Moore's Law starts breaking down. We'll just be happy if we see FinFET reach our phones and tablets in the near term.

Researchers at MIT's Microsystems Technology Laboratories may be giving Moore's Law a new lease on life with the development of the smallest indium gallium arsenide transistor ever made, measuring up at 22-nanometers. Such transistors could produce more current when shrunken down than those based on silicon, which means chips may continue to pack in more transistors while providing a bigger punch. "We have shown that you can make extremely small indium gallium arsenide MOSFETs (metal-oxide semiconductor field-effect transistors) with excellent logic characteristics, which promises to take Moore's Law beyond the reach of silicon," says co-developer of the tech Jesús del Alamo. The development is an encouraging step in the right direction, but the MIT team still has a long road ahead of it before the tech shows up in your gadgets. Next on the docket for the scientists is improving the transistor's electrical performance and downsizing it to below 10-nanometers. For the nitty gritty on how the transistor was built, hit the adjacent source link.

ARM CEO Warren East already has a tendency to be more than a bit outspoken on the future of computing, and he just escalated the war of words with an assault on the industry's sacred cow: Moore's Law. After some prompting by MIT Technology Review during a chat, East argued that power efficiency is "actually what matters," whether it's a phone or a server farm. Making ever more complex and power-hungry processors to obey Moore's Law just limits how many chips you can fit in a given space, he said. Not that the executive is about to accept Intel's position that ARM isn't meant for performance, as he saw the architecture scaling to high speeds whenever there was a large enough power supply to back it up. East's talk is a bit long on theory and short on practice as of today -- a Samsung Chromebook isn't going to make Gordon Moore have second thoughts -- but it's food for thought in an era where ARM is growing fast, and even Microsoft isn't convinced that speed rules everything.

Samsung's round of cash-flashingcontinues with a $629 million purchase of a three-percent stake in ASML. It's joining Intel and TSMC in pumping money into the Dutch business, developing tooling for chip-making machines with Extra Ultraviolet Lithography (EUV) designed to "extend Moore's Law." It'll also help reduce the cost of future silicon, since it'll enable the companies to use wider silicon wafers along the manufacturing line. Given that Samsung's investment caps of a project to raise nearly $5 billion in cash and that ASML's home is just five miles west of PSV Eindhoven's stadium, we just hope they threw in a few home tickets for their trouble.

3D silicon is all the rage, and now nanowire transistors have further potential to keep Moore's Law on life support. Researchers at A*STAR have found a way to double the number of transistors on a chip by placing the atomic-scale wires vertically, rather than in the run-of-the-mill planar mode, creating two "wrap-around gates" that put a pair of transistors on a single nanowire. In the future, the tech could be merged with tunnel field effect transistors -- which use dissimilar semiconductor materials -- to create a markedly denser design. That combo would also burn a miniscule percentage of the power required conventionally, according to the scientists, making it useful for low-powered processors, logic boards and non-volatile memory, for starters. So, a certain Intel founder might keep being right after all, at least for a few years more.

]]>
logic gatelogic gatesLogicGateLogicGatesmoores lawMooresLawnanowirenanowire transistornanowiresNanowireTransistorquantum computingQuantumComputingtfettransistortunnel field effect transistorsTunnelFieldEffectTransistorswrap around gatesWrapAroundGatesThu, 21 Jun 2012 08:49:00 -040021|20263133http://www.engadget.com/2012/06/15/engadget-primed-nanometers/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2012/06/15/engadget-primed-nanometers/http://www.engadget.com/2012/06/15/engadget-primed-nanometers/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsPrimed goes in-depth on the technobabble you hear on Engadget every day -- we dig deep into each topic's history and how it benefits our lives. You can follow the series here. Looking to suggest a piece of technology for us to break down? Drop us a line at primed *at* engadget *dawt* com.

Welcome to one of the most unnecessarily complicated questions in the world of silicon-controlled gadgets: should a savvy customer care about the underlying nature of the processor in their next purchase? Theoretically at least, the answer is obvious. Whether it's a CPU, graphics card, smartphone or tricorder, it'll always receive the Holy Grail combo of greater performance and reduced power consumption if it's built around a chip with a smaller fabrication process. That's because, as transistors get tinier and more tightly packed, electrons don't have to travel so far when moving between them -- saving both time and energy. In other words, a phone with a 28-nanometer (nm) processor ought to be fundamentally superior to one with a 45nm chip, and a PC running on silicon with features etched at 22nm should deliver more performance-per-watt than a 32nm rival.

But if that's true, isn't it equally sensible to focus on the end results? Instead of getting bogged down in semiconductor theory, we may as well let Moore's Law churn away in the background while we judge products based on their overall user experience. Wouldn't that make for an easier life? Well, maybe, but whichever way you look at it, it's hard to stop this subject descending into pure philosophy, on a par with other yawnsome puzzles like whether meat-eaters should visit an abattoir at least once, or whether it's better to medicate the ailment or the person. Bearing that in mind, we're going to look at how some key players in the silicon industry treat this topic, and we'll try to deliver some practical, offal-free information in the process.

Unless you've been hiding under a rock lately, we're pretty sure you've heard about the Raspberry Pi by now -- a $25 credit-card sized PC that brings ARM/Linux to the Arduino form factor. As a refresher, the system features a 700MHz Broadcom BCM2835 SoC with an ARM11 CPU, a Videocore 4 GPU (which handles HD H.264 video and OpenGL ES 2.0) and 256MB RAM. The board includes an SD card slot, HDMI output, composite video jack, 3.5mm audio socket, micro-USB power connector and GPIO header. Model A ($25) comes with one USB port, while Model B ($35) provides two USB ports and a 100BaseT Ethernet socket. Debian is recommended, but Raspberry Pi can run most ARM-compatible 32-bit OSes.

This past weekend at Maker Faire Bay Area 2012 we ran into Eben Upton, Executive Director of the Raspberry Pi Foundation, and took the opportunity to spend some quality time with a production board and to discuss this incredible PC. We touched upon the origins of the system (inspired by the BBC Micro, one of the ARM founders' projects), Moore's law, the wonders of simple computers and upcoming products / ideas -- including Adafruit's Pi Plate and Raspberry Pi's prototype camera add-on. On the subject of availability, the company expects that "there will be approximately 200,000 units in the field by the end of June". Take a look at our hands-on gallery below and our video interview after the break.

]]>
AdafruitAdafruit Pi PlateAdafruitPiPlateArduinoARMARM11BBC MicroBbcMicroBCM2835BroadcomBroadcom BCM2835BroadcomBcm2835DebianEben UptonEbenUptonhands-oninterviewLinuxMaker FaireMaker Faire 2012Maker Faire Bay AreaMaker Faire Bay Area 2012MakerFaireMakerFaire2012MakerFaireBayAreaMakerFaireBayArea2012Model AModel BModelAModelBMoores lawMooresLawPie PlatePiePlateRasbberry Pi FoundationRasbberryPiFoundationRaspberry PiRaspberryPivideoMon, 21 May 2012 06:21:00 -040021|20241458http://www.engadget.com/2012/02/21/single-atom-transistors-point-to-the-future-of-quantum-computers/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2012/02/21/single-atom-transistors-point-to-the-future-of-quantum-computers/http://www.engadget.com/2012/02/21/single-atom-transistors-point-to-the-future-of-quantum-computers/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsTransistors -- the basic building block of the complex electronic devices around you. Literally billions of them make up that Core i7 in your gaming rig and Moore's law says that number will double every 18 months as they get smaller and smaller. Researchers at the University of New South Wales may have found the limit of this basic computational rule however, by creating the world's first single atom transistor. A single phosphorus atom was placed into a silicon lattice and read with a pair of extremely tiny silicon leads that allowed them to observe both its transistor behavior and its quantum state. Presumably this spells the end of the road for Moore's Law, as it would seem all but impossible to shrink transistors any farther. But, it could also points to a future featuring miniaturized solid-state quantum computers.

]]>
moores lawMooresLawphosphorusquantum computingQuantumComputingtransistortransistorsUniversity of New South WalesUniversityOfNewSouthWalesTue, 21 Feb 2012 08:14:00 -050021|20175526http://www.engadget.com/2012/01/28/ibm-builds-9-nanometer-carbon-nanotube-transistor-puts-silicon/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2012/01/28/ibm-builds-9-nanometer-carbon-nanotube-transistor-puts-silicon/http://www.engadget.com/2012/01/28/ibm-builds-9-nanometer-carbon-nanotube-transistor-puts-silicon/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsIt's not the smallest transistor out there, but the boffins at IBM have constructed the tiniest carbon nanotube transistor to date. It's nine nanometers in size, making it one nanometer smaller than the presumed physical limit of silicon transistors. Plus, it consumes less power and is able to carry more current than present-day technology. The researchers accomplished the trick by laying a nanotube on a thin layer of insulation, and using a two-step process -- involving some sort of black magic, no doubt -- to add the electrical gates inside. The catch? (There's always a catch) Manufacturing pure batches of semiconducting nanotubes is difficult, as is aligning them in such a way that the transistors can function. So, it'll be some time before the technology can compete with Intel's 3D silicon, but at least we're one step closer to carbon-based computing.

]]>
9 nanometers9Nanometers9nmcarboncarbon nanotubesCarbonNanotubesibmmoores lawMooresLawresearchsciencesilicontransistorSat, 28 Jan 2012 00:34:00 -050021|20158047http://www.engadget.com/2011/11/17/ibm-sees-stacked-silicon-sitting-in-fluid-as-the-way-to-power-fu/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2011/11/17/ibm-sees-stacked-silicon-sitting-in-fluid-as-the-way-to-power-fu/http://www.engadget.com/2011/11/17/ibm-sees-stacked-silicon-sitting-in-fluid-as-the-way-to-power-fu/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsGenerally, the combination of microchips, electricity and fluids is usually considered an incredibly bad thing. IBM, however, thinks it can combine those three to make super small and super powerful computers in the future. The idea is to stack hundreds of silicon wafers and utilize dual fluidic networks between them to create 3D processors. In such a setup, one network carries in charged fluid to power the chip, while the second carries away the same fluid after it has picked up heat from the active transistors. Of course, 3D chips are already on the way, and liquid cooled components are nothing new, but powering a PC by fluids instead of wires has never been done before. Bruno Michel, who's leading Big Blue's research team, has high hopes for the technology, because future processors will need the extra cooling and reduced power consumption it can provide. Michel says he and his colleagues have demonstrated that it's possible to use a liquid to transfer power via a network of fluidic channels, and they to plan build a working prototype chip by 2014. If successful, your smartphone could eventually contain the power of the Watson supercomputer. Chop, chop, fellas, those futuristic fluidic networks aren't going to build themselves.

Around the same time most years, (2007, 2009, 2010), someone heralds the death of Moore's law. This time it's Stanford University's Dr. Jonathan Koomey, who has found that energy efficiency roughly doubles every two years. With the rise of mobile devices, we care less if our phones and tablets can outpace a desktop and more about if a full charge will last the duration of our commute -- reducing the importance of Moore's law. Historically, efficiency has been a secondary concern as manufacturers built ever faster CPUs, but Koomey believes there is enormous room for improvement. In 1985, Dr. Richard Feynman calculated an efficiency upper limit of Factor 100 Billion -- since then we've only managed to achieve Factor 40,000. Let's just hope Quantum Computing goes mainstream before next autumn so we can get on with more important things.

]]>
ChipChip DesignChip EfficiencyChip PowerChipDesignChipEfficiencyChipPowerCPUDr Jonathan KoomeyDr Richard FeynmanDrJonathanKoomeyDrRichardFeynmanEfficiencyGordon MooreGordonMooreJonathan KoomeyJonathanKoomeyKoomeys LawKoomeysLawMoores LawMooresLawPower EfficiencyPowerEfficiencyRichard FeynmanRichardFeynmanThu, 15 Sep 2011 14:50:00 -040021|20043427http://www.engadget.com/2011/06/20/intel-plans-exascale-computing-by-2018-wants-to-make-petaflops/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2011/06/20/intel-plans-exascale-computing-by-2018-wants-to-make-petaflops/http://www.engadget.com/2011/06/20/intel-plans-exascale-computing-by-2018-wants-to-make-petaflops/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
Sure, Fujitsu has a right to be proud of its K supercomputer -- performing over 8 petaflops with just under 70,000 Venus CPUs is nothing to sneeze at. Intel isn't giving up its status as the supercomputing CPU king, however, as it plans to bring exascale computing to the world by the end of this decade. Such a machine could do one million trillion calculations per second, and Intel plans to make it happen with its Many Integrated Core Architecture (MIC). The first CPUs designed with MIC, codenamed Knights Corner, are built on a 22nm process that utilizes the company's 3D Tri-Gate transistors and packs over 50 cores per chip. These CPUs are designed for parallel processing applications, similar to the NVIDIA GPUs that will be used in a DARPA-funded supercomputer we learned about last year. Here we thought the war between these two was over -- looks like a new one's just getting started. PR's after the break.

]]>
22nm3d transistor3dTransistorexaflopexascaleintelintel xeonIntelXeonknights cornerKnightsCornermoores lawMooresLawnvidiasupercomputersupercomputingtri-gatetri-gate transistorsTri-gateTransistorsxeonMon, 20 Jun 2011 20:13:00 -040021|19972040http://www.engadget.com/2011/05/20/intel-goes-ulv-for-laptops-to-combat-the-oncoming-tablet-horde/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2011/05/20/intel-goes-ulv-for-laptops-to-combat-the-oncoming-tablet-horde/http://www.engadget.com/2011/05/20/intel-goes-ulv-for-laptops-to-combat-the-oncoming-tablet-horde/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsIntel has been talking up its x86-powered smartphones and battery-sipping Atoms for tablets quite a bit recently, but the company hasn't forgotten its roots in traditional PC form-factors. At an investor event in San Francisco, CEO Paul Otellini announced a significant change to its line of notebook CPUs -- ultra low voltage will be the new norm, not just a niche chip for high-end ultra-portables. The baseline TDP for future CPUs will be in the 10 to 15 watt range, a huge drop from the 35w design of the mainstream Core line and lower than even current-gen ULV chips (which bottom out at 17w). The company also plans to make NVIDIA eat its words by putting the pedal to the metal on die shrinks -- releasing a 22nm Atom next year followed by a 14nm version in 2013. That could mean our fantasy of true all-day battery life in a sleek and sexy laptop will finally come true. Don't crush our dreams Intel!

]]>
14nm22nmatomcpucpusintellaptoplaptopsmoores lawMooresLawnotebooknotebookspaul otelliniPaulOtelliniprocessorprocessorsroadmapultra-low voltageUltra-lowVoltageulvx86Fri, 20 May 2011 10:59:00 -040021|19945778http://www.engadget.com/2011/05/04/intel-will-mass-produce-22nm-3d-transistors-for-all-future-cpus/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2011/05/04/intel-will-mass-produce-22nm-3d-transistors-for-all-future-cpus/http://www.engadget.com/2011/05/04/intel-will-mass-produce-22nm-3d-transistors-for-all-future-cpus/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
Looks like 3D isn't just a fad, folks, so long as we're talking about silicon -- Intel just announced that it has invented a 3D "Tri-Gate" transistor that will allow the company to keep shrinking chips, Moore's Law naysayers be darned. Intel says the transistors will use 50 percent less power, conduct more current and provide 37 percent more speed than their 2D counterparts thanks to vertical fins of silicon substrate that stick up through the other layers, and that those fancy fins could make for cheaper chips too -- currently, though, the tri-gate tech adds an estimated 2 to 3 percent cost to existing silicon wafers. Intel says we'll see the new technology first in its 22nm Ivy Bridge CPUs, going into mass production in the second half of the year, and it's planning 14nm chips in 2013 and 10nm chips in 2015. Also, 3D transistors won't be limited to the cutting edge -- Intel reps told journalists that they "will extend across the entire range of our product line," including mobile devices. Three videos and a press release await you after the break.

]]>
10nm14nm22nm3D3d transistor3dTransistorbreaking newsIntelIvy BridgeIvyBridgemoores lawMooresLawsilicontransistortri-gatevideoWed, 04 May 2011 13:00:00 -040021|19931705http://www.engadget.com/2011/04/25/today-marks-50th-anniversary-of-first-silicon-integrated-circuit/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2011/04/25/today-marks-50th-anniversary-of-first-silicon-integrated-circuit/http://www.engadget.com/2011/04/25/today-marks-50th-anniversary-of-first-silicon-integrated-circuit/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
There's little question that the last 50 years have represented the most innovative half-century in human history, and today marks the anniversary of the invention that started it all: the silicon-based integrated circuit. Robert Noyce received the landmark US patent on April 25, 1961, going on to found Intel Corporation with Gordon E. Moore (of Moore's Law fame) in 1968. He wasn't the first to invent the integrated circuit -- the inventor of the pocket calculator Jack Kilby patented a similar technology on a germanium wafer for Texas Instruments a few months prior. Noyce's silicon version stuck, however, and is responsible for Moore's estimated $3.7 billion net worth, not to mention the success of the entire computing industry. Holding 16 other patents and credited as a mentor of Steve Jobs, Noyce was awarded the National Medal of Technology in 1987, and continued to shape the computing industry until his death in 1990. If Moore's Law continues to hold true, as we anticipate it will, we expect the next 50 years to be even more exciting than the last. Let's meet back here in 2061.

]]>
computerscomputinggordon e mooregordon mooreGordonEMooreGordonMooreintegrated circuitintegrated circuit patentIntegratedCircuitIntegratedCircuitPatentintelintel corporationIntelCorporationjack kilbyJackKilbymoores lawMooresLawnoycepatentrobert noyceRobertNoycesiliconsilicon integrated circuitSiliconIntegratedCircuittexas instrumentsTexasInstrumentsMon, 25 Apr 2011 16:23:00 -040021|19922964http://www.tuaw.com/2010/09/24/what-10-years-of-apple-did-to-its-main-product/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=TUAW.com&ncid=rss_semi
http://www.tuaw.com/2010/09/24/what-10-years-of-apple-did-to-its-main-product/http://www.tuaw.com/2010/09/24/what-10-years-of-apple-did-to-its-main-product/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=TUAW.com#comments
How time flies! In the year 2000, I was just finishing high school, listening to Bush, and becoming acquainted with Windows 2000. Back then, I knew very little about Apple, and I'd certainly not heard of the Bondi Bubble iMac (the first iMac was released in 1998). In 2010, well...how things have changed for me!

It's incredible to think that the iPhone has taken center stage at Apple over the last three years. As noted by some of our commentators, there has been a real lack of Mac-centric news recently. Sure, there was the update to the iMac a few months ago, but it's glaringly obvious that the Mac has taken a back seat to the iPhone -- certainly in the limelight department. In fact, I'm reveling in the fact that I'm writing about the iPhone and the iMac at the same time!

Today, the Mac is the center of our digital hub, but it's no longer the center of our digital world. When we leave the house / office / room where the Mac lives, it's the iPhone ( iPad / iPod touch) that is constantly in our hands, and Apple knows it!

Of course, we have to come back to our Macs eventually (in my case, repeatedly, everyday) because the iPhone can't do everything that we want it to, or even some of the things that we want done well, yet. But just looking at this picture shows how far things have come, and how the direction taken by personal computing is becoming even more personal.

The only feature of the iPhone 4 that doesn't beat the iMac of yesteryear is screen real estate. The processor and RAM are double the capacity of the iMac, the iPhone's storage is 2 gigabytes larger, and it's flash-based memory. And of course, it's tiny in comparison. As noted by Obama Pacman, it's Moore's law in effect.

But when will it end? In 10 years time, will we have an iPhone that's five times smaller than the current one, but more powerful than the personal computers of today? Who knows? That might be a weird phone, but anything could happen. For now, I'm still stuck with my iPhone 3G, and I think it might still have some Bush on it. In the meantime, I'm just looking forward to getting the iPhone 4!

Entelligence is a column by technology strategist and author Michael Gartenberg, a man whose desire for a delicious cup of coffee and a quality New York bagel is dwarfed only by his passion for tech. In these articles, he'll explore where our industry is and where it's going -- on both micro and macro levels -- with the unique wit and insight only he can provide.

We are all familiar with Moore's law. The observation made by Intel co-founder Gordon Moore that the density of semiconductors doubles roughly every eighteen months. The net result? It's always going to be better faster and cheaper. Certainly that's been true of the phone space, with large screens, fast processors and lots of storage.

In the last few weeks alone I've looked at new phones with 1Ghz processors, the latest and greatest software platforms from Google and RIM... but it's been one little gadget that's caught my attention and it totally bucks the trend. What device? It's the Sony Ericsson Xperia X10 Mini Pro -- which is a lot of name for a small phone -- and it shows some very different thinking about what a smartphone is. In theory, this isn't a phone that I should like. Instead of a large 4.3-inch screen, it's running a 2.55-inch screen at 240 x 320 resolution. Don't look for a 1Ghz processor here. It's got an ARMv6 revision 5 processor at 600Mhz. Finally, forget Froyo or even Eclair. This thing's got Android 1.6 on it and may never get updated to the latest and greatest. Despite all that, I think Sony Ericsson has a potential hit on their hands if they decide to bring this to the US later this year as they said they plan to. Why am I so enamored?

]]>
columnentelligencegordon mooreGordonMooreminimooremoore s lawmoores lawMooresLawprosesonysony ericssonSonyEricssonx10x10 minix10 mini proX10MiniX10MiniProxperiaxperia x10 minixperia x10 mini proXperiaX10MiniXperiaX10MiniProFri, 27 Aug 2010 17:08:00 -040021|19610334http://www.engadget.com/2010/05/03/nvidia-vp-says-moores-law-is-dead/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2010/05/03/nvidia-vp-says-moores-law-is-dead/http://www.engadget.com/2010/05/03/nvidia-vp-says-moores-law-is-dead/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
NVIDIA and Intel haven't been shy about their differingrespectivevisions of the future of computing in the past year or so, but it looks like Team GPU just upped the rhetoric a little -- a Forbes column by NVIDIA VP Bill Dally argues that "Moore's law is dead." Given that Moore's law is arguably the foundation of Intel's entire business, such a statement is a huge shot across the bow; though other companies like AMD are guided by the doctrine, Intel's relentless pursuit of Gordon Moore's vision has become a focal point and rallying cry for the world's largest chipmaker.

So what's Dally's solution to the death of Moore's law? For everyone to buy into parallel computing, where -- surprise, surprise -- NVIDIA's GPUs thrive. Dally says that dual, quad- and hex-core solutions are inefficient -- he likens multi-core chips to "trying to build an airplane by putting wings on a train," and says that only ground-up parallel solutions designed for energy efficiency will bring back the golden age of doubling performance every two years. That sounds fantastic, but as far as power consumption is concerned, well, perhaps NVIDIA had best lead by example.

]]>
AMDBill DallyBillDallycomputingcpuCPUsCUDAGPGPUIntelMoores lawMooreslawNVIDIAparallel computingparallel processingparallel processorsParallelComputingParallelProcessingParallelProcessorsprocessingMon, 03 May 2010 01:43:00 -040021|19461802http://www.engadget.com/2010/04/02/defective-graphene-sheets-look-poised-to-succeed-silicon/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2010/04/02/defective-graphene-sheets-look-poised-to-succeed-silicon/http://www.engadget.com/2010/04/02/defective-graphene-sheets-look-poised-to-succeed-silicon/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
As circuitry gets smaller and approaches the effective limitation of silicon's computing power, and Moore's Law begins to look like it has an expiration date, we get closer and closer to needing an alternative. Graphene is held to be the answer; sheets of carbon a single atom thick that could be stacked and composited to create processors. Two professors at the University of South Florida, Matthias Batzill and Ivan Oleynik, have found a new way to turn those sheets into circuits by creating nanoscale defects. These strips of broken atomic rings wind up having metallic properties, thus making them act like microscopic wires. IBM is already teasing us with the possibilities of graphene and now, with a more practical way to make graphene-based electronics, we'd say Moore's Law still has at least another couple decades left.

]]>
carbongrapheneIvan OleynikIvanOleynikMatthias BatzillMatthiasBatzillmoores lawMooresLawuniversity of southern floridaUniversityOfSouthernFloridausfFri, 02 Apr 2010 08:46:00 -040021|19424206http://www.engadget.com/2009/10/20/physicists-calculate-the-end-of-moores-law-clearly-dont-belie/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2009/10/20/physicists-calculate-the-end-of-moores-law-clearly-dont-belie/http://www.engadget.com/2009/10/20/physicists-calculate-the-end-of-moores-law-clearly-dont-belie/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
If you're looking for pundits with an end date for Moore's Law, you don't have to look far. You also don't have to look far to find a gaggle of loonies who just knew the world was ending in Y2K, so make of that what you will. The latest duo looking to call the demise of the processor mantra that has held true for two score comes from Boston University, with physicists Lev Levitin and Tommaso Toffoli asserting that a quantum limit would be achieved in around 75 to 80 years. Scott Aaronson, an attention-getter at MIT, expects that very same limit to be hit in just 20 years. Of course, there's plenty of technobabble to explain the what's and how's behind all this, but considering that the brainiacs of the world can't even agree with Gordon Moore's own doomsday date, we're choosing to plug our ears and keep on believin' for now. Bonus video after the break.

]]>
chipcomputercomputingfastGordon MooreGordonMooremoores lawMooresLawPhysicistprocessorresearchsciencespeedTue, 20 Oct 2009 18:01:00 -040021|19202320http://massively.joystiq.com/2009/08/01/a-decade-of-divination/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Massively&ncid=rss_semi
http://massively.joystiq.com/2009/08/01/a-decade-of-divination/http://massively.joystiq.com/2009/08/01/a-decade-of-divination/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Massively#commentsMy first writings here at Massively were a look back at the last ten years of MMO gaming, much of which I'd taken some small part in, and a comparison of how early MMOs had been then, against how they seem to have shaped up today. I expect if I was going to grow out of these things it would have already happened by now, so am fully expecting to be playing an MMO of some description in 2019.

Much of the year 2019 is already known to us, and detailed extensively in the documentaries 'Bladerunner', 'The Running Man' and 'Akira', but what will MMOs be like, a decade from now? Join me as I charge up the flux capacitors, spin the big brass and crystal whirley thing with no obvious purpose and hop in my little blue box in a bid to divine...the future!

]]>
2019billingfeaturedfuturemoores lawMooresLawSat, 01 Aug 2009 10:00:00 -0400319|19116098http://www.engadget.com/2008/12/19/ibm-claims-title-of-worlds-fastest-graphene-transistor/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2008/12/19/ibm-claims-title-of-worlds-fastest-graphene-transistor/http://www.engadget.com/2008/12/19/ibm-claims-title-of-worlds-fastest-graphene-transistor/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsAs we've seen, plenty of researchers and companies are betting on graphene as being the big thing that will revolutionize transistors and, hence, all manner of electronics, and it looks like IBM is now claiming one of the biggest breakthroughs to date, not to mention the desirable title of "world's fastest graphene transistor." More specifically, IBM researchers have apparently been the first to demonstrate the operation of graphene field-effect transistors at gigahertz frequencies and, apparently even more importantly, they've also established the scaling behavior of the graphene transistors, which they say could eventually lead to the development of terahertz graphene transistors -- or, in other word's, keep Moore's Law around for quite a bit longer than many expected.

]]>
grapheneibmmoores lawMooresLawtransistorFri, 19 Dec 2008 20:51:00 -050021|1407038http://www.engadget.com/2008/10/22/researchers-say-new-state-of-matter-could-extend-moores-law/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2008/10/22/researchers-say-new-state-of-matter-could-extend-moores-law/http://www.engadget.com/2008/10/22/researchers-say-new-state-of-matter-could-extend-moores-law/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#commentsThere's certainly been no shortage of folks trying to pin down an end date for Moore's Law, but there's also thankfully plenty of researchers doing their best to keep it going, and a team of physicists from McGill University in Montreal now say they've made a discovery that could keep the law alive even further into the future. Their big breakthrough is a new state of matter known as a quasi-three-dimensional electron crystal, which they discovered in a semiconductor material by using a device cooled at temperatures "roughly 100 times colder than intergalactic space," and then exposing the material to the "most powerful continuous magnetic fields generated on Earth." Unlike two-dimensional electron crystals, which lead researcher Dr. Guillaume Gervais equates to a ham sandwich, the quasi-three-dimensional electron crystals are in an "in-between state" between 2D and 3D, which could potentially allow for transistors to improve further as they run up against the physical limits imposed by the laws of physics.

]]>
mcgillmcgill universityMcgillUniversitymoores lawMooresLawtransistorWed, 22 Oct 2008 13:28:00 -040021|1349802http://www.engadget.com/2008/07/11/microchip-breakthrough-could-keep-moores-law-intact-again/?utm_medium=feed&utm_source=Feed_Classic&utm_campaign=Engadget&ncid=rss_semi
http://www.engadget.com/2008/07/11/microchip-breakthrough-could-keep-moores-law-intact-again/http://www.engadget.com/2008/07/11/microchip-breakthrough-could-keep-moores-law-intact-again/?utm_source=Feed_Classic&utm_medium=feed&utm_campaign=Engadget#comments
We're pretty certain we'll be hearing this same story each year, every year for the rest of eternity, but hey, not like we're kvetching over that or anything. Once again, we're hearing that mad scientists have developed a breakthrough that makes Mr. Moore look remarkably bright, as a new approach to chip making could carve features in silicon chips "that are many times smaller than the wavelength of the light used to make them." Reportedly, the new method "produces grids of parallel lines just 25-nanometers wide using light with a wavelength of 351-nanometers," although the grids aren't functional circuits just yet. If you're interested in more technobabble on the matter, head on down to the read link, but we'd recommend against if you're easily frightened by terms like "photolithographic" and "nanotechnology."