Posted
by
Unknown Lamer
on Monday October 22, 2012 @10:22PM
from the beowulf-cluster-of-course dept.

First time accepted submitter icepick3000 writes "There are probably many digital photoframes unused these days laying around. Mine is from the first generation meaning you can only insert a compact flash card and display photos. Newer models nowadays can display weather, news, and stocks. Anyone have some good idea's how to give these old frames a second life? I have been thinking about compact flash cards that support wifi... maybe someone has a better idea?"

Posted
by
Soulskill
on Monday October 22, 2012 @05:10PM
from the dysfunctional-courtship dept.

Hugh Pickens writes "Jean-Louis Gassée says Apple and Samsung are engaged in a knives-out smartphone war. But when it comes to chips, the two companies must pretend to be civil because Samsung is the sole supplier of ARM-based processors for the iPhone. So why hasn't Intel jumped at the chance to become Apple's ARM source? 'The first explanation is architectural disdain,' writes Gassée. 'Intel sees "no future for ARM," it's a culture of x86 true believers. And they have a right to their conviction: With each iteration of its manufacturing technology, Intel has full control over how to improve its processors.' Next is pride. Intel would have to accept Apple's design and 'pour' it into silicon — it would become a lowlymerchant foundry.' Intel knows how to design and manufacture standard parts, but it has little experience manufacturing other people's custom designs or pricing them. But the most likely answer to the Why-Not-Intel question is money. Intel meticulously tunes the price points for its processors to generate the revenue that will fund development. Intel's published prices range from a 'low' $117 for a Core i3 processor to $999 for a top-of-the-line Core i7 device. Compare this to iSuppli's estimate for the cost of the A6 processor: $17.50. Even if more A6 chips could be produced per wafer — an unproven assumption — Intel's revenue per A6 wafer start would be much lower than with their x86 microprocessors. In Intel's perception of reality, this would destroy the business model. 'For all of Intel's semiconductor design and manufacturing feats, its processors suffer from a genetic handicap: They have to support the legacy x86 instruction set, and thus they're inherently more complicated than legacy-free ARM devices, they require more transistors, more silicon. Intel will argue, rightly, that they'll always be one technological step ahead of the competition, but is one step enough for x86 chips to beat ARM microprocessors?'"

Posted
by
Soulskill
on Monday October 22, 2012 @03:45PM
from the not-as-dangerous-as-my-toaster-oven dept.

An anonymous reader sends this quote from Wired:
"Affordable 3-D printers and CNC mills are popping up everywhere, opening up new worlds of production to wide ranges of designers. However, one major tool still hasn’t received a DIY overhaul: the laser cutter. Maybe people are sensitive because Goldfinger tried to cut James Bond in half with one, but all that changes now with Patrick Hood-Daniel’s new Kickstarter, 'Build Your Own Laser Cutter.' ... A 40-watt laser tube and power supply means it can cut a variety of materials: wood, plastic, fabric, and paper. ... There is one major red flag, however. The machine’s frame is built from of Medium Density Overlay (MDO) — a type of plywood. Hood-Daniels says this is a feature, making the blackTooth less sensitive to thermal distortion and inaccuracy than a metal frame, but it also creates a serious, fire-breathing concern. ... When asked for comment, Hood-Daniel says 'Initially, I had the same thoughts as to the precarious use of wood for the structure, but even with long burns to the structure which were made on accident when starting a run, there was no ignition.'"

Posted
by
timothy
on Monday October 22, 2012 @01:30PM
from the not-after-what-you-said dept.

An anonymous reader writes "Samsung has decided to terminate an ongoing contract with Apple to supply LCD panels for use in its growing range of devices. That means, come next year, there will be no Samsung panels used across the iPad, iPod, iPhone, and Mac range of devices. The reason seems to be two-fold. On the one hand, Apple has been working hard to secure supplies from other manufacturers and therefore decrease its reliance on Samsung. On the other, Apple is well-known for demanding and pushing lower pricing, meaning it just doesn't make business sense anymore for Samsung to keep supplying Apple with displays."

Posted
by
timothy
on Monday October 22, 2012 @10:08AM
from the layers-of-hypotheticals dept.

Nerval's Lobster writes "If the Apple rumor mill proves correct, the unveiling of the iPad Mini this week could mean sayonara for the iPad 2. At least, that's the prediction of Evercore Partners analyst Rob Cihra, who wrote in a recent note to investors that he believes Apple will remove the iPad 2 from its lineup to make room for a smaller tablet. Apple insider excerpted parts of Cihra's note Oct. 19. Of course, that's just one analyst speculating about the future plans of a company known for playing things close to the proverbial vest: Apple's Oct. 23 event in California could feature all sorts of surprises. So what do we know about the iPad Mini? First, that it might not be called the iPad Mini — that's a moniker dreamed up by the press. Second, a cheaper and smaller iPad could impact the market for e-readers and 'price-sensitive users,' according to J.P. Morgan analyst Mark Moskowitz, which in turn could mean a challenging future for Amazon, Google, and other IT vendors marketing cheaper tablets. Third, the media—driven by unnamed sources and blurry spy photos—seems to have collectively settled on a 7.85-inch screen without a high-resolution Retina Display."

Posted
by
timothy
on Monday October 22, 2012 @09:52AM
from the remarkably-cheap-from-historical-perspective dept.

alphadogg writes "Motorola Solutions has unveiled a head-mounted, voice-controlled computer that's targeted at the military and other industries where workers need hands-free access to information. Called the HC1, the device runs on an ARM processor and has an optional camera to send back real-time video over a wireless network. Unlike Google Goggles, though, the HC1 is aimed at the enterprise market with a price tag of $4,000-$5,000 per unit. Areas the company has been experimenting with include 'high-end repair markets,' such as aircraft engines, said Paul Steinberg, CTO of Motorola Solutions (which is the part of Motorola Google did not acquire). 'Emergency medical personnel at trauma centers might be looking at this too.' The HC1 will augment what users see by providing additional data, he said. Multiple units could be networked together and share information. Video here. "

Posted
by
timothy
on Sunday October 21, 2012 @06:40PM
from the very-cute dept.

Google's new ARM-powered Chromebookisn't a lot of things: it isn't a full-fledged laptop, it's not a tablet (doesn't even have a touch screen); and by design it's not very good as a stand-alone device. Eric Lai at ZDNet, though, thinks Chromebooks are (with the price drop that accompanies the newest version) a good fit for business customers, at least "for white-collar employees and other workers who rarely stray away from their corporate campus and its Wi-Fi network." Lai lists some interesting large-scale rollouts with Chromebooks, including 19,000 of them in a South Carolina school district. Schools probably especially like the control that ChromeOS means for the laptops they administer. For those who'd like to have a more conventional but still lightweight ARM laptop, I wonder how quickly the ARM variant of Ubuntu will land on the new version. (Looks like I'm not the only one to leap to that thought.)

Posted
by
timothy
on Sunday October 21, 2012 @01:58PM
from the indifference-mostly dept.

acer123 writes "Lately I have replaced several home wireless routers because the signal strength has been found to be degraded. These devices, when new (2+ years ago) would cover an entire house. Over the years, the strength seems to decrease to a point where it might only cover one or two rooms. Of the three that I have replaced for friends, I have not found a common brand, age, etc. It just seems that after time, the signal strength decreases. I know that routers are cheap and easy to replace but I'm curious what actually causes this. I would have assumed that the components would either work or not work; we would either have a full signal or have no signal. I am not an electrical engineer and I can't find the answer online so I'm reaching out to you. Can someone explain how a transmitter can slowly go bad?"

Posted
by
timothy
on Sunday October 21, 2012 @01:00PM
from the put-it-next-to-the-pi dept.

mikejuk writes "After six years in the making, the Arduino Due is finally becoming available and, with a price tag of $49, is bound to give a boost to the platform. The Due, which means 2 in Italian and is pronounced 'doo-eh', replaces the 8-bit, 16MHz Uno by a 32-bit, 84MHz processor board that also has a range of new features — more memory, a USB port that allows it to pretend to be a mouse or a keyboard say, 54 I/O pins and so on — but what lets you do more with it is its speed and power. The heart of the new Arduino Due is the Atmel SAM3X8E, an ARM Cortex-M3-based processor, which gives it a huge boost in ADC performance, opening up possibilities for designers. The theoretical sampling rate has gone from the 15 ksps (kilosamples per second) of the existing boards, the Arduino Uno, Leonardo, and Mega 2560, to a whopping 1,000 ksps. What this all means is that the Due can be used for much more sophisticated applications. It can even play back WAV files without any help. Look out for the Due in projects that once would have needed something more like a desktop machine."

Posted
by
timothy
on Saturday October 20, 2012 @04:55PM
from the champing-at-the-bit dept.

colinneagle writes "It's a darned shame, but the writing is on the wall for AMD. The ATI graphics business is the only thing keeping it afloat right now as sales shrivel up and the company faces yet another round of staffing cuts. You can only cut so many times before there's no one left to innovate you out of the mess you're in. Qualcomm, on the other hand, dominates this space, and it has the chips to back it up. The Snapdragon line of ARM-based processors alone is found in a ridiculous number of prominent devices, including Samsung Galaxy S II and S III, Nokia Lumia 900 and 920, Asus Transformer Pad Infinity and the Samsung Galaxy Note. Mind you, Samsung is also in the ARM processor business, yet it is licensing Qualcomm's parts. That's quite a statement."

Posted
by
timothy
on Saturday October 20, 2012 @12:31PM
from the lots-of-misters dept.

1sockchuck writes "As Google showed the world its data centers this week, it disclosed one of its best-kept secrets: how it cools its custom servers in high-density racks. All the magic happens in enclosed hot aisles, including supercomputer-style steel tubing that transports water — sometimes within inches of the servers. How many of those servers are there? Google has deployed at least 1 million servers, according to Wired, which got a look inside the company's North Carolina data center. The disclosures accompany a gallery of striking photos by architecture photographer Connie Zhou, who discusses the experience and her approach to the unique assignment."

Posted
by
timothy
on Saturday October 20, 2012 @09:41AM
from the whole-shebang dept.

An anonymous reader writes "ACM Queue interviews Cambridge researcher (and FreeBSD developer) Robert Watson on why processor designs need to change in order to better support security features like Capsicum — and how they change all the time (RISC, GPUs, etc). He also talks about the challenge of building a research team at Cambridge that could actually work with all levels of the stack: CPU design, operating systems, compilers, applications, and formal methods. The DARPA-sponsored SRI and Cambridge CTSRD project is building a new open source processor that can support orders of magnitude greater sandboxing than current designs."

Posted
by
timothy
on Saturday October 20, 2012 @01:50AM
from the sainthood-nomination-time dept.

another random user writes "Stanford Ovshinsky, a self-taught American physicist who designed the battery now used in hybrid cars, has died aged 89 from prostate cancer .
The electronics field of ovonics was named after Mr Ovshinsky, who owned over 200 patents and has been described as a '[Thomas] Edison of our age.'
He introduced the idea of 'glass transistors' in 1968, which paved the way for modern flat-screen monitors."

Posted
by
Soulskill
on Friday October 19, 2012 @10:30AM
from the which-automakers-will-promptly-ignore dept.

SchrodingerZ writes "The Society of Automotive Engineers (SAE), an international syndicate, has unveiled what is to become the standard for electric car charging. In today's market there are hundreds of different methods and plugs to charge a variety of different cars, now a single multi use plug is announced as the world standard. Called the J1772 , it 'has two charging plugs incorporated into a single design and is said to reduce charging times from as long as eight hours to as little as 20 minutes.' The cumulative work of over 190 'global experts,' the plug can cater to both AC and DC currents for charging. The plug also sets a new standard on safety regulations, including 'its ability to be safely used in all weather conditions, and the fact that its connections are never live unless commanded by the car during charging.' The J1772 beat out its Japanese competitor the CHAdeMO, used as an option on the Nissan Leaf."

Slashdot Top Deals

Slashdot Top Deals

Slashdot Poll

Maximum Items You've Powered From a Single Outlet

1-2: Better safe than sorry
3-4: Power strips are OK, right
5-8: Make that two power strips
9-16: Only a little smoke coming out
>16: Waiting for the big bang
All my stuff runs on batteries your insensitive clod