Maybe someone can clear this up for me. Apple drops out of EPEAT because some of their more recent devices can't be recycled. They try to deflect attention by claiming that it doesn't matter if they can't be recycled because they made them using green manufacturing processes. This strikes me as someone saying "it doesn't matter if the brakes don't work because we have a hybrid engine". Secondly, several people have pointed out that you can give them back to Apple to "recycle" but no one ever defines what that means. Every single recycling professional has stated that you can't recycle aluminum that is bonded to glass. Period. Apple does not nor have they claimed to have some magic process that allows them to do what no one else can do in this respect. So what does giving the devices to Apple to "recycle" really mean? Does it mean that Apple just dumps them along the roadside when no one is looking in a cynical "out of sight out of mind" way? Do they "recycle" them by giving them to poor third world countries and let them throw them away? I've yet to see this explained. There is also no economical way to recycle batteries and plastic that have been bonded together. I'd like to hear Apple address this clearly and with out all of the fancy dancing and deflection. To say that we can't meet the requirements for recycling as required by EPEAT but that doesn't matter because we did other "green things" is like saying "filling the landfills with polluting junk is ok as long as we were careful when we made the junk in the first place". It's avoiding the issue in much the same way that the Wizard screams to Dorothy not to look behind the curtain. Nothing to see here. Move along. You're HOLDING IT WRONG!

Learning is a collaboration. The student is paid to teach by getting a job when they graduate. The teacher is paid to teach. When either side fails then the student fails. It's not one or the other. There are way too many students who think that tuition is payment for a passing grade whether they work or not. Life does not work that way.

I completely agree with this. Attrition is not an unfortunate side effect. It is a planned result. When I started in the Engineering curriculum at the University of Illinois, one of my first Engineering classes was a huge auditorium full of hundreds of new students. The Dean of the school stood before the class and announce that 20 out of every 100 people there would fail. He also made it clear that they would not fail because they were not up to the task, they would fail because the grade curve would be adjusted to make sure that 20% of the students failed. The added pressure was supposed to make us work harder I suppose. What it achieved was to convince a lot of people to go straight to their advisers and change majors. I had a similar experience in math courses. Professors that we never saw taught classes by video tape monitored by TAs who didn't give a shit about whether or not you learned the material. It had a lot less to do about learning and a lot to do with "paying your dues". First year classes were more of a hazing experience than a learning experience.

Actually, while 60s era mainframes did require significant maintenance by the time the late 70s came around up-time was much better. I still have a late 70s mini-computer that I keep around for laughs that routinely gets about a year and a half between reboots, running 11 users and multi-tasking for each user. As for features that come and go, the IBM 7030 had instruction pipe-lining and look-ahead (what Intel calls hyper-threading) way back in the 60s. In fact it could have as many as 11 instructions in the pipeline at any time. (Though 4 was typical) That went away in the era of the microprocessor, not because it was a bad idea, but because it wasn't possible to implement in early primitive Intel processors. Only after technology caught up again did it reappear as hyper-threading.

Didn't Star Trek have an episode similar to this? A centuries old war fought by computers and proxy? It only ended when war was made ugly again. The future is here and parts of it scare the crap out of me.

I just received a notice from State Farm Insurance that if I allow them to collect OnStar data I "MIGHT" get a discount on my insurance. Uhhh... yeah... I'll be sure to do that. (NOT) I'm fairly certain that this is only the tip of the iceberg. How long before the car automatically calls the police when you exceed the speed limit?

dwreid writes: "This is an older article but serves to point out (yet again) how companies are granted a patent for something for which there is ample prior art. This time it's Google getting a patent for the "doodle" or anything that changes the logo for the site based on the season or holiday. I have to call BS on that one."Link to Original Source

It's been my experience that accountants are the people with the most distance between them and reality. A number of years ago I worked for a computer company that was experiencing 30%+ annual growth. The president and co-founder decided to retire. The company moved the CFO into the president's position. Mr brilliant bean counter decided to make the company more profitable by terminating all of the sales force. Those hefty salaries and bonuses were a huge cost center. Now the balance sheet looked all profitable. Six months later the company was GONE and he never knew why sales stopped. He actually said, "but we still have the marketing department. I don't understand." As the years have passed I've seen this kind of tunnel vision from the accounting departments over and over and over. The stories are legion.

It's actually more complex than you know. The problem has many layers. First is the problem of decoding the data and figuring out how that is encoded. It's not just "a pit is a one and no pit is a zero". There's more to it than that. Is it a limited run length code? Is it a Non-Return to Zero code? What parts of the data are information and what parts are house keeping for the file structure? Then add the error correction algorithm. Now you have raw data but what is that data? Is it text? Is it photos? Is it software? Let's say it was video. Now you have to figure out how which of the many video formats and encoding and compression CODECs and algorithms was used. Since none of them would still be in use that won't be easy. Now you have some decoded data but how do you display it? What is the color representation? Is it RGB? Is is CYMK? Is it something else? It is NTSC? Is it raster or some form of vector? Is it interlaced or progressive? What is the frame size? What is the frame rate? In the end, it could take years and many units of currency to figure out what's on the disc and display it.
When NASA found the lost lunar tapes they had no way to read them any more and that wasn't that long ago. They had to find some old tape drives in some guys garage and rebuild them before the data could be recovered. At one time data was stored on 7-track magnetic tape. It was the most widely used standard of the time. Now, unless you can find one of those really rare old drives the information on those tapes is just lost. Then there is that once widely used standard called the 8 inch floppy disk. and on and on...
Marks on paper last for hundreds or thousands of years. Information on electronic media lasts until Apple decides to discontinue it with no notice. This century will, in the long term, be the most undocumented century since the stone age when it's all said and done.

Of course the reason for using cloud based platforms is that they are cheap. Now you tell me that if I want them to be reliable they're not cheap. Soooo.... why use the cloud again? Oh that's right... it's the new buzz word.

You can warm up your car in the winter before going out to sit in an unbearably cold car. Orrrrr.... you can waste your cheating wife by putting her drugged self in the car in the garage and then taking the train downtown. Start up the car once your alibi is established and voila... suicide.
Just saying...