Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

I live on the planet where I actually USE an Apple product daily. My wife's Macbook Pro is running fine, my new one is about 95%. Sure I get the spinning beach ball, but I know it's because I put a third party hard disk in it and I'll deal with my own decision because everything else is PHENOMENAL!:)

Sine when does Apple make hard disks? The hard disks in Apple laptops and Apple desktops that I have seen are made by the same people that Dell, HP, and everyone else uses (I have seen Toshiba, WD, Seagate). The only difference is the hard drive has a little apple printed on the label. The hard drive specs are the same as the non Apple labeled ones. Apple has to have some way to see where the bottle neck is that is causing the beach ball. Saying it is the hard disk is jumping the gun a bit. Unless you put i

In the very first edition, if you had a guest user, and you used it, you would return to the administrator account and find all the data gone. Bad. Fixed in 10.6.1. We're now in 10.6.2. But some people just can't move on.

Leaving aside the jokes, has anyone else noticed that "the noughties" as the supposed name for this decade only seems to have cropped up- or at least been "standardised" on- in the past year or so.

For most of it, there didn't seem to be any strong name, though IIRC "2000s" was possibly the most common. Of course, while that name may have been fine when we were within the first ten years of the millennium (*), it's possibly less precise once it has two potential meanings. Though it didn't stop the "1900s" be

Brit here. Parallel with "Cool Britannia" (remember that?) I saw a lot of use in the press of the "Naughty Nineties" (modelled on the "Swinging Sixties"). So if the astonishingly imaginative trend continues, I imagine the next decade will be christened the Naughteens.

Actually, he has it right. Our modern, western notion of a calendar is marred by the fact that the Romans had no concept of zero until the conquest of Spain and the ensuing interaction with the moorish people who lived there. Thus, we start counting dates with 1, not zero. Therefore, the '60's is the decade beginning immediately after the end of year xx60 but a person "in their 60's" has completed 59 years of life and not 10 more.

In our Christian era calendars you do not find a year zero. To our modern, mathematically educated minds that would have been the year before Jesus of Nazareth was 12 months old.

Of course, our calendars, while allegedly based on the birth date of this man Jesus, are flawed by many other issues. Among these are:

1) We don't actually have agreement about the precise year of Jesus' birth.2) The 25-December customary date is a fabrication. Jesus was most likely born in the spring based on accounts of what was happening at the time.3) Our calendar system has been changed a few times over the past two millennia.

Don't get too hung up on all that. Most of the people I know get confused when they celebrate their Xth birthday and I tell them that I hope their newly-begun X+1th year is as successful as the last. I literally went back and forth with one person for over an hour on her birthday this year, with her repeatedly insisting she had just turned X and not understanding why that makes the year she's in now X+1.

I've all but conceded defeat on the millennium issue. I'll never, of course, admit to having been w

You aren't a coder, are you? If so, I envision many off-by-one errors in your work.

Surely a coder would have to be more mindful of the correct definitions of things like a decade. Otherwise if they moved between languages that had either 0 or 1 based arrays then they would constantly make errors.

Um, no, there was no year zero. The year before 1 AD was 1 BC. The first decade ran from 1 AD to 10 AD, the first century ran from 1 AD to 100 AD. The 20th century ran from 1901 to 2000.
The "90's" and the last decade of the 20th century are two different things. "The 90's" is 1990–1999, the last decade of the 20th century is 1991–2000

The 360 for its inexcusable failure rate, then in the wake of Microsofts competitors constantly revising their models and offering updates Microsoft declares they will not create a version two or revise their hardware.

Then - while XBox 360's were new and failing in droves, Microsoft not only decides the old model will no longer be supported with new products they recall as much existing stock of the old model as they can and do their best to make it got away. Sort of like they wanted to do with XP when Vis

let's examine the data behind this shall we. they surveyed 5000 people which is just 0.0178% of the total units sold, so statisticly it's a worthless sample size. They also don't give any clue on how they selected these 5000 people, for all we know they picked people with rabid MS hate.

i'd say the only thing lame here is your claim of 54% failure rate, it just doesn't sense check.

"they surveyed 5000 people which is just 0.0178% of the total units sold, so statisticly it's a worthless sample size"

The statistical power [wikipedia.org] of a survey does not depend on population size, a sample of 5000 is more than sufficient to get a very good estimate of the real failure rate, (assuming the real failure rate is not extremely small).

However a failure rate is meaningless without considering length of time and under what conditions. And as you imply the sample must be random, self selecting readers

the xbox had a failure rate of between 3% and 5% in line with industry norms (MS claim). while it's not a stellar performace it's nothing special. typically when you dig into the claims of 50% failure rates, they are either online polls or of limited sample size (in other words fucking worthless).

Video gaming in general could have used more prominence in TFA. After all, it's undoubtedly a part of the tech sector. Thinking of 10 examples off the top of my head, in no particular order...

- The Red Ring of Death: as you say, should absolutely have been in there. Cost-cutting decisions lead to major customer frustrations. The issue is then compounded by lies, obfuscation and, once the problem is acknowledged, a massively slow response.

- The Gamecube: everything about it. A nasty, tacky piece of junk with

I'm a little surprised that didn't make the list. It still irritates me, every time I look down at the PS2 I still have hooked up next to my PS3. Looking at the eBay listings for a $400 used 60 Gig console next to a $300 shiny new 80 Gig console just reinforces it.

When clocks struck midnight on January 1st and the dreaded Y2K bug turned out to be nothing but a mild irritant, it proved once again that the experts often don’t know what the heck they’re talking about.

No. The Experts were the ones working many, many hours in the preceding years fixing and updating things so that when the clock did turn, the problems were - for the main - no longer present. A job damned well done and the people fixing it should be praised, not ridiculed.

The people who don't know what the heck they were talking about are the media types like this guy who are quick to jump on catastrophic failures but rarely (if ever) give due praise when things are planned and done right. "Everything's fine" doesn't make good headlines for these people.

Mod parent up to 6.
#88: Also note that Google's multiple outages this year (and last?) don't get a mention.
#89: No mention of Windows Mobile 6.5 and how MS threw away its last chance of ever competing with the droid/iphone.
#90: TFA

I should've explained better - the "improvements" in Windows Mobile 6.5 were hardly improvements at all. Someone put it well when they said that it was "as useful as a poke in the eye"

Yes, Android feels like a beta, but then so do all Google products.

Even though WinMo is for power users (I agree), the problem with the popularity of the iphone is that the power users already want it in their enterprise environment and ferverently believe that it *IS* an enterprise phone.

To be honest, I also prefer WM6.1 to WM6.5 (except for the Bluetooth stack which is somewhat better). But still, even WM6.5 is a perfectly usable multitasking operating system for portable computers (and I use my Windows Mobile devices as portable PCs since 2003).

Also, iPhone is not that popular in Europe, so you often see following messages in the smartphone boards:"I am fed up with Windows Mobile, I have bought an iPhone now"and a couple of months later: "I am fed up with iPhone, I am going back to Window

To this (and the other replies here): he's not referring to people like you. Don't think that for a minute. I was a Y2K Fear debunker myself, and I assure you, I NEVER attacked people like you who WERE working around the clock to ensure that the transition was smooth.What I attacked -- and what he's clearly referring to -- were the outright fearmongers. "We CAN'T fix it all in time, buy beans, bullets and head for the hills!"... and... "embedded systems are the great unknown, we're all going to die, so bu

I think I heard of one embedded system that broke due to Y2K, but I've seen many more over the years that got confused over leap years. The year 2000 was especially good for that because that wasn't a leap year even though the common, oversimplified, every-4-year rule says it should have been.

Actually, it was a leap year, even though the common, not-quite-as-oversimplified-but-still-too-simple every-4-year-except-for-every-100-year rule says it shouldn't have been. The really dumb systems with the every-4-year rule lucked out.

Now we can all wait for the end of civilization in February 2100 as every single embedded system crashes....

I was working in a bank at that time. If we hadn't fixed our systems then come 1/1/2000 every customer in our business area would have found all their transactions failed as the system would have thought they'd expired 100 years ago!

Could you please point out a single example where a catastrophe was avoided due to fixing the code handing year changes?

How would we know? It is not as if people are going to publicise the bugs that they fix. "Hey everyone, we almost nuked Poland!"

Anyway, the worst of the hype that went around did not come from the experts. Nobody who knew what they were talking about would have said that there would be starvation in the streets. That said, there were definitely some people who tried to cash in on the paranoia. We had some consultant come in and try to sell us software to fix our systems because they were not Y2K ready. Sure enough, when the year changed the computers wrapped back to 1981. However, resetting them to the correct year worked fine.

But just because some unscrupulous people jumped on the bandwagon doesn't mean to say that there were not real bugs to fix. The main software that we wrote had a Y2K bug in it, but we fixed it back in 1997 without fanfare. Just because you never heard of it being fixed doesn't mean to say that it was a made up bug.

I had an uncle working for MetLife in the early 90s; had he and his team not put in many months of work back then, the Y2K bug would have left millions of insurance claims unpayable, from business insurance to health plans. On any given day, thousands upon thousands of insurance transactions are processed automatically by a computer, which would have rejected everything as invalid because the claim dates would have appeared to be before the policies were opened.

People who think Y2K was not a big deal were either children when the problem was solved or never really understood the problem to begin with. Y2K38 is the next big date/time bug to deal with; many people here on/. will probably wind up working on fixing the problem long before it causes catastrophes, and we will be pretty old on January 1, 2038. I am also pretty sure that people like you will be saying, "There was never really a problem" for many years afterward, and someone like me will probably have to reply with a story about a relative who solved the problem before the press started running stories about the end of the world.

Taken individually and in isolation, it is true that the problem with many such systems is trivial. Howevr many of these trivial systems feed into or from other trivial systems and this makes the system viewed as a whole rather complex. It is extremely difficult to predict the outcome of even a simple looking system (see Conway's game of life for example) so there was no telling what would or

What this chapter completely misses is that the emergency spending on Y2K fixes was just the tail end of a long effort of companies updating their software. Journalists were not talking about the Y2K bug in 1991, but plenty of programmers were already aware of it and already working on solutions to the problem. By the time the story got interesting -- as the year 2000 approached -- the problem had been mostly solved without any prodding from the media or pushing from the president. This whole situation was compounded by the fact that most people, including the journalists covering the story, had no understanding of how computers stored dates, and the fact that companies whose products had nothing to do with the Y2K bug were advertising their software as "Y2K compliant," and everyone wound up thinking that there was impending doom.

For the record, the Y2K bug did actually threaten critical computer systems, many of which were mainframes installed decades earlier, but those systems were fixed long before the story ever ran on the news.

You know, if that number was smaller, I might actually click through & read the article. But 87? Really? A number that large makes me think that you just wrote down every single lame thing you could think of & didn't edit at all.

Personally, I'd prefer a much shorter list which someone made some effort to pare down to the moments that were genuinely the lamest.

No doubt. I just have a pet peeve with huge lists like this. A lot of the value in coming up with a list is how & why you decide to either include a particular entry or leave it off. The longer the list gets, the more it appears that the author didn't put any hard thought into it, and the less value it has (for me anyway).

From TFA: "When clocks struck midnight on January 1st and the dreaded Y2K bug turned out to be nothing but a mild irritant, it proved once again that the experts often don't know what the heck they're talking about."

Well, that kinda hurts.

I was responsible for a newspaper ordering system that definitely would have stopped processing orders in 2000. Cost quite a number of man hours. The majority of the Y2K my team had to solve weren't for the year 2000 but for passing into the year 1999 because many ordering systems had stupid (year+1) counters internally. It was a very stressful period and I very happy it went the way it did without major disasters.

The experts that didn't (and don't) know what they are talking about are the ones thinking you can upper-limit a year counter at 1999 (or 2039).

> All these arsehats who go on about the Y2K being a load of scare mongering paranoia are the ones who don't have a clue about just how much work went on in 1999 trying to sort the issues out!

Hear hear!

I worked at a large manufacturers during 1999 and was tasked with the Y2K stuff. This basically included six months worth of work fixing the stuff that would have an issue followed by six months of sending replies to customers who were told they had to be concerned by the media and the industry that rose u

Newspaper? What is that? Wait a sec, I think I remember those. Newspapers were those paper things that had funnies in them. Yeah. Good thing you saved that newspaper thing so that it could go bankrupt a year or two later... that's practically the same thing as all the hysteria that centered on Y2K.:eyes rolling:

...among the/. comments. Despite Apple's blunders in this list being few and not really noteworthy, it naturally does not discourage the "grannies of/." to leap out from under their stones with their tag-sticks.

Too bad they did not mention the MacBook Air...
I don't have any problems with the MBA but saying you don't need a DVD drive to watch DVDs because you can just rent them from iTunes - wow that was good. Especially with those exorbitant prices.

If you see a lot of Apple hate among these comments then why didn't you post your message as a reply to one of them? Oh, maybe because there isn't a lot of Apple hate here. This just goes to prove what we have all been saying about you: you're paranoid!

KDE was flying high with its well regarded 3.x version, and then its developers disappeared with lustery promises of how great KDE 4 would be, and emerged to ship a completely unfinished product. Things are better with KDE 4.later, but, KDE 4.0, wow, you are rough. Meanwhile KDevelop 4 still doesn't work, and has been eclipsed by, well, Eclipse.

83. And Taco Bell was never a taco company.In an interview with the New York Times conducted in the wake of Yahoo’s decision to outsource its search features to Microsoft, Yahoo boss Carol Bartz says that Yahoo has “never been a search company.”

Carol Bartz is correct--Yahoo started out as a link collection, then a hierarchical directory (basically like http://www.dmoz.org/ [dmoz.org] then added a lot of portal services (including email, stock quotes, etc).

The thing that they never had, until 2004, was a search engine; Yahoo put other company's searches on their site (including Inktomi for a while, and then Google up until 2004). Doing that with Bing is just returning to what they've done historically.

I don't know much about TFA's author, but I definitely got the impression from reading his damn multi-page blog post (thanks for posting another one of those to the front page, kdawson) that he didn't actually know much about the technologies he was ridiculing. Yahoo was the most obvious example - anyone who started using the web when commercializing it was a novel idea will remember finding web pages not with an indexing search engine but with Yahoo's topical hierarchy of links. It was well-organized and

I'm surprised that Google sending a C&D letter to CyanogenMod didn't make the list. Google had been trying to market its cell phone OS, Android, as an open platform that welcomed innovation and contributions. Then they decided to threaten an immensely popular third party rom that did wonders for Android's performance.

The official distribution at the time had many issues. Performance degraded the longer you went without a reboot. You couldn't install apps on SD cards, only the tiny internal storage space

Meanwhile Microsoft actually has a good reputation for turning a blind eye to people making roms for Windows Mobile.

Turning a blind eye to piracy and other stuff you'd expect them to fight against is a standard Microsoft tactic in markets they want to take over. In their mind, as long as you're using a Microsoft product, even if you stole it, that's better than you using a competitor's product.

Once they are the de facto standard in a given market, that's when they begin finishing off their weakened competitors and turning the thumbscrews on their users. That's why you could pass around Windows install keys for years with impunity, and then XP got activation. Once the activation-free corporate XP keys got out, they had to turn the screws some more, and now even corporate copies of Vista and, I presume, 7 require activation of a sort. People might find ways around that, but the point is Microsoft is making it more and more difficult to avoid paying them for Windows now that they've sewn up the OS market.

Of course, I could have made this post a lot shorter by comparing them to drug dealers: "First one's free," then once you're hooked, up goes the price.

42. They should have stuck with the "Windows Vista Inadequate" ones.
In the face of Vista's delays, Microsoft encourages PC manufacturers to slap "Windows Vista Capable" stickers on XP machines. The stickers turn out only to mean that the computer can run the lower-end versions of Vista, and don't guarantee they're able to use the new OSes' signature Aero interface. Legal hijinks ensue, and internal Microsoft documents suggest the company knew it had a problem on its hands even as it was egging on consume

The issue was MS told hardware makers for a long time what the minimum spec for Vista. So they designed their PCs around it. The minimum was going to be more costly if they didn't use Intel's GPU. However at the last minute they changed their minds under coaxing from Intel who would have a large inventory that wasn't compatible. This infuriated hardware makers as their plans were suddenly changed. Internally some MS employees knew this a huge mistake but no one with any authority did anything about it.

I for one want to celebrate the anniversary of the Y2K Bug's passing by thanking all the people who's hark work kept it from being far far worse than the few mild annoyances we experienced. The word I saw was some gas pumps that were locked up, and it could have been far worse if a whole lot of coders and analysts hadn't spent a ton of time pouring over reams of old code and fixing problems. Double thanks to all the Grampa Geeks who came out of retirement to show the kids how COBOL was done and why it's still so important even ten years later. A nod goes even to the suits at the top who looked beyond next quarter's numbers and funded the stitch in time would save nine.

from the article:
39. Unpronounceable but catchy.
At the Consumer Electronics Show, Intel gets Tom Hanks, Morgan Freeman, and Danny DeVito to help it roll out Viiv, a new platform for media-savvy home PCs. Consumers have trouble figuring out what it is (and how to say it); PC vendors don’t jump on the bandwagon with great abandon. By 2007, the press is referring to it in the past tense.
I've long suspected that unpronounceable names (merkur from ford) are really bad for a product.