Blog Archives

No wonder the porn industry is slowly dying. It has to compete with Hollywood.

First, there was the New Yorker running a pictorial of rising-star-turned-trainwreck Lindsay Lohan in a recreation of Marilyn Monroe's famous 1962 nude pictorial last February. It didn't hurt that the pictures were linked on The Drudge Report, which gets 15 million visitors a day.

This resulted in more than 40 million page views for the site, a 2,000 percent increase over the same time last year and crashed servers. Even non-Lohan content received between 2 million and 3 million
page views, considerably more than the site would get.

At least Lohan is legal. The latest Internet tidal wave involves a girl who most certainly is not. Vanity Fair is running a pictorial of Disney star and the not-old-enough-to-drive Miley Cyrus posing topless but covered, so only her back is exposed. While the usual sanctimonious types acted outraged and photographer Annie Leibovitz was thrown under a bus, VF made out like a bandit.

After one day of availability, the pictures drew in 1.8 million unique visitors to VF.com, which normally
gets between 20,000 and 40,000 per day, and 17 million page views. Maybe I'm getting cynical but I don't think anyone was surprised at the "controversy," they were planning on it.

Microsoft has issued a rare security alert, something it usually doesn't do because then that tells the bad guys where to go look for an exploit. That usually means it's pretty severe.

The problem relates to a vulnerability that could give an unauthorized user access to LocalSystem, a user account not normally accessible by Windows users, as it has extensive privileges within the operating system and access to pretty much the entire system.

This affects Windows XP Professional Service Pack 2 and all
supported versions and editions of Windows Server 2003, Windows Vista,
and Windows Server 2008. Customers who allow user-provided code to run
in an authenticated context, such as within Internet Information
Services (IIS) and SQL Server, should review this advisory, since it contains workarounds.

Microsoft may issue an out-of-band patch if the problem is serious enough, or it will hold off until the next Patch Tuesday, which would be May 13.

Basic economics tells you that when there is an oversupply
of something, you cut back on production.

It took memory manufacturers a little while to figure that
out, but at least they did figure it out.

Gartner estimates that worldwide capital spending on
semiconductor equipment in 2008 will be $47.5 billion, a 19.8 percent decline
from 2007. Memory-related capital equipment would see a 29 percent reduction
and DRAM in particular would drop by 47 percent.

Gartner cited a weakening in the U.S. economy and oversupply
as the reasons for the cut in spending. "The expected bursting of the DRAM
capital spending bubble has finally happened, as rampant overcapacity in that
sector drove unit prices well below cash costs for most manufacturers,"
said Klaus Rinnen, managing vice president for Gartner's semiconductor
manufacturing group.

Rinnen noted that the memory market spent more than 57
percent of total revenue on capital expansion, "a level which cannot be
supported by the anticipated lackluster revenue growth."

Much as I love seeing 2GB of memory at Fry's for $50,
they really needed to smarten up on pricing.

It's amazing how quickly things change these days, thanks to the Internet, and more important, just how dogged some blogging sites have become.

A Florida company called Psystar created a stir on the Internet with its PC white box that runs Mac OS X. It was selling the $399 mini tower under the name Open Computing and OpenPro. I had figured the company would be obliterated by Apple, but now it seems Apple won't have to lift a finger thanks to some dogged bloggers.

The ensuing flood of customers willing to fork over credit card information to an unknown company skirting the law is alarming, but apparently the company was so inundated it was briefly knocked off line. Its payment processor cut it off, citing the company's questionable legal standing.

First, the company's address seemed to change daily, as noted by the UK newspaper The Guardian. Then some people from Gizmodo actually went to an address for Psystar and found that it wasn't there, and the business at the address had no idea what Psystar was.

Seriously, if you want a Hackintosh PC, it's very easy to do on your own. Build a computer and get a copy of Leopard and a few utilities off the Internet and you're golden. You can find plenty of information with a basic Google search.

Intel's executives are remarkably disciplined and not given
to making foolish statements. Usually. But one comment coming out of the Intel
Developer Forum in Shanghai is a head-scratcher.

TGDaily reports that during a demo, Ron Fosner, an Intel
Graphics and Gaming Technologist and former video game programmer, said that
multi-core CPUs will put an end to the need for a graphics processing unit (GPU)
and that people "probably" will not need discrete graphics cards in
the future.

Fosner went on to say that computers didn't have discrete graphics
in the '80s and that CPUs are becoming powerful enough to take over that role.

You can imagine Nvidia's reaction.

"It's funny that he says discrete GPUs are dead when
they are going to build one themselves, in Larrabee," said Derek Perez, a
spokesman for Nvidia. "All the indications are that there is more need for
a GPU than a CPU. They're four-cores, we're 128-cores."

The 1980s analogy doesn't quite work, he pointed out,
because back then people weren't watching HDTV video or using a graphical UI
like Vista's Aero, which failed miserably under Intel's weakest of integrated graphics
chipsets, the 915.

And Intel has yet to produce DirectX 10 video drivers for
its graphics chips more than a year after the release of Windows Vista. Intel
does a lot of things very well but it's not the first name that comes to mind
when you talk graphics.

Perez pointed out that Vista and Mac OS X both require a
GPU, a first for operating systems. People are playing HDTV video, 3D games and
3D apps like Google Earth, all of which need a GPU. "There's this global
movement toward visual computing, not a basic enterprise computing. That's why GPUs
are starting to sell," he said.

Even with Havendale, the rumored answer to AMD's Fusion,
it's doubtful a CPU will ever fully be able to handle Aero, much less a
graphical beast like Crysis. So I have no idea what Intel has in mind, or
thinks it has, but it better have some big surprises up its sleeve.

(For the unfamiliar, Crysis is a first-person shooter that
has set new levels of performance pain, making it hard to play on anything but
the absolute newest, fastest graphics cards. These days, a common joke on
boards in discussing PC hardware is "But can it run Crysis?" in a nod
to its high resource demands.)

Radar Online, one of the few gossip sites worth reading, has an interesting story on a partnership between former Vanity Fair and New Yorker editor Tina Brown and media mogul Barry Diller to launch a news aggregator website. The site will have "no ideological stance" and will be edited by a former deputy managing editor of the Wall Street Journal.

Brown declined to offer specifics to Radar Online on either a launch date or about she hopes to do with the site, saying only that it would be "a new take on an aggregator website."

Well let's hope so because let's face it, she is way, way late to market as they say on Madison Avenue. The Drudge Report was one of the first, if not the first aggregators out there. Most people think of Drudge as a political creature, but back n 1995 when he first began haunting Usenet newsgroups and first set up his site, he was all about entertainment news and gossip.

Rob Malda set up Slashdot in 1997 and in 1999 Drew Curtis set up FARK. Malda ran his operation out of Michigan while Curtis was based in Lexington, Ky, proving you don't need to be in the media nerve center of the world (i.e. New York) to be a success. What you need to be is useful, cool, funny, or preferably, all three.

A San Diego, Texas attorney has dropped an ultimatum on Microsoft: settle a lawsuit for $25 million or he will call chairman Bill Gates to the witness stand.

The suit stems from a fire in the Lazo family home that was blamed on faulty wiring in the family's Xbox. Back in 2005, Microsoft had to recall more than 14 million Xbox power cords due to the risk of overheating. Apparently that wasn't enough in this case and the Lazo family home went up in flames in June 2006, causing severe burns to then-13-year-old Kayla Lazo and the eventual loss of one leg.

According to the San Antonio Express-News, fire marshals were unable to determine the cause of the fire. William Tinning, an attorney hired by the family placed the blame on the console, citing the power cord recall. Tinning has now told Microsoft that if his demands for a settlement were not met by April 18, he will try to force Bill Gates to testify in the case.

Also named in the suit are Gamestop, where the console was purchased, and Chinese power cord specialist Ji Haw Industrial.

This better not be an April Fool joke. DigiTimes reports that Hynix Semiconductor has decided to cut production of NAND flash in an attempt to reduce the insane glut of memory that has resulted in massive oversupply and erosion of prices and therefore profits.

Hynix is a big player in the memory sector, a $9.1 billion company in 2007 that grew by 13.7 percent, according to Gartner. Its output reduction will account for about 5% of worldwide capacity and should take about two months for the effects to be felt by the industry, according to the report.

NAND flash is used in a variety of products, from iPods to solid state drives. It has been a high growth market, but even as demand for flash products skyrockets, vendors still can't make any money because supply has outstripped incredible demand.

One of the reasons for the high supply has been that many of the memory vendors in Asia receive government subsidies to open new fabrication lines, and who turns down a government handout? So more fabs were coming online even as average selling prices (ASPs) were collapsing and one vendor after another warned about it.

The reduction in production is not seen as an indicator of softness in the U.S. economy, since the product is sold around the world in markets not feeling the U.S.'s pain.

You could see this one coming a mile off. Survey after survey in 2006 and 2007 showed consumers were on the fence about a high definition DVD format, waiting until there was one choice instead of two. Well, now that there is one choice, they are getting off the fence.

Home Media Magazine is reporting that sales of Blu-ray titles have gone from accounting for 2%-3% of sales to as high as 12.6% in the case of Hitman (which was a flop in theaters). The Oscar-winning No Country for Old Men
saw 9.8% of its total sales come from Blu-ray Disc its first five days
in stores, according to an analysis of Nielsen VideoScan First Alert
numbers by HMM.

Now compare that to Q4 2007, the gift giving season. The Simpsons Movie generated just 2.8% of its total sales from Blu-ray and Pirates of the Caribbean: At World's End saw just 3.7% of total sales come from Blu-ray. Is Hitman a better movie than POTC: AWE? Of course not (I much prefer Shoot 'Em Up). It was an ideal movie for Blu-ray's demo, however.

It's based on a videogame series, its target demo was young men with disposable income for a high-end home theater system and the top Blu-ray player is the PlayStation 3. Action movies have traditionally done well on DVD. Back in 1998, the top-selling DVD on the then-fledgling DVD format was Godzilla, and we all knew how badly that movie stunk.

Meanwhile, HD DVD is rapidly disappearing from Best Buys all over the country. For a few days, the ill-fated discs were on sale for 70% off, then they were gone. BB's entire high def section is now entirely Blu. Once again, a narrow window of opportunity missed. But I'll get a high def version of The Ultimate Matrix Collection one way or another.