Sunday, December 28, 2008

The “voice of the Dodgers” across six decades, broadcaster Vin Scully will be inducted into the Broadcasting Hall of Fame at the next NAB Show.

Vin Scully, one of the most celebrated sportscasters in history, will be inducted into the NAB Broadcasting Hall of Fame during the NAB Show Radio Luncheon sponsored by ASCAP. Scheduled for Tuesday, April 21, the luncheon will also feature the annual presentation of the prestigious NAB Crystal Radio Awards.

"For nearly 60 years, Vin Scully has entertained fans with his vivid play-by-plays and passion for baseball," said John David, NAB executive VP, Radio. "We look forward to honoring his significant contributions to radio broadcasting during this year's Radio Luncheon."

Scully joined the Brooklyn Dodgers broadcast team in 1950 and followed the team when they moved to Los Angeles in 1958. As the "voice of the Dodgers" on the team’s flagship station, KABC Radio, Scully won the distinction of being the only current broadcaster to serve 59 consecutive years on one team. In addition to his Dodger broadcasts, Scully has called play-by-play for 25 World Series and 12 All-Star Games.

Hailed as the poet laureate of baseball by “USA Today,” Scully has captivated listeners for years with his lively delivery. In 1976, Dodger fans named him the Most Memorable Personality in L.A. Dodger history.

He was also named baseball's all-time best broadcaster in Curt Smith's "Voices of Summer," and was voted the top sportscaster of the 20th century by more than 500 national members of the American Sportscasters Association (ASA). In 1982, Scully was inducted into the broadcasters’ wing of the National Baseball Hall of Fame as the Ford C. Frick Award recipient.

After years of suing thousands of people for allegedly stealing music via the Internet, the recording industry is set to drop its legal assault as it searches for more effective ways to combat online music piracy.

The decision represents an abrupt shift of strategy for the industry, which has opened legal proceedings against about 35,000 people since 2003. Critics say the legal offensive ultimately did little to stem the tide of illegally downloaded music. And it created a public-relations disaster for the industry, whose lawsuits targeted, among others, several single mothers, a dead person and a 13-year-old girl.

Sunday, December 14, 2008

Scientists are discovering new physical and mental benefits to listening to music

JULIET CHUNG (Wall Street Journal)

Researchers have found that music can affect people, animals and even plants in many ways. Now, several small-scale studies suggest some surprising benefits of listening to music, from the brain down to the blood vessels.

A team at Stanford University's School of Medicine found that listening to music might hold an adaptive evolutionary purpose. The researchers used functional magnetic resonance imaging to gauge activity in 18 people's brains as they listened to obscure 18th-century symphonies. The team found that activity in the regions of the brain associated with paying attention, making predictions and updating events peaked during the short periods of silence between movements.

Published last year in the journal Neuron, the study provides a glimpse of how the brain organizes events, says lead author Vinod Menon, and suggests that listening to music can help sharpen the ability to anticipate events and sustain focus.

Finnish researchers have found that music could help aid cognitive recovery soon after a stroke. The study, which followed 54 patients and was published February in the journal Brain, found that verbal memory and focused attention improved significantly more in stroke patients who listened to their favorite music several hours daily than in those patients who listened to audio books or to nothing at all. Patients were randomly assigned to the music group and listened to the music for at least an hour daily, for two months, during their acute recovery phase.

Listening to your favorite music can also promote the functioning of blood vessels, according to a new study out of the University of Maryland School of Medicine. Researchers found that the diameter of the average upper-arm blood vessel expanded by 26% when subjects listened to music they had previously selected for making them feel joyful. The diameter constricted by 6% when subjects listened to music that made them feel anxious. Blood-vessel expansion indicates nitric oxide is being released, which can reduce the formation of blood clots and LDL, the so-called bad cholesterol, according to Michael Miller, the study's principal investigator and director of preventive cardiology at the medical center. The results were presented in November before the American Heart Association.

Of the 10 participants, several chose country music as their joyful listening selection and several said heavy metal made them feel anxious. But that says more about the participants than about any inherent vascular benefits of the genres themselves, says Dr. Miller. "I was listening to Hootie & the Blowfish last night and I had, I'm sure, a lot of endorphins being released," he says.

A study published in January by the Cochrane Collaboration, a London-based nonprofit that publishes reviews of health-care interventions, suggests that listening to or making music with trained therapists can help in treating depression. The group found five randomized studies that examined music therapy; four reported that depression symptoms lessened more among those who were randomly assigned to music therapy than those who received treatment that did not involve music. The fifth study reported no significant change. Further research needs to be done given the small number of credible studies in the area, the study says.

Other new studies confirm old hunches. A team at Brunel University in England found that certain music deemed motivational can enhance a recreational athlete's endurance and increase pleasure while exercising. In blind experiments on 30 participants, tracks from artists like Queen, Madonna and the Red Hot Chili Peppers increased endurance on a treadmill by up to 15%, says Costas Karageorghis, a reader in sports psychology at Brunel.

Recreational athletes might be served well by picking workout music that is up-tempo, has "bright, major harmonies" and is studded with encouraging phrases, says Mr. Karageorghis. "There's a reason Olivia Newton-John's 'Let's Get Physical' was a huge hit" for workouts, he says.

The former engineering arm of Bell Canada appears to be meeting the same fate as its American cousin did years ago. Nortel Meridian was one of the premier telephone suppliers in North American ten years ago.

Monday, December 8, 2008

Gee, I wish I could go down to the bank, tell them I'm an idiot, and they had me cash and don't even begin to make payments for two years...GM says it "disappointed" and "betrayed" consumersMon Dec 8, 2008 8:40am EST

DETROIT (Reuters) - General Motors Corp on Monday unveiled an unusually frank advertisement acknowledging it had "disappointed" and sometimes even "betrayed" American consumers as it lobbies to clinch the federal aid it needs to stay afloat into next month.

The print advertisement marked a sharp break from GM's public stance of just several weeks ago when it sought to justify its bid for a U.S. government on the grounds that the credit crisis had undermined its business in ways executives could never have foreseen.

It also came as Chief Executive Rick Wagoner, who has led the automaker since 2000, faces new pressure to step aside as GM seeks up to $18 billion in federal funding.

"While we're still the U.S. sales leader, we acknowledge we have disappointed you," the ad said.

The unsigned open letter, entitled "GM's Commitment to the American People" ran in the trade journal Automotive News, which is widely read by industry executives, lobbyists and other insiders.

In the ad, GM admits to other strategic missteps analysts and critics have said hastened its recent decline.

"We have proliferated our brands and dealer network to the point where we lost adequate focus on the core U.S. market," the ad said. "We also biased our product mix toward pick-up trucks and SUVs."

But GM also says in the ad that it was hit by forces beyond its control as it tried to complete a restructuring earlier this year.

"Despite moving quickly to reduce our planned spending by over $20 billion, GM finds itself precariously and frighteningly close to running out of cash," the ad says.

A failure of GM would deepen the current recession and put "millions of job at risk," according to the ad, which also highlights the automaker's pledged restructuring and intention to begin repaying taxpayers in 2011.

GM spokesman Greg Martin said the ad was an attempt by the automaker to present "a pledge directly to the public."

"We believe we need to deliver this commitment unfiltered since quite a bit of media commentary has not kept pace with our actual progress to transform the company," Martin said.Senate Banking Committee Chairman Christopher Dodd, a Democrat from Connecticut who is central to the effort to craft an auto bailout bill, on Sunday said GM should replace Wagoner.

Monday, November 24, 2008

With sports fans still getting used to their high-definition television sets, the National Football League is already thinking ahead to the next potential upgrade: 3-D.

Next week, a game between the San Diego Chargers and the Oakland Raiders will be broadcast live in 3-D to theaters in Los Angeles, New York and Boston. It is a preliminary step on what is likely a long road to any regular 3-D broadcasts of football games.

The idea is a "proof of concept," says Howard Katz, NFL senior vice president of broadcasting and media operations. "We want to demonstrate this and let people get excited about it and see what the future holds."

The several hundred guests at the three participating theaters Dec. 4 will include representatives from the NFL's broadcasting partners and from consumer-electronics companies. Burbank, Calif.-based 3ality Digital LLC will shoot the game with special cameras and transmit it to a satellite. Thomson SA's Technicolor Digital Cinema is providing the satellite services and digital downlink to each theater, and Real D 3D Inc. will power the display in the theaters.

This isn't the first time the NFL has participated in a 3-D experiment. In 2004, a predecessor company to 3ality filmed the Super Bowl between the New England Patriots and the Carolina Panthers. When Sandy Climan, 3ality's chief executive officer, shows the footage, "people crouch down to catch the ball," he says. "It's as if the ball is coming into your arms."

Real D, which has rolled out 3-D systems in 1,500 theaters around the world, has long advocated the transmission of live events to theaters in 3-D. "We look forward to giving fans of live events the opportunity to feel like they're in the front row," says Michael Lewis, Real D's CEO.

Some live events, including opera broadcasts and circus performances, already pop up on screens at theaters across the country.

Next week's demonstration will also include television displays, to show what might one day be available in homes. While 3-D television sets are already available in stores, mainly for the handful of DVDs available in 3-D, the industry is still working on technical standards for 3-D.

That process raises the possibility that 3-D TV sets purchased today might not be compatible with programs aired in a few years' time. Just as in theaters, home viewers must wear special 3-D glasses.

Sunday, November 23, 2008

Architect Bradford Perkins has endured three recessions in his 39-year career, so when business started slowing earlier this year, he acted quickly to bolster revenues. The chairman of Perkins Eastman, the city's largest architectural firm, opened two more international offices and hired two renowned architects to help win more commissions.

While the firm was searching for more business overseas, activity was tanking at home. Twenty projects—roughly 10% of the firm's total in New York—were suspended or canceled in the past five months. That forced Perkins Eastman to lay off about 40 workers, or 10% of the staff—an action unprecedented in the company's 24-year history.

“We always knew the business ran in cycles,” says Mr. Perkins. “But what surprised me is how the effects of this downturn came on so fast.”

As both office and residential development in the city grinds to a halt, architectural firms are scrambling to find more work. They are lowering their fees, chasing smaller projects, seeking more international assignments and bidding on more institutional contracts to generate revenues—all tried-and-true methods employed during past economic slowdowns.

But architects fear their traditional coping strategies will fall short as the economy craters. For example, they note that as the recession spreads globally, work is evaporating in former construction hot spots like Dubai and China.

Architects also worry that clients that have long provided lifelines, such as municipalities, universities and hospitals, will retreat as donations and taxes shrivel.In October, the Architecture Billings Index plummeted to 36.2, its lowest level since the survey began in 1995.

Any score below 50 indicates a decline in billings. The index, calculated by the American Institute of Architects, is considered a leading indicator of construction activity. One area that architects traditionally count on to carry them through recessions—government construction—is in danger of being curtailed because municipalities are having difficulties getting bonds approved to finance projects, the institute says.

“This is unprecedented,” says Kenneth Drucker, senior principal at architecture firm HOK New York. “Usually when one business dries up, another takes its place.” New York City's gloomy financial outlook unnerves architect Paul Eagle because his firm was tapped to design the new police academy in Queens.

“I check my e-mail every day to make sure it is still on,” says Mr. Eagle, the principal of Perkins+Will's New York office. “We are moving ahead, but you hear the news every day and you get nervous.”

Architects note that many factors that influence their business, such as clients' ability to get financing, are beyond their control. So they concentrate on other aspects during tough times.

Mr. Perkins' response to tough times has been to rev up marketing, but he says that even with his increased sales efforts, the company's revenues could fall 10% next year. The decrease will be larger if the economy further curdles, he says.

“What I learned over the years is that you shoot your way out of the recession,” he says. “You've got to put a lot more emphasis on selling.”Mr. Perkins' new marketing tools include offices in Ecuador and India. He's also hoping clients will hire his two recent additions: Steve Rosenstein, who specializes in designing science and research facilities, and Thomas Fridstein, who is known for his international expertise. Stanton Eckstut, principal of Ehrenkrantz Eckstut & Kuhn Architects, is taking to the road to drum up more business. Last week, he flew to Los Angeles to discuss with local colleagues how they can capitalize on the city's plans to build more schools.

Recently, the firm formed a joint venture with two Washington, D.C.-based engineering firms and just won a bid to build a school there.“We are out there. We are canvassing,” says Mr. Eckstut.

Despite his strenuous marketing efforts, the firm laid off 10 people—about 10% of the New York staff—in the past six weeks because work is slow.Robin Klehr Avia, a managing partner of Gensler, is in a similar situation.

“The problem is 20 firms respond to RFPs,” she says.Like other firms, Gensler has cut its fees but has still lost business. Ms. Klehr Avia says that in the last six weeks, 10 projects have either been scaled back, canceled or suspended.

Reactions:

Thursday, November 20, 2008

Irving Gertz, a film and television composer who contributed music to 1950s science-fiction films such as "It Came From Outer Space" and "The Incredible Shrinking Man" and to 1960s TV series such as "Voyage to the Bottom of the Sea," has died. He was 93.

Gertz died Friday at his home in West Los Angeles, said David Schecter, a record producer and film-music historian who was a close friend. No specific cause of death was given.

From the late 1940s to the late '60s, Gertz wrote music for about 200 movies and television episodes. Among his film credits are "Abbott and Costello Meet the Mummy," "Francis Joins the WACS," "The Alligator People," "The Monolith Monsters," "The Creature Walks Among Us," "Overland Pacific," "To Hell and Back," "The Thing That Couldn't Die" and "Flaming Star."

Among his TV credits were "Daniel Boone," "The Invaders," "Land of the Giants," "Peyton Place" and "Voyage to the Bottom of the Sea."

Reactions:

Wednesday, November 19, 2008

New documents made available under a Freedom of Information Act request brought additional information pertaining to the use of triggerfish technology to the foreground. Triggerfish is also known as cell-site simulators or digital analyzers. By posing as a cell tower, it is possible to use it to trick nearby phones into transmitting their serial numbers and other data, which can be used to triangulate the location of mobile phones. While earlier understanding of this technology assumes the cooperation of mobile phone operators, one of the uncovered documents explicitly noted that it can be deployed without having to "involve the cell phone provider."

The USB Promoter Group finalized the USB 3.0 specification on Monday this week--almost eight years after the launch of USB 2.0 at WinHEC.

Also known as "SuperSpeed USB," discrete controllers based on this standard are expected in the second half of next year, with consumer products poised to follow in 2010. About 10 times faster than USB 2.0, transferring a 25GB HD movie file will take just 70 seconds compared to 13.9 minutes using USB 2.0's 480Mbps data transfer rate.

On the other hand, the delays in rectifying the USB 3.0 standard meant that Microsoft will not have support for USB 3.0 in Windows 7 at RTM, according to Lars Giusti of Microsoft. At this moment, Microsoft is trying to decide if it will even incorporate USB 3.0 support in Microsoft Vista.

Monday, November 17, 2008

A supercomputer at Los Alamos National Laboratory remained the world's fastest, narrowly edging out another massive machine at Oak Ridge National Laboratory, according to a twice-yearly ranking of the 500 largest scientific systems.

International Business Machines Corp.'s 188 systems accounted for the most computing power on the so-called Top500 list, and it supplied machines rated first, fourth and fifth. Hewlett-Packard Co. moved past IBM in terms of total systems on the list, with 209 machines. Cray Inc. supplied the No. 2 system and three others in the top 10.

Intel Corp. chips were used in 379 of the top 500 systems. Rival Advanced Micro Devices Inc. supplied chips in 59 machines, including seven of the 10 fastest.

The No. 1 Roadrunner machine at Los Alamos uses both AMD and IBM microprocessors, while the Oak Ridge Jaguar system is powered only by AMD's Opteron chip.

The Top500 list is compiled by researchers at the University of Mannheim, Germany, along with the University of Tennessee in Knoxville, and the Department of Energy's National Energy Research Scientific Computing Center in Berkeley, Calif.

Reactions:

Thursday, November 13, 2008

LG Display, Sharp, and Chunghwa Picture Tubes agreed to plead guilty to criminal charges for participating in a liquid crystal display price-fixing conspiracy and pay US$585 million in fines, the U.S. Department of Justice announced Wednesday.

The three companies worked in concert to set prices on thin-film transistor LCDs, which are used in computer monitors, notebooks, televisions, mobile phones, and various electronics, according to the antitrust unit of the Justice Department.

Apple, Dell, and Motorola were among the companies affected by the price fixing, antitrust regulators said.

"The price-fixing conspiracies affected millions of American consumers who use computers, cell phones, and numerous other household electronics every day," Thomas Barnett, assistant attorney general for the Justice Department's antitrust division, said in a statement.

The three companies, which were charged with violating the Sherman Antitrust Act, allegedly held "crystal" meetings and engaged in communications about setting prices on the TFT-LCD displays. They agreed to charge predetermined prices for the displays, issued price quotes based on those agreements, and exchanged sales information on the display panels, in order to monitor and enforce the agreement, the Justice Department said.

LG Display agreed to pay a US$400 million fine, marking the second-highest antitrust fine ever imposed. The company pleaded guilty to setting prices with other unnamed suppliers for the TFT-LCD panels worldwide from September 2001 to June 2006, when the company operated under the name L.G. Philips LCD, a joint venture between LG Electronics and Philips Electronics. LG Display America was known as L.G. Philips LCD America.

Sharp, meanwhile, agreed to pay a US$120 million fine and participated in the conspiracy between April 2001 and December 2006 with other unnamed suppliers. The conspiracy involved setting prices in three separate agreements for TFT-LCD panels sold to Dell, which used them in computer monitors and laptops.

And during the period ranging from the fall of 2005 to mid-2006, similar price-fixing schemes were employed in sales to Motorola, which used the panels in its popular Razr mobile phones.Sharp's conspiracy also touched Apple from September 2005 to December 2006, in which Apple used the displays for its popular iPod music players.

Chunghwa agreed to pay a US$65 million fine, for its participation in the price-fixing scheme from September 2001 through December 2006.

The Justice Department began its investigation in 2006 and notes its investigation is still on-going.

"Dell is aware of the announcement and will review its impact, but we have no comment at this time and probably will not in the near term as it's an ongoing investigation," a Dell representative said Wednesday, in an e-mail response.

Sony, a major LCD panel producer, also declined to comment.

For the LCD industry, problems began in the late 1990s when a surge in demand for notebooks and handheld devices drove up the need for LCD glass. As a result, the TFT-LCD makers built glass plants in Korea and Taiwan during 1998 through 1999.

But as those factories came online and began to pump out LCD glass, a glut took hold. And by the fall of 2000, prices on 15-inch flat panels plummeted to a point that in some cases manufacturers were having to sell their panels at US$5 to US$10 below cost.

Between October 2000 through August 2001, LCD makers were feeling the pain of an over supply of panels. But after August 2001, prices began to rise.

And apparently, it was no coincidence. Five months prior, Sharp had begun fixing prices on TFT-LCD panels sold to PC giant Dell and in September 2001, LG and Chunghwa also began to engage in price fixing, as well.

Analysts, at the time, predicted LCD shortages, especially in the 15-inch panel, would continue through 2002.

IDC analyst Bob O'Donnell noted at the time that while PCs tend to only go down in price over time, flat panel prices have occasionally risen. Said O'Donnell at the time: "LCD is one of the few [markets] where things have actually gone up in price."

Although Sharp admits to engaging in price fixing with Apple's iPod screens in the 2005 to 2006 period, it remains unclear whether other vendors may have engaged in a similar behavior with Apple back in 2002.

That is when Apple was hit with a component shortage of 15-inch LCD panels for its newly introduced all-in-one flat panel iMacs. As a result, Apple suffered a shortage of iMacs after introducing and touting its sleek iMac.

Home Theater NewsIndustry-Trade NewsWritten by AVRev.com Wednesday, 12 November 2008This economy is taking its toll on home theater specialty retailers. First, Tower Records went out of business. Then, Tweeter files for bankruptcy and its eventual new owners pulled the plug. Circuit City seems to be in trouble, recently announcing the closing of several of their stores.

Now, Sound Advice, a Florida-based home theater retailer has called it quits. They will be closing all 22 of their stores by year's end. Their liquidation sale started last Wednesday and will continue through until after the holidays. They are currently offering 10 percent off TVs, 20 percent off speakers and 40 percent off cables and accessories.

Sound Advice employs about 50 workers, which is about 30 fewer than were working there a year ago. The stores have stopped accepting checks, selling gift cards, warranties, etc.

Sound Advice was acquired for $61 million in 2001 by Tweeter, so it was probably not a stretch of the imagination that Sound Advice would eventually close.

This breakthrough product combines Avocent’s field-proven MPX1500 wireless video distribution technology with the latest in WiFi advancements, IEEE802.11n MIMO-based radios, thereby raising the bar for wireless HD video distribution systems in terms of visual acuity, transmission distance, and noise immunity. With the MPX1550 system, deployment of a single stream of media to many displays can be accomplished in minutes, even under the most challenging of conditions, with a degree of reliability and quality that rivals a dedicated source device at each and every display.

Unlike multiple source devices, however, video and audio remain in lockstep across all displays, failure-prone moving parts are kept to a minimum, and software licensing costs are greatly reduced. Equally important, all devices in the media network are remotely manageable using the MPX1550 extender’s onboard Web interface, which is accessible via a dedicated control LAN interface.

“As the number of public displays continues to mushroom, the ability to deploy these displays in a rapid yet cost-effective manner has become a key differentiator for signage network providers,” said Mitch Friend, senior vice president and general manager of Avocent. “It has also become clear to us that content requirements differ among various deployments such as retail, institutional and way-finding signage applications. Characteristics such as visual acuity, resolution of motion and still images tend to vary. To handle the most demanding applications, we’ve added the MPX1550 system to our product offerings. By adding the 802.11n support to our product line, customers are now able to choose the extender that is best suited for their application.”

Emerge MPX1550 extenders are widely deployed for a variety of professional video applications, such as digital signage – providing panels with live content, entertainment and advertising in retail outlets, theaters, restaurants, airports, gas stations, and other venues. The Emerge MPX1550 extenders offer both wired (over IP) and wireless operation, including distribution of both HD and SD video signals from a single transmitter to a cluster of receivers. The MPX1550 system is uniquely optimized for both full motion video and still images.

The extenders support digital and analog video signals, providing support for a wide range of DVI, HDMI, VGA and component source and display devices. MPX1550 extenders also perform analog to digital conversion as needed to match dissimilar source and display devices. For additional control over source and display devices, the MPX1550 extenders also forward serial and IR signals. Support for HDCP ensures that even protected content can be distributed through signage networks in a secure and compliant manner.

The Emerge MPX1550T wireless transmitter and Emerge MPX1550R wireless receiver are available this month at a MSRP of $1,145 each.

The plan allows high-tech firms such as Google and Microsoft to develop a new generation of devices that will use the 'white spaces' between channels to go online.

By Jim Puzzanghera November 5, 2008

Reporting from Washington -- Federal regulators on Tuesday approved the largest ever expansion of wireless Internet access, unanimously backing a controversial plan to allow a new generation of devices to use the empty airwaves between television channels to go online.Dubbed "Wi-Fi on steroids" by its supporters in the high-tech industry, the plan promises to offer wireless Internet service across America -- most likely for free -- and spur new systems for transmitting video and other data between devices in homes.

It overcame staunch opposition from the entertainment industry, which is worried that the Web-surfing devices will interfere with TV broadcasts and wireless microphones.Although expected to be slower and possibly less secure than commercial broadband services from cable and phone companies, the new Internet connections will ride on the highest-quality broadcast airwaves, which are able to carry signals long distances and easily penetrate trees and walls.For decades, those government-owned airwaves have been reserved for TV stations.

But the Federal Communications Commission, in a 5-0 vote intended to increase the reach of high-speed Internet access, approved a plan advocated by public interest groups and technology companies, including Google Inc. and Microsoft Corp., to allow the use of the spectrum by new laptops, mobile phones and other gadgets with built-in equipment that are expected to hit the market in about two years.

"Consumers across the country will have access to devices and services they may have only dreamed about before," FCC Chairman Kevin J. Martin said.

The high-tech firms say the so-called white spaces of the airwaves that lie between the broadcast TV channels have the potential to provide revolutionary new wireless services that people could use for free -- unlike the spectrum leased by the government to cellphone companies, which then charge customers to access it.

Google Chief Executive Eric Schmidt and Microsoft co-founder Bill Gates personally lobbied FCC commissioners to open up access to the vacant channels, which range from about a third of the TV airwaves in major cities such as Los Angeles to three-quarters of the airwaves in rural areas.

These companies will have to build the infrastructure to connect the airwaves to the Internet, such as installing transmitters on existing cellular towers. Although they could charge users for those connections -- in the same way that some coffee shops charge for access to their Wi-Fi hot spots -- Google and others are expected to offer them for free, recouping the cost through sales of white-space-enabled devices and online advertising.

"This is a clear victory for Internet users and anyone who wants good wireless communications," Google co-founder Larry Page said.

Broadcasters fiercely fought it, warning that the new devices could cause some viewers to lose their TV signals because of interference. The issue is of particular concern because broadcasters must switch to all-digital signals in February. With traditional analog TV stations, interference causes static or fuzziness. But broadcasters say digital pictures can freeze or be lost entirely if another signal is broadcast on or near the same channel.

"The commission chose a path that imperils America's television reception in order to satisfy the 'free' spectrum demands of Google and Microsoft," said David Donovan, president of the Assn. for Maximum Service Television, an engineering trade group of TV broadcasters.

Representatives of sports leagues, musicians and large churches have also complained about potential interference from the new Internet devices and lobbied against the changes. They worry, for example, that one of these devices in a concert-goer's pocket would interfere with the performer's wireless microphone.

The FCC's field tests of early prototypes provided by Microsoft and other companies produced mixed results, with some of the devices failing to sense and avoid broadcast signals. Broadcasters said those results showed that the technology wasn't ready.

But FCC officials said the tests showed that it was possible for devices to use the airwaves without interference.

The devices will operate at low power and will only be able to use channels 21 to 51, where there are fewer TV stations. The FCC will give preference to devices that use technology to determine a user's location and then avoid TV channels operating there based on a special database, rather than devices that try to sense and avoid TV signals. Devices that use sensing technology will have to go through more rigorous field testing before being certified.

The FCC also will create a safe zone around large sporting and performance venues, such as the Los Angeles Coliseum and New York's Broadway theater district. The new mobile devices in those areas won't have access to channels used by wireless microphones.

Tuesday, November 4, 2008

Who can argue with the current trend of smaller carbon footprints, green packaging, and energy efficiency?

Me.

What a colossal waste of time, effort, and money. “Going Green” is the last thing we should be doing right now.

Something bizarre is going on we’re culturally loathing to admit- a trend driven by guilt, despair, depression, and paranoia. It’s a lemming like rush to the acceptance of mediocrity.

Mediocrity as a virtue?

It’s almost as if the Borg were attacking and we’ve decided to capitulate rather than fight. I don’t believe for a moment that resistance is futile against this mistakenly woeful Green Revolution.

Yet right now we’re stuck in the middle of a conundrum; an unhappy intersection of marketing prowess, Me Generation greed, economic leverage from oversea competitors, and an unhelpful dose of our own stupidity. We must compete our way out of it, not dig our holes even deeper.

I saw my first green audio product the other day. Yeech. Instead of touting its technical merits the press release rambled on about how the packaging was 43% smaller and could more easily recycled. A boring little spit of a product was in the box, but who cares if its carbon footprint is smaller? Are we now playing to the crowd or to the customer?

And why in the world should Americans cut down on packaging? Exporting trash is America’s single largest export! It’s bigger than corn or coal right now- look it up.

In the old days it was the biggest house, fastest car, loudest sound system, or the most sparkling jewel. These days it’s degenerated into something far more sinister. Remember the two buses on the Sex Pistols disc going to Nowhere and Boredom? That’s where the Green crowd wants to take us- and on mass transportation no less.

With apologies to the Renaissance Faire crowd, your world is boring and is not the solution to this malaise. As fascinating as it is to see unshaven Luddites prance about in the dirt with pointed shoes whilst strumming a lyre, let’s just say it’s a fork in the road I’m glad you went down and not me. Yet you have the microphone right now, as they say. And I mean to take it away from you.

The First Lord of Green Boredom in my book is Al Gore, a man who practices something entirely different than what he preaches. His message of apocalyptic environmentalism may have delivered him a Nobel Prize, but if I may I’d like to whisper an opposing message into your ear as well, “Our industry does not sell boring very well. Never has, and never will. We sell excitement, movement, and energy. We sell new and different- spectacular events if it all comes together properly. I doubt the Rolled Stones would have quite the attraction of the Rolling Stones, for instance….”

Do Toyota hybrids sell well in Abu Dhabi? Hell no. Ferraris do. I suppose that’s why they’re building Ferrariland there instead of here. Can you imagine the conniption fits our Greenies would convulse in had they even proposed putting a Ferrariland in Southern California? The best we can hope for a Prius based ride in Legoland.

The Toyota Prius is sold in over 40 countries, yet over sixty percent of their sales have been in the U.S. Ever wonder why? Is it high fuel costs here compared to Europe? Stylish design? High performance (30 HP less than Toyota’s own Yaris)? Low maintenance or insurance costs?

No, it’s because we’re supposed to feel better driving a boring car that’s acceptable to the Green crowd, even if we’re not quite sure where those 600,000 nickel metal hydride batteries are going to get dumped. Perhaps they should try Yucca Mountain- it’s not be used for anything right now anyway. Suffice to say the same crowd that predicts Armageddon from cow flatulence yet gives the battery disposal issues a complete pass.

They even gave the first 85,000 hybrid owners access to the car pool lane in California without any passengers. How that helped unclog the freeways I’m not quite sure. It seems the OPEC boys found a much more efficient technique in my opinion.

The only way out of this mess is with bigger and better technology, not going backwards and accepting lower performance as we pine for the good old days. Our best and brightest engineers should be racing ahead to build the latest and greatest, not wasting their time bragging about the various merits of cardboard packaging and telling us to make do with less performance. They should be building nuclear and fusion reactors, high capacity energy storage, more efficient transportation, and better communication systems.

We got it. Digital consoles, integrated wild tracks, DSP processing, switched power supplies, and line arrays have been very good to this industry. So what’s next, and when do we get it? I want our manufacturers to rigorously go through the entire system concept from start to finish and make the whole thing much better! That is what we need to prosper in the long term. The dreamers, technologists, engineers, and builders that will shape our new reality must be given their chance.

Try an experiment for me. Go up to your head salesperson and tell them you want them to only sell systems that don’t perform well but they are made from eco-friendly low carbon footprint materials. The audience won’t hear anything, but who really cares anyway? I have a pretty good idea where your salesperson’s footprint will be planted on you after that request. It does sound kind of stupid when it hits home, doesn’t it? It’s not different for any other industry.

Going Green may be a very quick way to make your accounting go red and your future black. Paraphrasing Churchill, this may not be the beginning of the end of the Green Revolution. But perhaps it is the end of the beginning. I hope we’re at a point where the platitudes and positioning end and meaningful innovation begins.

Thank goodness for that. Now flip this chlorinated and Kraft pulped (a fascinating industrial process dependent on gas turbine engines, by the way) page on to the next article.

The public reputation of Windows Vista is in shambles, as Microsoft itself tacitly acknowledged in its Mojave ad campaign.

IT departments are largely ignoring Vista. In June (18 months after Vista’s launch), Forrester Research reported that just 8.8% of enterprise PCs worldwide were running Vista. Meanwhile, Microsoft appears to have put Windows 7 on an accelerated schedule that could see it released in 2010. That will provide IT departments with all the justification they need to simply skip Vista and wait to eventually standardize on Windows 7 as the next OS for business.

So how did Vista get left holding the bag? Let’s look at the five most important reasons why Vista failed.

5. Apple successfully demonized Vista

Apple’s clever I’m a Mac ads have successfully driven home the perception that Windows Vista is buggy, boring, and difficult to use. After taking two years of merciless pummeling from Apple, Microsoft recently responded with it’s I’m a PC campaign in order to defend the honor of Windows. This will likely restore some mojo to the PC and Windows brands overall, but it’s too late to save Vista’s perception as a dud.

4. Windows XP is too entrenched

In 2001, when Windows XP was released, there were about 600 million computers in use worldwide. Over 80% of them were running Windows but it was split between two code bases: Windows 95/98 (65%) and Windows NT/2000 (26%), according to IDC. One of the big goals of Windows XP was to unite the Windows 9x and Windows NT code bases, and it eventually accomplished that.

In 2008, there are now over 1.1 billion PCs in use worldwide and over 70% of them are running Windows XP. That means almost 800 million computers are running XP, which makes it the most widely installed operating system of all time. That’s a lot of inertia to overcome, especially for IT departments that have consolidated their deployments and applications around Windows XP.

And, believe it or not, Windows XP could actually increase its market share over the next couple years. How? Low-cost netbooks and nettops are going to be flooding the market. While these inexpensive machines are powerful enough to provide a solid Internet experience for most users, they don’t have enough resources to run Windows Vista, so they all run either Windows XP or Linux. Intel expects this market to explode in the years ahead. (For more on netbooks and nettops, see this fact sheet and this presentation — both are PDFs from Intel.)

3. Vista is too slow

For years Microsoft has been criticized by developers and IT professionals for “software bloat” — adding so many changes and features to its programs that the code gets huge and unwieldy. However, this never seemed to have enough of an effect to impact software sales. With Windows Vista, software bloat appears to have finally caught up with Microsoft.

Vista has over 50 million lines of code. XP had 35 million when it was released, and since then it has grown to about 40 million. This software bloat has had the effect of slowing down Windows Vista, especially when it’s running on anything but the latest and fastest hardware. Even then, the latest version of Windows XP soundly outperforms the latest version of Microsoft Vista. No one wants to use a new computer that is slower than their old one.

2. There wasn’t supposed to be a Vista

It’s easy to forget that when Microsoft launched Windows XP it was actually trying to change its OS business model to move away from shrink-wrapped software and convert customers to software subscribers. That’s why it abandoned the naming convention of Windows 95, Windows 98, and Windows 2000, and instead chose Windows XP.

The XP stood for “experience” and was part of Microsoft’s .NET Web services strategy at the time. The master plan was to get users and businesses to pay a yearly subscription fee for the Windows experience — XP would essentially be the on-going product name but would include all software upgrades and updates, as long as you paid for your subscription. Of course, it would disable Windows on your PC if you didn’t pay. That’s why product activation was coupled with Windows XP.

Microsoft released Windows XP and Office XP simultaneously in 2001 and both included product activation and the plan to eventually migrate to subscription products. However, by the end of 2001 Microsoft had already abandoned the subscription concept with Office, and quickly returned to the shrink-wrapped business model and the old product development model with both products.

The idea of doing incremental releases and upgrades of its software — rather than a major shrink-wrapped release every 3-5 years — was a good concept. Microsoft just couldn’t figure out how to make the business model work, but instead of figuring out how to get it right, it took the easy route and went back to an old model that was simply not very well suited to the economic and technical realities of today’s IT world.

1. It broke too much stuff

One of the big reasons that Windows XP caught on was because it had the hardware, software, and driver compatibility of the Windows 9x line plus the stability and industrial strength of the Windows NT line. The compatibility issue was huge. Having a single, highly-compatible Windows platform simplified the computing experience for users, IT departments, and software and hardware vendors.

Microsoft either forgot or disregarded that fact when it released Windows Vista, because, despite a long beta period, a lot of existing software and hardware were not compatible with Vista when it was released in January 2007. Since many important programs and peripherals were unusable in Vista, that made it impossible for a lot of IT departments to adopt it. Many of the incompatibilities were the result of tighter security.

After Windows was targeted by a nasty string of viruses, worms, and malware in the early 2000s, Microsoft embarked on the Trustworthy Computing initiative to make its products more secure. One of the results was Windows XP Service Pack 2 (SP2), which won over IT and paved the way for XP to become the world’s mostly widely deployed OS.

The other big piece of Trustworthy Computing was the even-further-locked-down version of Windows that Microsoft released in Vista. This was definitely the most secure OS that Microsoft had ever released but the price was user-hostile features such as UAC, a far more complicated set of security prompts that accompanied many basic tasks, and a host of software incompatibility issues. In other words, Vista broke a lot of the things that users were used to doing in XP.

Bottom line

There are some who argue that Vista is actually more widely adopted than XP was at this stage after its release, and that it’s highly likely that Vista will eventually replace XP in the enterprise. I don’t agree. With XP, there were clear motivations to migrate: bring Windows 9x machines to a more stable and secure OS and bring Windows NT/2000 machines to an OS with much better hardware and software compatibility. And, you also had the advantage of consolidating all of those machines on a single OS in order to simplify support.

With Vista, there are simply no major incentives for IT to use it over XP. Security isn’t even that big of an issue because XP SP2 (and above) are solid and most IT departments have it locked down quite well. As I wrote in the article Prediction: Microsoft will leapfrog Vista, release Windows 7 early, and change its OS business, Microsoft needs to abandon the strategy of releasing a new OS every 3-5 years and simply stick with a single version of Windows and release updates, patches, and new features on a regular basis. Most IT departments are essentially already on a subscription model with Microsoft so the business strategy is already in place for them.

As far as the subscription model goes for small businesses and consumers, instead of disabling Windows on a user’s PC if they don’t renew their subscription, just don’t allow that machine to get any more updates if they don’t renew. Microsoft could also work with OEMs to sell something like a three-year subscription to Windows with every a new PC. Then users would have the choice of renewing on their own after that.

Wednesday, October 29, 2008

In recent weeks, two federal judges have criticized the record industry's attempts to extract exorbitant sums from alleged file-sharers, who might have uploaded/downloaded tracks on peer-to-peer services, but only for personal use as opposed to profit.

In one case, judge Michael Davis in Duluth, Minn. pleaded with Congress to revise the copyright law so that individuals like Jammie Thomas, who a jury found liable for uploading 24 songs to Kazaa, wouldn't face astronomical fines. The jury in the case had ordered Thomas to pay $220,000, but Davis last month set aside the verdict and ordered a new trial for reasons unrelated to the size of the award.

Also, Judge Xavier Rodriguez in San Antonio, Texas recently fined 20-year-old Whitney Harper $200 a track for file-sharing -- significantly lower than the $750 a track set out in the statute. Rodriguez departed from the minimum on the theory that Harper, a high school student at the time she shared files, was an "innocent infringer."

Now, Harvard Law professor Charles Nesson is asking a court to declare the statute the RIAA is relying on unconstitutional.

Nesson, who is representing Joel Tenenbaum, another teenager at the time of the alleged file-sharing, writes: "The plaintiffs and the RIAA are seeking to punish [Joel Tenenbaum] beyond any rational measure of the damage he allegedly caused. They do this, not for the purpose of recovering compensation for actual damage caused by Joel's individual action, nor for the primary purpose of deterring him from further copyright infringement, but for the ulterior purpose of creating an urban legend so frightening to children using computers, and so frightening to parents and teachers of students using computers, that they will somehow reverse the tide of the digital future."

There's no real question the record industry has seen revenues fall because of file-sharing. At the same time, the RIAA's campaign against individual users appears grossly random. The record labels have targeted around 30,000 unlucky individuals who have allegedly used a peer-to-peer service to share tracks. But that's out of millions of file-sharers.

And while CD sales have plunged, no individual user is responsible for the billions in lost revenue.Obviously, the record industry needs to figure out new ways to bring in revenue, whether by ad deals, selling concert tickets or some other business plan. But suing ordinary music-listeners into bankruptcy is no way to save an industry

Tuesday, October 28, 2008

Dingell asks commission for white space answersOct 28, 2008 10:35 AMCongressman John Dingell, D-MI, chairman of the House Committee on Energy and Commerce, has asked the FCC to explain the way it apparently is going about making rules aimed at allowing unlicensed devices into unused portions of the TV band known as white spaces.

In a letter to all five FCC commissioners Oct. 24, Dingell centered his questions on two areas: peer review of the “Evaluation of the Performance of Prototype TV-Band White Space Devices Phase II” report from the FCC’s Office of Engineering and Technology (OET) released Oct. 15 and accountability for taking corrective steps if white space devices cause harmful interference.

Dingell asked for written responses from the commission by Oct. 31.

On the same day the OET released the report, FCC Chairman Kevin Martin said he favored allowing unlicensed devices to operate in TV band white spaces with certain conditions. The FCC is tentatively scheduled to move on the issue at its Nov. 4 meeting.

Opponents, such as various broadcast trade associations, broadcast networks, affiliate groups and others, have asserted that allowing white space devices that rely on spectrum-sensing technology to identify unused spectrum for operation into the TV band threatens the billions of dollars both broadcasters and viewers have invested in DTV technology. While Martin said he favored opening the band to devices that incorporate geolocation technology to access a database of available frequencies in a given locale, the OET report held open the possibility of authorizing devices that only use spectrum sensing in the future.

Repeated tests by the FCC have shown that prototype white space devices have failed to accurately and consistently detect the presence of DTV transmission and those of wireless mics, which share the TV band.

Among Dingell’s questions:

Was the OET report released Oct. 15 peer reviewed? If so, when and by whom? What changes, if any, were made based on the peer review?

If the commission believes regulations do not require a peer review, why did the FCC subject its first report detailing the results of phase one white space prototype testing to peer review?

How would the commission address reports of harmful interference to over-the-air TV signals?

If white space devices are sold, and interference problems surface, how will the commission remove them from the market?

Sunday, October 26, 2008

FCC Chairman Kevin Martin has announced the agency’s plan to approve the use of fixed-location white space devices.

On Oct. 15, FCC Chairman Kevin Martin proposed opening up unused portions of the TV airwaves known as white spaces for unlicensed devices to deliver wireless broadband service. The proposal, made in the wake of field tests, is a victory for the Wireless Innovation Alliance, a group of technology companies including Google, Phillips and Microsoft that have been pressing for unfettered access to the spectrum space being vacated after the switch to digital television Feb. 17.

The new frequency usage rules are expected to be issued on Election Day, Nov. 4, with an Oct. 27 deadline for interested parties to submit formal comments to the FCC. This is an unusually short comment period, especially considering the protracted time this issue has been under consideration. On Oct. 17, the NAB filed an emergency request that the agency seek a 70-day period for public comments, stating, “The report’s conclusions are not supported and in fact contradicted by the underlying data.”

The filing also noted, “The FCC seems satisfied that white space devices will not significantly interfere with broadcast or cable TV signals in the home, a finding seemingly not well documented in its published test reports. At issue is the report’s contention that ‘proof of concept’ for the safe use of WSDs has been adequately proven in testing.”

While promising to consider the NAB request, the FCC seems satisfied that properly designed white space devices will not pose a significant interference threat. The FCC summary report states: “We are satisfied that spectrum sensing in combination with geolocation and database access techniques can be used to authorize equipment today under appropriate technical standards and that issues regarding future development and approval of any additional devices, including devices relying on sensing alone, can be addressed.”

That summary specifies only white space devices operating from a fixed location being allowed into the spectrum with the transition to digital television. Portable white space devices and products that rely on spectrum sensing only remain under consideration, though it seems clear that the agency expects to approve them at a later date.

Google welcomed Commissioner Martin’s comments on the proposed ruling on the company's public policy blog. "This news should be greatly encouraging for American consumers," it said. "The FCC now has more than enough information to develop appropriate rules that protect TV stations and wireless microphone users from harmful interference while at the same time allowing innovators and entrepreneurs to develop technology that productively uses these airwaves."

Wireless microphone manufacturer Shure joined with the NAB in filing a request for a longer comment period (see Shure white space filing echoes broadcasters' call for public comment). Shure is a long-time proponent of using science to determine the best course for white space devices, and broadcasters, while still accommodating RF microphones as incumbent spectrum users, were not prepared to comment formally, because the reports issued to date do not specifically mention the fate of its products. “We are continuing to discuss matters with the commission, learning all we can about the planned ruling. Our goal is to make sure that wireless microphone users are adequately protected, now and after Feb. 17,” said Christopher Lyons, Shure’s manager of technical and educational communications.

There does appear to be a disparity between the positive test outcome noted in the FCC summary and actual test date found in the full Office of Engineering and Testing report on white space device field testing. The executive summary’s statement that the “proof of concept” had been met was particularly interesting insofar as it is the first time the FCC has used this language in referring to the purpose of white space device testing.

In addition, it appears that much of the unfavorable test data found in the field test summary report by the OET was downplayed or ignored. For instance, false-positive results from a Phillips prototype are credited as accurate scans, dramatically inflating the rated accuracy of that device. Even less favorable data from RF microphone testing is not published in full, but only summarized. Still, the report notes that, during testing at FedEx Field, one device found all channels occupied whether ESPN’s wireless mics were on or not, and another prototype “indicated several channels as available even when the microphones were on.”

As a result, it seems clear that the FCC’s pending white space device spectrum use rulemaking will be controversial as the agency attempts to balance the desire for expanded economic development in the technology sector and provide broadband wireless access to rural areas with the acknowledged need to ensure continued viability of broadcast TV, cable TV and wireless microphone systems.

Thursday, October 23, 2008

PHILADELPHIA (AP) - Comcast Corp. (CMCSA) (CMCSA) on Wednesday said it will begin rolling out faster Internet speeds over the next few weeks in selected markets to homes and businesses.

The nation's largest cable operator and residential Internet service provider will offer speeds up to 50 megabits per second, which enables users to download a high-definition movie in 16 minutes and a standard definition movie in 5 minutes.

Most Comcast customers will double their speeds for free.

The service will be available in parts of New England, including the Boston area and southern New Hampshire, as well as in portions of Philadelphia, New Jersey and the Twin Cities in Minnesota. Over the next few months, Comcast expects to roll it out to over 10 major markets.

For residential users, Comcast's new 'Extreme 50' tier, including up to 10 Mbps upstream service, will cost $139.95 a month. For businesses, it will cost $189.95 monthly, including extra features and support.

The 'Ultra' plan for individuals will offer speeds up to 22 Mbps for downloading and up to 5 upstream for $62.95 a month. The business "premium" tier will offer the same speeds for $99.95 a month.

To get the new Internet plans, individuals must also subscribe to Comcast's cable TV service.

Friday, October 10, 2008

(Cedar Rapids, Iowa)— ARCHI-TECH named the winners in its annual Readers' Choice Awards. Each year, visitors to architechmag.com vote for their favorite technology products. The Readers' Choice Awards feature the most innovative technology products in the commercial buildings market today. Product entries, submitted by manufacturers, are applicable to commercial facilities in the areas of audio/video, building controls, glass and glazing, HVAC, lighting, security, and sustainable technology.

The award winners will be featured in the December 2008 issue of ARCHI-TECH magazine and will remain on architechmag.com until June 2009. Voting for the 2009 Readers' Choice awards will take place June 2009 through August 2009.

"Specifying technology products in commercial buildings is the job of ARCHI-TECH's audience. The winners of our Readers' Choice program should be particularly gratified that their products won the attention of this influential community," said ARCHI-TECH publisher Jim Forthofer.

The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. Speaking with an audience of magazine executives visiting the Google campus here as part of their annual industry conference, he said their brands were increasingly important signals that content can be trusted. "Brands are the solution, not the problem," Mr. Schmidt said. "Brands are how you sort out the cesspool."

Photo Credit: Doug GoodmanWelcome words Those were welcome words for the editors and publishers who have been watching the internet draw more and more ad spending every year. Mr. Schmidt took aim, however, at the Association of National Advertisers for opposing Google's planned ad deal with Yahoo. The association has said the deal will diminish competition and help Google and Yahoo increase ad prices. "If you're going to criticize us, criticize us correctly," Mr. Schmidt said. "We're guilty of many things, but that's not one of them."

In a talk that he structured mostly as an invitation for questions and ideas, Mr. Schmidt declined to advise magazines on looking more popular to Google's page-ranking programs. "We don't actually want you to be successful," he said. The company's algorithms are trying to find the most relevant search results, after all, not the sites that best game the system. "The fundamental way to increase your rank is to increase your relevance," he added.

On the subject of print, especially newspapers as we have known them, Mr. Schmidt was decidedly gloomy. "The evidence is not good," he said, guessing that the print business will eventually comprise a smaller piece of publishers' much larger online businesses.

Reactions:

Thursday, October 2, 2008

Pearlman Microphones, a boutique manufacturer of handmade tube microphones, launched a meticulous remake of the Church mic, a revered microphone originally created by Stanley Church for Metro-Goldwyn-Mayer (MGM) in the mid 1950s. The Pearlman replica adheres to the original schematics, including the use of the authentic Triad transformer that has been out of production for nearly 50 years.

Church served as the studio's chief sound engineer during that era, producing no more than 200 of his custom vacuum tube condenser microphones strictly for in-house use at MGM. Over the years, however, the mic's reputation and resale value skyrocketed, with originals in decent condition going for up to $20,000 on the vintage market, says the company.

"The Stanley Church MGM mic is widely considered an engineer's Holy Grail, delivering a vibe that is reminiscent of the best U47s and C12s," said Dave Pearlman in a statement. "But the original mic's transformer had been unavailable until just recently, so we're now finally able to replicate the critical combination of the original capsule, tube, and transformer. I'm also making each amplifier by hand, using all point-to-point, old-style wiring. So, essentially, I'm not recreating this classic legacy microphone--I'm simply continuing it."

"We put this mic up next to a couple of U47s the other day and people were absolutely freaking out about how good it sounded," Pearlman said. "They couldn't believe their ears. It's not exactly the same, of course--the mic has a different sonic flavor--but it's every bit on par with the sound of those amazing mics of yesteryear, yet available at a fraction of the price."

Encased in the same housing as Pearlman's TM-1, the new Pearlman Church mic is equipped with a handmade power supply, custom Mogami/Neutrik tube microphone cable, heavy-duty shock mount, and aluminum shipping case. Available to order, the Pearlman Church mic lists for $4,500.

Monday, September 22, 2008

The record industry has unsuccessfully attempted to stamp out piracy by litigating against individuals for five years now. In that time, the RIAA has threatened more than 30,000 people with litigation, racking up millions in legal fees in the process, but without appearing to make any dent in copyright infringement.

In casting a wide net for non-commercial file-sharers, the RIAA has also disrupted the lives of innocent Web users and is now itself facing a class-action lawsuit brought by an exonerated defendant.

But none of that is slowing down the RIAA. On the contrary, the group is growing even more aggressive in its litigation efforts.

The latest news is that the group has rejected a judge's suggestion that the organization allow Whitney Harper to pay $7,400, or $200 a song, to settle allegations that she shared 37 tracks on Kazaa four years ago, when she was just 16. The judge previously ruled that Harper was an "innocent infringer" because she didn't realize she was doing anything illegal, and because Kazaa didn't warn users that music available on its network was pirated. While those facts might not be enough to exonerate her, they can reduce damages to something less than the usual $750 minimum.

But the RIAA is determined to extract at least $750 per track from her and has requested a trial on the issue of damages.

And that's not the extent of the RIAA's militancy. The group is now going after the defense attorney Ray Beckerman, asking that a federal district court judge impose sanctions for his "vexatious" conduct.

The RIAA appears especially aggrieved by Beckerman's blog, The Recording Industry vs. The People, where he posts publicly available motions in lawsuits involving the organization.

"Defendant's counsel has maintained an anti-recording industry blog during the course of this case and has consistently posted virtually every one of his baseless motions on his blog seeking to bolster his public relations campaign and embarrass plaintiffs," the group wrote in its motion for sanctions.

That motion, like others filed by the RIAA, remains available on Beckerman's blog.

Sunday, September 14, 2008

Metallica's new release, "Death Magnetic," went on sale on Sept. 12, but fans who want the vinyl version may have to wait. Even when it was available for pre-order, the two-LP set was one of the fastest-selling music items on Amazon.com recently and is temporarily out of stock.

Friday, September 12, 2008

Northrop Grumman Corp. has received good news in the wake of the Pentagon’s decision to delay a lucrative Air Force tanker contract.

-

The U.S. Navy said it awarded the Los Angeles defense contractor a $5.1 billion contract to build its first next-generation aircraft carrier.

The USS Gerald R. Ford, named after the late president, will be the first of a new class of aircraft carrier in more than 40 years, and replace nuclear-powered Nimitz-class carriers that have been in service since the early 1970s. The new carrier is expected to launch in 2015.

The Navy said Ford-class carriers eventually will replace 11 carriers currently in service. Final cost estimate for the first new carrier is expected to be about $8.3 billion, which includes non-recurring costs associated with design and start-up.

Reactions:

Thursday, September 11, 2008

Ticketmaster shares plunged 18 percent on Thursday after Live Nation Inc. said it signed a seven-year deal to sell tickets at North American venues managed by SMG, one of the nation’s largest operators of arenas, stadiums and theaters.

SMG, which is owned by private equity fund American Capital LTD, is considered to be West Hollywood-based Ticketmaster's second largest customer. It manages such facilities as San Francisco’s Bill Graham Civic Auditorium, Chicago's Soldier Field, and New Orleans' Superdome.

Live Nation is launching its own ticketing service to compete with its current vendor Ticketmaster once their contract expires at the end of the year. The agreement with the Philadelphia-based SMG helps move the Los Angeles concert promoter into the ticketing business beyond its own venues.The deal, which will start in late 2009, should result in a 25 percent annual increase in the 13 million tickets the company expects to sell over the next seven years, Live Nation said.Ticketmaster shares closed down $3.32 to $15.45 on the Nasday. Live Nation shares closed up $1.03, or 6.5 percent, to $16.90 on the New York Stock Exchange.

On Sunday, prime contractor Boeing and its partner Northrop Grumman, which designed and built the megawatt laser for the airborne laser aircraft, successfully fired the laser during a ground testing at Edwards Air Force Base in California -- proving that the laser will be capable of destroying a missile in flight.

Both companies, along with Bethesda, Md.-based Lockheed Martin Corp., are expected to provide further details on the ground testing during a conference call Tuesday.The Missile Defense Agency's airborne laser aircraft is a modified 747-400F freighter, whose back half carries the high-enery laser.

In an indication of just how seriously North American engineers are taking integrated safety in applications where life and limb are on the line, Walt Disney Imagineering and Siemens Energy & Automation have together been working on a PLC-based safety system for busbar powered rides.

Wednesday, September 3, 2008

The FCC has swarmed Wilmington to prepare it for next week's roll-out, but February's nationwide changeover looms as a much larger task.

By Jim Puzzanghera, Los Angeles Times Staff Writer September 3, 2008

WILMINGTON, N.C. -- The future of broadcast television is set to premiere in this quaint seaside city next week. And the federal government is working hard -- too hard, some say -- to make sure it's a hit here.At noon on Monday, Wilmington's five commercial broadcast stations are scheduled to become the nation's first to permanently switch to all-digital signals, serving as a test of the government-mandated transition that other stations across the country will make in February.

"It's like landing on the moon," said Constance Henley Knox, general manager of CBS affiliate WILM. "We're making history."The change is the biggest for over-the-air television since the advent of color 50 years ago. The more efficient signals, which many stations already are transmitting, provide a much clearer picture and allow broadcasters to offer four or more programs at the same time on new sub-channels.But the end of analog broadcasts could leave many viewers who depend on rabbit ears and other antennas seeing nothing but static unless they upgrade their equipment. That's because older sets can't pick up the digital signals.

Although most people who get TV from cable, satellite or phone companies will be unaffected, viewers who rely on antennas need a digital TV or a special converter box.

So for the last four months, the Federal Communications Commission has lavished disproportionate attention on Wilmington, the nation's 135th-largest media market with 180,000 TV-watching households, to eliminate any chance the test run will flop.

A dozen FCC staffers have spent the summer crisscrossing the region like tourists to raise public awareness. They've visited the Poplar Grove Plantation farmers market and the Pender County Blueberry Festival. They've been to the 30th anniversary party for the public library in Elizabethtown and made friends at the Mae Coffee Shop in Whiteville. FCC Chairman Kevin J. Martin has visited five times to spread the word.

By all accounts the region is ready after the unprecedented FCC effort, which supplemented an aggressive publicity campaign by broadcasters. In a recent survey of Wilmington-area residents by the National Assn. of Broadcasters, 77% of respondents knew when the switch was occurring."I don't think I've run into anybody who doesn't know about it," Louis Pillarella, a 68-year-old engineer from Wilmington, said last week during a digital TV expo where Martin and three FCC staffers answered questions.

But the all-out federal effort is a major reason a successful test of what one Wilmington station has dubbed "the big switch" could turn out to be a big illusion."It's great Wilmington has come forward and offered to be the canary in the coal mine," said Joel Kelsey, a policy analyst with Consumers Union, the nonprofit publisher of Consumer Reports magazine. "But we have several concerns about just how good a canary Wilmington is going to be."One is that no other place will get the type of personal oversight that the FCC has showered on Wilmington. Other media markets will be visited by only a single FCC commissioner, accompanied by a few staffers, for a couple of days.Another reason a successful test in Wilmington may not be indicative of success in the rest of the country: Only about 8% of the area's homes rely on antennas, compared with 12% nationwide, according to Nielsen Co.

What's more, its flat topography eliminates the problems some viewers in Los Angeles and other hilly areas could face trying to tune in to digital signals. Poor reception leads to frozen pictures or blank screens.FCC Commissioner Michael J. Copps said he hoped a smooth transition in Wilmington wouldn't trigger complacency in Washington."

The worst thing would be if we all get in the airplane Sept. 8 and come home and say, 'That's that,' " said Copps, who proposed the test-market idea. "It's still such a huge leap that we're making. Even though we have this one little test, it still boggles my mind we're going to pull the lever on everyone else in February."

To free up more airwaves for public safety communications and wireless devices, the federal government mandated that all full-power TV stations permanently turn off their analog transmitters and broadcast only in digital by the end of the day Feb. 17. The millions of people with older TV sets who receive signals via antennas will need converter boxes, which typically cost $40 to $70. The government is subsidizing them through $40 coupons.Last year, a coalition made up of broadcasters and consumer and civil rights groups launched a nationwide public awareness effort. But some groups and members of Congress have criticized the federal government for not doing enough to assure an easy transition. The FCC sought a test market, and Wilmington's broadcasters volunteered.

Martin said Wilmington needed extra resources because it was making the transition early, and he declared that the test was already paying dividends. After seeing the benefits of having staffers on the ground there, Martin announced last month that FCC commissioners would fan out to the 80 markets with the most over-the-air-only households, including Los Angeles, between now and February. They plan to hold town hall meetings and other events to raise awareness and answer questions about the transition.