The objective of the NEM is for Malaysia to join the ranks of the high-income economies, but not at all costs. The growth process needs to be both inclusive and sustainable. Inclusive growth enables the benefits to be broadly shared across all communities. Sustainable growth augments the wealth of current generations in a way that does not come at the expense of future generations.

A number of strategic reform initiatives have been proposed. These are aimed at greater private initiative, better skills, more competition, a leaner public sector, pro-growth affirmative action, a better knowledge base and infrastructure, the selective promotion of sectors, and environmental as well as fiscal sustainability.

The next step of the process will be a public consultation to gather feedback on the key principles and afterwards the key recommendations will be translated into actionable policies.The NEM represents a shift of emphasis in several dimensions:

Refocusing from quantity to quality-driven growth. Mere accumulation of capital and labor quantities is insufficient for sustained long-term growth. To boost productivity, Malaysia needs to refocus on quality investment in physical and human capital.

Relying more on private sector initiative. This involves rolling back the government’s presence in some areas, promoting competition and exposing all commercial activities (including that of GLCs) to the same rules of the game.

Making decisions bottom-up rather than top-down. Bottom-up approaches involve decentralized and participative processes that rest on local autonomy and accountability —often a source of healthy competition at the subnational level, as China’s case illustrates.

Allowing for unbalanced regional growth. Growth accelerates if economic activity is geographically concentrated rather than spread out. Malaysia needs to promote clustered growth, but also ensure good connectivity between where people live and work.

Providing selective, smart incentives. Transformation of industrial policies into smart innovation and technology policies will enable Malaysia to concentrate scarce public resources on activities that are most likely to catalyze value.

Reorienting horizons towards emerging markets. Malaysia can take advantage of emerging market growth by leveraging on its diverse workforce and by strengthening linkages with Asia and the Middle East.

Welcoming foreign talent including the diaspora. As Malaysia improves the pool of talent domestically, foreign skilled labor can fill the gap in the meantime. Foreign talent does not substract from local opportunities--on the contrary, it generates positive spill-over effects to the benefit of everyone.

Overall, the New Economic Model demonstrates the clear recognition that Malaysia needs to introduce deep-reaching structural reforms to boost growth. The proposed measures represent a significant and welcome step in this direction. What will matter most now is the translation of proposed principles into actionable policies and the strong and multi-year commitment to implement them.

Source: http://blogs.worldbank.org/eastasiapacific/node/2887 -------------------------------------------------------------------------------Malaysia' ‘New Economic Model’KUALA LUMPUR, March 30 — Malaysian Prime Minister Datuk Seri Najib Razak today unveiled a raft of economic measures that he said would propel this Southeast Asian country to developed nation status by 2020.Following are some of the highlights of what he announced:• State investor Khazanah to sell 32 percent stake in Pos Malaysia.• To list stakes in two Petronas units.• Facilitate foreign direct and domestic direct investments in emerging industries/sectors.• Remove distortions in regulation and licensing, including replacement of Approved Permit system with a negative list of imports.• Reduce direct state participation in the economy.• Divest GLCs in industries where the private sector is operating effectively.• Strengthen the competitive environment by introducing fair trade legislation.• Set up an Equal Opportunity Commission to cover discriminatory and unfair practices.• Review remaining entry restrictions in products and services sectors.• Phase out price controls and subsidies that distort markets for goods and services• Apply government savings to a wider social safety net for the bottom 40 per cent of households, prior to subsidy removal.• Have zero tolerance for corruption• Create a transformation fund to assist distressed firms during the refom period.• Easing entry and exit of firms as well as high skilled workers.• Simplify bankruptcy laws pertaining to companies and individuals to promoteo vibrant entrepreneurship.• Improve access to specialised skills.• Use appropriate pricing, regulatory and strategic policies to manage non-renewable resources sustainably.• Develop a comprehensive energy policy.• Develop banking capacity to assess credit approvals for green investment using non-collateral based criteria.• Liberalise entry of foreign experts specialising in financial analysis of viability of green technology projects.• Reduce wastage and avoid cost overrun by better controlling expenditure.• Establish open, efficient and transparent government procurement process.• Adopt international best practices on fiscal transparency. — Reuters

With the public release of the Apple iPad looming, Elan Microelectronics, a Taiwanese chipmaker, is suing Apple, claiming many Apple products infringe on its multitouch patents.

Elan has asked the International Trade Commission (ITC) to ban imports of the iPhone, iPod Touch, MacBook, Magic Mouse and even the yet-to-be-released iPad.

"We have taken the step of filing the ITC complaint as a continuation of our efforts to enforce our patent rights against Apple's ongoing infringement. A proceeding in the ITC offers a quick and effective way for Elan to enforce its patent," the company said in a statement.

Elan says it owns patents covering "touch-sensitive input devices with the ability to detect the simultaneous presence of two or more fingers," which is exactly what these Apple products do. Apple has not released a formal response to the lawsuit yet.

This isn't the first time Elan has sued over its mutltitouch patent. Two years ago it sued Synaptics in a similar case. Synaptics ended up entering a licensing deal with Elan, but it's not a foregone conclusion that Apple will do the same thing since Apple is no stranger to prolonged legal battles.

There is also an element of irony in Apple being sued for multitouch patent infringement because the company recently brought a similar suit against smartphone maker HTC. Apple said HTC phones with the Android operating system infringed on over 20 Apple patents, including some that had to do with multitouch interfaces. The lawsuit won't affect sales of pre-ordered iPads slated to go on sale this Saturday, many of which have already shipped.

BEIJING - Strengthened media cooperation between India and China will help improve understanding and promote more beneficial bilateral ties between the two countries, officials from both sides proposed on Tuesday.

"China and India are enjoying a relationship which is deepening and broadening," S. Jaishankar, the Indian ambassador to China, said at the 2010 India-China Development Forum in Beijing. Jaishankar noted in his speech that both nations had witnessed some controversial and negative media coverage about each other last year, but said it was "no use blaming each other".

Jaishankar proposed a shift in China's focus from various media debates in India to the evaluation of the result brought about by these voices.

"Our media coverage will be more positive if we promote our relationship, and of course, a more efficient interpretation and dialogue is needed for such progress." Wang Chen, minister of the State Council Information Office, also noted the importance of the media, as direct communication between the two peoples was limited.

"China and India together account for almost half of the world's populationmore intensified media coverage by both countries about our progress and efforts is much needed," he said. Wang proposed that both countries report in a more positive and all-round manner, as well as cover mutual achievements.

"We hope the media will become the window of understanding for both sides," Wang said.

"Although both Chinese and Indian media have made great strides in recent years, the Western media still had the upper hand. China and India get to know each other through Western media outlets such as CNN and BBC, which somehow lead to misunderstanding. The media cooperation should be enhanced between the two countries." A media cooperation committee was also proposed during the forum.

Zeng Jianhua, executive director of the Department of Asian, African and Latin American Affairs at the Chinese People's Institute of Foreign Affairs said such a panel would help China and India put aside differences due to their different political and cultural backgrounds, and seek a common ground for mutual development.

As cloud computing-fueled devices like the iPad grow in popularity, so will associated greenhouse gas emissions, according to Greenpeace's "Make IT Green" report. The report, which dubs 2010 the Year of the Cloud, offers up a disturbing statistic: Cloud computing greenhouse gas emissions will triple by 2020.

The increase in emissions makes sense. As we increasingly rely on the cloud to store our movies, music, and documents, cloud providers will continue to build more data centers--many of which are powered by coal. Facebook, for example, recently announced that is building a data center in Oregon that will be powered mostly by coal-fired power stations, much to the chagrin of groups like Greenpeace.

The solution to the cloud computing problem is fairly obvious. Greenpeace explains in its report, "Companies like Facebook, Google, and other large players in the cloud computing market must advocate for policy change at the local, national, and international levels to ensure that, as their appetite for energy increases, so does the supply of renewable energy." As we've noted before, companies like IBM, Google, and HP have already begun to make strides in cutting data center energy use. But there is still plenty of work to be done--as it stands, the cloud will use 1,963.74 billion kilowatt hours of electricity by 2020.

Intel's switch to the Nehalem architecture was finally completed Tuesday with the launch of the Nehalem-EX Xeon 6500 and 7500 processors, the last of the Core, Xeon, and Itanium chips to get the Quick Path Interconnect and a slew of features that make Intel chips compete head-to-head with alternatives from Advanced Micro Devices. The price war at the midrange and high-end of the x64 market can now get underway, while the all-out, total price war awaits the debut of AMD's Opteron 6100 processors in the second quarter.

Since the summer of 2008. Intel has been previewing its top-end, eight-core Nehalem-EX beast, which we now know as the Xeon X7560. As it has done with prior generations of Xeons, the Nehalem-EX line is not comprised of one or two chips, but a mix of chips with different features (clock speed, cache memory, HyperThreading, and Turbo Boost) dialed up and down to give customers chips tuned for specific workloads.

While last year's Nehalem-EP Xeon 5500 and this year's Westmere-EP Xeon 5600 processors are aimed at workstations or servers with two sockets, with the Nehalem-EX lineup, Intel has broadened the definition of its Expandable Server (this is apparently what EX is short for, with EP is supposed to be an abbreviation for Efficient Performance) to include two-socket machines as well as the four-socket and larger machines that prior generations of Xeon MP processors were designed for.

Intel, no doubt, would have preferred to keep the Xeon DP and Xeon MP product lines more distinct, and charged a hefty premium for machines that needed expanded processor sockets or memory capability. But server makers and their customers were having none of that. With the rapid adoption of server virtualization and the need for larger memory footprints even for two-socket boxes, the Nehalem-EX processors have been tweaked so they can be used to support very fat memory configurations on even two-socket workhorse servers. This will eat into the volume Xeon 5500 and 5600 market, to be sure, but it is better to sell a Xeon 6500 or 7500 server in a two-socket box than have a customer dump Intel for AMD.

The Xeon 6500 and 7500 processors will also blur some lines between Xeon processors and the former "flagship" Itanium processors, which were supposed to take over the desktop and server arena starting a decade ago, but have been relegated mostly to high-end servers from HP running HP-UX, NonStop, and OpenVMS at this point in their history. The Itaniums were distinct in many ways from the Xeons, but the main distinction they held was better reliability, availability, and serviceability (RAS) features than Xeons had, and on par with mainframe, RISC, and other proprietary architectures from days done by.

The eight-core Nehalem-EX Xeon 7500 beast

But at the launch event today in San Francisco, Kirk Skaugen, vice president of the Intel Architecture Group and general manager of its Data Center Group, made no bones about the fact that the Nehalem-EX processors and their related Boxboro chipset that is shared with the Itanium 9300 processors launched in early February have common RAS features.

The new chip, explained Skaugen, has 20 new RAS features, including extended page tables and virtual I/O capabilities as well as a function that is in mainframes, RISC iron, and Itaniums called machine check architecture recovery, which allows a server to have a double-bit error in main memory and cope with it without halting the system. With Windows, Solaris, and Linux supporting these RAS features, as well as VMware's ESX Server hypervisor, this makes servers based on the Xeon 7500s just as suitable a replacement for proprietary midrange and mainframe platforms and RISC/Unix servers as the formerly beloved Itaniums.

Skaugen said that the Nehalem-EX chips would allow server makers to create two-socket servers that support up to 512GB of main memory, nearly three times as much as AMD can do using 8GB DIMMs with the Magny-Cours Opteron 6100s announced yesterday. Intel will be able to support 1TB of main memory in a four-socket configuration, while the controller inside the Opteron 6100 only allows a four-socket machine using these chips to address a maximum of 512GB.

Skaugen rubbed it in a little that Intel's Nehalem-EX partners had over 50 new products in rack, tower, and blade form factors, and that it had 75 per cent more four-socket designs than with any prior server chip launch in its history. A dozen OEM partners have 15 different servers in the works that will span eight or more processor sockets, and apparently some are pushing their designs up to 16, 32, or 64 sockets.

The big bad box at the Nehalem-EX launch, of course, was the Altix UV massively parallel supercomputer, which El Reg told you all about last November. The Altix UV machines allow for up to 2,048 cores (that's 256 sockets and 128 two-socket blades) to be lashed together in a shared memory system suitable for running HPC codes. The shared global memory is not the same as a more tightly coupled symmetric multiprocessing (SMP) or non-uniform memory access (NUMA) cluster used in general purpose servers for running applications and databases. But that said, the Altix UVs are very powerful machines indeed and are intended to scale to petaflops of performance.

The Boxboro chipset that Intel is shipping as a companion to the Nehalem-EX chips supports configurations with two, four, or eight sockets gluelessly. If you want more sockets than that, you have to create your own chipsets, as HP, IBM, Silicon Graphics, and Bull have done for sure and others will no doubt follow.But you can't just plug any old Nehalem-EX chip into any old configuration. That would be too simple, and Intel likes to charge premiums for features, like most capitalists. Take a gander at the feeds and speeds of the Nehalem-EX lineup:

The Intel Nehalem-EX Xeon 7500 and 6500 processors

The first thing you will notice is that there are two different families of Nehalem-EX processors. The Xeon 7500s are aimed at general-purpose workloads and offer the most socket expandability. All of these chips can be used in two-socket or four-socket boxes, and some of them can be used in eight-socket or larger machines, too. The Xeon 6500s are cut-down versions of the chips that only work in two-socket boxes and that are specially tuned for the HPC market. These chips, explained Skaugen, were optimized to have the highest bytes per floating point operation ratio while minimizing the amount of node-to-node communication among the processors in the complex.

The top-end X7560 part has eight-cores spinning at 2.26GHz, has 24MB of L3 cache on the chip, and is rated at 130 watts using Intel's thermal design point (TDP) scale. The chip supports Turbo Boost, which allows for a core to have it cycle time jacked up if other cores are shut down when they're not being used, and it also supports Intel's HyperThreading simultaneous multithreading, which virtualizes the physical pipeline in the chip so it looks like two virtual pipelines to a system's operating system and its applications. In best-case scenarios, HT can boost performance of applications by around 30 per cent. In 1,000-unit trays, the per-chip price for the X7560 is a whopping $3,692. That is exactly what Intel charged for a dual-core Montvale Itanium 2 with 24MB of L3 cache.

The X7550 drops the clocks down to 2GHz, chops the L3 cache down to 18MB, and the price comes down to $2,729, which is exactly what Intel was charging for its top-bin six-core Dunnington Xeon X7460 processor running at 2.66GHz with 16MB of L3 cache. The next part down, the X7542, jacks the clocks up to 2.66GHz, drops the cache down to 18MB, cuts out HyperThreading, and reduces the core count down to six from eight; the price drops down to $1,980.

For that same $1,980 you can get a standard 105 watt part, the E7540, running at 2GHz with six cores and that same 18MB cache. If you are willing to take lower clock speeds, you can get even cheaper standard parts, the E7530 and E7520, which cost $1,391 and $856, respectively. Intel has also cooked up two low-voltage parts, the L7555 and L7545, running at 1.86GHz and rated at 95 watts, which have eight and six cores, respectively. These are reasonably pricey chips that will no doubt be used inside Nehalem-EX blade servers where a premium is expected in exchange for extra density.

Generally speaking, the Xeon 6500 processors are cheaper than their Xeon 7500 counterparts because they have some features and functions turned off, as El Regpredicted they would last fall. This is in keeping with the general philosophy that HPC shops are super-stingy and will not pay one extra penny for a feature they don't want and will never use.

The Nehalem-EX processors are implemented in 45 nanometer processes and have 2.3 billion transistors. ®

When Steve Jobs met Google boss Eric Schmidt for coffee late last week, they may or may not have reached some common ground on certain hot-button subjects. But odds are, they didn't see eye-on-eye on Adobe Flash. As Jobs prepares to ship his much ballyhooed Apple iPad without even the possibility of running Flash - which he calls "buggy," littered with security holes, and a "CPU hog" - Google is actually integrating the beleaguered plug-in with its Chrome browser.

With a blog post on Tuesday, Mountain View announced that Flash has been integrated with Chrome's developer build and that it plans to offer similar integration with its shipping browser as quickly as possible.

Google has been known to say that HTML5 is the way forward for internet applications. But clearly, it believes in the plug-in as well, and it has no intention of pushing all development into the browser proper."Just when we thought that Google was the champion of HTML5 they turn around and partner with Adobe on Flash to ensure that the web remains a mess of proprietary brain damage," one netizen said in response to Google's post.

Last summer, Google proposed a new browser plug-in API, and with today's blog post, it also said that Adobe and Mozilla have joined this effort. "Improving the traditional browser plug-in model will make it possible for plug-ins to be just as fast, stable, and secure as the browser’s HTML and JavaScript engines," the company said. "Over time this will enable HTML, Flash, and other plug-ins to be used together more seamlessly in rendering and scripting.

"These improvements will encourage innovation in both the HTML and plug-in landscapes, improving the web experience for users and developers alike."

What's more, Mountain View is developing a native code browser platform of its own, dubbed Native Client. This is already rolled into Chrome, and it will be an "important part" of the company's browser-based Chrome operating system, set for launch in the fall.

By integrating Flash with Chrome, Google said that it will ensure users always receive the lastest version of the plug-in and that it will automatically update the plug-in as needed via Chrome's existing update mechanism. And in the future, the company added, it will include Flash content in Chrome's "sandbox," which restricts the system privileges of Chrome's rendering engine in an effort to ward off attacks.

In July, with a post to the Mozilla wiki, Google proposed an update to the Netscape Plug-in Application Programming Interface (NPAPI), the API still in use with browsers like Chrome and Firefox, and both Adobe and Mozilla are now working to help define the update.

"The traditional browser plug-in model has enabled tremendous innovation on the web, but it also presents challenges for both plug-ins and browsers. The browser plug-in interface is loosely specified, limited in capability and varies across browsers and operating systems. This can lead to incompatibilities, reduction in performance and some security headaches," Google said today.

"This new API aims to address the shortcomings of the current browser plug-in model."The new setup was developed in part to make it easier for developers to use NPAPI in tandem with Native Client. "This will allow pages to use Native Client modules for a number of the purposes that browser plugins are currently used for, while significantly increasing their safety," Google said when the new API was first announced.

Native Client and NPAPI have been brewing for months upon months, but today's Chrome announcement would seem to be a conscious answer to Steve Jobs' hard-and-fast stance on Flash. Presumably, the company sees this a way to ingratiate existing Flash shops who've been shunned by the Apple cult leader.One of the many questions that remain is whether Chrome will give users the option of not installing Flash. With the new developer build - available here - you must enable integrated Flash with a command line flag. ®

Tuesday, 30 March 2010

In its continuing quest to be more than just the world’s preferred search engine, Google recently added new features to its free website analysis program aimed at enterprises.

“Web Analytics is essentially a sophisticated website monitoring system,” said head of Web Analytics at Google South-East Asia Vinoaj Vijeyakumaar.

“Beyond just noting how many people visit your site, you can see what they do there and how much time they spend doing it.

“You can set and manage sales goals and receive automatic business reports based on those goals. This kind of intelligence can greatly improve productivity in any industry,” he said.

With the new enhancements, Google added about 20 preset goals to the Web Analytics repertoire.In-depth intelligence reports have also been enhanced. However the company acknowledged that algorithms used for those reports will not be made publicly available.

To help enterprises get the most out of Web Analytics, Google has appointed “authorised consultants” who are certified by the company to train staff members in how to use the program.

“We have three authorised consultants based in Singapore and we hope to open one in Malaysia very soon,” said head of communications for Google South-East Asia Dickson Seow.

“Knowing how to use all the features in the most effective manner can help online traders stay ahead of the game.” For more information, surf to www.google.com/analytics.

Larry Ellison likes to buzz rotten fruit off some corporate type’s head. Over the years Microsoft, PeopleSoft, BEA Systems, SAP, and Red Hat have lined up to be been duly pelted during calls with Wall St or during Ellison's company's mega OpenWorld customer and partner conference.

It's all good theater in the crucible of Silicon Valley, but it's theater nonetheless, and a form of performance that will always have a shallow veneer. When there's money involved, you can say what you want about your rivals during a conference call - it's just words.

For example: almost two-thirds of SAP implementations run on Oracle's database, which means SAP - a company regularly pilloried by Ellison - actually translates into big money and helps keep Oracle's chief executive in yachts.

Turning to Oracle's acquisition of Sun Microsystems, then, it's with some justification that those people involved in technologies that were spun up by Sun during its era of a thousand blooming flowers and that have little visible business return on investment should now feel worried.

The OpenSolaris community started screaming that it was being ignored by Oracle. The giant responded to say it wasn't ignoring them, it was just overworked getting its arms around the whole Sun thing.

To the ranks of the concerned, you can now add those working to put Solaris and OpenSolaris on IBM's Z-series mainframe. One Solaris on Z-series supporter contacted The Reg to say:

The SystemZ port of Solaris is dead. Oracle pulled all plugs and refused to further help the authors to help. Critical parts are closed parts of libc.so.1, the core user land library which has closed source parts. Oracle now refuses to give precompiled binaries of newer versions of the closed parts to the SystemZ port community, effectively ending this port because the missing bits cannot be replicated or bypassed.

Also concerned is David Boyes, president and chief technologist of Sine Nomine Associates - the engineering firm that helped put OpenSolaris on IBM's System Z mainframe in 2008. OpenSolaris was to become part of the main Solaris product.

Boyes told The Reg that the Sun employee working on the port has gone - chopped as the result of Ellison's Sun employee cull - and hasn't been replaced. Boyes is certain Oracle is not going to replace that person.Oracle was unable to comment for this article.

On paper, the future is not too bright for Solaris or OpenSolaris on IBM's mainframe platform. In two years of the project's life, its been downloaded just 1,000 times - sometimes repeatedly by the same organizations. Otherwise, we're told there are "plenty" of proofs of concept.

Boyes told us it's wrong to say Oracle has "killed" OpenSolaris on IBM's mainframe, but he noted the future is up for grabs as Oracle is grooming through the old Sun's software and project assets and deciding what to do with them. The party line from Oracle here and during the recent EclipseCon and the Open Source Business Conference is that it's still working through projects and deciding what to do.

"This is all about politics and has noting to do with technology," Boyes said, angry that so much of this own company's time - 20,000 to 30,000 hours - dedicated to the project could have been for nothing. "Guys who worked on the Power and Intel work outside of Sun are pretty damn pissed," he said.

He added that while source code for OpenSolaris is still available and can still be enhanced, unless Oracle commits to putting Sun's operating system on IBM's Z mainframe he'll have to put it on the back burner. "It will no longer have the priority if they make it clear this is going nowhere, and we will have to reconsider what we are doing," Boyes said.

Boyes is right. This is political. Solaris has a future inside Oracle, on Exadata servers running Oracle's database. Where OpenSolaris fits into that is unclear.

As for Solaris on the platform of a competitor that Ellison has taken enormous pleasure in pelting since the Sun acquisition, well - if Ellison does kill it, it won't be for theatrical reasons. It'll be because he's decided he can't make any money by having his own software run on IBM hardware.

If you want a sign of how much things have changed under the new management even at this early stage, consider this lesson from another corner of the OpenSolaris and Solaris camp.

InfoWorld has reported that Oracle has tweaked the Solaris download license, so that you can no longer download Solaris for free. You can now only use Solaris for free as part of a 90-day trial if you purchase a service contract. Under that nice Sun - but slightly stoopid Sun - all you had to do was jump through the hoops of some online survey and make sure you were smart enough to give a working email address for the download.

Yes, the flowers are wilting and anything that survives under Oracle will only bloom if it can deliver a return on Sun's investment. ®

Google’s recent action of redirecting google.cn traffic to servers in Hong Kong has raised much comment. Google’s claim is that it is doing this so it doesn’t have to follow Chinese government demands to censor what it has online.

The Chinese government, in its turn, has stated that it has the right to set the rules by which a corporation functions in its country.In the process of this dispute, the diverse views of the Chinese Internet users, of its netizens, appear to be missing from Google’s considerations.

Some netizens in China posted an open letter to Google and to the Chinese government ministries, asking that each side in the dispute present, in an open way, what their views are so that the netizens can be part of the discussion and decision-making process.(1)

Google has ignored this request. It has done what it decided to do. It claims that this is its good deed for the world. But is it? Is Google acting with concern for netizens in China? The authors of the letter objected to the secret manner Google used to make its decision.

This situation is reminiscent of an experience that users of what is known as “Usenet” had with Google almost 10 years ago. In 2001, Google acquired from another company, from Deja.com, an archive of posts put on Usenet by its users. Deja was going out of business and allegedly sold the Usenet posts it had archived to Google.

At the time a number of users of Usenet were surprised that the posts they had contributed to Usenet discussion groups had been sold from one company to another. Also at the time there were concerns about what Google might be planning to do with the posts.

An effort was made to ask Google to recognize that Usenet itself had grown up as part of a cooperative online community of users who contributed their efforts and articles to help to enrich this online community.

There was a concern among users on Usenet. Would the forms of participatory decision-making in this online community, which had been developed to involve users, be lost when a corporation like Google got involved in owning and controlling the archives of Usenet posts? Google was asked to contribute the archive or a least a copy of the archive, to a public entity that could protect it.

At the time, Google ignored these requests. Instead Google even began putting a copyright symbol on the articles in the archive, claiming that Google owned the copyright to the many contributed posts. This was contrary to the Berne Convention, the law regarding such posts. The Berne Convention, which the US agreed to respect as its copyright law as of March 1, 1989, states that the posts were the property of the users who had created them, not of Google.

Eventually Google stopped putting its copyright symbol on Usenet posts. This took quite a while, however, despite the fact that the illegal nature of Google making such a claim had been pointed out to Google soon after it started to post its copyright symbol on Usenet users posts.

The significant point of the experience that I and other Usenet users had with Google, however, is that we found that Google acted according to its own interests and its own directives. Management at Google refused to respond to users‘ concerns. In the process of this struggle I wrote an article titled “Culture Clash” which appeared on February 26, 2001 in the online magazine Telepolis describing what was happening with Google. (2)

In response to the article, I was invited to give a talk at Stanford University in California, where Sergey Brin and Larry Page, the creators of Google, had done their research on the search engine algorithm that was the basis for the Google search engine. I was told I would have a chance to debate what I had written in my article with Brin and Page at a program at Stanford. Once I arrived at Stanford, however, I was told that they would not be part of the program. Instead I could give the talk without them at Stanford and then go and speak at the corporate headquarters of Google.

I gave a talk at Stanford and then went to Google’s Mountain View headquarters and gave the talk a second time. While I appreciated having the chance to speak to, and afterwards have a discussion with, some of those working at Google at the time, neither Brin nor Page were available to participate in the program or to talk with me. Instead the person I was told I could speak to, offered no means for Usenet users to make input into the decisions-making process of Google.

Based on my experience with Google, I wrote the article “Commodifying Usenet and the Usenet Archive or Continuing the Online Cooperative Usenet Culture?” (3) The article was published in the scholarly journal “Science Studies” in January 2002.

In the article, I described how Usenet had been created by a cooperative online process. An example I gave, was when one of the pioneers involved in early Usenet development wanted to change the name of Usenet. He proposed this change to the users of Usenet. After an extended discussion it became clear that many users disagreed. The plan to change the name of Usenet was dropped. The name remained as Usenet. There were a number of other similar examples in the early days of Usenet development where users were involved in the discussion of problems and in contributing to the decisions that were made.(4)

This was, however, no longer to be the case when Google became involved with Usenet. As a result, some aspects of Usenet have survived, especially the discussion groups dealing with technical issues. A number of other discussion groups that existed on Usenet, however, were negatively affected by the ways that Google and other companies began to make various decisions, not only with respect to how Usenet was archived or searched, but also affecting other aspects of Usenet.

In trying to understand what has happened as the corporate world represented by Google and other online services began to affect the online world and the experience of netizens in this online world, it is helpful to also keep in mind Google's own origins.

When Brin and Page were students at Stanford University working on their search engine project, they wrote a paper criticizing the commercialization of search engine research. In the paper, they proposed the need for an open laboratory approach to working on search engine design. Such an approach would allow the best results to be developed and built by the research community. Brin and Page criticized the commercial decision making processes, particularly the secrecy, lack of community input into the processes and focus on advertisements. They criticized that this had caused "search engine technology to remain largely a black art and to be advertising oriented." (5)

The project that Brin and Page were part of had National Science Foundation (NSF) funding. US government funding during this period of the late 1990s took a turn toward promoting commercialization as opposed to supporting basic research in science and technology. The Director of the NSF, Dr. Rita Colwell, explained to the US Congress that the "transfer to the private sector of 'people' - first supported by the NSF at universities - should be viewed as the ultimate success" of the US government technology policy.(6)

The significance of this change was that Brin and Page became connected with the same "black art" they had critiqued as graduate students. The objectives of the Google corporate structure is not to facilitate the sharing of ideas and the communication that facilitates the best design of search engine technology that were the objectives Brin and Page advocated as researchers at Stanford.

More seriously, the vision of the Internet as a place where netizens strive to understand the problems that develop, and work together to find the solutions that will continue to foster an environment facilitating communication, is a vision the corporate entities do not share. Hence the culture clash that developed between Google and the Usenet community. Keeping in mind this persepective it is helpful to look at what Google is doing with respect to netizens in China.

The situation with regard to China’s online world is one in which there are many important discussions online among netizens. Many of China’s netizens contribute to serious discussions on issues concerning the problems in China and the world.(7) This is an important development with respect to the Internet, a development that other netizens around the world can learn from.

Instead of Google learning from what is happening in China and trying to hear what China’s netizens are saying about Google’s concerns and plans, Google acts in ways that have an effect on China’s netizens without involving them in its decision-making process.

Unfortunately, many users around the world have become dependent on Google for many of their Internet activities and are thus at the mercy of but another corporate entity that does not care for the development of the kind of cooperative communication that the Internet and netizens have nourished and endeavored to spread more broadly and widely.

What is happening in the struggle between Google and China therefore is important, as Google claims it cares for the Chinese users, but there is no evidence that Google has seen any reason to consider the views and concerns of China's netizens. Thus Google's decision to redirect its google.cn traffic to servers in Hong Kong is but the decision of another corporation acting on the claim that the corporation knows best. Thus the culture clash between netizens and Google continues.

Notes

(1)Chinese netizens' open letter to the Chinese Government and Google, Draft for Discussion, Version: 0.99, March 2010

GEORGE TOWN: Cradle Fund Sdn Bhd, an agency under the Finance Ministry, is targeting to approve 12 to 24 applications for its pre-seed funds and five to 10 applications for its seed funds from Penang’s technopreneurs this year.

Chief executive officer Nazrin Hassan said Cradle would give RM150,000 in pre-seed funding to a team of two or more technopreneurs from Penang to kick start the development of their ideas.

“Subsequently, the technopreneurs can form their own company to commercialise their intellectual property or sell it to a third party,” he said after signing an Memorandum of Understanding (MoU) with Software Consortium of Penang (Scope) chairman Jeffrey Lim.

The MoU allows Scope to assist Cradle in screening and approving funds from the Cradle Investment Programme (CIP), which is Malaysia’s first development and commercialisation programme that enables budding innovators and aspiring entrepreneurs to transform their raw technology-based ideas into commercially-viable ventures.

Also present was Penang Skills and Development Centre chief executive officer Datuk Boonler Somchit, who witnessed the signing.

Nazrin said for the seed grant, Cradle would be giving up to RM500,000 to a company for commercialising its products.

“To date, we have given a total of RM35mil in pre-seed grants to 387 ideas from all over Malaysia. About 50% of the ideas come from Selangor, and the rest from Penang and other parts of the country.

“Over 50% of the 387 ideas have been successfully commercialised. Some 70% of the ideas we funded were from the information and communication technology sector, while the remainder were ideas from the life and material sciences,” he said.

Meanwhile, Lim said the MoU would allow more technopreneurs from Penang to gain access to funding from the CIP.

“It serves as a catalyst for the creation and growth of a total eco-system for the development and commercialisation of high-technology business in the northern region,” he said.

Monday, 29 March 2010

The following is a transcript of Notes on the News, "Paying Down Rich Nations' Debt," first videocast on March 26, 2010.

Though it doesn't always feel like it, at least not in the rich nations like the U.S. and Europe, recovery from the Great Recession is well underway. The world economy as a whole will grow 4% this year, faster next. As the 2008/2009 crisis recedes, governments everywhere have to drain off the stimulus support that got them through it.

This poses three particular challenges. Get them wrong and we're back in trouble. First, get the timing for stimulus exit right. It will be different for different countries. Governments shouldn't move until they are sure recovery is firmly entrenched. China has made a start. The U.S., as Fed Chairman Ben Bernanke has recently indicated, will have to wait a while.

The second is a global rebalancing of savings to help make recovery sustainable over the long run.

The third--a particular challenge in rich nations--is to cut public debt ratios back to prudent levels. The Great Recession created a great decline in government revenues. Stimulus spending widened the resulting deficits further. The IMF reckons that in the rich countries government debt will have risen from 75% of GDP pre-crisis to 110% of GDP by 2014. Even if stimulus spending is cut back, it only accounts for 10% of the forecast increase in debt--far outweighed by the growing demographic pressure that will raise health and pension spending.

Many of those obligations are in the public sector and thus politically difficult to cut back. Over the medium term, large public debts could lead to high real interest rates and slower growth. Without containment of health and pension spending--perhaps though better targeting of social benefits--the recovery in the rich countries from the Great Recession is going to be a long slow haul.

It turns out that doing nothing with your retirement portfolio was a pretty good way to ride out the financial crisis.

To channel Mark Twain, it appears that rumors of the death of buy-and-hold investing have been greatly exaggerated. Had you simply stayed put in a low-stress, completely pedestrian collection of equity and bond funds, including index funds, over the past few years and continued to contribute to them you would have actually outperformed the market. In other words, a little inertia can be a very good thing.

The obvious argument for buying and holding is that if you try to time the markets it's well nigh impossible that you will sell at the very top or buy back into the market at the very bottom. Blowing this timing can deliver more than a small, smarting tweak to your portfolio in our bubble-prone economy where highs can be dizzying and the lows crushing. A good chunk of the recovery will happen in its very first few days, and the crashes can be swift, too. Good luck trying to call these moves perfectly. You’ll need it.

Javier Estrada, professor at the IESE Business School in Barcelona, has proved just how much damage can be wrought by pulling out of the markets at the wrong time. Estrada analyzed the Dow Jones industrial average from 1900 through the end of 2007. He found that $100 invested in 1900 would have returned $25,746 by the end of his study. Yet if you missed just the 10 best days of that entire ride your total pot at the end would be $9,008, almost two-thirds less than maintaining constant market exposure. Had you missed the best 100 days you would have earned just $87 in that century-plus stretch.

The flip side, of course, is that if you avoid the worst days you also outperform. Had you missed the worst 10 days of Dow losses you would have seen your total rise to $78,781. The worst 100 would have earned you $11,198,734.

What this shows is that while timing the markets right can earn you a lot of money, you can also lose a lot. If you are investing for your retirement this is an awfully dangerous gamble. For many investors it may be a better idea to invest in a range of index funds, set it up so you dollar-cost average on a monthly basis, and basically forget about trying to micro-manage it. Morningstar recently tabulated a decade's worth of data and compared how the average investor did vs. the average fund. The answer was that over the past 10 years investors did considerably worse, seeing returns of 1.68%, against 3.2% for all funds, meaning investors were generally off with how they timed their purchases and sales.

In addition Vanguard recently documented the benefits of standing pat in its study "Resilience in volatile markets: 401(k) participation behavior September 2007-December 2009." What the study found was that during a time of exceptional volatility the average defined contribution investor at Vanguard barely altered their saving and investment behavior. On the face of it this sounds like heresy: We are being told more than ever to keep a close eye on our money, and to, in essence, become our own investment managers. But this consistency yielded surprisingly good returns for Vanguard's retirement-centered investors, mitigating the downside of 2008 considerably.

In 2008 during the height of the credit crisis and the stock market meltdown, Vanguard found that traders shifted just 4% of their assets from equities to fixed income. In 2009 that number was 1%. Between September 2007 and December 2009 only 3% of participants abandoned equities.

This might sound like a recipe for financial suicide, but it was the opposite. Since most 401(k) investors at Vanguard kept up their contributions, the median participant account balance actually grew by 10%, vs. a 25% decline for the markets during the time above. The beauty of all this was during the worst financial crisis in several decades those who stayed put, whether by design or simple inertia, ended up buying a lot of securities at the bottom and making money.

What this means is that despite a hyperventilating financial media and daily reports of doom and gloom, most people who stuck with a plan of investing in a balanced portfolio of diversified equity, balanced and bond funds, including indexes, experienced far less volatility. They were able to build on their portfolios during a crisis and buy at the bottom. This paid off in 2009 when median account balances grew by 33% at Vanguard against a 26.5% rise for the S&P 500.

Vanguard is the leading name in indexing, a passive way of owning securities that mirror the investment performance of the world's financial markets. Many, including Vanguard founder John Bogle, argue that indexing remains the easiest, most cost-efficient way for the average investor to invest.

Of course there are arguments to be made against buying and holding indexes. The leading argument is that there are skilled investors that have consistently beaten the markets. This may well be true, but there are caveats to putting your money with a great money manager. The biggest caveat, again, is timing.

Fund managers Ken Heebner, manager of the CGM Fund; and Bill Miller, of Legg Mason's ( LM - news - people ) Value Trust both have earned well-deserved praise for their abilities to beat the Standard & Poor's 500 during their respective tenures as the heads of their respective funds. No one can, or would try, to take this away from them. Yet both of these guys lagged the markets for years, even decades. Rare is the investor with the fortitude to stick through almost 20 years of losses, compared with the market, to reap later gains.

Let's start with Heebner: He became manager of his fund on the first day of 1981. From that day until March 22, 2010 he beat the S&P 500, but it was closer than you might think, according to data supplied by Morningstar. In that time Heebner earned annualized returns of 10.93% against the market's 10.67%. Again, all credit to Heebner. It should also be noted that Heebner badly lagged the markets through the end of 1999, returning 15% against the market's 17.2%.

This means that $10,000 invested with Heebner, if you stayed for the whole ride, would now be worth $207,420, against $193,389 for the S&P 500. Yet what if you stuck with him through the end of 1999 and, entering a new decade, simply got fed up? You would have had returns of $143,228 from Heebner against $202,586 for the S&P.

It's possible you would have bailed at that point. CGM couldn't provide data as to how many, if any, investors fled the fund, but assets certainly dwindled after 1999. By the end of the year Heebner's fund had $909 million under management, which shrank to $654 million by the end of 2000, and bottomed at $376 million by the end of 2002. For those who hung tight, the story had a happy ending, though, as Heebner ended up beating the markets over the past decade, by a nice margin. As of the end of 2009 the fund had $549 million in assets under management. Yet the ride was not smooth.

Bill Miller has managed the Value Trust since April 17, 1982. If you had stayed with him for the whole ride you would have, again, beaten the markets. The value of $10,000 invested at the start of his tenure would be worth $245,579 as of March 22, against $219,441 for the markets. Impressive. The actual annualized percentage points were closer, with Miller's fund earning 12.1% against 11.7% for the markets. Miller's fund has seen a reversal of fortune over the past decade, though, earning annualized total returns of -2.7% against the market's -0.7%.

(One irony: Despite making more money during his time managing the Value Trust, Miller's fund currently has one star from Morningstar against five for Heebner's.)

Miller saw his assets shrivel during a tough period, in his case 2006 through 2008, only to have the fund roar back to life in 2009. Once again nervous investors missed the ride. As Miller himself noted in a 2008 letter to shareholders, "We (and everyone else) get the most inflows and the most interest AFTER we've done well, and the most redemptions and client terminations AFTER we've done poorly. It will always be so, because that is the way people behave."

Some people, that is. So while it may be better to lucky than good, sometimes it's better to be lazy than smart.

Australian Consulate General, Tom Connor speaks outside the People's Intermediate Court in Shanghai after the trial of a Chinese-Australian executive of Rio Tinto

Rio Tinto, the Anglo-Australian mining giant, has sacked four iron ore executives after a Chinese court sentenced them to jail terms ranging from seven to 14 years for commercial espionage and taking bribes.

Within hours of Australian citizen Stern Hu and his three Chinese colleagues being convicted, Rio Tinto to limit damage to its business interests in China.

The company announced it had sacked the four executives and said it hoped the case would not affect its trade with the world’s largest steel producer.

Mr Hu, Wang Yong, Ge Minqiang and Liu Caikui all pleaded guilty to accepting bribes during negotiations over iron prices, but disputed the amounts and aspects of the accusations. One of the four had admitted to commercial espionage.

Sam Walsh, the company’s iron ore chief executive, described the behaviour of the four workers as “deplorable”.

He said: “We have been informed of the clear evidence presented in court that showed beyond doubt that the four convicted employees had accepted bribes.”

Mr Walsh declined to comment on the charges of stealing commercial secrets which were heard in a closed court last week, because the company “has not had the opportunity to consider the evidence”.

Mr Hu was sentenced to seven years for taking bribes and to five years for stealing business secrets, the Shanghai Number One Intermediate People’s Court ruled.

The court said Mr Hu would serve parts of the sentences concurrently, reducing his jail term to 10 years. Mr Wang, accused of taking 75 million yuan (£7.5 million) in bribes, received the longest sentence, of 14 years. The two other Rio staff, Mr Ge and Mr Liu, were sentenced to eight years and seven years respectively.

All four stood passive while the sentences were read out. Mr Hu’s usually dyed black hair was now white. Tao Wuping, a lawyer for Mr Liu, said: “I think all of them were already mentally prepared to appeal both the bribery and secrets convictions.”

Jin Chunqing, a lawyer in the Mr Hu’s team, said the defence team were gathering to decide their next step. He said: “We haven’t decided yet if we would appeal.” Appeals in China have about a one per cent chance of success.

Australian Foreign Minister Stephen Smith described the sentences as “very tough”. He said: "It is a tough sentence by Australian standards. As far as Chinese sentencing practice is concerned, it is within the ambit or within the range. According to Australian officials there was evidence indeed, if not substantial evidence, that bribery acts had occurred." Announcing its verdict, the court said it had shown leniency because the defendants had made admissions of guilt.

However, it said the sentences were in line with the seriousness of a crime that had caused major losses to the Chinese steel industry.

The court found that the four had helped to obtain information from confidential strategy meetings of the China Iron and Steel Association, which was representing the Chinese steel industry in last year’s negotiations with the world's three top iron ore suppliers, Rio, BHP Billiton and Vale.

It is unclear how the actions of the four Rio executives differed from usual practices by businesses seeking to learn details of the position of an opposite number in any business negotiation.

The four Rio employees were arrested last July during contentious iron-ore contract talks between top mining companies and the steel industry in China, the world's largest consumer of the raw material. The talks collapsed. This year’s negotiations are still under way.

Tom Albanese, the chief executive of Rio Tinto, said: “I am determined that the unacceptable conduct of these four employees will not prevent Rio Tinto from continuing to build its important relationship with China.” Mr Smith said the outcome of the trial would not impact relations between China and Australia. He said: “I don’t believe that the decision that has been made will have any substantial or indeed any adverse implications for Australia’s bilateral relationship with China.

“We did go through some tensions or some difficulties last year, but whilst this has been a sensitive, very important and very difficult consular case, I don’t believe that what has occurred today will have an adverse impact on our own relationship.

“We continue to have a very strong economic and broader relationship with China,” he said. At the three-day trial, the court heard evidence that millions of yuan in bribes had been stuffed into bags and boxes for the accused.

Mr Hu took money from small private steel companies which, before the global financial crisis, were locked out of buying iron ore from Rio Tinto because the mining giant gave priority to large state-run steel companies.

Mr Walsh said today: "Shortly after the four employees were detained we appointed independent forensic accountants and lawyers to assist us in carrying out an internal investigation into the claims. This was done to the fullest extent possible. It did not uncover any evidence to substantiate the allegations of wrongdoing. "Rio Tinto has concluded that the illegal activities were conducted wholly outside our systems.

He added: "We have already implemented a number of improvements to our procedures, and we have now ordered a further far-reaching independent review of our processes and controls. We will introduce any necessary additional measures and safeguards the review recommends and will spare no effort in doing everything we can to prevent any similar activity."

Dong Zhengwei of the Beijing Zhongyin law firm, said: “Based on the crime, Mr Hu’s sentence is not harsh for China. He faced up to 15 years. This sends a real signal to foreign companies that they must act in accordance with business ethnics. They face a risk if they engage in illegal activities.”

The men will likely serve their sentences at Shanghai’s Qingpu prison, where American Jude Shao served 10 years of a 16-year sentence for tax evasion and fraud. Mr Shao was released in 2008.

Yield on US Treasuries advanced this week as demand for the $118B of 2, 5 and 7 year notes was weak. Demand from indirect bidders, the group that contains foreign central banks, and direct bidders, which includes domestic money managers both slipped. The yield on the bench mark 10 year bond increased to 3.90% before retreating to 3.86%. This yield is far less than the 6.3% the Greek's had to pay for their 10 year notes, but a continuation of this trend in the US may hinder recovery in the US housing market. Many home loans are priced in relation to the 10 year paper and the recovery is endangered by the biggest rate jump since December.

Some attribute the Greek sovereign debt crises as a catalyst for the higher rates. In a Market Watch bond review this morning they said:

"What's changed is that investor outlooks on the fiscal side have turned decidedly more downbeat since Greece's debt woes were first splashed onto the front pages of the main papers," RBS Securities' Bill O'Donnell and Aaron Kohli said.

"The spotlight on Greece only helped to reveal that that the U.S.'s kitchen (federal and state budget balances) was itself full of cockroaches," the bond strategists wrote in a note."

Fed Chairman Bernanke, during the past year, expanded the balance sheet of the Central Bank by the purchase of agency paper from Fannie and Freddie. If these lns were priced, mark to market, as the IRS demands, what would the new Fed balance sheet look like?

With current massive US budget deficit heading for a record of $1.6T, big bi weekly Treasury auctions will be the norm. We wonder if the current auction is a fluke or the beginning an upward spiral in rates, as global governments compete for money to fund their deficits. Bill Gross, the world's largest bond fund manager has expressed his views, when he told CNBC he prefers stocks over bonds. According toMONEYNEWS.COM he said:

Gross also cited "the healthcare situation and the $40 trillion worth of present value in terms of entitlements we have in the United States," he said.

"We just added in my opinion another $500 billion in terms of healthcare and the markets are beginning to look at that suspiciously."The dollar got a boost this past week, benefiting from the chaos caused by the Euro bankers response to the Greek crises. If the current agreement, which assigns two thirds of the bail out to the Europeans, and one third to the Washington based IMF holds, what will be the next problem that concerns currency traders. Higher yields in the US may attract some investor interest from yield seekers, but we all know which direction bonds go if the rates work higher. We are very cautious about the short side of the euro versus the dollar. Not all of the debt problems are in Greece.

Sunday, 28 March 2010

It appears the day when we we'll be paying to read general interest news stories on the Web is coming sooner, rather than later--perhaps as early as June for readers of the U.K.-based Times publications. News International, the British division of Rupert Murdoch's News Corp., announced on Friday that two of its newspapers, The Times and The Sunday Times of London, are set to begin charging readers using its sites in June.

The two papers have been offering their content in a combined news Web site called Times Online. Under the new plan, however, News International would introduce new, separate sites for each publication in May, according to several news accounts citing a company statement.

The sites will reportedly be offered for 1 pound ($1.48) for a day's access, or 2 pounds ($2.96) for a week's subscription. Those fees will cover access to both sites, which will be available for free during a trial period. As newspapers struggle to stay alive amid declining print circulations and weak advertising revenues--only made worse by recessionary times--there's been much talk about charging users for online stories.

"At a defining moment for journalism, this is a crucial step towards making the business of news an economically exciting proposition," News International CEO Rebekah Brooks said in a broadly reported statement. She added that "This is just the start," but did not offer up details on plans for the company's two other U.K. publications. http://newscri.be/link/1055863

But the move by the British Times publications, would mark the one of the first mass-market, consumer newspapers to start charging for content. (Newsday and Le Monde in France are two that we know of.) Meanwhile, in another move to save his business, Murdoch continues to point fingers at Google for depriving the industry of revenue by making news articles searchable for free. He plans to press legal action against the search giant if talks fail over its indexing of news content.