Ray Kurzweil will keynote the H+ Summit, to be held June 12-13 at Harvard University, with a talk on “The Democratization of Disruptive Change.”

The talk will focus on understanding the brain: Where are we on the roadmap to this goal? What are the effective routes to progress — detailed modeling, theoretical effort, improvement of imaging and computational technologies? What predictions can we make? What are the consequences of materialization of such predictions – – social, ethical?

“According to my models, we are only two decades from fully modeling and simulating the humanbrain,” said Kurzweil. “By the time we finish this reverse-engineering project, we will have computers that are thousands of times more powerful than the humanbrain. These computers will be further amplified by being networked into a vast worldwide cloud of computing. The algorithms of intelligence will begin to self-iterate towards ever smarter algorithms.

“This is how we will address the grand challenges of humanity, such as maintaining a healthy environment, providing for the resources for a growing population including energy, food, and water, overcoming disease, vastly extending human longevity, and overcomingpoverty. It is only by extending our intelligence with our intelligent technology that we can handle the scale of complexity to address these challenges.”

This link goes an article titled, “Cloud Computing’s Three Revolutions: Part 2,” and the whole piece is worth checking out (along with the first part) if you’re interesting in cloud computing. I have some serious concerns about cloud computing, particularly with privacy and the current state of legal precedent regarding the public/private status of data in the cloud (hint: right now people computing in the cloud are “not truly acting in private space at all” per U.S. District Court Judge Michael Mosman.)

Those concerns aside, this point from the first link details one place where a secure, private cloud can really help push innovation by removing a traditional roadblock to IT experimentation:

Low Cost Fosters Experimentation

An aspect of cloud computing that isn’t emphasized enough in most discussions about it is the fact that it is ideally suited for application experimentation. Just as the high-cost, capital-intensive IT of the past caused investment to focus on the safest, lowest-risk applications, the low-cost, capital-lite IT of cloud computing will motivate business organizations to experiment with new business initiatives. Business initiatives that, in the past, couldn’t have gotten enough support to justify sharing precious capital to take a flyer on them, will find a far friendlier environment in cloud computing.

A good example of this is the NASDAQ Market Replay application that leverages Amazon Web Services. Trying to buy enough equipment for this application would have been prohibitive, even though the application’s value seemed intuitive. Using AWS, the application could be developed for much less, which made launching it much lower risk. New applications can be tried out at a cost of hundreds or thousands of dollars, rather than the hundreds of thousands of dollars required heretofore. If you are a line of business executive with innovative ideas, cloud computing is going to make your prospects much brighter.

In the “low cost fosters experimentation” perspective, cloud computing is much like open source. In his book Here Comes Everybody, Clay Shirky noted that open source’s low cost encourages experimentation and making mistakes. When the stakes are low, trials that don’t work out are much more acceptable—and increasing the numbers of trials increases the odds for success.

The first print advertisement for the Xerox 914 Copier, the first automated copy machine, introduced to offices in 1950. In introducing the new machines, the president of Xerox said in a news release: “Our girls love to use the 914 and have discovered many new copying jobs for it to do.”

Have a look at just three technologies that have the ability to completely revolutionize IT from the ground up: memristors, nanowires and OLEDS.

Memristors are transistor-like devices made out of titanium dioxide that can remember voltage state information. They hold the potential for completely revolutionizing storage and processing technologies because they erase the distinction between processing and storage (you can do both/and on the same chip). More prosaically, they make it possible to create storage devices that require no power. How will that affect your data center?

Then there are nanowires: tiny wires no more than a single nanometer in width that can be conductors, insulators or semiconductors (albeit with weird quantum properties). These can form the basis for embedded intelligent networks — sensor and control networks that are actually built into the materials and devices they control. (Take that, smart grids!)

Finally, there are organic LEDs, which have the interesting property that they can be printed onto things such as wallpaper at relatively low cost. Sony has developed OLED monitors, and GE is looking into OLED wallpaper. So in a couple of years, your new office (or home office) may come equipped with wallpaper that, at the touch of a button, can turn into a floor-to-ceiling high-resolution display. (Think of the bandwidth requirements).

Each of these technologies holds the possibility of completely reshaping IT within the next few years. And the conjunction of all three could make the conjunction of the transistor and fiber optics look like a warm-up act.

No link here, this come through the inbox with this title, “The Case against Literary (and Software) Patents” and this descriptor, “Issue #125, August 28, 2009; by Timothy B. Lee.”

If you really feel the need to do some clicking here’s the footer, “Timothy B. Lee is an adjunct scholar at the Cato Institute in Washington, DC. To subscribe or see a list of all previous TechKnowledge articles, visit www.cato.org/tech/tk-index.html.”

I’m going to pull from this excellent piece by Lee, but I’m betting you can find the whole thing following that last link up there. It’s worth the read.

Software patents have been a drain on the IT world for a while and things have become simply out of hand. If you’ve never looked too deeply into the topic, or even never have heard of it, it’s shocking in how the US Patent and Trademark Office has been willingly misused by IT firms. Usually very big IT firms.

From the essay:

Patent protection was first extended to software in the 1980s, and the practice accelerated in the 1990s. As a result, it is now difficult to create any significant software without infringing a patent. With tens of thousands of software patents granted every year, and no effective indexing method for software patents, there is no cost-effective way to determine which patents cover any piece of software.

Stanford law professor Mark Lemley has documented the unsurprising result: most firms in the IT industry have simply given up trying to avoid patent infringement. Instead, larger firms stockpile patents to use as ammunition when they are inevitably sued for infringement. They also sign broad cross-licensing agreements with other large firms promising not to sue one another. This has prevented patents from bringing the software industry to a standstill, but it’s hard to see how the practice promotes innovation.

Even worse, software patents tilt the playing field against smaller and more innovative software firms. Most small firms develop their technology independently of their larger competitors, but this isn’t enough to prevent liability; incumbents have so many broad software patents that it’s impossible to enter many software markets without infringing some of them. Small firms don�t have the large patent arsenals they need to negotiate for cross-licensing agreements with their rivals. As a consequence, incumbents can use their patent portfolios to drive smaller competitors out of business. Other small firms are forced to pay stiff licensing fees as a cost of entering the software industry. The result is to limit competition rather than promote innovation.

The Supreme Court has been taking steps to rein in the patent bar in recent decisions such as KSR v. Teleflex. But the Court hasn’t directly addressed the patentability of software since 1981, when it ruled (as it had on two previous occasions) that software is ineligible for patent protection. In the intervening years, the United States Court of Appeals for the Federal Circuit, which hears all patent appeals, has seemed to stray far from that principle. But the Supremes have not reviewed any of its key decisions.

The patent at issue in Bilski is not a software patent; it is a “business method” patent that claims a strategy for hedging against financial risk. But the case is being closely watched for its effects on the software patent issue. Patented business methods are often implemented in software; for example, a key decision on the patentability of software, State Street Bank v. Signature Financial Group, involved a software-implemented business method. And the standard articulated by the Federal Circuit in Bilski, known as the �machine-or-transformation test� has been used by the Patent Office in recent months to invalidate several software patents. The Supreme Court could ratify the Federal Circuit’s mildly restrictive standard, or it could articulate its own standard that is either more or less restrictive of patents on software.

June 4, 2009

The global recession hasn’t lowered prices in all categories, but it looks like software is one sector pressed to get lower. In a time of limited spending for many firms, I’d say pushing dollars toward the IT budget might be a good idea.

From the link:

The worldwide recession has hammered IT budgets but has also prompted vendors to make their software pricing and licensing models more customer-friendly, according to a new Forrester Research report.Forrester’s report looked at how 12 enterprise software vendors’ pricing and licensing strategies changed in the fourth quarter of 2008 and the first quarter of this year.

Easily the most dramatic change was SAP’s recent, well-publicized agreement with user groups around KPIs (key performance indicators) to prove the value of its fuller-featured but more expensive Enterprise Support service.

January 16, 2009

Here’s nine of the hottest IT skills for 2009 according to CIO.com. I’ll have to admit the list is a little mundane, in that pretty much anyone could build the list in their sleep. With “hot” lists I always like to see at least one or two wildcards in there.

From the link, here’s number five:

5. Business Intelligence

Now more than ever, corporate executives want to be able to analyze customer and sales data in order to make informed decisions about business strategies. That’s driving demand for business intelligence specialists across the board, including people with data mining, data warehousing and data management skills.

At Aspen Skiing Co., which operates four ski resorts in western Colorado, company officials will be making year-over-year comparisons on customer spending, including analyses of spending habits during the previous recession, says CIO Paul Major. “We’re going to have to get very granular with our analytics,” he says.

Meanwhile, there’s steady demand for IT professionals with experience using vendor-specific BI tools from companies such as Business Objects and Cognos, says Spencer Lee. But the toughest people to find in this area are those who can help business managers understand the type of data they’re trying to analyze and how to interpret the results, she says. “What’s difficult,” she adds, “is to find someone who’s the full-meal deal.”

1. Use open-source and free software: When you’re trying to keep your business afloat, plunking down lots of cash for off-the-shelf software can really hurt. Thankfully, freeware and low-cost software can be a pleasant surprise in terms of robustness and functionality. While not as polished as Microsoft’s Office suite (but not as much of a memory or resource hog), OpenOffice.org is a free, open-source alternative with a full suite of applications for word processing, spreadsheets, presentations, and databases that are compatible with Microsoft Office formats. Google Docs (docs.google. com) is another viable and free alternative to Office. It’s Webbased, meaning you have no software to download or install.

Though it isn’t nearly as full-featured as either Office or OpenOffice, the basic functionality and streamlined interface of Google Docs may be all you’ll ever need. Creating PDF files may be crucial for business, but spending $450 on Adobe’s Acrobat Professional is not. CutePDF is a free program that simply exports files to PDF. Just download and install it; from the target file, choose File•Print, and select CutePDF from the printer menu. (If you’re using OpenOffice or Google Docs, you won’t even need to install CutePDF— both let you export to PDF directly.)

I’ll have to admit I’ve messed around with OpenOffice and came away not wildly impressed. It is decent, though, and it is also free. I’ve downloaded CutePDF for a trial run, but not installed the software just yet. I’ll post a review if it knocks my socks off once I get around to the installation.

Professor Luftman and SIM release full results of annual IT industry survey at SIMposium 2008

Businesses making broad use of IT to reduce costs and increase productivity

HOBOKEN, N.J. — The current economic crisis is forcing companies to quickly evaluate and modify their business models, but unlike previous economic downtrends, the information technology organizations are not feeling the cuts as quickly as in the past, according to a newly released survey.

“This time, businesses are using IT to reduce costs and to increase productivity across the company,” said Jerry Luftman a Distinguished Professor and Associate Dean at Stevens Institute of Technology, who also serves as the Vice President for academic Affairs for the Society for Information Management.

More than 300 respondents from 231 companies responded to Dr. Luftman’s annual survey (in June) which has become an industry barometer. The findings were released earlier this month at SIMposium 2008 in Orlando. Overall, the survey indicated that the IT industry remains strong, but Luftman noted that some data shows 2009 will likely to be a difficult year.

Among the findings in the report:

Chief Information Officers are spending about 80 percent of their time on non-technical relates issues.

Forty-five of the respondents said their CIOs have been in place for more than three years with the average CIO spending 4.3 years in their current position, up from 4.1 in 2007.

Of those surveyed, 22.2 percent said their IT departments are organized as federations, up 4.2 percent from a year earlier. Federated organization structures tend to have higher IT business alignment maturity scores, as uncovered in important benchmarking research underway by Luftman. Additionally, organizations where the CIO reports to the CEO also tend to have higher alignment maturity ratings. More than 40 percent of CIOs report to their CEO. Another important insight in Luftman’s maturity research is that organizations with higher maturity scores have higher overall company performance.

Luftman noted that companies are looking to hire IT employees who have more than just technical savvy saying they want “people who can demonstrate interpersonal, management, and industry skills.”

In fact, the survey found that companies want entry level and mid-level employees to demonstrate ethics and morals, oral communication and the ability to more effectively collaborate with their business partners.

“Technical skills are important, but it’s not everything to most companies,” said Luftman.

Looking towards 2009, the survey found that companies would likely increase or keep their IT headcount flat. But, noted that budget allocation for offshore outsourcing is projected to increase to 5.6 percent in 2009, up from 3.3 percent this year.

Some companies will have to do more with less as respondents said that IT departments, which have an IT budget averaging 3.82 percent of revenues, although having increasing IT budgets for 2009, will be faced with increases that will likely be down.

###

About Professor Luftman

Jerry Luftman (jluftman@stevens.edu) is Associate Dean and a Distinguished Professor for the graduate information systems programs at Stevens Institute of Technology, where he also earned his doctoral degree in information management. His 23-year career with IBM included strategic positions in IT management, management consulting, information systems, marketing, and executive education. He played a leading role in defining and introducing IBM’s Consulting Group. As a practitioner, he has held several positions in IT, including CIO. Luftman’s research papers have appeared in dozens of professional journals and books. His book, Competing in the Information Age: Align in the Sand, was published by Oxford University Press. He has been a presenter at many executive and professional conferences and is regularly called on by many of the largest companies in the world. He has served on the SIM Executive Board for over 10 years and was president of the New Jersey Chapter of SIM.

About Stevens Institute of Technology

Founded in 1870, Stevens Institute of Technology is one of the leading technological universities in the world dedicated to learning and research. Through its broad-based curricula, nurturing of creative inventiveness, and cross disciplinary research, the Institute is at the forefront of global challenges in engineering, science, and technology management. Partnerships and collaboration between, and among, business, industry, government and other universities contribute to the enriched environment of the Institute. A new model for technology commercialization in academe, known as Technogenesis®, involves external partners in launching business enterprises to create broad opportunities and shared value. Stevens offers baccalaureates, master’s and doctoral degrees in engineering, science, computer science and management, in addition to a baccalaureate degree in the humanities and liberal arts, and in business and technology. The university has a total enrollment of 2,150 undergraduate and 3,500 graduate students with about 250 full-time faculty. Stevens’ graduate programs have attracted international participation from China, India, Southeast Asia, Europe and Latin America. Additional information may be obtained from its web page at www.stevens.edu.

Microsoft is now taking the threat from Google quite seriously: In July 2008 COO Kevin Turner was dispatched to consumer-products giant Procter & Gamble to dissuade P&G from moving to Google Apps—and ditching Microsoft.

Back in February 2007, Google launched the Google Apps edition for businesses. Executives told media outlets that initial customers included a unit of Procter & Gamble. At some point in 2008, hundreds of P&G employees were testing Google’s e-mail, word-processing and spreadsheet applications as potential replacements for Microsoft products, according to a recent Bloomberg article about the P&G incident.

P&G, of course, is a massive consumer products goods company, with $84 billion in annual revenue this past year. To lose that kind of a customer—especially to Google—would have been catastrophic for Microsoft.

GLENDALE, Calif., Sept. 4 /PRNewswire/ — Panda Security, a leading provider of IT security solutions, today announced that PandaLabs, Panda Security’s laboratory for detecting and analyzing malware, has noted an increase in cyber-crooks’ use of malware under the guise of fake antivirus products to defraud users. These applications, classified as adware by PandaLabs, pass themselves off as antivirus utilities and often appear on the Internet as free downloads. Alternatively, they can be concealed in other files downloaded by users, including music or video files.

Once on a system, they often operate as follows: they tell the user (who is often unaware that the application is on their system) that a virus has been detected. They then invite them to buy the full version of the antivirus to disinfect the computer (you can see an example of these fake antivirus programs here: http://www.flickr.com/photos/panda_security/2678703471/).

If users don’t purchase the antivirus, it continues displaying non-existent infections and pop-ups inviting users to purchase the security software, which in reality does not detect or delete anything. If they buy it, they will have paid for a useless program. This is how cyber-crooks reach the main objective: to profit financially through malware. Additionally, to prevent users from checking whether they are genuinely infected or not, these programs usually try to block the web pages of real online antiviruses, security companies, etc.

“Initially, these fake antivirus programs were quite elementary. They are however, becoming more sophisticated to prevent detection by real security solutions. Many have become polymorphic (they change their form every time they are installed on a computer),” explains Luis Corrons, Technical Director of PandaLabs. “This investment proves cyber-crooks are obtaining significant financial benefits, and consequently, many users have fallen victim to this fraud.”

How to avoid falling victim to these fake antivirus products

— Be careful with what you install: On many occasions these programs are associated with other downloads. i.e. users can download a legitimate program and one of these programs can be included in the package. Usually, there is a non-installation option. PandaLabs recommends users to carefully check the programs that are entering the computer during the download.

— Ignore emails with eye-catching news or subjects: Many of these programs have been distributed in recent weeks using social engineering techniques — sending emails with eye-catching subjects (you can see an example here: http://www.flickr.com/photos/panda_security/tags/fakeantivirus/). These emails invite users to click a link to watch a video or images of the false news. If they do, they will be allowing some kind of malware to enter their computer, e.g. fake antiviruses.

— Be wary at the slightest indication of trouble: If a program you don’t remember installing begins to display false infections or pop-ups inviting you to compare some type of antivirus, watch out. Most likely one of these malicious programs has been installed (example of pop-up of a fake antivirus: http://www.flickr.com/photos/panda_security/2679524216/)

— Keep all the programs up-to-date: An outdated program can be a vulnerable program. Consequently, you should keep all applications installed on the computer up-to-date, since many malicious codes use existing computer vulnerabilities to enter and infect them.

— Scan your computer with a reliable security solution: You are advised to periodically scan your computer with a trusted security solution. This way, if one of these samples is resident on the computer, it can be detected and eliminated. Panda Security provides free online scan tools for home-users and companies at Infected or Not, http://www.infectedornot.com/.

About PandaLabs

Since 1990, its mission has been to detect and eliminate new threats as rapidly as possible to offer our clients maximum security. To do so, PandaLabs has an innovative automated system that analyzes and classifies thousands of new samples a day and returns automatic verdicts (malware or goodware). This system is the basis of collective intelligence, Panda Security’s new security model which can even detect malware that has evaded other security solutions.

Currently, 94% of malware detected by PandaLabs is analyzed through this system of collective intelligence. This is complemented through the work of several teams, each specialized in a specific type of malware (viruses, worms, Trojans, spyware, phishing, spam, etc), working 24/7 to provide global coverage. This translates into more secure, simpler and more resource-friendly solutions for clients.

Panda Security is one of the world’s leading IT security providers, with millions of clients across more than 200 countries and products available in 23 languages. Its mission is to develop and provide global solutions to keep clients’ IT resources free from the damage inflicted by viruses and other computer threats, at the lowest possible total cost of ownership.

Panda Security proposes a new security model, designed to offer a robust solution to the latest cyber-crime techniques. This is manifest in the performance of the company’s technology and products, with detection ratios well above average market standards and most importantly, providing greater security for its clients. For more information and evaluation versions of all Panda Security solutions, visit our website at: http://www.pandasecurity.com/.

(Total aside, if you’re reading much of the IT media world right now, SaaS comes up almost as often as cloud computing.)

An excerpt from the first link:

Here’s an interesting strategy for a new software company: create applications that place you squarely in the competitive sights of Google and Microsoft, bypass venture capital funding, and rebuff an acquisition offer from Salesforce.com, the surging software as a service (SaaS) company that delivers its products over the Web

That’s been the exact path of Zoho, a SaaS company launched in 2005 that offers a wide range of online software, including e-mail, a word processor, spreadsheets, wikis, and even a customer relationship management application that it sells to sales and marketing departments. In all, Zoho sells 17 productivity and collaboration apps, all for prices that, by traditional software standards, are dirt cheap.

For the whole lot of Zoho’s business applications, it costs a mere $50 per user per year (the same price that Google asks large enterprises for its Google Apps software). By contrast, the Professional Version of Microsoft Office, the popular software found on workstations throughout most of the corporate world, retails for as high as $499, the same price as some personal computers on the shelf at Wal-Mart.

CIO.com has a fairly comprehensive article titled “Demystifying Cloud Computing.” It’s a great place to start to learn more about the topic.

From the link:

Welcome Cloud Computing

Staten describes the concept as “a pool of abstracted, highly-scalable, and managed compute infrastructure capable of hosting end-customer applications and billed by consumption.”

Simply put, cloud computing is the next generation model of computing services. It combines the concepts of software being provided as a service, working on the utility computing business model, and running on grid or cluster computing technologies. Cloud computing aims to leverage supercomputing power, which can be measured in tens of trillions of computations per second, to deliver various IT services to users through the Web.

In his report, Staten refers to cloud computing as a service delivery platform, which is built on the same basic fundamentals of traditional hosting or SaaS. The building blocks of cloud computing, he says, that take the concept beyond conventional forms of IT service delivery models are:

— A prescripted and abstracted infrastructure. Fundamental to the cloud computing model is standardization of infrastructure and abstraction layers that allow the fluid placement and movement of services. It starts with a flat implementation of scale-out server hardware that, for some clouds, serves as both compute and storage infrastructure (others are leveraging SAN storage). Their infrastructure enables the cloud and is decided upon solely by the cloud vendor; customers don’t get to specify the infrastructure they want — a major shift from traditional hosting.

— Fully virtualized. Nearly every cloud computing vendor abstracts the hardware with some sort of server virtualization. The majority employ a hypervisor to keep costs low. Some have solutions that span virtual and physical servers via another middleware element, such as a grid engine.

— Equipped with dynamic infrastructure software. Most clouds employ infrastructure software that can easily add, move, or change an application with very little, if any, intervention by cloud provider personnel.

— Pay by use. Most clouds charge by actual use of resources in CPU hours, gigabits (Gbs) consumed, and gigabits per second (Gbps) transferred, rather than by a server or with a monthly fee. Their pricing is compelling.

— Free of long-term contracts. Most cloud vendors let you come and go as you please. The minimum order through XCalibre’s FlexiScale cloud, for example, is one hour, with no sign-up fee. This makes clouds an ideal place to prototype a new service, conduct test and development, or run a limited-time campaign without IT commitments.

— Application and OS independent. In most cases, the architectures of clouds support nearly any type of app a customer may want to host as long as it does not need direct access to hardware or specialized hardware elements. Cloud vendors told say there’s nothing about their infrastructures that would prevent them from supporting any x86-compatible OS.

— Free of software or hardware installation. You tap into a cloud just as you would any remote server. All you need is a log-in. There’s no software or hardware requirement at the customer end nor the need for specialized tools.

Google has launched Google Insights for Search, an extension of Google Trends, designed to be used by advertisers, small business owners, academics and others.

Like Google Trends, the Insights software lets users type in search terms and then see search volume patterns over time and the top related and rising searches. But users can also now compare volume trends across multiple search terms, vertical industry categories, geographic regions and time ranges.

The CIA is undergoing a major transformation, and IT is playing a leading role. In Part 2 of our inside look at the agency, CIA employees describe the environment pre- and post-9/11, and the massive changes that resulted from that day’s tragic events. Like other government agencies, the CIA and its IT department were unprepared for the intense change that was to come. (See “Inside the CIA’s Extreme Technology Makeover, Part 1” to read the first part in our series.)

Technology Review, July 8, 2008Vista Therapeutics is developing sensitive devices for continuous bedside monitoring of blood biomarkers for detecting organ failure and other problems in seriously injured or ill patients, such as those in the ICU after suffering a heart attack or traumatic injuries from a car accident.The devices use siliconnanowires developed by Harvard University chemist Charles Lieber. When a single protein binds to an antibody along the wire, the current flowing through the wire changes. Arrays of hundreds of nanowires, each designed to detect a different molecule in the same sample, can be arranged on tiny, inexpensive chips. The changes can be monitored continuously as molecules bind and unbind, making it possible to detect subtle trends over time, without requiring multiple blood draws.

Because nanowires are so sensitive and inexpensive, they could also find their way into home tests for cancer.

Technology Review, July 8, 2008University of Michigan have made a processor (the Phoenix) that measures just one millimeter square with a power consumption so low (2.8 picojoules of energy per computing cycle) that emerging thin-film batteries of the same size could power it for 10 years or more.At this scale, it could be feasible to build the chip into a thick contact lens and use it to monitor pressure in the eye, which would be useful for glaucoma detection. It could also be implanted under the skin to sense glucose levels in subcutaneous fluid. It could also be used in environmental sensors that monitor pollution, or structural health sensors, for instance.