"We've been watching the unstructured space for a while, just seeing the trends everyone else is seeing with more textual data being amassed," said Anne Milley, director of product marketing for SAS. "Effectively, we were looking at how can we have more natural language processing capabilities that we can leverage across our whole platform."

While SAS has had text-mining technologies for a while, the company grew to see the "need for leveraging natural language processing virtually at every step," she said.

SAS looked at other companies before purchasing Teragram. "We did do an assessment of several out there. There are some really good technologies, but in terms of a broad NLP workbench, [a support for] multiple languages, Teragram is really a best fit," Milley said.

Other players in the space include SPSS and Language and Computing.

The company's plans at present are to retain the Teragram brand, according to Milley.

According to SAS, Teragram's customers include CNN, Washingtonpost.com, and Yahoo.

Guy Creese, an analyst with Burton Group focusing on search and content management, said he was not overly familiar with Teragram but characterized SAS's move as one that fits into an overall trend: "At a high level, it points to the increasing consolidation between analyzing structured data and unstructured data together."

For example, he said, a traditional business-intelligence application might derive findings from data collected with a customer-satisfaction survey of a store chain. But by adding text analysis, a company can look at customer service call logs and spot telling words or phrases, such as "customer was disappointed," or "customer was mad," Creese said.

The analysis could also consider the tone and context, he added. "You're not only looking for a thumbs up or thumbs-down, but were they mildly irritated or really angry. You can get that from looking at the words."

Also Monday, SAS launched the 9.2 version of its data-analysis software. Improvements in SAS 9.2 include additional algorithms and guidelines for forecasting and predictive modeling, as well as role-based interfaces for helping a wider variety of users work with the tools.