Upgraded SAS® Data Management delivers big data payoff

Today's big data mania reflects a harsh reality: companies have huge amounts of data, but struggle to turn it into value. SAS, a leader in data quality and data integration, is offering new technologies and upgrades to its SAS® Data Management family of software to help organizations meet this big data challenge.

"With SASData Management, companies, government agencies and universities can more easily consolidate and transform silos of raw big data into integrated, quality information. This can help reduce customer-response times, prevent fraud, streamline supply chains, improve product quality, and achieve other business goals," said Jim Davis, SAS Senior Vice President and CMO.

The new release helps customers address high-priority items first, with one platform that provides modules for data access, data quality, event stream processing, data integration and more. This modular approach helps companies expand and adapt their use of SAS as their needs evolve.

"At Macys.com, the challenge is to gather data and turn it into insights in a timely fashion, and to respond to any kind of consumer demand changes or marketplace changes," said Kerem Tomak, Vice President of Analytics at Macys.com. "We have been using SAS Data Management to pull data from Hadoop to generate hundreds of thousands of models at the product or SKU level. That tells us the probability of selling a product at a particular time and location so we can optimize our assortment."

Enhancements to SAS Data Management include new in-database data quality capabilities, improved event-stream processing, and new access engines to more easily use data from Hadoop and other sources.

New in-database data quality

This update introduces in-database data quality to the SAS Data Management family. Cleansing data within the database requires less data movement, which dramatically reduces the time needed to transform raw data into usable information. Ultimately that means better, faster business decisions.

The new SAS Data Quality Accelerator for Teradata is the only in-database data-cleansing technology for the Teradata platform. By executing data quality functions directly in the Teradata database, companies can improve processing speeds, reduce IT workload, and most importantly, provide high-quality data for a host of downstream business uses.

"Quality data is fundamental for accurate business decisions, and the new SAS Data Quality Accelerator for Teradata enables customers to efficiently deal with data integrity issues. The solution quickly and thoroughly transforms raw data into valid, dependable information that's ready when organizations need it," said Rob Berman, Vice President of Teradata Partnerships and Alliances. SAS and Teradata are longtime partners.

Now with an improved interface, SAS Event Stream Processing lets companies access and analyze data as it happens instead of when it is hours or even days old. Companies like electric utilities, which continually collect high volumes of data on customer power use from smart meters, can use real-time data to anticipate and react to sudden spikes in demand. And financial services firms can use event-stream processing to evaluate trades in real time and identify risky or fraudulent activity.

New data sources

Data comes not only in many sizes, but from many sources, including the cloud, Hadoop clusters, streaming transactional data, on-premise databases, mainframes, in-memory databases and more. Regardless of format or source, SAS Data Management organizes and cleanses data to create a consistent, valid data source for operational systems and business intelligence and analytic applications.

With this upgrade, SAS makes it even easier to manage data from HP Vertica, Hortonworks, and PostgreSQL. With data connectivity and integration between SAS and these data sources, customers can edit, move, cleanse and integrate data regardless of source, platform, size or variety.