Finding efficient ways to compress and decompress data is more important than ever. Compressed data takes up less space and requires less time and network bandwidth to transfer. In this article, we’ll discuss the data compression functions and the latest improvements in the Intel® Integrated Performance Primitives (Intel® IPP) library.

From car to cloud―and the connectivity in between―there is a need for automated driving solutions that include high-performance platforms, software development tools, and robust technologies for the data center. With Intel GO automotive driving solutions, Intel brings its deep expertise in computing, connectivity, and the cloud to the automotive industry.

Vectorization offers potential speedups in codes with significant array-based computations—speedups that amplify the improved performance obtained through higher-level, parallel computations using threads and distributed execution on clusters. Key features for vectorization include tunable array sizes to reflect various processor cache and instruction capabilities and stride-1 accesses within inner loops.

Novosibirsk State University is one of the major research and educational centers in Russia and one of the largest universities in Siberia. When researchers at the University were looking to develop and optimize a software tool for numerical simulation of magnetohydrodynamics (MHD) problems with hydrogen ionization —part of an astrophysical objects simulation (AstroPhi) project—they needed to optimize the tool’s performance on Intel® Xeon Phi™ processor-based hardware.

Big data projects don’t typically fail for a single reason, and certainly not for technology alone. A combination of factors serve to derail big data deployments. Problems and failures occur due to factors including business strategy, people, culture, inattention to analytics details or the nuances of implemented tools, all intensified by the rapid advancement of digital transformation.

While there are multiple architectures targeting big data analytics, we have traditionally focused on maximizing dedicated hardware and large processing power. In the past few years, however, there is a fast growing convergence of big data and cloud, particularly where the data sets are unstructured with simple data models — an area of specific focus for the Apache Hadoop technology.

Data scientists generally ascribe to the “machine learning process” which is seen as a roadmap to follow when working on a data science project. The infographic at the end of this article provides a detailed work flow that it is general enough to encompass pretty much any data science project.

In the world of big data, a major goal for most enterprises is to maximize the value of its customer data in order to gain a 360 degree view of the customer. Most customer data, however, are housed in separate data silos. While each data silo contains important pieces of information about your customers, if you don’t connect the data into a single view, you’re only seeing portions of the customer equation.

Artificial Intelligence (AI) may seem like a vision for a distant future, but in truth, AI is all around us as machines are increasingly learning to sense, learn, reason, act and adapt in the real world. This is transforming industries and changing our lives in amazing new ways, by amplifying human capabilities, automating tedious or dangerous tasks, and solving some of our most challenging societal problems. In this article, we’ll discuss the path to AI with Intel technologies.

In this technology cast study, we’ll examine MeritData, Inc., a leading big data analysis technology and service provider in China. The company’s product is called Tempo, a big data platform that has been widely used by well-known power, manufacturing, financial, and global enterprises and by cloud service providers. MeritData helps its customers explore and exploit data value―and ultimately creates value through data processing, data mining, and data visualization solutions. This is achieved through the fusion of high-performance computing (HPC) technology, leading data analysis algorithms, high-dimensional visualization, and creative data visualization language.

Industry Perspectives

In this special guest feature, Brian D’alessandro, Director of Data Science at SparkBeyond, discusses how AI is a learning curve, and exploring opportunities within the technology further extends its potential to enable transformation and generate impact. It can shape workflows to drive efficiency and growth opportunities, while automating other workflows and create new business models. While AI empowers us with the ability to predict the future — we have the opportunity to change it. [READ MORE…]

Latest Video

White Papers

This insideBIGDATA technology guide explores how current implementations for AI and DL applications can be deployed using new storage architectures and protocols specifically designed to deliver data with high-throughput, low-latency and maximum concurrency.