Going Parallel With Star-P™ and MATLAB®

Parallel computing can often speed up common mathematical operations used in science and engineering, and enable larger data sets to be processed.Let’s briefly take a look at the kinds of algorithms and code structures perform well when parallelized, usinga combination of two scientific computing tools: Star-P from Interactive Supercomputing, and MATLAB® from The MathWorks.

First of all, it is important to note that not all algorithms and code structures lend themselves equally well to parallelization.There are two types of programming approaches that leverage parallel computing power: task-parallel and data-parallel computations.Figure 1 illustrates a strategy for porting a custom serial algorithm to a parallel implementation: some computations make sense to leave on the desktop, others lend themselves well to task parallelism, and others to data parallelism:

Execute in serial: some computations are so trivial that parallelization (and the associated overhead) is unnecessary.For the same reason you wouldn’t take a plane to the neighborhood convenience store, trivial operations that take fractions of a second and/or operate on small data sets or text strings, are best left on the desktop for serial execution.

Task-parallel computations:Task parallelism (sometimes called “coarse-grained” or "embarrassingly parallel") is a powerful method to carry out many independent calculations in parallel, such as Monte Carlo simulations, or "un-rolling" serial FOR loops.For example, in a medical application involving image processing on multiple brain slices, Star-P can distribute the images across several processors, and simultaneously process them.

Data-parallel computations: Data parallelism (sometimes called "global array syntax") is used for high-level matrix and vector operations on large data sets, for example a two-dimensional Fast Fourier Transform of a high-resolution image.

Figure 1 illustrates a strategy for porting a custom serial algorithm to a parallel implementation: some computations make sense to leave on the desktop, others lend themselves well to task parallelism, and others to data parallelism, and the two make for a very powerful combination in writing custom parallel applications.

Let’s take a look at how to implement the two modes of parallelism with a couple simple examples.

Data Parallel

To take advantage of Star-P’s global array syntax, just add the *p construct to the dimension of the variable.Adding the *p makes the variables parallel. Through propagation, related variables also become parallel.The following MATLAB script is a simple example of using a random tossing a coin two million times and adding up the number of heads using a random vector: first in serial MATLAB (in the green frame), then taking advantage of Star-P (red frame) to create the random vector on the parallel server.

Task Parallel

To illustrate task parallelism, let’s consider the example a 3-dimension array (100x100x500 elements). Think of it as 500 planes of 100x100 elements each.Our goal for each plane is to compute its inverse.The green frame below illustrates a “for” loop in MATLAB executed 500 times.

To do exactly the same thing with Star-P (red frame), we would tag the last dimension of the matrix with a *p, and then use the ppeval command (analogous to MATLAB’s “feval,” or “evaluate function” command).The arguments we pass it are the function we are performing, ‘inv’ in this case, and the variable we are performing it on – the matrix a.Star-P takes care of distributing and computing this, transparently, independent of the number of parallel processors.

For more information, You need this trick if you would like to execute your MATLAB code in parallel, to solve bigger problems, and for faster execution.

Got a cool software trick? Send us details, including any documentation and supporting code, to kfield@reedbusiness. If we publish your trick, we’ll send you a super cool Design News t-shirt.

Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.