SILC is founded on the belief that spatial skills are malleable, and that education or life experience can therefore enhance spatial thinking. However, prior research on this issue has led to mixed results. Let’s look at two examples. On the one hand, Sorby and Baartmans (1996, 2000) have demonstrated that engineering students who receive extensive practice in spatially-relevant skills, such as mentally rotating figures or disembedding, do better in their courses and are more likely to graduate. On the other hand, Sims and Mayer (2002) offer evidence that training of spatial skills has only narrow and fleeting effects.

We undertook a systematic meta-analysis of the literature on spatial skills to address the malleabiltiy of spatial thinking. A meta-analysis is a quantitative review of the literature, meaning that prior findings are organized and systematized. Meta-analyses are important because they are much more reliable than the results of any one study. A meta-analysis helps shed light on what the studies say when taken together. Put simply, meta-analyses represent our best assessment of what is known about a topic.

We began the meta-analysis by looking for relevant articles and research studies. We searched through several computer databases of publications in psychology, education, and other, related fields. In addition, we took pains to obtain as much unpublished work as possible. If we limited our results to published studies, we might reach a biased conclusion that was based on only those articles that showed strong enough results to merit publication. We therefore wrote to researchers in the field, asking them to send us the results of their ongoing or otherwise unpublished. We also searched through Dissertation Abstracts International (ProQuest) and the listing of papers presented at recent conferences. We included over 200 studies in our meta-analysis, of which approximately 50 were unpublished.

Different researchers measure the outcomes of spatial training in different ways. For example, some researchers examined whether training led to increases in the speed of mental rotation, whereas others looked at whether training led to an increase in scores on standard tests of spatial skills. To compare the different measures, we converted each to an effect size, which allowed us to place all of the effects of training along a common scale. Effect sizes express the effect of an intervention in terms of standard deviation units. An effect size of 1.0 would indicate, for example, that the intervention had led to an improvement of one standard deviation, relative to where the group had been before training, or as compared to a control group.

The results clearly indicate that spatial skills are malleable and that they respond well to training. The average effect size was .53, indicating that training led to improvement of more than 1/2 standard deviation. To provide a sense of scale, an effect of this size would be equivalent to an intervention improving SAT scores by more than 50 points or IQ scores by more than 7.5 points. In addition, some of the studies assessed the effects of training after delays, some as long as a month or more. The effects of training endured, thereby indicating that on average, the effects of training are not fleeting. This finding is important because it suggests that training could lead to knowledge that students could retain as they took new classes or acquired new skills.

As shown in the Figure below, training was even more effective for young children. However, it should be noted that training was still effective for older participants. Thus, training may be more effective if started early, but this fact is not a reason to give up on training late in life.

Figure 1: Effect Size of Training as a Function of Age of Participants

The meta-analysis also shed light on why different studies have obtained disparate results and reached disparate conclusions. It turns out that much of the variability in results was due to (a) whether the study included a control group, and (b) what the control group did instead of training. In some cases, the control groups performed tasks that were not designed explicitly to be a form of training but turned out to be helpful nevertheless. For example, taking several different kinds of spatial tests turned out to be a form of training in itself. While not as effective as formal training, this informal training did lead to some improvements. If an intervention was compared to a control group that improved because of informal training, then the intervention might not look very effective, even though both the intervention and control group improved. The meta-analysis shows that it is important to look at improvement in both intervention and comparison groups.

In summary, the meta-analysis has shown that spatial training can be highly effective. Efforts to improve spatial training can be valuable and long-lasting. SILC is investigating many different ways to enhance spatial learning in both children and adults.

♦ Sorby, S. A., & Baartmans, B. J. (2000). The development and assessment of a course for enhancing the 3-D spatial visualization skills of first year engineering students. Journal of Engineering Education, 89(3), 301–307.