HeartMath learning tools and techniques are devices and procedures intended to enhance student performance. The Institute of HeartMath has marketed a TestEdge® classroom program that includes lessons about a “Freeze-Frame®” technique and test-taking strategies. For students in high school, the program also includes lessons about “heart-brain synchronization” and the use of background music. Typically, older students practice using a computerized “emWave® PC Stress Relief System” that provides biofeedback.

CLAIMS ABOUT TEST SCORES

According to the folks in Boulder Creek (the Institute of HeartMath):

“Students in more than 1,000 schools around the world have benefited from IHM research by using HeartMath learning programs to increase their confidence at school, raise test scores and overall academic performance . . .”1
“The TestEdge methods of reducing test anxiety and actually raising test scores . . .”2
“. . . there was a significant increase in test performance in the experimental group over the control group, ranging on average from 10 to 25 points.”3

They appear to be claiming that their techniques cause students to achieve higher test scores.
What evidence do they have to support this claim?

SHOWING CAUSE AND EFFECT

The purpose of an experiment is to show that Treatment X causes changes in Behavior Y. Typically, because experiments provide the clearest evidence of cause-and-effect relationships, psychologists give great weight to experimental research. Experimenting is not the only method we use, but, in general, it is the most valuable.

For a discussion of how experiments show cause-and-effect, click here.

When people test ideas about the causes of behavior, the three most common mistakes are:
(1) lack of a control condition,
(2) lack of random assignment,
(3) lack of “blind” observers. Studies that have these flaws do not show that a treatment causes changes in behavior.

THE EVIDENCE

What sorts of studies have been done by people promoting HeartMath learning tools & techniques that are supposed to help students achieve higher scores?

The subjects were 20 high school seniors who needed to retake State tests in Reading or Math that are required
for graduation. They enrolled in a “Spring Training Camp.” The students’ previous, failing scores on the State test served as a baseline measure of their school performance. The three-week training camp included 25 hours of instruction. Two-thirds of this time was spent working on “the same standardized curriculum that is used throughout the school district for state test preparation.” The other one-third of the time was spent practicing HeartMath techniques, including the computerized “emWave® Stress Relief System” that provides biofeedback. Finally, the students took the State exam again, and their old scores were compared with their new ones.
The results showed that the students’ test scores went up. The researchers then compared this increase in scores with the increases shown by other students in the school district who did not learn HeartMath techniques. Because the increase for the 20 HeartMath students was larger that the average increase for other students in the school district, the researchers claimed the HeartMath program caused superior test scores.
This study had all three of the major problems described earlier:
(1) lack of a control condition,
(2) lack of random assignment,
(3) lack of “blind” observers.
The other students in the school district do not constitute a valid control group for two reasons: students were not assigned randomly to one group or the other, and the Spring Training Camp group was only 20 students while the number of other students (unreported) was surely much larger (200? 400?). This difference in the sizes of the groups prevents valid statistical tests of significance, and the researchers do not report any such tests—they simply claim that the results are important.
Because of these flaws, this study does not provide evidence that the HeartMath program caused greater increases in test scores.
There are at least two simpler explanations for the results:
(a) Perhaps students who volunteered for the Spring Training Camp were more highly motivated to do well, compared to the average student in the school district. A higher need for achievement would make those students work harder and benefit more from the standardized curriculum. Because students were not assigned randomly to the Spring Training Camp, this explanation cannot be ruled out.
(b) Perhaps the instructors working with students in the Spring Training Camp gave them more attention and individual feedback, compared to instructors working with other students in the school district. Because the instructors were not “blind”to the fact that these students received HeartMath training, the instructors’ expectations of high performance may have produced a self-fulfilling prophesy.

These researchers describe four studies involving four groups of students:
In 2002, “Instructors met with seven students, three hours per day, five days per week, for three weeks. Each day alternated between 45 minutes HeartMath instruction and 45 minutes math tutoring.”
On the first day, the students took a pre-test to determine their level of math proficiency. The HeartMath instruction included practicing the Freeze-Frame technique and trying to “re-experience” positive emotions. The students did so while using the computerized “emWave® PC Stress Relief System.” At the end of the three weeks, the students took another test of math performance.
In 2003, instructors worked with 15 students four days per week for three weeks. The students took a pre-test to determine their math proficiency. Then, each day for two hours, the students discussed emotions and practiced HeartMath techniques (including use of the computer system) , and practiced solving math problems for one hour. On the last day, the students took another test of math performance.
In 2004, instructors worked with an unreported number of students four days per week for three weeks. The students took a pre-test to determine their math proficiency. Then they used the same techniques as in 2003 but blended together the HeartMath and school math activities. On the last day, the students took another test of math performance.
In 2005, instructors worked with 16 students for seven weeks. The students took a pre-test to determine their math proficiency. Then they did the same activities in the combined format used in 2004. On the last day, the students took another test of math performance.
In each of these four studies, the researchers compared the students’ pre-test math scores and their final math scores. For each group, their scores went up, and the researchers said this result showed that the program caused increased scores.
All four studies had the three major problems described earlier:
(1) lack of a control condition,
(2) lack of random assignment,
(3) lack of “blind” observers.
Because of these flaws, these studies do not provide evidence that the HeartMath program is what caused the increases in test scores.
There is at least one simpler explanation for the results:
Perhaps simply spending more time working on math problems is what caused higher scores. Because there was no control condition—in which students worked on math problems without doing the HeartMath activities—there is no evidence that HeartMath had an effect.
Note: simply comparing the performance of the students in the treatment group with other remedial math students would not provide evidence of a HeartMath effect. If the students have not been assigned randomly to either a HeartMath group or a non-HeartMath group, their pre-existing individual characteristics (such as IQ) could explain any differences in the performance of the groups.

In this study, the subjects were tenth-grade students at two high schools in Northern California. At one school, 602 students received HeartMath training. At the other school, described as the “control group,” 332 students were not trained. The HeartMath TestEdge® program included practicing the Freeze-Frame technique and trying to “re-experience” positive emotions. The students did so while using the computerized “emWave® PC Stress Relief System” that provides biofeedback.

For details about the procedure—and flaws in the procedure—click here.

This study had all three of the major problems described earlier:
(1) lack of a(n equivalent) control condition,
(2) lack of random assignment (of individual subjects),
(3) lack of “blind” observers.
In addition, the researchers used some dubious statistical procedures when analyzing the results of this study.
Because of these flaws, this study does not provide evidence that the HeartMath program caused increases in test scores.

SUMMARY

Despite obvious flaws in research procedures, promoters of HeartMath continue to claim these studies show that their tools and techniques cause increases in students’ test scores. They also publish many articles that only
repeat their claims and do not present any new scientific evidence. The claim that HearthMath has been “scientifically validated” as a way to increase test scores appears to be without merit.