-"four out of five men who participated in a personal-growth weekend called Journey Into Manhood reported a decrease in unwanted homosexual attractions when surveyed between six months and six years later."

-"More than half reported an increase in heterosexual attractions since they participated in the weekend program"

-"73 percent reported a decrease in homosexual behaviors."

-"93 percent reported that Journey Into Manhood had a positive impact on their efforts to diminish same-sex attractions"

And of course, since these "facts" are presented in press release format, the group obviously wants media outlets to swallow and regurgitate the data in various outlets. Many "pro-family" and Christian outlets will surely take the bait.

But we've never been ones to swallow "pro-family" spin, so we decided to look at the survey data (pdf). And in just the first paragraphs of the data's first page, we found a MAJOR flaw that completely invalidates all of the above data:

People Can Change says that they surveyed the men who had participated in their so-called "Journey Into Manhood" weekend between its first outing in January 2002 and December 2006. And they reveal that in this span of time, 615 men had taken part. However, they say they only had emails for 497 of those men, so those are the only ones they polled. So right there you already have 118 men who they were unable to poll due to lack of email addresses, or because the men asked to be removed from mailings.

Okay, but then of that already reduced number of 497, the organization reveals that only 224 responded to the survey. So that takes the non-participation rate to a whopping 391! THAT'S 63.6% WHO CHOSE NOT TO PARTICIPATE (for whatever reason)!

Yet you see not one mention of this in the organization's press release. Instead, they make it sound as "4 Out of 5 Program Participants Report Reduction in Unwanted Homosexual Attractions." Considering that 3 out of 5 didn't even take part in the survey, their headline and their press release is a COMPLETE AND UTTER LIE! In fact, the majority drop-off rate in terms of response would seem to be the most revelatory fact to come from their research!! If those who found "success" are significant, shouldn't those who couldn't or wouldn't talk about their own growth (again, a majority) are also given at least a passing thought?!?!

This is an ABSOLUTELY egregious bit of intellectual dishonesty, and we call on People Can Change to correct the record ASAP. Until then, we will certainly do our part to correct it for them.

**Oh, and of course the above is but one skeptical point of this survey. The wording of the questions and the data itself -- check out the "heterosexual behaviors" and "continuing to work on further change" fields -- are also quite suspect.

**Note: See the below comments section for why we think this situation is much different from others in which a sample set is used to represent a larger population.

Your thoughts

I don't find the percentage of non-respondents alone all that alarming, to be honest with you. Sample sets that are considerably smaller than an overall population are often used to generate statistics that are then applied to the larger population it represents. It's actually what makes surveys and statistical studies useful in the first place.

The real problem is that it's questionable just how representative of the larger population the sample set really is. And on those grounds, I do agree with you. The facts that the survey method involved a great deal of self-selection and that those who had a successful experience are more likely to respond than those who don't, I suspect the results are skewed in favor of the desired results. Of course, the fact that certain results is particular were preferred by those conducting the survey also means that the study merits careful study.

Jarred: I think we're essentially saying the same thing. However, I think you're being too forgiving in terms of comparing the sample sets that were used in this particular survey to those that are used in other, more generalized polls. It's not like we're studying a random portion of the population and using that small set to represent society at large. We're talking about one particular program here. The non-respondents are of much statistical importance!

People Can Change is saying that "4 out of 5" of their participants reported reduction in same-sex attractions. Not 4 out of 5 of those surveyed -- 4 out of 5 of the participants as a whole. You simply cannot make this or any of the other definitive claims that they make without acknowledging that the majority of the participants chose to not participate (especially considering the implications that would be more likely to lie in non-participation).

To me this would be like taking a poll on how many people are afraid of answering polls, and then using the data you're able to collect to say "Majority of population not afraid of taking polls." When you are studying something in which the non-participation is of key importance, you cannot overlook this aspect!

Also, this bit from the data strikes me as very odd. They say of the 497 for whom they had emails:

"The number 497 excludes the small number of past participants (less than 10 people?) who, for various
reasons, had asked to be excluded from People Can Change mailings. Presumably, most (but not all) of these
men held negative attitudes toward People Can Change or the possibility of change at the time."

What I find odd is the question mark that they put after the "less than 10 people" part. Why are they asking us?! Shouldn't THEY be the ones to know how many of the 118 have asked to be removed from their email list?