Google searches on names that "sound black" result in ads suggesting that the person may be a criminal, according to a new Harvard study.

Professor Latanya Sweeney says she found "statistically significant discrimination" when comparing ads served with results from online searches made using names associated with blacks and those associated with whites. Sweeney, who is African-American, began her study after learning that a Google search for her own name produced an ad for a background check service, hinting that she'd been arrested.

Her hypothesis: Names that are associated with African-Americans, such as Latanya, are more likely to trigger negative ad associations than names such as Jill, that aren't.

I haven’t spoken to Sweeney (her picture is on the left), but I don’t believe she is accusing Google of deliberate racism, and neither am I. It would be easy for some to dismiss her work as torturous political correctness, but that’s wrong too. What is important about her work, I believe, is that it gives some insight into the experience that African Americans may have on the Internet.

After all, doing a search on your own name (I bet you've done that) and being served an ad for a service called Instantcheckmate that implies you are a criminal, has got to feel demeaning. When Sweeney searched on her name, Instantcheckmate ads saying “Latanya Sweeney, Arrested?” and “Check Latanya Sweeney’s Arrests,” appeared in the paid results part of the search page.

Suppose her daughter had seen that?

When Sweeney then clicked on those ads and paid the fee, it turned out, not surprisingly, that there was no record of her being arrested. By way of contrast, she did search for the names "Kristen Haring", "Kristen Sparrow" and "Kristen Lindquist." Ads came up, but not from Instantcheckmate or other similar services. But when she searched on those names in Instantcheckmate, there were arrest records for two of those women in the company database.

On Reuters.com, which uses Google AdSense to serve ads, a "black-identifying name was 25 percent more likely to get an ad suggestive of an arrest record," Sweeney found. On Google, 92 percent of ads appearing next to black-identifying names suggested a criminal record, compared to 80 percent of white-identifying names, she wrote.

It's not clear what's at the bottom of this. Google, of course, denies that it engages in what you'd have to call racial profiling. It may be that Instant Checkmate, which had the most online ads of any company tracked in the study, chose to link black-identifying names with ad templates suggesting a criminal record, though the company told Sweeney that it doesn't do that.

"There is discrimination in delivery of these ads," Sweeney writes in her report. "Notice that racism can result, even if not intentional, and that online activity may be so ubiquitous and intimately entwined with technology design that technologists may now have to think about societal consequences like structural racism in the technology they design."

I suspect that what may be going on here has to do with the type of searches millions of people make every day. Google's algorithm does track searches and uses that information to make search more relevant. If enough people are searching on black-sounding names and terms like crime, that might explain this.

Ultimately, I bet we'll never find out, but it's worth thinking about the ways the Web affects all of us in some very unexpected ways.

San Francisco journalist Bill Snyder writes frequently about business and technology. He writes the Tech's Bottom Line blog for InfoWorld, and his work appears regularly in CIO.com and the publications of Stanford's Graduate School of Business and the Haas School of Business at the University of California at Berkeley.