Following Facebook's November promises to take action, the company unveiled a suite of rules and machine-learning tactics on Wednesday, all in the name of curtailing discriminatory ad-targeting practices.

The most notable of these is Facebook's new automated toolset that is built to identify advertisements for "housing, employment or credit opportunities"—and flags them if they employ the site's "multicultural affinity targeting" system. In other words, if those types of ads in any way are built with requests that Facebook not deliver them to African-American, Hispanic, or Asian-American viewers, the site will attempt to automatically block the ad with a relevant notice. Advertisers can then either remove the cultural limitations from the ad or request a manual review for its approval.

Should the automated toolset recognize this type of ad but not pick up on apparent cultural targeting, advertisers will instead be directed to a three-paragraph "certification" notice, which advertisers will have to sign. This notice, among other things, forces advertisers to pledge that they "will not use Facebook advertising to improperly discriminate." This notice coincides with Facebook updating language in its advertiser-policy pages about discriminatory practices. In an October report, Pro Publica exposed previous issues in the social network's advertising platform by buying and running discriminatory ads that flew in the face of the Fair Housing Act.

Further Reading

The announcement in no way clarifies what terms and cues Facebook will track or trace to recognize an ad as one for housing, employment, or credit, nor does it suggest that any advertiser who receives a notice and then immediately rewrites an ad to remove trackable terms will be scrutinized any further. And the announcement didn't mention any further changes to its multicultural targeting system beyond changing its name from "ethnic affinity targeting."

The American Civil Liberties Union took the opportunity to commend Facebook for its efforts—and asked other tech companies with self-serve advertising platforms to do the same. "All ad platforms should make it impossible to target ads in these categories by any protected class status, including race, gender, and religion. And we need to keep educating platforms and advertisers about the danger of discrimination that targeting presents, even when ads are targeted by zip code or based on what music you listen to."