Facebook Advertising Filters Allows Advertisers to Be Hugely Racist, Which Is Horrible and Also Super-Illegal
There are times when Facebook seems like it’s really trying to be better (unconscious bias training! paying women equal pay!), and then there are times when it’s just like what are you even doing, Facebook? Case in point: the fact that advertisers on Facebook can segment their ads based on race, not to target them, but to exclude them.
As reported by Pro Publica, Facebook advertisers can exclude whatever “ethnic affinities” they’d like from their ads. So, have a housing ad and don’t want black, Latinx, or Asian folks renting your apartment? They don’t ever have to see your ad! Which is totally great, if you’re racist, but also super-illegal.
Pro Publica enlisted a civil rights attorney named John Relmen to parse this, and Relmen immediately said, ““This is horrifying. This is massively illegal. This is about as blatant a violation of the federal Fair Housing Act as one can find.” What’s the Fair Housing Act? Well, it’s this little law that came into effect in 1968 that makes it illegal in an advertisement for the sale or rental of a home or apartment to state “any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.”
Meanwhile, Privacy and Public Policy Manager at Facebook, Steve Satterfield, insists that their practices and filters are all above board and in accordance with the law. He says, “We take a strong stand against advertisers misusing our platform: Our policies prohibit using our targeting options to discriminate, and they require compliance with the law. We take prompt enforcement action when we determine that ads violate our policies.”
He then went on to talk about the importance of advertisers being able to segment and target their advertising to see what works and what doesn’t in the market. As an example, he said that one “might run one campaign in English that excludes the Hispanic affinity group to see how well the campaign performs against running that ad campaign in Spanish. This is a common practice in the industry.”
Now, I used to work in marketing, and I understand the need to target demographics depending on what you’d like to sell. What he’s describing is indeed common practice, and rightfully so. However, the way Facebook has set it up, you’re keeping people out of an ad, rather than targeting a specific market. You can choose to “Exclude people who match” a certain “ethnic affinity,” and it’s that exclusion that’s the problem.
See, if someone wants to run the above test re: the ad in English vs. the ad in Spanish, one could just as easily run the ad in English targeting the Latinx market, then run the same ad in Spanish targeting the Latinx market. Boom. Done. Having the function of being able to run an ad to everyone, but leave out certain groups, is basically just setting Facebook up for abuse by people who would use the function to be discriminatory. Satterfield mentioned “prompt enforcement” of any violations, but he doesn’t specify what that means exactly. And if their enforcement of these guidelines are anything like their arbitrary “enforcement” of their general community guidelines, I don’t have high hopes for this.
There’s no harm in someone seeing an ad that’s not necessarily meant for them. If they weren’t going to be interested in buying your product anyway, then them seeing the ad means nothing. If you want to target, you need to pay attention to the demographics of who is responding to which ads.
However, the only reason to keep someone out of an ad is if you’re afraid it will work on them, but you don’t want them responding to your offering. That’s textbook discrimination, and as it’s set up now, Facebook is enabling that.
(image via Shutterstock)
Want more stories like this? Become a subscriber and support the site!
—The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—
Have a tip we should know? firstname.lastname@example.org