Facebook’s ad delivery could be inherently discriminatory, researchers say

A new study says Facebook's ad serving algorithm discriminates by race and gender, even when advertisers try to reach a broad audience. The research supports a similar claim that the US Department of Housing and Urban Development. UU He did last week when he sued Facebook for violating housing discrimination laws.

Numerous reports have analyzed how advertisers can target ads to exclude certain groups, but this study examines how ads are delivered once they are out of the hands of advertisers. Even if an ad is broadly targeted, Facebook will serve the audience most likely to click on it, generalizing information about its profile and previous behavior. The system creates correlations to find this ideal audience: if techno fans are more likely to click on a specific ad for hearing aids, that ad could be more useful to other techno fans in the future, even if it was not a parameter of explicit guidance.

The document (which has not yet been peer-reviewed) is a collaboration between Northeastern University, University of Southern California and Upturn, a nonprofit organization. Their authors tested whether job listings or housing advertisements with certain keywords or images would automatically be delivered more frequently to certain groups, exposing what they call "previously unknown mechanisms" that could violate anti-discrimination rules. Researchers spent more than $ 8,500 on ads that they say reached millions of people, linking real job search or real estate sites, among other categories. They ran the same campaigns with different copies of ads or photos or with different rates, reviewing the demographic breakdowns provided by Facebook in each campaign.

Some simple changes revealed dramatic divisions. Housing advertisements with a photograph of a white family, for example, were apparently sent to more white users than the same ad with a black family. (Facebook does not offer analysis based directly on race, so the researchers directed the ads to places with different racial breakdowns as a proxy). An announcement of jobs in the wood industry was shown to an audience that was 90 percent male, while advertisements for supermarket cashiers reached a female audience of 85 percent. And unlike the advertisements in a well-known exhibition of ProPublica these were not specifically targeted at men or women. The only difference was in the text and the photos.

The expense rates also apparently affected those who saw the ad. Facebook ads are placed through a bidding process, so a campaign backed by more money can reach more "valuable" users. In this case, an ad with a very cheap campaign had an audience that was 55 percent male, compared to a high budget campaign, whose audience was more than 55 percent female.

The recent HUD claim claimed that by publishing ads based on "relevance", Facebook is likely to reinforce social inequalities: if the majority of home buyers in an area are white, for example, Facebook You can only show ads to white users. It was presented as an unproven theory, but this research offers significant support for the idea.

Researchers emphasize that they still do not know why Facebook's algorithm is taking any of these decisions. "We could say with confidence from this study that the content of the advertisement itself is very important for the class of people who see it. But we can not say exactly how those calculations are made, "says Aaron Rieke of Upturn.

In seeking comment, Facebook stressed that it was trying to eradicate the bias." We are against discrimination in any way. important changes in our ad targeting tools and we know that this is only a first step We have been looking at our ad delivery system and have involved industry leaders, academics and civil rights experts on this same topic, and We are exploring more changes, "said spokesman Joe Osborne.

Osborne said that Facebook was actively studying its algorithms and noted that Facebook had supported a resolution by the US House of Representatives. UU on the ethical development of artificial intelligence. He also pointed to previous changes in the targeting of Facebook ads, which include removing categories that ad buyers could use to discriminate, as well as creating a tool for users to verify all housing ads in their system, regardless of that you see in your news feeds. [19659012] This study suggests that changing ad targeting options might not make these listings significantly neutral, and Rieke says that a separate ad database would not go far enough. "Without a doubt, it's good that people can finally go and look for all the housing ads," he says. "Even so, I think it matters who really chooses Facebook to boost opportunities."

Facebook has argued that Section 230 of the Communications Decency Act protects it from liability for advertising content. But one of the researchers' main arguments is that Facebook is defining these audiences on its own, and advertisers may have little opinion on how it is done. "We did not say" the male lumberjacks loved themselves, "says Rieke." We strived to be very clear and neutral in the language of our test ads, and yet we saw these results. This is not a problem in which advertisers just need to be more careful with the content of their ads. "

So how would Facebook create a system that could avoid legal scrutiny? or you could change your guidance system to actively counter the bias, or you could derive these lists to a separate system, such as the housing ad database Facebook has promised to build.

For now, we do not know if this document will affect HUD's lawsuit against Facebook, the agency declined to comment, citing restrictions on talking about an active legal dispute, but if the case goes to trial, HUD could look for internal data that would support the conclusions of the document.

Yes a court determines that Facebook's ad placement algorithm is discriminatory, advertising networks across the network may have to change their practices. Researchers say that Facebook's "walled garden" made it particularly suitable for this experiment, but it is plausible that Google or any other ad platform may show the same biases. "We still do not measure other advertisers," says co-author Piotr Sapiezynski. "But we do suspect that platforms that try to achieve what they define as" relevant "audiences could find themselves in this situation."

Please Note: This content is provided and hosted by a 3rd party server. Sometimes these servers may include advertisements. igetintopc.com does not host or upload this material and is not responsible for the content.