Meta updates its ad policy: It should now be better for teens and less discriminatory
Meta has announced two major changes in its advertising that could be good news for teenagers and those who may feel like Meta’s algorithms have overlooked them in the past. The company is introducing new policies and technology to create a more positive and equitable experience for its users.
Effective February, advertisers will no longer be able to target teenagers based on gender on Meta's Facebook and Instagram platforms. This means advertisers will only be able to use age and location as their targeting criteria. Additionally, in March, teens will be given an element of control over the ads they see, through the new Ad Topic Controls. These changes are designed to protect the privacy of teenage users and ensure that they are not being bombarded with irrelevant ads. In the blog post announcing the feature, Meta said:
“Starting in March, teens will have more ways to manage the types of ads they see on Facebook and Instagram with Ad Topic Controls, expanding on what’s already available. Teens will be able to go to their Ad Preferences within Settings on both apps and choose See Less or No Preference to further control the types of ads they see.”
The company is also introducing a new Variance Reduction System (VRS) to create a more equitable distribution of ads on its platforms, particularly those related to housing, employment, and credit in the US. The technology uses a new form of machine learning to serve ads that are more closely aligned with the eligible target audience. The VRS also incorporates measures to protect user privacy to prevent the system from retaining specific information. Currently, the VRS is only available for housing ads in the United States, but the company plans to expand it to employment and credit ads later this year.
The move from Meta comes after a lawsuit Meta settled with the United States Department of Justice (DOJ). The DOJ alleged Meta was engaging in discriminatory advertising in violation of the Fair Housing Act (FHA). As the feature follows a lawsuit, it is perhaps rich for Meta to claim:
“Meta embeds civil rights and responsible AI principles into our product development process to help advance our algorithmic fairness efforts while protecting privacy.”
However, at least the Department of Justice has expressed satisfaction with the new feature, which means it is probably doing what it is supposed to.
In other words, Facebook is developing tools to accurately target groups of people by race (based on surnames and ZIP codes), with the stated purpose of fairness studies, or more exactly using theirs own words, “understanding whether a product or system performs differently across self-identified races and ethnicities.”. It reminds me when Facebook asked users to send them their naked pictures preventively to be able to remove them in case someone else would post them. What could go wrong ?
Anyway, wasn’t Facebook’s targeted advertising business ruled illegal as a whole recently ?