Nearly seven months have passed since Meta agreed to settle a Department of Justice lawsuit accusing the company of illegally allowing discrimination against its users based on race and other characteristics in its housing advertising system. Now the company says it’s finally ready to release a new machine learning technology it claims will distribute advertisements in more equitable ways and reduce algorithmic discrimination. The new technology, which Meta calls its Variance Reduction System (VRS), will start off with housing ads, but is expected to expand and apply to U.S. employment and credit ads by the end of 2023.
Meta says VRS will ensure audiences on its platform see ads more closely targeted to the eligible target audience for those ads. VRS uses a method of measurement called Bayesian Improved Surname Geocoding to measure the aggregate age, gender, and estimated race or ethnicity distribution of the users who have seen the ad. All that aggregate demographic info, informed by U.S Census statistics, is then compared against the demographic distribution of a targeted audience selected by the advertiser. Those changes, according to the DOJ, should, “substantially reduce the variances between the eligible and actual audiences along sex and estimated race/ethnicity in the delivery of housing advertisements.”
“This development marks a pivotal step in the Justice Department’s efforts to hold Meta accountable for unlawful algorithmic bias and discriminatory ad delivery on its platforms,” DOJ Civil Rights Division Assistant Attorney General Kristen Clarke said in a statement. “The Justice Department will continue to hold Meta accountable by ensuring the Variance Reduction System addresses and eliminates discriminatory delivery of advertisements on its platforms.”
In its lawsuit, the DOJ claimed Meta violated the Fair Housing Act by encouraging advertisers to target ad recipients based on characters like race, religion, and sex. The complaint alleged Meta’s previous “Special Ad Audience” advertising tool introduced bias when delivering the ads. Additionally, the DOJ said Meta’s system fed FHA-protected characteristics data into its delivery system and used that data to predict what housing ads were most relevant to users.
Meta ultimately paid a $115,054 civil penalty as part of the settlement and agreed to cease its use of the Special Ad Audience tool. The company also agreed to replace that tool with a new system that ultimately became VRS, though the company never admitted wrongdoing. Guidehouse, a third-party review, will now investigate Meta on an ongoing basis to make sure VRS is meeting the compliance metrics.
“Across the industry, approaches to algorithmic fairness are still evolving, particularly as it relates to digital advertising,” Meta said in a blog post. But we know we cannot wait for consensus to make progress in addressing important concerns about the potential for discrimination—especially when it comes to housing, employment, and credit ads, where the enduring effects of historically unequal treatment still have the tendency to shape economic opportunities.”
The DOJ, on the other hand, said the Meta settlement and subsequent development of the VRS alternative should serve as a warning sign to other tech companies with their own dubious algorithms.
“Federal monitoring of Meta should send a strong signal to other tech companies that they too will be held accountable for failing to address algorithmic discrimination that runs afoul of our civil rights laws,” Clarke said.
Read the full article here