In a settlement reached with the Justice Department on Tuesday, Meta agreed to make changes to its ad technology and pay a penalty of $115,054 to resolve allegations that the company’s advertising platforms had engaged in discriminatory behaviour toward Facebook users by imposing restrictions on who was able to view housing advertisements on the platform based on factors such as the user’s race, gender, and ZIP code.
In accordance with the terms of the agreement, Meta, the company that was formerly known as Facebook, stated that it would modify its technology and employ a new method that is assisted by computers. The purpose of this new method is to perform regular checks to determine whether or not individuals who are targeted and eligible to receive housing advertisements are, in fact, viewing those advertisements. The new approach, which is being called a “variance reduction system,” makes use of machine learning to verify that marketers are sending housing-related advertisements to the appropriate protected groups of individuals.
Damian Williams, a United States attorney for the Southern District of New York, stated in a statement that “Meta will for the first time adjust its ad distribution system to address algorithmic discrimination.” However, “our office will continue with the case if Meta fails to show that it has properly modified its distribution system to protect against algorithmic bias.”
Since it began collecting data from its users and allowing advertisers to target advertisements based on the characteristics of an audience, Facebook has been the target of criticism that some of its business practises are unfair and discriminatory. These allegations have been levelled against the company for years. Marketers have been able to choose who saw their advertisements by using thousands of different characteristics, which has allowed those advertisers to exclude people who fall into a number of safe categories, like race, gender, and age. The company’s ad systems have allowed marketers to choose who saw their advertisements.
Tuesday was the day that the Department of Justice presented Meta with both its lawsuit and the settlement agreement. In the lawsuit that it filed, the agency said that it had arrived at the conclusion that “Facebook could fulfil its goals in growing its income and offering relevant advertising to users via less discriminating ways.”
Despite the fact that the settlement only applies to advertisements for housing, Meta has said that it intends to use its new system to monitor the targeting of advertisements that are connected to jobs and credit. The corporation has been criticised in the past for permitting discrimination against women in job advertisements and for excluding specific categories of individuals from seeing advertisements for credit card companies.
Particularly contentious in the context of housing advertisements is the problem of biassed ad targeting. An investigation conducted by ProPublica in 2016 uncovered the possibility for ad discrimination on Facebook. The findings of the research demonstrated that the company’s technology made it easy for advertisers to avoid targeting certain ethnic groups for the purposes of their campaigns.
In 2018, Ben Carson, who was serving as the secretary of the Department of Housing and Urban Development at the time, filed a formal complaint against Facebook. In the complaint, he accused the company of having ad systems that “unlawfully discriminated” based on categories such as race, religion, and disability. Carson’s announcement of the complaint took place in 2018. In 2019, the Department of Housing and Urban Development (HUD) filed a lawsuit against Facebook for breaching the Fair Housing Act and engaging in housing discrimination.
The discrimination investigation and allegation brought against Facebook by HUD in 2019 served as part of the basis for the lawsuit and settlement brought by the Justice Department.
According to the complaint, the majority of housing advertisements in areas where the majority of the residents were white were also shown primarily to white users. On the other hand, housing advertisements in areas where the majority of the residents were black were displayed primarily to black users.
In recent years, civil rights organisations have also started fighting back against the extensive and intricate advertising networks that are the foundation of some of the major online platforms. The organisations have contended that such systems have inherent prejudices and that technology firms such as Meta, Google, and others should do more to combat those biases.
Computer scientists working in the field of artificial intelligence have shown a large amount of interest in the issue of “algorithmic fairness,” which refers to the area of research that they have been examining. At many years, prominent scholars, such as Margaret Mitchell and Timnit Gebru, both of whom once worked for Google, have been sounding the alarm bell on the dangers of such biases.
Facebook has, in the years afterwards, significantly reduced the number of categories from which advertisers could choose when buying housing advertising, bringing the total number of possibilities down to the hundreds range and removing the ability to target users based on factors such as race, age, and ZIP code.