Professors Mark Andrejevic, Daniel Angus and Jean Burgess are researchers at the ARC Center of Excellence for Automated Decision Making and Society and lead the Australian Ad Observatory project.
The online economy, dominated by platforms such as Google, Facebook and Instagram, relies heavily on revenue generated from users’ personal data. Advertisements that track us across the Internet are targeted to us, based on detailed information about our identities, interests, relationships and online activities.
Targeted advertising can sometimes be useful and convenient, such as when you’re looking for the perfect bedside lamp and Facebook shows you ads from lighting companies offering similar lamps. But can also be irrelevant and boring. It can also have serious consequences: the same systems that target you as a potential buyer of a new bedside lamp could also persuade you to buy a fraudulent health product or vote for a particular political party.
Serious harms can result – for example, online platforms could allow job postings to be targeted by race or gender, in violation of anti-discrimination laws. In 2019, Facebook paid more than $6 million to settle a US lawsuit concerning precisely this type of discrimination – a slap on the wrist for a company whose annual revenues at the time amounted to more than 100 billion dollars.
How much does Facebook regulate advertising on its platform?
Like all major platforms, Facebook hosts so much content that it has to rely on automated systems to ensure ads follow its rules. But these measures do not seem to be reliable in preventing harmful and unethical advertising practices – as the world learned in 2016 during the US presidential election and the UK referendum on leaving the European Union ( Brexit), where advertisers used personal data to create advertisements. targeting specific groups and individuals with the aim of manipulating them and influencing the outcome.
Researchers from the accountability group Internet Reset Australia recently tested Facebook’s anti-misinformation measures in submission of “fictitious” political ads. These have been approved by Facebook. Designed to run ahead of the upcoming federal election, the ads included official-looking notifications that falsely stated that people who weren’t vaccinated would not be allowed to vote and that the federal election had been canceled due to COVID-19. 19 pandemic. Reset extracted the ads before they aired. Had they been published, it’s likely the ads would have been flagged and taken down anyway – but only after potentially misleading or confusing large numbers of people.
Platforms keep us in the dark
When advertisements appear in newspapers, on billboards or on television, people can see what kind of messages are circulating and who else can receive them. This makes public accountability for advertising practices relatively straightforward.
But online, the situation is radically different: ads can target individuals on their personal devices for reasons they may not fully understand. Someone who receives a job offer or a real estate advertisement, for example, has very little way of knowing who else is seeing the same advertisement, and therefore whether they are being targeted based on their gender, age , ethnic origin or other personal characteristics.
As a society, we need to find new ways to see and detect potentially harmful patterns in the way consumers and citizens are profiled, screened, and targeted by platforms and advertisers.
Targeted online ads are equally opaque to groups we have traditionally relied on to hold advertisers to account, such as journalists, advocacy groups and regulators. Observability is therefore a major obstacle to accountability in online advertising: as a society, we need to find new ways to see and detect potentially harmful patterns in the way consumers and citizens are profiled, screened and targeted by platforms and advertisers.
In response to public pressure and legislation, platforms such as Facebook have started creating their own “ad transparency dashboards”. But from a public oversight perspective, these dashboards are difficult to understand and very limited in use. They aggregate or summarize key information and remove most historical data. They also obscure much of the detailed data needed to identify reach and targeting patterns that could indicate discrimination or predatory advertising. There is also no independent verification of the accuracy of the data.
How can we achieve better public control of online advertising?
People have the right to know how their data is used by platforms and advertisers to target them. As a company, we must ensure that this data is not used to enable unfair discrimination or predatory business practices. So far, platforms and advertisers have not gone far enough in terms of self-regulation and have aggressively resisted any push of responsibility.
In response, the ARC Center for Automated Decision-Making and Society (ADM+S) created the Australian Advertising Observatory. Inspired by a similar program in the United States, the Ad Observatory invites Australians to share the advertisements they encounter on Facebook with researchers. Anyone can participate by installing a browser extension that automatically collects the ads they receive in their Facebook News Feeds – and nothing else. The extension only works on laptops and desktops and can be removed or disabled at any time. If you install it, you will be able to see all the advertisements you receive while the extension is running.
People have the right to know how their data is used by platforms and advertisers to target them
We know it will be difficult for an independent project like this to create the full picture of how people are targeted by online advertising. But we hope it will help build public support for greater transparency and accountability, including legal obligations for companies to provide detailed information about the ads they serve and how they are targeted. As a society, we require these measures to ensure that targeted advertising is not used in a predatory or discriminatory manner.
If you would like to find out more about the Australian Advertising Observatory or would like to join, please visit our website.
Disclaimer: Professor Jean Burgess has previously engaged with Facebook as an external academic advisor on policy issues.