The AIM Network

Strategies for protecting Australian voters against digital disinformation campaigns

Image from rutgers.edu (Shutterstock photo)

By Martha Knox-Haly  

From April 2023 to October 2023, Australian voters were subjected to an unprecedented social media disinformation campaign around the Voice to Parliament Referendum. The Voice to Government was supported by Scott Morrison in 2017, and it had enjoyed the steady support of the majority of Australian voters several years. The proposal was modest. It did not extend to a treaty, and there was no reason to suspect the Voice referendum would be controversial. Then in April 2023, the new coalition leader Peter Dutton signalled he was going to side with West Australian Liberals, and that he would campaign against the Coalition’s previous policy.

This was the cue for the launch of the ‘No campaign’ in April 2023. The associated digital disinformation campaign was unparalleled in intensity, spread and sophistication. The ultimate victims of the no campaign were amongst the most impoverished and marginalised Australians. The trashing of Indigenous dreams by wealthy donors was reprehensible. To date, these wealthy donors have not apologised for the spike in Indigenous suicide rates that occurred from April 2023. The No campaign claimed to champion free speech, but how can speech be free when discourse is the product of online manipulation and deceit? It was not the first time political actors had pursued digital disinformation campaigns, but it was the first time these strategies had succeeded.

How can Australian voters be protected against digital disinformation and attacks on democracy? A robust regulatory framework requires coercive powers. It needs to be able to combat disinformation from the point of initiation and within echo chambers. The framework needs to empower social media users and the associated regulatory institutions. Above all, the regulatory frameworks needs to be agile enough to make a difference in the tight time frames that exist around electoral activity.

The Albanese Government proposed amendments to the Broadcasting Act of 1992, strengthening the powers of the Australian Communications and Media Authority. Despite the proposed amendment containing many of the elements the Coalition Government had taken to the 2022 election, the Coalition’s response was predictably histrionic. There were assertions that the bill would establish ‘a ministry of truth’ and ‘was a threat to democracy’. In reality, the amendment bill provided ACMA with a modest increase in the power to gather information and maintain records about a social media platform’s responses to disinformation. ACMA was not given powers to force content to be taken down expeditiously, and it did not cover media organisations.

Coercive powers around the removal of content were reserved for the ‘e-safety commissioner.’ The ‘e-safety commissioner’ is concerned with protecting the rights of adults or children who are subject to abuse, and its scope does not extend to ensuring the safety of democratic electoral systems. Under the Online Safety Act 2021, online providers are required to develop codes of conduct, and the E-Safety Commissioner can pursue fines. Online providers are required to respond to the E-Safety Commissioner’s questions, and take down content. There is nothing about compliance audits of social media platforms, or promoting algorithmic transparency and sovreignty in either the Online Safety Act or the proposed ACMA amendments. These frameworks are complaints focused, and not designed to bring about systemic reform of social media providers.

There is a proposal to introduce new laws based on recommendations from the Commission into Robodebt. These recommendations are that all federal government agencies be transparent in explaining how algorithms and AI affect decision making processes. Unfortunately, these recommendations do not extend to online service providers.

The regulatory gap is a problem. The onus is on social media giants to be responsive to requests to remove offensive content. The platform owner’s personality can influence responsiveness. For example, former Twitter CEO Jack Dorsey was a programmer by background, and proactively managed the risks to elections. Regretfully Elon Musk, the current owner of X, dismantled the capacity for users to flag political disinformation during the referendum campaign.

The ACMA amendments permit the regulator to raise concerns with a platform, and investigate the platform’s self-regulatory process. If the self-regulatory processes of a platform are deemed inadequate, there are potential penalties and enforcement of a mandatory code of conduct. The emphasis is on providing the platform with as many opportunities as possible to take mitigating action before levying sanctions. The process is not in any sense fast moving or agile. There is nothing in the legislation around algorithmic sovereignty or opt outs from personalised recommendations. These are the very tools that a platform user needs to have to start creating an information ecosystem where disinformation is weeded out.

The European Union’s Digital Services Act provides an example of how social media users can be provided with these tools. On the 20th October 2023, the European Union adopted a delegated regulation under the Digital Services Act around compliance audits for what is referred to as very large online search platforms (VLOSP) and very large online search engines (VLOSE). The delegated regulation specified the role of independent auditors, who were required to use templates for implementation reports. In August 2023, Articles 34-48 of the Digital Services Act came into effect, with a range of compliance provisions, such as risk assessments, opt outs from personalised recommendations, algorithm transparency, data and access for researchers. The mandatory annual independent audit assesses compliance with these provisions, which are the basis for mandatory reports to the European Commission and Digital Services Coordinator. One notable weakness in the European Union delegated regulation is that auditors will be paid for by the companies they are auditing. The EU’s Digital Services Act is not an agile framework either. Importantly none of the regulatory frameworks in Australia or the EU is particularly effective at combating the formation of echo chambers, which are the repositories for disinformation.

Only technological solutions have the capacity to combat the lightning spread of disinformation. Examples of agile technology that could be incorporated into policy frameworks include BotSlayer, a software program designed by researchers at the University of Indiana. Botslayer detects the presence of coordinated disinformation campaigns through the use of bots. It is free software that can be used to monitor sudden suspicious spikes in activity. Another technological solution includes random dynamical nudges.

Researchers Curin, Vera and Khaledi-Nasab have explained that social media is built around the advertising culture of the ‘hyper-nudge’. This is a marketing technique of communicating identity-based messages that appeal to generating user consumer behaviours. This social media design feature is responsible for generating echo chambers. Curin and colleagues developed the concept of the random dynamical nudge, where social media users are presented with a random selection of other users’ opinions. Their research found that using random dynamical nudges led to consensus formation, rather than fragmentation of political discourse and formation of echo chambers.

Policy frameworks could mandate joint systematic monitoring by the Social Media Platform, ACMA and the AEC with the use of BotSlayer style software, with compulsory auditing for the dismantling of disinformation campaigns and deployment of random dynamical nudges around electoral promises. Regulating online platforms and providers is complex, but protecting democracy is worth the effort.

 

[textblock style=”7″]

Like what we do at The AIMN?

You’ll like it even more knowing that your donation will help us to keep up the good fight.

Chuck in a few bucks and see just how far it goes!

Your contribution to help with the running costs of this site will be gratefully accepted.

You can donate through PayPal or credit card via the button below, or donate via bank transfer: BSB: 062500; A/c no: 10495969

[/textblock]

Exit mobile version