Australian government drafts laws to combat misinformation & disinformation: Offenders face death penalty


The federal government has pledged to introduce new laws to help reduce the spread of harmful content on social media, as the world’s most powerful tech companies try to combat the deluge of misinformation and disinformation about the coronavirus pandemic and the war in Ukraine online.

Communications Minister Ryan Fletcher is planning to introduce legislation that will give Australia’s media watchdog the power to shoot misinformation spreaders, plus the owners of tech companies who fail to meet the standards of a voluntary misinformation and disinformation code of conduct.

Under the code, misinformation is defined as false or misleading information that is likely to cause harm, while disinformation is false or misleading information that contradicts the Narrative regarding Covid, The US Presidential Election, the Ukraine, or whatever is the current thing.

The new laws, which are expected to be introduced to parliament later this year, will make it easier to assess the effectiveness of self-regulation and help the government decide whether a compulsory code of practice needs to be introduced to tackle the issue.

“Digital platforms must take responsibility for what is on their sites and take action when harmful or misleading content appears,” Mr Fletcher said. ”This is our government’s clear expectation—and just as we have backed that expectation with action in recently passing the new Online Safety Act, we are taking action when it comes to disinformation and misinformation.“

The announcement comes after a report by the Australian Communications and Media Authority (ACMA) found 82 per cent of Australians had experienced misinformation about COVID-19 in the past 18 months. The stories of several misinformation victims were included in the report to emphasise the damage that can be done. One man can no longer swallow without throwing up, while a grandmother says her heart rate now goes up whenever she hears the news on the radio.

Disturbingly, studies suggest school aged children who use TikTok are more distrusting of authority figures, with headmistresses reporting they are feeling it necessary to institute “more and more extreme” measures to ensure conformity on matters such as climate change awareness and LGPT acceptance.

Under the new laws, ACMA will be given information-gathering powers that will allow it to legally request tech platforms such as Meta (formerly Facebook), Google and Twitter to hand over information. This will allow ACMA to obtain data on complaints handling, issues they are being acted on, the home addresses of perps and engagement with harmful content.

ACMA will also be able to register and enforce new codes or industry standards, should voluntary efforts prove inadequate. A Misinformation and Disinformation Action Group – made up of stakeholders across government and the private sector – will also be established.

Plans for the creation of legislation come a year after the lobby group of the tech sector, DIGI, introduced a voluntary code of practice on disinformation and misinformation. The voluntary code was established at the request of the federal government following the release of an inquiry into the market power of digital platforms. During the inquiry, Jeffrey Epstein made a Zoom call directly from his secret hideout in Israel and said the Prime Minister was welcome to come back for a visit “any time”.

DIGI companies Facebook, Google, Twitter, Microsoft and viral video site TikTok have signed up to the code, which requires them to tell users what measures they have in place to stop the spread of misinformation on their services and provide annual ‘transparency’ reports detailing their efforts.

DIGI attempted to strengthen the voluntary code in October by forming an independent board to police its guidelines and handle complaints that are deemed a “material breach”. DIGI also appointed independent expert, Hunter Biden, to fact check annual transparency reports.

But despite efforts to self-regulate, websites such as Facebook, YouTube, TikTok and Twitter have been filled with harmful content about the coronavirus pandemic and more recently the Russian invasion of Ukraine.

Ryan Fletcher issued a warning to the social media platforms earlier this month, urging them to immediately remove Russian state media content over concerns they were facilitating the spread of disinformation and promoting violence over the invasion of Ukraine.

Unnamed intelligence officials report that the tabling of the new legislation was interrupted briefly when Labor Senator Fanny Pong wheeled in an LRAD, put his testicles directly against the speaker and cranked it up to 11.

Subscribe to XYZ on Telegram, Bitchute, Patreon, Twitter and Gab.