Quotes
Misinformation and disinformation is very dangerous, and we've seen it really kind of explode in the last few years,
Chalmers told national broadcaster ABC Independent fact-checkers are a vital safeguard against the spread of harmful misinformation and disinformation that threatens to undermine free democratic debate in Australia and aims to manipulate public opinion,
said AAP chief executive Lisa Davies Placing the onus on users is just delegating responsibility away and this is a structural issue,
Especially as we come into an election cycle in Canada somewhere in the next several months, it’s just going to be a nightmare. And there’s no control in this situation,
Yes, moderation is imperfect and yes, users across the political spectrum distrust it, but platform moderation policies reflect platform values,
The people who you get to participate will be incredibly important,
If there’s a sportsball game and one team fouls four times as much, it’s not ‘biased’ for the ref to call four times as many fouls against that team,
Kate Starbird, a professor at University of Washington and co-founder of its Center for an Informed Public, posted to Bluesky following the Meta announcement I believe that the former president should be responsible for his words.”
Zuckerberg repeatedly touted its fact-checking efforts and partners, including at a 2021 House Energy and Commerce committee hearing, where he seemingly called out Trump for inciting the January 6th attacks, saying This was after two years of Zuckerberg being hounded by the weaponization committee,
Jankowicz said, referring to House Judiciary Chairman Jim Jordan’s Select Subcommittee on the Weaponization of the Federal Government — one of the latest tools used by Republicans to institutionalize their complaints around social media and political bias The fact-checking program was never going to save Facebook, but it was the last bulwark to complete chaos on the platform,
said Nina Jankowicz, former head of a disinformation board within the Department of Homeland Security, who now helms a nonprofit organization focused on countering attacks on disinformation researchers The EU will remain uncomfortable for social media giants by standing up for the integrity and independence of free expression and democratic processes. Europe will never accept manipulation and disinformation as a standard for society. By abandoning factchecking in the US, Meta is making a profound strategic and ethical mistake.”
Valérie Hayer, an MEP and the leader of the centrist Renew Europe grouping in the European parliament, said Meta needs to set out how the risks of harm to children in the UK is not being increased by the removal of factcheckers in America and its changes in content policies.”
Rani Govender, its regulatory policy manager for child safety online, said They will be increasingly exposed to all the content categories that they need to be protected against.”
Arturo Béjar, a former senior engineer whose responsibilities at Meta included child safety measures, said I am extremely concerned about what this means for teenagers.”
A Meta whistleblower told the Guardian We do allow allegations of mental illness or abnormality when based on gender or sexual orientation.”
The changes to Meta’s global policies on hateful content now include allowing users to call transgender people “it”, with the guidelines stating We must stand firm against such proposals, which would remove any chance we have to hold tech executives to account and require them to enforce the safety standards on their platforms that are set in our laws.”
To hear that Meta is removing all its factcheckers [in the US] is concerning … people have a right to be protected from the harmful effects of misinformation,
Although research supports the idea that crowdsourcing fact-checking can be effective when done correctly, it is important to understand that this is intended to supplement fact-checking from professionals -- not to replace it,
said Gordon Pennycook, from Cornell University If he's all-in on the Musk playbook, the next step will be slashing yet more of his content moderator numbers,
including those that take down violent content and hate speech Community Notes users are very much motivated by partisan motives and tend to over-target their political opponents,
added Alexios Mantzarlis, director of the Security, Trust, and Safety Initiative at Cornell Tech