I'm organising an ECR workshop this Friday with an incredible lineup of scholars. It's called 'A field guide to infrastructural analysis'. We only have space for 20, so there is a short application form. If you're in Edinburgh this week, please consider joining us!
www.law.ed.ac.uk/news-events/...
Posts by Nadia Jude
Happy to see this out in the world! Please read if you're thinking critically about the fast adoption of content moderation systems like Community Notes, that combine crowdsourcing with bridging-based algorithmic ranking. These systems are being rolled out by X, YouTube, Meta and now TikTok.
@olgarithmic.bsky.social and I just published an article in The Conversation, talking about global online resistance to russia's war on Ukraine (including #nafo) and how it connects to the complexities of social media platform moderation. Read it here: theconversation.com/three-years-...
"These scholars... all highlight the centrality of popular discontent with neoliberalism—whether as a project of governance, a type of rationality, or a set of economic policies—as key to understanding the resurgence of an authoritarian, nationalist, and anti-globalist Right"
X, YouTube, and now Meta are taking advantage of these successes, taking elements, then co-opting the language of ‘empowerment’ and ‘democracy’ to reduce moderation and market their techno-solutionist products as beneficial for the world. Research like ours is a useful reminder of the importance of critically interrogating platform content moderation systems, which involves paying attention to these systems’ design, the problems they are oriented to solve, their contexts of use, and their risk of directly supporting and entrenching online harms. There is an urgent need to think beyond technology to address the societal challenges often associated with the disinformation problem despite large tech companies' efforts to convince us otherwise. Media system reform, market-shaping approaches, and “big tent” civil society coalitions led by the Global Majority would be a fruitful start.
We stress the importance of critically interrogating the design of content moderation systems, the problems they are oriented to solve, their contexts of use, and their risk of directly supporting and entrenching online harms.
Our @techpolicypress.bsky.social article is based on our recent paper examining Community Notes through the lens of humour. We find that CN enacts a narrow conception of disinformation that is ill-equipped to address hate and harm. eprints.qut.edu.au/254907/1/Mat...
Thank you Tom ☺️ We felt our research really spoke to the global tech justice piece you wrote with @joncong.bsky.social on the impact of Meta's decisions for marginalised communities + regions grappling with deep-seated conflicts. CN is not designed to address harm + its ethos will likely sow harm.
Went to a great talk on innovation nationalism and the colonial dynamics of drone testing in Australia last night, thanks so much @thaophan.bsky.social. The documentary below, interviewing people of Logan, is worth a watch. Logan is approx. 1h from my home.
www.careful.industries/ai-in-the-st...
I loved working with @ariadnamf.bsky.social on this. We conceptualise Community Notes as a "data infrastructure for soft moderation". CN has an operational logic that inscribes true-false, real-fake binaries, and fails to address false narratives that mobilise the ambiguities within humour to harm.
No problem at all, Nicole! It’s what this place is for ☺️ I hope you find them useful!
“Worryingly, critical researchers and specialist organizations are redirected by donors away from healing initiatives to conflict frames. 💔”
citap.unc.edu/wp-content/u...
Hi Nicole, I'm writing my PhD on shifting conceptions of disinformation within Australian policy discourses. I'll link to some readings I've found particularly useful below (have many more, if this is the kind of work you're looking for) ☺️
We made a starter pack of researchers affiliated with the QUT Digital Media Research Centre