Advertisement · 728 × 90

Posts by Nadia Jude

A field guide to infrastructural analysis An early career workshop with Julie Cohen, Gavin Sullivan and Morgan Currie | Edinburgh Law School

I'm organising an ECR workshop this Friday with an incredible lineup of scholars. It's called 'A field guide to infrastructural analysis'. We only have space for 20, so there is a short application form. If you're in Edinburgh this week, please consider joining us!
www.law.ed.ac.uk/news-events/...

6 months ago 1 0 0 0

Happy to see this out in the world! Please read if you're thinking critically about the fast adoption of content moderation systems like Community Notes, that combine crowdsourcing with bridging-based algorithmic ranking. These systems are being rolled out by X, YouTube, Meta and now TikTok.

1 year ago 9 1 0 0
Preview
Three years after Russia’s invasion, a global online army is still fighting for Ukraine The North Atlantic Fella Organisation has devised creative ways to support the Ukrainian cause – while navigating the complexities of content moderation.

@olgarithmic.bsky.social and I just published an article in The Conversation, talking about global online resistance to russia's war on Ukraine (including #nafo) and how it connects to the complexities of social media platform moderation. Read it here: theconversation.com/three-years-...

1 year ago 16 6 0 0

"These scholars... all highlight the centrality of popular discontent with neoliberalism—whether as a project of governance, a type of rationality, or a set of economic policies—as key to understanding the resurgence of an authoritarian, nationalist, and anti-globalist Right"

1 year ago 2 1 0 0
X, YouTube, and now Meta are taking advantage of these successes, taking elements, then co-opting the language of ‘empowerment’ and ‘democracy’ to reduce moderation and market their techno-solutionist products as beneficial for the world. Research like ours is a useful reminder of the importance of critically interrogating platform content moderation systems, which involves paying attention to these systems’ design, the problems they are oriented to solve, their contexts of use, and their risk of directly supporting and entrenching online harms. There is an urgent need to think beyond technology to address the societal challenges often associated with the disinformation problem despite large tech companies' efforts to convince us otherwise. Media system reform, market-shaping approaches, and “big tent” civil society coalitions led by the Global Majority would be a fruitful start.

X, YouTube, and now Meta are taking advantage of these successes, taking elements, then co-opting the language of ‘empowerment’ and ‘democracy’ to reduce moderation and market their techno-solutionist products as beneficial for the world. Research like ours is a useful reminder of the importance of critically interrogating platform content moderation systems, which involves paying attention to these systems’ design, the problems they are oriented to solve, their contexts of use, and their risk of directly supporting and entrenching online harms. There is an urgent need to think beyond technology to address the societal challenges often associated with the disinformation problem despite large tech companies' efforts to convince us otherwise. Media system reform, market-shaping approaches, and “big tent” civil society coalitions led by the Global Majority would be a fruitful start.

We stress the importance of critically interrogating the design of content moderation systems, the problems they are oriented to solve, their contexts of use, and their risk of directly supporting and entrenching online harms.

1 year ago 1 0 0 0

Our @techpolicypress.bsky.social article is based on our recent paper examining Community Notes through the lens of humour. We find that CN enacts a narrow conception of disinformation that is ill-equipped to address hate and harm. eprints.qut.edu.au/254907/1/Mat...

1 year ago 19 10 1 0

Thank you Tom ☺️ We felt our research really spoke to the global tech justice piece you wrote with @joncong.bsky.social on the impact of Meta's decisions for marginalised communities + regions grappling with deep-seated conflicts. CN is not designed to address harm + its ethos will likely sow harm.

1 year ago 1 1 0 0
Advertisement
Preview
Observatory - Logan — Careful Industries

Went to a great talk on innovation nationalism and the colonial dynamics of drone testing in Australia last night, thanks so much @thaophan.bsky.social. The documentary below, interviewing people of Logan, is worth a watch. Logan is approx. 1h from my home.
www.careful.industries/ai-in-the-st...

1 year ago 4 0 1 0

I loved working with @ariadnamf.bsky.social on this. We conceptualise Community Notes as a "data infrastructure for soft moderation". CN has an operational logic that inscribes true-false, real-fake binaries, and fails to address false narratives that mobilise the ambiguities within humour to harm.

1 year ago 25 8 0 0

No problem at all, Nicole! It’s what this place is for ☺️ I hope you find them useful!

1 year ago 1 0 1 0

“Worryingly, critical researchers and specialist organizations are redirected by donors away from healing initiatives to conflict frames. 💔”

1 year ago 6 2 0 0

citap.unc.edu/wp-content/u...

1 year ago 1 0 0 0
Preview
Critical disinformation studies: History, power, and politics | HKS Misinformation Review This essay advocates a critical approach to disinformation research that is grounded in history, culture, and politics, and centers questions of power and inequality. In the United States, identity, p...

misinforeview.hks.harvard.edu/article/crit...

1 year ago 1 0 1 0
Preview
Fake News is Not a Virus: On Platforms and Their Effects Abstract. This article attempts to uncover the intellectual, economic, and methodological structures that have led to the recent emergence of a particular

academic.oup.com/ct/article-a...

1 year ago 1 0 1 0
Disinformation and the structural transformations of the public arena: addressing the actual challenges to democracy - ORA - Oxford University Research Archive Current debate is dominated by fears of the threats of digital technology for democracy. One typical example is the perceived threats of malicious actors promoting disinformation through digital channels to sow confusion and exacerbate political divisions. The prominence of the threat of digital

ora.ox.ac.uk/objects/uuid...

1 year ago 1 0 2 0

Hi Nicole, I'm writing my PhD on shifting conceptions of disinformation within Australian policy discourses. I'll link to some readings I've found particularly useful below (have many more, if this is the kind of work you're looking for) ☺️

1 year ago 1 0 1 0
Advertisement

We made a starter pack of researchers affiliated with the QUT Digital Media Research Centre

1 year ago 54 30 3 7