2025 was a huge year for us at KGI! Can't wait to see what 2026 holds. :)
Posts by Leticia Bode
I think you might be trying to contact Letitia James.
Are social media feeds making us more polarized – or is the story more complicated? Join us this Thursday (10-11:30am ET) at Georgetown's Tech & Society Week 2026 as we unpack what the evidence actually says about algorithms & polarization. In person + livestream 👉 kgi.georgetown.edu/events/desig...
We’re hiring! KGI is seeking an Operations Associate to provide operational support for budgeting, people processes, events, and internal + external comms. This position is hybrid and must be based in Washington, D.C. Apply by April 6, 2026: kgi.georgetown.edu/operations-a...
Join us next week Friday, March 20, for a presentation by @leticiabode.bsky.social and @pfchap.bsky.social on "Better Access: A Framework for Accessing High-Influence Public Platform Data". Registration required via events.gwdg.de/event/1260/
How do social media algorithms shape social and political polarization? Join KGI at Georgetown's Tech & Society Week for “Designing for Democracy: Social Media Feeds in a Hyper-Polarized World.”
📆 Mar 26, 10-11:30AM ET
📍Georgetown Capitol Campus + livestream
🎟️ kgi.georgetown.edu/events/desig...
My latest collaboration with students! Studying how promoted posts frame the use of alternatives to hormonal birth control. Framing Fertility: The Use of Rhetorical Appeals in Natural Cycles’ Sponsored Content on Instagram. Free eprints: www.tandfonline.com/doi/full/10....
I highly recommend reading the full report, which is eye opening to say the least. kgi.georgetown.edu/research-and...
TikTok’s screentime management tools were subject to A/B testing which found that the actual impact on minors’ overall use was “about 10 minutes on weekdays and 15 minutes on weekends.” (13/13)
Adoption rates reported by TikTok, finding that from January 2024 to January 2025 the adoption rate for screentime breaks were approximately 1.5% of users, and sleep reminders between 0.7 and 1.8% of users. (12/13)
One internal TikTok document suggesting that “minors do not have executive function to [voluntarily] control their screen time.” (11/13)
3. Platforms know the mitigations they employ to reduce these harms don’t work: (10/13)
TikTok’s Minor Safety Strategy Paper references studies that suggest 6 or more hours a day spent on TikTok heightens depression risk. (9/13)
TikTok staff discussed external research finding that “minors who spend more than three hours a day on social media double their risk of ‘poor mental health outcomes, including symptoms of depression and anxiety.’ (8/13)
2. Platforms know that this addictive design has problematic associations: (7/13)
Internal documents state that the platform can “get people into flow – the psychological state of extreme engagement, loss of sense of time and even loss of self” including through “the variability of rewards (which is what makes the app so addictive).” (6/13)
TikTok's own records describe how it uses “powerful coercive design tactics” to extend use. (5/13)
TikTok’s Digital Wellbeing Product strategy describes compulsive usage as “rampant” on the platform. (4/13)
I'll give TikTok as an example for this thread, but the report also looks at Meta. Here's what we know:
1. Platforms know their designs are addictive: (3/13)
The report leverages two sources of data - 1) DSA mandated risk assessments, written by the platforms, and 2) internal platform documents that have emerged from US litigation against platforms. (2/13)
This new report from @knightgtown.bsky.social @pfchap.bsky.social @matt-steinberg.bsky.social shows the substantial gap between platform risk assessments and their actual approaches to mitigating risk. (1/13)
Lots of people talking about age assurance these days, but little about how it actually works - what core tradeoffs are built into the technology, in terms of baseline accuracy, circumvention resistance, availability, and privacy. The new KGI report digs into these weedy issues, check it out!
Registration for the Digital Competition Conference 2026 is now open to the public!📍Feb 5-6, 2026 | Washington, DC in person + livestream
DCC is where cutting-edge research meets policy on antitrust & digital competition. 🎟️ Secure your free ticket here: kgi.georgetown.edu/events/digit...
🚨Why does access to public platform data matter? Join our webinar "Better Access: Data for the Common Good" (Jan 28, 2026, 11am-12pm ET) for a discussion on the Better Access framework, current regulatory shifts in the EU, UK + US, and what changes 2026 might hold. kgi.georgetown.edu/events/bette...
Love this practice ranked choice voting ballot from DC board of elections, ranking your favorite singers. (Even better would be if it validated at the end whether you did it right or not.)
vr.dcboe.org/253434754272...
Please share with all the awesome applicants you know!
We recently concluded a special article series, “Seeing the Digital Sphere: The Case for Public Platform Data,” in collaboration with the Knight-Georgetown Institute, in which experts explored why access to public platform data is critical. Here’s a snapshot: (1/9)
Despite new EU transparency laws, researchers are still blocked from vital social media data, while corporations buy the same information freely, argues Brandon Silverman. The result: a two-tier system that shields platforms from scrutiny and weakens democratic oversight.
Think platform data from social media only matters for election or disinformation research? Think again. George Pearson, a researcher at the Truth Initiative, shows it’s vital for public health—revealing how tobacco companies can target youth, dodge regulations, and influence policy.
📣Announcing our new series with @techpolicypress.bsky.social "Seeing the Digital Sphere: The Case for Public Platform Data". Should we be able to understand the risks kids face online? How businesses target consumers? How politicians communicate? These qs depend on access to public platform data. 👇