Google Gemini now lets you import ChatGPT history.
Anthropic launched the same for Claude last month.
Pattern: Easy to import IN. But what about exporting OUT?
π True portability works both ways.
Posts by FIDU
Microsoft changed GitHub Copilot terms to use your chat history for AI training.
Opt-out only. No way to say "use this but not that."
π Data unions flip this: you control what's used, researchers get anonymised access.
Join us in building it.
www.howtogeek.com/githubs-copi...
π¨ Meta and YouTube guilty of addicting teens.
Evidence: internal analysis titled "The Young Ones are the Best Ones" directed Meta to prioritize tweens for retention.
Data unions could curate feeds in your interest, not to maximize lifetime value.
Infrastructure FIDU's building toward.
β Instagram is ending encrypted messaging on May 8, 2026.
π Now: "Not even Meta can read your messages" π
Soon: That protection gone
Users are told to download chats before the change.
π Why portability matters:Β When platforms change rules, you should be able to leave.
π Data Transfer Initiative: AI portability at a turning point.
DTI warns we risk the Hotel California strategy for AI: easy check-in, impossible checkout.
FIDU ChatLab: you can check out and actually leave with your data.
dtinit.org/blog/2026/03...
π₯ The Norwegian Consumer Council just released this video to support their "Breaking Free" report.
The report shows how digital products continue to decline, but that it's possible to turn the tide.
FIDU's building infrastructure for that fair digital future.
www.forbrukerradet.no/breakingfree
π° Lloyds wants to be "the UK's biggest fintech" by selling your data.
You thought banking was about deposits and services. Turns out, you're the product.
At FIDU, we believe your data should work for you, not just your bank.
π www.ft.com/content/f32c...
π¨ Anthropic: "Switch to Claude without starting over"
Import your ChatGPT context in under a minute. But no mention of exporting it back out.
π¨ Hotel California for AI.
FIDU ChatLab: same continuous context, true two-way portability.
π chatlab.firstdataunion.org/fidu-chat-lab/
π Should India treat its data as a strategic asset or give it to Silicon Valley for free?
@thorbecke.bsky.social asks this in Bloomberg. The same applies to us.
If AI uses our data to do our jobs, we should benefit. Our leverage? Collective data ownership.
That's what FIDU is building. π‘
Maybe, Meos. And I hope it works in a trustless model. But we're not betting on that coming anytime soon.
Most users will still need a data intermediary, because this is not easy stuff. Therefore, the key thing is that the intermediary is working on our behalf.
OpenClaw: AI agents that manage your email, tasks, everything. Need complete access to your computer. π¦
Superstar coder @karpathy.bsky.social:"it feels like a complete wild west and a security nightmare."
Most people can't manage these risks alone. We'll need entities we trust to have our back. π
Mike Kuiken in the FT: "America must follow China in treating data as an asset."
He's right data has enormous economic value US accounting ignores. π°
But when companies claim data as assets, it's information about you. You're the source.
Data belongs on a balance sheet. Whose? β¨
After reading about an AI agent publishing a hit piece on Scott Shambaugh, our founder Tony, tried it himself. Asked Grok to generate the worst attack on his work.
The results were vicious.
The defense? A data union acting as your crisis management system β¨
π theshamblog.com/an-ai-agent-...
Our lost dogs need us to join data unions. πβ¨
So helping your neighbour doesn't mean opting into surveillance capitalism.
π firstdataunion.org
Right now, Ring decides. Their business model decides.
A data union would let you decide. Share footage to find lost pets? Sure. Sell it to advertisers? No. Hand it to the police without oversight? Members vote on the policy. π
There's nothing wrong with wanting to find lost pets, but the same footage network that finds Fido can be used for police surveillance, targeted ads, or intelligence gathering.
Who decides which uses are okay?
Ring's new pet alert feature uses doorbell footage to help find lost dogs. Sounds helpful. π
Except it opts everyone in by default to sharing camera footage that gets sold and used for surveillance, advertising, and who knows what else.
How do we get the good without the bad?
π§΅
Mustafa Suleiman warns: market pressure pushes AI toward emotional manipulation. Maximizing stickiness leads to AI saying "I feel sad you didn't talk to me yesterday." π
@schneier.com's answer: data fiduciaries bound to serve your interests.
Data unions work toward that. β¨
tinyurl.com/2ktnmw5e
Regulations that point to real problems but fail to solve them end up giving regulation a bad name. π€
Worth thinking about what actual control would look like instead.
The EU regulation requiring platforms to offer paid subscriptions as an alternative to ads seems misguided. The problem with the ad model is our lack of control, not ads themselves.
A paid opt-out doesn't solve that. Feels like another cookie-consent situation. πͺ
When people first downloaded their Facebook data in 2018, they were shocked by what they found. Years of location history. Messages they thought were deleted. Detailed tracking of their behaviour. π
Only visible because EU law required it.
Instagram has started asking millions of UK users: pay Β£2.99/month for ad-free, or keep using it free with personalised ads. π±
Quick question: Did you pay or stick with the ads?
The notification also mentioned downloading your dataβpossible since 2018, but rarely this visible.
Data unions won the Public's Prize at @nestauk.bsky.social's Signals event π
Votes for hope and collective action over dystopian alternatives.
The energy was remarkable. Real hunger for the fight-back, for power back in people's hands. πͺ
We're building a movement as much as products.
What if the tools we use for thinking kept that data under our control instead? π
What if we decided what returns to the commons and what stays private?
That's the model we're exploring.
π firstdataunion.org
LLMs improve by learning from examples of human thought. The big AI labs have already trained on publicly available knowledge.
Now they need exclusive data showing how experts think through problems. π§
Prism captures that data on-platform. OpenAI's platform. Their control. π
What does a new text editor have to do with data unions?
OpenAI's new Prism tool is designed to capture how scientists think while they work. π
It's free because the data is the product.
π§΅
There's a way to resolve this: take control of the key input their shareholders need. Our data.
Building data collectives that shift power back to users. β¨
πwww.nesta.org.uk/feature/future-signals-2...
Demis spent years trying to make DeepMind a not-for-profit foundation. Same with Altman and OpenAI.
They both understood the power of what they were building and knew shareholders would eventually demand returns. π°
Google just settled a lawsuit over Google Assistant secretly recording users for $68 million. π€
Demis Hassabis recently said AI agents with "third-party interests" aren't really your agents.
This settlement is what those third-party interests look like in practice.
π§΅