EU folks: share your feedback on the proposed expansion of data retention requirements (public consultation open until June 18th). Metadata is high-risk data, particularly given recent backsliding on fundamental rights protections at the national level.
Posts by Cat Easdon
As expected Apple has canceled encrypted iCloud storage for UK users rather than capitulate to UK gov and create backdoor for gov to access data. "British customers who already have Advanced Data Protection will be warned later to disable it or lose access to iCloud." by @joemenn.bsky.social
👋🏻 Have you already sent your application for the 2025-2026 European Cybersecurity Fellowship our way? Put together your CV and write a short essay on one of this year’s topics: virtual-routes.org/virtual-rout...
⏳ Deadline: Feb 23, 23:59 CET
#VirtualRoutes #Cybersecurity #Fellowship
What do you think? 🤔
◆ You can check out the slides here: www.cattius.com/images/virtu...
◆ Read more about @dynatrace.com's secure development lifecycle here: docs.dynatrace.com/docs/manage/...
◆ And learn how to attest our product components' SBOMs here: docs.dynatrace.com/docs/ingest-...
As a society, we can set expectations for companies' responsibilities. ESG initiatives were a first step beyond the Friedman doctrine (the only social responsibility of a company is to increase its profits). Now let's consider what role we want companies to play in building societal resilience...
I highlighted how business pressures can lead over time to a weak security culture and resulting data breaches, despite the best efforts of individuals. To tackle this, do we perhaps need more than just internal organizational change across the industry?
We discussed:
◆ How businesses reason about software security
◆ The secure development lifecycle at @dynatrace.com
◆ Supply chain security and SBOMs 📜
◆ Two breaches that led to the US Cyber Safety Review Board calling on all cloud providers to drastically prioritize security
Three slides from the presentation considering technical security controls in the software development lifecycle, such as static application security testing (SAST) and software bills of materials (SBOMs).
How do cloud providers reason about software security, and how can we help them make the business case for security to build global resilience? Last week, I had the privilege of exploring these questions in a workshop with the @virtualroutes.bsky.social European Cybersecurity Fellows 🧵
This is an excellent primer on some of the privacy dangers posed by large scale AI, from a cybersecurity perspective. Written in clear language, it's the most accessible rundown I've seen yet on these topics!
desfontain.es/blog/privacy...
The biggest story in data privacy continues with a new piece about the Gravy Analytics hack covered in @404media.co (www.404media.co/candy-crush-...) + @wired.com (www.wired.com/story/gravy-...) --- i'm proud to have provided a few comments, but wanted to expand on it briefly in this thread:
'“The bottom line is, you can’t replace the guy who screams, ‘Listen, this is dangerous,’” with all the advanced AI technologies in the world,” said Caspit, the Israeli journalist who has interviewed every living 8200 commander for his book.'
Android’s developer docs talk about accuracy to within a few feet (for the raw fused location data without heuristics in addition). But it varies a lot depending on the environment (dense urban areas = lots of WiFi data), phone hardware, whether power saving mode is on, etc.
Great question! 10cm is unusually precise - it shocked me when I read the article. I’d assume the ‘geo’ data it refers to is data from the phone’s precise location source (GPS + WiFi, Bluetooth, cell coverage, etc.) + car-specific heuristics (e.g. you’re probably parked on a road, not in a river).
More info (🇩🇪-only) in the Spiegel + at 38C3 later this evening (recording coming soon). Intrigued to hear their take on why this data was even collected…Collecting with such precision seems like pure data greed given the risk to individuals + organizations (police, military etc. all in the dataset).
Today in ‘I know you don’t have time but you really need to read that app’s privacy policy’: a car app collected GPS data with up to 10cm precision (!!), stored it in the cloud, then leaked the creds 🙈 Could the data have been secured? Yes. But far, far better to never collect it in the first place.
🚨 BREAKING: South Korea is the world's SECOND country to enact a comprehensive AI law - and it's heavily inspired by the EU AI Act! Is the Brussels effect already happening? [HINT: YES] Here's what you need to know:
Plus side: it’s opt-in + you can exclude the replay when you share (their pitch is *you* use it to review your edits). Still troubling for privacy though if job apps/schools demand it; the drafting process reveals a lot of your thought process (do you really want to share your unedited thoughts? 😅)
Court order text. Link to follow
Court order text. Link to follow
Court order text. Link to follow
Court order text. Link to follow
BREAKING: court finds NSO Group liable for #Pegasus hacking of #WhatsApp users.
Big win for spyware victims.
Big loss for NSO.
Bad time to be a spyware company.
Landmark case. Huge implications. 1/ 🧵
Can't recommend the Fellowship enough - an amazing opportunity for professional development, growing your network, and meeting super smart folks :) I particularly encourage folks who are on the tech side and would like to get more exposure to policy work or other way round. Apply apply apply!
“For too long, we did not act.
Georgia in 2008.
Crimea in 2014.
And many did not want to believe he would launch all-out war on Ukraine in February 2022.
How many more wake-up calls do we need?”
Every European citizen should read this speech by NATO SecGen Rutte.
www.nato.int/cps/en/natoh...
It is not ours to finish the work but neither are we free to neglect it. The world is fractally complex and hard and it will always be, but if we don't work for a world which is kind we won't have one.
Some really remarkable lines in this speech by Pat McFadden, which is set to be delivered at the NATO Cyber Defence Conference on Monday.
I'll run through what we know in this thread...
“…it's possible to map key entry and exit points, pinpointing frequently visited areas, and even tracing personnel to their off-base routines. For a terrorist, this information could be a gold mine—an opportunity to identify weak points, plan an attack, or target individuals”
I've highlighted this case before (when I was wondering when they're going to schedule a hearing) which is IMO the most important EU law case of the last decade. The EU might soon be in its quasi 14th amendment moments: federal/EU-level fundamental rights (though not the Charter itself) enforced
I crated this list of privacy law thought leaders - folks who are in privacy, AI, data security, and tech:
bsky.app/profile/did:...
Finally the EU throws its hat in the ring with the (first of its kind for EU standards) common declaration about how PIL applies to cyberspace. Quite valuable position from an important player. We eagerly expect the actual text to see teh details but a very positive development overall!
I argue that the question of what it means for AI to be ‘ethical’ is a philosophical quagmire in an industry context. Focusing instead on protecting human rights helps us move one step forward towards responsible AI - but only if companies are willing to commit to defending our shared values.
We all agree that responsible AI is crucial. But how effective are principles for achieving this? In my op-ed for @bindinghook.bsky.social, I consider the challenges of putting responsible AI principles into practice when questions of ethics and responsibility are frequently dismissed as ‘politics’.