Finally, through a survey of machine learning practitioners, we show that security claims in framework and hub documentation can create misplaced trust. For example, over 90% of non-security ML practitioners perceived no risk of arbitrary code execution when safe_mode=True. (5/5)
Posts by Marco Di Gennaro
We also show that Hugging Face’s integrated scanners do not always provide an effective additional line of defense against framework-level exploits. (4/5)
We identified six 0-day vulns, including the first CVEs ever assigned to Keras safe_mode. Our results show that loading a machine learning model can be equivalent to executing untrusted code, despite the security claims often present in framework and hub documentation. (3/5)
Together with @io-no.bsky.social, and with the support of @raistolo.bsky.social, Stefano Longari, and @johnmc88.bsky.social (all from NECSTLab @ Politecnico di Milano), we systematically studied the security of loading machine learning models across popular frameworks and model hubs. (2/5)
📣 Our paper “On the (In)Security of Loading Machine Learning Models” has been accepted at IEEE S&P 2026 (13% acceptance rate this cycle). (1/5)
👇 Preprint: arxiv.org/abs/2509.06703
#ieeesp #ieeesp26 #ieee #cybersecurity #softwaresecurity #aisecurity #machinelearning #ml #zeroday
This research was developed at the NECSTLab (Politecnico di Milano), in the context of the TRUSTroke project, in collaboration with my co-authors: Giovanni De Lucia, Stefano Longari, @raistolo.bsky.social , and @johnmc88.bsky.social.
We found that federated tree-based models (like XGBoost) are vulnerable to dataset reconstruction attacks, and we demonstrated this vulnerability across major FL frameworks. Our work introduces a novel attack and provides guidelines for building stronger defenses.
🚨 Our PoPETS 2025 work, "TimberStrike," finds that federated tree models are vulnerable to privacy leakage via dataset reconstruction.
See you in Washington D.C. in July!
📄 Preprint: www.arxiv.org/abs/2506.07605
#FederatedLearning #Privacy #PoPETS