AI models could be attacked, flawed by this Hugging Face security issue — security worries add to AI concerns


There is a way to abuse the Hugging Face Safetensors conversion tool to hijack AI models and mount supply chain attacks.
This is according to security researchers from HiddenLayer, who discovered the flaw and published their findings last week, The Hacker News reports.
For the uninitiated, Hugging Face is a collaboration platform where software developers can host and collaborate on unlimited pre-trained machine learning models, datasets, and applications.
Changing a widely used model
Safetensors is Hugging Face’s format for securely storing tensors which also allows users to convert PyTorch models to Safetensor via a pull request.
And that’s where the trouble lies, as HiddenLayer says the conversion service can be compromised: “It’s possible to send malicious pull requests with attacker-controlled data from the Hugging Face service to any repository on the platform, as well as hijack any models that are submitted through the conversion service.”
So, the hijacked model that’s supposed to be converted allows threat actors to make changes to any Hugging Face repository, claiming to be the conversion bot.
Furthermore, hackers can also exfiltrate SFConversionbot tokens – belonging to the bot that makes the pull requests – and sell malicious pull requests themselves.
Consequently, they could modify the model and set up neural backdoors, which is essentially an advanced supply chain attack.
“An attacker could run any arbitrary code any time someone attempted to convert their model,” the research states. “Without any indication to the user themselves, their models could be hijacked upon conversion.”
Finally, when a user tries to convert a repository, the attack could lead to their Hugging Face token getting stolen, granting the attackers access to restricted internal models and datasets. From there on, they could compromise them in various ways, including dataset poisoning.
In one hypothetical scenario, a user submits a conversion request for a public repository, unknowingly changing a widely used model, resulting in a dangerous supply chain attack.
“Despite the best intentions to secure machine learning models in the Hugging Face ecosystem, the conversion service has proven to be vulnerable and has had the potential to cause a widespread supply chain attack via the Hugging Face official service,” the researchers concluded.
“An attacker could gain a foothold into the container running the service and compromise any model converted by the service.”
More from TechRadar Pro
There is a way to abuse the Hugging Face Safetensors conversion tool to hijack AI models and mount supply chain attacks. This is according to security researchers from HiddenLayer, who discovered the flaw and published their findings last week, The Hacker News reports. For the uninitiated, Hugging Face is a…
Recent Posts
- Elon Musk’s AI said he and Trump deserve the death penalty
- The GSA is shutting down its EV chargers, calling them ‘not mission critical’
- Lenovo is going all out with yet another funky laptop design: this time, it’s a business notebook with a foldable OLED screen
- Elon Musk’s first month of destroying America will cost us decades
- The first iOS 18.4 developer beta is here, with support for Priority Notifications
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010