INCUBATION | Adversarial Robustness Toolbox (ART) provides tools that enable developers and researchers to evaluate, defend, certify and verify Machine Learning models and applications against the adversarial threats. GitHub: https://github.com/Trusted-AI/adversarial-robustness-toolbox |
Reference Information
- Website
- Github
- Mail Lists
- Wiki
- Artwork
- Project Lead: Animesh Singh, singhan@us.ibm.com
- LF AI & Data Technical Advisory Council (TAC) Sponsor: Ibrahim Haddad, LF AI & Data Executive Director (temporary)
Recent space activity
Space contributors
{"mode":"list","scope":"descendants","limit":"5","showLastTime":"true","order":"update","contextEntityId":35160484}