Revamp: Automated Simulations of Adversarial Attacks on Arbitrary Objects in Realistic Scenes

Published in ICLR 24, 2024

Recommended citation: Hull, Matthew, et. al. (2024). "Revamp: Automated Simulations of Adversarial Attacks on Arbitrary Objects in Realistic Scenes." ICLR 24. https://matthewdhull.github.io/files/revamp.pdf

Abstract Deep learning models, such as those used in autonomous vehicles are vulnerable to adversarial attacks where attackers could place adversarial objects in the environment to induce incorrect detections. While generating such adversarial objects in the digital realm is well-studied, successfully transferring these attacks to the physical realm remains challenging, especially when accounting for real-world environmental factors. We address these challenges with REVAMP, a first-of-its-kind Python library for creating attack scenarios with arbitrary objects in scenes with realistic environmental factors, lighting, reflection, and refraction. REVAMP empowers researchers and practitioners to swiftly explore diverse scenarios, offering a wide range of configurable options for experiment design and using differentiable rendering to replicate physically-plausible adversarial objects. REVAMP is open-source and available at https://github.com/poloclub/revamp and a demo video is available at https://youtu.be/NA0XR0XkS1E.

Download paper here

Recommended citation: Hull, Matthew, et. al. (2024). Revamp: Automated Simulations of Adversarial Attacks on Arbitrary Objects in Realistic Scenes.” ICLR 24