Models and Framework for Adversarial Attacks on Complex Adaptive Systems
We introduce the paradigm of adversarial attacks that target the dynamics of Complex Adaptive Systems (CAS). To facilitate the analysis of such attacks, we present multiple approaches to the modeling of CAS as dynamical, data-driven, and game-theoretic systems, and develop quantitative definitions of attack, vulnerability, and resilience in the context of CAS security. Furthermore, we propose a comprehensive set of schemes for classification of attacks and attack surfaces in CAS, complemented with examples of practical attacks. Building on this foundation, we propose a framework based on reinforcement learning for simulation and analysis of attacks on CAS, and demonstrate its performance through three real-world case studies of targeting power grids, destabilization of terrorist organizations, and manipulation of machine learning agents. We also discuss potential mitigation techniques, and remark on future research directions in analysis and design of secure complex adaptive systems.
READ FULL TEXT