**4.4 Adversarial attacks**

An adversarial attack is a type of cyber-attack where an attacker modifies data to deceive or harm a machine learning system, causing it to produce incorrect or unexpected results. DL techniques are vulnerable to such attacks, and intentional CSI perturbations can significantly impact the accuracy of fingerprinting-based localization. While few studies have addressed adversarial attacks and defenses in the context of MIMO systems [102], it remains an open area of research.
