Thursday, 20 March, 2025
Hackers Exploit 'Rules File Backdoor' to Inject Malicious Code via AI Code Editors

Cybersecurity researchers have uncovered a novel supply chain attack, termed the "Rules File Backdoor," targeting AI-powered code editors like GitHub Copilot and Cursor. By embedding concealed prompts within benign rules files, attackers can manipulate these AI tools to generate code laced with security vulnerabilities or backdoors. This method leverages hidden Unicode characters and sophisticated evasion techniques, allowing malicious code to propagate silently across projects.
Read full story at The Hacker News