Wednesday, 26 March
poster

Thursday, 20 March2025

Hackers Exploit 'Rules File Backdoor' to Inject Malicious Code via AI Code Editors

Hackers Exploit 'Rules File Backdoor' to Inject Malicious Code via AI Code Editors

Cybersecurity researchers have uncovered a novel supply chain attack, termed the "Rules File Backdoor," targeting AI-powered code editors like GitHub Copilot and Cursor. By embedding concealed prompts within benign rules files, attackers can manipulate these AI tools to generate code laced with security vulnerabilities or backdoors. This method leverages hidden Unicode characters and sophisticated evasion techniques, allowing malicious code to propagate silently across projects.

Subscribe To Our Newsletter.

Full Name
Email