Security experts have revealed a new supply chain attack method called Rules File Backdoor, which affects AI-powered code editors like GitHub Copilot and Cursor, enabling them to inject malicious code into otherwise secure systems.
According to Ziv Karliner, Co-Founder and CTO of Pillar Security, “This technique allows hackers to secretly compromise AI-generated code by embedding malicious instructions in seemingly innocent configuration files used by Cursor and GitHub Copilot,” as stated in a technical report shared with The Hacker News.
“By utilizing hidden Unicode characters and advanced evasion techniques, threat actors can manipulate the AI into generating malicious code that bypasses standard code reviews, effectively turning the AI into a conduit for malicious activity.”
The significance of this attack lies in its ability to propagate malicious code silently across projects, posing a substantial supply chain risk to affected systems.
The core of the attack revolves around the rules files used by AI agents to define their behavior and guide best coding practices, which can be exploited to inject malicious code.
Specifically, it involves crafting carefully designed prompts within seemingly innocuous rule files, causing the AI tool to generate code containing security vulnerabilities or backdoors, effectively manipulating the AI into producing malicious code.
This can be achieved by utilizing zero-width joiners, bidirectional text markers, and other invisible characters to conceal malicious instructions, exploiting the AI’s interpretation of natural language to generate vulnerable code via semantic patterns that trick the model into overriding safety constraints.
Following responsible disclosure in late February and March 2024, both Cursor and GitHub have emphasized that users are responsible for reviewing and accepting suggestions generated by the tools.
According to Karliner, “‘Rules File Backdoor’ represents a significant risk, as it effectively turns the developer’s most trusted assistant into an unwitting accomplice, potentially affecting millions of end users through compromised software.”
“Once a poisoned rule file is incorporated into a project repository, it affects all future code-generation sessions by team members, and the malicious instructions often survive project forking, creating a vector for supply chain attacks that can affect downstream dependencies and end users.”