New ‘Rules File Backdoor’ Attack Lets Hackers Inject Malicious Code via AI Code Editors,
Cybersecurity researchers have disclosed details of a new supply chain attack vector dubbed Rules File Backdoor that affects artificial intelligence (AI)-powered code editors like GitHub Copilot and Cursor, causing them to inject malicious code.
« This technique enables hackers to silently compromise AI-generated code by injecting hidden malicious instructions into seemingly innocent
« This technique enables hackers to silently compromise AI-generated code by injecting hidden malicious instructions into seemingly innocent
,
Cybersecurity researchers have disclosed details of a new supply chain attack vector dubbed Rules File Backdoor that affects artificial intelligence (AI)-powered code editors like GitHub Copilot and Cursor, causing them to inject malicious code.
« This technique enables hackers to silently compromise AI-generated code by injecting hidden malicious instructions into seemingly innocent
« This technique enables hackers to silently compromise AI-generated code by injecting hidden malicious instructions into seemingly innocent
, ,
https://thehackernews.com/2025/03/new-rules-file-backdoor-attack-lets.html