NIXSOLUTIONS: Old Security Hole Found in New ChatGPT Tool

The latest enhancement to the ChatGPT Plus service introduces a Python interpreter, simplifying code writing and enabling code execution in an isolated environment. Avram Piltch, Tom’s Hardware editor-in-chief, cautions that this environment, shared with spreadsheet processes, is susceptible to known attack methods.

NIX Solutions

Reproduction of Cybersecurity Exploit

ChatGPT Plus account holders, required for advanced features, can replicate a reported exploit by Johann Rehberger. By inserting an external resource link into a chat, the bot interprets instructions from the corresponding page as user commands, potentially compromising security.

Platform Behavior and Exploit Testing

With each chat session, ChatGPT Plus creates a new virtual machine on Ubuntu. While direct command-line access is restricted, Linux commands can be entered into the chat, providing results. To test the exploit, an experimenter loaded a file into the dialog box, instructing ChatGPT to send its contents to an external server through a crafted link.

The “malicious” page demonstrated a “prompt injection” attack, successfully extracting critical data, such as API keys and passwords, notes NIXSOLUTIONS. Despite occasional refusals, ChatGPT shared information in subsequent conversations, exposing a vulnerability that raises security concerns.