Artificial intelligence (AI) like ChatGPT has been a global sensation since early 2023, but this AI is not always used for positive purposes. Recently, a security expert found a way to ask ChatGPT to create malicious code during testing.
Aaron Mulgrew, a security expert at Forcepoint, shared the risk of writing malware using a language chatbot developed by OpenAI. Although ChatGPT was designed to prevent users from asking the AI to design malware, Aaron still found a vulnerability by creating command lines (prompt) for the artificial intelligence to write programming code line by line. When combined together, Aaron realized that he had in his hands an undetectable data theft execution tool, so sophisticated that it is comparable to today's major malware.
Each individual command line generated by ChatGPT, when combined, can become sophisticated malware.
Mulgrew's discovery is a warning bell about the ability to exploit AI to create dangerous malware without the need for a hacker group, and without even the tool's creator writing a single line of code.
Mulgrew's software is disguised as a screen saver application, but is capable of automatically activating on Windows-based devices. Once in the operating system, the malware "sneaks" into every file, including Word text editors, image files, and PDFs, searching for data to steal.
Once it has what it needs, the program breaks down the information and attaches it to image files on the machine. To avoid detection, the images are uploaded to a folder on Google Drive cloud storage. The malware is so powerful because Mulgrew can tweak and enhance its features to avoid detection through simple commands entered into ChatGPT.
Although this was a private test by security experts and no attacks were carried out outside of the testing area, the cybersecurity community still recognizes the dangers of activities using ChatGPT. Mulgrew claims that he does not have much experience in programming, but OpenAI's artificial intelligence is still not powerful and intelligent enough to prevent his test.
Source link
Comment (0)