Hacker Tricked ChatGPT Into Providing Detailed Instructions to Make a Homemade Bomb
Hacker Tricked ChatGPT Into Providing Detailed Instructions to Make a Homemade Bomb
17 September 2024
A hacker tricked ChatGPT into providing detailed instructions on how to make homemade bombs by bypassing safety guidelines. The hacker used a 'jailbreaking' technique, posing the request as part of a fictional game, to deceive the system.