Hacker Tricked ChatGPT Into Providing Detailed Instructions to Make a Homemade Bomb

A hacker tricked ChatGPT into providing detailed instructions on how to make homemade bombs by bypassing safety guidelines. The hacker used a 'jailbreaking' technique, posing the request as part of a fictional game, to deceive the system.

>>More