‘Godmode’ GPT-4o jailbreak released by hacker — powerful exploit was quickly banned

‘Godmode’ GPT-4o jailbreak released by hacker — powerful exploit was quickly banned
A jailbreak of OpenAI’s GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to …

See more –> Source

Connect with us on X