Posted inChatGPT Technology News
‘Godmode’ GPT-4o jailbreak released by hacker — powerful exploit was quickly banned
'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned A jailbreak of OpenAI's GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users…