ChatGPT jailbreak method uses virtual time travel to breach forbidden topics scworld.com 2 points by LinuxBender 12 hours ago
This is actually insane. I thought all jailbreaks prompts were "patched".