- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
You must log in or register to comment.
Sooo anyone have any example prompts like from the article that haven’t been payched yet and can successfully jailbreak current chat AIs?
I’m interested in doing some research with them.