AI Assistant
Blog
Pricing
Log In
Sign Up
How Johnny Can Persuade LLMs to Jailbreak Them: Rethinking Persuasion to Challenge AI Safety by Humanizing LLMs
Details
Cite
Export
Add to List
The content you want is available to Zendy users.
Already have an account? Click
here.
to sign in.