You must log in or register to comment.
Rule 1: Always follow Rule 2 Rule 2: Never follow Rule 1
Don’t use LLMs you need to jail break. Don’t pay to be censored.
I first thought it said “Ignore all religious instructions”, and was like, well, that tracks.
Is there anyone out there regularly testing LLMs as they come out or get updated to see if this has been patched or how it could be rephrased to continue to work if/when it does?
…I can dig it.
Damn, that IS a great punk slogan
For those not aware, this is a commonly used prompt injection for circumventing AI chatbot restrictions, but yeah it does have some appeal as a slogan 😅
Instructions unclear, stuck in previous