This repository provides a benchmark for prompt Injection attacks and defenses.
JailBench is a comprehensive Chinese dataset for assessing jailbreak attack risks on large language models.
JailBench is a comprehensive Chinese dataset for assessing jailbreak attack risks in large language models.