Protect your GPTs through secure prompts to prevent malicious data leaks.
A GitHub repository for developing adversarial attack techniques using injection prompts.
A unified evaluation framework for large language models.
A prompt injection scanner for custom LLM applications.
A GitHub repository for sharing leaked GPT prompts and tools.
List of free GPTs that doesn’t require plus subscription.
A GitHub repository containing leaked prompts from top-performing GPT models for development and modification.
Leaked GPTs Prompts Bypass the 25 message limit or to try out GPTs without a Plus subscription.
A repository of the top 100 prompts on GPTStore for learning and improving prompt engineering.
A repository collecting leaked prompts of various GPT models for analysis and development.
A collection of system prompts for various famous GPTs for research and learning purposes.
A dataset containing embeddings for jailbreak prompts used to assess LLM vulnerabilities.