sqlmap is a powerful tool for detecting and exploiting SQL injection flaws in web applications.
GitHub repository for techniques to prevent prompt injection in AI chatbots using LLMs.
A Agentic LLM CTF to test prompt injection attacks and preventions.
Automatic Prompt Injection testing tool that automates the detection of prompt injection vulnerabilities in AI agents.
Unofficial implementation of backdooring instruction-tuned LLMs using virtual prompt injection.
GitHub repository for optimization-based prompt injection attacks on LLMs as judges.
Discover the leaked system instructions and prompts for ChatGPT's custom GPT plugins.
Easy to use LLM Prompt Injection Detection / Detector Python Package.
Application which investigates defensive measures against prompt injection attacks on LLMs, focusing on external tool exposure.
Manual Prompt Injection / Red Teaming Tool for large language models.
Fine-tuning base models to create robust task-specific models for better performance.