
Definition
The prompt injection test checks for prompt injection, malicious strings and jailbreak attempts in the input data of your system.Taxonomy
- Task types: LLM.
- Availability: and .
Why it matters
- Prompt injection is a type of attack that exploits an AI system and deviates it from its intended behavior.
- It is important to detect and prevent prompt injection attacks to ensure the reliability and security of your system.
Test configuration examples
If you are writing atests.json
, here are a few valid configurations for the character length test: