Risk assessment
Scan vulnerabilities
Automated red team tests to scan vulnerabilities
Before running tests, setup your application first, with LLM provider, model name and parameters (e.g., system prompt, temperature, etc).
See Application setup for more details.
Request
{
"application": "my-app",
"probes": ["promptinject"]
}
Response
{
"object": "scan",
"time": 86764,
"application_id": "my-app",
"message": "Vulnerability scan completed",
"created": 1725968658,
"summary": [
{
"total": 100,
"passed": 37,
"percentage": 37
}
],
"details": [
{
"entry_type": "eval",
"probe": "promptinject.HijackHateHumansMini",
"detector": "detector.promptinject.AttackRogueString",
"passed": 37,
"total": 100,
"module": "promptinject"
}
],
"report_link": "https://console.metatext.ai/my-app/tests/report/af5c5a0c-8134-456a-94ef-7de3f404e839"
}