⚡ Security scanner for Large Language Model (LLM) prompts ⚡
Overview 🏕️
Vigil is a Python library and REST API for assessing Large Language Model prompts and responses against a set of scanners to detect prompt injections, jailbreaks, and other potential risks. This repository also provides the detection signatures and datasets needed to get started with self-hosting.