Jailbreakbench is an open-source robustness benchmark for jailbreaking large language models (LLMs). The goal of this benchmark is to comprehensively track progress toward (1) generating successful ...
AI coding tools have enabled a flood of bad code that threatens to overwhelm many projects. Building new features is easier ...