surfaceclassificationtoxicity_llm
Classificationdimension · returns varchar

TOXICITY_LLM

LLM-backed toxicity assessment (escape hatch for TOXICITY)

Per-row classifier — stable across GROUP BY.

classificationllmllm-escape-hatchtext

Arguments

nametypedescription
textVARCHAR
focusVARCHAR
num_levelsINTEGER

About

LLM-backed escape hatch for toxicity assessment. Use when TOXICITY (toxic-bert) misses a nuanced case — typically sarcastic civility, coded language, or domain-specific offensive terms the purpose-built model wasn't trained on. For routine toxicity classification, prefer TOXICITY — it's purpose-built for this task and far faster.

Examples

LLM escape hatch assesses toxicity

SELECT
  toxicity_llm ('Thank you for your help!')

Nearby rabbit holes

same domain
Climb back to The Looking Glass