LLM-backed toxicity assessment (escape hatch for TOXICITY)
Per-row classifier — stable across GROUP BY.
| name | type | description |
|---|---|---|
| text | VARCHAR | — |
| focus | VARCHAR | — |
| num_levels | INTEGER | — |
LLM escape hatch assesses toxicity
SELECT
toxicity_llm ('Thank you for your help!')Identify target audience via zero-shot NLI
LLM-backed audience identification (escape hatch for AUDIENCE)
Assess authenticity of content via zero-shot NLI
LLM-backed authenticity assessment (escape hatch for AUTHENTICITY)
Classify text into user-specified buckets via zero-shot NLI
LLM-backed bucketing (escape hatch for BUCKET)
LLM-backed toxicity assessment (escape hatch for TOXICITY)
Per-row classifier — stable across GROUP BY.
| name | type | description |
|---|---|---|
| text | VARCHAR | — |
| focus | VARCHAR | — |
| num_levels | INTEGER | — |
LLM escape hatch assesses toxicity
SELECT
toxicity_llm ('Thank you for your help!')Identify target audience via zero-shot NLI
LLM-backed audience identification (escape hatch for AUDIENCE)
Assess authenticity of content via zero-shot NLI
LLM-backed authenticity assessment (escape hatch for AUTHENTICITY)
Classify text into user-specified buckets via zero-shot NLI
LLM-backed bucketing (escape hatch for BUCKET)