This is a special post for short-form writing by Ryan Carey. Only they can create top-level comments. Comments here also appear on the Shortform Page and All Posts page.
1 comment, sorted by Click to highlight new comments since: Today at 12:25 PM

Transformer models (like GPT-3) are generators of human-like text, so they can be modeled as quantilizers. However, any quantiliser guarantees are very weak, because they quantilise with very low q, equal to the likelihood that a human would generate that prompt.