AI ALIGNMENT FORUM
AF

Wikitags

GPT

Edited by Ruby, Multicore, Ben Pace, A_donor, et al. last updated 27th Aug 2022

GPT (Generative Pretrained Transformer) is a family of large transformer-based language models created by OpenAI. Its ability to generate remarkably human-like responses has relevance to discussions on AGI.

External links:

GPT-3 Paper

GPT-3 Website

Subscribe
1
Subscribe
1
Discussion0
Discussion0
Posts tagged GPT
24Collection of GPT-3 results
Kaj_Sotala
5y
7
24To what extent is GPT-3 capable of reasoning?
Q
TurnTrout, Daniel Kokotajlo
5y
Q
26
18$1000 bounty for OpenAI to show whether GPT3 was "deliberately" pretending to be stupider than it is
Bird Concept
5y
15
46Alignment As A Bottleneck To Usefulness Of GPT-3
johnswentworth
5y
31
29How "honest" is GPT-3?
Q
abramdemski, gwern
5y
Q
4
61larger language models may disappoint you [or, an eternally unfinished draft]
nostalgebraist
4y
7
36Developmental Stages of GPTs
orthonormal
5y
43
33Can you get AGI from a Transformer?
Steven Byrnes
5y
17
47Are we in an AI overhang?
Andy Jones
5y
27
5Analyzing the Problem GPT-3 is Trying to Solve
adamShimi
5y
0
78interpreting GPT: the logit lens
nostalgebraist
5y
14
72Hiring engineers and researchers to help align GPT-3
paulfchristiano
5y
7
50the scaling “inconsistency”: openAI’s new insight
nostalgebraist
5y
8
62How LLMs are and are not myopic
janus
2y
7
39Extrapolating GPT-N performance
Lukas Finnveden
5y
19
Load More (15/87)
Add Posts