AI ALIGNMENT FORUMTags
AF

PaLM

•
Applied to PaLM-2 & GPT-4 in "Extrapolating GPT-N performance" by Lukas Finnveden 6mo ago
•
Applied to AI and the Map of Your Mind: Pattern Recognition by Scott Broock 8mo ago
•
Applied to Google's PaLM-E: An Embodied Multimodal Language Model by Kaj Sotala 9mo ago
•
Applied to Benchmarks for Comparing Human and AI Intelligence by ViktorThink 1y ago
•
Applied to How should DeepMind's Chinchilla revise our AI forecasts? by Cleo Nardo 1y ago
•
Applied to [Linkpost] Solving Quantitative Reasoning Problems with Language Models by Yitzi Litt 1y ago
•
Applied to The Problem With The Current State of AGI Definitions by Yitzi Litt 2y ago
•
Applied to Google's new 540 billion parameter language model by Mateusz Bagiński 2y ago
•
Applied to Testing PaLM prompts on GPT3 by Yitzi Litt 2y ago
•
Applied to PaLM in "Extrapolating GPT-N performance" by Yitzi Litt 2y ago
Yitzi Litt v1.0.0Apr 6th 2022 (+283) LW1

PaLM is a Transformer language model created by Google in April 2022.  Google claims that the model exhibits discontinuous jumps in capabilities as it scales. The original paper announcing PaLM can be found here: https://storage.googleapis.com/pathways-language-model/PaLM-paper.pdf 

•
Created by Yitzi Litt at 2y