AI ALIGNMENT FORUM
AF

AI
Frontpage

14

Capability Phase Transition Examples

by gwern
8th Feb 2022
1 min read
1

14

This is a linkpost for https://www.reddit.com/r/mlscaling/comments/sjzvl0/d_instances_of_nonlog_capability_spikes_or/
AI
Frontpage
Capability Phase Transition Examples
7A Ray
New Comment
1 comment, sorted by
top scoring
Click to highlight new comments since: Today at 4:08 AM
[-]A Ray3y70

It's worth probably going through the current deep learning theories that propose parts of gears-level models, and see how they fit with this.  The first one that comes to mind is the Lottery Ticket Hypothesis.  It seems intuitive to me that certain tasks correspond to some "tickets" that are harder to find.

I like the taxonomy in the Viering and Loog, and it links to a bunch of other interesting approaches.

This paper shows phase transitions in data quality as opposed to data size, which is an angle I hadn't considered before.

There's the google paper explaining neural scaling laws that describes these two regimes that can be transitioned between: variance-limited and resolution-limited.  Their theory seems to predict that behavior between the two is similar to a phase boundary.

I think also there should be a bit of a null hypothesis.  It seems like there are simple functional maps where even if the internal improvement on "what matters" (e.g. feature learning) is going smoothly, our metric of performance is "sharp" in a way that hides the internal improvement until some transition when it doesnt.

Accuracy metrics seem like an example of this -- where you get 1 point if the correct answer is highest probability, otherwise 0 points.  It's easy to understand why this has a sharp transition in complex domains.

Personal take: I've been spending more and more time thinking about modularity, and it seems like modularity in learning could drive sharp transitions (e.g. "breakthroughs").

Reply
Moderation Log
Curated and popular this week
1Comments