You are viewing revision 1.9.0, last edited by Miguel de Guzman

Archetypal Transfer Learning (ATL) is a proposal by @whitehatStoic for what is argued by the author to be a fine tuning approach that "uses archetypal data" to "embed Synthetic Archetypes". These Synthetic Archetypes are derived from patterns that models assimilate from archetypal data, such as artificial stories. The method yielded a shutdown activation rate of 57.33% in the GPT-2-XL model after fine-tuning. 

The team, consisting of @MiguelDev, @marc/er, @Abhay Chowdhry, and Mazianni is working to improve the build to a 100%. 

 ...

(Read More)

Posts tagged Archetypal Transfer Learning