AE Studio is launching a short, anonymous survey for alignment researchers, in order to develop a stronger model of various field-level dynamics in alignment.
This appears to be an interestingly neglected research direction that we believe will yield specific and actionable insights related to the community’s technical views and more general characteristics.
The survey is a straightforward 10-15 minute Google Form with some simple multiple choice questions.
For every alignment researcher who completes the survey, we will donate $40 to a high-impact AI safety organization of your choosing (see specific options on the survey). We will also send each alignment researcher who wants one a customized report that compares their personal results to those of the field.
Together, we hope to not only raise some money for some great AI safety organizations, but also develop a better field-level model of the ideas and people that comprise alignment research.
We will open-source all data and analyses when we publish the results. Thanks in advance for participating and for sharing this around with other alignment researchers!
Survey full link: https://forms.gle/d2fJhWfierRYvzam8
I timed how long it took me to fill in the survey. It took 30 min. I could probably have done it in 15 min if I skipped the optional text questions. This is to be expected however. Every time I've seen someone someone guesses how long it will take to respond to their survey, it's off by a factor of 2-5.
Note: The survey took me 20 mins (but also note selection effects on leaving this comment)
Definitely good to know that it might take a bit longer than we had estimated from earlier respondents (with the well-taken selection effect caveat).
Note that if it takes between 10-20 minutes to fill out, this still works out to donating $120-240/researcher-hour to high-impact alignment orgs (plus whatever the value is of the comparison of one's individual results to that of community), which hopefully is worth the time investment :)