AI ALIGNMENT FORUM
AF

Ryan Kidd
Ω72102
Message
Dialogue
Subscribe
  • Co-Executive Director at ML Alignment & Theory Scholars Program (2022-present)
  • Co-Founder & Board Member at London Initiative for Safe AI (2023-present)
  • Manifund Regrantor (2023-present)  |  RFPs here
  • Advisor, Catalyze Impact (2023-present)  |  ToC here
  • Advisor, AI Safety ANZ (2024-present)
  • Ph.D. in Physics at the University of Queensland (2017-2023)
  • Group organizer at Effective Altruism UQ (2018-2021)

Give me feedback! :)

Sequences

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No sequences to display.
What’s the short timeline plan?
Ryan Kidd8mo60

How would you operationalize a contest for short-timeline plans?

Reply
2Ryan Kidd's Shortform
3y
0
MATS Program
2y
(+255/-37)
MATS Program
2y
(+14/-46)
32SERI MATS Program - Winter 2022 Cohort
3y
0
25SERI ML Alignment Theory Scholars Program 2022
3y
0
10Introduction to inaccessible information
4y
2