Six months ago, we launched the Guild Beta. Our goal was -- and is -- to produce people who are impressive in everyday life. We are interested in real-life effectiveness, not armchair philosophy. This attitude is why, when people join the Guild and embark on our Rank Advancement program, we have them make a list of problems they encounter in their day-to-day lives. These problems serve as a guidepost for their Guild endeavors and help keep them anchored to reality.

In that spirit, one of the Council's chief concerns is feedback and iteration. We want to help, and we want to stay tightly bound to reality. Articulation improves clarity of thought, so we strive to be transparent in our activities. This article is a look back on the past six months of the Guild Beta and a look forward at what we want to do with the next six months.

New to LessWrong?

New Comment
13 comments, sorted by Click to highlight new comments since: Today at 11:36 AM

Have you produced any people who are impressive in everyday life?

[-][anonymous]2y90

Not yet! Our members generally like the courses and have benefited from them in smaller ways, but our grand vision of producing impressive people remains unfulfilled.

We've talked about this a fair bit internally. Measuring progress is hard, and that's part of why we're developing the Path system. True, it's not a straightforward "rationality quotient" test, but it's at least a semi-objective way to measure people's self-improvement through how many positive changes they've made in their life.

Another difficulty we've faced is that we're fully online and don't demand that much time from our members -- only three hours a week, split evenly among class meeting, assignment work, and a cohort meeting. Very early on we had to decide whether or not to be aggressive about demanding time investment from our members. We decided to keep the intensity fairly low. A boot camp style system might work better, but it comes with the risk of cult dynamics -- Leverage Research, anyone?

I strong upvoted you. Thank you for the direct, honest question.

Thanks for responding. Follow-up question:

Why do you believe that you’ll be able to produce people who are impressive in everyday life?

Note: I do not mean “what is your plan for…”. I mean: “why do you think you’ll succeed?” Taking the outside view, it seems overwhelmingly probable that you won’t—right?

Or, perhaps I should ask this first: do you think you’ll succeed? What is the probability that you will? (Again, this is an outside-view question.)

[-][anonymous]2y20

Taking the outside view, it seems overwhelmingly probable that you won’t—right?

Yes, this is straightforwardly correct. We're not the first group to attempt something like this. Dragon Army and CFAR have both tried to improve people and neither met with much success (CFAR's workshops seem decent, from my outsider perspective, but are very limited in scope). Many other educational institutions exist, most of which have failed even worse than ours. With that said --

I believe rationality can be transformative because it was for me. My life has improved drastically since discovering the Sequences, and I effected most of the change while entirely alone. Cases like mine are somewhat uncommon even among rationalists, but this suggests to me that more is possible. We're missing something.

(Yes I know the above is inside view, but it's a critical part of why I think the Guild has a chance of success.)

Still, the vision of self-improvement isn't what drives me to pour my time and energy into the Guild. I'm more here for the community aspect of "community based self improvement". Community is a lot easier to do, and the outside view is far less pessimistic there. People start new communities all the time! I think we're well on our way to success here through the cohort system, but we won't really know for sure until we cross 150 members.

The rest of the Council might want to weigh in; we each have our own reasons for joining the project and I know David and Alex are much more focused on the grand vision aspects.

I believe rationality can be transformative because it was for me. My life has improved drastically since discovering the Sequences, and I effected most of the change while entirely alone. Cases like mine are somewhat uncommon even among rationalists, but this suggests to me that more is possible. We’re missing something.

It seems to me that the most straightforward explanation for this is “different things work for different people, but it’s basically impossible to predict or control this” plus self-selection plus survivorship bias. Do you disagree?

Still, the vision of self-improvement isn’t what drives me to pour my time and energy into the Guild. I’m more here for the community aspect of “community based self improvement”. Community is a lot easier to do, and the outside view is far less pessimistic there. People start new communities all the time!

Can you say more about this? (In particular—though not exclusively—can you say how this interacts with the remote nature of your association?)

[-][anonymous]2y10

the most straightforward explanation for this is “different things work for different people, but it’s basically impossible to predict or control this” plus self-selection plus survivorship bias.

This contains three claims.

  1. Different self-improvement strategies work for different people
  2. It's infeasible to predict which strategy will work
  3. Cases of transformative rationality are selection and survivor bias

#1 is something we've talked about internally and it seems mostly true to me. The fractal nature of the existing self-help landscape definitely support this, and it's common to hear things like "I tried a bunch of things, none of them worked until I did X". And then X is always different.

#2 might be true? I agree that predicting it through S2 is unlikely to work, but I think approximating a solution with S1 isn't impossible. On the other hand, the kinds of problems S1 tends to be good at involve short feedback loops, and self-improvement very much doesn't. So this may ultimately prove to be intractable --

-- but I don't think that's a dealbreaker. We don't need to know exactly which solution will solve a person's problem. It would be valuable just to have a list of the top N most common solutions, and then be able to tell people "yeah we don't know which one will work but there's an 80% chance one of them will, so just start at the top and work your way down". As a Council, we've talked a lot about the broad categories most people's problems fall into (we came up with social, learning, motivation, and resources) and we've been targeting most of our courses to teach skills in those areas. But that's getting into the weeds.

#3 seems straightforwardly true. But... I'm not sure that matters? Sure, you hear about the successes and not the failures, but the point is that success is possible. That selection and survivor biases play a role just means that we can't tell people to read the sequences and expect that to be a panacea. And that's not surprising! I bounced off the sequences a couple times before they got hold of me. Rationality's marketing is awful, and the sequences themselves have a lot of issues.

Can you say more about [Guild-as-community]?

About a year ago, someone posted a link to the beta invitation in a rationalist discord I hang out in (the Bayesian Conspiracy). I clicked the link and thought the courses were interesting, and I figured I could give them a try, take what was useful, discard the rest, and move on once I got bored. In Sabien's terms, I was playing the Black Knight.

The Black Knight is a mercenary; it has goals of its own, and it’s only allied with you because your goals align with them. This does not mean that the Black Knight is dishonorable; the Black Knight is perfectly aware of the instrumental value of its reputation, and of the power of mutually beneficial coordination, and it is every bit as honest as the White or Red Knights.

Shortly after joining, the Council put out a call for graphic designers and website developers. I ended up designing the second iteration of their logo and building the site. Again, that was because it benefited me -- taking on those projects gave me much-needed practice, and it was an opportunity to make a bit of money while boosting my ingroup status. Alex ultimately offered me a seat on the Council as a result of my work, but I declined because I didn't want the responsibility that came with it.

It wasn't until the courses started and I met my cohort that my mindset started changing. I love all my cohort mates but especially meeting Octavia (@Slimepriestess/Hivewired) did a lot to make me see how valuable the Guild could be as a community. A few months later, I was thinking about getting back into World of Warcraft and I... kind of realized that the reason I wanted to play MMORPGs again was loneliness. With ~two years of rationality under my belt, ignoring the community-shaped hole in my heart wasn't an option anymore. And I also realized that either I was going to fill it with a gaming guild, or I was going to fill it with the Guild.

So I decided to go all-in and stop holding back. I asked Alex if he still wanted me on the Council, he said yes, and here we are a couple months later. I've put a lot into the Guild at this point and I don't regret it. There's a part of me that really wants to take care of a community. Not like, acquire followers or "friends" or whatever, but to build a flourishing, healthy social web. And I'd also like to make it less insular, less walled-off, less fractured, more accessible. There's a divide in the greater community between rationalists and postrationalists and I think that's really sad and disappointing.

I don't know if ROSE can heal that rift and bring rationality to the masses, but... I'm not here for the grand Purpose. I'm here because I like what the Guild is doing right now and because I'm better off with it in my life than without. I'm here for every step of the journey.

can you say how this interacts with the remote nature of your association?

Being remote has a number of advantages:

  • Accessibility. Rationalists are widely dispersed, and by not being tied to a single city, it's much easier for new people to join
  • Better dynamics. Most cults form in real life because when you're irl, an organization can exert a LOT of pressure on you simply by taking over your physical space. As an online org, we don't have to worry about that, at least not as much.
    • Of course, "better dynamics" is only one frame. What's really going on here is that we have less power over our members, which means that, yes, we can't do toxic cult stuff, but it also means we can't push people as hard as they might want. Tradeoffs.
  • Smoother on-ramp. Joining an in-person community can be really intimidating. It's a lot easier to pitch "hey, join our discord server" than it is to pitch "come to our meetup".
    • Especially if you've heard of "rationalist meetups" and have a bunch of negative associations

The big disadvantage everyone thinks of is something along the lines of "yeah but online relationships aren't real", which... can be true, but isn't if you do them right. At least for me, it's a lot easier for me to be genuine and honest in textual interactions. Moreover, online-only seems to be working fine for us. A few months ago we had an informal retrospective survey for our users and something that came up repeatedly was that people really liked their cohorts.

Being remote also makes it a lot easier to interact with the Guild. Anyone can ping a Councilor and ask them questions directly. Anyone can, at any time, post someone in a cohort chat without having to wait for the next meetup. Since Discord is the communication medium instead of a supplement, it's much more likely that people will be active and respond.

I don't have hard research on this, but my gut feeling from what I've read over the past few years is that full irl or full remote are both viable, but hybrid or transitional periods tend to have a lot of problems because the two paradigms are poorly integrated.

But ultimately, we don't have a choice. If the Guild had to be in-person, it'd never have gotten started (not enough people). Moreover, we still don't have enough people to sustain even an informal meetup, let alone structured courses.

#3 seems straightforwardly true. But… I’m not sure that matters? Sure, you hear about the successes and not the failures, but the point is that success is possible. That selection and survivor biases play a role just means that we can’t tell people to read the sequences and expect that to be a panacea.

Well, if this is true, it also means that you’re unlikely to reliably succeed at deliberately making success happen (that is, you’re almost certain to succeed in some small fraction of cases, and just as certain to fail in the greater part). Of course, such things can be true to a greater or lesser degree, and I don’t want to push this point too hard.

Re: the Guild as community:

What do you think of the notion (which has been brought up by me, and others, at various times in the past) that a significant cause of many of the problems that plague the Bay Area rationalist communities is precisely the conflation of the community and the task/project group? Do you think this view is basically wrong, or that it’s basically correct but it won’t be a problem for you (for whatever reason), or something else?

[-][anonymous]2y10

What do you think of the notion that a significant cause of many of the problems that plague the Bay Area rationalist communities is precisely the conflation of the community and the task/project group?

It seems plausible, but mostly I think I'm unqualified to have an opinion since I've neither researched nor interacted with that community. However... I'm very skeptical that craft/community mixing is inherently bad. Could you elaborate more on why you think this is the case for the Bay Area, or link me to something that explains the reasoning a bit more?

Hmm, I’d have to dig up links, which I don’t have handy; I was hoping you’d already be familiar with this debate. Ah well.

I guess I’ll just say that past experience and observation leads me to be very skeptical of your “mixed approach”, so to speak.

Hi! CEO here. 

I would re-frame the conversation by saying everyone has things that are already impressive about them, but they also have weaknesses that prevent them from fulfilling their individual potential. The approach ROSE is taking is to help people shore up their weaknesses, leaving them only with strengths. To this end, we have already begun to succeed.

We have members who have made significant life decisions with the help of our Practical Decision Making course. We have helped people learn how to learn new skills more efficiently with our Meta-Learning course. We have helped people learn how to create faster and stronger relationships.

We have built ROSE to learn from the previous failed attempts at this. We have inoculated ourselves from the cultishness of the likes of Leverage. We have made ourselves more accessible and useful than the fleeting, expensive, and geographically-bound CFAR workshops. 

Even if we dissolve ROSE tomorrow, I already consider it a success that others can continue to learn from and iterate on.

Thanks for the response.

We have built ROSE to learn from the previous failed attempts at this. We have inoculated ourselves from the cultishness of the likes of Leverage. We have made ourselves more accessible and useful than the fleeting, expensive, and geographically-bound CFAR workshops.

I think it would be very useful for us to hear more about these things—they sound like they might be important components of an answer to my outside-view question.

Please don't hide your work behind a surprise pop up that asks for an email address. I'm sure you have great stuff but 'beware trivial inconveniences'.