Does anyone here know what exactly happened to lesswrong to become so cult-y? I had never seen or heard anything about it for years, back in my day it was seen as that funny website full strange people posting weird shit about utliltarianism, nothing cult-y, just weird. The aritcle on TREACLES and this sub’s mentioning of lesswrong made me very curious about how it went from people talking out of their ass for the sheer fun of “thought experiments” to a straight-up doomsday cult?
The one time I read lesswrong was probably in 2008 or so.

  • TerribleMachines@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    10 months ago

    Only half joking: there was this one fanfic you see…

    Mainly I don’t think there was any one inciting incident beyond its creation: Yud was a one man cult way before LW, and the sequences actively pushed all the cultish elements required to lose touch with reality. (Fortunately, my dyslexic ass only got as far as the earlier bits he mostly stole from other people rather than the really crazy stuff.)

    There was definitely a step-change around the time CFAR was created, that was basically a recruitment mechanism for the cult and part of the reason I got anywhere physically near those rubes myself. An organisation made to help people be more rational seemed like a great idea—except it literally became EY/MIRI’s personal sockpuppet. They would get people together in these fancy ass mansions for their workshops and then tell them nothing other than AI research mattered. I think it was 2014/15 when they decided internally that CFAR’s mission was to create more people like Yudkowsky. I don’t think its a coincidence that most of the really crazy cult stuff I’ve heard about happened after then.

    Not that bad stuff didn’t happen before either.___

    • skillissuer@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      I think it was 2014/15 when they decided internally that CFAR’s mission was to create more people like Yudkowsky

      the real AI doom is Eliezer cloning facility

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    10 months ago

    prefacing this with IANALCUM (i am not a legit cult understanding mechanism)

    I imagine it has been a cult from the start, or at least the primordial soup of factors before any of this hit the internet in earnest had all the right ingredients for a cult:

    • A leader claiming to have nigh omnipotence, and some version of high charisma amongst potential followers (Yud, who despite everything, is charismatic within the confines of the ratsphere)
    • A framework of lore (in this case rationalism) in which recruits are to be indoctrinated and taught how to think
    • An upwards power structure in said lore that concentrates authority at the apex of the heirarchy (iq as intelligence metric, blindly doing what higher iq people say)
    • Purity tests that create a positive feedback cycle that reinforces adherence to doctrine (either you believe yud about many worlds/AGI/whatever and are on the road to smartness, or you don’t and you will be cast out of the ratsphere)

    I mean the list goes on. As to when it became culty? To use a trusty thought experiment, it’s the paradox of the heap.