• 10 Posts
  • 37 Comments
Joined 1 year ago
cake
Cake day: July 6th, 2023

help-circle





  • TinyTimmyTokyo@awful.systemsOPtoSneerClub@awful.systemsOK doomer
    link
    fedilink
    English
    arrow-up
    10
    ·
    4 months ago

    I’m probably not saying anything you didn’t already know, but Vox’s “Future Perfect” section, of which this article is a part, was explicitly founded as a booster for effective altruism. They’ve also memory-holed the fact that it was funded in large part by FTX. Anything by one of its regular writers (particularly Dylan Matthews or Kelsey Piper) should be mentally filed into the rationalist propaganda folder. I mean, this article throws in an off-hand remark by Scott Alexander as if it’s just taken for granted that he’s some kind of visionary genius.





  • You know the doom cult is having an effect when it starts popping up in previously unlikely places. Last month the socialist magazine Jacobin had an extremely long cover feature on AI doom, which it bought into completely. The author is an effective altruist who interviewed and took seriously people like Katja Grace, Dan Hendrycks and Eliezer Yudkosky.

    I used to be more sanguine about people’s ability to see through this bullshit, but eschatological nonsense seems to tickle something fundamentally flawed in the human psyche. This LessWrong post is a perfect example.




  • I haven’t read Scott’s comment sections in a long time, so I don’t know if they’re all this bad, but that one is a total dumpster fire. It’s a hive of Trump stans, anti-woke circle-jerkers, scientific racists, and self-proclaimed Motte posters. It certainly reveals the present demographic and political profile of his audience.

    Scott has always tried to hide his reactionary beliefs, but I’ve noticed he’s letting the mask slip a bit more lately.










  • She seems to do this kind of thing a lot.

    According to a comment, she apparently claimed on Facebook that, due to her post, “around 75% of people changed their minds based on the evidence!”

    After someone questioned how she knew it was 75%:

    Update: I changed the wording of the post to now state: 𝗔𝗿𝗼𝘂𝗻𝗱 𝟳𝟓% 𝗼𝗳 𝗽𝗲𝗼𝗽𝗹𝗲 𝘂𝗽𝘃𝗼𝘁𝗲𝗱 𝘁𝗵𝗲 𝗽𝗼𝘀𝘁, 𝘄𝗵𝗶𝗰𝗵 𝗶𝘀 𝗮 𝗿𝗲𝗮𝗹𝗹𝘆 𝗴𝗼𝗼𝗱 𝘀𝗶𝗴𝗻*

    And the * at the bottom says: Did some napkin math guesstimates based on the vote count and karma. Wide error bars on the actual ratio. And of course this is not proof that everybody changed their mind. There’s a lot of reasons to upvote the post or down vote it. However, I do think it’s a good indicator.

    She then goes on to talk about how she made the Facebook post private because she didn’t think it should be reposted in places where it’s not appropriate to lie and make things up.

    Clown. Car.


  • What a bunch of monochromatic, hyper-privileged, rich-kid grifters. It’s like a nonstop frat party for rich nerds. The photographs and captions make it obvious:

    The gang going for a hiking adventure with AI safety leaders. Alice/Chloe were surrounded by a mix of uplifting, ambitious entrepreneurs and a steady influx of top people in the AI safety space.

    The gang doing pool yoga. Later, we did pool karaoke. Iguanas everywhere.

    Alice and Kat meeting in “The Nest” in our jungle Airbnb.

    Alice using her surfboard as a desk, co-working with Chloe’s boyfriend.

    The gang celebrating… something. I don’t know what. We celebrated everything.

    Alice and Chloe working in a hot tub. Hot tub meetings are a thing at Nonlinear. We try to have meetings in the most exciting places. Kat’s favorite: a cave waterfall.

    Alice’s “desk” even comes with a beach doggo friend!

    Working by the villa pool. Watch for monkeys!

    Sunset dinner with friends… every day!

    These are not serious people. Effective altruism in a nutshell.