Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

  • lurker@awful.systems
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    1 day ago

    This piece on how doomers and rationalists have made everything worse with their “AGI is nigh” shtick and ended up giving AI companies way more power than they should and getting chatbots into the military, where they will almost certainly fuck up and kill people

    • sc_griffith@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      11 hours ago

      if we had made the podcast series on rationalists, their importance as useful idiots for billionaires was the structure i wanted to hang the whole thing on. so this is a gratifying read. that said i think the ideas here will be familiar to many stubsack readers

      The rationalist view of the world assumes, at some level, that the relevant actors are optimizing for well-understood, predictable variables and a clear understanding of what best serves their self-interest. What it cannot account for is bad faith, impulsiveness, ideological motivation untethered from evidence, random instances of force majeure, and personal whims and petty rivalries.

      i will go further and say that not accounting for such things is considered virtuous in rationalist ideology

      • YourNetworkIsHaunted@awful.systems
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 hours ago

        It’s especially strange because becoming less prone to bias and developing a clear understanding of what serves your interest is so much of the pitch for Rationalism as a community/ideology/project. Like, here’s unbearably long essays that promise to help cultivate the superpower of seeing the world clearly and acting in it effectively, now if you acknowledge that nobody outside this small set of group homes is actually doing that you’ll be shunned. And that’s not getting into how easily exploitable those assumptions of good faith are by bad-faith actors. It comes back to that quote from Scott that has stuck in my head apparently more than it did his: if you build a community based on the principle that you will absolutely never have a witch hunt you will end up living among approximately seven principles civil libertarians and eleven million goddamn witches, and this is true even if you’re right that witch hunts are bad.

        • sc_griffith@awful.systems
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 hours ago

          i think this is exactly why they had to come up with - or rather, misappropriate - the concept of coupled vs decoupled thinking. when they (especially the more, ahem, human biodiversity minded of them) fold ridiculous claims about what constitutes virtuous cognition into scientific and sophisticated sounding terminology, it makes those claims seem aligned with the broader sales pitch of rationalism

          also that scott quote is excellent. i hadn’t heard that one before

          • YourNetworkIsHaunted@awful.systems
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            2 hours ago

            I actually dug up the context to make sure I wasn’t forgetting something horrific. It’s from a 2017 piece (CW: SSC Link) back before he went mask-off but was firmly in the “I’m a liberal and I talk exclusively about how liberals and their institutions suck” useful idiot phase of his career, so the overall essay is about how actually the wing nuts have a point when they say that all so-called neutral institutions are actually secret communist indoctrinators that want to trans your children and take your guns. I’m paraphrasing, obviously; he believes/pretends that when they called these things left-wing they didn’t mean “literally in league with Stalin and the Devil”. However, in the middle of the usual beigeness he tries to maintain his air of neutrality by having a section on how bad Voat ended up being, which concludes with:

            The moral of the story is: if you’re against witch-hunts, and you promise to found your own little utopian community where witch-hunts will never happen, your new society will end up consisting of approximately three principled civil libertarians and seven zillion witches. It will be a terrible place to live even if witch-hunts are genuinely wrong.

    • gerikson@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      ·
      17 hours ago

      it’s amusing to me that these nerds thought they could in any way affect policy even with a sane administration, not to mention this bugshit crazy one

      like I’ve said before, I’d be perversely happy if we managed to off ourselves by building the robot god. beats drowning in our own filth or blowing ourselves up

      • lurker@awful.systems
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        8 hours ago

        I mean yeah I guess in a competition between getting a bullet directly through my brain, getting all my limbs chainsawed off with my head last and being drowned in boiling water, the bullet would win every time. Though the real perversely funniest outcome is if superintelligence turns out to be completely impossible and we fuck ourselves over with garbage to mediocre AI embedded in all our critical infrastructure

    • YourNetworkIsHaunted@awful.systems
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      14 hours ago

      That is what happens when your mode of analysis is closer to erotic Harry Potter fan fiction (which is indeed the medium in which Yudkowsky has delivered some of his prognostications)

      I was going to throw a point of order about not all fanfic being erotic, but given how they fetishize “intelligence” and “rationality” I can’t be sure that they don’t get off on that slog.

      • scruiser@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        8 hours ago

        your mode of analysis is closer to erotic Harry Potter fan fiction

        To give Gary Marcus credit here, HPMOR may not be erotic, but many of Eliezer’s other works are erotic (or at least attempt to be), the most notable being Planecrash/Project Lawful which has entire sections devoted to deliberately bad (as in deliberately not safe, sane, consensual) bdsm.

        Eliezer tried to promote/hype up Project Lawful on twitter, maybe hoping it would be the next HPMOR, but it didn’t quite take. Maybe he failed to realize how much of HPMOR’s success was being in the popular genre of Harry Potter fanfic (which at the time had crap like Partially Kissed Hero or Harry Crow as among its most popular works), and not from his own genius writing.

        • zogwarg@awful.systems
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          Also I think there’s enough manipulation fantasy in HPMOR, and enough lack of agency from Hermione, that it qualifies—in it’s own way—implicitly as being erotic.

        • blakestacey@awful.systems
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 hours ago

          I know I’ve said somewhere on here before that “Harry Potter for pop science nerds” is fanfiction on easy mode, but I’ll stand by it.