• 18 Posts
  • 425 Comments
Joined 2 years ago
cake
Cake day: August 13th, 2023

help-circle
  • stumbled across an ai doomer subreddit, /r/controlproblem. small by reddit standards, 32k subscribers which I think translates to less activity than here.

    if you haven’t looked at it lately, reddit is still mostly pretty lib with rabid far right pockets. but after luigi and the trump inauguration it seems to have swung left pretty significantly, and in particular the site is boiling over with hatred for billionaires.

    the interesting bit about this subreddit is that it follows this trend. for example

     Why Billionaires Will Not Survive an AGI Extinction Event: As a follow up to my previous essays, of varying degree in popularity, I would now like to present an essay I hope we can all get behind - how billionaires die just like the rest of us in the face of an AGI induced human extinction... I would encourage anyone who would like to offer a critique or comment to read the full essay before doing so. I appreciate engagement, and while engaging with people who have only skimmed the sample here on Reddit can sometimes lead to interesting points, more often than not, it results in surface-level critiques that I’ve already addressed in the essay. I’m really here to connect with like-minded individuals and receive a deeper critique of the issues I raise - something that can only be done by those who have actually read the whole thing... Throughout history, the ultra-wealthy have insulated themselves from catastrophe. Whether it’s natural disasters, economic collapse, or even nuclear war, billionaires believe that their resources—private bunkers, fortified islands, and elite security forces—will allow them to survive when the rest of the world falls apart. In most cases, they are right. However, an artificial general intelligence (AGI) extinction event is different. AGI does not play by human rules. It does not negotiate, respect wealth, or leave room for survival. If it determines that humanity is an obstacle to its goals, it will eliminate us—swiftly, efficiently, and with absolute certainty. Unlike other threats, there will be no escape, no last refuge, and no survivors.

    or the comments under this

    Under Trump, AI Scientists Are Told to Remove ‘Ideological Bias’ From Powerful Models A directive from the National Institute of Standards and Technology eliminates mention of “AI safety” and “AI fairness.”

    comments include "So no more patriarchy?" and "This tracks with the ideological rejection of western values by the Heritage Foundation's P2025 and their Dark Enlightenment ideals. Makes perfect sense that their orders directly reflect Yarvin's attacks on the "Cathedral". "

    or the comments on a post about how elon has turned out to be a huge piece of shit because he’s a ketamine addict

    comments include "Cults, or to put it more nicely all-consuming social movements, can also revamp personality in a fairly short period of time. I've watched it happen to people going both far right and far left, and with more traditional cults, and it looks very similar in its effect on the person. And one of ketamine's effects is to make people suggestible; I think some kind of cult indoctrination wave happened in silicon valley during the pandemic's combo of social isolation, political radicalism, and ketamine use in SV." and "I can think of another fascist who used amphetamines, hormones and sedatives."

    mostly though they’re engaging in the traditional rationalist pastime of giving each other anxiety

    cartoon. a man and a woman in bed. the man looks haggard and is sitting on the edge of the bed, saying "How can you think about that with everything that's going on in the field of AI?"

    Comment from EnigmaticDoom: Yeah it can feel that way sometime... but knowing we probably have such a small amount of time left. You should be trying to enjoy every little sip left that you got rather than stressing ~















  • I think to understand why this is concerning, you need enough engineering mindset to understand why a tiny leak in a dam is a big deal, even though no water is flooding out today or likely to flood out next week.

    he certainly doesn’t himself have such a mindset, and I am not convinced that he knows why a tiny leak in a dam is a big deal, nor am I convinced that it is necessarily a big deal. for example with five seconds of searching

    All earth dams leak to some extent and this is known as seepage. This is the result of water moving slowly through the embankment and/or percolating slowly through the dam’s foundation. This is normal and usually not a problem with most earthen dams if measures are taken to control movement of water through and under the dam.

    https://damsafety.org/dam-owners/earth-dam-failures

    one would suspect a concrete dam leaking is pretty bad. but I don’t actually know without checking. there’s relevant domain knowledge I don’t have, and no amount of “engineering mindset” will substitute for me engaging with actual experts with actual knowledge


  • I listened to ezra klein’s podcast sometimes before he moved to the NYT, thought it was occasionally interesting. . every time I’ve listened to an episode since he moved it’s been some of the most credulous shit I’ve ever heard

    like, there was one episode where he interviewed a woman whose shtick was spending the whole time talking in what I can loosely call subtext about how she fucked an octopus. she’d go on about how they were ‘tasting each other’ and their ‘fluids were mingling’ and such and he’d just be like wow what a fascinating encounter with an alien intelligence. this went on for an hour and at no point did he seem to have a clue what was going on




  • According to The Information, SoftBank CEO Masayoshi Son is planning to borrow $16 billion to invest in AI, and may borrow another $8 billion next year. The following points are drawn from The Information’s reporting, and I give serious props to Juro Osawa and Cory Weinberg for their work.

    SoftBank currently only has $31 billion in cash on its balance sheet as of December. Its net debt — which, despite what you think, doesn’t measure total debt but rather represents its cash minus any debt liabilities — stands at $29 billion… They plan to use the loan in question to finance part of their investment in OpenAI and their acquisition of chip design firm Ampere.

    According to SoftBank’s reported assets, their holdings are worth about $219 billion (33.66 trillion yen), including stock in companies like Alibaba and ARM.

    am I reading this correctly: softbank has $50 billion in debt, equal to about 25% of their total assets? is that… normal? these are genuine questions, not sure whether I’m misunderstanding something/whether this is actually usual