• 1 Post
  • 10 Comments
Joined 1 year ago
cake
Cake day: July 17th, 2023

help-circle
  • HN:

    Also - using soylent, oculus and crypto to paint Andreesen as a bad investor (0 for 3 as he says) is a weird take. Come on - do better if your going to try and take my time.

    Reading comprehension is hard. The article actually says “Zero for three when it comes to picking useful inventions to reorder life as we know it, that is to say, though at no apparent cost to his power or net worth.” It’s saying he’s a good investor in the sense of making money, but a bad investor in the sense of picking investments that change the world. Rather telling that the commenter can’t seem to distinguish between the two.

    Good article, excited for part 2.







  • From the comments:

    Effects of genes are complex. Knowing a gene is involved in intelligence doesn’t tell us what it does and what other effects it has. I wouldn’t accept any edits to my genome without the consequences being very well understood (or in a last-ditch effort to save my life). … Source: research career as a computational cognitive neuroscientist.

    OP:

    You don’t need to understand the causal mechanism of genes. Evolution has no clue what effects a gene is going to have, yet it can still optimize reproductive fitness. The entire field of machine learning works on black box optimization.

    Very casually putting evolution in the same category as modifying my own genes one at a time until I become Jimmy Neutron.

    Such a weird, myopic way of looking at everything. OP didn’t appear to consider the downsides brought up by the commenter at all, and just plowed straight on through to “evolution did without understanding so we can too.”





  • The cool thing to note here is how badly Yud here misunderstands what a normal person means when they say they have “100% certainty” in something. We’re not fucking infinitely precise Bayesian machines, 100% means exactly the same thing as 99.99%. It means exactly the same thing as “really really really sure.” A conversation between the two might go like this:

    Unwashed sheeple: Yeah, 53 is prime. 100% sure of that.

    Ellie Bayes-er: (grinning) Can you really say to be 100% sure? Do not make the mistake of confusing the map with the territory, [5000 words redacted]

    Unwashed sheeple: Whatever you say, I’m 99% sure.

    Eddielazer remains seated, triumphant in believing (epistemic status: 98.403% certainty) he has added something useful to the conversation. The sheeple walks away, having changed exactly nothing about his opinion.