Contents

Inadequate Equilibria

My rating: 9/10
An interesting book which explores the efficiency and adequacy of systems, and presents a case against modest empistemology in favor of probability theory and decision-theory.
My rating: 9/10
An interesting book which explores the efficiency and adequacy of systems, and presents a case against modest empistemology in favor of probability theory and decision-theory.

Summary

This book can be seen as having two themes that are interwoven with one another. The first is about exploring and assessing the efficiency and adequacy of systems, and the second is about rejecting modest epistemology in favor of other frameworks for setting your confidence in your beliefs. It exposes some of the forces which beget moloch and offers some wisdom for designing more adequate systems.

The core thesis of the first theme is that systems which have persistently undesirable outputs are typically either inexploitable or inadequate. Inexploitable systems behave poorly because those who are interested in making them run properly have no ability to capitalize on their knowledge. Inadequate systems behave poorly because they are set up to optimize for the wrong utility function.

The core thesis of the second theme is that you can and should use probability theory and decision theory to form your beliefs. Allow yourself high confidence in the result, but also master changing your mind when new data calls for it.

What I Got Out of It

I thoroughly enjoyed this book and would recommend it to anyone interested in human coordination. It gave me a much deeper appreciation for the role incentives play in determining the output of systems, and why broken systems are so hard to fix.

It can be easy to see systems as having some divineness to them for why they operate the way they do, but going forwards I’ll have a better understanding of the way that while systems may be efficient, it’s easy for them to end up as inadequate for some task.

This is especially true of multi-factor markets where multiple people need to defy incentives to improve the adequacy of the system. “In the same way that inefficient markets tend systematically to be inexploitable, grossly inadequate systems tend systematically to be unfixable by individual non-billionaires.” This is where bad incentives typically bottom out.

The discussion of modest epistemology was interesting. Yudkowsky offers some tools for justifying having high confidence in your beliefs even when they appear to defy experts on a subject. The idea is that while it may be unreasonable to generate beliefs that defy experts in a field you aren’t an expert in, it’s much more reasonable to synthesize the beliefs of experts, update their relative likelihoods based on your experience and background, and hold high confidence in the conclusions of this process.

Key Takeaways

  • Efficiency: Stock is neither too low nor too high relative to anything you can possibly know about Microsoft’s price
  • Inexploitability: You can’t make a profit by short selling houses so houses will generally be overpriced.
  • Adequacy: at least there’s no well-known but unused way to save ten thousand lives for just ten dollars each, right? Somebody would have picked up on it! Right?!”
  • “The academic and medical system probably isn’t that easy to exploit in dollars or esteem, but so far it does look like maybe the system is exploitable in SAD innovations, due to being inadequate to the task of converting dollars, esteem, researcher hours, etc. into new SAD cures at a reasonable rate—inadequate, for example, at investigating some SAD cures that Randall Munroe would have considered obvious, or at doing the basic investigative experiments that I would have considered obvious. And when the world is like that, it’s possible to cure someone’s crippling SAD by thinking carefully about the problem yourself, even if your civilization doesn’t have a mainstream answer.”
  • “There’s a whole lot more to be said about how to think about inadequate systems: common conceptual tools include Nash equilibria, commons problems, asymmetrical information, principal-agent problems, and more. There’s also a whole lot more to be said about how not to think about inadequate systems.”
  • Sources of inadequacy:
    • Decisionmakers who are not beneficiaries
    • Asymmetric information
    • Nash equilibria that aren’t the best Nash equilibrium, let alone Pareto-optimal
  • Coming up with and implementing a solution to an adequacy problem is much more difficult than identifying it
  • “If no hospital offers statistics, then you have no baseline to compare to if one hospital does start offering statistics. You’d just be looking at an alarming-looking percentage for how many patients die, with no idea of whether that’s a better percentage or a worse percentage. Terrible marketing! Especially compared to that other hospital across town that just smiles at you reassuringly.”
  • “This brings me to the single most obvious notion that correct contrarians grasp, and that people who have vastly overestimated their own competence don’t realize: It takes far less work to identify the correct expert in a pre-existing dispute between experts, than to make an original contribution to any field that is remotely healthy.”
  • Advice
    • Make sure to store data points and reflect on them
    • Bet real money. Helps a lot with learning.
  • “This fits into a very common pattern of advice I’ve found myself giving, along the lines of, “Don’t assume you can’t do something when it’s very cheap to try testing your ability to do it,” or, “Don’t assume other people will evaluate you lowly when it’s cheap to test that belief.”
  • If you always win, you’re doing something wrong.
  • “If a position calls for a leader of men, you often find a leader of men. If it instead requires super high levels of another skill, whether it’s coding, raising money, lifting weights, intricate chemistry or proving theorems, you’ll find that. However, if you need rare levels of such skills compared to what you can offer, you won’t select for anything else. You can’t demand ordinary competence in insufficiently important areas. There aren’t enough qualified applicants. Plus it wouldn’t be worth the distraction.”
  • Do more of the thing that is working!
  • “As Tetlock puts it in a discussion of the limitations of the fox/hedgehog model in the book Superforecasting: “Models are supposed to simplify things, which is why even the best are flawed. But they’re necessary. Our minds are full of models. We couldn’t function without them. And we often function pretty well because some of our models are decent approximations of reality.””

Other Reviews