In everyday life, this kind of thing is somewhat of a rare occurrence. But it doesn't have to be.

The double-crux game originated at CFAR. |

I first learned about the Double-Crux game when McKenzie Amodei and Andrew Critch of the Center for Applied Rationality taught a workshop on it at EA Global 2015 in Mountain View. The idea is to get you and someone you disagree with into a situation such that hopefully one of you will be able to change your mind. It is genuinely one of the most awesome experiences I've ever had, and it is well worth the amount of effort that is involved in really and truly thinking it through.

- First, you need a friend who's willing to do this with you. It only works if you're both intellectually honest, willing to change your mind if presented with sufficient evidence, and excited about the progress of getting closer to truth, even if that means that one's current view is incorrect.
- Second, you need to identify an intellectual disagreement that you have between the two of you that you'd like to focus on. The process doesn't always work, but if it does, one of you will be changing your mind on this issue. The first time you try this, it will work best with a binary boolean of the form A or ~A. But once you get the hang of it, you can also use it for non-binary disagreements, like believing A is true with 90% confidence or 60% confidence.
- Third, you might want some paper to write down your thoughts. It's not strictly necessary if you have a good memory, but it's definitely helpful at least the first time you attempt it.

To start, write out statement A, for which the two of you disagree. For your first attempt at this game, one of you should think A is true; the other should think A is false.

From Richard Acton. |

If you've done this correctly, then you will find that there are now two statements that the two of you disagree on. You think both A and B are true, while your partner thinks both A and B are false. Further, you honestly believe that, if B were false, then A would be false, too -- and your partner believes that, if B were true, then A would be true as well. This means that if either of you were to change your mind about B, then that person would also change their mind about A.

Now we can recursively go through the above steps, replacing A with B. The two of you find a double crux for B which will be called C. Then a double crux of C called D. And so on. At each step, write down the double crux statement. Each time you do so, make sure it is a genuine double crux. This doesn't work if you grudgingly accept a statement as a crux; it only works if you honestly believe that the truth value of the crux is causally related to the truth value of the former statement.

Eventually, the two of you may reach a double crux statement Z for which data can be looked up. This is the goal of the cooperative game. It should be an issue that is easy for the two of you to agree upon, perhaps because you can look it up directly in Wikipedia, it might be easily computable in Wolfram Alpha, or it might be easy to simulate on Guesstimate. Once this happens, the two of you will agree on statement Z. For one of you, this means that you have effectively changed your mind on Z.

Because Z was a crux of Y, that person should also change their mind on Y. And X, W, V, U, etc., all the way to A.

If you are both intellectually honest about what really does consist of a double crux each time, then once you reach the end condition of finding a checkable statement Z, it means that one of you will correspondingly change your mind about A.

The experience of changing one's mind through this method is absolutely breathtaking. Being able to watch as your brain goes through each step, legitimately changing your mind on increasingly complex lemmas until you reach the main statement of disagreement is literally awe inducing. It is one of my favorite feelings, and I highly recommend trying it out for yourself.

Unfortunately, it doesn't always work. I've often gotten into situations where my partner and I can each find a crux, but not a double crux. And sometimes when working with people new to the double crux game, we run into the issue of someone claiming something is a crux when it actually turns out not to be. But even in these "failures", I've found the double crux game to be an immensely rewarding experience, because it usually helps both sides to understand the perspectives of each other much more.

Amodei lists a few additional hints on how to play the game:

- It can be easier to look for your individual single cruxes first, then to check with your partner to see if it might be a double crux with them.
- You can find your single crux by asking "what things could, in principle, change my mind on this topic?"
- Cruxes are better when you make the claim concrete and specific. Vague claims might be legitimate cruxes, but it will be very difficult to recursively work with something too vague. Be specific; quantify your claim; put a percent of likelihood on your predictions. That way you won't run into a situation where you won't be able to find a double crux for a too-vague statement.
- Once you understand how to do this with a boolean statement, feel free to move on to more complex disagreements. If you believe A is 90% likely while your partner believes it is 60% likely, that disagreement can be resolved by the double crux game as well.

From Regex. |

Richard Acton has created a presentation that gives a great example of two players playing the double-crux game. It's short, but it can be helpful to see a real life example of finding a double crux from a statement of disagreement.

Once you're familiar with the double crux game, PeteMichaud points out that you can use the game to resolve cognitive dissonance by examining your internal epistemic rationality, or even to build deep, sustainable caring by playing between the parts of you that thinks some part of the world matters and parts of you that are afraid to look in that direction. Of course, this requires that you first know how to notice and parse out these parts of yourself, and it might not be helpful to people who believe they have an intuitive grasp of value. But for those of us who are moral anti-realists that wish to construct value anyway, this seems like a promising technique to get the entirety of your brain on board with a consistent ethical position.

If you want to learn more about the double-crux game, or if you want to experience it first hand in a setting with others that are similarly interested in better navigating intellectual disagreement, you may be interested in going to one of the Center for Applied Rationality's workshops. Or, you might want to invite me to work through a mutual disagreement. Regardless, I heartily recommend playing the Double-Crux game. It's well worth the effort needed to work through each level with a trusted friend.

EDIT (Dec 2016): Duncan Sablen has posted a high quality introduction to Double Crux on LessWrong. I highly recommend that people not only read Duncan's post, but also the several high quality comments left by users of LW.

EDIT (July 2018): Richard J. Acton has commented below with a link to an updated prezi he's created that has some additional rationality techniques added. On LessWrong, deluks917 has posted a concrete multi-step variant of double crux. Both are worth checking out.

EDIT (Dec 2016): Duncan Sablen has posted a high quality introduction to Double Crux on LessWrong. I highly recommend that people not only read Duncan's post, but also the several high quality comments left by users of LW.

EDIT (July 2018): Richard J. Acton has commented below with a link to an updated prezi he's created that has some additional rationality techniques added. On LessWrong, deluks917 has posted a concrete multi-step variant of double crux. Both are worth checking out.

Awesome!

ReplyDeleteI'm down to play! I'm just gonna say go ahead and book a skype call with me, anyone who wants to: https://malcolmocean.youcanbook.me/

ReplyDeletewe can do this for life insurance decisions

ReplyDeleteIf anyone wants to play the double crux game with me, I could use the practice!

ReplyDeleteI'd be interested

Delete"Your goal is to find the exact same statement B that serves as a crux for both you and your partner"

DeleteI didn't read the preceding sentences carefully enough the first time, so for a second I thought the goal was to find B such that one person thinks B <=> A and the other person thinks B <=> !A. Or to eventually find some analogous Z.

I think that would also be a very interesting game, if a very different one.

It might be interesting to play this with the proposition *about* the double crux game.

DeleteLet's.

Deletethat sounds fun, poke me sometime to videochat

DeleteSounds fun/interesting. I am down to try!

ReplyDeleteDo you have some semi-controversial ideas brainstormed already? You might pique more targeted, action-ready interest if you share examples.

ReplyDeleteI'm going to do this!

DeleteI'd try it.

ReplyDelete(Sorry, I'm coming to this long after it was posted. Perhaps no one's reading any more.)

ReplyDeleteCould someone clarify a couple of things?

1. If I understand right, a "double crux" of a proposition A means another proposition B for which you think A => B and B => A. But the post also refers to a "crux" and doesn't explain its terminology. Does it mean a proposition B for which you think A => B? (So that not-B => not-A, so that refuting B would require you to abandon belief in A.) Or something else?

2. The post suggests that you can play the same game when your disagreement is "softer": the two participants assign different probabilities to some proposition, but they aren't 0% and 100%. In this case, what happens to the notions of "crux" and "double-crux"? I mean, suppose we are arguing about whether freebles are glorkish; I say there's a 30% chance they are and you say there's a 70% chance. Am I now looking for other statements logically equivalent to "there's a 30% chance that freebles are glorkish"? That seems impossibly specific. Or am I looking for other statements equivalent to "freebles are glorkish" and to which I assign 30% probability? Or for statements equivalent to something like "there's a less than 50% chance that freebles are glorkish", trying to find a dividing line between my position and yours?

Regarding your first question:

DeleteNo one needs to believe A↔B. We’re only looking for sufficient conditions, not necessary ones. One person believes A→B and another believes B→A, but neither need believe both simultaneously. If Peter and Dorek are playing, then these are the beliefs they’d need to have:

Peter:

A;B;A→B. By modus tollens, Peter would also believe¬B→¬A.Dorek:

¬A;¬B;¬A→¬B. By modus tollens, Dorek would also believeB→A.An example may help here. Let’s say Peter believes traditional Greek mythology is true, whereas Dorek believes Roman mythology is true. So Peter and Dorek decide to play the double crux game with the proposition A: “traditional Greek mythology is true”. To start with, Peter believes A and Dorek believes ¬A.

Now the two of them individually start looking for (single) cruxes of their belief. This means Peter is looking for a new statement B such that A→B. Once found, modus tollens will inform Peter that ¬B→¬A as well.

So Peter thinks: if greek myths are true, then Zeus is the king of all gods. This will be Peter’s single crux B: “Zeus is the king of all gods.” Peter believes both B and A→B, so this works as a single crux. Peter mentions this as a possible double crux for Dorek.

So Dorek thinks: Zeus is not the king of all gods; Jupiter is. So Dorek believes ¬B. Also, if greek myths are not true, then zeus is unlikely to be the king of all gods. So ¬A→¬B. (

Note that this uses a probability estimate, not a logical certainty, but that nevertheless works for the double crux game.) By modus tollens, Dorek also believes B→A.Thus they have found their double crux: B. Peter believes A, B, and ¬B→¬A; while Dorek believes ¬A, ¬B, and B→A.

To finish the game, they visit a trusted oracle and ask if Zeus is the king of all gods. The oracle says no, so they now know ¬B. Because Peter already agreed that ¬B→¬A, Peter updates his belief from A to ¬A.

Your second question is much more complicated to answer. It involves confidence intervals of nested propositions that you have to calculate twice at each step, so that when you find the final proposition Z that you can look up, you can see which of the two probability estimates is closer to reality. The domino-falling effect of changing confidence levels still occurs, but it rarely resolves to completely changing one’s mind on the original proposition; it just modifies the confidence interval by some calculated amount. Plus it requires an additional rule to finding a good double crux at each step.

DeleteMy best recommendation is to go in-person to a CFAR event that teaches the double crux algorithm (like the one Amodei taught at EAG Oxford: https://docs.google.com/presentation/d/1CDA5GWVvM0ioIpRTJ83N4QNeXSV7l1TRPA50xVXEmUU), as I’m not sure how to effectively convey what I know in a blog post comment. But I’ll try to post a few thoughts anyway, just in case some of it helps you or another reader detail the idea in full.

First, ensure that both players have calibrated their credence well. See http://acritch.com/credence-game/ or http://lesswrong.com/lw/hak/link_how_to_calibrate_your_confidence_intervals/. Then try it out first with only using a confidence interval for the initial proposition A, not other propositions. This makes the math easier.

Example: I’m 70% confident in A; you’re 30%. We find a double crux B such that A→B for me and ¬A→¬B for you. Therefore here are our beliefs:

Eric:

p(A)=.7;A→B;p(B|A)=1. Therefore:p(B)=.7.G:

p(¬A)=.7;¬A→¬B;p(¬B|¬A)=1. Therefore:p(¬B)=.7.Then, once you’re able to do the above well, you can move on to allowing confidence intervals on nested propositions, but require forced agreement on supporting facts. For example, let’s say we find a double crux C such that follows from A 75% of the time if A is false, but 95% of the time if A is true. This would mean the beliefs would be:

Shared beliefs:

p(C|¬A)=.75;p(C|A)=.95Eric:

p(A)=.7;p(C)=.89(calculated)) ;A→C(per shared beliefs).G:

p(¬A)=.7;p(¬C)=.19(calculated) ;¬A→¬C(per shared beliefs).The p(C) values for each of us would be found by multiplying out the confidence intervals, like so:

(.75)(.3)+(.95)(.7) = .89.The problem that you pointed out is that the p(C) values aren’t the same, so it’s at first unclear how to treat C as a double crux. However, you can still use C as a double crux — you just have to multiply out the confidence interval. Using the values that I provided (on purpose) here, this makes the end result somewhat unsatisfactory, as the disagreement on p(C) is .89 for me and .81 for you, which is quite similar to each other. If you were to nest a few additional double cruxes in, you may find that the difference between your confidence in some future proposition p(Z) and my confidence in that same proposition might be nearly identical.

DeleteTo correct for this, there’s an additional rule in finding a double crux: it has to result in a significant difference in confidence intervals between the two players. This is harder to do than it sounds, and it doesn’t exactly sound easy to do.

Because I enjoy the double crux game as a social game to help resolve disagreements, I tend to just stick to only using confidence intervals in the first proposition, but not subsequent ones. That way it is easier to explain to people, it’s more fun to play, and it makes finding shared cruxes much easier. When I play, it’s more likely to get people to understand each others’ differences than to actually change someone’s mind, mostly because finding double cruxes is hard enough in the boolean version of the game.

If you find a better explanation of how to play the double crux game using confidence intervals somewhere online, please link it here so that future people can see it as well.

Whoops, I'm not sure where I got A<=>B from. Sorry about that. Anyway, it sounds as if I was right to take it that a crux of A is something you think is implied by A.

DeleteThe fully-calculated Bayesian version of double-cruxing sounds like a lot of Not Fun in general, so I endorse your suggestion of not bothering with intermediate degrees of credence for propositions other than the original -- in which case, it seems like the thing to look for is propositions that we're *very confident* are necessary for our positions on the contentious proposition to be right.

An obvious difficulty here is that there's a reason why those probabilities tend to come out not-very-different: nontrivial chains of reasoning where all the steps are uncertain aren't, and shouldn't be, very convincing. What it should take to convince you in such cases is different in shape from what it takes to convince you when you have a watertight chain of logical inference.

(In other words, if you can't find suitable propositions that follow almost-certainly from the thing you disagree on, double-cruxing probably isn't actually the best way to resolve your disagreement.)

I'm not sure I understand your use of the term "confidence interval" here, by the way. Those things in your calculations look to me like probabilities, not confidence intervals.

There is an updated version of that prezi with some additional rationality techniques/concepts outlined, rationalist taboo, steel-manning and inferential distance

ReplyDeleteI'll hopefully be fixing some typos etc and including a link some exercises I put together for when I deliver this presentation as a workshop at some point.

https://prezi.com/_er3ebasdwr5/better-disagreement-2/

(I did actually remove the slide you used from by old presentation as I found the formalish logic was bit intimidating for some in my audiences)

glad to see someone found my prezi useful - only just found this post ~2yrs after the fact