Agreeing to Disagree

Robert Aumann, a winner of the 2005 Nobel Prize for Economics, once published a paper in The Annals of Statistics titled “Agreeing to Disagree.” The basic idea of the paper is that two rational people should, by sharing their beliefs with each other, come to a common understanding about what is likely to be true.

The argument is given as a mathematical theorem, which I will briefly describe. (Impatient readers may skip to the next paragraph.) Economists and others who study decision theory generally model beliefs as a probability distribution. This distribution describes the subjective (“Bayesian”) likelihood that I place on something being true or false. For example, I might believe with 75% probability that it will rain tomorrow, or I might believe with 90% probability that the woman sitting next to me has name that starts with a “C.” When I am confronted with new evidence (I hear a weather forecast, the woman introduces herself as Susan) I update my beliefs, and the new probability distribution is called the “posterior.” The situation is modeled as follows: assume there are two people who have “common priors.” This means, roughly, that given the same total body of evidence they would come to the same conclusions. Furthermore assume that each is able to observe some private information that is not observable to the other person. This should cause each to update their beliefs to reflect the new information. Now assume that the two people can get together and express their beliefs (“posteriors”) to each other. Then they should each update their beliefs to reflect the other persons beliefs, until they have exactly the same beliefs. Sharing their beliefs is all that is required—there is no need for them to describe their private evidence.

The idea is easy to understand: my beliefs summarize the weight of all the evidence that I’ve observed in my life. If I want to know the truth, I should place weight not just on the evidence I have personally observed, but on evidence that you have observed as well. In fact, if I believe that you are as smart and honest as I am, then I should give your evidence as much weight as my own. Since your beliefs summarize all your evidence, they allow me to update my beliefs to reflect your private information. Of course, you should do the same, until we believe the same thing. (Note that the theory applies only to objective truths, and not to matters of taste–there is nothing problematic about agreeing that you like chocolate and I like vanilla.)

Why might two truth seeking people continue to disagree? One possible reason for disagreement is that they doubt each other’s honesty or rationality. I will not believe the evidence from your “posterior” if I think you have been faulty in evaluating your private evidence, or if I think you are lying about your beliefs, even to yourself. Another possible reason is that they don’t have “common priors,” or in other words, that they would not agree even if they saw all the same evidence. But again, this seems to require a judgment by me that your thinking is somehow less valid than mine. In the end, it is difficult for good-faith disagreement to persist. If it persists, at least one of the parties is essentially judging the other to be on some level dishonest, deluded, or dumb. In an unpublished paper, blogger-economists Tyler Cowen and Robin Hansen consider this question and conclude that “typical disagreements are dishonest.”

It is straightforward to apply this idea to religion, especially religion as it is understood by Mormons. First, we believe that a primary goal of true religion is to find the truth about objective facts: about the will of God, the authority of the church, the right way to get baptized, etc. Second, we believe that the most important evidences about these truths consist of private information: personal spiritual experiences and feelings. Other people cannot verify our evidences, and in fact we might not even be able to accurately describe them. Some maintain that we shouldn’t even try to describe these experiences, because they’re “too sacred.” Instead we resort to describing our posteriors, or, as we call it, bearing our testimonies.*

If others find us trustworthy, this testimony should be enough to convince them to adopt our beliefs. Indeed this process is explicitly approved by the Lord in D&C 46:

13 To some it is given by the Holy Ghost to know that Jesus Christ is the Son of God, and that he was crucified for the sins of the world.
14 To others it is given to believe on their words, that they also might have eternal life if they continue faithful.

Of course, this cuts both ways. We live in a world with many who do not, in the end, “believe on our words.” What are we to make of this? Many of these others seem just as rational and well adjusted as we do, yet there are many who have no religious beliefs, or, more problematically, strong convictions that are much different than ours. How should we account for these beliefs? Do we need to update our own posteriors?

One popular solution is to simply decide that these non-believers are simply dishonest or self-deluding, perhaps because they are involved in sin and ensnared by the wiles of the evil one. Or, we could decide that the non-believers don’t listen because they eroneously think we’re deluded or stupid. These solutions resolve the epistemological dilemma, but they also make it hard for us to have good faith relationships with the non-believers. Our disagreement about these fundamental matters reveals an unresolved underlying mistrust, no matter how friendly a face we try to put on it.

Another solution that is popular among the more liberal types is to decide that religion isn’t so much about objective facts, but it’s more like chocolate and vanilla…if you find beliefs that work for you, great! But this seems to deny some core aspects of our faith, and so many find it unworkable.

In the end, I think this theory helps explain why discussions like those in the bloggernacle can so easily become unpleasant. “Agreeing to disagree” sounds nice, but it is ultimately more problematic than we realize. Perhaps we’d be wise to take Cowen and Hanson’s advice, and focus ourselves, considering how we can “become more honest,” learning to “look for signs of self deception,” and being “a bit more wary of our own judgments when we disagree.”

* I’d love to see someone go to the podium on fast Sunday and say “I’d like to describe my posterior, that I believe with probability approaching one that the church is true….”

66 comments for “Agreeing to Disagree

  1. Ed,

    Why did Aumann title his paper that? It does not appear to be what happens in his model. Or perhaps he was using the model to show conditions under which agreeing to disagree was untenable.

  2. Ed: I don’t see what the fact that people do not have the same posteriors necessarily implies that you think that the other guy is irrational, deluded, etc. It seems to me that you simply think that he is wrong. However, if our beliefs are basically a probability distribution, then why do I need to assume that there is something suspect about someone else coming to an incorrect conclusion? The very fact that I can question my own beliefs without necessarily convicting myself of some sort of fundamental breach against reason suggests that I can think that others are wrong without convicting them of a similar breach against reason.

    It seems to me that we can agree to disagree on fairly simple terms by saying something like, “Well, I think that you are a rational and reasonable guy. I just think that you are also wrong.”

  3. Interesting stuff, but maybe only to someone who enjoys economics, as I do. I find that people either love it or hate it.

    There is another factor at play in many discussions also, and that is play itself. A debate can be a game – either a game of skill in logic or a game of cat and mouse. Tennis players don’t reason out who is better, they play the game – sometimes to see who is better and sometimes just because they find the game enjoyable. Some games end in arguments, and some in hugs or handshakes, no matter who wins.

    The same happens in economics and other academic/scientific inquiries. Questions are asked. Assumptions are challenged. It is one way for everyone, including the questioners, to peel away falsehood, errors, dishonesty and gaps in knowledge, even within themselves.

    The advice to focus on our own honesty and judgements is good advice, but also very personal/private. We like to know someone a bit before we bare our prejudices, fears and ignorance with them. It isn’t as likely to happen online, especially early in a discussion. There’s a game to play first.

  4. Here is another position, Ed. Feelings are much more about ourselves and our desires than about a reality transcending ourselves or the empirically accessible world. Hence you can encounter an honest Jew for Jesus who claims that God has answered her prayer with respect to the Book of Mormon negatively while we believe that we received the opposite answer.

    Alma 32 is quite explicit. The testimony begins with a wish. If you wish the Book of Mormon to be true then that’ll be your answer. If you wish it to be of the devil then that’ll be your answer.

  5. Frank,

    I haven’t talked to Aumann, but I assume you’re right. He’s showing that “agreeing to disagree” doesn’t really make sense in a Bayesian model with common priors. Note that this is the model that you’d probably want to use in any formal investigation of beliefs. This is surprising because, as Nate points out, “agreeing to disagree” is commonly thought to be a fairly simple and wise thing to do.

  6. “Alma 32 is quite explicit. The testimony begins with a wish. If you wish the Book of Mormon to be true then that’ll be your answer. If you wish it to be of the devil then that’ll be your answer.”

    I believe that you wish Alma 32 to say that, but I’m not convinced it does :).

    Nate,

    If somebody has different beliefs than you, the first question to ask is, why? Is it based on some set of reasons that you would be interested in knowing about? If you weight those reasons differently than them, why should you, unless you think you have some special insight? In other words, if they believe differently than you, there is a reason for that, and you should seek to use that information thy have to make a better estimate of the truth.

    As for questioning oneself, I am not sure that is a problem, after all, a posterior is a distribution you accept giving probabilities of any given event being the case. There is quite a bit of room there for self doubt! You may, for example, not have a clue, in which case you might, for example, assign equal probability to an infinite number of possibilities.

  7. Ed: But the whole argument seems a little circular. It seems to me that saying that one has common priors simply means that the ONLY way that people can reach different conclusions is by either having different information or through some sort of intellectual failing. The argument becomes true by definition.

    Why make this assumption at all? You suggest that this is the model that we want to use in any formal investigation of belief. Why?

  8. Nate, I’m not sure I understand your question. How can you think that someone else is just as smart and honest as you, but simultaneously think he’s more likely to be wrong than you?

    Let’s say you are interested in the truth of proposition A. Let’s say you initially believe A is true with 95% probability. You meet another guy who believes A is false with 95% probability. If you think he’s just as smart as you, then wouldn’t you think you are just as likely to be wrong or biased in your beliefs as he is? But if you conclude that, then you really don’t think “A is true with 95% probability” at anymore, you think “it’s just as likely that I’m right and A is likely to be true, or that he’s right and A is likely to be false” Combining these, you would now believe that A is true or false with equal probability.

    Of course the other guy should do the same thing if he thinks you are just as smart as him, so then he also thinks “A is true or false with equal probability.” Voila, you have the same beliefs.

  9. “ONLY way that people can reach different conclusions is by either having different information or through some sort of intellectual failing. The argument becomes true by definition.”

    You are correct that we can destroy the theorem by just assuming different priors. But that seems a little capricious. Why should you think your priors are so much better than everyone elses? If everyone were to think that their personal priors were nothing special, that would reduce to the same thing as common priors.

  10. Clair, I think that’s a really good point. Intellectual debate is often about the fun of the debate as much as getting the right answer. In that case, agreeing to disagree would really just mean “we’ll take this up later for even more fun!” But when it comes to fundamental questions that are deemed to be really important, like religious truth, it’s not that simple.

  11. This post brings out a couple points:

    Although there are some people for whom this is true, in most cases the model does a horrible job of predicting bahavior, which suggests that we are missing something. For example, I may have preferences over what I wish to believe is true, even if it means ignoring what is true. I may not actually be interested so much in a fundamental truth as in something that I find useful for me in its application, in which case, people who have different applications may not be useful to me, because what I am after is not truth but something useful to my situation.

    Lastly, how does one think about situations where someone has alot of knowledge and someone else has little, but the person with little adopts overly optimistic evaluations of the value of their knowledge? In fact, that may be a common characteristic of having little information. You think you know more than you do, because additional information would actually make you less confident. Is that a problem or not? I haven’t read the Aumann paper so I am not sure if that was allowed under his model.

  12. It is straightforward to apply this idea to religion, especially religion as it is understood by Mormons.

    I’m not so sure that “truth” or “Truth” is as objectively determinate and independent of desire as you suggest. Is your perspective reconcilable with D&C 88’s assertion that our condition depends upon, in the end, upon what condition we desire?*

    *32 And they who remain shall also be quickened; nevertheless, they shall return again to their own place, to enjoy that which they are willing to receive, because they were not willing to enjoy that which they might have received.

    33 For what doth it profit a man if a gift is bestowed upon him, and he receive not the gift? Behold, he rejoices not in that which is given unto him, neither rejoices in him who is the giver of the gift.

    34 And again, verily I say unto you, that which is governed by law is also preserved by law and perfected and sanctified by the same.

    35 That which breaketh a law, and abideth not by law, but seeketh to become a law unto itself, and willeth to abide in sin, and altogether abideth in sin, cannot be sanctified by law, neither by mercy, justice, nor judgment. Therefore, they must remain filthy still.

    36 All kingdoms have a law given;

    37 And there are many kingdoms; for there is no bspace in the which there is no kingdom; and there is no kingdom in which there is no space, either a greater or a lesser kingdom.

    38 And unto every kingdom is given a law; and unto every law there are certain bounds also and conditions.

    39 All beings who abide not in those conditions are not justified.

  13. Does this model leave room for “capital-T” Truth, however? There are some absolutes, otherwise we have no religion and no claim to the Truth (about God, about the plan of salvation, about the Godhead, about the Book of Mormon and Restoration, about priesthood, etc.). Before believing in personal spiritual experiences, we believe in absolute truths that people are then invited to discover for themselves. But different individual personal experiences don’t change what is True. I still may need to agree to disgree with someone, but that doesn’t change those things that are True (or Right and Wrong).

    If it persists, at least one of the parties is essentially judging the other to be on some level dishonest, deluded, or dumb.

    Or one is judging the situation against what is True, which is not necessarily a judgment on the person per se, but a realization that the person can’t be right because you can’t change what is True. That doesn’t give us the right to judge or condemn someone, but it does give some understanding about what is going on.

    For example, if someone comes to me and says that he/she has received revelation that the Book of Mormon really isn’t the word of God, then I can know that something about his/her evidence or experience or bias or whatever is inhibiting that person from finding Truth. The person may not be “dishonest, deluded, or dumb” but instead be “deceived.” We would still have to agree to disagree, but I could know that I am right — not by nature of my rightness but by nature of what is True.

    I would not be able to apply the same confidence about being Right to a more subjective decision such as that discussed re: the Mommy Wars in another thread. I have received revelation that I should stay home with my children and not actively pursue my career, and, while I believe strongly in what the prophets say about the ideal of women staying home, I cannot conclude that another woman who says she feels impressed to pursue her career is dishonest, deluded, dumb or deceived, because this is not a case of absolute Truth. It is based on true principles, yes, but there are different situations that may call for a different solution, and I cannot pass judgment on that.

    However, if she were to claim that the prophets are dishonest, deluded, dumb or deceived because she doesn’t like what they teach, then I could go back to saying that she is wrong — because it is True that prophets have authority to lead the Church and are called of God to teach Truth and to give inspired counsel that is for our good.

    Now, it is quite possible that I missed the point of the thread, since I really detest economics, so….

  14. * I’d love to see someone go to the podium on fast Sunday and say “I’d like to describe my posterior, that I believe with probability approaching one that the church is true….�

    Don’t tempt me.

    Otherwise, I think Frank’s first paragraph (#12) deserves particular attention if this model is to be applied in a religious sphere.

  15. The model predicts that honest, rational, non-self-deluded truth seekers with common priors will not disagree. We observe that people do in fact disagree. This “failure” could be because some people are:

    1. not fully rational (or subject to cognitive biases)
    2. dishonest
    3. deluded
    4. seeking for something other than truth
    5. do not have common priors.

    I would argue that it’s all of the above. The reasons you mention seem to fall into these categories. If yo’re seeking for some instrumental value other than truth, that’s reason 4. Greenfrog and Helmut also seem to be talking about people seeking for something other than the truth. If you are over-confident with little information, that is reason 1. (Or possibly 5. It still seems to me that having different priors itself is a form of non-rationality.)

  16. greenfrog,
    D&C 88 is talking about how much Truth we are willing to receive and what our resulting final state will be…but our degree of reception does not change what is True.

  17. Frank (#12): I don’t know about Aumann’s model, but in finance there has been a recent flury of papers using heterogeneous beliefs where the “agreeing to disagree” comes from the fact that the stochastic processes involved are sufficiently non-stationary that convergence of beliefs never happens. An important part of the uncertainty involved is not knowing how much you know relative to what everyone else knows (I think this is not allowed in Aumann). In this case, Ed’s result that “Sharing their beliefs is all that is required—there is no need for them to describe their private evidence” breaks down. I think this is a good way to think about how Aumann’s result breaks down in religious, political, and other polarizing topics of conversation where people like to disagree (this is essentially the same as assuming we have different priors, it’s just giving a better and more rationale story as to why people might have different priors, and continue to have different priors…).

  18. Robert,

    I think you’re wrong about what Aumann allows—if you are able to repeatedly share information and compare how you update it, your posteriors should converge to the same distribution. If you have jstor access, a link to the Aumann paper is here. It’s only 3 pages long, and page 3 has a nice example with coin flips.

    What finance papers are you talking about? Are you sure that the agents are sharing their beliefs with each other? Or is the result simply that agents start with different beliefs, and the flow of information is never enough to swamp the priors?

  19. Robert:

    Re# 18

    The financial models break down becuase participants do not attempt to share information. By sharing information any one participant looses the opportunity for arbitrage.

  20. ed, LOL on the testimony meeting. Although that might be close to how Moses would bear testimony, see Exodus 33:23.

    Thanks for this post ed, it is fascinating. While it is true that we belong to a religion that seeks to discover objective facts about important things, our religion also places great value on the avoidance of contention, e.g. the first commandment Jesus gave to the Nephites.

    While I don’t have the background to offer anything substantive, I want to make an observation.

    Isn’t this how the church is supposed to work? The apostles operate on a principle of unanimity, any one of them holds veto power over the other eleven. President Joseph F. Smith said he could not imagine any circumstances where he would act without the support of both his counselors. And elder Oaks has described his service in a stake presidency where the SP and the other counselor were in agreement, but he (Oaks) was not. He recounts how the SP would revisit the issue every every few months, just to see if anything had changed, until finally, a year or so later, it started to make sense to him and he gave his support. He apologized to the SP for the delay, but the reply was that unanimity was important enough to wait for. The idea that I have incomplete information or understanding, and that others have information that is necessary in order to reach an optimal solution, is implicit in our system of councils, I think. I also think this is how successful marriages operate.

  21. Ed:

    While I’ve never used a quote from Aumann as an introduction to a testimony, I have used Kahneman and Tversky. I think a lot of their work regarding decision-making and risk applies to your original post and to the comment you made in #16 responding to Frank in #12. While my peer and I may both eventually come to believe Prop A with 95% probability we each also have to also assess the potential consequences of acting upon Prop A– and I don’t remember Aumann treating this in his paper (if he did please correct me as I don’t currently have access to it). This might be the “something” missing that Frank mentioned that you do not cover in comment #16. As Kahneman and Tversky discovered, when people evaluate the consequences of decisions they don’t always behave in a “rational” way. I think it is possible for two people to believe Prop A and still take different decisions regarding Prop A but for reasons that boil down to “chocolate and vanilla.” This is a huge theological problem for Mormons because it becomes appearant that getting people to believe in Mormon theology is insufficient for “saving” them in the Kingdom of God. It also means that if I believe Prop B but still make the “correct” choice as someone believing Prop A making the “correct” choice that I may also be saved in the Kingdom of God.

  22. mulling&musing (m&m) wrote: D&C 88 is talking about how much Truth we are willing to receive and what our resulting final state will be…but our degree of reception does not change what is True.

    I think that if you were to ask five individuals of each of the innumerable kingdoms referred to in D&C 88 “Is X true?” you would get different answers from those in different kingdoms, based — at least in part — upon their desires. I think that such a conclusion is implied by the assertion in verse 34:

    And again, verily I say unto you, that which is governed by law is also preserved by law and perfected and sanctified by the same.

    I understand this verse to indicate that within the context of each specific kingdom, the inhabitants thereof are justified by the law that they accept/acknowledge. Law is, in the end, a function and expression of communal desire.

  23. Law is such a function of cummunal desire, and sometimes we say law to mean a particular truth, but I don’t think equating them broadly is a good idea. Truth is not a function of communal desire.

  24. Rob,

    On the other hand some recent papers in Finance correspond fairly nicely with Ed’s post. For example,
    Scheinkman and Xiong (JPE, 2003?) sustain differences of opinion and hence speculative trading by modeling some investors as overconfident.

  25. Frank McIntyre wrote: Truth is not a function of communal desire.

    No, but communities are formed based upon what the community as a group believes to be true — both about relatively objective matters as well as relatively more subjective matters. To the extent that there is no community consensus, to that degree, there is no comm-unity.

  26. RE: # 14, (p.4) What’s the difference between “deluded” and “deceived” ?? I think you mean that delusions are endogenous, whereas deception is exogenous (I picture the Church Lady saying “From maybeeeee….SATAN ?!). Just wanted to ask for clarification though.

  27. Great post — the more economics on this site the better!

    Re #4 — If I recall correctly, Cowen and Hanson address to some extent the possibility of disagreement as a sport.

    Ed, your list is #16 is right on the money and I would agree that all come into play. Dennis Potter adressed a related question from a philosophical perspective in a 2002 Sunstone Symposium session, “Recollecting God.” In the context of the present discusssion, the issue there is that two people come into contact, each of whose posterior regarding some important matter related to God approaches one. Unfortunately, what each “knows” about God is in fundamental disagreement with the other (imagine, for instance, Joseph Smith and Mohammed). If these two are able to ascertain that neither is dishonest nor dumb, etc., what is left as a basis for their disagreement? What is the result when they confer and update their beliefs? My personal conclusion is that it must be that mankind’s capacity for delusion is greater than we would like to acknowledge (even the most rational and honest of us). Further, I don’t see any particular reason to assume that we, as Mormons, are exempt. For that reason, I suppose, I have to say that my own posterior on a great many religious matters is not anywhere near one.

  28. Ed (16), more generally, I am not sure that the data that feelings generate in different people represents the same phenomenon. After all, it’s the feeling of a particular person. There is no reason why the feeling of another person should be the same. Therefore, two individuals may not be talking about the same data after all.

    When one acknowledges the data derived from “externally” observables, on the other hand, then one does not have that problem. Even allowing for misunderstandings, eventually people of good will can determine whether they are talking about the same data.

    Since the determination of religious truth invokes feelings, according to Mormon theology, Aumann’s reasoning may not apply.

    In other words, the Jewess for Jesus that told me several years ago that God had answered her prayer about Joseph Smith and the Book of Mormon negatively might have related her emotional experience just as honestly, conscientiously, and competently as the missionaries who told me the opposite.

  29. Hellmut,

    According to the setup in the theorem, you do not need to know anything about why she believes what she does. It is enough to know that someone else has a different set of beliefs. And if both of you are seeking truth for its own sake and are comparably comptetent and intelligent, the theorem says you should both move to the same set of beliefs. Ed laid out above the reasons why this might fail, but since the theorem does not require people to reveal their reasons for believing (or their feelings), it still holds even with subjective spiritual experiences.

    And I think this is why some people are disconcerted by people saying they prayed and got the answer that the Book of Mormon is not true.

  30. Ed (#19): Thanks for the Aumann link. Ultimately I think it’s the common knowledge assumption that’s being relaxed in many finance models to motivate heterogeneous beliefs (see my note below to Karl for references and more on this).

    To take the coin-flipping example, a key assumption in Aumann is that I know how you update your beliefs. Although in a simple example, a price might be a sufficient statistic for inferring this, in more realistic situations it is not (e.g. an observed price will not likely be a sufficient statistic for the mean and the standard deviation of prior or posterior beliefs–and in finance there’s good reason to believe that analysts might not understand how each other updates their beliefs). So if I don’t know how you’re updating your beliefs, our beliefs won’t necessarily converge. (On the other hand, If the coin-flip was repeated infinitely and the mean sationary, then I think beliefs would converge anyway; my point above is that if the mean is non-stationary and if I don’t know exactly how you are updating your beliefs, then our beliefs won’t converge; more common, the assumption is that although the mean may ultimately be stationary, the convergence I just described will be so slow in practical terms that we might as well just assume heterogeneous beliefs).

    In more realistic day-to-day conversations, I would argue this assumption of comon knowledge is non-trivial. First, I believe that conversations are not sufficient to explicate sufficiently what, how, and why people believe what they believe(as pertaining to religous or political belief; I think it’s a pretty good assumption in many economic settings), so the updating process of understanding how others update their beliefs doesn’t sufficiently occur (that is, people reach a stand still in communication/understanding, which is slightly different then agreeing to disagree).

    But even if perfect communication were granted, I think it’s misleading to talk about religious beliefs in the same way we talk about statistical beliefs–after all, I don’t even claim my own religious belief is rational. Instead I think religious belief requires a leap of faith (which is ultimately super-rational). It may be partially motivated by personal experience (a la Alma 32), but I think it’s going too far to say that personal experience can fully rationalize belief.

    Another way to think about religious beliefs violating Aumann’s assumptions might be in the perception of such an event; that is, Aumann assumes there is no uncertainty about an event taking place; in the case of religion, wouldn’t this be like assuming that everyone knows with 100% certainty that Joseph Smith saw God (though I still think other assumptions break down and that this wouldn’t be sufficient for religious beliefs to converge)?

  31. Karl (#25): Good to cross paths once in a while. We should catch up sometime….

    I agree—I would say there’s also been a flurry of behavioral finance papers on this topic too. Coming from Carnegie Mellon, I was predisposed to more rational motivations of heterogeneous belief models (for example, Burton Hollifield and Michael Gallmeyer have done work on hetergeneous beliefs, and although I’m not sure if they’ve explicitly written about this in their papers, I think they prefer the so-called rational way of motivating heterogeneous beliefs that I outlined above (#30).

    Stephen Morris has done a lot of very good work in this area (esp. in the mid 90’s). It’s been in a while since I’ve looked at his work, but a google search will surely give anyone interested some good leads. In particular, I remember him giving some simpe examples showing how relaxing the infinite iterations assumed in common knowledge can lead to some surprising pricing results….

  32. Frank,

    For one, I didn’t understand this post in the same way that you just summarized it. You defined “common priors” as “if both of you are seeking truth for its own sake and are comparably comptetent and intelligent”…which is far from how I understood the definition. It has more to do with having roughly the same past experience, a highly theoretical construct I would add.

    Which leads to my next question: how does having “common priors” lead one to more easily/honestly accept truth? I can see how it would lead to increased likelihood if honest agreement, But this model says nothing about truth…especially the objective “Big T” kind referred to in a previous comment.

    I think that the ideas in this post say much about arriving at agreement, which I agree with…hah! But it says nothing about the process of arriving at truth. Hellmut’s point is well taken…just because two individuals can honestly agree upon a finding based upon common priors is no proof of objective Truth; only that those two people share the same assumptions about how to arrive at a conclusion.

    The truth…it can still be a party crasher, no matter how many people have agreed to agree otherwise.

  33. Robert (#30),

    Those are good points. There are various problems with applying these theories to real life. Personally, I have grave doubts that Bayesian probability provides a very complete model of what we mean by “belief,” but I don’t have a better model to suggest.

    But I still think this idea is basically right: if I’m interested in the truth, and I meet someone who believes something different than I do, shouldn’t I take this as some degree evidence that my beliefs might be wrong? And shouldn’t I modify my beliefs accordingly? Not to do so is to privelege my own experiences and intellect over that of others. I actually think this works in real life, and that you really should take others’ beliefs as evidence to consider. And I also think that failure to come to agreement often does result from one or both party’s unwillingness to rationally and honestly consider that they might be wrong.

  34. Robert,

    No, it does not mean that everyone agrees that JS saw God. You could have priors over the various possible interpretations, I think.

    Watt,

    I was actually dealing with other assumptions of the model. But as Ed points out, the “common priors” assumption is not as big a deal as it may appear. Priors, if we wish, can be defined as your beliefs before any evidence comes in. For example, I have a coin I am going to flip. My prior might be that there is a 95% chance that this coin will come up heads half the time and tails half the time. Or I might have a prior based on having no idea which number to favor– in which case I might say that the chance of being heads, call this H, could be anything from 0 to 1, and I have no preference for one possibility over another. So prior beliefs are a function of past experience, but not the same.

    Thus priors might be defined as your beliefs at birth, for example. Or, as Ed points out, if two people have different priors, then if they consider each other to be their equal then they should probably (under not crazy assumptions) average across their two sets of priors to get a new set. And regardless, the whole point in this Bayesian learning stuff is that you wish to collect info sufficient that the posterior is based on observation (evidence of some form), not just the hypothesized priors. So you don’t particularly want your conclusions to be sensitive to your priors, because you want the results to be based on the data.

    Hopefully some of that made sense to you. The point is that the common priors assumption simplifies the analysis, but is not, I think, really crucial all by itself to the point Ed is making here.

  35. Frank (34),

    You are right to suspect that I would have a difficulty following. :-)

    In (29) you referred specifically to a set of priors in order to refute Hellmut’s position. I question the claim that a common desire for truth and comparable competence and intelligence (your set of priors) will lead two people to agree…if both are honest. I also reaffirm my question that this set of priors is anything resembling what Ed presents as “common priors”.

    My question about how the theory presented in this post has any bearing on the discovery and transmission of objective truth still stands.

  36. Thanks for your clarifications, Frank. Obviously, I need to read Aumann myself. May be, Tylor Cowan et al. have the solution to the religious heterogeneity, which tends to increase as freedom increases. Most people are not really willing to update their religious priors.

  37. See, when you said this:
    “And I think this is why some people are disconcerted by people saying they prayed and got the answer that the Book of Mormon is not true.”

    I understood it in the context of your comment (29) to mean that someone must be dishonest. This is a misrepresentation of Ed’s post as I understand it.

  38. Great comment, Frank. You explained the stuff about common priors better than I did.

    One thing to remember is that when I say that people no longer disagree, it doesn’t mean that they together discover the truth. On the contrary, it might often mean that both people become less sure of themselves, as each updates his beliefs to ascribe increased probability to the other’s position. If I think it’s going to rain and you think it’s not, then after talking about it we might both agree that we really don’t have much idea whether it’s going to rain or not.

  39. Alright Ed,

    So that clears it up for me a little…

    Just wanted to be clear that we’re not saying that agreement ,or even level of certainty, is the measure of truth.

    ‘Cause my prior is that all humans (regardless of how certain they feel or how much I trust them according to my priors) are complete fools in the arena of judging objective truth.

  40. Ed (#33),

    I definitely agree. That is, I think careful and thoughtful discussion, which entails careful listening and probably some assertive effort to really understand your interlocuter, as well as questioning of your own prejudices and prior beliefs, generally leads to much better mutual understanding. And, notwithstanding my quibbles above, I think Aumann’s result can appropriately be used to make this point, as you, Cowen, and Hanson have nicely and thought-provokingly done.

    Frank (#34),

    Yeah, Joseph Smith is a bad example, it’s too easy to simply discuss existing evidence regarding his (alleged) historical experience. I think a better example might be personal religious experience. And on this point, I don’t have much to add to the points Ed has already made, only the terms I might use to describe the following experience might be different than Ed’s:

    Suppose I receive an answer to my sincere Mormonism is true and you don’t, and I believe it’s b/c you weren’t sincere in your prayer. Whereas Ed might say I believe you are irrational (am I putting words in your mouth Ed?), I would prefer to say that we lack common knowledge about whether you sincerely prayed or not.

  41. “‘Cause my prior is that all humans (regardless of how certain they feel or how much I trust them according to my priors) are complete fools in the arena of judging objective truth. ”

    Indeed. I suppose you include yourself in that group?

    Humbling yourself may be the first step to enlightenment.

  42. #40 – The last paragraph is about sincere prayer—see, my prayers don’t make it through the blogosphere, let alone past the ceiling….

  43. Ed (42),

    I did say “all”, and I consider myself human. :)

    Humility is certainly a step to enlightenment. Good point. And who knows who is truly humble? Can anyone discern the level of my humility? Or are we just passing judgment based upon the assumption that anyone who questions or has failed to receive the some answers must also be short on humilty?

  44. Agreeing to disagree only works between people who don’t really know each other and don’t really care.

    In that case, agreement to disagree simply reflects a rational view of where one’s efforts are best spent.

    But for those you really care about, temporary ceasefire is never going to cut it.

    If you’re right, you’re right and it doesn’t matter whether everyone else agrees with you or tolerates you. Those who disagree will have the option of either agreeing with you, or being wrong. Simple.

    If you’re right, that is …

    In any event, I will never ascribe to an idealized world view that contemplates armed camps observing an uneasy and unearned ceasefire. By golly, we’re going to have this out until one or both of us are proven wrong! It’s not a warm-fuzzy kind of statment, I know. But outstanding disagreements over fundamental issues cannot be suppressed by layers of warm-fuzzies.

  45. Robert, in the example you give, if the person wasn’t sincere, they should know that they were not sincere, unless they’re deluded. Knowing this, if they trust you, they should change their beliefs to be more in line with yours (i.e. accept that Mormonism might well be true). Not to do so would indicate that they are irrational (or, more likely, that they don’t really trust you).

    This actually happens all the time…for example, young people go on missions without feeling they’ve received an answer, because they trust the assurances given by their parents and others.

    On a somewhat related note, I don’t like the way we teach missionaries to actually misrepresent their own posteriors, in that they are taught to bear a “testimony” even if they don’t feel they have one. That is what I was taught, and I think it’s just wrong. It would be better for everyone if missionaries were told to simply describe what they believe and why. Dishonesty in representing our beliefs is exactly what we should be trying to avoid. Ironically, many of the people these missionaries end up disagreeing with may be more sincere in expressing their beliefs than the missionaries themselves.

  46. Rob,

    We should definately catch-up. I usually go to the so called “Mormon Finance Association” dinners at AFA and WFA, but I don’t think I am going to the WFA this year. You should come to the next NBER Behavioral Finance meeting. I almost always go to those.

    Coming from Carnegie Mellon, I was predisposed to more rational motivations of heterogeneous belief models

    Wow, that doesn’t put me in a very favorable light does it. After all I went to Chicago, and yet was still seduced by the darkside.

    Stephen Morris has done a lot of very good work in this area (esp. in the mid 90’s).

    Were you thinking of Morris, Stephen, 1996, Speculative investor behavior and learning,
    Quarterly Journal of Economics 110, 111–1133. I agree.

  47. #47
    I’m pretty sure this is changing with the “raise the bar” and Preach My Gospel program. I think missionaries are better prepared spiritually in general.
    Even with the old discussions, I think there was room to explain what you believe and why. I was never given a rote testimony to bear, and I remember expressing my own feelings when I did so.

  48. I agree that this discussion is particularly pertenent for Mormons because of the peculiar attitudes and assumptions we share about the nature of Truth and its relationship to faith. I think M&M brought up an important distinction between truth and Truth (objective, eternal, unchanging, etc). As Mormons, none of us approaches the theoretical situation described in the initial post under the assumption that the only factors in play are the intellect, honesty, experience, self-delusion, etc. of the other person. We assume the existence of third parties. And I don’t just mean God, since the experience of having God reveal truth to us falls under the same category as any other experience (i.e. if I would have had the experience or if I trust that you honestly had such an experience and believe that you are describing it to me accurately, then it will influence my assessment of the truth the same way it influenced yours). The third party I’m talking about is Satan. We believe that he is an independently acting agent with real power and an unquenchable desire to deceive the children of men. It is one thing to assume that a person is less intellegent or less honest than you are. It is something else to assume that they being deceived by a real, clever, powerful being–a being whose actions they are unaware of. Add to that the doctrines taught in Alma 12 linking righteous desires and behaviors to a persons actual capacity to grasp deep truths and mysteries, and that adds an entirely new dimension to an equation that otherwise appears to take place in a vaccuum.

  49. Human thought combines observations and assumptions to form certain impressions. And scholar-logicians (such as the economist cited by Ed) reason out what this process would entail under optimal conditions. But the element that would be of the most interest to examine the complete picture of human thinking is that of those leaps of faith people must make in whatever givens they must accept as their basic assumptions. And also how such assumptions are given from one person to another is of course through language. But ultimately such concepts as “God,” “good,” and “must,” although they are essential universals among humans, must in the end be seen as concepts that lack true precision — ideas or abtract “forms” that, although they are necessary, must ultimately be seen never to be concrete.

    For example, the abstractions True as in “The Book of Mormon is true” or False as in “The Book of Mormon is false” Which have meaning according to how one emotionally and mentally wields the abstractions of either Truth or Falsity. And in the end the process of human “knowledge” is existential.

  50. Or instead of anything I just said with concern to truths, impressions, and assumptions, just read and then ponder Ecclesiastes.

    Which, of course, finally ends with: “OK, so then after all this, then still do your duty and what you know to be right ANYWAY!”: the tenor of advice given by ethical philosophers from Plato and Aristotle to Confucius to whomever-can-be-named (and is also apparently the “Eugene England” approach of abiding by positives of faith, obedience and humility, as had been refered to by Travis in his post 96 to “On Earning the Right to Complain”).

    Amen.

  51. Besides some of the difficulties aleady mentioned, I think a very serious problem is that people’s beliefs consist of a complex, highly coupled set of concepts, propositions, and associations. Two parties A and B can generally only exchange partial information about their beliefs in any reasonable time frame.

    So even if A thinks B is a smart, honest, reliable, and sincere individual, partial information exchange will limit the credit that A can give to B’s beliefs because the largely sub-conscious rational / associative process that A uses to form his own beliefs will have a greater level of input from A’s existing store of experiential and propositional knowledge – even if A weights any of the experiences related by B as if they occured to him or her personally.

    Only with shared experiences and extensive communication over a period of years, would A’s beliefs naturally converge with B’s, even if they trust each other implicitly.

    There is a further problem particularly relevant to religious / emotional beliefs – some experiences or emotions are extremely difficult to convey accurately.

  52. I understand its popular usage in certain contexts, but the notion of “relative truth” is strictly speaking an oxymoron. We have plenty of terms like “belief”, “model”, “probability” and “rational expectation” to cover propositional content that is held subjectively.

    The only reason to sully the standard sense of the term “truth” is if one doubts in the existence of an external reality. Truth is either absolute in some sense or it isn’t truth at all.

  53. It seems like two factors are at work to make this thesis not bear itself out in the real world. First of all, I can think my atheist friend is rational and smart, but that he hasn’t yet had the sort of spiritual experiences I have had, so he doesn’t yet believe as I do.

    On the other side, eyewitness testimony is notoriously unreliable, and it’s not rational for someone to overhaul their entire worldview based on someone else’s eyewitness testimony, regardless of how smart and honest they think the person to be. The most they might do is provisionally accept that something might be going on that doesn’t fit their worldview entirely, that’s they don’t completely understand. Otherwise, we could poll the smartest and most honest people in history, and simply believe what they believed. The problem with that is that all the various super-brilliant people in history have believed different things.

    An analogy would be when I was fixing people’s computers for a living. They would describe their computer problems to me, and they were always being honest and sincere in describing to me what happened, but their lack of knowledge about the underlying mechanisms of the software and hardware made their descriptions unreliable. I could only rarely figure out what was wrong based on their descriptions. On the other hand, when they knew how to make it happen again, and could do it while I was watching, I could usually tell instantly what was wrong, and if not, I could always figure it out eventually using the scientific method. It wasn’t that they were not smart or honest, but that they didn’t share my knowledge of the underlying mechanisms, so they (or their perception subsystems) didn’t know how to perceive what happened. I had to have more than their words, I had to have the experience myself, before it had meaning to me.

    If religious experiences were shareable and repeatable then what you say would be the case, because we could test our theories against our observations. As it is, I would think my friend irrational if he overturned his whole view of the universe based on my testimony of religious experiences of which he has had no hint himself. Everyone’s perceptions are based on their prior understanding, and we filter our experiences before we even register them. There are some things you just can’t do for someone else. One of them is feeling the spirit.

  54. Paul (#20): Sorry to somewhat overlook your comment. The not sharing of information is probably a better way to approach this than the hole I’m digging myself into trying to relate this the common knowledge assumption….

    Karl (#48): The QJE paper is related to some of the work I was thinking of, but there he’s relaxing the common prior assumption (he has what I think is a very well-written piece—that a lay audience should understand —discussing reasons that may be a poor assumption here). Instead, I was thinking more about his work on higher order beliefs—see my comment to Ed below.

  55. Ed (#47): Hmm, I see you’re point and I’m having a hard time getting at the point I’m inuitively trying to make (b/c I don’t have a formal grasp of it…). So I’m going to back up a bit and explain the intuitive link I’m trying to make based on some finance results. Of course I’ll be glad if someone else can help me make the link more formal, or help me understand the intuition of the finance results (which I though I had, but am having a hard time recovering, hopefully b/c I’m time pressed, not b/c my brain is aging!).

    In finance, several authors have shown that relaxing the common knowledge assumption can lead to speculative pricing bubbles. Here’s a paper where Morris does it by only allowing finite depth of knowledge (see p. 462 for the bubble implications). Here’s a paper where they get bubbles (or something similar) by relaxing the assumption of common knowledge of common priors.

    I’ve simply been tryng to relate these type of bubble results, which come from relaxing the common knowledge assumption, to the post here. I think I’ve done a very poor job of it so far, probably b/c my memory has jumbled several of these results together in my head. I still think there’s a cogent argument to be made, I just no longer believe I’m the one to make it!

    Also, I think the rational justifications in the flury of heterogeneous beliefs finance articles have relied more on the arguments for why common priors may be a poor assumption (see the Morris article I linked to in #56) than on the relaxation of the common knowledge assumption per se.

  56. Watt (#44), the omniscient economist-modeler gets to decide who’s sincere and who’s not! And I assume God is the great economist of them all (though it could be the god of this world, I often get confused between the two…). What I had in mind is if the one praying is sincere, then God would know and answer the prayer.

    Actually, I don’t think T-truth matters in many of these theorems/results—it only matter if there’s a relevant outcome in the model that depends on T-truth, in which case it’s usually assumed that T-truth is a weighted average of all the agents’ beliefs in the model, though other variations can be explored (usually of the “too optimistic” or “too pessimistic flavor).

  57. 1. not fully rational (or subject to cognitive biases)
    2. dishonest
    3. deluded
    4. seeking for something other than truth
    5. do not have common priors.
    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
    I simplified those to:

    I don’t agree with other people because I think:
    1) They have defective rational, cognitive functionality.
    2) They are motivated to gain something other than truth.
    3) They are just plain ignorant or inexperienced.

    Beyond these simplified three, people “agree to disagree” because their vocabularies are slightly or widely different. This is different from “uncommon priorsâ€? because the priors imply a difference of experience, and not a difference in a description of these experiences. All those differences amount to general dissonance in a discussion; and, if you’re persistent, you may discover disagreement in what a single word “actually” means.

    Using our imprecise communication, untangling accumulated disagreement–even partially, takes an enormous amount of time, emotional openness, persistent logicality, etc. Those requirements are rarely met. When we recognize our shared inability to communicate fully (with required capacity and resource) on a given topic, it is wise to “agree to disagree” and to acknowledge the limitations in our human communication.

    We are not completely understood by any mortal. We must “agree to disagree” here in mortality, with our limited mental/emotional capacities and time resources. Positively, we have the potential for complete divine harmony today.

  58. Very good point, Chris. Failures to communicate can make disagreement persist. Sometimes what appears to be disagreement might be mostly just a different use of language.

  59. Language is the arbitrary clothing of ideas yet still places these ideas in place and time.

    E.g., Hassidim and Mormons are both Judeo-Christians, yet the us-versus-them imperative “Don’t be like unto heathen” is interpreted
    _____
    among Hassidim –> beards = good

    among Mormons –> beards = bad

  60. A thought for the economists out there to poke holes in:

    As I understand it, I rationally take your beliefs into account because you are as rational as I am. But suppose I *think* you are as rational, but am not certain. Wouldn’t it make sense to wait and see if you take my beliefs into account? You would do it if you are rational. If you don’t, your rationality is suspect and I ought not to take your beliefs into account. Then suppose that you are rationally waiting to see if I take your beliefs into account . . .

  61. Adam (#62), clearly your scenario is a major reason that disagreement persists. However, while there is always going to be evidence that some people are less rational than I am, it probably is not rational for me to take that as my working assumption whenever I come across someone with a different posterior than me. In theory, given enough time I should be able to work out to a high degree of confidence whether or not someone shares my rationality. In practice, there are ways, however imperfect, to discover whether or not someone is willing to take my beliefs into account.

    More fundamentally, for me to be fully rational in the sense at issue, I would have to be willing to take your beliefs into account to some degree whether or not you were willing to take mine into account. You have had a different set of experiences than I have had and so your beliefs — assuming that you seem not to be completely irrational — contain some information that should prove useful to me if my objective is the truth. If the only thing that is preventing you from taking my beliefs into account is that you are waiting to see whether I will reciprocate, I may give less weight to your beliefs at first than I would otherwise, but as you see I have begun to incorporate your beliefs, you will start to incorporate mine and our disagreement will begin to vanish.

  62. Adam (#62): You are relaxing the “common knowledge of common priors” assumption, which is the same thing these guys are doing.

    While I’m here, let me take one more stab at giving an example where the asymmetric information about others’ beliefs bites:

    Round 1: Suppose I begin a discussion with person M who says something that I believe is true. I may rationally conclude both that M is relatively informed and, therefore, that M is likely right in this belief.

    Round 2: If in subsequent discussion I learn that M is a member of a fringe, cult-like religion (say, that believes in polygamy) which has a disproportionate share of relatively uninformed members (not irrational, just people with a lot of bad information) in it, then I may well reverse my beliefs in Round 1. If I now believe that M is most likely uninformed, and that most uninformed beliefs are wrong, it is conceivable that I now rationally question the belief that M expressed, even though that is what I originally believed.

    To formalize this example and relate it to Aumann’s framework would take a bit of work, but my conjecture (based on finance models where this kind of reversal of beliefs leads to asset pricing bubbles, speculation, excess volatility, and/or excess trading volume) is that the result can be generated by relaxing Aumann’s common knowledge assumption (not just by relaxing the rationality assumption).

  63. Adam (#62): Actually, rereading your post, it seems you’re proposing not to relax the “common knowledge of common priors” assumption, but the assumption of “common knowledge of rationality.” Aumann rules out this case. And I believe this confirms Ed’s critique: the reason I disagree with you is that I believe you are (or might be) irrational. So I don’t believe what you are suggesting eases the tension Ed is pinpointing. My example only marginally eases the tension—I may disagree with you, not because you are irrational, but because I believe you are uninformed. Toma(y)to, toma(h)to you could argue, but to financial economists there is a big difference….

  64. Adam,

    I think your question is a clever one, and I too think it is closely related to the “common knowledge” assumptions that Aumann’s paper clarifies.

    But for purposes of this discussion, I think that it would work something like this: even if you think there is some probability that I am irrational or deluded, you would still give some weight to my beliefs in revising your own beliefs. So we would still expect people’s beliefs to move towards each other. As we continued to talk, the fact that we can see that we’re both taking each others beliefs into account might further reinforce the idea that we are both rational, and in the end we might come to have very similar beliefs.

    If, on the other hand, I refuse to budge and just suggest that we “agree to disagree,” you might conclude that there is something amiss (dishonesty, irrationality, lack of trust), and you’d give my beliefs little weight in revising your own beliefs. Then we’re back in the situation described in the original post.

Comments are closed.