moralobjectivity.net: copyright Robert M. Ellis 2011

Cognitive bias 

'Cognitive bias' is a general term for a number of ways identified by psychologists in which we often do not engage with the world as objectively as we might. Cognitive biases interfere with our understanding of the world by distorting our judgement in one way or another. There is an obvious overlap between cognitive bias and Middle Way Philosophy, as objectivity in Middle Way Philosophy is understood as including the overcoming of such biases. If Middle Way Philosophy is correct, cognitive biases should be explicable in terms of attachments to metaphysical beliefs, and can be overcome through the development of integration.

However, before we look at some examples of cognitive bias and how they can be understood in terms of Middle Way Philosophy, some differences also need to be noted between the approaches typical of empirical psychology, where these biases have been identified and tested, and those of Middle Way Philosophy. Wherever a failure in judgement is noted, a standard for comparison in good judgement needs to be assumed. In empirical psychology this standard is either based on independently verifiable facts or on accepted conventional facts, neither of which can be absolute, but which nevertheless assume the naturalistic model of absolute facts opposed to 'subjective' values. In Middle Way Philosophy, whilst science and convention can often provide us with helpful indicators, it is rather the avoidance of metaphysics and the adequacy of experience itself that provides the standard of objectivity, rather than any appeal to ultimate 'external' facts of any kind. Because facts are recognised as necessarily inter-related with values in the very meaning of our language, too, there is no distinction between 'cognitive bias' and 'affective bias': both stop us engaging with conditions in parallel ways. Discussion of cognitive bias thus needs to be supplemented by psychological study of emotional limitations such as failures in empathy.

Nevertheless, the huge volume of work done by empirical psychologists is generally applicable and helpful in understanding the limitations on our objectivity of belief. It helps to provide evidence to support Middle Way Philosophy by showing the many ways that objectivity (or its lack) is incremental and psychological in nature rather than consisting in verified theories that are assumed to have a one-to-one representational relationship with reality.

I think that all, or nearly all, cognitive biases can probably be explained as (a) dependent on a metaphysical belief, (b) sometimes resolvable in the short-term through awareness of the bias (c) always resolvable in the long-term through integration. The awareness and the integration here are interdependent: one needs initial awareness that one has a bias in order to be able to address it, but to stop it arising unconsciously one needs a longer-term integration that addresses the conditions in which the cognitive bias develops. On the other hand, continued attachment to the metaphysical belief at the basis of the bias entrenches it, and prevents integration.

There is a very helpful list of cognitive biases on Wikipedia here. Many of these phenomena overlap, but have been given different names by different psychologists along with slightly different explanatory theories. Most of the ones listed on Wikipedia can be readily explained in terms of metaphysics and integration, so I am happy to respond to queries on specific ones not mentioned below. Here I will just give an indicative list of some of the more important cognitive biases that have been identified, with their metaphysical root and integrative resolution. All the names of the biases given as sub-headings below are linked to Wikipedia articles that explain them further and give references.

Actor-observer bias 

We tend to explain our own behaviour in terms of the demands of the situation, and others' behaviour in terms of their character. Here the metaphysical assumptions involved are those of freewill and determinism, although both may be found on either side. If we are attempting to merely describe actions without a weight of negative group-judgement attached to them, we tend to see ourselves as responding rationally to a situation and others as responding in predictable ways according to character. If we were to recognise the effects of character on ourselves here it would interfere with the belief that we have a choice, and if we were to recognise the effects of situations on others we would also have to recognise that they also reason and make choices in relation to situations. However, when negative moral judgement is involved, we tend to see ourselves as determined by a situation in order to avoid a negative judgement from the group ("I had to do it - i had no choice"), but others as responsible for their character, which led them to act in the way they acted (e.g. the belief that a murderer is evil). Here we need to integrate our beliefs about character with our beliefs about situations so as to treat them in a more consistent way, and a decisive avoidance of implicit beliefs in freewill or determinism will help with this process.

Anchoring

This is the tendency to make judgements in dependence on a specific 'anchoring' focus on a fact or type of fact. Any metaphysical belief can be used as an 'anchor' in this way. For example, an anchoring belief in the existence of God can make us focus on types of event that show God's care or God's punishment, and neglect events that do neither of these. An 'anchor' here appears to be another name for a habitual identification due to conditioning. To integrate the anchor with other possible starting points we need to extend meaning, which may mean a process of learning in which others point out alternative approaches. 

Attentional bias

Here an emotionally more powerful identification leads us to pay far more attention to some factors than others, for example not considering counter-examples or negative results in experiments, but only the relationship between preferred belief and the evidence that directly supports it. Here a metaphysical belief of some kind supports the exclusive identification, e.g. the absolute value of one outcome. The integration needed to resolve this is probably at the level of desire, and involves the person extending their awareness systematically to other areas of experience (e.g. in meditation).

Attribute substitution

We tend to avoid difficult and complex judgements by unconsciously substituting easier ones: for example, judging a person according to stereotypes or a difficult maths problem by using an inappropriate but more familiar and simpler method. This type of cognitive bias can contribute to explanation of why any type of metaphysics is favoured as a basis for judgement. Metaphysical beliefs are simple, universal and catch-all, so are much easier to call in than complex assessments based on experience. To deal with this tendency we need to integrate our desire for a quick and easy answer with our desire for an accurate answer that addresses conditions and/or our belief that there are easy metaphysical answers with our recognition of their deceptiveness.

Availability cascade

A new idea becomes rapidly popular ('goes viral' in internet parlance) because of its increasing accessibility and the group pressure to adopt it. This tends to overwhelm critical scrutiny of the idea. This mechanism helps to explain both the links between any metaphysical belief and its supporting group, and the ways in which opposing metaphysical beliefs can rapidly replace one another. A belief adopted in this way cannot be provisional because of the way in which it is adopted, which precludes scrutiny. This mechanism can be avoided by developing individuality and critical thinking skills so as to integrate beliefs more effectively.

Belief bias

Here, the process of reasoning is undermined by a tendency to accept or reject the conclusion regardless of its validity based on the premises. This seems to be an example of a metaphysical defence mechanism, because it prevents reasoning based on experience from upsetting belief in a metaphysical claim that is already accepted. Experiential claims depend for their justification on such links of reasoning, so they cannot be served by the belief bias, only undermined by it. Metaphysical conclusions that are supported by metaphysical premises, on the other hand, are unaffected, because the outcome is the same whether or not the reasoning actually effects the outcome.

Belief disconfirmation paradigm, to avoid cognitive dissonance

Cognitive dissonance is created by inconsistent beliefs, and we have a drive to reduce this inconsistency which is obviously one of the motivators of reasoning. So cognitive dissonance can be very useful to us. However, the drive to reduce cognitive dissonance is not necessarily reduced by reasoning to bring about consistency, but can alternatively be tackled just by ignoring or denying information that creates conflicts with existing accepted beliefs. We may not even see things that are dissonant in this way, through denial. Another technique that avoids cognitive dissonance, is ad hoc reasoning (also known as 'moving the goalposts'), where the original information is re-assessed in a way that merely defends the original belief, as in the re-interpretation of failed prophecy. The defensiveness of these cognitive biases can be employed to support any kind of metaphysical claim, and is associated with metaphysical claims rather than provisional ones, because provisional claims do not need to be defended in this way.

The solution to cognitive dissonance is the use of critical reasoning to either identify what conflicting beliefs have in common, or to show the justifiability of one and lack of justifiability of the other. This is a dialectical process, rather than one of rejecting or denying one belief immediately without considering evidence. In this way conflicting beliefs can be integrated rather than one merely being identified with and subjugating the other. 

Clustering illusion

This is the tendency to find patterns in phenomena that are no more than random or otherwise lack the significance attributed to them. A harmless form of this is merely projecting imaginative significance onto forms, such as seeing dragons in the clouds (indeed this could be seen as a positive extension of meaning). However, a cognitive bias can obtrude when this projection is given a metaphysical status, i.e. taken to be the 'real' message of the phenomena. For example, numerological interpretations try to find hidden messages from God in the apparently insignificant numerical construction of the Bible, such as the numbers of chapters and verses in each book. The clustering illusion can also be detected in the design argument for the existence of God, which attributes divine significance to patterns that could otherwise be interpreted as the result of random processes (e.g. through evolutionary theory).

However, in applying the Middle Way to this instance of cognitive bias, I think there should also be a recognisable cognitive bias involved in denying such significance, as opposed to merely affirming it. For example, we cannot deny that a person may experience God's purpose in the world around them: the metaphysical move, and the cognitively harmful bias, arises either in affirming or denying a metaphysical status for that experience (see metaphysical agnosticism).

So, the clustering illusion needs to be treated carefully if we want to integrate the beliefs it tends to give rise to. If we experience significance in a given pattern that others think is 'merely random', we should accept that significance. However, we also need to stay with that experience as an experience rather than over-interpreting it as providing truths about the universe. Meditation can provide helpful practice in observing such experiences and their significance for us, without attaching unnecessary labels to them.

Confirmation bias

Confirmation bias is the tendency to look for information that confirms your existing beliefs, rather than neutral or falsifying information. This leads to the under-reporting of negative results in scientific research, and for those considering a particular viewpoint to concentrate on arguments from viewpoints that favour it. For example, how many theists read as much work by atheists and agnostics as by theists? Scholarliness is another approach that is very subject to confirmation bias. Diligent scholars can give you hundreds of references to prove their point, and then be baffled as to why you should possibly still reject such overwhelming authority: but all the authorities quoted have been selected to support the scholar's position. Even when making an everyday observation (e.g. "John is lazy") we do not usually compare John's degree of laziness with other possible degrees of observable laziness, but just take characteristics of laziness by themselves to prove the point - but perhaps John is a lot more industrious than most of his peers.

Confirmation bias can be seen as part of the mechanism by which any metaphysical view is maintained. Given that we tend to seek evidence that confirms a metaphysical view because it can be easily interpreted in its terms, the view remains unchallenged. Opposed views may be given cursory consideration, but not with a degree of detail or seriousness that allows challenges to really be considered, because the nature of the view is such that alternatives become threatening and a move away from it would be discontinuous. Provisional views, however, are constantly checked against alternatives which may supplant them is they fit experience better.

Confirmation bias shows a lack of integration, because the potential for other beliefs is held opposed to the belief we identify with. Psychologists investigating polarisation have found an association between publicly declaring a position and the development of opposed perspectives on the same evidence amongst different groups. This suggests that the more we identify with a group, the more we need to reject an opposing set of beliefs, reinforced with confirmation bias, to reinforce our group identity. Confirmation bias can thus be tackled by integrating the beliefs of groups at a social level and/or the beliefs of individuals, who can be trained in critical thinking and/or rigorous use of scientific method to give just as much attention to opposing evidence or opposing views. In the long term this is the best way of supporting a view that one identifies with: make it address the conditions better!

Disregard of regression towards the mean

We tend to assume that exceptional activity will continue rather than falling back to a pattern that is more consistent over the longer-term. For example, after a period of weeks of exceptionally hot weather, I expect more hot weather and am surprised when the temperature falls. A temperature that is actually normal for that time of year in my section of the globe feels cold for a while as a result. The 'Gambler's fallacy' is another example of this: gamblers who have had a run of luck expect it to go on. The metaphysical belief involved here is that of scientific relativism: we absolutise the conclusions drawn from a localised pattern and assume it is universally and eternally the case for us (even if we theoretically recognise different patterns existing elsewhere) instead of considering it provisionally in relation to patterns in other times and places. The solution is to integrate our universal beliefs with our relative ones through a process of critical investigation.

Hindsight bias

This is the tendency to believe with hindsight that we knew how things were going to turn out: a belief that is not supported by an accuracy of prediction before events. This cognitive bias is closely related to the metaphysical belief in determinism, that all events are inevitable and thus could have been known in advance. Even if we don't claim to have actually known in advance that certain events were going to happen, hindsight bias makes us feel that in some sense we should have known had we had enough information or insight - the illusion perpetuated in determinism. Hindsight bias shows a lack of integration of belief between different times, because we are unable to acknowledge the ways in which are past beliefs were different from our present ones but insist on imposing the present on the past. We have to acknowledge our past beliefs as different through critical investigation before we can genuinely integrate them with present ones.

Hyperbolic discounting

Hyperbolic discounting is the tendency to value benefits that are closer in time over those that are further away in time, so that $1 today is worth more to me than $1 in a year's time (regardless of inflation). This 'mistake in moral mathematics' as Derek Parfit calls it, is one of the difficulties of a utilitarian approach to moral reasoning in which we must weigh up benefits that are the consequences of actions against each other to judge the actions. Whenever we weigh up benefits, our judgement is distorted by a preference for having them sooner, even though the benefits themselves are not experienced any differently later. Jeremy Bentham formalised this hyperbolic discounting by including propinquity, or nearness in time, as one of the aspects of pleasures we should consider in his moral calculus, but he was unable to give any further justification for why an imminent pleasure should be considered better than a distant one. It amounts to a metaphysical assumption that is made in consequentialist calculation.

The lack of integration involved is very clear here, because it consists in a privileging of desires that we can more closely relate to our present experience over those in the future. This reflects the demands of the ego, insisting that our current desires and views are the only ones and failing to identify with those of other times. The process of integration here depends on extending our identification of ourselves to different times, which might be done through reflection and/or meditation. Using Kantian deontological moral theory as a device to challenge the weaknesses of the utilitarian approach is also a good moral strategy for overcoming this cognitive bias in moral decision making (see 'What is Buddhist Ethics?')

Illusion of control

This is the tendency to believe that we are in control of events when our actions have a direct input into them, even though those actions may not be significant in determining such events. For example, when playing a lottery, being able to choose the number makes people feel that they have control over their chances of winning, even though the chances of winning with a chosen number are no greater than with a random number. Those who are driving also tend to believe that there is less chance of an accident then than there is when they are in the passenger seat. This cognitive bias can be related to the metaphysical belief in freewill, which imposes a belief in our control over all our actions because we are making choices about those actions, without taking into account the conditioning affecting the choices, the actions and the outcomes.

Here the belief that we are in control needs to be integrated with the belief that we are not in control, probably associated with active and passive desires. We should be able to do this through critical reasoning together with a basic level of scientific knowledge about the conditions we encounter.

Illusory correlation

This is the tendency to correlate easily identifiable, available or unusual categories and have exaggerated beliefs about the extent of the correlation, based on limited experience. For example, an elderly white woman burgled by a black man might subsequently think of all black men as potential burglars. This involves metaphysical belief because it involves an absolutising of beliefs that may be correct in relation to certain limited times and places, but are not absolutely or universally the case. This can be seen in relation to moral beliefs when a moral rule that may be relevant to one situation is adhered to as an absolute, e.g. Jewish dietary rules, which seem to have had the original purpose of distinguishing the Israelites as a tribe from other tribes, continue to be applied by orthodox Jews when this distinction no longer serves the same function. Like any identification with an absolute position, this cognitive bias involves a lack of integration between identification with the absolute belief and other possible identifications, with denial of experiences that do not fit (see belief disconfirmation paradigm above). A process of critical thinking and sceptical enquiry is probably the primary means by which integration can be achieved here.

Impact bias

This is our tendency to overestimate the strength and/or duration of future feelings, whether good or bad. In general we recover more quickly from loss, and get bored more quickly with a new source of satisfaction, than we think we will. Like hyperbolic discounting above, this involves a metaphysical assumption created by a lack of integration between different identifications at different times. However, here, instead of giving preference to our current identifications, we idealise future ones and identify with those at the expense of present ones. In this way, we give less credence to current pains and pleasures and exaggerated credence to the impact of future ones. The ultimate version of impact bias is the belief in eternal bliss or eternal damnation after death, in absolute contrast to the short-lived experiences of pleasures and pains we have in this life. As with hyperbolic discounting, both meditation working directly with desires, and Kantian moral reasoning dealing with moral principles, might help integrate opposed identifications with different times here.

Ingroup bias

Ingroup bias is the tendency to favour our own group over those outside it, both in preferential treatment and in greater credibility given to the beliefs of the ingroup. This has been shown by empirical psychologists to occur even in randomly created groups where there was no other reason for favouring an in-group. Ingroup bias is part of the mechanism of metaphysics, by which group beliefs become a badge of identification for a group. Ingroup bias research shows that individual judgements are (consciously or unconsciously) subjugated to the group through adherence to group beliefs. This could mean that alternative views are never entertained, that they are prejudicially rejected when considered, or that they are accepted secretly but not revealed to the group for fear of losing one's place in it. Any view can be held dogmatically by a group in this way, discouraging individual investigation or consideration, but metaphysical beliefs lend themselves to this process by being apparently immune to disproof. More empirical research needs to be done to establish (or falsify) the links between metaphysical beliefs specifically and ingroup bias in an experimental context.

The lack of integration involved in ingroup bias is obvious at a social level: it leads to conflict between groups, or between groups and transgressive individuals. However, it also applies at a psychological level, because our feelings of wishing to conform to the group tend to conflict with justified beliefs based on experience. We can only overcome this lack of integration in the long term by making group beliefs increasingly provisional and subject to evidence. Strongly-rooted drives here seem to be in conflict with our investigatory intelligence, but we can still integrate the two by recruiting the energy of those drives towards increasing the adequacy of group beliefs. A group of research scientists or a critical thinking class could each provide examples of groups where, despite some continuing ingroup bias, there is a strong attempt to use the resources and energies that individuals give to a group to actively undermine dogmatic group positions.

Information bias

Information bias is the tendency to prefer more information as an end in itself, even when this is irrelevant to our practical judgements. This is directly related to metaphysical belief, as  metaphysical belief consists in assumed information that cannot be relevant to our practical judgements, even though we have a tendency to assume that it will be. For example, if we are making a judgement about  a person's responsibility for a blameworthy action, we may believe that their freewill is relevant to the judgement, but this gives us no further information than that offered by experience of their character, behaviour, reasoning abilities etc. Whilst judging them on the basis of experience, we use freewill as a further false justification for allocating responsibility. We can overcome information bias by integrating our practical beliefs more fully with our cognitive construction, so that we are neither creating a cognitive construction that goes beyond practical experience, nor adopting an unduly narrow understanding of practical needs.

Introspection illusion

This is the tendency to believe that our own introspection is a more reliable guide to the causes of our choices and behaviours than are the outward observations of others. This belief is inconsistent with the way we treat others' introspections, which we tend to see as unreliable. We tend to believe that our own motives for choices are transparent to us, even when we have constructed our account of those motives with the benefit of hindsight. This enables us to believe that we are rational and virtuous in ways that others are not, because we see our own behaviour as motivated by clear, justified reasons, even when may well have unconscious causes, and the behaviour of others as irrationally caused even when it may have been carefully considered.

Introspection illusion is closely associated with metaphysical beliefs about true sources of knowledge known introspectively. The most obvious example of this is that of Descartes, who believed that what he "clearly and distinctly" perceived must be indubitable. Cartesian assumptions have motivated many subsequent philosophers to believe that either direct internal experience or a priori reasoning  (which Descartes called "clear and distinct") must provide certainty in contrast with uncertain external observations - for example, Husserl and the phenomenologists on unmediated internal experience or the analytic tradition on a priori reasoning. Meditators, mystics and other religious figures have also often claimed to know with certainty that their inner illuminations give them certainty about God or about universal truth, of a kind that external experience could not have. In both these kinds of cases, the clarity and/or power of an internal experience have been mistaken for a source of absolute truth, even though the power or clarity of internal experience or reasoning bears no neceesary relationship to the truth of assertions justified through it. Metaphysics is thus often justified in ways that illustrate the introspection illusion.

The introspection illusion can be overcome by integrating the beliefs that we associate with introspection with those we associate with outward observation. This involves extending our awareness to try to understand ourselves from the point of view of others and others from the point of view they have of themselves, in addition to (not instead of) our personal experience of authoritative introspection. Obviously communication may help with this process.

Just-world hypothesis 

This is a tendency to assume that the world is just, in order to avoid the cognitive dissonance (see above) created by injustice, to maintain a sense of security and avoid anxiety. The just world hypothesis leads people to blame victims for their suffering and praise those who are merely fortunate, preventing us from recognising conditions that do not accord with the belief that the good always prosper and the bad always suffer. There is some further discussion of this hypothesis in thesis2a (scroll down to section v).

The Just world hypothesis has strong links with the metaphysical belief in cosmic justice, discussed in A Theory of Moral Objectivity as one of the characteristic beliefs of eternalism. For example, the Just world hypothesis motivates belief in heaven and hell as reward or punishment for sins on earth, belief in karma and rebirth or reincarnation, and belief in a just outcome in history (such as the American 'Manifest Destiny' or the Israelite belief expressed in the Old Testament that the invasion of Israel by Assyrians and Babylonians was God's punishment for their lack of faith). In philosophy, it also motivates doctrines such as Schopenhauer's 'Principle of Sufficient Reason' (following Leibniz). However, the Just world delusion does not need to take such grand forms. If we simply feel that we should have been given that job that we clearly deserved, failing to comprehend the reasons why we didn't, the Just world delusion may be at work.

The solution to the Just world hypothesis is to integrate our beliefs about observable events with our desire for favourable outcomes. The Just world hypothesis is just extended wishful thinking, but if we take enough notice of what observation actually tells us, whilst recognising the idealising energy we invest in Just world fantasies, we might be able to start investing those energies in making the world more just, as opposed to pretending that it is already. A decisive rejection of Cosmic Justice beliefs can only help with this, even though we can still accept and enjoy the meaningfulness of Just world fantasies as we meet them, say, in Dante's Divine Comedy or the Buddhist Wheel of Samsara ('Wheel of Life').

Moral luck

Moral luck is our tendency to make moral judgements that depend, not just on actions that are the result of choices, but on events, circumstances or shaping influences that are not under our control. For example, we blame people who do things that are a danger to others (like reckless driving) much more when they actually result in harm, and we praise able students for success that is largely a result of intellectual capacities that have been genetically inherited. The assumption of moral luck is one factor that makes the analytic style of ethics (which appeals to convention as the only moral ground) metaphysical and nihilistic, because it takes a conventional view that includes moral luck as the basis of judgement when we cannot justify it through our moral experience (including experience of developing integration). Moral luck is also related to the actor-observer bias (see above), because we tend to discount mitigating circumstances that apply to others when holding them responsible, whilst exaggerating those that apply to ourselves. To address this, we need to integrate our desire for social acceptance (which drives us to accept the conventional blame of the unlucky) with our desire for individual control and freedom (which leads us to the rejection of moral luck).

Neglect of probability

We tend to neglect probability when acting in situations of uncertainty, but make our decisions based only on an absolute emotional response to a risk or a desired outcome. For example, players are attracted to lotteries by large prizes at extremely low odds, and people change their behaviour much more in response to risks that have been brought to their attention rather than ones that continue in the background (e.g. a well-publicised rapist makes some women afraid to walk the streets at night, even though the overall probability of an attack has not changed significantly). The neglect of probability is metaphysical because it involves thinking in absolutes rather than increments: probabilistic thinking is incremental and absolute thinking is metaphysical. Any kind of integration can help us to pay more attention to probability by reducing the unreflective power of the thoughts of one moment and helping us see them in the wider context of other thoughts.

Self-serving bias

Self-serving bias is the tendency to evaluate all information in a way that supports our own interests. Ambiguous information is interpreted in accordance with our own interests, and self-serving information is selected to the exclusion of opposing information. We take credit for success but make excuses for failure. The metaphysical view here is obviously one that emphasises the importance of ourselves or our own identifications - although we do not always identify with ourselves exclusively, many people have a tendency to do so. Self-serving bias attributes an absolute value to a narrow conception of my own interests, whereas if this conception of my interests was entended and integrated with other opposing conceptions, my own longer-term interests would be better addressed because they would address more conditions. 

Status quo bias/ system justification

This is the tendency to support the existing established system or status quo over alternatives, even when an alternative may be clearly in our interests. For example, this accounts for conservatism amongst the poor, some of which prefer the existing social and economic system over any radical changes that may be proposed, even though it works to disadvantage them. System justification can be related to conservatism as a metaphysical ideology, which gives absolute value to tradition regardless of its usefulness, whether in a political, social, or religious context. To overcome system justification we need to integrate our identification with the past and present with that for future ideals, using ideals to influence present policy realistically through gradual change.

Stereotyping

Stereotyping is the tendency to assume that individuals in groups other than our own have features that we take to be typical of that group, regardless of individual differences. It is related to argument from anecdote and the problem of induction, because it involves over-generalising from limited examples. Stereotyping is due to interpreting our experience in terms of expectations, which take an absolute form, rather than taking in new incremental information from experience. In a sense every metaphysical view is a stereotype, because it consists in an absolute belief that can only be built on relative foundations, when all our experience is relative. We can overcome stereotyping by integrating our wider generalising beliefs with our specific observations. We do not need to go to the extreme of rejecting all generalisations, but we do need to keep all generalisations provisional and subject to revision in the light of new information.

Subjective validation/ Forer effect

Subjective validation is the tendency to believe that two unrelated phenomena are connected because the belief in a link fits into our existing beliefs (closely related to confirmation bias, see above). The Forer Effect is a specific example of this, found in people's tendency to believe that a vague or general character description that fits their beliefs about themselves was specifically written about them - as in horoscopes and fortune telling. Subjective validation tends to reinforce metaphysical beliefs because, as in confirmation bias, we tend to interpret the world around us in terms of those beliefs. In the particular case of the Forer effect a deterministic belief either about our character or our future fate is reinforced by a tendency to interpret ambiguous information in its terms. We can overcome subjective validation through the practice of critical thinking, which integrates our existing beliefs with new ones acquired through new experience or information, through the process of examining the justification of the existing beliefs in the light of the new ones.

 

Links to related pages

The Middle Way and Psychology

Integration - concept page

Metaphysics - concept page

 

Return to moralobjectivity.net home page