Seeking Meaning in an Irrational and Meaningless World

Albert Camus, an existentialist within the 20th century, argues that the absurdity of life occurs when humans experience an epiphany, this epiphany being the moment in their lives when they begin to contemplate and reflect on who they are and what kind of life they live and whether or not these hold any meaning. This conflict between trying to seek meaning, coherence, and value of our lives and the inability to find said meaning within a world that Camus believes is irrational and meaningless is what he calls the Absurd for we can never find meaning no matter how hard we think and attempt to. These two ideals will never be harmonious. We often choose to distract ourselves from ever having to think about these kinds of questions – there’s a social standard that many of us are governed by and carry out in our lives, we are told at a young age by family, our institutions, or media, what kind of life is “normal” or what path we should be taking. For example, majority of humans live their lives very cyclically, and what kind of life one lives is akin to something like going to college, get a job, have a family. These outside factors shape who we are, what kind of values we have (for example, education, family, honesty, etc), it creates us. We are molded in a particular way that we feel we cannot break free from – that once we hit a certain period of time, changing who we are, changing in general of our values, seems impossible because it appears as though we have already established who we are. However, existentialists would argue with “well, what is us?”, “what is the purpose of your life?”, and essentially “are the things you are doing hold any meaning right now?” and this creates the absurd. We are living our life normally without much thought or consideration, but when prompted with these questions, we begin to realize the agency that we do have over our lives. For some people, when experiencing very traumatic experiences such as child abuse, depression, or abandonment, it’s very easy to blame the world for ruining your life. One does not take responsibility to better themselves, and essentially are made to believe that they are doomed to simply be unhappy or broken. It’s easy to be comfortable with these kinds of thoughts, thoughts that the world set you up for failure and you have no control over the kind of life you live often blaming it on outside factors such as family, and the school system, but no one is forcing you to nor stopping you from acting.

Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Essay Writing Service

At all times in our lives, are we making a decision and these decisions define us, but we have the ability to change. This is where the absurd comes in for when we realize that everything and anything, we do in the grand scheme of things are meaningless, well then what is holding us back from truly living our lives? That is why the realization of absurdity shows the world in its primordial hostility because any kind of meaning that we attempt to give our lives is essentially meaningless, they’re not actual truths but rather justifications for why we chose to do certain things. We in actuality have full agency of our lives but are made to believe that the universe controls how we are to feel, to act, and to live. It is our own responsibility to take control of our lives and make us happy. Camus’ essay on the “Myth of Sisyphus” explains this directly to us. Sisyphus is condemned to push a boulder up a hill everyday, and then he watches it roll back down that exact hill, only to do the same thing tomorrow. In this situation, Camus explains how when one realizes the absurd, we are given two choices: suicide or rebellion. The task, obviously, is very tortuous and in fact, heavily exhausting. Sisyphus can easily start thinking about “why do I have to do this?”, and “what is the purpose of me rolling this hill up everyday?” but he doesn’t. In fact, Sisyphus, realizes that he will never get an answer and despite how bad his situation is, embraces the absurd. He is happy in spite of how horrible his life. He chooses to be happy – he chooses rebellion for the choice of suicide is the realization that one is unable to handle the world and the idea of full agency. Suicide is an escape from the absurd, but not the solution to it. For when we choose to embrace the absurd, then does that make it worthwhile to live. A cat does not experience the absurd because, the purpose of a cat is to be a cat, such as the purpose of a flower is not to fertilize the soil, or to reproduce to many flowers, no, the purpose of a flower is to be a flower. However, this cannot be said the same to humans because we have the innate ability to reflect and contemplate and we are all very individualistic. We are governed by the ideas of good and bad, we have dreams and goals, and we have regrets. Often labelled as a ‘bad person’ or a ‘good person’ when really, we are just living an as individual. As time passes, we grow and the world has no right to judge us for changing, for taking back the control we felt as though we lost, and to have full agency over our lives. These two ideals conflict within the majority of our lifespan. For some, the idea of radical freedom that Sartre speaks of is their hope, but to others the entire concept of radical freedom is their death – the fact that they have too much control and power over their lives become overwhelming to the point where they want to commit suicide because they realize no one can save them. It is their choice and decision to save themselves, relating back to Sisyphus who makes the conscious decision to be happy, ultimately saving himself from the choice of suicide. Suicide in and of itself, has no meaning. It cannot offer value to your life because in death, we no longer have the freedom to choose, we lose ourselves and whatever life we wanted or dreamed to have, can no longer happen. Embrace the meaningless, because while our tasks hold no meaning, it does not mean that life is not worthwhile and that the experiences one goes through, both the good and bad, build value. It’s the little things in life that matter, whether that be seeing the sun, or the fresh feeling and smell after it rains. There is no afterlife or second chances, there is simply the now and you have the ability to change that now for your future.
Nagel, on the other hand, also discusses about the absurd. However, he takes a different approach – rather than seeing it as a conflict with one’s self and their environment/world, he believes that it is an internal battle between one’s ability of taking life too seriously when really these things that we take so seriously such as values, occupation, decisions, are not that serious in the long-run. It’s not common to hear someone say how often they overthink their decisions to the point where it feels as though they have to calculate their actions to provide them the “right answer” to their situation/experience but Nagel argues to essentially stop thinking so deeply about it. Whatever happens, happens. We are so cautious as human beings to not mess up, but Nagel says that whatever we choose to do, there is no justification for it and that these decisions that we make for our lives are open and free, not set in stone and yet we act as though they are. Take for example, the moment we all have graduated from Grade 12. At such a young age (17-18), we are taking this decision far too seriously than Nagel purposes. We are acting as though this decision, the choice of going to university or not, what university you’re even going to, is going to make or break our lives. However, once we have made a decision, we are just thrown into another situation where we have to make another choice – thus, was the choice of going to said university the right one? No, because it’s still open to doubt for once we get into the university, we may realize that we hate this kind of life, that we hate the school. Nagel is purposing that in the moment, these decisions seem very grand and big, but once we take a step back and look at it from afar, we realize how miniscule and unimportant these decisions are. Nagel differs from Camus because Nagel believes that we have the ability to view our lives from a universal stand-point, from a third person perspective and we realize how insignificant our beings are to the grand scheme of the universe. Nagel discuses that humans have the option to simply go back to their everyday, cyclical nature of life and attempt to ignore the absurd but in that case trying to ignore or forget about the absurd means that the absurd already exists, thus you cannot really ignore it as it’s part of your subconscious. Another solution is to not take things so seriously, but Nagel isn’t saying that we should be irrational beings and have no thoughts whatsoever because what makes us human is the fact that we are able to think for ourselves and act upon these thoughts. We still have to be governed in some way or else chaos will ensue. The last option is to commit suicide, but similarly to Camus, they both view suicide as not the answer. Rather Nagel says, things happen and we have to accept it, Nagel offers the solution to face the absurd with irony. For if life has no meaning and our values cannot be justified, then the absurd itself is simply apart of life and should not matter either. In fact, the absurd should be something to viewed optimistically in a sense because of how it is unique to humans. It’s not something we should be running from or fighting against as Camus puts it, but rather the absurd is to be the absurd, it’s simply there. The absurd shouldn’t be taken so seriously as neither should life. It goes back to the idea that, any attempt to solve these issues of trying to find meaning from outside factors such as a God, don’t help because for Nagel, we just have to live and accept the absurd as part of life, and not something to be solved. Any sort of idea of a good or bad person are simply social constructs during life but once we die, none of that mattered. We all end up in the same exact place, and as such, ideas of heroism and despair are rendered meaningless because in death, there are no heroes or villains. Nagel reminds me of a quotation I stumbled upon which was something akin to “you will always be the villain in someone’s story,” and as such Nagel suggest to just live your life because no matter what kind of decision you make and think about very deeply, you cannot and will not be a perfect person.
 

Do empirical studies of human reasoning show that humans are fundamentally irrational?

Prior to the 1970s, mainstream psychology of decision-making assumed people to be fairly good and reasonable statisticians (Edwards, 1966; Lopes, 1991, p. 66). Since then, however, this assumption has been gradually undermined. This shift was sparked by a series of empirical findings published by psychologists Daniel Kahneman and Amos Tversky as part of their heuristics and biases program (summarised in Tversky and Kahneman, 1974; see also Kahneman, Slovic and Tversky, 1982). They suggested that, in assessing probabilities, people rely on a limited number of rules of thumb, or heuristics, which reduce complex reasoning tasks to simpler, more intuitive judgmental operations. Drawing on this idea, several researchers from various disciplines have argued, in a pessimistic vein, that humans are fundamentally irrational.

Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Essay Writing Service

Evaluating some of the heuristics-and-biases tradition‘s empirical findings will indeed reveal seemingly irrational patterns of reasoning (I). Nevertheless, I will contend that these results should be approached with scepticism, as they are ultimately embedded in an unwarranted and problematic idea of human cognition. Indeed, counterarguments and evidence advanced by evolutionary psychologists will show that many of the alleged cognitive illusions, or biases, proposed by Kahneman, Tversky and several of their colleagues, can be avoided by adopting a more instrumental approach to rationality (II). Against these opposite and conflicting extremes, I will finally propose and defend a more moderate ‘middleway’, offered by dual process theories (III).
(I)
In their widely cited articles and books, Tversky and Kahneman set out to describe and discuss how people make judgments under uncertainties. In doing so, they designed a series of thought-experiments devised to reveal people’s underlying reasoning processes (Tversky and Kahneman, 1974, p. 1124; McKenzie, 2005, p. 323). To better understand their work, it is useful to directly engage with some of their most notable experiments. In the famous Linda problem, Tversky and Kahneman presented a group of statistically naïve subjects with this simple personality sketch:
‘Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations’ (Tversky and Kahneman, 1982, pp. 84–98).
Participants were then asked to rank the following statements by their probability:
a) Linda is a bank teller (T)
b) Linda is a bank teller and is active in the feminist movement (T&F) (Tversky and Kahneman, 1982, pp. 84–98).
The overwhelming majority of subjects (89%) ranked the compound target (T&F) more probable than the simple target (T). This, however, clearly violates the conjunction rule – i.e. the requirement that a conjunction cannot be more probable than either of its conjuncts. All feminist bank tellers are, by definition, bank tellers; a person cannot be more likely to be a feminist bank teller than just a bank teller (Tversky and Kahneman, 1982, pp. 84–98; McKenzie, 2005, p. 326). Drawing upon these results, Tversky and Kahneman posited that, when asked to estimate the probability that A belongs to B, or that B will generate A, people rely on representativeness heuristics; that is, on the degree to which A is representative of, or resembles, B (Tversky and Kahneman, 1974, p. 1124; 1982, pp. 84–98). Accordingly, the description of Linda being highly consistent with the stereotype of feminists but not of bank tellers, subjects replaced correct probability judgment with this, more readily available, heuristic. Obviously, however, because similarity is not a factor affecting probability assessment, judgments based on representativeness are frequently biased (Tversky and Kahneman, 1982, pp. 90, 92–93; Newell, 2013, pp. 606–607).
Impressively, this pattern of reasoning – labelled conjunction fallacy – has been found repeatedly not only in later, similar experiments, but also within groups with backgrounds in statistic and probabilistic theory, both at intermediate and sophisticated level (Tversky and Kahneman, 1982, pp. 92–93). Moreover, representativeness-based biases have been reported also in problems concerning prior probabilities assessment. In the well-known lawyers―engineers problem, two groups of subjects were presented personality sketches of several individuals allegedly randomly sampled from a group of 100 lawyers and engineers (Tversky and Kahneman, 1974, p. 1124–1125). In one condition participants were told that the group comprised 70 lawyers and 30 engineers; in the other condition the composition was reversed. Both groups were then asked to assess the probability that a given personality sketch belonged to engineer rather than a lawyer.
According to Bayesian reasoning, the provided base-rate of lawyers and engineers should have influenced reported probabilities (Tversky and Kahneman, 1974, p. 1124–1125; Samuels and Stich, 2004, pp. 4–5). However, Tversky and Kahneman observed that the subjects in the two conditions produced the same probability judgment. This indicates that participants systematically ignored base-rate information, relying instead on the degree to which a given description was representative of either lawyers or engineers. Interestingly, in the absence of descriptive material, prior probabilities were correctly employed. Nevertheless, these were yet again ignored every time a personality sketch was introduced – even when the sketch was completely uninformative and undescriptive of either lawyers or engineers (Tversky and Kahneman, 1974, p. 1124–1125; Samuels and Stich, 2004, pp. 4–5).
Involving fairly obvious errors, base-rate neglect and conjunction fallacy are, perhaps, the most interesting phenomena discovered in decision-making. However, they are far from isolated. Proponents of the heuristics-and-biases approach have reported a huge number of empirical findings concerning popular fallacies in probabilistic judgment.[1] Notably, for example, Peter Wason’s selection task (1966) seem to indicate that people are biased towards confirmation. During an experiment, Wason presented subjects with four cards bearing letters on one side (e.g. ‘A’ and ‘K’) and numbers on the other side (e.g. ‘2’ and ‘7’). Two cards were displayed with the letter side up, two with the number side up. Participants were then asked to select just those cards that, if turned over, would show whether or not the following statement is true: ‘if there is a consonant on one side of a card, then there is an odd number on the other side’. Subjects mostly selected the Kcard alone, or the Kand the 2cards, rarely choosing the Kand 7cards. Yet, if the 7 had a consonant on its other side, the rule would be false. Drawing on these results, Wason concluded that people are biased towards confirmation, and fail to see the importance of the falsifying card (Wason, 1968, as quoted in McKenzie, 2005, p. 328).
Against these upsetting results, one might argue that many of the reasoning problems explored in the heuristics-and-biases literature do not have great practical importance. Yet, worryingly, this does not appear to be the case (Lopes, 1991, pp. 78–81; Gigerenzer, 1991, p. 85). For instance, in a renowned study, Casscells, Schoenberger and Grayboys (1978) presented the following reasoning task to a group of staff members and students at Harvard Medical School:
‘If a test to detect a disease whose prevalence is 1/1000 has a false positive rate of 5%, what is the chance that a person found to have a positive result actually has the disease, assuming that you know nothing about the person’s symptoms or signs? __%’ (Casscells, Schoenberger and Grayboys, 1978, pp. 999–1000).
The authors found that, under the most plausible interpretation of the problem, the majority of their subjects neglected probabilistic reasoning. Only eighteen percent of the participants gave the correct Bayesian answer (2%); while a striking forty-five percent of them ignored the base-rate information, assuming the correct answer to be 95% (Casscells, Schoenberger and Grayboys, 1978, pp. 999–1000). In this particular case, the base-rate neglect cannot be explained in terms of representativeness heuristic. Accordingly, it seems plausible to argue, as Kahneman and Tversky did, that judgmental biases are widespread even beyond the laboratory’s walls, making disquieting inroads also in applied disciplines with potentially real-world implications (Tversky & Kahneman, 1982, p. 154; Casscells, Schoenberger and Grayboys, 1978, pp. 999–1000; Cosmides and Tooby, 1996, pp. 21–22; Samuels, Stich and Bishop, 2002, p. 240).
On their face, these results show that, in making intuitive judgements involving probabilities and uncertainties, people systematically deviate from appropriate statistical, mathematical and logical rules. Instead, they employ normatively problematic heuristics, which, more often than not, lead to biases (Tversky and Kahneman, 1974, pp. 1124). Thus, some researchers have painted a rather bleak image of human rationality, claiming that people repeatedly commit errors in probabilistic judgement because they have not evolved ‘an intellect capable of dealing conceptually with uncertainty’ (Slovic, Fischhoff and Lichtenstein, 1976, p. 174; Nisbett and Borgida, 1975, p. 935). Kahneman and Tversky themselves also seem to endorse this pessimistic interpretation, arguing that ‘people do not appear to follow the calculus of chance or the statistical theory of prediction’ not just in some or many cases, but in all cases – including those in which they get the right answer (Kahneman and Tversky, 1973, p. 48; Samuels, Stich and Bishop, 2002, p. 241).
This pessimistic view has some weight to it. The above discussed empirical findings do indeed seem to demonstrate that the untutored mind only makes use of ‘quick-and-dirty’ heuristics. Nonetheless, I find such a conclusion contentious. One problem with the pessimistic interpretation is the yardstick against which proponents of the heuristics-and-biases tradition assess people’s cognitive mechanisms. Adopting the so-called ‘standard picture of rationality’, they maintain that being rational is to reason in accordance with rules of classical logic and probability theory (Samuels, Stich and Bishop, 2002, p. 247). However, this assumption is problematic. Firstly, the concept of ‘probability’ itself is hotly debated. For instance, some argue that rules of probability theory do apply to single events, while some contend that they only apply to classes of events. If the latter camp is correct, then this would invalidate many of the heuristics-andbiases experiments involving unique events (Cosmides and Tooby, 1996, p. 3; Chase, Hertwig and Gigerenzer, 1998, p. 207). Secondly, this classic interpretation of human rationality is content-blind. In other words, it assumes laws of logic and probability a priori as normative, independent from problem context and subjects’ judgements about it (Gigerenzer, 2006, pp. 106; 121–122; Chase, Hertwig and Gigerenzer, 1998, p. 207). Such criticisms indicate that a reevaluation of the criteria used to assess rationality is needed. Thus, in the following section, I will consider evolutionary psychologists’ call for a more instrumental view of human cognition.
(II)
As mentioned, following the classical picture of rationality, proponents of the heuristics-and-biases approach define errors in reasoning as discrepancies between people’s judgments and probabilistic norms. However, as evolutionary psychologists contend, these laws of logic are neither necessary nor sufficient to make rational inferences in a world of uncertainties. Normative theories and their rules are relevant to people only in some contexts (Gigerenzer, 2006, p. 118; 1991, p. 86; Over, p. 5). This emphasis on the ‘ecology’ of rationality Echoing the tradition of Simon’s bounded rationality (1956), these authors therefore emphasise on the relationship between mind and environment and reject the ‘cognition in a vacuum’ of the heuristics-and-biases approach. In particular, given that the human mind has been shaped by evolution, Gigerenzer (1994) and Cosmides and Tooby (1996) suggest that researchers should present problems in more ‘ecological’ way; that is a way that resembles humans’ natural evolutionary environment. In such an environment, they insist, probabilistic information is framed in terms of natural frequencies – e.g., ‘X will happen 3 out of 20 times’ – rather than percentages (Cosmides and Tooby, 1996; Gigerenzer, 1994; 2006, p. 119).
This speculation – i.e. frequentist hypothesis – has prompted several evolutionary psychologists to re-design some of Kahneman and Tversky’s most famous reasoning task in terms of natural frequencies. For example, Fielder (1988) proposed a frequentist version of the Linda problem phrased as follows:
There are 100 people who fit [Linda’s description]. How many of them are:
a) bank tellers
b) bank tellers and active in the feminist movement
In this version of the experiment, as Fielder predicted, the conjunction fallacy was significantly reduced: only 22% of participants judged (b) more probable than (a) (Fielder, 1988, as quoted in Gigerenzer, 1991, pp. 91–92). Cosmides and Tooby (1996) have presented even more impressive results by running a frequentist version of Casscells, Schoenberger and Grayboys’s medical diagnosis problem. In contrast to the original findings, their version of the problem elicited the correct Bayesian answer from 76% of the subjects. Base-rate neglect simply seemed to disappear (Cosmides and Tooby, 1996, pp. 21–22). Obviously, these findings do not invalidate the ones produced by the heuristics-and-biases approach; however, they do show that people’s inductive mechanisms embody aspects of calculus of probability. However, these are designed to take frequency information as input and produce frequencies as output. Just like the frequentist school does, the untutored mind distinguishes between frequencies and single events (Cosmides and Tooby, 1996, p. 5; Gigerenzer, 1991, p. 104). Moreover, and more importantly, these results demonstrate that good judgment under uncertainty is more than mechanically applying a formula of classical logic or probabilistic theory. In making decisions, intuitive statisticians must first check the problem context or content (Gigerenzer, 2006, p. 119; Newell, 2013, p. 610– 613).
This argument introduces and informs Gigerenzer’s notions of adaptive toolbox of fast and frugal heuristics (Todd, Gigerenzer, and the ABC Research Group, 2012). To briefly explain, he compares the mind to an adaptive toolbox containing specific heuristics selected depending on the constraints of the environment and the goals of the decision maker. The emphasis is on using heuristics that do well, rapidly, and on the basis of a small amount of information (Gigerenzer, 2006, pp. 124–126; Goldstein and Gigerenzer, 2002). The following example serves to illustrate the approach.
Which US city has more inhabitants: San Diego or San Antonio?
Goldstein and Gigerenzer (2002) posed this question to groups of students from the University of Chicago and the University of Munich. Sixty-two percent of University of Chicago students inferred correctly that San Diego was larger; but, surprisingly, every single Munich university student answered correctly (Gigerenzer, 2006, pp. 124–126). Goldstein and Gigerenzer explained the result through the operation of the recognition heuristic, which states that when you are faced with two objects and you have heard of one but not the other, you should choose the former. Most of the Chicago students had heard of both cities so could not rely on this heuristic; in contrast, the ignorance of the Munich students – very few had heard of San Antonio – facilitated their judgment (Gigerenzer, 2006, pp. 124–126).
Evolutionary psychologists’ conclusions and results urge a re-consideration of the heuristics-and-biases pessimistic view. They demonstrate that, if mental tasks are proposed in a more instrumental, ecological frame, than people do not deviate from appropriate norms of rationality. Most of people’s reasoning is worked out by ‘elegant machines’ shaped to survive the intricate evolutionary environment. Moreover, as Gigerenzer argues, ‘fast and frugal’ heuristics are useful strategies, insofar as they capitalise on evolved abilities and environmental structure to make smart inferences (Gigerenzer, 2006, p. 120). Therefore, concerns for human irrationality are ultimately unsubstantiated.
(III)
Unsurprisingly, the debate between heuristics-and-biases proponents and evolutionary psychologists has received huge attention. Some have attempted to make this dispute disappear, claiming that there is no fundamental disagreement between the two sides (Samuels, Stich and Bishop, 2002). For instance, Samuels, Stich and Bishop (2002) note that the empirical findings of the heuristics-and-biases approachdo not provide any compelling reason to think that people only base their judgments on normatively problematic mechanisms of reasoning. At the same time, evolutionary psychologists have offered no empirical proof that all reasoning and decision-making is promoted by normatively unproblematic ‘elegant machines’ (Samuels, Stich and Bishop, 2002, pp. 245–260). This argument, however, completely ignores the extent of differences between pessimistic and optimistic view of rationality (see Kahneman and Tversky, 1996; Gigerenzer; 1991, pp. 101–103). Nevertheless, it does correctly suggest that these approaches do not necessarily invalidate each other.
I have suggested that the fast-and-frugal approach has helpfully refocused questions of human rationality on the relationship between mind and environment. However, sometimes it might be difficult to find the necessary or correct result in the external environment. In these cases, careful thought about available information and its cognitive representation can help to overcome erroneous judgments. Moreover, as Evans and Stanovich (2013) note, both the heuristics-and-biases tradition and evolutionary psychologists largely neglect personal differences. After all, some participants in the heuristics-and-biases experiments do give the standard normative response, whereas some subjects in the experiments championed by evolutionary psychologists still commit fairly obvious errors (Evans and Stanovich, 2013, pp. 234–235).
Drawing on this consideration, proponents of dual-process theories have claimed that human reasoning and related higher cognitive processes – such as judgement and decision-making – are underpinned by two kinds of thinking; one intuitive, the other reflective. The former – i.e., Type 1 processing – is fast, automatic, holistic, largely unconscious, and makes minimal cognitive demands; while the latter – i.e., Type 2 processing – is relatively slow, rule-based, deliberatively controlled and requires more cognitive capacity (Evans and Stanovich, 2013, p. 225). Further, Evans and Stanovich speculate that the Type 1 processing, as evolutionary psychologists suggest,has been shaped by natural selection to make smart judgement based on the environment; whereas Type 2 processing has developed more recently, it is aimed at maximising personal utility and it is more influenced by culture and education. Accordingly, individual differences can be explained in terms of subjects’ cognitive abilities; those participants who are more trained to use Type 2 processing will be more likely to find the correct answer, independently from how the problem is framed (Evans and Stanovich, 2013, pp. 236–237; Stanovich, 1999, p. 150).
Although I am sympathetic to evolutionary psychologists’ argument for human rationality, the empirical findings proposed by dual-process theories provide a tenable, and in some respects, more persuasive, “middle-way”. Reviewing and assessing the experiments proposed by the heuristics-and-biases tradition and by evolutionary psychologists has showed that people are inclined to make errors, as well as to reason in accordance with optimal information processing models. Although very influential, these views ultimately oversimplify questions on human rationality, failing to see the complexities of the mind and its mechanisms. In contract, by accommodating both pessimistic and optimistic interpretations, dual-process theories overcome the blunt question ‘are people rational?’, acknowledging that the mind is neither a perfectly working machine nor a fundamentally flawed one. Upon these considerations, researchers should abandon the ‘monolithic’ views proposed by the heuristics-and-biases and evolutionary approaches, to focus instead on questions concerning the extent of human cognitive abilities and the specific reasoning processes at play under certain conditions.
Reference List
Casscells, W., Schoenberger, A. and Grayboys, T. B, (1978), ‘Interpretation By Physicians of Clinical Laboratory Results’, New England Journal of Medicine, Vol. 299, No. 18: pp. 999–1001.
Chase, V. M., Hertwig, R. and Gigerenzer, G. (1998), ‘Visions of Rationality’, Trends in Cognitive Science, Vol. 2, No. 6: pp. 206–214.
Cohen, L. J., (1981), ‘Can Human Irrationality Be Experimentally Demonstrated?’, Behavioral and Brain Sciences, Vol. 4, pp. 317–370.
Cosmides, L. and Tooby, J., (1996), ‘Are Humans Good Intuitive Statisticians After All? Rethinking Some Conclusions from The Literature on Judgment Under Uncertainty’, Cognition, Vol. 58, pp. 1–73.
Edwards, W., (1966), Nonconservative Information Processing Systems. Ann Arbor: University of Michigan.
Evans, J. St. B. T. and Stanovich, K. E., (2013), ‘Dual-Process Theories of Higher Cognition: Advancing the Debate’, Perspectives on Psychological Science, Vol. 8, No. 3: pp. 223–241.
Evans, J. St. B. T., (2008), ‘Dual-Processing Accounts of Reasoning, Judgment, and Social Cognition’, Annual Review Psychology, Vol. 59, pp. 255–278.
Fielder, K., (1988), ‘The Dependence of The Conjunction Fallacy On Subtle Linguistic Factors’, Psychological Research, Vol. 50, No. 3: pp. 123–129.
Gigerenzer G., Hertwig R. and Pachur, T. (eds), (2011), Heuristics: The Foundations of Adaptive Behavior. Oxford: Oxford University Press.
Gigerenzer, G. and Goldstein, D. G., (1996), ‘Reasoning the Fast and Frugal Way: Models of Bounded Rationality’, Psychological Review, Vol. 103, No. 4, pp. 650–669.
Gigerenzer, G., (1991), ‘How to Make Cognitive Illusions Disappear: Beyond “Heuristics and Biases”‘, European Review of Social Psychology, Vol. 2, No. 1: pp. 83–115.
Gigerenzer, G., (1994), ‘Why the Distinction between Single-event Probabilities and Frequencies is Important for Psychology (and Vice Versa)’, in G. Wright and P. Ayton (eds), Subjective Probability. New York: John Wiley & Sons.
Gigerenzer, G., (2006), ‘Bounded and Rational’ in R. Stainton (ed) Contemporary Debates in Cognitive Science. Oxford: Blackwell Publishing.
Gilovich, T., Griffin, D. and Kahneman, D., (2002), Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press.
Kahneman, D. and Tversky, A., (1973) ‘On the Psychology of Prediction’, Psychological Review, Vol. 80: pp. 237–251
Kahneman, D. and Tversky, A., (1996), ‘On the Reality of Cognitive Illusions’, Psychological Review, Vol. 103, No. 3: pp. 582–591.
Kahneman, D., Slovic, P. and Tversky, A. (eds), (1982), Judgment Under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press.
Lopes, L. L. (1991), ‘The Rhetoric of Irrationality’, Theory & Psychology, Vol. 1, No. 1: pp. 65–82.
McKenzie, C. R. M., (2005), ‘Judgment and Decision Making’, in K. Lamberts and R. L. Goldstone, Handbook of Cognition. London: SAGE Publications.
Newell, B. R. (2013), ‘Judgement Under Uncertainty’, in D. Reisenberg (ed.), The Oxford Handbook of Cognitive Psychology. Oxford: Oxford University Press.
Nisbett, R., and Borgida, E., (1975). ‘Attribution and the social psychology of prediction’, Journal of Personality and Social Psychology, Vol. 32, pp. 932– 943.
Over, D., (2007), ‘Rationality and the Normative/Descriptive Distinction’ in D. J. Koehler and N. Harvey, Blackwell Handbook of Judgment and Decision Making. London: John Wiley & Sons.
Samuels, R. and Stich, S., (2004), ‘Rationality and Psychology’ in A. R. Mele and P. Rawling (eds), The Oxford Handbook of Rationality. Oxford: Oxford University Press.
Samuels, R., Stich, S. and Bishop, M. (2002), ‘Ending the Rationality Wars: How To Make Disputes About Human Rationality Disappear’ in R. Elio (ed) Common Sense, Reasoning and Rationality. Oxford: Oxford University Press.
Simon, H. A., (1956), ‘Rational Choice And The Structure Of The Environment’, Psychological Review, Vol. 63, No. 2: pp. 129–138.
Slovic, P., Fischhoff, B. and Lichtenstein, S. (1976), ‘Cognitive Processes and
Societal Risk Taking’ in J.S. Carroll and J.W. Payne (eds.), Cognition and Social Behavior. Lawrence Erlbaum
Stanovich, K. E., (1999), Who is rational? Studies of individual differences in reasoning. Mahwah, NJ: Erlbaum
Todd, P. M., Gigerenzer, G. and the ABC Research Group (2012), Ecological Rationality. Intelligence in the World. Oxford: Oxford University Press.
Tversky A. and Kahneman, D. (1982), ‘Judgments of and by Representativeness’ in D. Kahneman, P. Slovic and A. Tversky (eds), Judgment Under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press.
Tversky, A. and Kahneman, D. (1974), ‘Judgment Under Uncertainty: Heuristics and Biases’, Science, Vol. 185, No. 4: pp. 1124–1131.
Wason, P. C., (1968), ‘Reasoning About a Rule’, Quarterly Journal of Experimental Psychology, Vol. 20, No. 3: pp. 273–281.

[1] Amongst the most mentioned, overconfidence biases and anchoring and framing effects. For a complete account see Kahneman, D., Slovic, P. and Tversky, A. (eds), (1982), Judgment Under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press.