Final Address – Bar Practice Course 81 – 2 May 2024
Congratulations on completing the Bar Practice Course. You are about to be released, like someone I sentence who has served sufficient time in custody and is given today as a parole release date. I try to explain things that will help them: a GP mental health plan, a parenting course, voluntary work as a precursor to paid employment, and some other tips. I know that most of what I am saying is not being absorbed because all the person can think about is getting out and going home.
Tonight is not an evening for instructions about pleadings, ethics, or how to issue frame in written and oral submissions. I wrote a paper[1] on that last topic and I commend it to you. You already have been told by others that honesty, candour, and your higher duty to the administration of justice are more important than a fleeting advantage by cutting some ethical corner and misleading a court by omission. Your reputation is hard-earned: decades of toil in the making, culminating in this intense course. All of that can be lost in a moment of misjudgement.
Realize that the best lawyers were not born into legal dynasties. Chief Justice Susan Kiefel went to Sandgate High and left school at 15 to become a secretary. Chief Justice Stephen Gageler went to a one-teacher school in the Hunter Valley and was introduced to law by a hobby farmer who was a barrister. Lord Atkin’s father was not Lord Atkin. He was a journalist and progressive politician, who was training to be a barrister in Brisbane when he died aged 30. His eldest son, Dick Atkin, then aged 4, had been born around the corner in Tank Street. Dick’s widowed mother and grandparents brought him up in Wales. His success came from scholarships and hard work. When he wanted to start at the London Bar, he had no connections. He walked around the courts and saw who the best barristers were. He asked one of the best, Edward Scrutton, to be his pupil master. Approaching the formidable Scrutton must have taken courage. Good mentors have been invaluable to most of us in life. So seek out good mentors.
Atkin narrowly survived at the junior Bar. In his first few years, he had two loyal briefing solicitors: one of them was a young solicitor called Norman Herbert Smith whose small firm is now the global firm, Herbert Smith Freehills. From little things big things grow.
Luck plays a part. I came to the Bar in 1986 when I had the opportunity to go into good chambers. I was with that group for 22 happy years. You may not be so lucky and live a nomadic existence. Be brave, and politely offer to tag along to court with more senior barristers. Be seen. And be seen to do good work. That requires preparation and knowing what judges want.
Judges love to see brilliance. But we want assistance. That was one reason I wrote the paper on issue framing in written and oral submissions. Well-prepared junior barristers often try to impress me in the first few minutes by telling me all the details of the case and showing that they have read lots of cases. Feel free to impress me with your knowledge and tell me the details. Just don’t do it in the first two minutes.
At the start I simply want to know who did what to who; what the issue is; what the rule is; and why you say you should win. Do not start with a lot of dates and detail that I cannot absorb. Try not to start: “This is an application under section 38(b)(v) of the Dog Act”. First tell me: “The defendant’s dog bit a child”. I don’t need to know the dog’s name or the child’s name at that point.
When I was at the junior Bar, I was amazed by the brain power of judges like Bill Pincus who seemed to be able to absorb a great deal of information, quickly process it, and get to the point. With the advent of AI, people increasingly compare judicial decision-makers with computer programs. Like IBM’s Big Blue all those years ago competing with World Champion Garry Kasparov in chess. Will machines or humans, in time, prove to be the best judicial decision-making machines? AI, with all its embedded human biases and superior processing power, may win that race. In the meantime, you are dealing with judges with human strengths and human failings.
In 1949, the great jurist, Jerome Frank, wrote “We must face the fact that judges are human”.
One part of being a human is what psychologists call “bounded rationality”. There is only so much information that even someone with the processing power of the late Bill Pincus can absorb in a short amount of time. Judges, Magistrates and tribunal members with crowded lists can absorb only so much information. Sometimes they have 20 cases to decide that morning.
Thinking Fast and Slow: Intuitive thinking
In 2015, I attended a seminar conducted by Professor Daniel Kahneman at the Federal Court in New York City. Since then, I have become interested in, some might say mildly obsessed by, decision-making and cognitive biases. Kahneman and his co-author, Amos Tversky, founded modern decision-making theory, which is behind behavioural economics. Its insights improve many aspects of our lives, like busy doctors not misdiagnosing cases.
You may have heard of Professor Kahneman, who was awarded the Nobel Prize for Economics, without ever having taught an economics class. Tversky died before he could be jointly awarded the Nobel Prize. Kahneman wrote a best-seller called Thinking Fast and Slow. It is about two systems which affect our thinking:
- System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.
- System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.
Kahneman writes:
“When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do.”
The labels of System 1 and System 2 are widely used in psychology. System 1 is remarkably complex and generally very good at what it does. Kahneman describes the circumstances in which System 2 takes over, overruling the freewheeling impulses and associations of System 1.
In our daily lives, we make intuitive decisions all the time about important matters, and if we did not, we’d be dead, disabled or suffering from some kind of decisional overload or paralysis because we cannot afford to deliberate over all the decisions we have to make every day.
What risk does the person approaching me at a dark station pose to my personal safety?
My assessment depends on whether the person is a smiling, little old lady, or an angry young man. My assessment is based on stereotypes and biases: tattoos, age, the way the baseball hat is positioned, media portrayals of offenders, even movies I have seen. If I take too long to deliberate, rather than run, I may get mugged.
My intuition may be wrong. But I make my intuitive decision about risk and probabilities based on biases and stereotypes. I’d be mad, or at least very unusual, if I didn’t.
However, when I decide a bail case, I am expected to make a more reflective assessment of risk. Yet, implicit biases and heuristics play their part in decision-making. Someone of a certain age, background and criminal history simply poses a statistically higher risk than someone without that profile. And if I have just heard on the news about an offence committed by someone with that profile, or if the day before I sentenced someone with that profile for committing an offence whilst on bail, my assessment of risk is affected, and perhaps over-estimated.
System 1 relies on patterns that develop based on the individual’s experiences with the world. The individual learns over time how to distinguish between things and people, and how to discern patterns. These help the brain process information quickly and efficiently.
By contrast, the reflective, System 2 relies on deliberation and effort to perform a task.
Naturally, experience and practice are important. An experienced emergency doctor will be quicker at treating a gunshot wound than an average GP. Some things require a lot of practice, like playing a top-spin, backhand passing shot in tennis, and becomes second nature, and seemingly a lot easier through experience.
But some things, like hard maths problems which cannot be done by most people in their heads, just require concentration and deliberation: slow thinking.
While System 1 can process information on an ongoing basis, the reflective system has a limited capacity. Thus, the brain is limited about its use of System 2: I cannot solve a hard maths problem, sing a song, and watch TV at the same time.
Courts and tribunals with heavy caseloads, like GPs with full waiting rooms during a flu epidemic, tend to rely on the automatic retrieval of schemas or heuristics to process incoming information and engage the reflective system only when motivated to do so.
The experts refer to this reliance on schemas as recognition-primed decision making. The idea is that we develop schemas that we subsequently use to size up a situation and decide what to do. For example, a first responder in an ambulance comes across an unconscious person at the scene of an accident and does not take 30 minutes to analyse all the potential options for action. Rather, he or she takes in information about the immediate situation and matches it to a response option that has worked well in similar situations in the ambulance officer’s past or has been engrained through training. The initial option may not have been the best option if there had been enough time to generate and analyse all possible options, but it is the best option in a time-pressured situation.
Judicial decision-makers, particularly when confronted with heavy caseloads, tend to use the same process.
Kahneman and Tversky’s insights are relevant to time-poor courts and tribunals: the equivalent of a crowded GP clinic during the flu season. Too many patients, not enough time, with rushed intuitive decisions based on recognition-primed patterns of thinking. A busy doctor thinks “You look like you’ve got the same flu as my last 10 patients” – an understandable, intuitive conclusion but possibly a wrong one.
In busy court lists there is the same potential for missed diagnoses and reliance on recognition-primed patterns of thinking and stereotypes. Not just racial or other wicked stereotypes, but stereotypes and unconscious biases that are based on experience of doing similar cases that are easily called to mind. It’s called the availability heuristic. Busy judicial officers may unconsciously say to themselves, “I’ve seen this case before, I know what this case is about, and I know how this case is going to end”.
There is no easy solution to this problem in the health system or in the justice system. Your task is to help the decision-maker to avoid cognitive biases. That starts with avoiding information overload in the first few minutes of your address or the first paragraph of your written submissions. Keep things as simple as possible without being misleading by omission. Spend time on your opening.
The problem of intuitive thinking is not confined to busy, lower courts and tribunals. Senior judges in apex courts make policy decisions based on assumptions about how the way the world works. I developed this point in an article I wrote in the US Journal of Torts Law[2] about the imposition of duties of care in tort or the creation of judge-made immunities. Judges make assumptions about incentives and deterrent effects by assuming that certain occupations, like police or doctors, will respond or over-respond to the threat of civil liability. They are based on hunches and biases, rather than empirical evidence.
Heuristics
Two things that can lead to inaccurate decisions are heuristics and implicit biases. In psychology, heuristics are simple, efficient rules which people often use to form judgments and make decisions. They are mental shortcuts that usually involve focusing on one aspect of a complex problem and ignoring others. These mental shortcuts ease the cognitive load of making a decision. They include a rule of thumb, an educated guess, a guesstimate and intuitive judgments.
Heuristics are schemas that rely on only some of the information available so an individual can make a decision quickly and with little effort.
Judging is typically seen as a rational and deliberative process. However, the emerging judicial cognition research suggests that, like other human decision-making, judging is partly an intuitive cognitive process. Sometimes this assists in quick and efficient decision-making. However, it can also produce systematic errors in decision-making.
Evidence that judges are susceptible to implicit biases and use heuristics comes from a series of studies by law professors Judge Andrew Wistrich, Jeffrey Rachlinski and Chris Guthrie.[3] They explored judges’ use of five heuristics and biases:
- anchoring;
- framing – the same information presented differently (e.g., the glass is half full versus half empty);
- hindsight bias – the sense that specific outcomes were more predictable once the outcomes are known;
- representativeness heuristic – ignoring statistical base-rate information, and
- egocentric bias – overconfidence in one’s abilities.
Locally, Professor Kylie Burns from Griffith Law School has researched and written in this field. In “Judges, ‘Common Sense’ and Judicial Cognition”, she explains the availability heuristic and other ways of thinking that produce systemic errors in decision-making.[4]
Anchoring
Kahneman and Tversky wrote about anchoring. Judges may be influenced by anchoring: the starting points in competing submissions about quantum in a personal injury case or submissions on sentence. There is an inclination to think that the right answer lies in the middle of these anchoring points. It is part of being human. We have to try to resist anchoring by placing little weight on a submission that advances too high or too low a number.
Framing
Here is a simple example of framing. Imagine that you go into a supermarket to buy a product. Each container is marked differently: one label has 90% fat free and the other says 10% fat.
Which one are you more likely to buy? Each container, of course, contains the same product.
When people face a difficult decision, such as whether to undergo a medical procedure or to go to trial, the way in which the decision is framed influences the decision and people’s willingness to incur risk. Different ways of presenting the same information prompt different emotions. A cancer patient given statistics about the outcome of surgery and radiation might be given two descriptions of the short-term outcomes of surgery:
- The one-month survival rate is 90%
- There is a 10% mortality in the first month
The way in which such information is presented not only affects the decisions of patients, it affects the decisions of doctors. Physicians participating in a study that Tversky and others carried out at the Harvard Medical School were given these statistics. Surgery was much more popular in the former frame (84% of physicians chose it) than in the latter (where 50% favoured radiation). The logical equivalence of the two descriptions is obvious, and “a reality bound decision-maker would make the same choice regardless of which version she saw”.[5] However, we are affected by emotion and emotional words: mortality is bad, survival is good. The statement that “the odds of survival one month after surgery are 90%” is more reassuring than the equivalent statement that “mortality within one month of surgery is 10%”.
Think about framing when you are advising clients with poor prospects of winning at trial. You might advise:
- You have a 90% chance of losing; or
- You have a 10% chance of winning
Incidentally, Kahneman and Tversky’s Prospect Theory explains why a plaintiff with a 90% chance of winning at trial will usually accept less than 90% of an agreed quantum to settle the case. The thought of how they would feel if they lost (a 10% risk) is so terrible that they will take more than a 10% discount to settle.
The availability heuristic
I turn to what is called the availability heuristic. The more easily people can call some scenario to mind – the more available it is to them – the more probable they find it to be.
Any fact or incident that was especially vivid, or recent, or common – or anything that happened to preoccupy a person – is likely to be recalled with special ease, and so be disproportionately weighed in any judgment.
A couple of years ago I spent a week in Toowoomba and each day sentenced a number of drug offenders, mostly for street-level dealing in methamphetamine. On the Friday evening as my Associate and I drove back to Brisbane she asked me “Judge, is everyone in Toowoomba on meth?” My immediate response was “I don’t think my Auntie Violet is”. I first met Auntie Violet in the early 1960s when I was an infant. Her daughters, who were distant cousins, rode ponies. For a long time I assumed that most children in Toowoomba rode ponies. My Associate’s over-estimation of the number of Toowoomba citizens who use meth, like my childhood over-estimation of the number of Toowoomba children who rode ponies, is what scholars describe as an availability heuristic. The scholarship in this area can be traced to Tversky and Kahneman’s seminal 1974 article ‘Judgment under Uncertainty: Heuristics and Biases’.[6] That article described simplifying shortcuts of intuitive thinking and explained some 20 biases. In simple terms, the availability heuristic is the process of judging frequency by the ease with which instances come to mind.
Human judgments are often based on memory. If we do not have the necessary information to make a decision, we use information acquired in the past that we think will help us make a decision. However, this process can lead to incorrect assumptions, for example a wrong assumption about the frequency of an event based on how many similar events are brought to mind. Judges, like everyone else, are more likely to draw on information that can be easily called to mind. Based on the judge’s limited experience and lack of knowledge, erroneous assumptions may be made about a group whose behaviour is under consideration, for example, the behaviour of the victims of domestic violence or childhood sexual abuse. Also, like other people, judges may over-estimate the chance of something occurring because of their experience or exposure to media reports.
The availability heuristic suggests that people tend to think a risk is more serious if it can be readily called to mind. A terrorist attack in Paris or London that attracts media coverage will alter your feelings about the safety of visiting that city, and cause you to change your travel plans and go scuba diving in Fiji instead. Media reporting of divorces among Hollywood celebrities leads us to exaggerate their frequency.[7] An example given by Kahneman is that strokes cause almost twice as many deaths as all accidents combined, but 80 per cent of respondents to a survey which considered pairs of causes of death judged accidental death to be more likely. Tornadoes were seen as more frequent killers than asthma, although the latter caused 20 times more deaths.[8]
The representativeness heuristic
When people make categorical judgments (for example, in assessing the likelihood that a defendant is guilty) they tend to base their judgments on the extent to which the evidence being analysed is representative of the category. People typically rely on the “representativeness heuristic” in which probabilities are evaluated by the degree to which A is representative of B, that is, by the degree to which A resembles B. When one thing resembles something else in a category, we judge the possibility that the first item is a member of that category as high. On the other hand, if A does not resemble or is not similar to B, we judge the likelihood that A is in that category as low. This is referred to as the “representative heuristic”.
It is useful, but it can lead people to discount relevant statistical information. When people tend to “undervalue statistical information, this can lead to decision errors”. For instance, people undervalue the importance of the frequency with which the underlying category occurs: this is known as the “base-rate” statistic. This heuristic can result in a “form of automated stereotyping” which leads people to rely on “impressionistic and intuitive reactions of the representativeness” of information.
Tversky and Kahneman illustrated judgment by representativeness, by asking respondents to consider an individual who had been described by a former neighbour as follows:
“Steve is very shy and withdrawn, invariably helpful, but with little interest in people, or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.
Rate the probability that Steve has one of the following occupations. Use 1 for most likely and 5 for the least likely:
- A farmer
- A salesman
- An airline pilot
- A librarian
- A physician”
As a result of the representative heuristic, the probability that Steve is a librarian is assessed by the degree to which he is representative of, or similar to, the stereotype of a librarian. This approach can lead to serious errors because the judgment is insensitive to the prior probability, or base-rate frequency, of the outcomes. For example, the fact that there are more farmers than librarians in the population should enter into any reasonable estimate of the probability that Steve is a librarian rather than a farmer. However, base-rate frequencies are ignored.[9]
Fact Finding
Confirmation bias includes the human tendency to favour or interpret information in a way that confirms or strengthens an existing belief. Once someone has reached a certain view of the facts, it is difficult to dislodge.
Premature closure is jumping to a conclusion. Earlier, I gave the example of a busy doctor who makes a quick diagnosis (often based on pattern recognition), fails to consider other possible diagnoses, and prematurely stops collecting information.
Judges are not appointed to make intuitive decisions based on gut feelings. Because of our training and professional ethics, we are trained to keep an open mind. The research shows that judges seem to be better than your average juror in avoiding what may be described as confirmation bias or premature closing.
Still, being human, judges and lawyers are not immune to confirmation bias and premature closure. Your difficult task is to help a judge or magistrate avoid that kind of error and harming your client’s interests. There may be tactical reasons in a criminal or civil case to keep your powder dry, so that the evidence or argument you introduce has its greatest effect. As against that, there are many cases in which introducing a judge or a jury to an alternative view of the case and previewing evidence that challenges the early evidence may be to your advantage.
Most of my work as a trial judge is not deciding novel questions of law, or even deciding questions of law. It is deciding what happened. Several years ago, after swallowing the Kahneman Kool-Aid, I read an article by an Australian born academic, Professor Emma Cunliffe, who applied Kahneman’s ideas about fast and slow thinking to judicial decision‑making.
Judges pride ourselves on deciding cases by deliberation, but we are humans and prone to intuitive decision-making. A judge should try to interrupt intuitive decision-making. Professor Cunliffe observes:[10]
“It seems to be human nature to value information that appears to confirm one’s pre‑existing beliefs and to disregard or fail to search for information that contests those beliefs”.
Faced with the prevalence and necessity for intuitive reasoning, Cunliffe considers the possibility of a process to “interrupt intuitive reasoning that is based on substitution or stereotypes, particularly where the stereotype is otherwise likely to distract the trier of fact from the most likely explanation”.
The approach to judicial decision-making suggested by academics like Cunliffe entails consciously asking:
“For the account that I find more coherent to have occurred, what must the protagonist have done? When and how must she have done it? In what time period did it occur, according to the best and most independent evidence I can muster? How well is this account grounded in the trial record, and to what extent am I making inferences from proven facts? What evidence challenges this account, and do I disbelieve that evidence? Are there things I would expect to see if this narrative were true, but which are absent from or contradicted by the record? And, what assumptions am I making about human behaviour to get to this result?”
Scholars in this area like Simon suggest that judges work through a process of generating two or more models of a case and then restructure those models until they identify the model that is most coherent as a means of settling upon an outcome.
If that is the way that triers of fact go about deciding cases, then Cunliffe is surely right that the questions she suggests should be consciously asked have a capacity to focus a trier of fact “on the evidence that has not been accounted for by the preferred account, and to consider what inferences are being drawn to reach the preferred conclusion”.
The final question which Cunliffe suggests that decision-makers pose is:
“What assumptions am I making about human behaviour to get to this result?”
That is something of a checklist for any judge. I encourage you to incorporate it in your submissions to judges and juries.
Interrupting Intuitive thinking
From recent weeks in the Bar Practice Course you already have a long list: read lots of cases; be across your brief; ask your briefing solicitors for additional information and instructions; do not take on too much work, including pro bono work, if the quality of your work will suffer; be prepared to challenge something that has been assumed by those around you; challenge the empirical basis of the predictions and opinions of an expert witness, including one who has been giving the same predictions as a professional witness for decades; ask them if they have gone back and validated their predictions like any good scientist would do; be prepared to ask “Where is your evidence for that?” and be prepared to say “The Emperor has no clothes”. Do not just agree with a senior barrister who leads you. That is not what you are paid to do. Politely point out when you think they have got something wrong. Be prepared to politely correct a judge. Never mislead. Be ethical, find good mentors, be resilient, look after your mental health, be prepared to concede unmeritorious points, issue frame in oral and written submissions, unplug from your screen and earplugs, and go for a walk through an art gallery.
To that long list I have just added: help the judge to not take the cognitive shortcuts that Kahneman and Tversky wrote about in Science in 1974.
Kahneman and Tversky’s work inspired the whole field of behavioural economics, and the idea of nudge in public policy. It has improved health care by avoiding missed diagnoses. It reduces disparities in sentencing practices in the US.
As a child Dr Donald Redelmeier was very good at maths. In 1977 Redelmeier’s high school teacher gave him an article to read from Science by Amos Tversky and Daniel Kahneman called ‘Judgment Under Uncertainty: Heuristics and Biases’.[11] In the last 47 years it has become one of the most cited articles in social science. People like Dr Redelmeier have used its insights by improving decision-making by doctors.
Dr Redelmeier works at a hospital in Toronto that treats a large number of road trauma cases. The Emergency Department treats complex cases of people who have more than one thing wrong with them. Dr Redelmeier is used by the hospital to check the decisions of specialists for cognitive errors. He checks on other people’s thinking, by thinking about how other people think.
The following story is taken from Michael Lewis’ The Undoing Project:[12]
“But the dazed young woman who arrived in the emergency room directly from her head-on car crash, with her many broken bones, presented her surgeons, as they treated her, with a disturbing problem. The rhythm of her heartbeat had become wildly irregular. It was either skipping beats or adding extra beats; in any case, she had more than one thing seriously wrong with her.
Immediately after the trauma centre staff called Redelmeier to come to the operating room, they diagnosed the heart problem on their own – or thought they had. The young woman remained alert enough to tell them that she had a past history of an overactive thyroid. An overactive thyroid can cause an irregular heartbeat. And so, when Redelmeier arrived, the staff no longer needed him to investigate the source of the irregular heartbeat but to treat it. No one in the operating room would have batted an eye if Redelmeier had simply administered the drugs for hyperthyroidism. Instead, Redelmeier asked everyone to slow down. To wait. Just a moment. Just to check their thinking – and to make sure they were not trying to force the facts into an easy, coherent, but ultimately false story.
Something bothered him. As he said later, ‘Hyperthyroidism is a classic cause of an irregular heart rhythm, but hyperthyroidism is an infrequent cause of an irregular heart rhythm.’ Hearing that the young woman had a history of excess thyroid hormone production, the emergency room medical staff had leaped, with seeming reason, to the assumption that her overactive thyroid had caused the dangerous beating of her heart. They hadn’t bothered to consider statistically far more likely causes of an irregular heartbeat. In Redelmeier’s experience, doctors did not think statistically. ‘Eighty percent of doctors don’t think probabilities apply to their patients,’ he said. ‘Just like 95 percent of married couples don’t believe the 50 percent divorce rate applies to them, and 95 percent of drunk drivers don’t think the statistics that show that you are more likely to be killed if you are driving drunk than if you are driving sober applies to them.’
Redelmeier asked the emergency room staff to search for other, more statistically likely causes of the woman’s irregular heartbeat. That’s when they found her collapsed lung. Like her fractured ribs, her collapsed lung had failed to turn up on the X-ray. Unlike the fractured ribs, it could kill her. Redelmeier ignored the thyroid and treated the collapsed lung. The young woman’s heartbeat returned to normal. The next day, her formal thyroid tests came back: Her thyroid hormone production was perfectly normal. Her thyroid never had been the issue. ‘It was a classic case of the representativeness heuristic,’ said Redelmeier. ‘You need to be so careful when there is one simple diagnosis that instantly pops into your mind that beautifully explains everything all at once. That’s when you need to stop and check your thinking.’” (emphasis added)
A telling case of a missed diagnosis and a lesson about the dangers of intuitive thinking that leads to a satisfying, plausible conclusion, but a conclusion that is simply wrong.
Judges, particularly those under time pressure in busy Domestic Violence courts and tribunals, or a Supreme Court Judge deciding several bail cases in a morning, need your help to not decide cases based on stereotypes and intuitive thinking.
Also avoid error on your part by not making assumptions about your client or a witness based on the group that person comes from: be that a real estate agent from Surfers Paradise or a 19‑year-old youth from a Sudanese, refugee background. In my career at the Bar I acted for the professional indemnity insurer of many Gold Coast real estate agents: some were dodgy, at least one was an exceptional human being whose word was his bond. That experience taught me to not think all Gold Coast real estate agents were the same.
You need to avoid intuitive thinking, including stereotypical thinking, when you get a brief. Avoid premature closure, making up your mind after reading the first few witness statements, or making an assessment based on an inadequate sample size. Even if you have done 20 similar cases, that is an invalid number to draw any conclusion. I may have done 20 personal injury trials as a judge, but that does not entitle me to generalise about plaintiffs, insurance companies, psychiatrists called by plaintiffs, or orthopaedic surgeons called by defendants. I generalise because I am human. I have to remind myself of Kahneman’s chapter on the pitfalls of making decisions based on small numbers and the random nature of events.
In our vocation, the law, we are as prone as doctors to mistaken diagnosis by recognition-primed decision-making and false categorisation. We are human and prone to cognitive biases. Try not to think of a client, a witness, an opponent, or a judge according to some stereotype or group identity. Remember that Lord Atkin was not Lord Atkin at birth and did not live in a castle. He was born in a cottage that stood around the corner.
Avoid fitting an individual into a group to which you intuitively think that person belongs. Individuals are far more complex and interesting.
Professor Daniel Kahneman 1934- 2024
Daniel Kahneman died a few weeks ago aged 90. May I end by reading a somewhat long extract from his biographical note when he won The Nobel Prize for Economics. The insight it gives into the human condition might be applied to contemporary conflict zones and the plight of the oppressed, minorities and refugees. Kahneman wrote:
“I was born in Tel Aviv, in what is now Israel, in 1934, while my mother was visiting her extended family there; our regular domicile was in Paris. My parents were Lithuanian Jews, who had immigrated to France in the early 1920s and had done quite well. My father was the chief of research in a large chemical factory. But although my parents loved most things French and had some French friends, their roots in France were shallow, and they never felt completely secure. Of course, whatever vestiges of security they’d had were lost when the Germans swept into France in 1940. What was probably the first graph I ever drew, in 1941, showed my family’s fortunes as a function of time – and around 1940 the curve crossed into the negative domain.
I will never know if my vocation as a psychologist was a result of my early exposure to interesting gossip, or whether my interest in gossip was an indication of a budding vocation. Like many other Jews, I suppose, I grew up in a world that consisted exclusively of people and words, and most of the words were about people. Nature barely existed, and I never learned to identify flowers or to appreciate animals. But the people my mother liked to talk about with her friends and with my father were fascinating in their complexity. Some people were better than others, but the best were far from perfect and no one was simply bad. Most of her stories were touched by irony, and they all had two sides or more.
In one experience I remember vividly, there was a rich range of shades. It must have been late 1941 or early 1942. Jews were required to wear the Star of David and to obey a 6 p.m. curfew. I had gone to play with a Christian friend and had stayed too late. I turned my brown sweater inside out to walk the few blocks home. As I was walking down an empty street, I saw a German soldier approaching. He was wearing the black uniform that I had been told to fear more than others – the one worn by specially recruited SS soldiers. As I came closer to him, trying to walk fast, I noticed that he was looking at me intently. Then he beckoned me over, picked me up, and hugged me. I was terrified that he would notice the star inside my sweater. He was speaking to me with great emotion, in German. When he put me down, he opened his wallet, showed me a picture of a boy, and gave me some money. I went home more certain than ever that my mother was right: people were endlessly complicated and interesting.” (emphasis added)
You can imagine why that kind of experience might make a person be interested in psychology. As a child, Kahneman was the beneficiary of the representative heuristic. The blue-eyed, blonde boy reminded the soldier of his son. The thought that the boy might be a Jew did not occur to him.
In his late 80s, Professor Kahneman was still producing amazing work, like the co-authored book Noise. When asked a few years ago what still motivated him, he said “curiosity”. He was unusual. He liked changing his mind. He stated:
“For me when I change my mind it is the pure experience of having learned something. That’s when I am sure I’ve learned something. Yesterday I was stupid, now I have seen the light.”
Conclusion
I hope you benefit from Kahneman’s wisdom, particularly about being curious and being prepared to change your mind about a witness or about what you think the evidence shows happened. Avoid simple and quick answers. Interrupt intuitive thinking by judges by politely suggesting that this case may be different to apparently similar cases they may have done. Suggest that the evidence or lack of evidence should make them stop and think about a provisional, intuitive view.
In your own work, be curious about people. Don’t jump to easy, intuitive conclusions about individuals or what happened. Keep an open mind. Consider the opposite. What is missing from the picture? What other explanation or diagnosis might explain the evidence? Stop and think, even for a few seconds. Stop and think.
Further listening
If you would like to listen to some podcast episodes about thinking, and the work of Kahneman and Tversky then here are some links:
BBC – Think with Pinker
Professor Steven Pinker has spent his life thinking about thinking. In this series, he discusses things that he hopes will help all of us make better decisions. He also interviews leaders in the field of psychology, and a former judge and Harvard Law Professor who discusses the life and death choices made by judges and juries.
https://www.bbc.co.uk/programmes/m0011lt1
Freakonomics: People I Mostly Admire – Remembering Daniel Kahneman
In 2021 Daniel Kahneman talked to Steve Levitt about their work and Kahneman’s book Noise.
Hidden Brain – The Transformative Ideas of Daniel Kahneman
This podcast remembers Kahneman by revisiting 2018 and 2021 conversations with him. The second part relates to “Noise” and discusses variability in decisions by different judges and between the same judge on different days.
Freakonomics – The Men who Started a Thinking Revolution
Michael Lewis wrote The Undoing Project about the Kahneman – Tversky collaboration. He explains how their work had such a profound influence on how we think about decision-making.
[1] The Hon P D T Applegarth Issue Framing in Written and Oral Submission
[2] P D T Applegarth ‘Deciding Novel and Routine Cases without Evidence’ J. Tort Law 2018, 173-208.
[3] C Guthrie, J Rachlinski & A Wistrich: “Inside the Judicial Mind” (2001) Cornell Law Review, vol 86:777; C Guthrie, J Rachlinski & A Wistrich: “HeinOnline” (2007) Cornell Law Review, vol 93:1.
[4] Kylie Burns: “Judges, ‘common sense’ and judicial cognition” (2016) 25(3) Griffith Law Review 319.
[5] Kahneman: Thinking, Fast and Slow, p 367.
[6] This article also can be found as an appendix to Kahneman’s bestselling work Thinking, Fast and Slow (Penguin, 2011).
[7] Daniel Kahneman, Thinking, Fast and Slow (Penguin, 2011) 130.
[8] Ibid 138.
[9] Kahneman “Thinking, Fast and Slow”, p 420 citing “Judgment Under Uncertainty”.
[10] Emma Cunliffe, ‘Judging, fast and slow: using decision-making theory to explore judicial fact determination’, The International Journal of Evidence & Proof (2014) 139 at 176.
[11] Amos Tversky and Daniel Kahneman, Judgment under Uncertainty: Heuristics and Biases (1974) 185(4157) Science 1124.
[12] Michael Lewis, The Undoing Project (WW Norton & Co, 2016) pp 215–216.