Life is full of decisions. “Third Millennium Thinking: Creating Sense in a World of Knowledge” outlines methods of making choices rationally using scientific methods.
Nobel Prize-winning physicist Saul Perlmutter, philosophy professor John Campbell, and social psychologist Robert MacCoun turned their course at the University of California Berkeley on using scientific tools to approach everyday problems into a book.
Perlmutter says it’s easy to fall into mental traps or fool ourselves when making a choice. But when people assess all the variables that could influence them and the potential outcomes, they approach questions more thoughtfully, he says.
“There is so much of what is the scientific approach to the world that is never taught anywhere,” Perlmutter says. “It seemed like this was a time for us to be trying to figure out how can we teach this in ways that don’t require having to become a scientist in order to do it.”
Book excerpt: ‘Third Millennium Thinking: Creating Sense in a World of Nonsense’
By Saul Perlmutter, John Campbell and Robert MacCoun
INTRODUCTION
In just the past few decades, those of us who live in the internet-connected world have obtained access to a nearly unfathomable amount of information. We can click a link and instantly gain insight into whatever we’re curious about, whether it’s treatment options for a particular health condition, how to build a solar generator, or the political history of Malta. On the other hand, sometimes there is so much information we don’t know how to sort or evaluate it. The social science database ProQuest, for example, boasts of “a growing content collection that now encompasses . . . 6 billion digital pages and spans six centuries.” And that’s just old-school, print information! The Internet Archive’s Wayback Machine, an archive of websites and other digital artifacts dating back to 1996, hosts almost a trillion pages of digital content, tens of millions of books and audios, and nearly a million software programs.
More and more often, it can be hard to determine what to focus on, let alone how to distinguish what’s revelatory and enlightening, in and among all the highly technical, specialized, contradictory, incomplete, out‑of‑date, biased, or deliberately untrue information we can now access. Was that drug study funded by a pharmaceutical company? Did an AI system invent all those supposedly authentic product reviews? What do those statistics leave out? What does that article even mean? It is also increasingly tricky to identify whom to trust for expert guidance in interpreting this information. There are all sorts of people out there who claim expertise — and perhaps your favorite experts aren’t my favorite experts. Experts disagree, or have ulterior motives, or perhaps don’t understand the world or “real life” beyond their own narrow perspective. How do we find an expert we can safely trust?
To make a sound decision, take a meaningful action, or solve a problem — whether as individuals, in groups, or as a society — we need first to understand reality. But when reality is not easy to discern, and we’re not sure which experts to trust to clarify the matter, we adopt other strategies for navigating the clutter. We “go with our gut”; decide what we “believe” and look for evidence to reaffirm whatever that is; adopt positions based on our affiliations with people we know; even find reassurance in belittling the people who disagree with us. We choose to consult experts who tell us what we like to hear; or bond in shared mistrust of people providing or communicating the information that confuses us, whether they are scientists, scholars, journalists, community leaders, policymakers, or other experts. These coping strategies may help us get by in our personal or professional lives; they may provide a consoling sense of identity or belonging. But they do not actually help us see clearly or make good decisions. And resorting to them can have dangerous social and political consequences.
How can we navigate better — as individuals, and as a society — in this age of informational overwhelm? How do we ward off confusion, avoid mental traps, and sift sense out of nonsense? How do we make decisions and solve problems collaboratively with people who interpret information differently or have different values than we do?
The three of us — a physicist (Saul), a philosopher ( John), and a psychologist (Rob) — have been working closely together for nearly a decade on a project to help our students learn to think about big problems and make effective decisions in this “too much information” age. We began our collaboration in 2011, in response to what was already a worrying trend toward no‑think, politics-driven decision-making. An issue like raising the national debt ceiling, for example, was being debated that summer as if it were a religious schism, rather than a simple, practical, probably even testable question of what economic approach would work best to improve the country’s economic well-being. Most of the arguments both yea and nay betrayed equal disregard for, or ignorance of, the most basic principles of scientific thought. We began to wonder whether it might be possible to first articulate and then teach the principles that would lead to clearer thinking, more rational arguments, and a more fruitful collaborative decision-making process.
The result was a team-taught, multidisciplinary Big Ideas course at UC Berkeley, intended to teach students the whole gamut of ideas, tools, and approaches that natural and social scientists use to understand the world. We also designed the course to show how useful these approaches can be for everybody in day‑to‑day life, whether working individually or collaboratively, in making reasoned decisions and solving the full range of problems that face us. To our great satisfaction, the course has been both popular and successful, and has since been replicated and adapted by other teachers at a growing number of other universities.1 Our students appear to rethink their worlds and emerge energized with new ways to approach both personal decision-making and our society’s problems. They are better able to investigate their questions, evaluate information and expertise, and work together as members of a group or a society. Inspired by their enthusiasm, we began to think about new ways to share these tools — and this new way of thinking and working together — beyond the classroom, with students and citizens of all ages.
We have become ever more concerned that our society is losing its way, causing suffering — and missing great opportunities — simply because we don’t have the tools that could help us make sense of the extraordinary amount of complex, often contradictory information now available to us. Practical problem-solving can come to a standstill when we cannot ascertain the facts of the problems, or, when those problems require communal or political solutions, even agree with others on what those facts are. We humans, who can figure out rocket science and fly to the moon, can’t always figure out how to navigate uncertainty and conflicting points of view to make a simple reasonable decision when we need to.
Part of the problem is that science itself is often a major source of the highly technical, opaque, inconsistent, and contradictory information that has overwhelmed, perplexed, and even angered people. Trust in science has eroded in the recent past.2 The achievements of science cannot live up to all the utopian expectations those successes have generated. Some scientific achievements have also come with negative social, political, or environmental side effects. For these and other reasons, science has become one of the totems of polarization in political discussions. In short, as science became harder to understand, was connected to undesirable side effects, and subjected to politically partisan critiques, many people lost their trust in scientists and in “science” itself.3
But science also has a phenomenal record of providing insight into — if not answers to — the most confounding questions humans have thought to ask. It has helped us to solve puzzles, address problems, and make better lives over millennia. It is a culture of inquiry rooted in the dawn of humankind, with centuries of practice in evaluating conflicting information in a baffling world, and in distinguishing what we know from what we don’t. Along the way, scientists have learned from both successes and mistakes, breakthroughs and blunders, to refine the tools with which to address new questions and solve new problems.
Over the past few years, we have all become aware of the shocking degree of polarization in our society, and the surprising interaction between this polarization and our society’s often-problematic relationship to science and scientific expertise. If we are to have any hope of finding the practical common plans and common understandings that can move our society ahead together, we need to learn to accept the possibility of errors in our own thinking, and our need for opposing views that help us see where we are going wrong. And we need to understand the source of the disenchantment with and backlash against scientific progress that arose during the end of the Second Millennium and seek to repair it.
No one book and no single approach can heal the rifts. Not all of our polarized disagreements will vanish. But we have to start somewhere. And we believe that one of our more promising starting points is with the culture of science — if we begin to borrow its tools, ideas, and processes, and make a Third Millennium shift in our own thinking.
Adapted from “Third Millenium Thinking” by Saul Perlmutter, John Campbell, and Robert MacCoun. Copyright © 2024 by Saul Perlmutter, John Campbell, and Robert MacCoun. Used with permission of Little, Brown Spark, an imprint of Little, Brown and Company. New York, NY. All rights reserved.
Emiko Tamagawa produced and edited this interview for broadcast with Todd Mundt. Grace Griffin adapted it for the web.
This article was originally published on WBUR.org.
Copyright 2024 NPR. To see more, visit https://www.npr.org.