This essay is excerpted from Guy McPherson’s (2011) memoir, Walking Away from Empire: A Personal Journey.
Asking a contemporary scientist in possession of a Ph.D. any question about philosophy typically draws a blank stare or, occasionally, an inquisitive gaze. Philosophy rarely is taught in science classes at any level of education, including the Ph.D. Across campus, a dose of science is taught in the philosophy department, but practicing scientists rarely are involved in the conversation.
Gemini, the twins
Yet the identical twins Science and Philosophy were born in ancient Athens. Most contemporary philosophers claim the twins’ father was Thales of Miletus, largely based on two events: Thales was the first to calculate the height of Egypt’s pyramids (which he did before traveling to Greece, by measuring the length of the pyramids’ shadows) and, even more notably, the first to predict a solar eclipse (in 585 BC).
Inseparable and indistinguishable for nearly two millennia, Science and Philosophy were viewed as one and the same child. Little evidence remains of Thales, and the majority of his ideas ultimately were buried beneath the landslide of Grecian reason capped by Socrates and Plato. Philosophical advances continued to pile up, but Alfred North Whitehead famously described two millennia of these advances as mere footnotes to Plato.
Despite minor quibbles, Science and Philosophy remained close for several centuries before they were irrevocably forced apart during, ironically, the Enlightenment.
Although most educated people could distinguish the twins by the mid-1600s, when intellectual and political battles produced notable differences in the twin icons of reason, they remained friends for another three centuries, until the biblical root of all evil came between them.
By 1945, Bertrand Russell introduced his comprehensive History of Western Philosophy by dividing knowledge into three categories: science represents the known universe, theology represents dogma (which I would hesitate to call “knowledge”), and philosophy represents the no man’s land between the two.
Russell concluded that philosophy, like science, relies on reason and that, like theology, it consists of speculations beyond definitive knowledge. Scientific advances resulting from the Enlightenment reduced philosophy to such a narrow domain that it “suffered more from modernity than any other field of human endeavor,” according to Hannah Arendt’s 1958 book, The Human Condition. The post-Aristotle shift from deduction to induction contributed to, or perhaps merely was symptomatic of, philosophy’s demise and the coincident rise of science.
In the wake of World War II and five years after Russell’s capacious historical account acknowledged and contributed to the chasm between science and philosophy, Franklin D. Roosevelt signed the legislation that created the National Science Foundation of the United
States (NSF). This new and influential organization swung the final ax that doomed Science and Philosophy to separate existences.
The NSF was created in 1950 and became by 1955 a dominant influence — perhaps the dominant influence — on the nature and conduct of science.
Confined to separate quarters, Science and Philosophy barely speak to each other in the 21st century. Casual observers would never know they once looked alike, as evidenced by treatment of the two entities on university campuses, where compartmentalization is the order of the day.
The marginalization of philosophy has coincided with the rise of “big science.” British philosopher John Gray goes so far as to write (in his excellent short book, Straw Dogs: Thoughts on Humans and Other Animals), “philosophy is a subject without a subject matter.”
I tend to think of philosophy in much the same way I think about science and art: it’s personal.
Paul Feyerabend’s dogmatic postmodernism notwithstanding, science has rules, more or less. But science as a way of understanding the universe — in sharp contrast to the societal expectation of science as a never-ending font of technology — is a personal journey of curiosity addressed with unbridled creativity. So, too, are art and philosophy. Although science often produces knowledge that is more repeatable and reliable than the other two endeavors, it’s not at all clear that either outcome is used by, or useful for, the typical person.
On the other hand, many people use technology — the (incorrectly) perceived point of science — as a tool to assault the natural world while temporarily satisfying our insatiable urge to divorce ourselves from physical reality. Whether the divorce is intentional or not is beyond the scope of my knowledge and this essay.
The nature of unity
If reason arose in Athens, passion for the natural world was born in the Orient. Specifically, Lao Tzu’s masterful book of poetry, the Tao Te Ching, which was written approximately coincident with the development of pre-Socratic philosophy in Greece (the birth year of Lao Tzu, who represented a single person or a conglomeration of identities, traditionally is accepted as 570 BC, 15 years after Thales predicted a solar eclipse).
Whereas Platonists often are blamed for divorcing humanity from the natural world, Eastern thought has maintained a tight connection between humans and their environment, and has exalted nature in the process (China’s recent embrace of free-market capitalism has produced the expected deterioration of that country’s environment). Tao Te Ching is the most famous example in the Western world, but Lao Tzu merely was reflecting his culture. Further, cursory inspection of virtually any of the major Eastern religions reveals strong links between nature and humanity.
A reason to reason
Reason arose in Greece about 25 centuries ago, and is perhaps best known from Plato’s Socratic Dialogues.
Plato (ca. 428-348 BC) uses the conversations of Socrates to pose and explore questions in considerable detail. Although many of the issues and associated conversations seem unsophisticated to contemporary readers, these initial attempts to employ logic to study the natural world and the role of humans in the world are remarkable precisely because they were the unprecedented. The contributions of ancient Greece to the material worldview that characterizes modernity cannot be overstated; that so many of the contributions came from Athens, a city that never exceeded 250,000 residents, is simply astonishing.
Although the ancient Greeks laid the foundation for modernity, few bricks were added to the structure for nearly two millennia.
During the early seventeenth century, the empiricist Francis Bacon (1561-1626) and the deconstructionist Rene Descartes (1596-1650) ushered in the Enlightenment, thereby triggering a flurry of construction to the edifice of knowledge. Almost overnight it became clear that the world was a material one that could be observed and quantified by all who dared think and observe. Nature obeyed rules and humans were big-brained animals capable of discovering and describing those rules. Thus, the Enlightenment eroded the role of authority as a source of knowledge. In the wake of Giordano Bruno’s heinous execution by the Catholic Church, Bacon recanted earlier statements in which he denied the Ptolemaic view that Earth was the center of the universe. But the erosion of authority that began as a trickle quickly became a flood, and the Church was increasingly marginalized as a source of knowledge.
David Hume (1711-1776), in his initial written piece of philosophy, presented a compelling case against miracles, hence against religion: Of Miracles was published in 1748 as an essay in An Enquiry Concerning Human Understandings. (Hume became particularly well known through the idea that what “is” does not indicate what “ought” to be.)
Shortly before Charles Darwin formalized the theory of evolution by natural selection in the Origin of Species (1859), Schopenhauer (1788-1860) used Plato-like dialogue to question the basis of religion (Religion: A Dialogue) and Max Stirner declared the death of God in his 1845 book, The Ego and Its Own. Notably influenced by Schopenhauer and writing shortly after publication of Darwin’s dangerous idea, Friedrich Nietzsche (1844-1900) vociferously spread the word about God’s death (probably without awareness of Stirner’s work) while predicting that Reason would overwhelm worldviews based on mysticism.
With respect to the rise of Reason, Nietzsche was an optimist. As S. Jonathan Singer concludes in his 2001 book, The Splendid Feast of Reason, it appears unlikely that more than ten percent of people are capable of employing reason as a basis for how they live. Singer likely did not know he was echoing Schopenhauer, although Schopenhauer’s use of dialogue in his essay clearly indicates he knew he was echoing Plato in reaching the same conclusion.
Collectively, these voices from the Enlightenment illustrate the capacity for, and importance of, Reason. Reason is the basis for understanding the material world. As such, it serves as the foundation upon which conservation biology, nature-human interactions, and durable living in human communities can be understood and practiced.
Lifting the veil
We can willingly conserve nature and its parts only through description and understanding rooted in reality. Mysticism has proven an insufficient foundation for conserving nature. Ultimately, it doubtless will prove inadequate for saving humanity as well.
It is not at all clear that humankind can be saved (or, for that matter, is worth saving). Evolution drives us to survive, drives us to procreate, and drives us to accumulate material possessions. Evolution always pushes us toward the brink, and culture piles on, hurling us into the abyss. Nietzsche was correct about our lack of free will — as Gray points out in Straw Dogs — free will is an illusion. It’s not merely the foam on the beer: it’s the last bubble of foam, the one that just popped. It’s no surprise, then, that we are sleepwalking into the future, or that the future is a formidably tall cliff.
–Guy McPherson for Transition Voice