By Keith Tidman
The term postmodernism entered the lexicon in the second half of the twentieth century, critical of modernity and of notions like objective natural reality, absolute truth, grand ideologies or belief systems, and what were referred to from the outset as metanarratives: more specifically, an “incredulity toward metanarratives,” as couched by French philosopher Jean-François Lyotard in 1979.
Metanarratives were seen as single, universal truths, or overarching accounts of reality that attempt to provide an all-encompassing (some say ‘totalizing’) description of the world. These metanarratives are based on society’s faith in Enlightenment principles: Where science and technology are viewed as among the key wellsprings of human advancement. And where inquiry is grounded in greater certainty, in the binary illumination of true and false, and in the objectivity of knowledge. All of which fall under the banner of realism, and which postmodernists tend roundly to refute.
Instead, postmodernists assert that knowledge, rather than absolute, is blemished, subjectively dependent on the vagaries of individual perception and the ways by which the human mind thinks, interprets, and presumes to understand the world. In that view, what we claim to know is seen as swayed by the biasing influences of social constructs: institutional, structural, methodological, dialectical, linguistic, cultural, normative, and sociopolitical. One foundational cornerstone of postmodernism’s philosophical viewpoint is often skepticism toward the methods of scientific realism (the scientific method) used in attempting to grasp the world as it really is.
Certainly, postmodernism brings worthwhile nuggets to deliberating the contributions of science and technology in ways that bring to the table fair criticisms and countervailing beliefs. There are of course different ways to knowing, some more favored and effective than others. To those extents, let’s be clear: There’s no getting around the fact that no discipline is immune from fault and censure. There are only degrees of deserved censure. So, it may be argued, as postmodernists indeed do, that importantly knowledge is only ever provisional.
That being said, I suggest that postmodernist thinking at times unjustly overreaches in its criticisms, going so far as to vilify science and technology. Yet, context matters. The breadth of knowledge that these two symbiotic fields of inquiry have contributed should awe: from countless advancements that materially make our daily lives less feral, to understanding the colossal cosmos we inhabit — and ever so much in-between. Through what-if pondering, structured hypotheses, tests and retests, and confirmation or refutation, progress in what we know — even if only provisional — comes in two forms: the gradualism of incremental, accreting change, with its measurable lurches forward in knowledge and understanding; and wholesale, inventive shifts from one paradigmatic model of reality to another.
Now let’s pivot. In particular, how might these doctrines of postmodernism magnificently collide with science? I suggest the question points us to what we might regard as science’s holy grail, namely its quest to develop what’s called a theory of everything, or ToE. An all-embracing, coherent theory resolutely pursued for decades by Albert Einstein, Stephen Hawking, and a line of other prominent theoretical physicists. The ToE, which sometimes also carries the moniker ‘unified field theory,’ is arguably science’s greatest affront to postmodernism’s disdain for the metanarratives mentioned above.
The basic aims of the envisioned ToE, in its narrowest form, can be summarized as this: developing a single theoretical scaffold that unifies all the forces of nature and particles into a master theory of the universe, describing all physical phenomena. Where no incompatibilities or unsolvable contradictions can exist. At the very least, unification must encompass gravity, electromagnetism, the weak nuclear force (responsible for particle decay), and the strong nuclear force (responsible for binding the fundamental particles of matter to form larger particles). Together, these four forces, encapsulated in what’s called the Standard Model, govern everything that happens in the universe.
However, currently there is a rather thorny incompatibility. It comes about while trying to unite quantum mechanics, which applies to very small scales, with Einstein’s theory of general relativity, which applies to very large scales. Although each of these two fields has been repeatedly authenticated as working spectacularly in their separate domains, the flawless unification of quantum mechanics and the theory of general relativity has thus far proven elusive. Which some postmodernists might find confirming.
More likely, however, there just needs to be a search for a still deeper reality than these fields, which would amicably and seamlessly integrate both into a single reigning reality or force, as they well might have been earlier in the life of the universe. That all-inclusive force would be described by the master theory, or ToE, that scientists envision. Among the different instances of research playing to these interests is ‘string theory,’ where particles are actually minuscule, uniquely vibrating strings with as many as ten dimensions to spacetime, rather than the points we usually think of. Another among the theories is a quantum version of gravity, by which spacetime would be seen in terms of quantum mechanical laws.
Even from a science standpoint, when it comes to a theory of everything, there remains the question, how truly everything is everything? Might it lead to understanding the whys and wherefores of all natural laws? Or might a TofE metanarrative always leave out some aspect of what composes this model of the world — that is, a tantalizingly missing something whose description requires another set of equations and axioms, and then yet another set, indefinitely? Because of such uncertainties and incompleteness, humankind will irresistibly continue to hunt for a ToE, as doing so is illustrative of our natural exploratory constitution. In time, it might even help soften Lyotard’s disapproving ‘incredulity toward metanarratives.’
Humanity is thus riveted to probing for a greater understanding of those first principles that make the cosmos tick in such orderly and decipherable fashion. That is, a comprehensible theory of everything that answers the following fundamental queries about the universe: Why is there this particular ‘something’ that composes the universe rather than an entirely different something or rather than ‘nothing’ at all? And what is the ToE metanarrative — the single, all-unifying theoretical framework — which describes that something?
2 comments:
A comment on Keith Tidman”s article “Postmodernism Collides with ‘The Theory of Everything’ published August 11, 2015
By E. Rohwer
A fruitful direction for answering Keith’s question—“Why is there this particular something that composes the universe rather than an entirely different something or nothing at all?”—is to turn to Wittgenstein’s Tractatus and identify the cause. The philosophical difficulty stems from overlooking the role of time.
It is at this particular moment in time that this particular something comprises the world. At an earlier moment, it was different: once people believed the Earth was flat. We can assume that the particular something of the future will differ again, given humanity’s ongoing scientific progress.
Wittgenstein accounted for the role of time in constructing the best possible philosophical model of the world. In his masterpiece, he claimed to have resolved all philosophical problems arising from the incorrect application of logic—in this case, the omission of time.
In today’s technical terms, Wittgenstein’s model is dynamic—it incorporates time—distributed—it recognizes that all language communication involves humans, who shape language and are in turn shaped by it—and, most importantly, probabilistic. Wittgenstein devotes the longest section of his masterwork (Section 5) to developing the application of probability theory to logic and language. This theory sets his model in motion.
The Tractatus famously opens with an answer to Keith’s other question: How truly everything is everything?
The world is everything that is the case. (TLP 1)
Etymologically, “is the case” means the dice has fallen and its result is, at this moment in time, known. An event has happened and been witnessed. “Everything” therefore encompasses all events that have happened up to this point in time.
But events are witnessed by people, and different people witness different sets of events. They record them in speech or writing—communicating with each other in language through propositions. Thus, a proposition is both an event and the description of that event. The single, unifying theoretical framework that covers both is the propositional theory Wittgenstein develops in the Tractatus. In a letter to Russell, he stated that this theory is the main point of his book. It culminates in TLP 6 with the formula of the general propositional form, which describes how a proposition is generated—expressed as a string of words, articulated at the moment the dice falls. I believe this generalized description is what Keith is asking for. It is both mathematically precise and philosophically powerful.
Furthermore, Wittgenstein’s statement that the general propositional form is a variable (TLP 5.53) points to its statistical nature, alluding to what in computer science is known as information entropy. This concept lies at the heart of today’s hugely successful generative AI, powered by large language models—proving Wittgenstein not only right, but astonishingly ahead of his time.
Thank you, Elizabeth, for your informative take on my essay.
First off, my sense is that turning to Ludwig Wittgenstein and his “Tractatus Logico-Philosophicus” risks leading us down an avoidable rabbit hole. I propose that the argument — anchored to "Tractatus" and its infamously myriad interpretations — may not get us to answers about cosmology and the “theory of everything.” I suggest that the incubator of answers is physics, not metaphysics.
As to the “role of time,” you say, “It is at this particular moment in time that this particular something comprises the world.” The paragraph adds that “at an earlier moment it [time] was different.” Well, sure, but these observations don't reveal much about the universe undergoing change for 13.8 billion years, and will continue to do so. Whereas astrophysics is clear.
Yes, I agree that time is better expressed as change. Though I prefer “arrow of entropy” — or, Stephen Hawking's “thermodynamic arrow of time.” These phrases work better than Arthur Eddington’s “arrow of time.” As I wrote elsewhere, “It is change — the neurophysiological perception of change in human consciousness — that renders the apparition of time visible to us.”
You say that “Wittgenstein accounted for time in constructing the best model of the world.” To be fair, physicists' and astrophysicists' models likewise account for time. But that time is one key to understanding reality isn't surprising. Especially given the churn, accelerating expansion, and entropy that science observes in the universe..
You say that “Wittgenstein’s model is dynamic," recognizing that humans, who "shape language and are in turn shaped by it," with reference to probabilities. Notions expressed by Noam Chomsky on how we perceive and process the world through language. I agree.
I suggest that mathematics, in tackling whether it’s discovered (the base of reality) or invented (a language), has similarly framed our understanding of the world. (The use of mathematics by the physicist Peter Higgs to predict the existence of something spectacularly resulted in discovery of the Higgs boson.)
To this role of mathematics, the physicist Eugene Wigner famously titled a paper “The Unreasonable Effectiveness of Mathematics in the Natural Sciences.” The coinage went viral. Earlier, Galileo proclaimed that “mathematics is the alphabet with which God has written the universe.” As for your turn to probabilities, quantum mechanics surely says something more substantial than the “Tractatus.”
You cite my question about “how truly everything is everything,” set in the context of a putative “theory of everything.” The “holy grail” of some scientists. You cite Wittgenstein as having this answer: “The world is everything that is the case.” Unfortunately, the wording seems unhelpfully tautological.
After all, can the world really be "less than everything that is the case," or "more than everything that is the case"? You say that “everything therefore encompasses all events that have happened up to this point.” I doubt the latter moves the our needle of understanding.
Rather, the context for framing a theory of everything was the postmodernists’ angst over metanarratives. The theory of everything is, unapologetically, the quintessential metanarrative, of which Jean-Francois Lyotard spoke, with science and technology at the root.
My essay's point was to suggest there may never be a final eureka moment, where further equations and scientific axioms cease. Hence the question what is “everything”? We must bear in mind that just as there are no degrees of "uniqueness," there are no degrees of "everything."
So, the word “everything” in the “theory” may serve only as a placeholder. Where “no incompatibilities or unsolvable contradictions can exist,” as my essay says. Those engaged in thought experimentation may need, per Isaac Newton, to “stand on the shoulders of others.” Crafting a theory of everything gounded on “those first principles that make the cosmos tick in such orderly and decipherable fashion.”
Post a Comment