Monday, 21 September 2020

‘What Are We?’ “Self-reflective Consciousness, Cooperation, and the Agents of Our Future Evolution”

Cueva de las Manos, Río Pinturas

Posted by John Hands 

‘What are we?’ This is arguably the fundamental philosophical question. Indeed, ‘What are we?’ along with ‘Where do we come from?’ and ‘Why do we exist?’ are questions that humans have been asking for at least 25,000 years. During all of this time we have sought answers from the supernatural. About 3,000 years ago, however, we began to seek answers through philosophical reasoning and insight. Then, around 150 years ago, we began to seek answers through science: through systematic, preferably measurable, observation or experiment. 

As a science graduate and former tutor in physics for Britain's ‘Open University*’, I wanted to find out what answers science currently gives. But I couldn’t find any book that did so. There are two reasons for this.

  • First, the exponential increase in empirical data generated by rapid developments in technology had resulted in the branching of science into increasingly narrow, specialized fields. I wanted to step back from the focus of one leaf on one branch and see what the whole evolutionary tree shows us. 
  • Second, most science books advocate a particular theory, and often present it as fact. But scientific explanations change as new data is obtained and new thinking develops. 

And so I decided to write ‘the book that hadn’t been written’: an impartial evaluation of the current theories that explain how we evolved, not just from the first life on Earth, but where that came from, right back to the primordial matter and energy at the beginning of the universe of which we ultimately consist. I called it COSMOSAPIENS Human Evolution from the Origin of the Universe* and in the event it took more than 10 years to research and write. What’s more, the conclusions I reached surprised me. I had assumed that the Big Bang was well-established science. But the more I investigated the more I discovered that the Big Bang Theory had been contradicted by observational evidence stretching back 60 years. Cosmologists had continually changed this theory as more sophisticated observations and experiments produced ever more contradictions with the theory.

The latest theory is called the Concordance Model. It might more accurately be described as ‘The Inflationary-before-or-after-the-Hot Big Bang-unknown-27% Dark Matter-unknown-68% Dark Energy model’. Its central axiom, that the universe inflated at a trillion trillion trillion times the speed of light in a trillion trillion trillionth of a second is untestable. Hence it is not scientific.

The problem arises because these cosmological theories are mathematical models. They are simplified solutions of Einstein’s field equations of general relativity applied to the universe. They are based on assumptions that the latest observations show to be invalid. That’s one surprising conclusion I found. 

Another surprise came when I examined the orthodox theory for the last 65 years in the UK and the USA of how and why life on Earth evolved into so many different species. It is known as NeoDarwinism, and was popularised by Richard Dawkins in his bestselling book, The Selfish Gene, where it says that biological evolution is caused by genes selfishly competing with each other to survive and replicate.

NeoDarwinism is based on the fallacy of ascribing intention to an acid, deoxyribonucleic acid, of which genes are composed. Dawkins admits that this language is sloppy and says he could express it in scientific terms. But I’ve read the book twice and he never does manage to do this. Moreover, the theory is contradicted by substantial behavioural, genetic, and genomic evidence. When confronted by such, instead of modifying the theory to take account of the evidence, as a scientist should do, Dawkins lamely says “genes must have misfired”. 

The fact is, he couldn’t modify the theory because the evidence shows that Darwinian competition causes not the evolution of species but the destruction of species. It is cooperation, not competition, that has caused the evolution of successively more complex species.

Today, most biologists assert that we differ only in degree from other animals. I think that this too is wrong. What marked our emergence as a distinct species some 25,000 years ago wasn’t the size or shape of our skulls, or that we walked upright, or that we lacked bodily hair, or the genes we possess. These are differences in degree from other animals. What made us unique was reflective consciousness.

Consciousness is a characteristic of a living thing as distinct from an inanimate thing like a rock. It is possessed in rudimentary form by the simplest species like bacteria. In the evolutionary lineage leading to humans, consciousness increased with increasing neural complexity and centration in the brain until, with humans, it became conscious of itself. We are the only species that not only knows but also knows that it knows. We reflect on ourselves and our place in the cosmos. We ask questions like: What are we? Where did we come from? Why do we exist? 

This self-reflective consciousness has transformed existing abilities and generated new ones. It has transformed comprehension, learning, invention, and communication, which all other animals have in varying degrees. It has generated new abilities, like imagination, insight, abstraction, written language, belief, and morality that no other animal has. Its possession marks a difference in kind, not merely degree, from other animals, just as there is a difference in kind between inanimate matter, like a rock, and living things, like bacteria and animals. 

Moreover, Homo sapiens is the only known species that is still evolving. Our evolution is not morphological—physical characteristics—or genetic, but noetic, meaning ‘relating to mental activity’. It is an evolution of the mind, and has been occurring in three overlapping phases: primeval, philosophical, and scientific. 

Primeval thinking was dominated by the foreknowledge of death and the need to survive. Accordingly, imagination gave rise to superstition, which is a belief that usually arises from a lack of understanding of natural phenomena or fear of the unknown. 

It is evidenced by legends and myths, the beliefs in animism, totemism, and ancestor worship of hunter-gatherers, to polytheism in city-states in which the pantheon of gods reflected the social hierarchy of their societies, and finally to a monotheism in which other gods were demoted to angels or subsumed into one God, reflecting the absolute power of king or emperor. 

The instinct for competition and aggression, which had been ingrained over millions of years of prehuman ancestry, remained a powerful characteristic of humans, interacting with, and dominating, reflective consciousness. 

The second phase of reflective consciousness, philosophical thinking, emerged roughly 1500 to 500 BCE. It was characterised by humans going beyond superstition to use reasoning and insight, often after disciplined meditation, to answer questions. In all cultures it produced the ethical view that we should treat all others, including our enemies, as ourselves. This ran counter to the predominant instinct of aggression and competition. 

The third phase, scientific thinking, gradually emerged from natural philosophy around 1600 CE. It branched into the physical sciences, the life sciences, and medical sciences. 

Physics, the fundamental science, then started to converge, rapidly so over the last 65 years, towards a single theory that describes all the interactions between all forms of matter. According to this view, all physical phenomena are lower energy manifestations of a single energy at the beginning of the universe. This is similar in very many respects to the insight of philosophers of all cultures that there is an underlying energy in the cosmos that gives rise to all matter and energy. 

During this period, reflective consciousness has produced an increasing convergence of humankind. The development of technology has led to globalisation, both physically and electronically, in trade, science, education, politics (United Nations), and altruistic activities such as UNICEF and Médecins Sans Frontières. It has also produced a ‘complexification’ of human societies, a reduction in aggression, an increase in cooperation, and the ability to determine humankind’s future. 

This whole process of human evolution has been accelerating. Primeval thinking emerges roughly 25,000 years ago, philosophical thinking emerges about 3,000 years ago, scientific thinking emerges some 400 years ago, while convergent thinking begins barely 65 years ago. 

I think that when we examine the evidence of our evolution from primordial matter and energy at the beginning of the universe, we see a consistent pattern. This shows that we humans are the unfinished product of an accelerating cosmic evolutionary process characterised by cooperation, increasing complexity and convergence, and that – uniquely as far we know – we are the self-reflective agents of our future evolution. 


 

*For further details and reviews of John’s new book, see https://johnhands.com 

Editor's note. The UK’s ‘Open University’ differs from other universities through its the policy of open admissions and its emphasis on distance and online learning programs.

Monday, 14 September 2020

Poetry: The Non-linear Mathematics of History



Posted by Chengde Chen *


Things are so obvious, why can’t we see them?
We are still obsessed with developing technology
as if we wished to hasten our extinction
This is because history is deceptive
We have no understanding of the mathematics of history
hence are immersed in a linear perception of ‘progress’:
history has proved that man controls technology
so technology must do more good than harm
This has been our experience of thousands of years
thus our unshakable faith and confidence

We, of course, need to rely on history
which seems to be the only thing we have
Yet, history is not a piece of repeatable music
but more of non-linear mathematics
Some histories may be mirrors of futures
while some futures have no reflection of history at all

It is hard to establish such a non-linear understanding
as it’s so different from our intuition
Thanks to the difficulty, as a famous tale relates
Dahir, an Indian wise-man of 3000 years ago
almost made the King bankrupt his Kingdom!

One day, the chess-loving King challenged Dahir
by asking him to play the final phase of a losing battle
As it seemed impossible for anyone to turn the table
the King promised Dahir smugly:
‘If you can win, I’ll meet you a request of any kind!’
Dahir, with his superior intelligence, did win
but he only made a very small request:
‘I would like to have some grain
placed on the chessboard in the following way:
one for the first square
two for the second square
four for the third square
and so on and so forth
so that each square is twice that of the previous one
until all sixty four squares of the chessboard are placed’


What an insignificant request, the King thought
and approved it immediately
He ordered his soldiers to bring in a sack of grain
and to place them in the way requested
When one sack was finished, another was served
Then another, and another…
until they exhausted all the grain in the Kingdom
it was still far from completing the 64 squares
The grains required are such astronomical quantity that
even the amount of grain in today’s world
does not come near it (over 1000 billion tonnes)!
It was the modest figures of the early counting
as well as the linear intuition about ‘history’
that made the King miscalculate the matter completely
He is still in debt to Dahir to this day!

Technological progress is the kind of exponential curve
but it is even more deceptive
It had crawled very slowly for very long in ancient times
but rose quicker and quicker in recent centuries
People, however, have considered the change linearly
assuming the rate of growth the same as the past
Hence a common-sense conviction:
we have always progressed through technology
so through it we can always progress into the future
technology has always become more and more advanced
so with it we can always be more and more powerful

Oh, the linear thinking of progress!
History is not optics
nor is the future a mirror image of the past

In the past man was a small member of the club of nature
while today, we have changed the weather, raised oceans
and created new species, as well as new forms of energy
If we cannot see such a world of difference
we are as miscalculating as the old King was!

We cannot, however, afford to miscalculate
as we would have no time even to be surprised
The surface value of history is its usefulness
The deeper value of history is to prove itself useless

The history in which we controlled technology
was only history, no matter how brilliant it was
The future may mean a ruthless breaking away from it!



Editor's note. The amount required is 2 raised to the power of 64 minus one. Wikipedia offers that the total number of grains is eighteen quintillion, four hundred and forty-six quadrillion seven hundred and forty-four trillion seventy-three billion seven hundred and nine million five hundred and fifty-one thousand six hundred and fifteen (18,446,744,073,709,551,615) and that this is “about 2,000 times annual world production”. 
 
* Chengde Chen is the author of the philosophical poems collection: Five Themes of Today, Open Gate Press, London. He can be contacted on chengde.chen@hotmail.com

Monday, 7 September 2020

‘Mary’s Room’: A Thought Experiment

Posted by Keith Tidman
Can we fully understand the world through thought and language—or do we only really understand it through experience? And if only through experience, can we truly communicate with one another on every level? These were some of the questions which lay behind a famous thought experiment of 1982:
A brilliant neurophysiologist, Mary, knows all there is to know about her academic specialty, the science of vision: the physics, biology, chemistry, physiology, and neuroscience. Also how we see colour.

There’s a catch, however: Mary has lived her entire life in a totally black-and-white room, watching a black-and-white screen, and reading black-and-white books. An entirely monochromatic existence. Then, unexpectedly, her screen reveals a bright-red tomato.

What was it like for Mary to experience colour for the first time? Or as the Australian philosopher Frank Jackson asked, who originated this thought experiment, ‘Will [Mary] learn anything or not?’ *

Jackson’s original takeaway from his scenario was that Mary’s first-time experience of red amounted to new knowledge—despite her comprehensive scientific knowledge in the field of colour vision. Jackson believed at the time that colour perception cannot entirely be understood without a person visually experiencing colour.

However, not everyone agreed. Some proposed that Mary’s knowledge, in the absence of first-hand experience, was at best only ever going to be partial, never complete. Indeed, renowned philosopher Thomas Nagel, of ‘what is it like to be a bat’ fame, was in the camp of those who argue that some information can only be understood subjectively.

Yet, Mary's complete acquaintance with the science of vision might well be all there is to understanding the formation of knowledge about colour perception. Philosopher and neurobiologist Owen Flanagan was on-board, concluding that seeing red is a physical occurrence. As he put it, 'Mary knows everything about colour vision that can be expressed in the vocabularies of a complete physics, chemistry, and neuroscience.

Mary would not have learned anything new, then, when the bright-red tomato popped up on her screen. Through the completeness of her knowledge of the science of colour vision, she already fully knew what her exposure to the red tomato would entail by way of sensations. No qualities of the experience were unknowable. The key is in how the brain gives rise to subjective knowledge and experience.

The matter boils down to whether there are nonphysical, qualitative sensations—like colour, taste, smell, feeling, and emotion—that require experience in order for us to become fully familiar with them. Are there limits to our comprehension of something we don’t actually experience? If so, Mary did learn something new by seeing red for the first time.

A few years after Frank Jackson first presented the ‘Mary’s room’ thought experiment, he changed his mind. After considering opposing viewpoints, he came to believe that there was nothing apart from redness’s physical description, of which Mary was fully aware. This time, he concluded that first-hand experiences, too, are scientifically objective, fully measurable events in the brain and thus knowable by someone with Mary’s comprehension and expertise.

This switching of his original position was prompted, in part, by American philosopher and cognitive scientist Daniel Dennett. Dennett asserted that if Mary indeed knew ‘absolutely everything about colour’, as Jackson’s thought experiment presumes, by definition her all-encompassing knowledge would include the science behind people’s ability to comprehend the actual sensation of colour.

To these points, Mary’s factual expertise in the science of colour experience—and the experience’s equivalence and measurability in the brain—appears sufficient to conclude she already knew what red would look like. The experience of red was part of her comprehension of human cognitive functions. Not just with regard to colour, but also to the full array of human mental states: for instance, pain, sweetness, coldness, exhilaration, tedium—ad infinitum.

As Jackson ultimately concluded, the gist is that, given the special particulars of the thought experiment—Mary acquired ‘all the physical information there is to obtain about what goes on when we see ripe tomatoes, or the sky, and use terms like red and blue’—Mary did not acquire new information upon first seeing the red tomato. She didn’t learn anything. Her awareness of redness was already complete.



* Frank Jackson, 'Epiphenomenal Qualia', Philosophical Quarterly, 32, April 1982.

Monday, 31 August 2020

Thought Experiment: How do you Price the Office Parlour Palm?

Posted by Martin Cohen
Here's one of a collection of short puzzles that might be considered an A-Z of the tricks of high finance: Not so much 'P is for Parlour Palm' though as 'C is for Cheap Collateral'.

This is the idea that if a bank agrees to loan the office parlour palm to the next door bank for a million dollars, and in return to rent their aspidistra for a million dollars, they both can update their asset sheets accordingly!

Now that's magic. But it was also the basis of the B for Bubble that brought down most of the world's banking system in 2008!

Of course, banks don't do silly stuff like buy each others' pot plants. But they do buy each others' packaged securities. And for many years, these packages became more and more complex, and thus more and more about buyer and seller agreeing on what mysterious qualities made the deal realistic. We know where that ended up: with thousands of dodgy loans to people who had no income or maybe had even died being bundled up and sold as top quality assets. Banks are plagued by problems with so-called ‘ghost’ collateral that disappears or is pledged to several lenders at the same time! After the crisis, the European Central Bank looked at the use of such devices and in a discussion paper wrote:

"the use of collateral is neither a sufficient nor a necessary condition for financial stability."*

The logicians could not have put it better!


https://www.ecb.europa.eu/pub/pdf/scpwps/ecb.wp2107.en.pdf

Monday, 24 August 2020

The Necessity of Free Will

Eric Hanson, ArtAsiaPacific Magazine, Mar/Apr 2013
by Thomas Scarborough

I propose to solve the problem of free will.

The problem is, quite simply, the view that we live in a world where causality reigns supreme. If causality reigns supreme, then there can be no free will. And if we admit quantum indeterminacy to the picture, neither is indeterminacy free will.

I propose that the problem rests on an ancient conceptual dichotomy: the things-relations distinction. I propose, too, that this distinction is illusory. Aristotle called it the features-dispositions distinction. Wittgenstein called it the objects-arrangements distinction. We find it, too, in language (the nouns-verbs distinction), and in maths (variables-operators).

The alternative is obvious: there is no such distinction, but rather a fusion of things.  The philosopher Mel Thompson describes our world as ‘a seamless web of causality that goes forwards and backward in time and outwards in space’. ‘Seamless’, if we take it to mean exactly that, implies that there are no seams; there is no separation between things; therefore there is no relation between them.

Our reality has been variously described as an undifferentiated stream of experience, a kaleidoscopic flux of impressions, a swirling cloud without determinate shape. To make sense of this, then, we need to separate it into sounds and sights, surfaces and motions—which is individual things. We take aspects of a seamless whole, and we isolate them from the whole. Once done, we are able to trace relations between them.

With this, we have the basis of causality.  But in a seamless reality, where there is a fusion of things, all things cause all things. Even the language which we speak has an urge towards such fusion. There is an ‘evil’, wrote the philosopher and statesman Francis Bacon, in defining natural and material things.  ‘Definitions themselves consist of words, and those words beget others’.  Ultimately, our words reach into everything.

In the midst of an undifferentiated expanse, therefore, we create things, and we create causes. We isolate causes from the seamless whole—and with them, effects. But these causes must always strip something off.  This is why our thinking in terms of causality—which is supremely embodied in the modern scientific method—must bring about unwanted side effects of all kinds, through stripped-off relations.

When we say that A causes B we are, as it were, placing our drawing compass on the seamless web of causality and demarcating a circle in the midst of it: 'A'.  Outside of this circle lies the entire, seamless universe, and this knows no 'things'—until we create them in its midst. And when we create them, we create the intractable problem as to what a relation actually is.  A property?  An attribute?

Someone might object. Even if we have no things, no objects, no features (and so on) with which to create causality, we still have a reality which is bound by the laws of the universe. There is therefore some kind of something which is not free. Yet every scientific law is about A causes B. Whatever is out there, it has nothing in common with such a scheme—that we can know of anyway.

One more step is required to prove free will. Every cause that I identify is a creation of my own mind, in that it is freely chosen.  I am free to create it—which is, to demarcate the circle with the drawing compass. When I say that A caused B, I omit C, D, E, and every other possible cause, with the exception of what I want to create.  This is a choice without any kind of necessity.

I fire a shot at a clay pigeon. I choose the cause, and with the cause I choose the effect, and the pigeon shatters in the sky.  Now I see a nearby church bell.   I choose the cause, and I choose the effect, and an entire village awakes from its slumbers on a drowsy afternoon.   In this lies free will.  Cause and effect might seem iron clad—yet it is itself freely chosen.

But did I not cause my causes to be created?  Are not the causes and effects we invent themselves caused in some way?  This possibility is excluded.  We would need to readmit A’s and B’s to our scheme before we could claim cause.

David Bohm wrote that quantum theory is ‘the dropping of the notion of analysis of the world into relatively autonomous parts, separately existent but in interaction’.  In fact, this applies in every sphere.  Causality is illusory.  Not only that, but to say that any such illusion is caused is to admit causality through the back door.  There is no back door. 

Monday, 17 August 2020

And the Universe Shrugged




Posted by Keith Tidman

It’s not a question of whether humankind will become extinct, but when.

To be clear, I’m not talking about a devastatingly runaway climate; the predations of human beings on ecosystems; an asteroid slamming into Earth; a super-volcano erupting; a thermonuclear conflagration; a global contagion; rogue artificial intelligence; an eventual red-giant sun engulfing us; the pending collision of the Milky Way and Andromeda galaxies. Nor am I talking about the record of short-lived survival of our forerunners, like the Neanderthals, Denisovans, and Homo erectus, all of whom slid into extinction after unimpressive spans.

Rather, I’m speaking of cosmic death!

Cosmic death will occur according to standard physics, including cosmology. Because of the accelerating expansion of the universe and the irrepressibility of entropy — the headlong plunge toward evermore disorder and chaos — eventually no new stars will form, and existing stars will burn out. The universe will become uninhabitable long before its actual demise. Eventually a near vacuum will result. Particles that remain will be so unimaginably distanced from one another that they’ll seldom, if ever, interact. This is the ultimate end of the universe, when entropy reaches its maximum or so-called thermodynamic equilibrium, more descriptively dubbed ‘heat death’. There’s no place to duck; spacefaring won’t make a difference. Nowhere in the universe is immune.

Assuredly, heat death will take trillions of years to happen. However, might anyone imagine that the timeframe veils the true metaphysical significance of universal extinction, including the extinction of humans and all other conscious, intelligent life? And does it really make a difference if it’s tens of years or tens of trillions of years? Don’t the same ontological questions about being still searingly pertain, irrespective of timescale? Furthermore, does it really make a difference if this would be the so-called ‘sixth extinction’, or the thousandth, or the millionth, or the billionth? Again, don’t the same questions still pertain? There remains, amidst all this, the reality of finality. The consequences — the upshot of why this actuality matters to us existentially — stay the same, immune to time.

So, to ask ‘what is the meaning of life?’ — that old chestnut from inquiring minds through the millennia — likely becomes moot and even unanswerable, in the face of surefire universal extinction. As we contemplate the wafer-thin slice of time that makes up our eighty-or-so-year lifespans, the question seems to make a bit of sense. That select, very manageable timeframe puts us into our comfort zone; we can assure ourselves of meaning, to a degree. But the cosmological context of cosmic heat death contemptuously renders the question about life’s purpose without an answer; all bets are off. And, in face of cosmic thermodynamic death, it’s easy to shift to another chestnut: why, in light of all this, is there something rather than nothing? All this while we may justifiably stay in awe of the universe’s size and majesty, yet know the timing and inevitability of our own extinction rests deterministically in its hands.

A more suitable question might be whether we were given, evolutionarily, consciousness and higher-order intelligence for a reason, making it possible for us to reflect on and try to make sense of the universe. And where that ‘reason’ for our being might originate: an ethereal source, something intrinsic to the cosmos itself, or other. It’s possible that the answer is simply that humankind is incidental, consigning issues like beginnings to unimportance or even nonsense. After all, if the universe dies, and is itself therefore arguably incidental, we may be incidental, too. Again, the fact that the timeframe is huge is immaterial to these inquiries. Also immaterial is whether there might, hypothetically, be another, follow-on Big Bang. Whereby the cosmological process restarts, to include a set of natural physical laws, the possible evolution of intelligent life, and, let’s not overlook it, entropy all over again.

We compartmentalise our lives, to make sense of the bits and pieces that competitively and sometimes contradictorily impact us daily. And in the case of cosmic death and the extinction of life — ours and everyone else’s possibly dotting the universe — that event’s speck-like remoteness in distant time and the vastness of space understandably mollifies. This, despite the event’s unavoidability and hard-to-fathom, hard-to-internalise conclusiveness, existential warts and all. To include, one might suppose, the end of history, the end of physics, and the end of metaphysics! This end of everything might challenge claims to any singular specialness of our and other species, all jointly riding our home planets to this peculiar end. 

Perhaps we have no choice, in the meantime, to conduct ourselves in ways that reflect our belief systems and acknowledge the institutional tools (sociological, political, spiritual) used to referee those beliefs. As an everyday priority, we’ll surely continue to convert those beliefs into norms, to improve society and the quality of life in concrete, actionable ways. Those norms and institutions enable us to live an orderly existence — one that our minds can plumb and make rational sense of. Even though that may largely be a salve, it may be our best (realistically, only) default behaviour in contending with daily realities, ranging from the humdrum to the spectacular. We tend to practice what’s called ‘manic defence’, whereby people distract themselves by focusing on things other than what causes their anxiety and discomfort.

The alternative — to capitulate, falling back upon self-indulgent nihilism — is untenable, insupportable, and unsustainable. We are, after all, quite a resilient species. And we live every day with comparatively attainable horizons. There remains, moreover, a richness to our existence, when our existence is considered outside of extraordinary universal timeframes. Accordingly, we go on with our lives with optimism, not dwelling on the fact that something existential will eventually happen — our collective whistling past the graveyard, one might say. We seldom, if ever, factor this universal expiry date into our thinking — understandably so. There would be little to gain, on any practical level, in doing otherwise. Cosmic thermodynamic death, after all, doesn’t concern considerations of morality. Cosmic death is an amoral event, devoid of concerns about its rightness or wrongness. It will happen matter of factly.

Meanwhile, might the only response to cosmic extinction — and with it, our extinction — be for the universe and humanity to shrug?

Monday, 10 August 2020

A New Dark Age?

Genseric's Invasion of Rome, by Karl Bryullov, 1833
By Allister Marran
Are we living through a mini Dark Age in what was supposed to be a time of Enlightenment? Will history see this moment in the same light as it saw the decline of Western civilisation during the thousand lost years from 500 to 1 500 AD?
The democratisation of and free access to information with the rise of the Internet, mobile phones, and social media should have made people smarter, more knowledegable, and aware of the world around them.

Being able to access information previously locked behind the paywall of a university education, a military career, or a scientific laboratory seemed to be a renaissance-like utopia ushering in the next stage of the socio-cultural evolution of humankind.

But instead, we are now ostensibly far dumber for it, most notably because information without context is worthless, and the weaponisation of information and context, by nefarious actors trying to forge a new narrative that seeks to divide and conquer instead of unite and build, has had a devastating effect on world politics and social cohesion.

The value of information is that it is held in the trust that it is authentic, and this is where the manipulation of data has seen the biggest gains for bad actors seeking power and influence.

A false fact, or distorted perspective, can be drip fed into social conscience using complex social media algorithms which identify those most willing to buy into the lie using confirmation bias. The influencers are encouraged to share and comment, this lending the lie credibility, and ensuring that the common man or woman will continue to share it downstream, until a lie becomes generally accepted as truth for those wanting to believe it.

The right uses innate prejudice and hatred to rally support for bigotry and lies, which the anonymity of the Internet and a carefully chosen Twitter handle protects like a KKK mask of old.

And with the advent of cancel culture and left wing propaganda, people are too scared to challenge obvious falsehoods that emerge on the left, for fear of being cancelled themselves, a modern version of the Salem Witch Trials if you will.

This has permeated into every facet of life, not least of which being science and education. No student or professor dare take on any controversial research project, lest they be cancelled, stripped of tenure, or harmed if their scientifically verifiable results are taken to task by anyone on the left or right. Truth is no longer important; pandering to an already decided audience is the only thing that matters.

This is how progress stalls. This is how the last dark age began, when science and truth was made to conform to the beliefs and norms of the religious conveniences of men and women.

Perhaps humankind was not meant to have unfettered access to knowledge and information, as with great power comes great responsibility, and the average person does not have the ability to filter out the nonsense and internalise the good data.

The current course on which we are headed has dire consequences for everyone, as we have not existed this close together physically, yet far apart ideologically, for almost a hundred years, and without some method of bringing us all back together again, human being will either end up in conflict, or less desirably, enter another thousand years of Dark Ages.

Monday, 3 August 2020

Picture Post 57: A Clown's Playground



'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Photo credit: Rebecca Tidman
Posted by
Tessa den Uyl

Putting on a clown’s nose is a subtle and non-violent gesture to distinguish, but what exactly? The red ball on the nose un-identifies its wearer immediately, almost as if to become part of another species. Clowns may be funny and dramatic, stupid and incredibly smart, poetic without prose, offensive, scary, or sweet. Clowns attain to a world that mirrors the exaggeration of our being human.

The image of the clown offers the spectator a space to de-personalise in its turn, and in this psychological game the clown creates its playground. If a clown communicates, this is by touching all the unfolded layers we carry along within ourselves.


Indeed, clowns could very well have become a branch like ‘action psychotherapy’, only that clowns are much older than psychotherapy itself. Perhaps this is why many ideas about clowns are misapprehended, and a partly negative view or childishness in their regard belongs, not that much to them, but rather to how being human has been abased by their former appearances.

Monday, 27 July 2020

Poem: Fragility

By Jeremy Dyer


Shattered Glass Shoots, by Claus Bellers.

Fragility is a foolish thing
I don’t believe you're made of glass
Stop wallowing in your suffering
It’s just a pose, now move your arse.

Your sensitivity is a sham
You’re hard as nails so drop the scam
Just pull yourself together now
You're not a sacred Indian cow.

Fragility is a hard tiara
With metal thorns to make you bleed
I don't want your psycho-drama
Just tell me what the hell you need.

Fragility is here to stay
First blowing up then tearing down
To get the child her selfish way
Bipolar like a circus clown.

Fragility, the role you wear
Spewing out your evil wrath
Mercenary, the cross you bear
Exploiting all who cross your path.

Fragility, the cruellest mask
Deceiving all with poison smile
Killing the ones you take to task
Victimising all the while.


Editors' note: In recent weeks, fragility as a social term has been covered, among others, by The Guardian, The New York Times, and The Atlantic. Where an issue becomes all too familiar, poetry may infuse fresh vigour. 'The function of poetry,' wrote the linguist and literary theorist Roman Jakobson, 'is to point out that the sign is not identical to the referent.'

Monday, 20 July 2020

Miracles: Confirmable, or Chimerical?

Posted by Keith Tidman

Multiplication of the Loaves, by Georges, Mount Athos.
We are often passionately told of claims to experienced miracles, in both the religious and secular worlds. The word ‘miracle’ coming from the Latin mirari, meaning to wonder. But what are these miracles that some people wonder about, and do they happen as told?

Scottish philosopher David Hume, as sceptic on this matter, defined a miracle as ‘a violation of the laws of nature’ — with much else to say on the issue in his An Enquiry Concerning Human Understanding (1748). He proceeded to define the transgression of nature as due to a ‘particular volition of the Deity, or by the interposition of some invisible agent’. Though how much credence might one place in ‘invisible agents’?

Other philosophers, like Denmark’s Søren Kierkegaard in his pseudonymous persona Johannes Climacus, also placed themselves in Hume’s camp on the matter of miracles. Earlier, Dutch philosopher Baruch Spinoza wrote of miracles as events whose source and cause remain unknown to us (Tractatus Theologico-Politicus, 1670). Yet, countless other people around the world, of many religious persuasions, earnestly assert that the entreaty to miracles is one of the cornerstones of their faith. Indeed, some three-fourths of survey respondents indicated they believe in miracles, while nearly half said they have personally experienced or seen a miracle (Princeton Survey Research Associates, 2000; Harris poll, 2013).

One line of reasoning as to whether miracles are credible might start with the definition of miracles, such as transgressions of natural events uncontested convincingly by scientists or other specialists. The sufficiency of proof that a miracle really did occur and was not, deus ex machina, just imagined or stemming from a lack of understanding of the laws underlying nature is a very tall order, as surely it should be.

Purported proof would come from people who affirm they witnessed the event, raising questions about witnesses’ reliability and motives. In this regard, it would be required to eliminate obvious delusions, fraud, optical illusions, distortions, and the like. The testimony of witnesses in such matters is, understandably, often suspect. There are demanding conditions regarding definitions and authentication — such as of ‘natural events’, where scientific hypotheses famously, but for good reason, change to conform to new knowledge acquired through disciplined investigation. These conditions lead many people to dismiss the occurrence of miracles as pragmatically untenable, requiring by extension nothing less than a leap of faith.

But a leap of faith suggests that the alleged miracle happened through the interposition of a supernatural power, like a god or other transcendent, creative force of origin. This notion of an original source gives rise, I argue, to various problematic aspects to weigh.

One might wonder, for example, why a god would have created the cosmos to conform to what by all measures is a finely grained set of natural laws regarding cosmic reality, only later to decide, on rare occasion, to intervene. That is, where a god suspends or alters original laws in order to allow miracles. The assumption being that cosmic laws encompass all physical things, forces, and the interactions among them. So, a god choosing not to let select original laws remain in equilibrium, uninterrupted, seems selective — incongruously so, given theistic presumptions about a transcendent power’s omniscience and omnipotence and omniwisdom.

One wonders, thereby, what’s so peculiarly special about humankind to deserve to receive miracles — symbolic gestures, some say. Additionally, one might reasonably ponder why it was necessary for a god to turn to the device of miracles in order for people to extract signals regarding purported divine intent.

One might also wonder, in this theistic context, whether something was wrong with the suspended law to begin with, to necessitate suspension. That is, perhaps it is reasonable to conclude from miracles-based change that some identified law is not, as might have been supposed, inalterably good in all circumstances, for all eternity. Or, instead, maybe nothing was in fact defective in the original natural law, after all, there having been merely an erroneous read of what was really going on and why. A rationale, thereby, for alleged miracles — and the imagined compelling reasons to interfere in the cosmos — to appear disputable and nebulous.

The presumptive notion of ‘god in the gaps’ seems tenuously to pertain here, where a god is invoked to fill the gaps in human knowledge — what is not yet known at some point in history — and thus by extension allows for miracles to substitute for what reason and confirmable empirical evidence might otherwise and eventually tell us.

As Voltaire further ventured, ‘It is . . . impious to ascribe miracles to God; they would indicate a lack of forethought, or of power, or both’ (Philosophical Dictionary, 1764). Yet, unsurprisingly, contentions like Voltaire’s aren’t definitive as a closing chapter to the accounting. There’s another facet to the discussion that we need to get at — a nonreligious aspect.

In a secular setting, the list of problematic considerations regarding miracles doesn’t grow easier to resolve. The challenges remain knotty. A reasonable assumption, in this irreligious context, is that the cosmos was not created by a god, but rather was self-caused (causa sui). In this model, there were no ‘prior’ events pointing to the cosmos’s lineage. A cosmos that possesses integrally within itself a complete explanation for its existence. Or, a cosmos that has no beginning — a boundless construct having existed infinitely.

One might wonder whether a cosmos’s existence is the default, stemming from the cosmological contention that ‘nothingness’ cannot exist, implying no beginning or end. One might further ponder how such a cosmos — in the absence of a transcendent force powerful enough to tinker with it — might temporarily suspend or alter a natural law in order to accommodate the appearance of a happening identifiable as a miracle. I propose there would be no mechanism to cause such an alteration to the cosmic fabric to happen. On those bases, it may seem there’s no logical reason for (no possibility of) miracles. Indeed, the scientific method does itself call for further examining what may have been considered a natural law whenever there are repeated exceptions or contradictions to it, rather than assuming that a miracle is recurring.

Hume proclaimed that ‘no testimony is sufficient to establish a miracle’; centuries earlier, Augustine of Hippo articulated a third, and broader take on the subject. He pointedly asked, ‘Is not the universe itself a miracle?’ (The City of God, 426 AD). Here, one might reasonably interpret ‘a miracle’ as synonymous for a less emotionally charged, temporal superlative like ‘remarkable’. I suspect most of us agree that our vast, roiling cosmos is indeed a marvel, though debatably not necessitating an originating spiritual framework like Augustine’s. 

No matter how supposed miracles are perceived, internalised, and retold, the critical issue of what can or cannot be confirmed dovetails to an assessment of the ‘knowledge’ in hand: what one knows, how one knows it, and with what level of certainty one knows it. So much of reality boils down to probabilities as the measuring stick; the evidence for miracles is no exception. If we’re left with only gossamer-thin substantiation, or no truly credible substantiation, or no realistically potential path to substantiation — which appears the case — claims of miracles may, I offer, be dismissed as improbable or even phantasmal.

Monday, 13 July 2020

Staring Statistics in the Face

By Thomas Scarborough

George W. Buck’s dictum has it, ‘Statistics don’t lie.’ Yet the present pandemic should give us reason for pause. The statistics have been grossly at variance with one another.

According to a paper in The Lancet, statistics ‘in the initial period’ estimated a case fatality rate or CFR of 15%. Then, on 3 March, the World Health Organisation announced, ‘Globally, about 3.4% of reported COVID-19 cases have died.’ By 16 June, however, an epidemiologist was quoted in Nature, ‘Studies ... are tending to converge around 0.5–1%’ (now estimating the infection fatality rate, or IFR).

Indeed it is not as simple as all this—but the purpose here is not to side with any particular figures. The purpose is to ask how our statistics could be so wrong. Wrong, rather than, shall we say, slanted. Statistical errors have been of such a magnitude as is hard to believe. A two-fold error should be an enormity, let alone ten-fold, or twenty-fold, or more.

The statistics, in turn, have had major consequences. The Lancet rightly observes, ‘Hard outcomes such as the CFR have a crucial part in forming strategies at national and international levels.’ This was borne out in March, when the World Health Organisation added to its announcement of a 3.4% CFR, ‘It can be contained—which is why we must do everything we can to contain it’. And so we did. At that point, human activity across the globe—sometimes vital human activity—came to a halt.

Over the months, the figures have been adjusted, updated, modified, revised, corrected, and in some cases, deleted. We are at risk of forgetting now. The discrepancies over time could easily slip our attention, where we should be staring them in the face.

The statistical errors are a philosophical problem. Cambridge philosopher Simon Blackburn points out two problems with regard to fact. Fact, he writes, 'may itself involve value judgements, as may the selection of particular facts as the essential ones'. The first of these problems is fairly obvious. For example, ‘Beethoven is overrated’ might seem at first to represent a statement of fact, where it really does not. The second problem is critical. We select facts, yet do so on a doubtful basis.

Facts do not exist in isolation. We typically insert them into equations, algorithms, models (and so on). In fact, we need to form an opinion about the relevance of the facts before we even seek them out—learning algorithms not excepted. In the case of the present pandemic, we began with deaths ÷ cases x 100 = CFR. We may reduce this to the equation a ÷ b x 100 = c. Yet notice now that we have selected variables a, b, and c, to the exclusion of all others. Say, x, y, or z.

What then gave us the authority to select a, b, and c? In fact, before we make any such selection, we need to 'scope the system'. We need to demarcate our enterprise, or we shall easily lose control of it. One cannot introduce any and every variable into the mix. Again, in the words of Simon Blackburn, it is the ‘essential’ facts we need. This in fact requires wisdom—a wisdom we cannot do without. In the words of the statistician William Briggs, we need ‘slow, maturing thought’.

Swiss Policy Research comments on the early phase of the pandemic, ‘Many people with only mild or no symptoms were not taken into account.’ This goes to the selection of facts, and reveals why statistics may be so deceptive. They are facts, indeed, but they are selected facts. For this reason, we have witnessed a sequence of events over recent months, something like this:
At first we focused on the case fatality rate or CFR
Then we took the infection fatality rate into account, or IFR
Then we took social values into account (which led to some crisis of thought)
Now we take non-viral fatalities into account (which begins to look catastrophic)
This is too simple, yet it illustrates the point. Statistics require the wisdom to tell how we should delineate relevance. Statistics do not select themselves. Subjective humans do it. In fact, I would contend that the selection of facts in the case of the pandemic was largely subconscious and cultural. It stands to reason that, if we have dominant social values, these will tend to come first in our selection process.

In our early response to the pandemic, we quickly developed a mindset—a mental inertia which prevented us from following the most productive steps and the most adaptive reasoning, and every tragic death reinforced this mindset, and distracted us. Time will tell, but today we generally project that far more people will die through our response to the pandemic than died from the pandemic itself—let alone the suffering.

The biggest lesson we should be taking away from it is that we humans are not rational. Knowledge, wrote Confucius, is to know both what one knows, and what one does not know. We do not know how to handle statistics.

Monday, 6 July 2020

Picture Post 56: Fate on the Verge of Extinction



'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Posted by Tessa den Uyl
Photo credit: African shared pictures. Cameroon.

The woman in white, called ‘the female pastor’, cures a woman affected with COVID-19. Interesting in the picture is the physical approach this female pastor takes in regard to a contagious disease. Noteworthy is also the posture of the patient, which completely surrenders to this kind of aid.

Superstition. Can it or can it not cure?

When we dive into other cultures, we should be careful in responding to this question. In the case of this specific picture, we are talking about a place where the native language itself is in the throes of extinction. And with a language that is only spoken, not written, the population of such an ethnic group becomes extremely vulnerable towards misinformation.

Suppose you have grown up believing in magic, and regular medicine has never reached your habitat, beyond perhaps an aspirin. To reach out for what your people have always known is not stupid, is simply obvious. Less apparent is the exploitation of the superstition of minority groups, to create personal benefit in a context of capitalism and mass urbanisation. Hence they often go together!

To exploit a virus’s nature like COVID -19 with a blow in the face, is not taking care of ‘your flock’; rather it traces upon very old traditions that cannot endure the loss of the mind as a mystical labyrinth, in favour of the power of the human mind alone to find cure.

Inherently, this picture questions where the idea of destiny, which is characteristic of superstition, is going to stand in a globalising world.

Monday, 29 June 2020

The Afterlife: What Do We Imagine?

Posted by Keith Tidman


‘The real question of life after death isn’t whether 
or not it exists, but even if it does, what 
problem this really solves’

— Wittgenstein, Tractatus Logico-Philosophicus, 1921

Our mortality, and how we might transcend it, is one of humanity’s central preoccupations since prehistory. One much-pondered possibility is that of an afterlife. This would potentially serve a variety of purposes: to buttress fraught quests for life’s meaning and purpose; to dull unpleasant visions of what happens to us physically upon death; to switch out fear of the void of nothingness with hope and expectation; and, to the point here, to claim continuity of existence through a mysterious hereafter thought to defy and supplant corporeal mortality.

And so, the afterlife, in one form or another, has continued to garner considerable support to the present. An Ipsos/Reuters poll in 2011 of the populations of twenty-three countries found that a little over half believe in an afterlife, with a wide range of outcomes correlated with how faith-based or secular a country is considered. The Pew Center’s Religious Landscape Study polling found, in 2014, that almost three-fourths of people seem to believe in heaven and more than half said that they believed in hell. The findings cut across most religions. Separately, research has found that some one-third of atheists and agnostics believe in an afterlife — one imagined to include ‘some sort of conscious existence’, as the survey put it. (This was the Austin Institute for the Study of Family and Culture, 2014.) 

Other research has corroberated these survey results. Researchers based at Britain's Oxford University in 2011 examined forty related studies conducted over the course of three years by a range of social-science and other specialists (including anthropologists, psychologists, philosophers, and theologians) in twenty countries and different cultures. The studies revealed an instinctive predisposition among people to an afterlife — whether of a soul or a spirit or just an aspect of the mind that continues after bodily death.

My aim here is not to exhaustively review all possible variants of an afterlife subscribed to around the world, like reincarnation — an impracticality for the essay. However, many beliefs in a spiritual afterlife, or continuation of consciousness, point to the concept of dualism, entailing a separation of mind and body. As René Descartes explained back in the 17th century:
‘There is a great difference between the mind and the body, inasmuch as the body is by its very nature always divisible, whereas the mind is clearly indivisible. For when I consider the mind, or myself insofar as I am only a thinking thing, I cannot distinguish any parts within myself. . . . By contrast, there is no corporeal or extended thing that I can think of which in my thought I cannot easily divide into parts. . . . This one argument would be enough to show me that the mind is completely different than the body’ (Sixth Meditation, 1641).
However, in the context of modern research, I believe that one may reasonably ask the following: Are the mind and body really two completely different things? Or are the mind and the body indistinct — the mind reducible to the brain, where the brain and mind are integral, inseparable, and necessitating each other? Mounting evidence points to consciousness and the mind as the product of neurophysiological activity. As to what’s going on when people think and experience, many neuroscientists favour the notion that the mind — consciousness and thought — is entirely reducible to brain activity, a concept sometimes variously referred to as physicalism, materialism, or monism. But the idea is that, in short, for every ‘mind state’ there is a corresponding ‘brain state’, a theory for which evidence is growing apace.

The mind and brain are today often considered, therefore, not separate substances. They are viewed as functionally indistinguishable parts of the whole. There seems, consequently, not to be broad conviction in mind-body dualism. Contrary to Cartesian dualism, the brain, from which thought comes, is physically divisible according to hemispheres, regions, and lobes — the brain’s architecture; by extension, the mind is likewise divisible — the mind’s architecture. What happens to the brain physically (from medical or other tangible influences) affects the mind. Consciousness arises from the entirety of the brain. A brain — a consciousness — that remarkably is conscious of itself, demonstrably curious and driven to contemplate its origins, its future, its purpose, and its place in the universe.

The contemporary American neuroscientist, Michael Gazzaniga, has described the dynamics of such consciousness in this manner:
‘It is as if our mind is a bubbling pot of water. . . . The top bubble ultimately bursts into an idea, only to be replaced by more bubbles. The surface is forever energized with activity, endless activity, until the bubbles go to sleep. The arrow of time stitches it all together as each bubble comes up for its moment. Consider that maybe consciousness can be understood only as the brain’s bubbles, each with its own hardware to close the gap, getting its moment’. (The Consciousness Instinct, 2018)
Moreover, an immaterial mind and a material world (such as the brain in the body), as dualism typically frames reality, would be incapable of acting upon each other: what’s been dubbed the ‘interaction problem’. Therefore the physicalist model — strengthened by research in fields like neurophysiology, which quicken to acquire ever-deeper learning — has, arguably, superseded the dualist model.

People’s understanding that, of course, they will die one day, has spurred search for spiritual continuation to earthbound life. Apprehension motivates. The yearn for purpose motivates. People have thus sought evidence, empirical or faith-based or other, to underprop their hope for otherworldly survival. However, modern reality as to the material, naturalistic basis of the mind may prove an injurious blow to notions of an out-of-body afterlife. After all, if we are our bodies and our bodies are us, death must end hope for survival of the mind. As David Hume graphically described our circumstances in Of the Immortality of the Soul (1755), our ‘common dissolution in death’. That some people are nonetheless prone to evoke dualistic spectral spirits — stretching from disembodied consciousness to immortal souls — that provide pretext in desirously thwarting the interruption of life doesn’t change the finality of existence. 

And so, my conclusion is that perhaps we’d be better served to find ingredients for an ‘afterlife’ in what we leave by way of influences, however ordinary and humble, upon others’ welfare. That is, a legacy recollected by those who live on beyond us, in its ideal a benevolent stamp upon the present and the future. This earthbound, palpable notion of what survives us goes to answer Wittgenstein’s challenge we started with, regarding ‘what problem’ an afterlife ‘solves’, for in this sense it solves the riddle of what, realistically, anyone might hope for.

Monday, 22 June 2020

Hope Against Hope

Thomas Scarborough. After the Veldfire.
By Thomas Scarborough
There are better things to look forward to.  That is what hope is about.  I hope to be happy.  I hope to be well.  I hope to succeed.  Even through struggle and strife, I hope for it all to be worthwhile.  The philosopher Immanuel Kant put it simply, ‘All hope concerns happiness.’ 
But wait, said the ancient Greek philosophers.  On what does one base such hope?  Hope is 'empty', wrote Solon. ‘Mindless’, wrote Plato.  Then the Roman philosopher Seneca saw the dark side, which has cast a shadow over hope ever since.  Hope and fear, he wrote, ‘march in unison like a prisoner and the escort he is handcuffed to. Fear keeps pace with hope.’

The standard account of hope is this: the object of hope must be uncertain, and a person must wish for it—and here is the trouble with hope.  There is not much about hope that is rational.  We have no sound reason to believe it is justified.  It is clear that one’s hopes may not come true.

Why then hope?  Even when hopes are fulfilled—if they are fulfilled—the journey often involves struggle, and heartache, and not a little luck.  And when I have been through all that, I may well have to go through it all again.  Another goal, another relationship. How often?  At what cost?  Often enough, our hopes, once realised, may still disappoint.  They so often leave us with less to hope for than we had before.

There is a psychological problem, too.  It is called the ‘problem of action’.  Today few disagree that, most basically, I am motivated to act when I hold up the world in my mind to the world itself, and there discover a disjoint between the two.  To put it another way, we are motivated by mental models.

Yet the opposite is true, too.  Just as a disjoint between expectation and reality motivates me, so a lack of such disjoint demotivates me.  It may potentially remove any motivation at all.  We cannot go on with a view of the world which is born of the world itself.

There is a hope, observed the philosopher Roe Fremstedal, which occurs spontaneously in youth, yet is often disappointed in time.  Many start out in life with high hopes, pleasant dreams, and enthusiasm to spare.  But as we progress through life, disillusionment sets in.  And disillusionment, presumably, means coming to see things for what they are.  The disjoint is lost.

And then, death. What kind of hope can overcome death?  Death destroys everything.  An anonymous poet wrote,
Nothing remains but decline,
Nothing but age and decay.
Someone might object.  ‘This is seeing the glass half empty.  Why not see it half full?’  But put it like this.  There is certainly no greater reason to hope than there is to fear or despair.

Is there hope for me?  Is there hope for my environment?  For society?  History?  The universe?  I side with the ancient Greeks.  They had the courage to tell it like it is.  Hope as we generally know it is mere deception and superstition.  ‘Hope,’ wrote Nietzsche, ‘is the worst of all evils because it prolongs the torments of man.’

When I was at school, we sang a song.  To schoolboys at the time, it seemed like a statement of boundless optimism and cheer.  Titled ‘The Impossible Dream’, it came from a Broadway musical of 1965—and it closes with these words:
Yes, and I'll reach
The unreachable star!
It seems hard to tell now whether the songwriter was sincere.  Some say that the striving which the words represent is more important than the words themselves.  Some say the songwriter was characterising his starry-eyed younger self.  More likely, it seems, he was raving against a contradictory universe, in a nonsensical song.

People have tried in various ways to get around the problems of hope.  We should best project our hopes onto something else, they say: society, history, eternity.  Some have said that hope just happens—so let it happen.  Some have said that we should quell our hopes—which might work if our minds did not transcend time.  Lately, hope tends to be studied as a mere phenomenon: this is how we define it; this is what it does.

The only way to hope in this life, wrote the Danish philosopher Søren Kierkegaard, is to ‘relate oneself expectantly to the possibility of the good’.  In fact, ‘at every moment always,’ he wrote, ‘one should hope all things’.  We hope, because there are all good things to look forward to, always.*

If this is to be true, there is one necessary condition.  All of our present actions, and all events, must serve our good and happiness.  Even our greatest disappointments, our greatest causes for despair—even death itself—must be interpreted as hope and be grounded in hope.  True hope cannot be conditional, as the Greeks rightly saw.

What guarantees such hope?  The theologian Stephen Travis wrote, ‘To hope means to look forward expectantly for God’s future activity’.  This de-objectifies hope—it relativises it, because God's activity cannot be known—and it provides the translation of fear and despair, to hope.  Yet even without bringing God into it, there would have to be something that translates fear and despair.  The only challenge that remains is to identify it and appropriate it.

Whatever comes my way—everything that comes my way—is something to be hoped for, not because I hope according to the standard account, but because I have an unconditional hope.  We call it ‘hope against hope’.



* Note, however, that there is a more existential possibility. If I have an unconditional hope which is, as it were, already fulfilled in the present—the present already representing 'all good things'—then I may expect the same of the future.  This overcomes the notion that hope it too future-orientated.

Monday, 15 June 2020

Joad’s Concept of Personality

Posted by Richard W. Symonds
There is a small group of significant philosophers who had extraordinary turnarounds. The most famous of these is Ludwig Wittgenstein, who wrote about his magnum opus, ‘The author of the Tractatus was mistaken.’ So, too, A.J. Ayer who, in an interview with the BBC, said of his former philosophy, ‘At the end of it all it was false’. Yet perhaps the most extraordinary turnaround was the enormously popular C.E.M. Joad.
Cyril Edwin Mitchinson Joad (1891-1953) was a university philosopher at Birkbeck College London, who wrote on a wide variety of philosophical subjects, both historical and contemporary. For most of his life he rejected religion—but in the 1940s and early 1950s he first abandoned atheism, then accepted a form of theism, and finally converted to Christianity.

Not until Recovery of Belief, in 1952, did he set out the Christian philosophy in which he had come to believe. This post explores just one aspect of that philosophy, namely his theory of personality and the soul—then briefly, what motivated him philosophically, to make such a radical about-turn. Here is Joad’s later view, in his own words:
‘Having considered and rejected a number of views as to the nature and interpretation of the cosmos, I shall state the one which seems to me to be open to the fewest objections. It is, briefly, what I take to be the traditional Christian view, namely, that the universe is to be conceived as two orders of reality, the natural order, consisting of people and things moving about in space and enduring in time, and a supernatural order neither in space nor in time, which consists of a Creative Person or Trinity of Persons from which the natural order derives its meaning, and in terms of which it receives its explanation.’
In his ‘interpretation of the cosmos’, then, Joad proceeds by seeking to vindicate ‘the traditional division of the human being [as] not twofold into mind and body, but threefold into mind, body and soul.’ The reference seems to be to the view identifiable in late-Scholastic theology, that a human being has an immortal part which can sin, be forgiven, and rise at the Last Judgement (the soul); a thinking part which can understand, affirm, deny, desire, imagine (the mind); and a body which is the agent of the mind and soul.

In fairness, Joad does not claim to demonstrate the validity of the threefold analysis; he claims no more than that ‘if it were true it would cover a number of facts which seem to be inexplicable on any other’. He offers it as what we might term an inference to the best explanation. He found no better way to explain the cosmos as he found it.

The soul, Joad tells us, is ‘the essential self and is timeless’. It is incarnated in bodies but can exist without them, since after our bodily death, it remains an individual entity and ‘sustains immortality’. At this point, the influence of Plato’s theory of the soul in the Phaedo is clear. Unplatonic, however, is the notion that the soul is ‘normally inaccessible to us’, and that we at least approximate to an awareness of it in ‘mystical experience’—experience with which ‘most of us, at any rate, are acquainted [in] certain moments of transport of tranquillity that we enjoy in our intercourse with nature’.

Yet Joad’s theory does not rely solely on mystical experience. There are those, he writes, to whom mystical experience is denied. Thus he posits the soul as our ‘point of contact and communication’ with the divine ... God, to use the language of religion, influences man through his soul’.

Joad suggests that ‘The phenomena of spiritual healing and spiritual regeneration are ... most plausibly to be explained on the assumption that God, in response to prayer, acts upon us through the soul to heal the body and strengthen the mind. The soul is also the 'still small voice of God' of which we are conscious when the hubbub of ordinary life and consciousness dies down". This presupposes the existence of God, and of a God who acts in these ways.

Of the mind, Joad tells us that it ‘is brought into being in consequence of the contact of the soul with the natural, temporal order, which results from its incorporation in a physical body’. The mind cannot be identified with matter, as Locke’s ‘thinking substance’, for instance. Mind ‘cannot be adequately conceived in material terms ... Is the notion of conscious matter really thinkable?’ Joad asks rhetorically and in protest against Julian Huxley.

Yet Joad concedes that ‘The mind is, it is clear, constantly interacting with the body and the brain.’ Again, it is not Joad’s purpose to demonstrate the validity of his analysis. In fact, he states that this is a paradoxical occurrence which ‘is, by us, incomprehensible’. This incomprehensibility, further, he sees as being characteristic of what he calls ‘all the manifestations of the supernatural in the natural order’; the supernatural here being the soul—with the mind and the natural being the brain and the body.

There is, however, a crucial concept which subsumes the categories of body, mind, and soul. This is ‘personality’, which Joad describes as being ‘logically prior’ to the soul, mind, and body as the three elements of our being. He introduces us to this concept by considering the relation of a sonata to its notes, and of nation or society to its members (with a more thorough discussion of mereology).

While Joad does not define logical priority, the basic idea is that the soul (to borrow a phrase from C.D. Broad) is ‘an existent substantive’ which temporarily ‘owns’ or is characterised by the mind, the brain, and the body. Hence any idea that the person is a composite, ‘resulting from the concurrence of a number of parts’ has things the wrong way round. The person, essentially identified with the soul as ‘the seat of personality’, is prior to the ‘parts’—the mind, brain, and body.

It came down to this. C.E.M Joad considered the creeds of a single, materialist, physical order of reality ‘palpably inadequate’, almost meaningless, in explaining the universe and our place within it. ‘Personality’ seemed the only explanation left.

Fifteen years after Joad’s death, the philosophical theologian Francis Schaeffer’s major work, The God Who is There, was published in the USA. Interestingly, Schaeffer there presents ‘personality’ as his core idea. He writes that we have either ‘personality or a devilish din’. Schaeffer had an enormous influence on American society and religion. Among other things. President Ronald Reagan, thirteen years later, ascribed his election victory to Francis Schaeffer.

Joad’s final, almost forgotten book may have been more important than we suppose—but not only for society and religion. The idea of ‘personality’ as being logically prior to all else might become a critical pre-condition for humanity’s survival in the 21st century.

Recent Comments