Monday, 3 August 2020

Picture Post 57: A Clown's Playground

'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Photo credit: Rebecca Tidman
Posted by
Tessa den Uyl

Putting on a clown’s nose is a subtle and non-violent gesture to distinguish, but what exactly? The red ball on the nose un-identifies its wearer immediately, almost as if to become part of another species. Clowns may be funny and dramatic, stupid and incredibly smart, poetic without prose, offensive, scary, or sweet. Clowns attain to a world that mirrors the exaggeration of our being human.

The image of the clown offers the spectator a space to de-personalise in its turn, and in this psychological game the clown creates its playground. If a clown communicates, this is by touching all the unfolded layers we carry along within ourselves.

Indeed, clowns could very well have become a branch like ‘action psychotherapy’, only that clowns are much older than psychotherapy itself. Perhaps this is why many ideas about clowns are misapprehended, and a partly negative view or childishness in their regard belongs, not that much to them, but rather to how being human has been abased by their former appearances.

Monday, 27 July 2020

Poem: Fragility

By Jeremy Dyer

Shattered Glass Shoots, by Claus Bellers.

Fragility is a foolish thing
I don’t believe you're made of glass
Stop wallowing in your suffering
It’s just a pose, now move your arse.

Your sensitivity is a sham
You’re hard as nails so drop the scam
Just pull yourself together now
You're not a sacred Indian cow.

Fragility is a hard tiara
With metal thorns to make you bleed
I don't want your psycho-drama
Just tell me what the hell you need.

Fragility is here to stay
First blowing up then tearing down
To get the child her selfish way
Bipolar like a circus clown.

Fragility, the role you wear
Spewing out your evil wrath
Mercenary, the cross you bear
Exploiting all who cross your path.

Fragility, the cruellest mask
Deceiving all with poison smile
Killing the ones you take to task
Victimising all the while.

Editors' note: In recent weeks, fragility as a social term has been covered, among others, by The Guardian, The New York Times, and The Atlantic. Where an issue becomes all too familiar, poetry may infuse fresh vigour. 'The function of poetry,' wrote the linguist and literary theorist Roman Jakobson, 'is to point out that the sign is not identical to the referent.'

Monday, 20 July 2020

Miracles: Confirmable, or Chimerical?

Posted by Keith Tidman

Multiplication of the Loaves, by Georges, Mount Athos.
We are often passionately told of claims to experienced miracles, in both the religious and secular worlds. The word ‘miracle’ coming from the Latin mirari, meaning to wonder. But what are these miracles that some people wonder about, and do they happen as told?

Scottish philosopher David Hume, as sceptic on this matter, defined a miracle as ‘a violation of the laws of nature’ — with much else to say on the issue in his An Enquiry Concerning Human Understanding (1748). He proceeded to define the transgression of nature as due to a ‘particular volition of the Deity, or by the interposition of some invisible agent’. Though how much credence might one place in ‘invisible agents’?

Other philosophers, like Denmark’s Søren Kierkegaard in his pseudonymous persona Johannes Climacus, also placed themselves in Hume’s camp on the matter of miracles. Earlier, Dutch philosopher Baruch Spinoza wrote of miracles as events whose source and cause remain unknown to us (Tractatus Theologico-Politicus, 1670). Yet, countless other people around the world, of many religious persuasions, earnestly assert that the entreaty to miracles is one of the cornerstones of their faith. Indeed, some three-fourths of survey respondents indicated they believe in miracles, while nearly half said they have personally experienced or seen a miracle (Princeton Survey Research Associates, 2000; Harris poll, 2013).

One line of reasoning as to whether miracles are credible might start with the definition of miracles, such as transgressions of natural events uncontested convincingly by scientists or other specialists. The sufficiency of proof that a miracle really did occur and was not, deus ex machina, just imagined or stemming from a lack of understanding of the laws underlying nature is a very tall order, as surely it should be.

Purported proof would come from people who affirm they witnessed the event, raising questions about witnesses’ reliability and motives. In this regard, it would be required to eliminate obvious delusions, fraud, optical illusions, distortions, and the like. The testimony of witnesses in such matters is, understandably, often suspect. There are demanding conditions regarding definitions and authentication — such as of ‘natural events’, where scientific hypotheses famously, but for good reason, change to conform to new knowledge acquired through disciplined investigation. These conditions lead many people to dismiss the occurrence of miracles as pragmatically untenable, requiring by extension nothing less than a leap of faith.

But a leap of faith suggests that the alleged miracle happened through the interposition of a supernatural power, like a god or other transcendent, creative force of origin. This notion of an original source gives rise, I argue, to various problematic aspects to weigh.

One might wonder, for example, why a god would have created the cosmos to conform to what by all measures is a finely grained set of natural laws regarding cosmic reality, only later to decide, on rare occasion, to intervene. That is, where a god suspends or alters original laws in order to allow miracles. The assumption being that cosmic laws encompass all physical things, forces, and the interactions among them. So, a god choosing not to let select original laws remain in equilibrium, uninterrupted, seems selective — incongruously so, given theistic presumptions about a transcendent power’s omniscience and omnipotence and omniwisdom.

One wonders, thereby, what’s so peculiarly special about humankind to deserve to receive miracles — symbolic gestures, some say. Additionally, one might reasonably ponder why it was necessary for a god to turn to the device of miracles in order for people to extract signals regarding purported divine intent.

One might also wonder, in this theistic context, whether something was wrong with the suspended law to begin with, to necessitate suspension. That is, perhaps it is reasonable to conclude from miracles-based change that some identified law is not, as might have been supposed, inalterably good in all circumstances, for all eternity. Or, instead, maybe nothing was in fact defective in the original natural law, after all, there having been merely an erroneous read of what was really going on and why. A rationale, thereby, for alleged miracles — and the imagined compelling reasons to interfere in the cosmos — to appear disputable and nebulous.

The presumptive notion of ‘god in the gaps’ seems tenuously to pertain here, where a god is invoked to fill the gaps in human knowledge — what is not yet known at some point in history — and thus by extension allows for miracles to substitute for what reason and confirmable empirical evidence might otherwise and eventually tell us.

As Voltaire further ventured, ‘It is . . . impious to ascribe miracles to God; they would indicate a lack of forethought, or of power, or both’ (Philosophical Dictionary, 1764). Yet, unsurprisingly, contentions like Voltaire’s aren’t definitive as a closing chapter to the accounting. There’s another facet to the discussion that we need to get at — a nonreligious aspect.

In a secular setting, the list of problematic considerations regarding miracles doesn’t grow easier to resolve. The challenges remain knotty. A reasonable assumption, in this irreligious context, is that the cosmos was not created by a god, but rather was self-caused (causa sui). In this model, there were no ‘prior’ events pointing to the cosmos’s lineage. A cosmos that possesses integrally within itself a complete explanation for its existence. Or, a cosmos that has no beginning — a boundless construct having existed infinitely.

One might wonder whether a cosmos’s existence is the default, stemming from the cosmological contention that ‘nothingness’ cannot exist, implying no beginning or end. One might further ponder how such a cosmos — in the absence of a transcendent force powerful enough to tinker with it — might temporarily suspend or alter a natural law in order to accommodate the appearance of a happening identifiable as a miracle. I propose there would be no mechanism to cause such an alteration to the cosmic fabric to happen. On those bases, it may seem there’s no logical reason for (no possibility of) miracles. Indeed, the scientific method does itself call for further examining what may have been considered a natural law whenever there are repeated exceptions or contradictions to it, rather than assuming that a miracle is recurring.

Hume proclaimed that ‘no testimony is sufficient to establish a miracle’; centuries earlier, Augustine of Hippo articulated a third, and broader take on the subject. He pointedly asked, ‘Is not the universe itself a miracle?’ (The City of God, 426 AD). Here, one might reasonably interpret ‘a miracle’ as synonymous for a less emotionally charged, temporal superlative like ‘remarkable’. I suspect most of us agree that our vast, roiling cosmos is indeed a marvel, though debatably not necessitating an originating spiritual framework like Augustine’s. 

No matter how supposed miracles are perceived, internalised, and retold, the critical issue of what can or cannot be confirmed dovetails to an assessment of the ‘knowledge’ in hand: what one knows, how one knows it, and with what level of certainty one knows it. So much of reality boils down to probabilities as the measuring stick; the evidence for miracles is no exception. If we’re left with only gossamer-thin substantiation, or no truly credible substantiation, or no realistically potential path to substantiation — which appears the case — claims of miracles may, I offer, be dismissed as improbable or even phantasmal.

Monday, 13 July 2020

Staring Statistics in the Face

By Thomas Scarborough

George W. Buck’s dictum has it, ‘Statistics don’t lie.’ Yet the present pandemic should give us reason for pause. The statistics have been grossly at variance with one another.

According to a paper in The Lancet, statistics ‘in the initial period’ estimated a case fatality rate or CFR of 15%. Then, on 3 March, the World Health Organisation announced, ‘Globally, about 3.4% of reported COVID-19 cases have died.’ By 16 June, however, an epidemiologist was quoted in Nature, ‘Studies ... are tending to converge around 0.5–1%’ (now estimating the infection fatality rate, or IFR).

Indeed it is not as simple as all this—but the purpose here is not to side with any particular figures. The purpose is to ask how our statistics could be so wrong. Wrong, rather than, shall we say, slanted. Statistical errors have been of such a magnitude as is hard to believe. A two-fold error should be an enormity, let alone ten-fold, or twenty-fold, or more.

The statistics, in turn, have had major consequences. The Lancet rightly observes, ‘Hard outcomes such as the CFR have a crucial part in forming strategies at national and international levels.’ This was borne out in March, when the World Health Organisation added to its announcement of a 3.4% CFR, ‘It can be contained—which is why we must do everything we can to contain it’. And so we did. At that point, human activity across the globe—sometimes vital human activity—came to a halt.

Over the months, the figures have been adjusted, updated, modified, revised, corrected, and in some cases, deleted. We are at risk of forgetting now. The discrepancies over time could easily slip our attention, where we should be staring them in the face.

The statistical errors are a philosophical problem. Cambridge philosopher Simon Blackburn points out two problems with regard to fact. Fact, he writes, 'may itself involve value judgements, as may the selection of particular facts as the essential ones'. The first of these problems is fairly obvious. For example, ‘Beethoven is overrated’ might seem at first to represent a statement of fact, where it really does not. The second problem is critical. We select facts, yet do so on a doubtful basis.

Facts do not exist in isolation. We typically insert them into equations, algorithms, models (and so on). In fact, we need to form an opinion about the relevance of the facts before we even seek them out—learning algorithms not excepted. In the case of the present pandemic, we began with deaths ÷ cases x 100 = CFR. We may reduce this to the equation a ÷ b x 100 = c. Yet notice now that we have selected variables a, b, and c, to the exclusion of all others. Say, x, y, or z.

What then gave us the authority to select a, b, and c? In fact, before we make any such selection, we need to 'scope the system'. We need to demarcate our enterprise, or we shall easily lose control of it. One cannot introduce any and every variable into the mix. Again, in the words of Simon Blackburn, it is the ‘essential’ facts we need. This in fact requires wisdom—a wisdom we cannot do without. In the words of the statistician William Briggs, we need ‘slow, maturing thought’.

Swiss Policy Research comments on the early phase of the pandemic, ‘Many people with only mild or no symptoms were not taken into account.’ This goes to the selection of facts, and reveals why statistics may be so deceptive. They are facts, indeed, but they are selected facts. For this reason, we have witnessed a sequence of events over recent months, something like this:
At first we focused on the case fatality rate or CFR
Then we took the infection fatality rate into account, or IFR
Then we took social values into account (which led to some crisis of thought)
Now we take non-viral fatalities into account (which begins to look catastrophic)
This is too simple, yet it illustrates the point. Statistics require the wisdom to tell how we should delineate relevance. Statistics do not select themselves. Subjective humans do it. In fact, I would contend that the selection of facts in the case of the pandemic was largely subconscious and cultural. It stands to reason that, if we have dominant social values, these will tend to come first in our selection process.

In our early response to the pandemic, we quickly developed a mindset—a mental inertia which prevented us from following the most productive steps and the most adaptive reasoning, and every tragic death reinforced this mindset, and distracted us. Time will tell, but today we generally project that far more people will die through our response to the pandemic than died from the pandemic itself—let alone the suffering.

The biggest lesson we should be taking away from it is that we humans are not rational. Knowledge, wrote Confucius, is to know both what one knows, and what one does not know. We do not know how to handle statistics.

Monday, 6 July 2020

Picture Post 56: Fate on the Verge of Extinction

'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Posted by Tessa den Uyl
Photo credit: African shared pictures. Cameroon.

The woman in white, called ‘the female pastor’, cures a woman affected with COVID-19. Interesting in the picture is the physical approach this female pastor takes in regard to a contagious disease. Noteworthy is also the posture of the patient, which completely surrenders to this kind of aid.

Superstition. Can it or can it not cure?

When we dive into other cultures, we should be careful in responding to this question. In the case of this specific picture, we are talking about a place where the native language itself is in the throes of extinction. And with a language that is only spoken, not written, the population of such an ethnic group becomes extremely vulnerable towards misinformation.

Suppose you have grown up believing in magic, and regular medicine has never reached your habitat, beyond perhaps an aspirin. To reach out for what your people have always known is not stupid, is simply obvious. Less apparent is the exploitation of the superstition of minority groups, to create personal benefit in a context of capitalism and mass urbanisation. Hence they often go together!

To exploit a virus’s nature like COVID -19 with a blow in the face, is not taking care of ‘your flock’; rather it traces upon very old traditions that cannot endure the loss of the mind as a mystical labyrinth, in favour of the power of the human mind alone to find cure.

Inherently, this picture questions where the idea of destiny, which is characteristic of superstition, is going to stand in a globalising world.

Monday, 29 June 2020

The Afterlife: What Do We Imagine?

Posted by Keith Tidman

‘The real question of life after death isn’t whether 
or not it exists, but even if it does, what 
problem this really solves’

— Wittgenstein, Tractatus Logico-Philosophicus, 1921

Our mortality, and how we might transcend it, is one of humanity’s central preoccupations since prehistory. One much-pondered possibility is that of an afterlife. This would potentially serve a variety of purposes: to buttress fraught quests for life’s meaning and purpose; to dull unpleasant visions of what happens to us physically upon death; to switch out fear of the void of nothingness with hope and expectation; and, to the point here, to claim continuity of existence through a mysterious hereafter thought to defy and supplant corporeal mortality.

And so, the afterlife, in one form or another, has continued to garner considerable support to the present. An Ipsos/Reuters poll in 2011 of the populations of twenty-three countries found that a little over half believe in an afterlife, with a wide range of outcomes correlated with how faith-based or secular a country is considered. The Pew Center’s Religious Landscape Study polling found, in 2014, that almost three-fourths of people seem to believe in heaven and more than half said that they believed in hell. The findings cut across most religions. Separately, research has found that some one-third of atheists and agnostics believe in an afterlife — one imagined to include ‘some sort of conscious existence’, as the survey put it. (This was the Austin Institute for the Study of Family and Culture, 2014.) 

Other research has corroberated these survey results. Researchers based at Britain's Oxford University in 2011 examined forty related studies conducted over the course of three years by a range of social-science and other specialists (including anthropologists, psychologists, philosophers, and theologians) in twenty countries and different cultures. The studies revealed an instinctive predisposition among people to an afterlife — whether of a soul or a spirit or just an aspect of the mind that continues after bodily death.

My aim here is not to exhaustively review all possible variants of an afterlife subscribed to around the world, like reincarnation — an impracticality for the essay. However, many beliefs in a spiritual afterlife, or continuation of consciousness, point to the concept of dualism, entailing a separation of mind and body. As René Descartes explained back in the 17th century:
‘There is a great difference between the mind and the body, inasmuch as the body is by its very nature always divisible, whereas the mind is clearly indivisible. For when I consider the mind, or myself insofar as I am only a thinking thing, I cannot distinguish any parts within myself. . . . By contrast, there is no corporeal or extended thing that I can think of which in my thought I cannot easily divide into parts. . . . This one argument would be enough to show me that the mind is completely different than the body’ (Sixth Meditation, 1641).
However, in the context of modern research, I believe that one may reasonably ask the following: Are the mind and body really two completely different things? Or are the mind and the body indistinct — the mind reducible to the brain, where the brain and mind are integral, inseparable, and necessitating each other? Mounting evidence points to consciousness and the mind as the product of neurophysiological activity. As to what’s going on when people think and experience, many neuroscientists favour the notion that the mind — consciousness and thought — is entirely reducible to brain activity, a concept sometimes variously referred to as physicalism, materialism, or monism. But the idea is that, in short, for every ‘mind state’ there is a corresponding ‘brain state’, a theory for which evidence is growing apace.

The mind and brain are today often considered, therefore, not separate substances. They are viewed as functionally indistinguishable parts of the whole. There seems, consequently, not to be broad conviction in mind-body dualism. Contrary to Cartesian dualism, the brain, from which thought comes, is physically divisible according to hemispheres, regions, and lobes — the brain’s architecture; by extension, the mind is likewise divisible — the mind’s architecture. What happens to the brain physically (from medical or other tangible influences) affects the mind. Consciousness arises from the entirety of the brain. A brain — a consciousness — that remarkably is conscious of itself, demonstrably curious and driven to contemplate its origins, its future, its purpose, and its place in the universe.

The contemporary American neuroscientist, Michael Gazzaniga, has described the dynamics of such consciousness in this manner:
‘It is as if our mind is a bubbling pot of water. . . . The top bubble ultimately bursts into an idea, only to be replaced by more bubbles. The surface is forever energized with activity, endless activity, until the bubbles go to sleep. The arrow of time stitches it all together as each bubble comes up for its moment. Consider that maybe consciousness can be understood only as the brain’s bubbles, each with its own hardware to close the gap, getting its moment’. (The Consciousness Instinct, 2018)
Moreover, an immaterial mind and a material world (such as the brain in the body), as dualism typically frames reality, would be incapable of acting upon each other: what’s been dubbed the ‘interaction problem’. Therefore the physicalist model — strengthened by research in fields like neurophysiology, which quicken to acquire ever-deeper learning — has, arguably, superseded the dualist model.

People’s understanding that, of course, they will die one day, has spurred search for spiritual continuation to earthbound life. Apprehension motivates. The yearn for purpose motivates. People have thus sought evidence, empirical or faith-based or other, to underprop their hope for otherworldly survival. However, modern reality as to the material, naturalistic basis of the mind may prove an injurious blow to notions of an out-of-body afterlife. After all, if we are our bodies and our bodies are us, death must end hope for survival of the mind. As David Hume graphically described our circumstances in Of the Immortality of the Soul (1755), our ‘common dissolution in death’. That some people are nonetheless prone to evoke dualistic spectral spirits — stretching from disembodied consciousness to immortal souls — that provide pretext in desirously thwarting the interruption of life doesn’t change the finality of existence. 

And so, my conclusion is that perhaps we’d be better served to find ingredients for an ‘afterlife’ in what we leave by way of influences, however ordinary and humble, upon others’ welfare. That is, a legacy recollected by those who live on beyond us, in its ideal a benevolent stamp upon the present and the future. This earthbound, palpable notion of what survives us goes to answer Wittgenstein’s challenge we started with, regarding ‘what problem’ an afterlife ‘solves’, for in this sense it solves the riddle of what, realistically, anyone might hope for.

Monday, 22 June 2020

Hope Against Hope

Thomas Scarborough. After the Veldfire.
By Thomas Scarborough
There are better things to look forward to.  That is what hope is about.  I hope to be happy.  I hope to be well.  I hope to succeed.  Even through struggle and strife, I hope for it all to be worthwhile.  The philosopher Immanuel Kant put it simply, ‘All hope concerns happiness.’ 
But wait, said the ancient Greek philosophers.  On what does one base such hope?  Hope is 'empty', wrote Solon. ‘Mindless’, wrote Plato.  Then the Roman philosopher Seneca saw the dark side, which has cast a shadow over hope ever since.  Hope and fear, he wrote, ‘march in unison like a prisoner and the escort he is handcuffed to. Fear keeps pace with hope.’

The standard account of hope is this: the object of hope must be uncertain, and a person must wish for it—and here is the trouble with hope.  There is not much about hope that is rational.  We have no sound reason to believe it is justified.  It is clear that one’s hopes may not come true.

Why then hope?  Even when hopes are fulfilled—if they are fulfilled—the journey often involves struggle, and heartache, and not a little luck.  And when I have been through all that, I may well have to go through it all again.  Another goal, another relationship. How often?  At what cost?  Often enough, our hopes, once realised, may still disappoint.  They so often leave us with less to hope for than we had before.

There is a psychological problem, too.  It is called the ‘problem of action’.  Today few disagree that, most basically, I am motivated to act when I hold up the world in my mind to the world itself, and there discover a disjoint between the two.  To put it another way, we are motivated by mental models.

Yet the opposite is true, too.  Just as a disjoint between expectation and reality motivates me, so a lack of such disjoint demotivates me.  It may potentially remove any motivation at all.  We cannot go on with a view of the world which is born of the world itself.

There is a hope, observed the philosopher Roe Fremstedal, which occurs spontaneously in youth, yet is often disappointed in time.  Many start out in life with high hopes, pleasant dreams, and enthusiasm to spare.  But as we progress through life, disillusionment sets in.  And disillusionment, presumably, means coming to see things for what they are.  The disjoint is lost.

And then, death. What kind of hope can overcome death?  Death destroys everything.  An anonymous poet wrote,
Nothing remains but decline,
Nothing but age and decay.
Someone might object.  ‘This is seeing the glass half empty.  Why not see it half full?’  But put it like this.  There is certainly no greater reason to hope than there is to fear or despair.

Is there hope for me?  Is there hope for my environment?  For society?  History?  The universe?  I side with the ancient Greeks.  They had the courage to tell it like it is.  Hope as we generally know it is mere deception and superstition.  ‘Hope,’ wrote Nietzsche, ‘is the worst of all evils because it prolongs the torments of man.’

When I was at school, we sang a song.  To schoolboys at the time, it seemed like a statement of boundless optimism and cheer.  Titled ‘The Impossible Dream’, it came from a Broadway musical of 1965—and it closes with these words:
Yes, and I'll reach
The unreachable star!
It seems hard to tell now whether the songwriter was sincere.  Some say that the striving which the words represent is more important than the words themselves.  Some say the songwriter was characterising his starry-eyed younger self.  More likely, it seems, he was raving against a contradictory universe, in a nonsensical song.

People have tried in various ways to get around the problems of hope.  We should best project our hopes onto something else, they say: society, history, eternity.  Some have said that hope just happens—so let it happen.  Some have said that we should quell our hopes—which might work if our minds did not transcend time.  Lately, hope tends to be studied as a mere phenomenon: this is how we define it; this is what it does.

The only way to hope in this life, wrote the Danish philosopher Søren Kierkegaard, is to ‘relate oneself expectantly to the possibility of the good’.  In fact, ‘at every moment always,’ he wrote, ‘one should hope all things’.  We hope, because there are all good things to look forward to, always.*

If this is to be true, there is one necessary condition.  All of our present actions, and all events, must serve our good and happiness.  Even our greatest disappointments, our greatest causes for despair—even death itself—must be interpreted as hope and be grounded in hope.  True hope cannot be conditional, as the Greeks rightly saw.

What guarantees such hope?  The theologian Stephen Travis wrote, ‘To hope means to look forward expectantly for God’s future activity’.  This de-objectifies hope—it relativises it, because God's activity cannot be known—and it provides the translation of fear and despair, to hope.  Yet even without bringing God into it, there would have to be something that translates fear and despair.  The only challenge that remains is to identify it and appropriate it.

Whatever comes my way—everything that comes my way—is something to be hoped for, not because I hope according to the standard account, but because I have an unconditional hope.  We call it ‘hope against hope’.

* Note, however, that there is a more existential possibility. If I have an unconditional hope which is, as it were, already fulfilled in the present—the present already representing 'all good things'—then I may expect the same of the future.  This overcomes the notion that hope it too future-orientated.

Monday, 15 June 2020

Joad’s Concept of Personality

Posted by Richard W. Symonds
There is a small group of significant philosophers who had extraordinary turnarounds. The most famous of these is Ludwig Wittgenstein, who wrote about his magnum opus, ‘The author of the Tractatus was mistaken.’ So, too, A.J. Ayer who, in an interview with the BBC, said of his former philosophy, ‘At the end of it all it was false’. Yet perhaps the most extraordinary turnaround was the enormously popular C.E.M. Joad.
Cyril Edwin Mitchinson Joad (1891-1953) was a university philosopher at Birkbeck College London, who wrote on a wide variety of philosophical subjects, both historical and contemporary. For most of his life he rejected religion—but in the 1940s and early 1950s he first abandoned atheism, then accepted a form of theism, and finally converted to Christianity.

Not until Recovery of Belief, in 1952, did he set out the Christian philosophy in which he had come to believe. This post explores just one aspect of that philosophy, namely his theory of personality and the soul—then briefly, what motivated him philosophically, to make such a radical about-turn. Here is Joad’s later view, in his own words:
‘Having considered and rejected a number of views as to the nature and interpretation of the cosmos, I shall state the one which seems to me to be open to the fewest objections. It is, briefly, what I take to be the traditional Christian view, namely, that the universe is to be conceived as two orders of reality, the natural order, consisting of people and things moving about in space and enduring in time, and a supernatural order neither in space nor in time, which consists of a Creative Person or Trinity of Persons from which the natural order derives its meaning, and in terms of which it receives its explanation.’
In his ‘interpretation of the cosmos’, then, Joad proceeds by seeking to vindicate ‘the traditional division of the human being [as] not twofold into mind and body, but threefold into mind, body and soul.’ The reference seems to be to the view identifiable in late-Scholastic theology, that a human being has an immortal part which can sin, be forgiven, and rise at the Last Judgement (the soul); a thinking part which can understand, affirm, deny, desire, imagine (the mind); and a body which is the agent of the mind and soul.

In fairness, Joad does not claim to demonstrate the validity of the threefold analysis; he claims no more than that ‘if it were true it would cover a number of facts which seem to be inexplicable on any other’. He offers it as what we might term an inference to the best explanation. He found no better way to explain the cosmos as he found it.

The soul, Joad tells us, is ‘the essential self and is timeless’. It is incarnated in bodies but can exist without them, since after our bodily death, it remains an individual entity and ‘sustains immortality’. At this point, the influence of Plato’s theory of the soul in the Phaedo is clear. Unplatonic, however, is the notion that the soul is ‘normally inaccessible to us’, and that we at least approximate to an awareness of it in ‘mystical experience’—experience with which ‘most of us, at any rate, are acquainted [in] certain moments of transport of tranquillity that we enjoy in our intercourse with nature’.

Yet Joad’s theory does not rely solely on mystical experience. There are those, he writes, to whom mystical experience is denied. Thus he posits the soul as our ‘point of contact and communication’ with the divine ... God, to use the language of religion, influences man through his soul’.

Joad suggests that ‘The phenomena of spiritual healing and spiritual regeneration are ... most plausibly to be explained on the assumption that God, in response to prayer, acts upon us through the soul to heal the body and strengthen the mind. The soul is also the 'still small voice of God' of which we are conscious when the hubbub of ordinary life and consciousness dies down". This presupposes the existence of God, and of a God who acts in these ways.

Of the mind, Joad tells us that it ‘is brought into being in consequence of the contact of the soul with the natural, temporal order, which results from its incorporation in a physical body’. The mind cannot be identified with matter, as Locke’s ‘thinking substance’, for instance. Mind ‘cannot be adequately conceived in material terms ... Is the notion of conscious matter really thinkable?’ Joad asks rhetorically and in protest against Julian Huxley.

Yet Joad concedes that ‘The mind is, it is clear, constantly interacting with the body and the brain.’ Again, it is not Joad’s purpose to demonstrate the validity of his analysis. In fact, he states that this is a paradoxical occurrence which ‘is, by us, incomprehensible’. This incomprehensibility, further, he sees as being characteristic of what he calls ‘all the manifestations of the supernatural in the natural order’; the supernatural here being the soul—with the mind and the natural being the brain and the body.

There is, however, a crucial concept which subsumes the categories of body, mind, and soul. This is ‘personality’, which Joad describes as being ‘logically prior’ to the soul, mind, and body as the three elements of our being. He introduces us to this concept by considering the relation of a sonata to its notes, and of nation or society to its members (with a more thorough discussion of mereology).

While Joad does not define logical priority, the basic idea is that the soul (to borrow a phrase from C.D. Broad) is ‘an existent substantive’ which temporarily ‘owns’ or is characterised by the mind, the brain, and the body. Hence any idea that the person is a composite, ‘resulting from the concurrence of a number of parts’ has things the wrong way round. The person, essentially identified with the soul as ‘the seat of personality’, is prior to the ‘parts’—the mind, brain, and body.

It came down to this. C.E.M Joad considered the creeds of a single, materialist, physical order of reality ‘palpably inadequate’, almost meaningless, in explaining the universe and our place within it. ‘Personality’ seemed the only explanation left.

Fifteen years after Joad’s death, the philosophical theologian Francis Schaeffer’s major work, The God Who is There, was published in the USA. Interestingly, Schaeffer there presents ‘personality’ as his core idea. He writes that we have either ‘personality or a devilish din’. Schaeffer had an enormous influence on American society and religion. Among other things. President Ronald Reagan, thirteen years later, ascribed his election victory to Francis Schaeffer.

Joad’s final, almost forgotten book may have been more important than we suppose—but not only for society and religion. The idea of ‘personality’ as being logically prior to all else might become a critical pre-condition for humanity’s survival in the 21st century.

Monday, 8 June 2020

Rage and Retribution

By Seth Stancroff
What do emotions have to do with justice? A lot, it seems, when we survey the events of recent weeks in the USA. Here I call upon the so-called ‘moral sentimentalists,’ who argue that emotions play a leading role in our determinations of what is morally right and wrong, and of whom many believe that emotions are the primary source of moral knowledge.
It seems to me that moral sentimentalism has much to say when it comes to strong emotional responses to issues of injustice and criminal punishment. These responses, when viewed through the sentimentalist lens, might change the ways we view theories of just punishment.

Indeed, I would argue that emotional reactions to issues of injustice, and the sentimentalist analysis of these reactions, should indeed influence the ways we think about punishment and moral justifications for it. Specifically, the sentimentalist view might suggest that retribution (as opposed, for example, to deterrence, rehabilitation, or incapacitation) is well-suited to honour the feelings of those harmed by injustice.

In other words, while retributive justice is often criticised as being uncivilised and vindictive, retribution is perhaps uniquely able to acknowledge the pain and suffering that arises from injustice.

Consider the recent cases of Ahmaud Arbery, an unarmed 25-year-old black man who was shot and killed by Gregory and Travis McMichael, a former police officer and his son (both of whom are white), on 23 Februrary 2020, and George Floyd, an unarmed 46-year-old black man who was murdered by a white police officer, Derek Chauvin, on 25 May 2020.

These incidents have come to serve as reminders of the violent racism that persists in the United States. Floyd’s case, in particular, illustrates the deep-seated racism that plagues police officers and informs policing practices. Arbery’s is reminiscent of the horrifying and relatively recent period in U.S. history when extralegal killings of black people by white vigilantes were common.

Both of these tragedies have rightly sparked disgust and outrage. Those protesting Arbery’s murder gathered holding signs stating, ‘We will get justice.’ Arbery’s mother said, 'I want all hands involved in my son’s murder to be prosecuted to the highest … my son died, so they should die as well.' Floyd’s murder has motivated widespread protests in cities around the world, with activists demanding justice and proclaiming, ‘No justice, no peace.’

These incidents—as well as many other cases in which black individuals have been killed by police or other white offenders—suggest that often, our first instincts are not to turn to deterrence, rehabilitation, or some other conception of punishment. Anthony Walsh and Virginia Hatch, in an article for the New Criminal Law Review in 2018 entitled, ‘Capital Punishment, Retribution, and Emotion: An Evolutionary Perspective,’ capture this well:
‘A retributive punishment justification is the only justification associated with deep emotions related to social concern. When people hear of some vicious criminal act, they become angry, outraged, and disgusted, and their first inclination is to want to exact some sort of retribution; it is highly unlikely that their first thoughts should be of deterrence or rehabilitation.’
The murders of Ahmaud Arbery and George Floyd highlight two features of emotional responses to injustice and the retributive urge:

1. When people hear about these acts of injustice, the kinds of punishments they seek for the offenders are indeed retributive. Impassioned calls such as 'Justice for Floyd,' and 'My son died, so they should die as well,' while perhaps understandable, do not imply an appeal to deterrence, and certainly not rehabilitation. These statements suggest that those who committed such crimes should be punished as a result of their injustices. They should be subjected to some harm because of the harms they caused.

2. The kinds of punishments for which many ask hinge heavily on the notion of desert, or the extent to which the offenders are deserving of punishment. Of course, retribution is the theory of punishment most concerned with desert. Deterrence, rehabilitation, and incapacitation are not the first ideas that come to mind in cases like these. Instead, many imagine that Derek Chauvin, as well as those who murdered Ahmaud Arbery, deserve to be punished.

All of this is to say that, although certain criticisms of retribution may be warranted, it is important to recognise that the theory occupies an important space within societies’ sensibilities and moral intuitions surrounding justice and punishment. If it is indeed the case, as the moral sentimentalists argue, that morality and emotions are closely tied, then emotional responses to injustice, and the retributive urges that accompany them, should not be deemed morally irrelevant.

While state-sanctioned punishment certainly should not be motivated by rage and vindictiveness, it is important to see that, in some cases, retributive urges will be strong and understandable. Although there are other theories that take more utilitarian and dispassionate approaches to punishment, it seems that they may not explicitly acknowledge the suffering caused by acts of injustice. Retribution, at the very least, honours this kind of pain.

Monday, 1 June 2020

Picture Post 55: Making Assumptions

'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Posted by Martin Cohen

A Twitter friend of mine posted this image with the comment: “Believe in your limitations”.

I wasn’t sure what to make of that, but he explained that he was being “quietly subversive” which I took to meaning gently mocking religious iconography. That aside, though, I think the image does illustrate something important about the way our minds process images. The traffic lights are not, in fact, the Buddha's eyes, yet the impression that they might be is so compelling that it makes us re-evaluate the Buddha himself.

I say ‘himself’, as Buddhas are traditionally male, indeed in some cultures being female is formally an inferior state and an obstacle to following the Buddhist philosophy. Of course, being male or female might not actually have any implications for the ability to transcend this world and reach ‘nirvana’ – yet for centuries such quick assumptions have prevailed.

Which brings me back to this image, because it illustrates nicely the way that we link things that in reality are completely unconnected, due to them fitting a strongly preconceived ‘pattern’. Such assumptions are not necessarily good or bad. But perhaps we should be on guard against them. Which maybe fits with my friend‘s cryptic comment after all.

Monday, 25 May 2020

Trading Lives Without Anger

Heinrich Hoerle, 1930, Monument to the Unknown Prosthesis
 Posted by Allister Marran
The COVID-19 crisis has brought into sharp focus modern man’s ideological belief that he has mastered science and medicine, and has so defeated—or at least delayed—the intrusions of the Grim Reaper.  Our misplaced belief that medical science can cure any ailment means we want to try to save everyone—and when we cannot, there is dismay and fury.
Centuries of loud, proud pronouncements from researchers, scientists, and the medical community, of sound progress being made in the battle against age-old enemies like cancer, malaria, tuberculosis, and innumerous mortal ailments has lulled us into a false sense of security—a perception of invulnerability and ultimately immortality.

What happens, then, when death becomes an inevitable choice?  What if the choices set before us are choices which must choose death in any event?

Whilst the achievements of medical science cannot be overstated, and are undoubtedly impressive, our somewhat conceited overestimation of our ability to stave off death indefinitely has led us to a crossroads today which opens up the social, spiritual, and philosophical question of where to draw the line, who to try to save, and at what cost—if death is indeed inevitable.

At logical extremes, there are two distinct, divergent—apparently incompatible—viewpoints that could be held and debated. In the context of the coronavirus, or COVID-19:
Firstly, that we should lockdown indefinitely, or until a treatment or vaccine is found, saving every life we can at any cost.

Alternatively, when the cost becomes too high, to start trading the lives of the old and the sick for that of the starving young and poor.
There have of course been many pandemics, and COVID-19 is just be the latest contagion in a long line of similar illnesses that have ripped through the human population over the last hundred or so yearsstarting with the Spanish Flu in 1918, and continuously assaulting us before retreating and coming back again in different forms and kinds.

There is a difference this time, however.  The connected world and social media has allowed the world to track the progress of the disease and monitor its devastation, and the real-time outrage has been swift, palpable, and highly publicised.

A minister who has presided over countless funerals told me recently that there has been a perceptible change in the emotions expressed when family and friends come together to bury loved ones.  The old markers of grief and the grieving process are replaced with anger and fury today. 

But our fury has no object; it is just the way things work.  There must be a middle road—to save who you can, but allow those whose time has come to leave.  A realisation and philosophical embracing that our time on earth is finite, which in turn adds value to the little time we do have.  To say goodbye without anger or pain or fury, because after all, shouldn’t your last memory of a departed one be tinged with memories and feelings of love, not hate?

Monday, 18 May 2020

The Sweet Fruits of Authenticity

 Posted by Lina Scarborough
Think of the most authentic version of yourself. What do you see? Perhaps someone who has more than what you currently do—more skills, more money, more possessions—like a house and a car. Or perhaps more time and will power to pursue passions, or better relationships and fitness levels.
In other words, it would be a ‘sage’ ideal of oneself, travelling into a future point in time to when one has acquired success and achieved milestones.

But if I think of my most authentic version of myself, I am a young child again. A girl, around 8 or 10 years old, running through the forest park near home, gathering berries with my mother. The thorns pricked so very badly, but I was stubborn enough to throw myself into the bushes anyway. I had enough experience getting scratches from our cat to treat them nonchalantly anyway.

As I child, I never knew what exactly I wanted to become. I had fleeting ideas—a kindergarten teacher (out of a learned fear of any maths more complex than numeracy). A butterfly scientist (quickly shot down by mother dearest, a no-nonsense Russian gastroenterologist). The closest I had to a ‘dream’ was to travel the world, trying every ice-cream flavor on the planet, including funky ones like squid ink (yes, it actually exists, it’s called Ikasumi ice-cream). But as life would have it, I developed an adolescent onset of lactose intolerance.

All I knew was that I wanted to become an adventurer. I wanted to be like the cool heroes I saw in video games and animations, pirates and space cowboys, princesses with magic powers with a loyal band of friends—and importantly, a sense of mission. I pictured myself as a female version of Indiana Jones, or a kind of Lara Croft, but I was born too late. It seemed like everything cool and exciting had been explored! And no one would pay me to live a life like Indiana Jones!

Of course, that’s not true. At least the first part isn’t. The discoveries and environments of adventure have simply changed shape. Instead of conquering the wilds of Peru, the modern-day explorers pave the way in nanotechnology, AI, and all kinds of other scientific adventures.

Alas. I am no nano-butterfly brain surgeon. That world is not for me. Thus, the closest you’d get to the old-fashioned type of adventurer would be an astronaut or a game ranger, but these too are exceptional, rather than easily attainable paths.

Apart from that, all kinds of life factors impact the reasons behind our choices. Do we choose our jobs, partners, even mundane things like the kind of clothes we wear or the music we listen to, out of authenticity to ourselves, or out of society and family, or peer-pressure? Of course, for most people, their motivations will fall on some spectrum of a blend of authentic and forced choices.

The next question that is begged though, is whether or not authenticity is a good thing. Consider; if Hitler’s authentic version of an ideal world was one of mass genocide, is that a good thing? Obviously not.

This is one of the reasons I wished my peers—millennials and younger—would not place such great value on the individual—or even worse, the individual’s fleeting feelings. Too little authenticity, and we may very well land up in a cult-like dictatorship; too much, and we lose a great part of what it means to be human. No man is an island entire of itself; an overused, but wise quote. To be entirely authentic is to discredit how those we love and fear, admire and detest have and can shape us. It discredits the bond that is formed along the way.

To idealise authenticity or one’s own feelings is to firmly isolate oneself inside one’s head and tape one’s eyes and heart shut. We will always need a guiding entity to determine whether our motivations—sincere as they might be—are actually good and truthful to goodness or not.

I used to be exceptionally good at dance and gymnastics. I was the lead in several school productions. I had forgotten this. After my mother’s passing, I dropped all forms of artistic expression. I stopped dancing and writing and playing piano for almost 10 years, and pursued a career I rationalised was safest for my economic well-being—but not one that was aligned to authentic self—to the curious, ever-exploring and gleeful child in me.

Perhaps that is why God had me fall in love with a man like my husband. Someone who firmly believes that following one’s passion will, eventually, lead to financial stable means. Eight years ago, he perchance opened a book on elephants and mammoths. Today, he’s awaiting the response to his PhD submission on the dwarf elephants of Sicily.

As the decade-anniversary of my mothers passing nears, God rest her soul, I reflect on how I came to be where I am today, and where I, at the tender age of a quarter-century, am headed. My choices have often not been authentic, mostly out of fear of failure and hardship. Both I have received nonetheless, in various portions; I might as well be as brave as the child I was, unafraid of the thorns in pursuit of the sweet, sweet berries of truth and earnest passion.

Monday, 11 May 2020

Zen ‘Koans’: What Is the Sound of One Hand?

Busy Busy Beggar (Aizu Museum, Waseda University)
Posted by Keith Tidman
‘Two hands clap and there is a sound; what is the sound of one hand?’
The puzzle above long ago entered popular culture, and is familiar to many: The question’s origins date back to one of the most influential Zen Buddhists, Hakuin Ekaku, whose life straddled the 17th and 18th centuries.

The Zen name for such a puzzle is koan — a paradoxical anecdote, dialog, or question. The idea is that koans permit thinking to escape the bounds of rationality and instead embrace intuition-like ways to awaken enlightenment and arouse spiritual development. It’s a realm where logical reasoning is shown inadequate, to be suspended. By pondering the mystery of koans, contemplative monks absorb Buddhist teachings — letting go of the strictly analytic method to understanding, and instead learning to accept ambiguity and paradox and the absence of just one truth.

There are more than 1,700 classical koans, amassed over many centuries in China, Japan, and elsewhere (Thomas Cleary, Secrets of the Blue Cliff Record, 2002). Each one is a meditative device aimed at prompting the deep awareness that comes only from an open, freed-up mind. The interpretations of koans are often not obvious or clear-cut, their ambiguity making multiple alternative insights possible. In turn, these insights might lead to additional questions, inviting further reflection.

As far as the pursuit of open-mindedness and intuition goes, the following aphorism was offered in the Diamond Sutra:
‘Out of nowhere, the mind comes forth’. 
The observation originated in a 9th-century Sanskrit document, translated into Chinese, which was among thousands of scrolls hidden in ‘The Cave of a Thousand Buddhas’, evidently a library concealed to protect its contents.

Here’s another koan, perhaps less well-known outside of Zen circles:

Two monks were arguing about the temple flag waving in the wind. One insisted, ‘The flag moves’. The other equally insisted, ‘No, it is the wind that moves’. They argued back and forth but couldn’t agree. The Zen master Huineng was passing by and, having overheard the two monks, said, ‘It is not the flag moving. It is not the wind moving. It is your mind moving’. 

In this koan, the minds of the first two monks were riveted on the flapping flag, becoming increasingly obsessed with the issue of whether the flag was moving (the observable world) or whether it was in reality the wind that moved  (an invisible force acting on the observable world). Huineng’s point is that the two monks’ minds had become agitated over a minor distraction, consumed by binary, either-or thinking, instead of being in the restful state fostered by Zen Buddhism. Huineng reminded them to move beyond the diversionary tug of who was right or wrong — as both were seeing only partial truth — and instead calm their needlessly restless minds, caught up in the argument.

This anecdotal koan is less enigmatic, but likewise offers a valuable insight into human behaviour:

Tanzan and Ekido were once traveling together down a muddy road. A heavy rain was still falling. Coming around a bend, they met a woman in a silk kimono and sash, unable to cross the intersection. ‘Come on’, said Tanzan, at once lifting the woman and carrying her over the mud.

Ekido did not speak again until that night, when they reached a lodging temple. Then he no longer could restrain himself. ‘We monks don’t go near females’, he told Tanzan, ‘especially not young and attractive ones. It is dangerous. Why did you do that?’

‘I left the woman there’, said Tanzan. ‘Are you still carrying her?’

The account has various interpretations. One version is not to let the past consume you, such that an out-of-control preoccupation crowds out of the mind all else of greater value, including the present — forfeiting immediate experiences. Ekido was plagued by the niggling urge to judge and conform, unable to let go of re-litigating over and over whether Tanzan had violated the literal monastic code of conduct.

In doing so, Ekido succumbs to stepping outside of mindfulness, sacrificing what’s transpiring in the here and now. Meanwhile, Tanzan had moved on. Sometimes, moral codes are cloudy, even appropriately flexible in interpretation and application in order to bend to circumstances. As in this case, the right ‘moral’ choice may have been to break momentarily with convention in order to do a kindness — a higher good.

Another ‘paradoxical anecdote’ offers a different insight: 

Nan-in, a Zen master during the Meiji era, received a university professor who had arrived to inquire about Zen. Nan-in served tea. He poured his visitor’s cup full, and then kept on pouring.

The professor watched the overflow until he no longer could restrain himself. ‘It is overfull. No more will go in!’ ‘Like this cup’, Nan-in said, ‘you are full of your own opinions and speculations. How can I show you Zen unless you first empty your cup?’

Here, the need to let go of — to unlearn — long-held, unaccommodating beliefs, preconceptions, biases, expectations, knowledge, and presumed wisdom is a prerequisite to opening the mind to learn new and different things. The paradox is that arguably the professor might not be able, no matter how sincere his intentions, to disassociate from a lifetime of learning — unable to empty his metaphorical cup.

This is another classic koan:

A man traveling across a field encountered a tiger. He fled, the tiger racing after him. Coming to a precipice, the man caught hold of the root of a wild vine and swung himself down over the edge. The tiger sniffed at him from above. Trembling, the man looked down to where, far below, another tiger was waiting to eat him. Only the vine sustained him.

Two mice, one white and one black, little by little started to gnaw away at the vine. Just then, the man saw a luscious strawberry near him. Grasping the vine with one hand, he plucked the strawberry with the other. How sweet it tasted!

The man faces inevitable death on all sides, trapped by a hungry tiger above and one below. He also faces the two mice, whose gnawing on the vine bodes increasingly dire outcomes. The man chooses to live in the moment, enjoying the luscious strawberry. It is a moment of sublime happiness. Seeing the tigers on each side, the man sees his life similarly bracketed: Before life he had nonexistence and after life he will return to nonexistence, for eternity. He is left with the present. We might similarly conclude the best option in life is to grasp that singular moment in which we relish the ‘strawberry’.

So, what to make of the koan, at the start of this essay, asking about the sound of one hand? The point is that a koan is dynamic and transformational, in the sense that it is, to recall the words of philosopher and Zen monk G. Victor Sogen Hori (in Zen Sand, 2003):

‘…both the object being sought and the relentless seeking itself. In a koan, the self sees the self not directly, but under the guise of the koan. . . . When one realizes (‘makes real’) this identity, then two hands have become one. The practitioner becomes the koan that he or she is trying to understand. That is the sound of one hand.’

In today’s world, heavily influenced by the ubiquity of the scientific method, analysis, quantification, and logic, people are heavily swayed by this way of thinking in which society seeks insight, knowledge, understanding, and even wisdom. But despite the significant contributions such approaches make available, they overlook and even obscure key aspects of the world and life. Perspectives on what motivates our thinking, our relationships, our values, our connections to the planet, our happiness, our fundamental nature, our intent, our enlightenment, and our potential.

In this way, the ancient koans — with their emphasis on intuitiveness, open-mindedness, and spirituality — are still able in the modern era to inform, inspire, and guide these vital human interests.

Monday, 4 May 2020

Picture Post # 54 No Standing Still

'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Posted by Thomas Scarborough

Tarawa lagoon, at Antebuka

The world, in the photo, is angled to the left, as if it might slip away.  A boy is suspended between sea and sky. 

Like Zeno's arrow, which is frozen in time and space, and is never able to reach its target, the boy has become a snapshot -- the permanent impression of something transitory. 

In reality, he soon plunges into the water.  In reality, there is no freezing of time and space.  There is no standing still.  There is no Pause button to press as our world hurtles towards the future.

Monday, 27 April 2020

The Curiosity of Creativity and Imagination

In Chinese mythology, dragon energy is creative. It is a magical energy, the fire of the soul itself. The dragon is the symbol of our power to transmute and create with imagination and purpose.
Posted by Keith Tidman

Most people would agree that ‘creativity’ is the facility to produce ideas, artifacts, and performances that are both original and valuable. ‘Original’ as in novel, where new ground is tilled. While the qualifier ‘valuable’ is considered necessary in order to address German philosopher Immanuel Kant’s point in The Critique of Judgment (1790) that:

‘Since there can also be original nonsense, its products [creativities] must at the same time be models, i.e., be exemplary’.

An example of lacking value or appropriateness in such context might be a meaningless sequence of words, or gibberish.

Kant believed that creativity pertains mostly to the fine arts, or matters of aesthetics — a narrower perspective than today’s inclusive view. He contended, for example, that genius could not be found in science, believing (mistakenly, I would argue) that science only ever adheres to preset methods, and does not allow for the exercise of imagination. He even excluded Isaac Newton from history’s pantheon of geniuses, despite respecting him as a great man of science.

Today, however, creativity’s reach extends along vastly broader lines, encompassing fields like business, economics, history, philosophy, language, physics, biology, mathematics, technology, psychology, and social, political, and organisational endeavours. Fields, that is, that lend themselves to being, at their creative best, illuminative, nontraditional, gestational, and transformational, open to abstract ideas that prompt pondering novel possibilities. The clue as to the greatness of such endeavors is provided by the 16th/17th-century English philosopher Francis Bacon in the Novum Organum (1620), where he says that:

‘By far the greatest obstacle to the progress . . . and undertaking of new tasks and provinces therein is found in this — that men despair and think things impossible’.

Accordingly, such domains of human activity have been shown to involve the same explorative and generative functions associated with the brain’s large-scale neural networks. A paradigm of creative cognition that is flexible and multidimensional, and one that calls upon several features:
  • an unrestricted vision of what’s possible,
  • ideation, 
  • images, 
  • intuitions,
  • thought experiments, 
  • what-if gaming, 
  • analogical reasoning, 
  • metaphors, 
  • counterfactual reasoning, 
  • inventive free play, 
  • hypotheses, 
  • knowledge reconceptualisation, 
  • and theory selection.
Collectively, these are the cognitive wellspring of creative attainment. To those extents, creativity appears fundamental to defining humanity — what shapes us, through which individual and collective expression occurs — and humanity’s seemingly insatiable, untiring quest for progress and attainment.

Societies tend to applaud those who excel at original thought, both for its own sake and for how it advances human interests. That said, these principles are as relevant to the creative processes of everyday people as to those who eventually are recorded in the annals of history as geniuses. However, the creative process does not start out with the precise end (for example, a poem) and the precise means to getting there (for example, the approach to writing that poem) already known. Rather, both the means and the end product are discoverable only as the creative process unfolds.

Above all, imagination sits at the core of creativity. Imagination is representational, of circumstances not yet real but that nevertheless can evoke emotions and behaviours in people. The world of imagination is, of course, boundless in theory and often in practice, depending on the power of one’s mind to stretch. The American philosopher John Dewey spoke to this point, chalking up every major leap in science, as he boldly put it in The Quest for Certainty, to ‘a new audacity of the imagination’. Albert Einstein’s thoughts paralleled these sentiments, declaring in an interview in 1929 that ‘Imagination is more important than knowledge’. Wherein new possibilities take shape. Accordingly and importantly, imagination yields ideas that surpass what’s already supposed.

Imagination is much more, however, than a mere synonym for creativity, otherwise the term would simply be redundant. Imagination, rather, is a tool: freeing up, even catalysing, creativity. To those ends, imagination entails visualisation (including thought experiments, engaged across disciplines) that enables a person to reach out for assorted, and changing, possibilities — of things, times, places, people, and ideas unrestricted by what’s presumed already experienced and known concerning subjective external reality. Additionally, ‘mirroring’ might occur in the imaginative process, where the absence of features of a mental scenario are filled in with analogues plucked from the external world around us. Ultimately, new knowledge and beliefs emerge, in a progressive loop of creation, validation, application, re-imagination.

Imagination might revolve around diverse dominions, like unconstrained creative thought, play, pretense, the arts, allegorical language, predictive possibilities, and imagery, among others. Imagination cannot, however, guarantee creative outcomes — nor can the role of intuition in human cognition — but imagination is essential (if not always sufficient) for creative results to happen. As explained by Kant, imagination has a ‘constitutive’ role in creativity. Something demonstrated by a simple example offered by 17th-century English philosopher Thomas Hobbes:

‘as when from the sight of a man at one time, and a horse at another, we conceive in our mind a Centaur’. 

Such imaginative, metaphorical playfulness being the stuff not only of absorbed, undaunted children, of course — though they are notably gifted with it in abundance — but also of freethinking adults. Adults whose minds marvel at alternatives in starting from scratch (tabula rasa), or from picking apart (divergence) and reassembling (convergence) presumed reality.

The complexities of imagination best nourish what one might call ‘purposeful creativity’ — where a person deliberately aims to achieve a broad, even if initially indeterminate outcome. Such imagining might happen either alone or with the involvement of other participants. With purposeful creativity, there’s agency and intentionality and autonomy, as is quintessentially the case of the best of thought experiments. It occasions deep immersion into the creative process. ‘Passive creativity’, on the other hand, is where someone has a spontaneous, unsought solution (a Eureka! moment) regarding a matter at hand.

Purposeful, or directed, creativity draws on both conscious and unconscious mechanisms. Passive creativity — with mind open to the unexpected — largely depends on unconscious mental apparatuses, though with the mind’s executive function not uncommonly collaboratively and additively ‘editing’ afterwards, in order to arrive at the final result. To be sure, either purposeful or passive creativity is capable of summoning remarkable insights.

The 6th-century BC Chinese spiritual philosopher Laozi perhaps most pithily described people’s capacity for creativity, and its sometimes-companion genius, with this figurative depiction in the Teo Te Ching, the context being to define ‘genius’ as the ability to see potential: ‘To see things in the seed’ — long before germination eventually makes those ‘things’ apparent, even obvious, to everyone else and become stitched into the fabric of society and culture.

Recent Comments