Monday, 20 August 2018

Our Destined Date

Posted by Jeremy Dyer *
Red Sky at Night II by Kimberly Conrad
The rivers run, the leaves do fall
The earth still turns its trick
The oceans and the prairies roar
But we are very sick.

Blasted earth, the toxins run
The blood is poisoned well
We cannot survive the fun
Of our consumer hell.

Garbage, plastic, rusted bike
It all runs to the sea
Killing man and beast alike
That poison's killing me.

When will we wake, alas too late
It's past the point of fixing
There is a destined, horror date
That no-one will be missing.

I have the hope that birds will sing
Kind winds will blow again
Stopping our destructioning
Healing up our pain.

But will we wake and heal the earth
Get rid of all the 'leaders'?
Reduce the greed, respect the hearth
Deal with all the breeders?

The earth will die, I think it's done
We're in the final hour
What's over when the song is sung
Is the funeral bower.

The rivers run, black as hell
They're dying as we speak
The urgent answers that we seek
Won't be on tv this week.
* Jeremy Dyer is an acclaimed Cape Town artist.

Monday, 13 August 2018

Ubuntu's Fifth Wave

'Unity in Diversity' by Oscar Olufisayo Awokunle
Posted by Thomas Scarborough
'Ubuntu' is legendary in Southern Africa.  Its meaning is encapsulated in the idiom, 'Ubuntu ngumtu ngabanye abantu'—a person is a person through other people, or 'I am because you are.'  It represents the very reverse of European Enlightenment thinking: 'The individual is prior to the group.'  According to the Luthern minister William Flipping Jr., ubuntu means that 'we are first and foremost social beings.'  This is not to say that ubuntu denies our individuality; rather it incorporates our individuality in the group.
Ubuntu as one describes it, however, and ubuntu as one experiences it, would seem to be two different things.  The concept seems inadequate for describing what it really is.  A personal aside suggests itself here.  I myself, in 2013, 'married into Africa'.  My own identity, in a positive way, was incorporated into that of the clan, so that I was treated with warmth, generosity, and equality—despite being an outsider of sorts.  Apart from this, I discovered an energetic ubuntu which had a very practical interest in the good of all—something which a mere definition seems unable to convey.

There are said to have been three distinct waves of ubuntu—or four, if one inserts the first on this list.
The original 'village ubuntu'—the spirit of 'one for all, and all for one', which existed since ancient time, and included hospitality to the stranger.
The ubuntu, first described around the middle of the 19th century, which referred to African personhood and dignity in the face of dehumanisation by colonists.
The ascendency of ubuntu as a political concept in the late 20th century, promising to restore the personhood and dignity of citizens following apartheid.
The theological turn of ubuntu, which originated with Anglican Archbishop Desmond Tutu—anchored in Christian teachings of forgiveness and reconciliation.
Yet if it is true that ubuntu understands that 'we are first and foremost social beings,' then we would seem, lately, to have travelled in the opposite direction.  We have witnessed increasing individualism in Africa—often heartless and often harmful to the group.  We see it in many ways: self-enrichment, wilful damage to local and national infrastructure, environmental destruction, and so on.  Ubuntu seems to have been fairly powerless in the face of such challenges.

The weakness of ubuntu is to me epitomised by a crisis which the BBC billed as a world first—the water shortage in Cape Town, which would have seen the world's first major city running dry on 'Day Zero'.  One of Pi's own contributors, Sifiso Mkhonto, on the news service News24, highlighted the need for ubuntu.  Yet ubuntu seemed in short supply.  Instead, the city mayor wrote a dramatic letter with the opening lines, 'Day Zero is now likely.  60% of Capetonians won’t save water, we must now force them.'

Ubuntu is an idea which was born in ancient time—in another world, we might say, far removed from us now.  More recent concepts of ubuntu were born in the optimism of social and political change, and seem ill fitted to the 'reality' which has set in today.  Archbishop Desmond Tutu describes ubuntu like this: 'When you do well, it spreads out.'  The trouble is, it both does and doesn't.  In the case of Cape Town's water crisis, ubuntu did not spread out to save the city. The city responded to a point, yet it was saved by rain.

We have not yet lifted up ubuntu to the level where Archbishop Tutu's 'doing well' is not just about me and you transmitting the warmth and goodness which a society absorbs, but about a society which can be so organised that it really works for all.  We need a 'fifth wave' of ubuntu, which goes beyond village ubuntu, beyond political ubuntu, and ingrains ubuntu in the fabric of society, as it were.  The advent of true democracy was a positive development for Southern Africa—yet once obtained, the healthful organisation of society is the primary goal.

Religions have known this for millennia.  They have a large foundation of unconditional 'bottom line injunctions'.  Societies, in a similar way, need a comprehensive set of core values which are sacrosanct—a kind of 'super-law' which not only favours ubuntu, but secures it and upholds it with the necessary mechanisms.  That is, an ubuntu of the body politic, embraced as the ideal, embedded in law, and effectively applied.  When that happens, ubuntu may be complete.

Monday, 6 August 2018

To Be is to Inherit

'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Posted by Tessa den Uyl and Martin Cohen

Picture credit: Harry Rutter

Words, by repeating their connotation, their application seems to follow rather rigid schemes, hence we might even think that, after all, words work. Authority depends on making words effective. Don’t move! Stop! Words that are not only verbal but follow a series of physical gestures as well, that we should understand, not question.

Now let us enter this door in the picture above. After all, there is written welcome. If there would be a person behind that door we would be told we are not allowed to be there. “Have you not read there is written No Entry?” Well no, we focused on the welcome, and would a welcome not be open to all?

This is not how it works, and we do understand this. Even when the combination of more words clearly carries along a form of incongruous meanings, most often the no rejects the yes.   

Being human is to be ambivalent by nature. We cannot avoid contradictions within our own selves, a plural reading of meaning, of relations. But somehow we have learned that property is connected to prohibition. Exclusion is our logic. Hence this is why, in our language, a welcome can be offered to some but not to all?

Monday, 30 July 2018

The Anthropic Principle: Was the Universe Made for Us?

Diagram on the dimensionality of spacetime, by Max Tegmark
Posted by Keith Tidman
‘It is clear that the Earth does not move, and that it does not lie elsewhere than at the center [of the universe]’ 
— Aristotle (4th century BCE)

Almost two millennia after Aristotle, in the 16th century, Nicolas Copernicus dared to differ from the revered ‘father of Western philosophy’. Copernicus rattled the world by arguing that the Earth is not at the center of the universe — in a move that to many at the time seemed to knock humankind off its pedestal, and reduce it from exceptionalism to mediocrity. The so-called ‘Copernican principle’ survived, of course, along with the profound disturbance it had evoked for the theologically minded.

Five centuries later, in the early 1970s, an American astrophysicist called Brandon Carter came up with a different model — the ‘anthropic principle’ — that has kept philosophers and scientists debating its significance cosmologically and metaphysically. With some irony, Carter proposed the principle at a symposium to mark Copernicus’s 500th birthday. The anthropic principle points to what has been referred to as the ‘fine-tuning’ of the universe: a list of cosmological qualities (physical constants) whose extraordinarily precise values were essential to making intelligent life possible.

Yet, as Thomas Nagel, the contemporary American philosopher, suggested, even the physical constants known to be required for our universe and an intelligent carbon-based life form need to be properly understood, especially in context of the larger-scaled universe:
‘One doesn’t show that something doesn’t require explanation by pointing out that it is a condition of one’s existence.’
The anthropic principle — its adherence to simplicity, consistency, and elegance notwithstanding — did not of course place Earth back at the center of the universe. As Carter put it, ‘Although our situation is not necessarily central, it is inevitably privileged’. To widen the preceding idea, let’s pose two questions: Did the anthropic principle reestablish humankind’s special place? Was the universe made for us?

First, some definitions. There are several variants of the anthropic principle, as well as differences among definitions, with Carter originally proposing two: the ‘weak anthropic principle’ and the ‘strong anthropic principle’. Of the weak anthropic principle, Carter says:
‘… our location in the universe [he was referring to the age of the universe at which humankind entered the world stage, as well as to location within space] is necessarily privileged to the extent of being compatible with our existence as observers.’
Of the strong anthropic principle, he explained,
‘The universe (and hence the fundamental parameters on which it depends) must be such as to admit the creation of observers within it at some stage’.
Although Carter is credited with coining the term ‘anthropic principle’, others had turned to the subject earlier than him. One in particular among them was the 19th-century German philosopher Arthur Schopenhauer, who presented a model of the world intriguingly similar to the weak anthropic principle. He argued that the world’s existence depended on numerous variables, like temperature and atmosphere, remaining within a very narrow range — presaging Carter’s fuller explanation. Here’s a snapshot of Schopenhauer’s thinking on the matter:
‘If any one of the actually appearing perturbations of [the planets’ course], instead of being gradually balanced by others, continued to increase, the world would soon reach its end’.
That said, some philosophers and scientists have criticized the weak variant as a logical tautology; however, that has not stopped others from discounting the criticism and favoring the weak variant. At the same time, the strong variant is considered problematic in its own way, as it’s difficult to substantiate this variant either philosophically or scientifically. It may be neither provable nor disprovable. However, at their core, both variants (weak and strong) say that our universe is wired to permit an intelligent observer — whether carbon-based or of a different substrate — to appear.

So, what kinds of physical constants — also referred to as ‘cosmic coincidences’ or ‘initial conditions’ — does the anthropic principle point to as ‘fine-tuned’ for a universe like ours, and an intelligent species like ours, to exist? There are many; however, let’s first take just one, to demonstrate significance. If the force of gravitation were slightly weaker, then following the Big Bang matter would have been distributed too fast for galaxies to form. If gravitation were slightly stronger — with the universe expanding even one millionth slower — then the universe would have expanded to its maximum and collapsed in a big crunch before intelligent life would have entered the scene.

Other examples of constants balanced on a razor’s edge have applied to the universe as a whole, to our galaxy, to our solar system, and to our planet. Examples of fine-tuning include the amount of dark matter and dark energy (minimally understood at this time) relative to all the observable lumpy things like galaxies; the ratio of matter and antimatter; mass density and space-energy density; speed of light; galaxy size and shape; our distance from the Milky Way’s center; the sun’s mass and metal content; atmospheric transparency . . . and so forth. These are measured, not just modeled, phenomena.

The theoretical physicist Freeman Dyson poignantly pondered these and the many other ‘coincidences’ and ‘initial conditions’, hinting at an omnipresent cosmic consciousness:
‘As we look out into the universe and identify the many accidents of physics and astronomy that have worked together to our benefit, it is almost as if the universe must in some sense have known we were coming.’
Perhaps as interestingly, humankind is indeed embedded in the universe, able to contemplate itself as an intelligent species; reveal the features and evolution of the universe in which humankind resides as an observer; and ponder our species’ place and purpose in the universe, including our alternative futures.

The metaphysical implications of the anthropic principle are many. One points to agency and design by a supreme being. Some philosophers, like St. Thomas Aquinas (13th century) and later William Paley (18th century), have argued this case. However, some critics of this explanation have called it a ‘God of the gaps’ fallacy — pointing out what’s not yet explained and filling the holes in our knowledge with a supernatural being.

Alternatively, there is the hypothetical multiverse model. Here, there are a multitude of universes each assumed to have its own unique initial conditions and physical laws. And even though not all universes within this model may be amenable to the evolution of advanced intelligent life, it’s assumed that a universe like ours had to be included among the infinite number. Which at least begins to speak to the German philosopher Martin Heidegger's question, ‘Why are there beings at all, instead of nothing?’

Monday, 23 July 2018

Seizing Control of Depression

The Man at the Tiller 1892 | Theo van Rysselbergh
Posted by Simon Thomas
We know the symptoms of depression well. We read of them everywhere: sleeplessness, weight loss, reckless behaviour—and so on. Yet we tend to miss the fact that the foremost of these symptoms is deeply philosophical. 
The philosopher Tim Ruggerio defines depression, above all, as ‘the healthy suspicion that there may not be an aim or point to existence’. This broadly agrees with a symptom which stands at the top of many lists of symptoms: ‘Feelings of helplessness and hopelessness. A bleak outlook.’

Of course, depression does not exist purely on a philosophical plane. It is deeply felt. The symptoms one reads about do not begin to describe the darkness one feels in the throes of a depressive episode. It may be hard to see a way out when, frayed and tattered, one’s feelings start spiralling—and it seems no amount of positive talk can help.

Yet even then, there is one steady pole at the centre. My feelings belong to me. Only I can do something about them. This, too, is deeply philosophical. It is too easy to doubt or despair about something, without recognising that one is despairing over oneself. One needs to own it—and such ownership, in turn, forms the basis for a rational way forward.

The philosopher-theologian Paul Tillich wrote, ‘The acceptance of despair is in itself faith and on the boundary line of the courage to be ... The act of accepting meaninglessness in itself is a meaningful act.’ Here, then, is how this simple philosophical insight helps us further:
When we recognise that we are dealing with a philosophical struggle, our orientation to the problem may change. The acceptance of depression as my own, far from acceptance in the sense of surrender, becomes the source of the resolve to face the real issue. It is about the search for an all-embracing meaning of life.

When I see that depression is a philosophical problem, it stands to reason that I shall engage in activities which strengthen me philosophically—which enhance the mind and focus on the good.  Conversely, I shall as far as possible remove myself from the company of those who engage in negativity.

When I understand that it is too easy to doubt or despair about something, without recognising that I am despairing over myself, I know to set aside some of those thoughts and activities which are merely avoidant, which serve to continue a once-removed despair.

Knowing that the solution is philosophical, it stands to reason that it does not merely take a day off to apply it. It is a long-term process, and there are no quick fixes. One develops realistic expectations. Similarly, one does not let down one’s guard. Depression is a bit like the devil in Christian belief. It does not take time off. It is not the time for peace until one walks free.

The ownership of depression represents an acceptance of one's own weakness. Socrates was an avid proponent of the dictum ‘Know thyself.’ To know one's weakness in times of distress is of great help, because if one knows what causes one to fall, one can take steps to stop the downward spiral of one’s mindset.

Philosophy in all its fullness includes the spiritual and artistic aspects of our personality. Therefore it is valuable to have an appreciation for the spiritual and aesthetic inclinations of the human ‘soul’, and to exercise and expand on them.
Of course, prevention is always better than cure. ‘Guard your heart, for out of it comes the issues of life,’ wrote the wise King Solomon. Watch your life and be careful what and who you allow in your heart. We are always under the influence of something or someone, at some stage of our life. It is sensible to guard what one allows oneself to be influenced by.

This is not intended to diminish the help that medication gives, or wise counsel. Yet philosophy plays a central role in depression, and may present a definitive anchor for the soul, which enables us to find the way back to a place of reason and not to spiral into despair.

Monday, 16 July 2018

The Things | Relations Dichotomy

Yin-Yang by Sandi Baker
Posted by Thomas Scarborough
We humans have always been accused of dichotomous thinking: us and them, good and evil, for and against, and so on.  It pervades our thinking, and our existence.  Such dichotomous thinking is closely familiar to us.  Not a day goes by without someone suggesting that we should be more nuanced, less one-sided, better rounded. 
Yet there is a strange dichotomy which is more pervasive still, which passes all but unnoticed in our lives—and, I shall argue, bedevils all of our thinking.  Within its broader bounds, it goes by hundreds of names—which in itself suggests that it has too much escaped our attentions.  One might describe it as the static and dynamic, or being and becoming—but there are many ways to describe it besides:
things and relations (Kant)
objects and arrangements (Wittgenstein)
the spatial and the temporal
nouns and verbs
operators and variables
And so on. It is the simple matter of a world where things exist (we include events, which is things that happen), and exist in a certain relation to one another.  This dichotomy pervades all of our thinking—and this it does at a level which is embedded in our thinking.  As one sees, even in our grammar and our sums, for example.

It all has to do with individuation.  We all begin, apparently, with what William James called ‘one great blooming, buzzing confusion’, then we single out complexes from nature and call them things, entities, objects, even concepts—or events, actions, processes, and so on.  We distinguish these then from the relations between them.

We may have said enough in these few words to identify the presence of this dichotomy at the core of some major philosophical problems, of which just a sample here:
The fact-value distinction (Hume).  We have fact on the one hand—or statements which contain things—yet do not know on the other how we should arrange them or bring them into relation. 
The ‘own goal’ of science (Hawking).  By singling out things from nature, and discarding all that (we think) does not belong to them, we create a world of unforeseen side-effects, as we relate them.
Free will and determinism.  Free will goes to the question of cause and effect, and causation in turn is about the relation between two or more events. This, too, rests on the dichotomy of things and relations.
The mind-body problem.  This problem may rest on our experience that things exist in the world (or so we feel), while only relations can exist in our networking brain—not things, of course.
God.  The problem of God’s existence may rest on the notion of causality, since that which is caused is not influenced by God.  Again, causality rests on the distinction between events and their relations.
We may put it this way.  If we did not have this dichotomy of things (and the sort) versus relations, it would be impossible that we should have any of the problems listed above—and many more.  This suggests that we may solve these problems by doing away with one side of the dichotomy—say, things.  This would leave us only with relations, and relations within relations.  It is not an entirely new idea.

Someone might object.  Even if we have no things, objects, entities, events, actions, and so on, we do still have relations—and these relations are governed by scientific law.  But wait a moment.  Without things, there is no scientific law.  At least, not as we know it.

The fact of the dichotomy is presented here simply as food for further thought.  In my view, the dichotomy is artificial and false.  It is a reflection of something in the human fabric that insists first on our individuating things, then on relating them one to the other.  Yet there never has been anything to set this on a firm foundation.

Monday, 9 July 2018

Is Time What It Appears to Be?

Posted by Keith Tidman

Picture credit: Shutterstock via

“Time itself flows in constant motion, just like a river; for neither the river nor the swift hour can stop its course; but, as wave is pushed on by wave, and as each wave as it comes is both pressed on and itself presses the wave in front, so time both flees and follows and is ever new.” – Ovid
We understand time both metaphorically and poetically as a flowing river — a sequence of discrete but fleeting moments — coursing linearly from an onrushing future to a tangible present to an accumulating past. Yet, might ‘time’ be different than that?

Our instincts embrace this model of flowing time as reality. The metaphor extends to suppose a unidirectional flow, or an ‘arrow of time’. According to this, a rock flies through a window, shattering the glass; the splinters of glass never reform into a whole window. The model serves as a handy approximation for our everyday experiences. Yet what if the metaphor of time as a flowing river does not reflect reality? What then might be an alternative model of time?

What if, rather than the notion of flow, time actually entails only one now. Here, an important distinction must be made, for clarity. That is, time is not a sequence of ‘nows’, as proposed by some, such as the British author of alternative physics, Julian Barbour. That is, time is not points of time — corresponding to frames in a movie reel — with events and experiences following one another as ephemeral moments that if slowed down can be distinguished from one another. But, rather, time entails just one now. A model of time in which the future is an illusion — it doesn’t exist. The future isn’t a predetermined block of about-to-occur happenings or about-to-exist things. Likewise, the past is an illusion — it doesn’t exist. 

As to the past not existing, let me be specific. The point is that what we label as history, cosmology, anthropology, archaeology, evolution, and the like do not compose a separately distinguishable past. Rather, they are chronicles — memories, knowledge, understanding, awareness, information, insight, evidence — that exist only as seamless components of now. The Battle of Hastings did not add to an accumulating past as such; all that we know and have chronicled about the battle exists only in the now. ‘Now’ is the entirety of what exists — all things and all happenings: absent a future and past, absent a beginning and end. As the 4th-century philosopher St. Augustine of Hippo presciently noted:
‘There are three times: a present time about things past, a present time about things present, a present time about things future. The future exists only as expectations, the past exists only as memory, but expectation and memory exist in the present’.
In this construct, what we experience is not the flow of time — not temporal duration, as we are want to envision — but change. All the diverse things and events that compose reality undergo change. Individual things change, as does the bigger landscape of which they are a part and to which they are bound. Critically, without change, we would not experience the illusion of time. And without things and events, we would not perceive change. Indeed, as Ernst Mach, the Austrian philosopher-physicist, pointed out: ‘... time is an abstraction, at which we arrive by means of the changes of things’.

It is change, therefore, that renders the apparition of ‘time’ visible to us — that is, change tricks the mind, making time seem real rather than the illusion it is. The illusion of time nonetheless remains helpful in our everyday lives — brown leaves drop from trees in autumn, we commute to work sipping our coffee, an apple rots under a tree, the embers of a campfire cool down, the newspaper is daily delivered to our front door, a lion chases down a gazelle, an orchestra performs Chopin to rapt audience members, and so forth. These kinds of experiences provide grounds for the illusion of time to exist rather than not to exist.

As Aristotle succinctly put it: ‘there is no time apart from change’. Yet, that said, change is not time. Change and time are often conflated, where change is commonly used as a measurement of the presumed passage (flow) of time. As such, change is more real to the illusion of time’s passing than is our observing the hands of a clock rotate. The movement of a clock’s hands simply marks off arbitrarily conventional units of something we call time; however, the hands’ rotation doesn’t tell us anything about the fundamental nature of time. Change leads to the orthodox illusion of time: a distinctly separate future, present, and past morphing from one to the other. Aristotle professed regarding this measurement aspect of time’s illusion:
‘Whether if soul [mind] did not exist, time would exist or not, is a question that may be asked; for if there cannot be someone to count, there cannot be anything that can be counted.’
So it is change — or more precisely, the neurophysiological perception of change in human consciousness — that deludes us into believing in time as a flowing river: a discrete future flowing into a discrete present flowing into a discrete past. The one-way arrow of time.

In this way, the expression of dynamic change provides our everyday illusion of time, flowing inexorably and eternally, as if to flow over us. The British idealist philosopher J.M.E. McTaggart wrote in the early years of the twentieth century that ‘in all ages the belief in the unreality of time has proved singularly attractive’. He underscored the point:
‘I believe that nothing that exists can be temporal, and that therefore time is unreal.’
To conclude, then: Although the intuitive illusion of time, passing from the future to the present to the past, serves as a convenient construct in our everyday lives at work, at home, and at play, in reality this model of time and its flow is a fiction. Actual experience exists only as a single, seamless ‘now’; there is no separately discrete future or past. Our sense of time’s allegorical flow — indeed, of time itself — arises from the occurrence of ‘change’ in things and events – and is ultimately an illusion.

Monday, 2 July 2018

PP #37 A Celebration of Brashness!

'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Posted by Tessa den Uyl and Martin Cohen

A postcard presentation of Times Square
Times Square, New York.
‘The soft rush of taxis by him, and laughter, laughters hoarse as a crow’s, incessant and loud, with the rumble of the subways underneath - and over all, the revolutions of light, the growings and recedings of light - light dividing like pearls - forming and reforming in glittering bars and circles and monstrous grotesque figures cut amazingly on the sky.’
During the so-called Jazz Age, that is the optimistic time after ‘the Great War’ and before the Depression, the rise of Nazism and the Second World War, F. Scott Fitzgerald’s metaphor in his book The Beautiful and Damned, reflects so well the human despair combined with hope.

Acts of freedom and expression intertwine to be heard and noticed, to forget and to distract, to employ, and to  hope... In those days, Times Square must have appeared promising, like a colourful stamp on the continent. But what did its message say?

Ideas about segregation and freedom brought ‘silent’ new horizons and made former distinctions tremble. With all there was to come, in those years of the Roaring Twenties, all the layers that combine to make a society were looking for ‘a voice’ and the call echoed, near and far. 
People rather grandly called Times Square the ‘crossroads of the world’ and in those days, that might have well been so. And today, on the edge of the square, the NASDAQ controls a good slice of the world’s wealth and the New York Times does likewise for the world's news. 
Yet it is after dark, after the office day has finished, that the square really comes alive. Doubtful is whether that liveliness today, is filled with the same complexity and struggle, or with that necessity literally and symbolically to survive. While it once stimulated a proper voice, ‘light dividing like pearls’, now Times Sqaure embraces more of a homogenisation and offers monstrous grotesque figures cut amazingly out of the sky.

Monday, 25 June 2018

The Importance of Being Seen

Posted by Simon Thomas
Of all our innermost human desires, nothing seems to trump that of being seen. Humanity is meant for community and for togetherness. There is something very fundamental about being acknowledged, of being seen, as a person.
The Internet with all its wonders and the ability to connect people from all over the world, does indeed meet some of this need, but virtual reality does not on any level substitute real human interaction on a personal level. It does, however, point out the fact there that there is a real need for it. A person can have untold ‘friends’ or followers on social media, but this does not replace real human conversation. And we are sick because of it.

Everyone needs to know that what they say, or the very fact of their being is important to somebody. The heart longs for a human embrace, for the attentive ear of another, who shows some interest in being with them and listening to what they say. Psychological research has shown that when people are perpetually in a situation where they are ignored, this causes real emotional pain, and that in turn will cause physical problems caused by the stress of being ignored on a ongoing basis.

This is all the more prevalent in our society of individualism. Everyone wants to connect, but there seems to be this incessant preoccupation with connecting with everything and everyone except that which is in our present reality. It is true, a person can feel lonely in crowd, and feel intense feelings of abandonment even in the company of others. This is especially true in our society with its preoccupation with distraction.

Things become more important than people, virtual friendships on-line become more important than friendships we can experience in real time and in real situations. We see, but we don’t see each other. We put each other in categories, and fail to recognise how much we are the same, with the same need for communicating and the real need for simply communicating with those who share our time and space.

Families today, too, have gone this route. People live under the same roof but do not communicate; there is no or very little interaction. The whole emphasis has shifted form ‘how can I serve’ to ‘how can I get something out of this person’.

I have this kind of relationship with my dog. It comes to me when it is hungry or wants something from me. And that is okay. Animals do not have the complex relationship needs that human beings have. But my dog has what I call ‘cupboard love’ -- he loves me for what he can get from me. But that is not to be the way we interact with our fellow human beings. It is the height of selfishness. And very often the cause of much emotional and mental anguish.

I have noticed however, that that is how the Internet of things works. Someone has something on offer, which the other wants, and what happens is that while felt needs are met on a superficial level, there is not a lasting connection. And it is understandable that people want the connection, but they don’t want to acknowledge the person they interact with. We come across many people in our daily lives, in the office, at the bus station, in the shops, at church. But as many can testify, even after we exit a party or a group of people we feel drained.

It is important to listen to one another. Even a brief interaction can be meaningful if the person we talk to makes us feel that we have been seen, that we have been acknowledged. It is not uncommon to go through a day and while we do many things in the course our day’s activities, we are left empty. What is that? Well I perceive that the reason we fail to connect is that we seem to objectify people and treat them as less than they are.

Human beings are made imago dei -- in the image of God. We were created to interact and communicate; we were made to live in community and not in isolation. To be human is to share in the common human experience, and to live in such a way that we acknowledge one another, and not allow our many distractions to detract from how we relate to one another.

Monday, 18 June 2018

White Lies – Malevolence or Defence?

Little White Lies, by e9Art
Posted by Christian Sötemann
A little thought experiment: In the year 2088, a mentally highly volatile leader of an autocratic world power is undergoing yet another personal crisis. His wife, so he has heard, is secretly planning to leave him. Without her, he sees no meaning in going on. Since he is also a narcissistic megalomaniac, in his dark mood, he decides that the world should perish if he left him. He prepares to give the order for a nuclear strike and confronts his wife on her secret plans.
Now, what would be a wise thing for her to answer, even if she actually planned on leaving him? Surely, most people would say something along those lines: Calm him down, say that everything is fine, just get him away from ordering a nuclear strike. The rest will be sorted out later. Hence she should lie to save the world from a nuclear attack.

That’s that then, right? Not so fast. In ethics, the role of the lie has been a hotly debated one. Among the ethical stances, there are some which emphasise the consequences of an action to determine whether they are moral or not. Many of the supporters of these approaches would probably have few issues with the wife’s lie. The argument would go like this: Lying in this particular case prevents unfathomable damage occurring to millions of people, so it is the right decision.

There are, however, perspectives in ethics that focus more on principles and duties rather than consequences of actions, notably in Kant’s categorical imperative: ‘Act only according to that maxim by which you can at the same time will that it should become a universal law’. From this point of view, in its strictest form, a lie cannot ever be legitimate, because human relationships would become poisoned if everybody lied to each other all the time.

In many cases, there is some validity to that principle. We have to be able, at least most of the time, to confide in what people around us tell us. The lie has to be the exception rather than the rule. Our everyday life would be seriously impaired if we all lied to each other all or most of the time. 

Still, there is a point to be made for white lies. Schopenhauer viewed lies as a legitimate form of self-defence in cases of extortion, threat or unauthorised interference or intrusion, among other things. If I am exposed to an evil will, lying can be part of the arsenal to defend myself.

For example, if somebody broke into my house, thus violating my right to privacy, my exclamation telling the burglar that the police were already on its way, would represent a perfectly legitimate lie to make this intruder leave my house as quickly as possible. Similarly, a child threatened by bullies on its way home from school might want to use the white lie that his parents or elder brother were just around the corner. There is no malevolent deceit in situations such as these.

It seems that the most important aspect here is that there is a predicament which can make a white lie a suitable means to an end. To avert a catastrophe or a crime, white lies can come into consideration. Besides, from this perspective, the ‘lie’ aspect of the white lie becomes less relevant – rather, it becomes one of several means to defend oneself. It is something one can do to get out of a dangerous situation.

The application of the categorical imperative in this case should therefore not denounce the white lie as harmful, but could be reformulated as: ‘In a dangerous situation threatening the physical and psychological integrity of an individual in an illegitimate way, every individual should have the right to undertake sufficient actions to avert this threat’.

In German, one translation of ‘white lie’ is Notlüge, meaning, literally, ‘emergency lie’. Perhaps this serves to illustrate some cases in which a white lie seems appropriate. It is something that is more a verbal form of defence rather than a mere lie.

Certainly, it would be harmful to lie all of the time. And it can be harmful to never ever lie. The potential Kantian counterargument that this takes into consideration the consequences of actions rather than a principled stance regardless of what happens afterwards is something that can be addressed.

But it represents another example of morality not necessarily being beholden to one orthodoxy throughout.  We may consider principles as well as consequences in our moral deliberations. There is something to be found between the extremes of rigidity and arbitrariness. So, we should not blame the dictator’s wife for her white lie. Those living in the year 2088 will be grateful for our leniency.

Monday, 11 June 2018

BOOK REVIEWS: Back to the Future with the Food Gathering Diet

Posted by Martin Cohen*

Back to the Future with the Food Gatherers Diet

How we imagine hunting and gathering - in this case, on the South Texas Plains

Food Sanity: How to Eat in a World of Fads and Fiction
By David Friedman (Turner 2018).

Psst! Maybe someone should have told David Friedman, well-known media personality as well as the author of this new look at food issues – there are hardly any vegans. So if you pitch a book on 'how to eat' to that crowd, you take the risk of ending up preaching to a much reduced congregation. Add to which the serious vegans in town won't like some of what Friedman has to say, because vegans don’t eat eggs and certainly don’t eat fish. All of which only goes to show, that food is a pretty controversial and divisive issue these days, and if you want to be honest, as Friedman evidently does, you're going to have to risk trampling on the dearly held, indeed dearly munched, beliefs of lots of people.

But I hope Food Sanity does find that wider readership, because I’ve read a lot of books and articles recently about food and this one really does clear out a lot of the deadwood and present some pretty mind-boggling facts (and figures) to ‘put the record straight’, as Jack Canfield (of Chicken Soup for the Soul fame) puts it, by way of an endorsement of the book.

Take one opening salvo, that as I say, will surely lose Friedman lots of readers in one fell swoop: the Paleo or ‘Caveman’ Diet. This is probably the most popular diet going and that’s likely because it fits so excellently people’s dearly held prejudices. Plus, it allows them to eat lots of beef-burgers and chips, while cutting out things like muesli which only hippies eat anyway. But oh no, Friedman has done his research and found out that Stone Age folk didn’t really eat lots of red meat washed down with a beaker of blood, as we like to imagine. Instead, using both archaeological and anthropological research as a guide, he says that the earliest human tribes spent most of their time eating fruits and seeds, which they gathered, and probably only really sharpened the spears (or so, at least, I imagine) for internecine human disputes.

Friedman finishes his deconstruction of Paleo by consideration of human biology too: notably the fact that we just aren’t built to catch our fellow animals. We lack the right claws, teeth and general physique too. He points out, a thing curiously overlooked, that Stone Age people would have been rather short and squat - not the fine figures wielding clubs that we imagine. He retells Jared Diamond’s tale of a hunting trip by one of today’s last remaining ‘stone age’ tribes, in New Guinea. At the end of the hunt, the tribe had caught only some baby birds, frogs and mushrooms.

This is all fascinating to me, but compelling too are Friedman’s physiological observations, most particularly on the acidity of the human stomach. The gastric fluids of carnivores are very acidic (pH 1), which is essential if they are to break down the proteins and to kill bacteria. Our stomachs, however, are much less acidic (pH 5), and simply can’t tolerate much uncooked meat. And if, yes, Stone Age man might have done a bit of cooking, it would probably have been rather rudimentary with parts of the meat not really cooked.

Actually, by the time I had finished reading all of the reasons that ‘humans can't eat meat’, I was left puzzled by Friedman’s conclusion which was that a significant proportion of the prehistoric human diet (nonetheless) seems to have been meat. Less surprising was Friedman’s hearty endorsement of eggs, which surely everyone has heard by now are really not dangerous, and don’t cause heart attacks after all, and fish, which he carefully defends form claims that they are today dangerously contaminated with things like mercury.

However dairy gets the thumbs down, with a disdain that I personally felt was unjustified. Dairy, after all, is much more than drinks of cow’s milk - it is goat and sheep milk, cheese and cream too -  and an inseparable part of many dishes. We are advised here instead to swap to things like ‘almond milk’, and ‘hemp milk’ but I know these substitutes very well, and, well, they ain’t one. At least Friedman doesn’t try to suggest we switch to soya milk because, as he rightly observes, that is a food disaster just in itself

There is, to be honest, a bit too much bad news in this book - so much so that I started to skip some  sections, which fortunately the book’s modular structure permits. On the other hand, Friedman makes an effort to leaven the mix by including some good news and positive suggestions, including a two page table of the healthiest foods on earth. What are they? They're all fruits and veggies - the things that Plato and Pythagoras were praising and recommending nearly three thousand years ago. It seems that it’s time, if not indeed long overdue, to go back to following their advice.

*Martin Cohen is the author of a forthcoming book on food issues too called I Think Therefore I Eat, which is also published by Turner, and due out in November 2018

Monday, 4 June 2018

Picture Post #36 A postcard from Taroudant

'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Posted by Tessa den Uyl and Martin Cohen

A postcard from Taroudant, Maroc

One piece of advice offered is to lower the gaze, to not allow it to dwell, as if the eye serves distraction.

The woman seated in front of the painting is possibly homeless. Her posture dissolves with the two figures on the wall, characterised by their carved-out eyes, and urge us to imagine where this woman can put her gaze.

Eyes and hearts, their combination invites a myriad of symbolic attributions. One of them is that a woman with her eyes can reach the man in his heart. The carved-out eyes suggest that women, even when veiled, still look (and distract), which they should not... Or is the image saying something quite different, that the time for women to be veiled is consigned to history and that these days we can 'forget about the eyes’?

An eye is connected with light, and light with reflection. The ‘seduction’ begins with the question of where the reflection should pose its attention.

Recent Comments