Faith

Martin Turner is a former chair of the Liberal Democrat Christian Forum. He previously worked with Operation Mobilisation in Belgium, and is a member of Bidford Baptist Church (theBarn).

Why we need less morality, not more

Morality. Of all types of opinions, we hold our moral ones most tightly, utter them uninvited most frequently, and are most outraged when they are breached. The sub-text of every tabloid shock headline is ‘and this should not be’. In journalistic and political writing, all we have to do is preface our complaint with ‘it is an outrage that…’, and we know that we have gained some kind of an aura as moral pundits, superior to those we criticise.

One of the very best places to find moral outrage is Facebook. Once the hangout of students and other cool folk, Facebook is increasingly populated by grumpy middle-aged people and others whose intent it is to police the moral values of, well, everyone. Students and other cool folk long ago migrated to Instagram and Whatsapp, and, as often as not, maintain their Facebook accounts just to keep in touch with grumpy relatives.

Alongside the personality quizzes, pseudo-maths puzzles, (where have the pictures of cats gone?) and general chit-chat, Facebook is the best place to pep up your daily moral outrage by seeing pictures of things you don’t like, with comforting text implying that you are a morally superior being by not liking them. If you are of a liberal tendency, you can be outraged by Republicans, right-wing Christians, oil companies and -ists of many kinds. If you are of a conservative tendency, you can be outraged by Democrats, extremist Muslims, feministas and -ists of most other kinds. If you sit somewhere in the middle, pictures of an empty House of Commons debating something important may well fill the gap.

A bit of moral outrage seems to tickle a spot in most of us. Newspapers have been selling on that basis for years. Far be it from me to try to limit people’s access to moral outrage. However, I do want to argue that such outrage is not a part of our moral sense at all, and that it can be quite dangerous: my ability to be outraged by, say, Donald Trump, can easily give me the impression that I am a moral person.

Our very best moral outrage is naturally reserved for hypocrisy. Ever since Jesus pointed this one out, it has been a perennial favourite. Most of us are able to keep in mind the things that we oppose. If we catch ourselves doing them, we generally drop our opposition to them and get wound up about something else instead. By this means, we can retain our stance against hypocrites, while making sure that we do not fall into their category.

Hypocrisy, though, is the symptom, not the cause. It is an outward expression of our natural and innate tendency toward self-deception. That we have learned to catch ourselves (usually) before making hypocritical statements merely demonstrates that we have gained some skills in thinking before we speak, at least in regard to a rather unfashionable vice.

Let me attempt to delve a little deeper.

The foundation of our shared morality—it seems to me—is in an innate sense which we all have which is often referred to as ‘conscience’. Our sense of right and wrong is intuitive rather than rational—at least, for things which are ‘obviously’ right and wrong. Some people have argued that conscience is identical to empathy, but I think this leads us very quickly into a cul-de-sac. The two senses overlap in many cases, but there are issues of morality such as stealing from a corporation where it is very hard to argue that the failing was an empathic one.

To some extent, what we regard as conscience is heavily shaped by upbringing and by society around us. There is nothing intrinsically wrong with eating with your fork in the right hand and your knife in the left, but for most British people, this prompts the same kind of reaction as the temptation to take a larger slice of the pie than the next person. However, anyone who spends much time with children (or who remembers what it was like to be a child) knows that ‘it’s not fair’ is a cry that does not have to be taught. Whether children get the notion of fairness from their parents or not, it is one that resonates early on, and is much more easily retained than that business with forks and knives, even though it is an abstract concept.

Here is the tricky bit. For most of us, when we are deciding whether something that we are contemplating is right or wrong, we consult our conscience. If introspection does not give us an immediate answer, our moral reasoning tends to proceed on the lines of ‘this is like that, and that is wrong’. 1

However, when making our moral pronouncements about the behaviour of others, we tend to first try to articulate a rule, and then apply it to other people. Making moral pronouncements would seem to be merely a distillation of our sense of conscience, but is it?

A few months ago, there was an article on the BBC website about filtering on motorways when three lanes are brought down to two lanes. According to the article, correct driving is to remain in your lane as long as possible. When filtering from the right, it is the other drivers’ responsibility to let you in. Interesting as it was, the article was nothing like as interesting as the comments that followed it. The language people were using was the language of morality, and, as the debate progressed, there were real signs of outrage. Just last week, LBC showed a cyclist and taxi colliding, and asked ‘but who is in the wrong?’ Perhaps the headline drew in those most interested in a moral debate. Either way, the language of the debate was moralistic, not technical.

It should be quite evident that the Highway Code is not a matter of intuitive conscience. It is a set of conventions for driving which form the basis for passing the UK driving test. As any driver coming from mainland Europe or the USA can attest, the most fundamental UK convention, that we drive on the left, is not intuitive at all. It certainly isn’t a matter of superior morality. What’s more, when a UK driver goes to Belgium and drives on the right, they are not being hypocritical by doing so.

I give this example because I hope it shows that our willingness to make moral pronouncements and then to reason from them is not actually a function of our moral sense at all. The commentator who argued (with quite a high implication of moral outrage) that it is a driver’s duty to filter as soon as ‘lane closed ahead’ was signed up (and the many who agreed with him) was doing exactly that. Most of the commentators on the LBC thread (at least, when I last looked) did exactly the same thing: they first created a ‘rule’, and then applied it to the situation. In most cases, the rule they created was not in the Highway Code at all, and the commentators who argued from the Highway Code were often dismissed by others.

I don’t intend to make a pronouncement about the cyclist or the taxi driver. What I’m more interested in is this as an example of our delight in rule-making and pronouncement.

Any system of justice, of course, relies on being able to make pronouncements. Roman law, indeed, relies on written rules, and even Anglo-Saxon precedent law has come to rely increasingly on regulation, often codifying what was previously judged on precedent. But law and morality are not the same thing, and were never intended to be.

While legal regulations have mushroomed, popular moral prescriptivism has exploded. In the last few days I have read articles and seen memes that tell me that it is immoral to explain things, immoral to accept refugees if there is any homelessness in one’s own country, immoral to own guns, immoral to control guns, even (I assume in jest) that it is immoral to put up Christmas decorations early. Political Correctness brings with it an ever tightening set of strictures, but its opposite, US Republican-style anti-political correctness, seems just as laden with rules, just about different things.

There are only three moral thinkers who, to me, have contributed substantially to the debate, and all of them have sought to reduce a super-complexity of moral rules down to just one or two positions.

John Stuart Mill, building on the work of Jeremy Bentham, posited Utilitarianism as a single moral theory which allowed us to dispense with other rules. I don’t agree with Utilitarianism, but I recognise its importance. The notion of ‘the maximum good to the maximum number of people’, when taken alongside minimising harm, is a framework which large, impersonal bodies, such as corporations and governments, can use effectively in many circumstances. There are lots of examples of where applying Utilitarianism would produce a result which was unjust, or even evil, and many of these have been condensed into moral philosophy puzzles involving crashing trams and other such crunch cases.

Immanuel Kant proposed the notion of a Categorical Imperative, which is “Act only according to that maxim whereby you can at the same time will that it should become a universal law without contradiction”. From this he derives: “Act in such a way that you treat humanity, whether in your own person or in the person of any other, never merely as a means to an end, but always at the same time as an end.” Further, “Therefore, every rational being must so act as if he were through his maxim always a legislating member in the universal kingdom of ends.”

Kant’s notion of the Categorical Imperative is widely cited (though not quoted) as something which it is not, quite. On that, more in a moment. On grand moral questions, it is well worth asking “if I did this, and it were to become the universal rule, would I be happy?” However, on day to day moral questions, issues of personality and personal style crowd in. As an Extrovert Intuiting Thinking Perceiving person (though I’m actually borderline on three of those), I have a view of the kind of things that make me happy, and which I think would make everyone happy. Some people do want a world full of parties, trying out the latest gadgets, bright clothes, late nights and loud music. To others, this would be hell on earth. One of the reasons for the explosion of new moral prescriptivism is that many people now imagine that they are legislating members of humanity.

Nonetheless, when applied as a personal code, Kant’s view is, I think, preferable to Mill’s, and Mill’s to today’s ad hoc prescriptivism. Mill’s needs a calculator to operate, whereas Kant’s needs a bit of introspection.

The moral thinker (it should surprise no one that I make this claim) who I think takes us the greatest distance is Jesus of Nazareth. He proposes two ‘laws’: ‘Love the Lord your God’, and ‘Love your neighbour as yourself’. When pressed on ‘who is my neighbour’, the answer is ‘anyone you encounter’. His other formulation is ‘do to others what you would have them do to you’.

Kant’s Categorical Imperative is often confused with this last pronouncement, typically referred to as the ‘golden rule’. Its negative form, ‘do not do to others what you would not have them do to you’ is relatively widespread before Jesus, but he is credited as being the first to put it forward as an injunction to ‘do’ rather than than ‘refrain from doing’. It differs from Kant’s in that Kant is saying that you should only do that which you would want to be a universal rule. Jesus’s is more direct: ‘would you want it? Then do it’, and requires nothing in the way of extended introspection.

What Mill, Kant and Jesus all have in common is that they are proposing one or two simple rules by which moral agents (ie, us) can evaluate the actions we are about to take. Mill’s view can be applied retrospectively, in the sense of ‘did that produce the maximum good?’, but that is not its intention. In each case, they are rules for us, rather than rules for us to impose on others. Indeed, neither Kant’s position nor Jesus’s can be applied to someone else. I cannot know whether, at the time, someone did something because it was what they would want done to them, or because they wanted it to be the universal rule, or for entirely selfish reasons.

If we could simply wipe out all the extra moral rules, the extra bits of ethics, custom, judgement, prescription, outrage and memification, and go back to any one of Mill’s, Kant’s, or Jesus’s formulations—in other words, have less but better morality, rather than more but bittier—then we would be in a much better position to evaluate our own behaviour ahead of time, and be possessed of a much better understanding that it really isn’t our business to evaluate other people’s.

So, for everyone poised to create that new meme, or to post an outraged remark on Facebook or as a comment to a BBC article, or to pen the newspaper article that prompts storms of outrage, or to make a speech in the House of Commons denouncing this group or that group, or to create new legislation that forces people to behave ‘better’ (whatever better is), let me offer one final moral remark, also from Jesus of Nazareth: ‘Do not judge others, lest you be judged yourself.’

Show 1 footnote

  1. Piaget, followed by Kohlberg, of course, argues that this is the form of reasoning only engaged in by those who are relatively morally developed. However, experimental evidence has not generally supported their view: we see quite sophisticated moral reasoning among children, and, indeed, children’s literature which is popular among children tends to have an intuitive rather than rule-bound moral sense.

Christianity is not British, Mr Cameron

55 secularists have written a letter to the Daily Telegraph complaining that David Cameron has referred to Britain as a Christian country. According to a YouGov poll, most people describe themselves as non-religious. According to the most recent census, most people in Britain describe themselves as Christian.

So who is right?

First, we should avoid putting words into Mr Cameron’s mouth: what he wrote in the Church Times was:

“We should be more confident about our status as a Christian country.

“Being more confident about our status as a Christian country does not somehow involve doing down other faiths or passing judgment on those with no faith at all.”

If we begin from the position that ‘Christian Country’ is a status which can be obtained, like ‘Fair Trade Town’ or ‘licensed to drive goods vehicles over 10 tonnes (unladen)’, then it’s possible to argue the toss all day about whether the YouGov survey is right, or the census, or whether having an established church gives Britain this status, or laws in some way connected with the ten commandments. We could look at weekly church attendance, which runs around 10%, and argue that Britain is not Christian, or compare that 10% with all political parties, arts associations, visits to museums and libraries, and conclude that Christianity remains one of its dominating forces. Likewise, we could look at the influence of the Bible on our language and cultural life, and argue that Britain is Christian, or look at the daily output of television, radio and newspapers, and argue that it is almost entirely secular.

The letter to the Telegraph takes a slightly different tack. Although it asserts that Britain is not Christian, its real force is suggesting that the effect of David Cameron’s pronouncement is division. This is a clever shot, because it should be relatively obvious to everyone that Cameron wrote his Church Times article because he wants to produce a particular effect. The secularist letter argues that the effect will be the opposite.

Nonetheless, the letter accepts the same underlying question: “Is Britain a Christian country? Discuss”.

 

Historic Christianity is defined more or less completely by the New Testament, read in the context of the Old Testament. All Christian churches base their beliefs upon it, and none acknowledge any written authority above it. Contemporary Christianity exists in thousands of flavours, with believers in every one of the world’s 196 countries, expressing their faith in yet more culturally-relevant ways — but all look back to the New Testament for their teaching.

So, what basis does the New Testament provide for the claim that Britain is (or ever was) a Christian country?

Actually, none whatsoever. The early Christians never had any official connection with any state, except through the courts, during persecutions. Armenia claims to be the first country which was officially Christian, and that was not until AD 301. Not only did the early Christians have no opportunities for Christianising the state, they had no theology for it either. One of the rich contrasts between the ‘Old Covenant’, and the ‘New’ is that whereas the ideal Old Testament Jewish state was a theocracy ruled by God, the New Testament believers expected to live in a state which did not acknowledge their faith at all.

What proportion of a country would have to be personally Christian before the nation itself was Christian? In the New Testament’s theology, even 100% percent would not achieve that. The notion of a ‘Christian nation’ is — in New Testament terms — simply a category error. How hot must we heat lead before it becomes gold? Or, perhaps, how much red must we add to blue before it becomes Tuesday?

Christians can, and should, be good citizens in any country. The number of ‘evangelical’ Christians, to use the word with the meaning it had fifteen years ago (it now appears to simply relate to ‘enthusiastic’), in mainland China is reliably estimated to be rather greater than the entire population of Great Britain. Christianity does not need state sponsorship to prosper.

Nonetheless, the claim that Britain is, or ever was, a ‘Christian’ nation, is simply one that neither David Cameron, nor even Henry VIII, was ever genuinely entitled to make.

In the words of Jesus, ‘my kingdom is not of this world’.

Happy Easter (and why ‘Good Friday’ is called good)

Happy Easter.

A friend recently shared a page on Facebook about how Easter was originally a pagan festival to the goddess Ishtar, which accounts for the chocolate and eggs.

Plausible, you might say, especially if you have read the Da Vinci Code.

Plausible, but entirely incorrect. I mention this, because it illuminates some of the difficulties modern people (or post-modern, if you like) have in coming to terms with something which is so radical that it is beyond revolutionary.

Let’s deal with the word ‘Easter’ first. Nobody really knows where it comes from. It enters written English through the venerable Bede, who comments in De Temporum Rationes that the Anglo-Saxon name for the month we call April is Eosturmonaþ, which he presumes to be a month in which the pagan Saxons held feasts in honour of Eostre. This had entirely died out — if it ever occurred — by his time, and the Christians used the word purely as a fixture in the calendar.

Was Bede right? Is this a fossil of a forgotten deity? Is this Ishtar? Probably not. Only possibly. Not possible at all.

The problem with Bede’s account is that we have no other references to any such goddess in any extant Germanic mythology. Casual readers can get confused because they read about *Ostara in discussions by scholars. Starred forms, such as *tunaz as the origin of Dutch tuin (garden) and English town (originally meaning village) are reconstructed forms: they are not attested anywhere, and are merely etymological conjecture. Our knowledge of Anglo-Saxon deities is relatively limited, but names are quite well preserved, both in the Chronicle, in poems, and in place names. Eostre is not among them.

If we look for parallels among East Germanic mythology, about which we have a wealth of knowledge, we find nothing at all. We might speculate that Eostre is the same as Freja, but, really, that is idle speculation.

What about the identification with Ishtar? Not at all possible. Ishtar was an East Semitic goddess, not an Indo-European goddess at all. Although divinity iconography and stories happily transmute from one culture to another, the names of the gods do not. As importantly, ‘Ishtar’ only sounds a bit like the modern form Easter. It sounds much less like ‘Eostre’, and even less like the putative reconstructions.

Why is this important? Well, to some extent it isn’t, but, in another sense, it is indicative of a widespread desire to explain Easter as something else than what it is: an exclusively Christian festival, though based upon the calendar of Jewish Passover, with an exclusively Christian component: resurrection.

One might imagine that resurrection is a common theme in ancient mythology. After all, the desire to live for ever is well attested among the Greeks and the Babylonians. If Death is the ultimate enemy, one would imagine a whole range of beliefs about resurrection.

But this is not the case. If we consult Kittel, the seminal 20th century Theological Dictionary of the New Testament, compiled predominantly by liberal, German scholars who had no brief to defend orthodoxy, we find that the only usages of the word ‘anastasia’ — resurrection — in classical Greek were in phrases such as ‘Resurrection is impossible’. One of the most famous Greek myths of them all, Orpheus and Eurydice, hammers home this point: you cannot bring someone back from the dead. The doctor-god, Asclepius, is described as raising Hippolytus from the dead, for which he was slain by Zeus. In other versions of the story, he is punished for similar crimes.

Even Asclepius does not resurrect. Gods in mythology can travel to the land of the dead and return, but this is only in the context of the land of death as a physical, though mythological, place.

We may look on the people of the ancient world as scientifically uninformed, but there were a couple of things they were more certain of than we are: babies are not born to virgins, and people don’t come back from the dead.

The resurrection of Jesus — a man who was dead, was put in a tomb, and 48 hours later was alive again, without the intervention of any medicine or miracle worker — is, in any case, not presented as a mythological account. In Matthew, Luke and John, Mark if you count the disputed final section, and all of the rest of the New Testament, the various writers are adamant that this was a physical resurrection, placed in the very physical and utterly brutal context of a Roman execution and a Jewish burial chamber.

A dead man came back to life.

People have been working for almost two-thousand years to somehow explain this away. All of the explanations so far, however, have stumbled on their own illogicality. Resurrection may be a unique phenomenon, but the explanations are simply analytically false.

The centre-piece of the Christian faith is that its founder died, rose, and did not die again. There were people in the Bible elsewhere raised from the dead: two in the Old Testament, several in the ministry of Jesus and in the rest of the New Testament. All these people, though, had relapses: they died again.

Atheists, naturally, dismiss all this as hokum. But that is to beg the question. One cannot simply dismiss an event which is presented as unique in all of human history and say ‘it is impossible’. Actually, what is more troubling is why modern science has not found the secret of reversing necrosis. Speculatively, resurrection ought to be extremely possible, but, aside from reviving people whose hearts have stopped beating, which is scarcely the same thing, death remains an obstinately final enemy.

Which brings us to Good Friday. Etymological speculation is that ‘Good’ is from the equivalent of ‘godly’ — a Holy Day. However, since we don’t have any other ‘good’ days in our calendar, this is little more than speculation.

Like all such things, no matter what its etymology, ‘Good’ Friday only continued to be used because it resonated with what people thought.

Christianity is the only faith which commemorates the death of its founder with a ‘Good’ anything. Good Friday was clearly one of the outstanding documented injustices of the ancient world. And yet Christians are resolute in calling it ‘Good’.

Why?

Simply, because without Good Friday, there is no Easter Sunday. Without death, no resurrection. Without Jesus suffering — as Christians believe — on a cross to save mankind, there is no Christianity, no saving, no hope of eternal life.

Of course, if death is all there was, then it wouldn’t be much of a Good Friday. It is the resurrection and Easter which hammers home victory over death, the confirmation of every promise that Jesus made.

Happy Easter.

Spirituality is not dead, reveals poll

Spirituality is not dead, reveals poll

Theos Think Tank – Clear thinking on religion and society Theos, the Bible Society backed thinktank on faith in contemporary society, has conducted a poll which suggests that 77% of UK adults believe that “there are things in life that we simply cannot explain through science or any other means”. More interestingly, 61% of non-religious people agreed with the same statement. In other words, most people in the UK who describe themselves as non-religious believe at the very least in the ‘unexplained’.

 

This is a proper poll, conducted by ComRes, and you can look at the actual data tables here. 34% of people think that people’s thoughts can be influenced by spiritual forces, 27% that events in the human world can be, and 23% that events in the natural world can be. Only 25% believed that none of the above could be influenced by spiritual forces, and 22% said they didn’t know.

 

Those with Higher university degrees, those with no formal education and those with only primary education were the most likely to agree that spiritual forces influence the world, though the ‘no formal education’ and ‘primary education’ groups were a very small sample and don’t have much statistical significance. The most likely group to say that they didn’t were those still in full time education — 38% compared with 26% for those with higher university education. In other words, belief in spiritual forces is not a function of just not knowing very much. If this sounds like a big win for organised religion, it isn’t at all.

 

Opinion was mixed as to whether God was a universal life force (30%), a personal being (13%), or a higher spiritual being that can be called God (12%). The devil does rather better, on 14%. 39% of respondents believed there was a soul, 32% in life after death, 26% in heaven, 13% in hell, and 13% in the power of deceased ancestors. Just 13% believed that humans are purely material beings with no spiritual element, though 25% believed that humans are no different from any other animal, suggesting that one person in eight believes that animals are spiritual as well. 12% said they didn’t know.

 

So where is all this spirituality going?

 

The question “Which, if any, of the following have you ever undergone?” threw up 39% for one or more of tarot, astrology, reflexology, reiki, having your aura read, healing with crystals or having an ayurveda session. Before anyone suggests that this means Britain is moving sharply from organised religion to the occult, a parallel question of “which, if any, of the following have you ever undergone?” for church service, Christian wedding, attending mosque or temple or an informal Christian event like an Alpha Course or Christian Union event, the figures would most likely be approaching 100%. Asking the ‘ever’ question in a poll generally gives you the result you are looking for.

 

38% of people either tend to believe or believe strongly that prayer can heal people, against 50% who tend not to believe or don’t believe at all. Half of respondents (51%) believed that the impact of prayer was that it made you feel at peace, with 17% on each side for those who believed that prayer changes the world and those who believe that prayer does not work in any way. However, 21% of people pray a few times a week or even daily. Where does all this take us?

 

There is a myth, largely propagated by militant secularists, that the United Kingdom was once a fervently Christian nation, and is now essentially unreligious. Probably the simplest way to test attitudes for yourself over an extended period of time is the Google Ngram viewer. Google Ngram charts in real time that which could not have been countenanced twenty years ago: all the uses of a word in books in its vast, though not comprehensive, digitised collection of printed books. Consider this one: faith, Christianity, Islam. I’m going back to 1800, because if you go back much before that then the change in usage of words clouds the picture. What is interesting is that ‘faith’ was intensely popular in the 19th century, but ‘Christianity’, as a term, is about as popular now as it was then. You can do your own experiments to your hearts content. Examining the corpus of English gets rid of all kinds of methodological difficulties, including the notorious bias in responses to surveys, and most importantly, the absence of surveys before the 20th century. True, it has its own bias, since it correlates to education, but that is something we can take account of.

 

There have been historic periods of active Christianity in Britain — in the 1390s, when it was estimated that as many as one in three were Wycliffites or ‘Lollards’, in the Wesley revival in the mid to late 18th century, and again at the back end of the 19th century. In between, there have been periods of what secularists would call ‘advance’ and Christians would call ‘drought’.

 

Certainly when I was at school, the received wisdom was that Christianity was definitely on its way out, and had been since Matthew Arnold wrote On Dover Beach, describing how the Sea of Faith which ‘was once, too, at the full, and round earth’s shore’ was now withdrawn, leaving Arnold and his fellows ‘here as on a darkling plain / Swept with confused alarms of struggle and flight’. Certainly the mixed view of spirituality suggested in today’s survey results is reminiscent of a darkling plain.

 

However, it is not remotely a spiritual desert. Life is there. People are enquiring. The great rationalist project to replace all forms of faith with scientific reason has not succeeded. Indeed, the victim of the false opposition between science and faith seems more likely to be science. Large numbers of people doubt climate change science, despite the fact that it is hugely attested and has overcome some of the best-funded hostile scrutiny of any scientific theory. Lots of people doubt the theory of evolution, without any corresponding belief in a form of creationism.

 

I cannot speak for other forms of religion, but Christianity helped to give rise to the modern world, and is no strange bed-fellow to science, to democracy, to feminism or to globalism. In Religion and the Decline of Magic, Keith Thomas carefully charted the way that it was the rise of coherent faith which was the principal cause of the collapse of folk-beliefs in magic. To some extent, it is these folk-beliefs, of an incoherent, uncoordinated supernatural universe, which are returning, as demonstrated by the Theos survey work. It is these, not Christianity, which are anti-scientific.

 

Likewise, long before any form of democracy that we would recognise today came into existence in British politics, non-conformists, especially Baptists, were organising their meetings on democratic principles. The influence of 17th century Baptists on our form of government is through the rise of private enterprise in the 18th century, among others, where the non-conformist work ethic and the business governance that proceeded from it gradually impinged on more exalted levels of society. Like any survey, people will make of the Theos results what they will.

 

Perhaps the best conclusion should be: it ain’t over.

Back to Top