The Battle for Christmas: Who made it? Who owns it? Does it matter?

It is December, and the battle for Christmas is well underway. There was a marvellously provocative article in yesterday’s Daily Telegraph which claimed that all religion is delusion, but that abandoning Christmas is more or less the beginning of the end of life as we know it. I won’t link to it, as I don’t think it really had anything to say. Nonetheless, Christmas as an annual event becomes one of the most hotly disputed topics every year.

Key questions of 2015 include:

  • Aldi versus John Lewis: who wins our hearts and minds on TV (or will it be that German advert?)
  • Happy holidays or Merry Christmas? Does it matter what you say, and does anyone in the UK actually say ‘happy holidays’ anyway?
  • Should we insist that Christmas is a Christian festival, or should we (with a little more knowledge) simply say that it is a pagan festival which was for a while Christianised?
  • Did Coca Cola really invent Santa Claus? Should we stick with a more British ‘Father Christmas?’
  • What is the correct number of Christmas films to be shown per day during December? 15? Or is that substandard these days?
  • Will parents be allowed to take pictures at the Nativity this year?
  • Are we really going to have 30 days of snow and a White Christmas? Or will it just be early winter as usual?
  • Does any of this really matter? Wouldn’t we be better just not over-spending and over-eating this year?
  • And, finally, what does the story of ‘no room at the inn’ and the flight to Egypt have to say about our treatment of refugees today?

A little background: Christmas is not pagan, but it’s not really Christian either

Like probably everyone else, I was taught in junior school that Christmas was originally the pagan festival of the solstice, that the Christian version comes from the Roman Saturnalia, mixed in with the Old English Yule, and all the stuff about robins, Christmas trees, mistletoe, Father Christmas and so on is just paganism. In the mean time, I was taught in Sunday School that the true meaning of Christmas was the baby Jesus, and everything else was pure commercialism.

A few years ago, I decided to look into the matter and get a firm grip on what it all really was.

First, bad news for the neo-pagans, good news for the traditionalists: there was no Roman festival on 25 December, or, at least, none until one was invented centuries into the Christian era, and, even then, there’s no actual evidence it was ever celebrated. The Roman Saturnalia, which did involve giving presents (though lots of festivals across the year did as well) finished on the 23rd of December. The festival of Sol Invictus, which was on 25 December, is first attested in 354 AD1, in a Christian calendar. Not much is known about it, although the cult of Sol Invictus had been established in 274 AD by Aurelian. There was a rather more successful attempt to get the Christians and others to synchronise their dates with the introduction of Sunday as a day of shared worship between Christians and adherents of Sol Invictus2.

As for Yule, we don’t actually know much concrete about the Anglo-Saxon festival of Yule, except that what we call ‘December’ was known as ‘Before Yule’, Ærra Geola, while our January was ‘After Yule’, Æftera Geola. The first real reference (and it’s Norse, not Anglo-Saxon) we have to an exact date comes in Snorri Sturluson’s Heimskringla, in the Saga of Hakon the Good, who is credited with moving Yule to the 25th of December to coincide with the Christian Christmas. Whether you believe that it was Hakon who actually did this, it seems evident that Snorri is curating a memory of the change of date of Yule, and giving it a historical explanation (a fairly standard thing for Snorri to do).

It’s Victorian…

Second, good news for the traditionalists, bad news for the ‘Christ-only Christmas’ people: Christmas as we know it isn’t especially Christian either, nor is it especially old. Check the dates on most of the carols we sing, and you will see that they are generally 19th century. Some are older, and a very few, including O Come O Come Immanuel and Personent Hodie, are descendants of medieval hymns. Snow, robins, and songs putting the two together are Victorian. In fact, our endless fascination with a White Christmas largely comes from Victorian illustrations. The turkey (or goose), the stuffing, the trimmings and even the Christmas tree are all Victorian.

A word on Christmas trees. My wife, being Frisian, was taught in school that when Saint Boniface came to Frisia in 716, he chopped down a pagan sacred tree, declared that it was now a Christmas tree, and so began the whole Christmas tree thing. According to a more or less contemporary account, Boniface did indeed chop down a pagan sacred tree, but it was an oak, not a conifer, and no mention was ever made of Christmas trees. They were in fact brought to these shores by Prince Albert, and, as far as we can make out, were something protestant Christians started to use during the reformation, partly to identify their celebrations as distinct from Roman Catholic ones.

Third, the ‘Christian’ Christmas we celebrate is not all that close to the accounts given in Matthew and Luke’s gospels. As most people these days know, the ‘three kings’ weren’t kings, and they weren’t three. The Gospel account is that wise men from the East brought gold, frankincense, and myrrh. At what point someone decided that three gifts = three givers, and expensive gifts = kings, is not known, but the carol ‘We three kings of orient are…’ has an awful lot more to do with popular conceptions than the original accounts do. Likewise, there’s no particular reason to believe that the birth of Christ, as described in the Gospels, was in the winter. Even if it was, this was Palestine, not North Yorkshire, and it would not have been ‘bleak’. Christina Rosetti’s wonderful carol ‘In the Bleak Mid-Winter’ contains some lovely sentiment, and some rather good theology in the final verse, but earth would not have stood ‘hard as iron’ nor ‘water like a stone’. Snow, almost certainly, had not fallen ‘snow on snow’.

Our nativity plays make much of Joseph and Mary wandering round Bethlehem being turned away at each inn and finally giving birth in a stable. All the Gospels tell us is: “She [Mary] wrapped him in cloths and placed him in a manger, because there was no room for them in the inn.”3.

Christmas carols and nativity plays conjure up powerful images of what Christmas meant to the Victorians, and represent a strong imaginative response to ‘what it would have been like if Jesus had been born somewhere near Warwick in 1834’. The people in our nativity plays may have tea-towels round their heads, but what we have is a fundamentally British reimagining.

Not surprisingly, very few of our traditions are shared elsewhere, except in the USA. I was astonished, when I first lived in Belgium (though I am now astonished that I was astonished) to discover that almost nothing of what I thought was ‘traditional’ Christmas existed in the French or Dutch speaking worlds. People did sing carols, but they were largely different carols: Il est né le divine enfant, not Hark the Herald, or Ere Zij God, not Once in Royal David’s City. The festivities were early in December, for Sint Niklaas. Christmas celebrations were held on Christmas Eve, not Christmas Day. Turkey? No. Stockings? No. Father Christmas? Absolutely not. In fact, I once tried to arrange a bit of Christmas promotion by churches in the city of Ghent using Father Christmas. A number of people took me on one side and told me that this was not appropriate, though I think they were more worried about an Anglo-Saxon invasion than a pagan one.

A British Value?

Aside from queuing and not complaining in shops, Christmas probably qualifies for being one of the only genuinely British contributions to the world. British Christmas, that is: others have their own Christmases, which are quite different, and sometimes on different days. It’s now been amalgamated with American Christmas, which brings in Santa Claus instead of Father Christmas, reflecting Germanic input into that melting pot of nations. Coca Cola is widely credited with inventing the modern version, though all it really did was popularise one colour scheme. There are plenty of Victorian illustrations of Father Christmas aka Santa Claus in red, although he is sometimes seen wearing green and other colours.

British Christmas, though, is a very instructive cultural phenomenon for us to look at. Trees, decorations (frugally saved from year to year), lots of hot, fatty, plentiful and relatively inexpensive food, carol singing from door to door, giving of (originally) inexpensive gifts in bright wrapping, cards (a legacy of the rise of Royal Mail). Our Christmas is the very best of the Victorian age, frozen in time, and upgraded with ever more expensive gifts, dramatic light shows, and the inevitable email round-robin letters. Christmas was a good time to be ‘poor but happy’, though, as often as not, it has transmuted into a time to be substantially less poor, and significantly less happy. Retailers rely on us overspending at Christmas to balance their annual books. Many people will face a financial slump in January. Parents who, for whatever reason, are not able to spend lavishly, feel guilty and worried that they are not doing right by their children.

One piece of good news should allay these rather sombre thoughts. Contrary to popular opinion, suicide rates do not rise particularly at Christmas. On the other hand, Accident and Emergency departments are these days invariably overloaded, and that trend is getting worse, not better.

Time to celebrate Christmas like never before, to recover its original meaning, or to just let it go?

To read the papers (and the Facebook memes), it’s either time to celebrate the quintessentially British Christmas like never before, or else to strip it back to Jesus in the manger (with an indeterminate number of wise men who arrived long after the manger episode was over), or to abandon the whole thing.

Which should we do? Which can we do?

In one of its more infamous interludes, Birmingham City Council announced a few years ago that it was doing Winterval, rather than Christmas. This has been spun and respun by press officers and commentators so many times that many people believe it was an urban myth. However, I was living in Birmingham at the time, remember it well, and can confirm that this really did take place, and it really did cause an outcry — especially when it was put about that this was in order not to offend Muslims and Sikhs. I recall that a group of ethnic minority community leaders wrote to Birmingham CC to point out that they were not offended by Christmas at all, and to request that they were kept out of it. I followed the story for several weeks in the Birmingham Evening Mail (as it then was), the Birmingham Post and on the radio. I’m sure it got spun well above what it was ever meant to be, and Birmingham City Council later stated that it had never been meant to replace Christmas, but merely to brand all of the winter activities together. Nonetheless, the attempt did not go down well.

In a sort of parallel-universe version of this, I’ve been getting memes on Facebook for months now (literally, since September) asking me to ‘Share if you say ‘Merry Christmas’—Don’t give in to ‘Happy Holidays”. To the very best of my knowledge, no one in Britain has ever said to me ‘Happy Holidays’, nor have I ever been criticised for saying ‘Happy Christmas’, though I do remember a fight about corporate Christmas cards once, and what they should say. ‘Happy Holidays’ is an American phenomenon, and reflects the separation of church and state, rather than any take-over by leftists, atheists or people from non-Christian religious backgrounds.

The lesson of this, though, I think is that Christmas is here to stay. Even if the church abandoned it, John Lewis, Lidl, Aldi and the others would keep it going through their ever more elaborate advertisements. What Coca Cola understood years ago (and they are the ones with the ‘Holidays are Coming’ rather than ‘Christmas is coming’ advertisements) is that if you can associate your products with the festive season, the available profits are vast.

Should hard-line Christians, then, go on marches demanding that Christmas® is © Christ, and should no longer be used for any secular purpose? I’ve heard a lot about this from atheists, but relatively little from Christians themselves. Churches are generally madly busy organising carol services, carol singing, Sunday School parties, Christmas collections and street events to have any time to do any protesting. Of course, if anyone does want to have a bit of a rant, there’s usually airspace for them, simply because newspapers need controversy. “Christmas going to be great again, says Vicar,” is hardly going to make the front page.

There’s a rather good meme going around Facebook at the moment, though, sadly, it is barely more accurate than the run-of-mill memes you usually get. It’s about the irony of celebrating two Palestinian refugees trying to find shelter two-thousand years ago, while not doing anything about the refugees on our doorstep today.

It’s factually incorrect: Joseph and Mary weren’t refugees at the time, and, notwithstanding the carols and Nativity plays, there isn’t much about them finding shelter. However, Joseph, Mary and Jesus did go on to become refugees a couple of years later, when they fled to Egypt. The meme could do with a bit of refining, but the underlying premise is sound: it is ironic that we should be celebrating Peace on Earth, Goodwill to all Humans while being unwilling to welcome in refugees who are shivering in Calais as winter closes in. This is really not a political point, unless we politicise it ourselves. Notwithstanding the posturings of Donald Trump and others on the American right, the Gospel of Jesus Christ is about light in darkness, loving enemies, and undeserved forgiveness.

I love Christmas, and will celebrate it in many different ways this year. Christmas, though, is the beginning but not the end of the Christian hope. One reason why the origins of Christmas as a festival are a little bit obscure is because the early Christians put so much more emphasis on Good Friday and Easter. The baby in the manger is good will and great joy, but it is the man who gives his life for others, defeats death and returns that is the essential, core and unmistakeable centre of Christian belief.

Show 3 footnotes

  1. in the Chronology of Philocalus
  2.  Codex Justinianus 3.12.2
  3. Luke 2:7, New International Version

Good questions, weak answers? A review of the report on the commission on religion and belief in British public life

Living With Difference, the report of the Commission on Religion and Belief in British Public Life, gives an excellent overview of the changing shape of faith in the UK, and asks some very pertinent questions. However, the answers it offers are, for the most part, neither particularly new nor particularly radical, and it is difficult to see how much difference they would actually make, if implemented.

These are my conclusions on the report.

I have to say, it’s an enjoyable read and I commend it to anyone who has an hour to spare and wants a nuanced view of the changes in Britain’s religious life and the questions that it throws up. In an area of debate famous for its tub-thumping, Baroness Butler-Sloss’s report consistently steers a middle path. However, in doing so, it fails to offer answers to the questions it asks, and instead provides answers to questions it did not ask.

The conclusions, which cover education, media, (inter-faith) dialogue, (social) action and law, have been at least partially covered in today’s press. Broadening the representation of faith groups in the House of Lords, abolishing the requirement to have an act of worship in schools, introducing a statutory entitlement (whatever that is) to a broad-based belief subject in schools, and in framing counter-terrorism legislation, government should seek to promote rather than limit freedom of enquiry, speech and expression.

However, the 37 recommendations, when taken together, fail to answer some of the key issues raised in the report itself. How do we balance the hierarchy of rights which, according to some contributors, puts religious freedom and conscience at the bottom and, according to others, puts it right at the top? How can religious groups be resourced to engage in all the dialogue that is now mandated on them, given that, by the report’s own admission, numbers and resources are shrinking? What practical steps can be taken in term’s of press literacy or religion (the 7 recommendations here do not really answer the question as the report poses it)?

More importantly—and ever-present between the lines, though rarely emerging into the open—what steps do you take to change societal attitudes as a whole, and what changes should there be?

It seems to me that this is the crunch question which the report introduces, underpins with good research and much dialogue, and then does nothing about.

Let me say that I am not really against any of the 37 recommendations. Abolishing a compulsory act of worship in schools and replacing it with something more inclusive is not going to radically change anything. I would welcome hearing Richard Dawkins occasionally on Thought for the Day. I don’t agree with the House of Lords, but, if we have to have it, having religious representatives from a broader background is, I think, a good thing. Offering non-religious chaplaincy in hospitals, prisons and higher education would seem very reasonable.

And yet, and yet: we could do all these things, and not come up with a society which was any more tolerant of religious difference, any more generally informed about its varying strands of belief, and any better at steering people away from radicalisation and extremism.

Radicalisation and extremism, are, of course, the hidden terms in the report. They are referred to sparsely in the text, though more frequently in the bibliography. On the one hand, we are aware that some people are being drawn into patterns of thought which make them amenable to taking violent action. On the other hand, it is evident that large numbers of people who have no association with extremism, no tendency toward violence and no support for radical groups are being treated by mainstream society with suspicion and even ill-will. The report does discuss the poor use of language in the media which can exacerbate this. But it proposes no suitable remedy, beyond broader awareness among journalists.

I don’t want to be too harsh on the report. It brings together a vast amount of research, gives fairly authoritative access to lots of surveys which are otherwise hard to track down, has clearly been carefully checked (and is therefore a much better place to start a discussion than most online resources) and has been written without even a sliver of axe being ground.

To some extent, I feel it would have been better as an interim report, without the recommendations. Opening up the House of Lords and Thought for the Day may seem radical to some, but for the vast majority of people in this country, they won’t make a great deal of difference. I wonder if, at some points, some very good ideas have been watered down.


To me, the most important recommendation, though it doesn’t go nearly far enough, is that of a

“statutory entitlement for all schools within the state system for a subject dealing with religious and non-religious worldviews. They should establish content and learning objectives that can be flexibly applied by teachers, allowing the minimum requirements to be built on differently by different schools. The content should be broad and inclusive in a way that reflects the diversity of religion and belief in the UK, and the subject should have the same status as other humanities subjects.”

I sense that somewhere in there are the remnants of a much bolder and directly applicable proposal.

Let me unpack that for a moment.

I’m a committed Christian, and I went to a notionally Christian secondary school, which also happened, at the time, to be the most academically successful in Britain. However, even as a committed Christian, I found Religious Studies to be deeply dull. It gave me no new insights about my own faith (for which I do not blame my teachers at all), and very little about anybody else’s. I did learn about Ur Naptishun, but I didn’t learn anything about Mohammed. I was entirely in the dark about the Pillars of Islam, about notions of Karma, and about the moral philosophy of John Stuart Mill and Immanuel Kant.

This is in no way a criticism of my school or my teachers. Having discussed the issue with teachers and pupils over many years, my sense is that this is not particularly unusual. There have been a number of attempts to reform religious education since. However, based on my conversations with Religious Studies teachers over the last couple of years, I sense that the consensus is that they have not yet achieved what they need to.

As written, I’m not sure how much better the recommendation above would be. However, what I think it tries to say, or someone at some point tried to say through it, is that we should be guaranteeing every young person a level of religious literacy similar to their level of ability in English and History. I would personally go beyond that, and say that we should be guaranteeing young people religious and cultural literacy. (I am not using the word ‘culture’ here as a euphemism for ethnicity.)

It seems to me that in today’s world, all non-Muslims need to be given a sympathetic understanding of Islam: not only its theology, but also Islamic cultures, and how they respond to Islam in different ways. In terms of Christianity, I think everyone (and especially people who are from a Christian background, and therefore imagine that they understand it) needs to understand the inter-relation between Christianity and culture in the USA, in Russia, in South America, in sub-Saharn Africa, and in China. For people from a UK Christian background, it is far too easy to assume that Christian=British (while simultaneously mocking Americans for making the same mistake). The same should apply also to Hinduism, Buddhism, Shinto, Judaism and various kinds of humanism.

The purpose in this should not be to study religion, thereby being an expert on it, but rather to be religiously literate in a society which requires us to navigate increasingly complex currents of thinking.

I understand why such an attempt is easy to water down. At opposite ends of the debate, there will be people who insist that Christianity is the dominant religion in Britain, and therefore should get the lion’s share of Religious Studies time, while at the other end there will be people who regard all religion as a form of delusion, and something which should therefore be banned from school premises.

The ultimate conclusion of the report — I think, if I may give my own summary of its summaries — is that Britain is no longer a ‘Christian’ nation (if it ever was, of course), but neither is it a secular nation. We need to develop our structural response to religion and public life in the light of that.

As a conclusion, it is hard to argue with. However, something firmer and further reaching than the report’s recommendations is required. More importantly, someone needs to have the task of taking action. Otherwise, like so many reports, it will fade like the proverbial morning mists.

One century of genocides: the anniversary that no one wants to celebrate, but no one should forget

One century of genocides: the anniversary that no one wants to celebrate, but no one should forget

I’ve refrained from writing this article for almost the whole of 2015, but now it’s time. Perhaps I should have written it in April, but I was hoping that someone else, better qualified, would write it. Perhaps they did, and I missed it.

On April 24 1915, a massacre of Armenians began which ushered in a century of genocides. The word had at the time not been coined, but when Raphael Lemkin did coin it, in 1943, he specified that this, like the Holocaust, was a genocide. Since Hitler’s genocide, we have seen the Cambodian genocide, the Rwandan genocide, the Darfur genocide and the former Yugoslavian genocide. Unlike the Armenian genocide, which has never been formally acknowledged by the United Nations, and which Turkey still maintains was not a genocide, these others each prompted, rightly, International Criminal Tribunals.

One century. Six genocides — not including the actions of Daesh against the Yazidis and other tiny people groups in their own territories. We have also seen similar actions against Kurds, though these have not yet led to international criminal tribunals. In Turkey, it remains a prosecutable offence to claim that the Armenian massacres actually were a genocide. It seems unlikely, therefore, that Turkish actions against Kurds will ever be acknowledged.

I am not here trying to point the finger at Turkey, or Cambodia, Sudan, Rwanda or Germany. Far from it. The kind of thinking that suggests that particular nations or ethnic groups are more guilty of genocide than others, or more prone to be, is exactly the kind of thinking which puts us on the path to genocide. Indeed, in Rwanda, from my memories of reading transcripts of some of the court cases that followed, many people who took part in it said that they felt it was ‘us or them’.  It was human beings who committed genocides, people like us. Some of the Rwandan transcripts made me feel keenly just how much like us they were.

The Armenian genocide should not have come as a surprise to the West. Indeed, Gladstone, in his last public speech, January 1895, urged Britain to take note of the plight of the Armenians. The Sultan demanded an immediate disavowal by the British government, and the Earl of Kimberley, Secretary of State for Foreign Affairs, responded immediately to assure the Sultan that this was purely a private speech1, and did not reflect Britain’s view. Twenty years later, the genocide began, with many of the practices subsequently used by Adolf Hitler against the Jews.

Hitler was not merely unconsciously echoing what had happened to the Armenians. In August 1922, Hitler wrote:2

“…Accordingly, I have placed my death-head formations in readiness — for the present only in the East — with orders to them to send to death mercilessly and without compassion, men, women, and children of Polish derivation and language. Only thus shall we gain the living space (Lebensraum) which we need. Who, after all, speaks today of the annihilation of the Armenians?”

Monument to Soviet soldiers, Armenia

Monument to Soviet soldiers, Armenia

While the world has largely forgotten the Armenian genocide, Armenians have not. In Armenia, the landscape still contains many statues to Soviet soldiers. This surprised me, when I was there. When I asked local people about them, it was clear that they still regarded the USSR as the great deliverers from oppression.

There is, of course, no comparison before or since with Hitler’s genocidal passion. As his letter about the Poles indicates, his hatred was not just for Jews, but for Poles, Soviets, and anyone else who did not fit into his narrow category of worthwhile humanity.

Nonetheless, just like the Armenian genocide, the rest of the world should have seen the Holocaust coming. Mein Kampf, which is being republished, did not in any sense attempt to cover up what he intended. Admittedly its first substantially distributed translation in English — and the one widely available at the time in the USA — had been excised of the parts which would not play well with Anglo-Saxon audiences. Even so, it should not have required a great deal of investment of effort — and remember that resurgent Nazi Germany was already establishing itself as a power in Central Europe, and therefore should have attracted the attention of diplomatic and intelligence linguists — to analyse in the original.

I remember the post-Yugoslavian genocide unfolding in the press and on the news. What astonishes me now is that I, like many others, was far too quick to assume that there was probably right and wrong on both sides, that things would sort themselves out, that it was all terrible, but not a great deal could be done about it from our end.

In retrospect — and having read George Orwell’s review of Mein Kampf, in which he describes popular British responses to Hitler in the 1930s — I recognise now that those are exactly the conditions in which genocide flourishes. We assume that things cannot be as bad as they seem, we prefer the less extreme (and highly propagandist) accounts issued to counter the hard news, we want to let sleeping dogs lie, we worry what the impact of intervention will be, we find out far too late that all the men in a city have been rounded up to be executed, that a new term, in that case ‘ethnic cleansing’, is really no different from the old term. We wring our hands again, and wish that we had acted.

I am not going to apply this directly to Daesh and to Britain’s decision to join the bombing. I disagreed with that decision, as I have written elsewhere, while accepting that I did not have all the facts. We cannot use the possibility of genocide as a blanket excuse or reason to intervene in every foreign emergency.


Gregory Stanton, president of Genocide Watch, has put forward a theoretical framework of eight stages by which genocide progresses. 3 His analysis follows the pattern: formulation, symbolisation, dehumanisation, organisation, polarisation, preparation, extermination and denial. Some have argued that it does not seem to apply particularly well to all genocides, and it does not seem to help us in predicting where they happen next.

What is perhaps more useful to us is to recognise the failings and lacks of which we ourselves are guilty, in allowing genocides to progress. Since the Kosovan crisis, Western powers have accepted a military dogma known as ‘R2P’ or, Responsibility to Protect. We accept that it is no longer the case that our moral responsibility and authority extends only as far as defending our NATO allies — which is the classic international military and diplomatic position — but that it now goes beyond to protecting those unable to protect themselves. And yet, often, we have not acted.

The eyes of the world are now on Daesh. They should be, but why have we done nothing about DR Congo? The situation in DR Congo is so bad that it could almost be referred to as an ‘Omnicide’, where everyone is attempting to exterminate everyone else.

I don’t want to chart out the stages by which Western nations permit genocide to take place. However, I do want to put forward five ways in which we as individuals in Britain can better help to prevent them.

  1. Engagement with world affairs. Britain is a democracy. Government listens when enough people start lobbying. Mostly, though, we only engage with places we’ve been to or might consider visiting. There was a fair degree of criticism on Facebook when many people (I was one) changed their profile picture to one with a French flag after the Paris massacres. Why had we not done so for other, much larger massacres? It was a valid point, though I still stuck with the French flag. Gladstone engaged with the plight of the Armenians in 1895, but his own supporters and followers did not continue the pressure. If they had, it might never have happened. Whether that would have had a substantial effect on later genocides I cannot say, but it would certainly have been infinitely better if it had not happened. We have better, cheaper and easier access to information about the world than ever before. We need to makes use of it.
  2. Investment in international journalism and broadcasting. The BBC World Service is once again under threat. Why should Britain pay for the world’s radio? In many countries, the BBC is still regarded as the most authoritative way of getting news. When we invest in international journalism, and international broadcasting, we offer a mirror to a wider world, as well as assisting our own engagement. People locally are in a much better position to speak out if they know what is happening. If a genocide is genuinely being contemplated, their own state-sponsored broadcast will already be doing its best to suppress it. Many of the people who participated in the Rwandan genocide said afterwards that they did not realise or understand the full scale of what they were taking part in. Would better journalism and broadcasting have helped? Almost certainly, it would have done to some extent.
  3. Stamp out misinformation on social media. According to recent research by Nielsen4, people are now more likely to believe online opinions than they are traditional news. I’ve written elsewhere about the dangers of meme-sharing. Urban myths, of course, will always propagate. Nonetheless, there are particular kinds of myths that work to demonise particular people groups. Alongside the disgraceful memes alleging that all central banks are owned by the Rothschild family (really a coded version of the old ‘Jewish Conspiracy’ myth), we are seeing a growing number of memes demonising refugees and asylum seekers, based on supposedly accurate information. Whenever the information is checked, it almost always turns out to be wildly untrue. That doesn’t matter, though, if the people sharing it do so on the grounds that ‘there must be something in it’. However, it is by exactly these mechanisms (or the pre-internet versions thereof) that sympathy with what was happening in Nazi Germany spread. Britain would probably never have engaged in its own persecution of Jews, mainly because it did not have the same proportions of Jews living here (as a result of earlier persecutions) but antagonism towards Jews, and sympathy with the Nazi view, meant that Britain did too little, too late, until it almost was too late.
    We need to be better committed to fact checking and getting to the bottom of things. Copies of ‘The Protocols of the Elders of Zion’ were in wide circulation during the 1930s. To many people, they seemed to be genuine, documentary proof of the Jewish Conspiracy. It would not have taken anyone a great deal of thought or research to discover that they were a hoax. However, most of us are all to happy to believe a hoax, if it appears to have a bit of a document behind it, and it suits our prejudices.
  4. Move beyond narrow ‘British Interests’. The job of the Foreign and Commonwealth Office is to look after Britain’s interests in the wider world. But what are Britain’s interests? The current mood in Britain is that they are narrow notions of things which are good for Britain’s economy, good for our internal security, and good for keeping us at the top table. Ultimately, the Foreign and Commonwealth Office will pursue the agenda set by government. Who sets the government? We do. We could argue that only 1/3 of people actually voted in the current government. Even so, everyone involved in politics, journalism or even local church hall debates helps to set the agenda. If MPs, during election time, hear a thousand times that the price of butter is too high, and once that Britain is not doing enough to stop Boko Haram, they will go into the next parliamentary term determined to lower the price of butter. No matter how much more sympathy they have with stopping Boko Haram, this is unlikely to progress beyond signing an Early Day Motion, which is really the same as parliamentary graffiti. Amid all of the online grass-roots pressure groups that have sprung up to demand this or that, I have yet to see one that insists Britain do more to protect the most vulnerable people groups from destruction.
  5. Abandon the right of selfishness. Selfishness is not one of the UN’s fundamental human rights, nor is it enshrined in the Human Rights act. But it might as well be, because, in almost all debates, ‘looking after our own’ is presented as an unassailable and fundamental right. No one even bothers to argue with it. It is a clincher on television, on radio and on the doorsteps. There is, of course, nothing wrong with looking after our own, but that is not what the phrase is used for. Facebook is now flooded with memes demanding that we do more for homeless ex-soldiers, rather than refugees. Before the refugee crisis came up, there was relatively little interest in helping these homeless veterans. ‘Charity begins at home’, the old version of ‘looking after our own’, was never used as an encouragement to be more charitable locally, merely as an excuse not to intervene internationally.
    When we turn back refugees at the border, or make them wait in migrant camps in Calais, or demonise them in the popular press, we send back the most powerful signal to those who persecute them: the West does not care, you may do what you like. Driving people out of a country to die as refugees on the journey was a feature of the Armenian genocide, and also a feature of the Nazi genocide. More Armenians died fleeing persecution than were actually killed in the persecution.
    The moment that we put our own comfort, convenience and prosperity above the needs of those fleeing for their lives, we, too, have become complicit.

My prayer — and I do mean prayer — is that as the 100 years of genocides comes to an end, the genocides themselves will come to an end. I understand that there will always be atrocities. There will always be people angry enough, or selfish enough, or heartless and calculating enough, to commit massacres. Nonetheless, if, for once, we could learn one lesson of history, let it be this: never again.

Show 4 footnotes

  1. See this newspaper article
  2. The letter was presented in evidence at the Nuremburg Tribunal as Exhibit USA-28. A copy is available here. Note that Turkey disputes its authenticity.
  3. Helpfully summarised on Wikipedia.
  4. Report here

Rights, right and might: unravelling the Syria vote

I don’t agree with the decision to bomb Syria, and I probably never will. I do continue to support Tim Farron as leader of the Liberal Democrats. At a time when Lib Dems are sharply divided over what the vote should have been, I want to unpack for a moment the democratic mandate for it, and why I believe that, although it was not (in my view) the right decision, those that made it had the right to decide as they did.

In the dialogue Protagoras, the philosopher Socrates explains why he does not believe that there should be ‘sophists’. The sophists he attacks were professional wise men, pundits whose wisdom could be taken as authoritative. His argument was that, while in most cases we call an expert in a particular field, when we make major decisions for the state, the entire assembly decides together, and each person gets to make their speech.

Socrates is describing a Greek city-state where all adult, free men (thus, in fact, quite a small proportion of all adults) form the assembly, and the assembly decides together what should be done.

Our democracy is substantially more complex than that, and for very good reasons. However, the underlying point still stands: in great questions of the state, no one individual, nor group of pundits, is wise enough to answer the questions alone. It is in debate that we find our way forward.

One of the most fundamental requirements for a worthwhile debate is that there is at least the possibility of people changing their minds. If we are not open to persuasion, then we are not part of democratic debate.

However, for those watching from the sidelines — especially for those actively supporting one of the groups within the debate — this means that there is a real possibility that the people we supported, backed and worked to elect will make choices that we disagree with. They may even make choices which seem to us utterly wrong and nonsensical. Nonetheless, without the possibility of this, there is no debate, and there is no democracy.

A debate and a vote on their own is insufficient for a thing to be democratic. There are three tests which should always be applied. First, were the debate and subsequent vote conducted fairly, according to previously agreed processes which do not prejudice the outcome? Second, did the people voting have the authority to do so? Third, was sufficient time for the debate given, and sufficient information to the debaters for the decision reached to be an informed and considered one?

This third point is one which frequently baffles observers of our parliament. The most common way for a private member’s bill to fail is for it to be ‘talked out’ — the debate goes on to the limit of the allotted time, and there is no time left for the vote. In any kind of Board or executive meeting, the chair would have drawn matters to a close so that a vote could take place. In a policy debate at a Liberal Democrat party conference, once the time for the debate was finished, a vote would take place. However, the parliamentary rule is that sufficient time must be given for members to express their views. We can argue (I think sometimes correctly) that this is abused when MPs opposing an otherwise popular motion (or popular with those present and likely to vote) make extended, long-winded and only marginally topical speeches for this purpose. Nonetheless, the alternative, that business could be rushed through and ill-considered decisions made in haste with unexpected consequences, is worse.

The second point was the crux of Liberal Democrat opposition to the second Iraq war. For better or for worse, the recognised international authority to engage in acts of war, except under direct attack by an enemy, is a United Nations mandate. No mandate was ever given for the second Iraq war. Therefore, irrespective of whether, from a utilitarian perspective, the war would result in more lives saved or more lives lost, the British government did not have the legal right to go to war.

This in itself raises all kinds of notions about national sovereignty. However, since Britain has long championed the international rule of law, we are right to ask very serious questions not only about whether we were right to go to war for the second time in Iraq (a question which should have been debated in parliament) but whether we had the right — a question to be settled by legal processes before the decision, not after it.

In the current crisis, there is an emotional response on the second point as well. We (that is, a bit more than 50% of Liberal Democrats who voted in the leadership election) elected Tim Farron, therefore should he not pay more attention to what we think? Of course, we recognise there are divided loyalties on this issue. Six Liberal Democrat MPs voted to extend air strikes into Syria, two did not. Each of those MPs was elected not by the Liberal Democrats (though they were each selected at some point by a majority of local party members) but by their own constituents. Should they therefore not be polling their constituents, and us, to find out what they should do?


There is no realistic way of polling Liberal Democrats. We certainly aren’t going to hand over our membership database to YouGov, and we lack the polling machinery to conduct our own poll, though we do do surveys from time to time. 1 Where polls of Liberal Democrats have been published, they are polls of people who self-identify to the pollsters as Liberal Democrat supporters. Constituency polling is equally fraught, but for different reasons. In the recent and infamous Sun poll, 1/5 Muslims were alleged to have sympathy with Jihadis. However, as Survation, the company who did the polling pointed out, this was a complete misrepresentation of the results of the poll. Further reflection showed that the question was meaningless in that regard, because it did not distinguish between ‘feel in a similar way’ and ‘feel sorry for’.

The only legitimate way to find out what the public really thinks is in a referendum. Referendums are core to Swiss democracy, which resembles much more closely the demokratia of ancient Athens, but generally foreign to British democracy. They fall foul of test number three: sufficient time for the debate, and sufficient information available to the debaters. In the Scotland referendum, the opposing sides presented wildly differing indications of the economic impact of secession. We can have our own view of which side presented the more accurate information, but most voters were forced to make their own minds upon who they chose to believe: actual, established information, authoritatively condensed to the level that an educated and interested lay-person could follow, was not available. This is not particularly a problem with the Scottish referendum, but with referendums in general, and is one of the main reasons for having a representative, parliamentary democracy.

Such things are particularly the case with a decision to go to war. Without considering the impact of secret intelligence, such as was used to draw up the ‘Dodgy Dossier’ of the second Gulf War, parliamentarians have access to a range of political, military and strategic analysis which it is their full time job and duty to digest. Some of this comes out in the public debate, but all of it informs the debate—much of which takes place informally before the parliamentary debate begins.

The reason that a representative, parliamentary democracy is better than a universal, referendum-based democracy, is that we employ the parliamentarians to invest themselves in the proper information so that the debate can be legitimate—and only those involved in the debate (by virtue of being present in the chamber) are given the opportunity to vote.

Not in my name

But this brings us to a problem. If, when we elect an MP, or elect a leader of a party, we are handing over the authority to make a decision which we fundamentally disagree with, what then? The decision to bomb Syria was not one to be taken lightly by anyone. There will be people at either end of the spectrum whose positions are so established that there is little likelihood of them shifting, but, in the centre there was a significant number who could have gone either way—so much so that, last weekend, the government did not believe it had enough votes to win. Among Liberal Democrat supporters, one poll put support for bombing at 46%, though that was a snapshot of one particular time. Are the 54%, or whatever the figure of those opposing was or is, now tarred by that decision? Should people who voted for Tim Farron but are deeply opposed to bombing Syria now consider their membership?


No decision that parliamentarians make is made in our names. They are made in their own names, which is why it is the name of the individual that goes on the ballot paper at a General Election, and only subsequently (and relatively recently) the name of the party for which they are standing.

If Tim Farron had stood on a platform of ‘tough action in the Middle East’, but had garnered votes from people opposed to that kind of rhetoric, they might well need to go through a period of soul-searching. I have previously urged people who voted Conservative, but simply did not believe some of the commitments to further austerity that the Conservatives made, and now regret it, to go through exactly such an exercise. On the other hand, if Tim Farron had stood on a platform of ‘bombing never solved anything’, then there would be legitimate reasons to call for his resignation, or, at least, to trigger a leadership contest.

Neither are the case here. What parliament has collectively chosen to do is a matter for the consciences of each person who cast their vote. It does not make us complicit, unless they promised to do what they have now done before we elected them.

Rights, right and might

I still don’t see how adding Royal Air Force capability to the existing forces bombing Syria improves the situation, nor what realistic and effective strategy we have for dealing with Daesh (ISIS).

To some extent, I don’t expect to. I recently had a conversation with an analyst about the military situation elsewhere. I was staggered, as I always am by these conversations, to learn how little my knowledge of the situation — garnered from what I hold to be an intelligent and informed approach to media reports — tallied with what was really happening. To be fair, and as he told me, all the information was out there, in the public domain, freely available. I had simply not considered the implications of the facts. Equally, I wouldn’t expect our military strategy for Daesh to be made common knowledge, as that would invalidate it as a strategy.

I am persuaded that Parliament — including Tim Farron and five of his Liberal Democrat colleagues — had the democratic right to make the decision they made yesterday. Crucial in this is that there was already a UN mandate, and a proper debate took place. Neither of these were the case for the second Iraq war.

I am not persuaded that they made the right decision—though I accept that either decision would have had far-reaching consequences which could not be known or calculated beforehand.

If the Liberal Democrat MPs had made a decision which I believed they had no right to make, then I would have felt obligated to withdraw my support from them. Because they made a decision which they had a right to make, though not the decision I favoured, I will continue to support them.

We now move on to the critical question, which can only be answered by events. Will the extension of British military might (and Britain is, by many assessments, still the 5th largest military power in the world) save lives or destroy them? Will we properly accept our responsibilities, already laid out in the international protocol on refugees to which we are signatories, to play a full part in sheltering those displaced by this war? Will yesterday’s decision have an impact on our peaceful existence on these islands?

We wait to see.

Show 1 footnote

  1. A properly conducted poll is of a carefully constituted sample, weighted in such a way that it creates a statistically valid indication of the result. A survey is a much broader questionnaire, typically for the Liberal Democrats to all members, reliant on who chooses to respond. Despite the much larger numbers involved, a survey is far more likely to be dependent on response-bias, the artefact of giving more weight to those interested enough to reply.

Back to Top