“Have you ever thought that nostalgia isn’t what it used to be…”

Human beings love nostalgia, perhaps strangely. For all the success of various self-help gurus and such telling us to ‘live in the moment’, there are few things more satisfying than sitting back and letting the memories flow over us, with rose-tinted spectacles all set up and in position. Looking back on our past may conjure up feelings of longing, of contentment, of pride or even resentment of the modern day when considering ‘the good old days’, but nobody can doubt how comforting the experience often is.

The real strangeness of nostalgia comes from how irrational it is; when analysing the facts of a given time period, whether in one’s own life or in a historical sense, it is hard not to come to the conclusion that the past is usually as bad as the present day, for some different and many of the same reasons. The older generations have, for example, have always thought that current chart music (for any time period’s definition of ‘current’) is not as good as when they were a teenager, that their younger peers have less respect than they should, and that culture is on a downward spiral into chaos and mayhem that will surely begin within the next couple of years. Or at least so the big book of English middle class stereotypes tells me. The point is that the idea that the modern day is worse than those that have gone before is an endless one, and since at no point in history have we ever been rolling in wealth, freedom, happiness and general prosperity it is a fairly simple process to conclude that things have not, in fact, actually been getting worse. At the very least, whilst in certain areas the world probably is worse than it was, say, 30 years ago (the USA’s relationship with the Middle East, the drugs trade, the number of One Direction fans on planet Earth and so on), from other standpoints it could be said that our world is getting continually better; consider the scientific and technological advancements of the last two decades, or the increasing acceptance the world seems to have for certain sections of its society (the LGBT community and certain racial minorities spring to mind). Basically, the idea that everything was somehow genuinely better in the past is an irrational one, and thus nostalgia is a rather irrational idea.

What then, is the cause of nostalgia; why do we find it so comforting, why is it so common to yearn for ‘good old days’ that, often, never truly were?

Part of the answer may lie in the nature of childhood, the period most commonly associated with nostalgia. Childhood in humans is an immensely interesting topic; no other animal enjoys a period of childhood lasting around a quarter of its total lifespan (indeed, if humans today lived as long as they did in the distant past, around half their life would be spent in the stage we nowadays identify as childhood), and the reasons for this could (and probably will one day) make up an entire post of their own. There is still a vast amount we do not know about how our bodies, particularly in terms of the brain, develop during this period of our lives, but what we can say with some certainty is that our perception of the world as a child is fundamentally different from our perception as adults. Whether it be the experience we do not yet have, the relative innocence of childhood, some deep neurological effect we do not yet know about or simply a lack of care for the outside world, the world as experienced by a child is generally a small, simple one. Children, more so the younger we are but to a lesser extent continuing through into the teenage years, tend to be wrapped up in their own little world; what Timmy did in the toilets at school today is, quite simply, the biggest event in human history to date. What the current prime minister is doing to the economy, how the bills are going to get paid this month, the ups and downs of marriages and relationships; none matter to a childhood mind, and with hindsight we are well aware of it. There is a reason behind the oft-stated (as well as slightly depressing and possibly wrong) statement that ‘schooldays are the best of your life’. As adults we forget that, as kids, we did have worries, there was horrible stuff in the world and we were unhappy, often; it’s just that, because childhood worries are so different and ignore so many of the big things that would have troubled us were we adults at the time, we tend to regard them as trivial, with the benefit of that wonderful thing that is hindsight.

However, this doesn’t account so well for nostalgia that hits when we enter our teenage years and later life; for stuff like music, for example, which also is unlikely to have registered in our pre-teen days. To explain this, we must consider the other half of the nostalgia explanation; the simple question of perception. It is an interesting fact that some 70-80% of people consider themselves to be an above-average driver, and it’s not hard to see why; we may see a few hundred cars on our commute into work or school, but will only ever remember that one bastard who cut us up at the lights. Even though it represents a tiny proportion of all the drivers we ever see, bad driving is still a common enough occurrence that we feel the majority of drivers must do such stupid antics on a regular basis, and that we are a better driver than said majority.

And the same applies to nostalgia. Many things will have happened to us during our younger days; we will hear some good music, and ignore a lot of crap music. We will have plenty of dull, normal schooldays, and a couple that are absolutely spectacular (along with a few terrible ones). And we will encounter many aspects of the world, be they news stories, encounters with people or any of the other pieces of random ‘stuff’ that makes up our day-to-day lives, that will either feel totally neutral to us, make us feel a little bit happy or make us slightly annoyed, exactly the same stuff that can sometimes make us feel like our current existence is a bit crappy. But all we will ever remember are the extremes; the stuff that filled us with joy, and the darkest and most memorable of horrors. And so, when we look back on our younger days, we smile sadly to ourselves as we remember those good times. All the little niggly bad things, all the dull moments, they don’t feature on our internal viewfinder. In our head, there really were ‘good old days’. Our head is, however, not a terribly reliable source when it comes to such things.

Advertisement

Time is an illusion, lunchtime doubly so…

In the dim and distant past, time was, to humankind, a thing and not much more. There was light-time, then there was dark-time, then there was another lot of light-time; during the day we could hunt, fight, eat and try to stay alive, and during the night we could sleep and have sex. However, we also realised that there were some parts of the year with short days and colder night, and others that were warmer, brighter and better for hunting. Being the bright sort, we humans realised that the amount of time it spent in winter, spring, summer and autumn (fall is the WRONG WORD) was about the same each time around, and thought that rather than just waiting for it to warm up every time we could count how long it took for one cycle (or year) so that we could work out when it was going to get warm next year. This enabled us to plan our hunting and farming patterns, and it became recognised that some knowledge of how the year worked was advantageous to a tribe. Eventually, this got so important that people started building monuments to the annual seasonal progression, hence such weird and staggeringly impressive prehistoric engineering achievements as Stonehenge.

However, this basic understanding of the year and the seasons was only one step on the journey, and as we moved from a hunter-gatherer paradigm to more of a civilised existence, we realised the benefits that a complete calendar could offer us, and thus began our still-continuing test to quantify time. Nowadays our understanding of time extends to clocks accurate to the degree of nanoseconds, and an understanding of relativity, but for a long time our greatest quest into the realm of bringing organised time into our lives was the creation of the concept of the wee.

Having seven days of the week is, to begin with, a strange idea; seven is an awkward prime number, and it seems odd that we don’t pick number that is easier to divide and multiply by, like six, eight or even ten, as the basis for our temporal system. Six would seem to make the most sense; most of our months have around 30 days, or 5 six-day weeks, and 365 days a year is only one less than multiple of six, which could surely be some sort of religious symbolism (and there would be an exact multiple on leap years- even better). And it would mean a shorter week, and more time spent on the weekend, which would be really great. But no, we’re stuck with seven, and it’s all the bloody moon’s fault.

Y’see, the sun’s daily cycle is useful for measuring short-term time (night and day), and the earth’s rotation around it provides the crucial yearly change of season. However, the moon’s cycle is 28 days long (fourteen to wax, fourteen to wane, regular as clockwork), providing a nice intermediary time unit with which to divide up the year into a more manageable number of pieces than 365. Thus, we began dividing the year up into ‘moons’ and using them as a convenient reference that we could refer to every night. However, even a moon cycle is a bit long for day-to-day scheduling, and it proved advantageous for our distant ancestors to split it up even further. Unfortunately, 28 is an awkward number to divide into pieces, and its only factors are 1, 2, 4, 7 and 14. An increment of 1 or 2 days is simply too small to be useful, and a 4 day ‘week’ isn’t much better. A 14 day week would hardly be an improvement on 28 for scheduling purposes, so seven is the only number of a practical size for the length of the week. The fact that months are now mostly 30 or 31 days rather than 28 to try and fit the awkward fact that there are 12.36 moon cycles in a year, hasn’t changed matters, so we’re stuck with an awkward 7 day cycle.

However, this wasn’t the end of the issue for the historic time-definers (for want of a better word); there’s not much advantage in defining a seven day week if you can’t then define which day of said week you want the crops to be planted on. Therefore, different days of the week needed names for identification purposes, and since astronomy had already provided our daily, weekly and yearly time structures it made sense to look skyward once again when searching for suitable names. At this time, centuries before the invention of the telescope, we only knew of seven planets, those celestial bodies that could be seen with the naked eye; the sun, the moon (yeah, their definition of ‘planet’ was a bit iffy), Mercury, Venus, Mars, Jupiter and Saturn. It might seem to make sense, with seven planets and seven days of the week, to just name the days after the planets in a random order, but humankind never does things so simply, and the process of picking which day got named after which planet was a complicated one.

In around 1000 BC the Egyptians had decided to divide the daylight into twelve hours (because they knew how to pick a nice, easy-to-divide number), and the Babylonians then took this a stage further by dividing the entire day, including night-time, into 24 hours. The Babylonians were also great astronomers, and had thus discovered the seven visible planets- however, because they were a bit weird, they decided that each planet had its place in a hierarchy, and that this hierarchy was dictated by which planet took the longest to complete its cycle and return to the same point in the sky. This order was, for the record, Saturn (29 years), Jupiter (12 years), Mars (687 days), Sun (365 days), Venus (225 days), Mercury (88 days) and Moon (28 days). So, did they name the days after the planets in this order? Of course not, that would be far too simple; instead, they decided to start naming the hours of the day after the planets (I did say they were a bit weird) in that order, going back to Saturn when they got to the Moon.

However, 24 hours does not divide nicely by seven planets, so the planet after which the first hour of the day was named changed each day. So, the first hour of the first day of the week was named after Saturn, the first hour of the second day after the Sun, and so on. Since the list repeated itself each week, the Babylonians decided to name each day after the planet that the first hour of each day was named, so we got Saturnday, Sunday, Moonday, Marsday, Mercuryday, Jupiterday and Venusday.

Now, you may have noticed that these are not the days of the week we English speakers are exactly used to, and for that we can blame the Vikings. The planetary method for naming the days of the week was brought to Britain by the Romans, and when they left the Britons held on to the names. However, Britain then spent the next 7 centuries getting repeatedly invaded and conquered by various foreigners, and for most of that time it was the Germanic Vikings and Saxons who fought over the country. Both groups worshipped the same gods, those of Norse mythology (so Thor, Odin and so on), and one of the practices they introduced was to replace the names of four days of the week with those of four of their gods; Tyr’sday, Woden’sday (Woden was the Saxon word for Odin), Thor’sday and Frig’sday replaced Marsday, Mercuryday, Jupiterday and Venusday in England, and soon the fluctuating nature of language renamed the days of the week Saturday, Sunday, Monday, Tuesday, Wednesday, Thursday and Friday.

However, the old planetary names remained in the romance languages (the Spanish translations of the days Tuesday to Friday are Mardi, Mercredi, Jeudi and Vendredi), with one small exception. When the Roman Empire went Christian in the fourth century, the ten commandments dictated they remember the Sabbath day; but, to avoid copying the Jews (whose Sabbath was on Saturday), they chose to make Sunday the Sabbath day. It is for this reason that Monday, the first day of the working week after one’s day of rest, became the start of the week, taking over from the Babylonian’s choice of Saturday, but close to Rome they went one stage further and renamed Sunday ‘Deus Dominici’, or Day Of The Lord. The practice didn’t catch on in Britain, thousands of miles from Rome, but the modern day Spanish, French and Italian words for Sunday are domingo, dimanche and domenica respectively, all of which are locally corrupted forms of ‘Deus Dominici’.

This is one of those posts that doesn’t have a natural conclusion, or even much of a point to it. But hey; I didn’t start writing this because I wanted to make a point, but more to share the kind of stuff I find slightly interesting. Sorry if you didn’t.

Determinism

In the early years of the 19th century, science was on a roll. The dark days of alchemy were beginning to give way to the modern science of chemistry as we know it today, the world of physics and the study of electromagnetism were starting to get going, and the world was on the brink of an industrial revolution that would be powered by scientists and engineers. Slowly, we were beginning to piece together exactly how our world works, and some dared to dream of a day where we might understand all of it. Yes, it would be a long way off, yes there would be stumbling blocks, but maybe, just maybe, so long as we don’t discover anything inconvenient like advanced cosmology, we might one day begin to see the light at the end of the long tunnel of science.

Most of this stuff was the preserve of hopeless dreamers, but in the year 1814 a brilliant mathematician and philosopher, responsible for underpinning vast quantities of modern mathematics and cosmology, called Pierre-Simon Laplace published a bold new article that took this concept to extremes. Laplace lived in the age of ‘the clockwork universe’, a theory that held Newton’s laws of motion to be sacrosanct truths and claimed that these laws of physics caused the universe to just keep on ticking over, just like the mechanical innards of a clock- and just like a clock, the universe was predictable. Just as one hour after five o clock will always be six, presuming a perfect clock, so every result in the world can be predicted from the results. Laplace’s arguments took such theory to its logical conclusion; if some vast intellect were able to know the precise positions of every particle in the universe, and all the forces and motions of them, at a single point in time, then using the laws of physics such an intellect would be able to know everything, see into the past, and predict the future.

Those who believed in this theory were generally disapproved of by the Church for devaluing the role of God and the unaccountable divine, whilst others thought it implied a lack of free will (although these issues are still considered somewhat up for debate to this day). However, among the scientific community Laplace’s ideas conjured up a flurry of debate; some entirely believed in the concept of a predictable universe, in the theory of scientific determinism (as it became known), whilst others pointed out the sheer difficulty in getting any ‘vast intellect’ to fully comprehend so much as a heap of sand as making Laplace’s arguments completely pointless. Other, far later, observers, would call into question some of the axiom’s upon which the model of the clockwork universe was based, such as Newton’s laws of motion (which collapse when one does not take into account relativity at very high velocities); but the majority of the scientific community was rather taken with the idea that they could know everything about something should they choose to. Perhaps the universe was a bit much, but being able to predict everything, to an infinitely precise degree, about a few atoms perhaps, seemed like a very tempting idea, offering a delightful sense of certainty. More than anything, to these scientists there work now had one overarching goal; to complete the laws necessary to provide a deterministic picture of the universe.

However, by the late 19th century scientific determinism was beginning to stand on rather shaky ground; although  the attack against it came from the rather unexpected direction of science being used to support the religious viewpoint. By this time the laws of thermodynamics, detailing the behaviour of molecules in relation to the heat energy they have, had been formulated, and fundamental to the second law of thermodynamics (which is, to this day, one of the fundamental principles of physics) was the concept of entropy.  Entropy (denoted in physics by the symbol S, for no obvious reason) is a measure of the degree of uncertainty or ‘randomness’ inherent in the universe; or, for want of a clearer explanation, consider a sandy beach. All of the grains of sand in the beach can be arranged in a vast number of different ways to form the shape of a disorganised heap, but if we make a giant, detailed sandcastle instead there are far fewer arrangements of the molecules of sand that will result in the same structure. Therefore, if we just consider the two situations separately, it is far, far more likely that we will end up with a disorganised ‘beach’ structure rather than a castle forming of its own accord (which is why sandcastles don’t spring fully formed from the sea), and we say that the beach has a higher degree of entropy than the castle. This increased likelihood of higher entropy situations, on an atomic scale, means that the universe tends to increase the overall level of entropy in it; if we attempt to impose order upon it (by making a sandcastle, rather than waiting for one to be formed purely by chance), we must input energy, which increases the entropy of the surrounding air and thus resulting in a net entropy increase. This is the second law of thermodynamics; entropy always increases, and this principle underlies vast quantities of modern physics and chemistry.

If we extrapolate this situation backwards, we realise that the universe must have had a definite beginning at some point; a starting point of order from which things get steadily more chaotic, for order cannot increase infinitely as we look backwards in time. This suggests some point at which our current universe sprang into being, including all the laws of physics that make it up; but this cannot have occurred under ‘our’ laws of physics that we experience in the everyday universe, as they could not kickstart their own existence. There must, therefore, have been some other, higher power to get the clockwork universe in motion, destroying the image of it as some eternal, unquestionable predictive cycle. At the time, this was seen as vindicating the idea of the existence of God to start everything off; it would be some years before Edwin Hubble would venture the Big Bang Theory, but even now we understand next to nothing about the moment of our creation.

However, this argument wasn’t exactly a death knell for determinism; after all, the laws of physics could still describe our existing universe as a ticking clock, surely? True; the killer blow for that idea would come from Werner Heisenburg in 1927.

Heisenburg was a particle physicist, often described as the person who invented quantum mechanics (a paper which won him a Nobel prize). The key feature of his work here was the concept of uncertainty on a subatomic level; that certain properties, such as the position and momentum of a particle, are impossible to know exactly at any one time. There is an incredibly complicated explanation for this concerning wave functions and matrix algebra, but a simpler way to explain part of the concept concerns how we examine something’s position (apologies in advance to all physics students I end up annoying). If we want to know where something is, then the tried and tested method is to look at the thing; this requires photons of light to bounce off the object and enter our eyes, or hypersensitive measuring equipment if we want to get really advanced. However, at a subatomic level a photon of light represents a sizeable chunk of energy, so when it bounces off an atom or subatomic particle, allowing us to know where it is, it so messes around with the atom’s energy that it changes its velocity and momentum, although we cannot predict how. Thus, the more precisely we try to measure the position of something, the less accurately we are able to know its velocity (and vice versa; I recognise this explanation is incomplete, but can we just take it as red that finer minds than mine agree on this point). Therefore, we cannot ever measure every property of every particle in a given space, never mind the engineering challenge; it’s simply not possible.

This idea did not enter the scientific consciousness comfortably; many scientists were incensed by the idea that they couldn’t know everything, that their goal of an entirely predictable, deterministic universe would forever remain unfulfilled. Einstein was a particularly vocal critic, dedicating the rest of his life’s work to attempting to disprove quantum mechanics and back up his famous statement that ‘God does not play dice with the universe’. But eventually the scientific world came to accept the truth; that determinism was dead. The universe would never seem so sure and predictable again.

In a hole in the ground there lived a hobbit…

I read a lot; I have done since I was a kid. Brian Jacques, JK Rowling, Caroline Lawrence and dozens of other authors’ work sped through my young mind, throwing off ideas, philosophies, and any other random stuff I found interesting in all directions. However, as any committed reader will tell you, after a while flicking through any genre all the ‘low hanging fruit’, the good books everyone’s heard of, will soon be absorbed, and it is often quite a task to find reliable sources of good reading material. It was for partly this reason that I, some years ago, turned to the fantasy genre because, like it or loathe it, it is impossible to deny the sheer volume of stuff, and good stuff too, that is there. Mountains of books have been written for it, many of which are truly huge (I refer to volumes 11 and 12 of Robert Jordan’s ‘Wheel of Time’, which I have yet to pluck up the courage to actually read, if anyone doubts this fact), and the presence of so many different subgenres (who can compare George RR Martin, creator of A Game of Thrones, with Terry Pratchett, of Discworld fame) and different ideas gives it a nice level of innovation within a relatively safe, predictable sphere of existence.

This sheer volume of work does create one or two issues, most notably the fact that it can be often hard to consult with other fans about ‘epic sagas’ you picked up in the library that they may never have even heard of (hands up how many of you have heard of Raymond E Feist, who really got me started in this genre)- there’s just so much stuff, and not much of it can be said to be standard reading material for fantasy fans. However, there is one point of consistency, one author everyone’s read, and who can always be used as a reliable, if high, benchmark. I speak, of course, of the work of JRR Tolkein.

As has been well documented, John Ronald Reuel Tolkein was not an author by trade or any especial inclination; he was an academic, a professor of first Anglo-Saxon and later English Language & Literature at Pembroke College, Oxford, for 34 years no less. He first rose to real academic prominence in 1936, when he gave (and later published) a seminal lecture entitled Beowulf: The Monsters and the Critics. Beowulf is one of the oldest surviving works of English literature, an Anglo-Saxon epic poem from around the 8th century AD detailing the adventures of a warrior/king named Beowulf, and Tolkein’s lecture defined many contemporary thoughts about it as a work of literature.

However, there was something about Beowulf that was desperately sad to Tolkein; it was just about the only surviving piece of Old English mythology, and certainly the only one with any degree of public knowledge. Tolkein was a keen student of Germanic mythology and that of other nations, and it always pained him that his home nation had no such traditional mythology to be called upon, all the Saxon stories having been effectively wiped out with the coming of the Normans in 1066. Even our most famous ‘myths’, those of King Arthur, came from a couple of mentions in 8th century texts, and were only formalised by Normans- Sir Thomas Malory didn’t write Le Morte d’Arthur, the first full set of the Arthurian legends, until 1485, and there is plenty of evidence that he made most of it up. It never struck Tolkein as being how a myth should be; ancient, passed down father to son over innumerable generations until it became so ingrained as to be considered true. Tolkein’s response to what he saw as a lamentable gap in our heritage was decidedly pragmatic- he began building his own mythological world.

Since he was a linguistic scholar, Tolkein began by working with what he new; languages. His primary efforts were concerned with elvish, which he invented his own alphabet and grammar for and eventually developed into as deep and fully-fleshed a tongue as you could imagine. He then began experimenting with writing mythology based around the language- building a world of the Dark Ages and before that was as special, fantastical and magical as a story should be to become a fully-fledged myth (you will notice that at the start of The Lord Of The Rings, Tolkein refers to how we don’t see much of hobbits any more, implying that his world was set in the past rather than the alternate universe).

His first work in this field was the Quenta Silmarillion, a title that translates (from elvish) as “the Tale of the Silmarils”. It is a collection of stories and legends supposedly originating from the First Age of his world, although compiled by an Englishman during the Dark Ages from tales edited during the Fourth Age, after the passing of the elves. Tolkein started this work multiple times without ever finishing, and it wasn’t until long after his death that his son published The Silmarillion as a finished article.

However, Tolkein also had a family with young children, and took delight in writing stories for them. Every Christmas (he was, incidentally, a devout Catholic) he wrote letters to them from Father Christmas that took the form of short stories (again, not published until after his death), and wrote numerous other tales for them. A few of these, such as The Adventures of Tom Bombadil, either drew inspiration from or became part of his world (or ‘legendarium’, as it is also known), but he never expected any of them to become popular. And they weren’t- until he, bored out of his mind marking exam papers one day in around 1930, found a blank back page and began writing another, longer story for them, beginning with the immortal lines: “In a hole in the ground there lived a hobbit.”

This work, what would later become The Hobbit (or There and Back Again), was set in the Third Age of his legendarium and is soon to be made into a  series of three films (don’t ask me how that works, given that it’s shorter than each one of the books making up The Lord Of The Rings that each got a film to themselves, but whatever). Like his other stories, he never intended it to be much more than a diverting adventure for his children, and for 4 years after its completion in 1932 it was just that. However, Tolkein was a generous soul who would frequently lend his stories to friends, and one of those, a student named Elaine Griffiths, showed it to another friend called Susan Dagnall. Dagnall worked at the publishing company Allen & Unwin, and she was so impressed upon reading it that she showed it to Stanley Unwin. Unwin lent the book to his son Rayner to review (this was his way of earning pocket money), who described it as ‘suitable for children between the ages of 6 and 12’ (kids were clearly a lot more formal and eloquent where he grew up). Unwin published the book, and everyone loved it. It recieved many glowing reviews in an almost universally positive critical reception, and one of the first reviews came from Tolkein’s friend CS Lewis in The Times, who wrote:

The truth is that in this book a number of good things, never before united, have come together: a fund of humour, an understanding of children, and a happy fusion of the scholar’s with the poet’s grasp of mythology… The professor has the air of inventing nothing. He has studied trolls and dragons at first hand and describes them with that fidelity that is worth oceans of glib “originality.”

In many ways, that quote describes all that was great about Tolkein’s writing; an almost childish, gleeful imagination combined with the brute seriousness of his academic work, that made it feel like a very, very real fantasy world. However, this was most definitely not the end of JRR Tolkein, and since I am rapidly going over length, the rest of the story will have to wait until next time…

We Will Remember Them

Four days ago (this post was intended for Monday, when it would have been yesterday, but I was out then- sorry) was Remembrance Sunday; I’m sure you were all aware of that. Yesterday we acknowledged the dead, recognised the sacrifice they made in service of their country, and reflected upon the tragic horrors that war inflicted upon them and our nations. We gave our thanks that “for your tomorrow, we gave our today”.

However, as the greatest wars ever to rack our planet have disappeared towards the realm of being outside living memory, a few dissenting voices have risen about the place of the 11th of November as a day of national mourning and remembrance. They are not loud complaints, as anything that may be seen as an attempt to sully the memories of those who ‘laid so costly a sacrifice on the altar of freedom’ (to quote Saving Private Ryan) is unsurprisingly lambasted and vilified by the majority, but it would be wrong not to recognise that there are some who question the very idea of Remembrance Sunday in its modern incarnation.

‘Remembrance Sunday,’ so goes the argument, ‘is very much centred around the memories of those who died: recognising their act of sacrifice and championing the idea that ‘they died for us’.” This may partly explain why the Church has such strong links with the ceremony; quite apart from religion being approximately 68% about death, the whole concept of sacrificing oneself for the good of others is a direct parallel to the story of Jesus Christ. ‘However,’ continues the argument, ‘the wars that we of the old Allied Powers chiefly celebrate and remember are ones in which we won, and if we had lost them then to argue that they had given their lives in defence of their realm would make it seem like their sacrifice was wasted- thus, this style of remembrance is not exactly fair. Furthermore, by putting the date of our symbolic day of remembrance on the anniversary of the end of the First World War, we invariably make that conflict (and WWII) our main focus of interest. But, it is widely acknowledged that WWI was a horrific, stupid war, in which millions died for next to no material gain and which is generally regarded as a terrible waste of life. We weren’t fighting for freedom against some oppressive power, but because all the European top brass were squaring up to one another in a giant political pissing contest, making the death of 20 million people the result of little more than a game of satisfying egos. This was not a war in which ‘they died for us’ is exactly an appropriate sentiment’.

Such an argument is a remarkably good one, and does call into question the very act of remembrance itself.  It’s perhaps more appropriate to make such an argument with more recent wars- the Second World War was a necessary conflict if ever there was one, and it cannot be said that those soldiers currently fighting in Afghanistan are not trying to make a deeply unstable and rather undemocratic part of the world a better place to live in (I said trying). However, this doesn’t change the plain and simple truth that war is a horrible, unpleasant activity that we ought to be trying to get rid of wherever humanly possible, and remembering soldiers from years gone by as if their going to die in a muddy trench was absolutely the most good and right thing to do does not seem like the best way of going about this- it reminds me of, in the words of Wilfred Owen: “that old lie:/Dulce Et Decorum Est/Pro Patria Mori”.

However, that is not to say that we should not remember the deaths and sacrifices of those dead soldiers, far from it. Not only would it be hideously insensitive to both their memories and families (my family was fortunate enough to not experience any war casualties in the 20th century), but it would also suggest to soldiers currently fighting that their fight is meaningless- something they are definitely not going to take well, which would be rather inadvisable since they have all the guns and explosives. War might be a terrible thing, but that is not to say that it doesn’t take guts and bravery to face the guns and fight for what you believe in (or, alternatively, what your country makes you believe in). As deaths go, it is at least honourable, if not exactly Dulce Et Decorum.

And then, of course, there is the whole point of remembrance, and indeed history itself, to remember. The old adage about ‘study history or else find yourself repeating it’ still holds true, and by learning lessons from the past we stand very little chance of improving on our previous mistakes. Without the great social levelling and anti-imperialist effects of the First World War, then women may never have got the vote, jingoistic ideas about empires,  and the glory of dying in battle may still abound, America may (for good or ill) have not made enough money out of the war to become the economic superpower it is today and wars may, for many years more, have continued to waste lives through persistent use of outdated tactics on a modern battlefield with modern weaponry, to name but the first examples to come into my head- so to ignore the act of remembrance is not just disrespectful, but downright rude.

Perhaps then, the message to learn is not to ignore the sacrifice that those soldiers have made over the years, but rather to remember what they died to teach us. We can argue for all of eternity as to whether the wars that lead to their deaths were ever justified, but we can all agree that the concept of war itself is a wrong one, and that the death and pain it causes are the best reasons to pursue peace wherever we can. This then, should perhaps be the true message of Remembrance Sunday; that over the years, millions upon millions of soldiers have dyed the earth red with their blood, so that we might one day learn the lessons that enable us to enjoy a world in which they no longer have to.