One Year On

A year is a long time.

On the 16th of December last year, I was on Facebook. Nothing unusual about this (I spent and indeed, to a slightly lesser extent, still spend rather too much time with that little blue f in the top corner of my screen), especially given that it was the run up to Christmas and I was bored, and neither was the precise content of the bit of Facebook I was looking at- an argument. Such things are common in the weird world of social networking, although they surely shouldn’t be, and this was just another such time. Three or four people were posting long, eloquent, semi-researched and furiously defended messages over some point of ethics, politics or internet piracy, I know not which (it was probably one of those anyway, since that’s what most of them seem to be about among my friends list). Unfortunately, one of those people was me, and I was losing. Well, I say losing; I don’t think anybody could be said to be winning, but I was getting angry and upset all the same, made worse by the realisation that what I was doing was a COMPLETE WASTE OF TIME. I am not in any position whereby my Views are going to have a massive impact on the lives of everyone else, nobody wants to hear what they are, and there was no way in hell that I was going to convince anyone that my opinion was more ‘right’ than their strongly-held conviction- all I and my fellow arguees were achieving was getting very, very angry at one another, actively making us all more miserable. We could pretend that we were debating an important issue, but in reality were just another group of people screaming at one another via the interwebs.

A little under a week later, the night after the winter solstice (22nd of December, which you should notice was exactly 366 days ago), I was again to be found watching an argument unfold on Facebook. Thankfully this time I was not participating, merely looking on with horror as another group of four or five people made their evening miserable by pretending they could convince others that they were ‘wrong’. The provocativeness of the original post, spouting one set of Views as gospel truth over the web, the self-righteousness of the responses and the steadily increasing vitriol of the resulting argument, all struck me as a terrible waste of some wonderful brains. Those participating I knew to be good people, smart people, capable of using their brains for, if not betterment of the world around them, then perhaps a degree of self-betterment or at the very least something that was not making the world a more unhappy place. The moment was not a happy one.

However, one of the benefits of not competing in such an argument is that I didn’t have to be reminded of it or spend much time watching it unfold, so I turned back to my news feed and began scrolling down. As I did so, I came to another friend, putting a link up to his blog. This was a recent experiment for him, only a few posts old at the time, and he self-publicised it religiously every time a post went up. He has since discontinued his blogging adventures, to my disappointment, but they made fun reading whilst they lasted; short (mostly less than 300 words) and covering a wide range of random topics. He wasn’t afraid to just be himself online, and wasn’t concerned about being definitively right; if he offered an opinion, it was just something he thought, no more & no less, and there was no sense that it was ever combative. Certainly it was never the point of any post he made; each was just something he’d encountered in the real world or online that he felt would be relatively cool and interesting to comment on. His description described his posts as ‘musings’, and that was the right word for them; harmless, fun and nice. They made the internet and world in general, in some tiny little way, a nicer place to explore.

So, I read through his post. I smirked a little, smiled and closed the tab, returning once more to Facebook and the other distractions & delights the net had to offer. After about an hour or so, my thoughts once again turned to the argument, and I rashly flicked over to look at how it was progressing. It had got to over 100 comments and, as these things do, was gradually wandering off-topic to a more fundamental, but no less depressing, point of disagreement. I was once again filled with a sense that these people were wasting their lives, but this time my thoughts were both more decisive and introspective. I thought about myself; listless, counting down the last few empty days before Christmas, looking at the occasional video or blog, not doing much with myself. My schedule was relatively free, I had a lot of spare time, but I was wasting it. I thought of all the weird and wonderful thoughts that flew across my brain, all the ideas that would spring and fountain of their own accord, all of the things that I thought were interesting, amazing or just downright wonderful about our little mental, spinning ball of rock and water and its strange, pink, fleshy inhabitants that I never got to share. Worse, I never got to put them down anywhere, so after time all these thoughts would die in some forgotten corner of my brain, and the potential they had to remind me of themselves was lost. Once again, I was struck by a sense of waste, but also of resolve; I could try to remedy this situation. So, I opened up WordPress, I filled out a few boxes, and I had my own little blog. My fingers hovered over the keyboard, before falling to the keys. I began to write a little introduction to myself.

Today, the role of my little corner of the interwebs has changed somewhat. Once, I would post poetry, lists, depressed trains of thought and last year’s ’round robin letter of Planet Earth’, which I still regard as one of the best concepts I ever put onto the net (although I don’t think I’ll do one this year- not as much major stuff has hit the news). Somewhere along the line, I realised that essays were more my kind of thing, so I’ve (mainly) stuck to them since; I enjoy the occasional foray into something else, but I find that I can’t produce as much regular stuff this was as otherwise. In any case, the essays have been good for me; I can type, research and get work done so much faster now, and it has paid dividends to my work rate and analytical ability in other fields. I have also found that in my efforts to add evidence to my comments, I end up doing a surprising amount of research that turns an exercise in writing down what I know into one of increasing the kind of stuff I know, learning all sorts of new and random stuff to pack into my brain. I have also violated my own rules about giving my Views on a couple of occasions (although I would hope that I haven’t been too obnoxious about it when I have), but broadly speaking the role of my blog has stayed true to those goals stated in my very first post; to be a place free from rants, to be somewhere to have a bit of a laugh and to be somewhere to rescue unwary travellers dredging the backwaters of the internet who might like what they’ve stumbled upon. But, really, this little blog is like a diary for me; a place that I don’t publicise on my Facebook feed, that I link to only rarely, and that I keep going because I find it comforting. It’s a place where there’s nobody to judge me, a place to house my mind and extend my memory. It’s stressful organising my posting time and coming up with ideas, but whilst blogging, the rest of the world can wait for a bit. It’s a calming place, a nice place, and over the last year it has changed me.

A year is a long time.

Questionably Moral

We human beings tend to set a lot of store by the idea of morality (well, most of us anyway), and it is generally accepted that having a strong code of morals is a good thing. Even if many of us have never exactly qualified what we consider to be right or wrong, the majority of people have at least a basic idea of what they consider morally acceptable and a significant number are willing to make their moral standpoint on various issues very well known to anyone who doesn’t want to listen (internet, I’m looking at you again). One of the key features considered to be integral to such a moral code is the idea of rigidity and having fixed rules. Much like law, morality should ideally be inflexible, passing equal judgement on the same situation regardless of who is involved, how you’re feeling at the time and other outside factors. If only to avoid being accused of hypocrisy, social law dictates that one ‘should’ pass equal moral judgement on both your worst enemy and your spouse, and such a stringent dedication to ‘justice’ is a prized concept among those with strong moral codes.

However, human beings are nothing if not inconsistent, and even the strongest and most vehemently held ideas have a habit of withering in the face of context. One’s moral code is no exception, and with that in mind, let’s talk about cats.

Consider a person- call him a socialist, if you like that sort of description. Somebody who basically believes that we should be doing our bit to help our fellow man. Someone who buys The Big Issue, donates to charity, and gives their change to the homeless. They take the view that those in a more disadvantaged position should be offered help, and they live and share this view on a daily basis.

Now, consider what happens when, one day, said person is having a barbecue and a stray cat comes into the garden. Such strays are, nowadays, uncommon in suburban Britain, but across Europe (the Mediterranean especially), there may be hundreds of them in a town (maybe the person’s on holiday). Picture one such cat- skinny, with visible ribs, unkempt and patchy fur, perhaps a few open sores. A mangy, quite pathetic creature, clinging onto life through a mixture of tenacity and grubbing for scraps, it enters the garden and makes its way towards the man and his barbecue.

Human beings, especially modern-day ones, leave quite a wasteful and indulgent existence. We certainly do not need the vast majority of the food we produce and consume, and could quite happily do without a fair bit of it. A small cat, by contrast, can survive quite happily for at least day on just one small bowl of food, or a few scraps of meat. From a neutral, logical standpoint, therefore, the correct and generous thing to do according to this person’s moral standpoint, would be to throw the cat a few scraps and sleep comfortably with a satisfied conscience that evening. But, all our person sees is a mangy street cat, a dirty horrible stray that they don’t want anywhere near them or their food, so they do all they can to kick, scream, shout, throw water and generally drive a starving life form after just a few scraps away from a huge pile of pristine meat, much of which is likely to go to waste.

Now, you could argue that if the cat had been given food, it would have kept on coming back, quite insatiably, for more, and could possibly have got bolder and more aggressive. An aggressive, confident cat is more likely to try and steal food, and letting a possibly diseased and flea-ridden animal near food you are due to eat is probably not in the best interests of hygiene. You could argue that offering food is just going to encourage other cats to come to you for food, until you become a feeding station for all those in the area and are thus promoting the survival and growth of a feline population that nobody really likes to see around and would be unsustainable to keep. You could argue, if you were particularly harsh and probably not of the same viewpoint as the person in question, that a cat is not ‘worth’ as much as a human, if only because we should stick to looking after our own for starters and, in any case, it would be better for the world anyway if there weren’t stray cats around to cause such freak out-ness and moral dilemmas. But all of this does not change the fact that this person has, from an objective standpoint, violated their moral code by refusing a creature less fortunate than themselves a mere scrap that could, potentially, represent the difference between their living and dying.

There are other such examples of such moral inconsistency in the world around us. Animals are a common connecting factor (pacifists and people who generally don’t like murder will quite happily swat flies and such ‘because they’re annoying’), but there are other, more human examples (those who say we should be feeding the world’s poor whilst simultaneously both eating and wasting vast amounts of food and donating a mere pittance to help those in need). Now, does this mean that all of these moral standpoints are stupid? Of course not, if we all decided not to help and be nice to one another then the world would be an absolute mess. Does it mean that we’re all just bad, hypocritical people, as the violently forceful charity collectors would have you believe? Again, no- this ‘hypocrisy’ is something that all humans do to some extent, so either the entire human race is fundamentally flawed (in which case the point is not worth arguing) or we feel that looking after ourselves first and foremost before helping others is simply more practical. Should we all turn to communist leadership to try and redress some of these imbalances and remove the moral dilemmas? I won’t even go there.

It’s a little hard to identify a clear moral or conclusion to all of this, except to highlight that moral inconsistency is a natural and very human trait. Some might deplore this state of affairs, but we’ve always known humans are imperfect creatures; not that that gives us a right to give up on being the best we can be.

Fitting in

This is my third post in this little mini-series on the subject of sex & sexuality in general, and this time I would like to focus on the place that sex has in our society. I mean, on the face of it, we as a culture appear to be genuinely embarrassed by its existence a lot of the time, rarely being referred to explicitly if at all (at least certainly not among ‘correct’ and polite company), and making any mention of it a cause of scandal and embarrassment. Indeed, an entire subset of language has seemed to develop over the last few years to try and enable us to talk about sexually-related things without ever actually making explicit references to it- it’s an entire area where you just don’t go in conversation. It’s almost as if polite society has the mental age of a 13 year old in this regard, and is genuinely embarrassed as to its existence.

Compare this to the societal structure of one of our closest relatives, the ‘pygmy great ape’ called the bonobo. Bonobos adopt a matriarchal (female-led) society, are entirely bisexual, and for them sex is a huge part of their social system. If a pair of bonobos are confronted with any new and unusual situation, be it merely the introduction of a cardboard box, their immediate response will be to briefly start having sex with one another almost to act as an icebreaker, before they go and investigate whatever it was that excited them. Compared to bonobos, humans appear to be acting like a bunch of nuns.

And this we must contrast against the fact that sex is something that we are not only designed for but that we actively seek and enjoy. Sigmund Freud is famous for claiming that most of human behaviour can be put down to the desire for sex, and as I have explained in previous place, it makes evolutionary sense for us to both enjoy sex and seek it wherever we can. It’s a fact of life, something very few of us would be comfortable to do without, and something our children are going to have to come to terms with eventually- yet it’s something that culture seems determined to brush under the carpet, and that children are continually kept away from for as long as is physically possible in a last-ditch attempt to protect whatever innocence they have left. So why is this?

Part of the reasoning behind this would be the connection between sex and nakedness, as well as the connection to privacy. Human beings do not, obviously, have thick fur to protect themselves or keep them warm (nobody knows exactly why we lost ours, but it’s probably to do with helping to regulate temperature, which we humans do very well), and as such clothes are a great advantage to us. They can shade us when its warm and allow for more efficient cooling, protect us from harsh dust, wind & weather, keep us warm when we venture into the world’s colder climates, help stem blood flow and lessen the effect of injuries, protect us against attack from predators or one another, help keep us a little cleaner and replace elaborate fur & feathers for all manner of ceremonial rituals. However, they also carry a great psychological value, placing a barrier between our bodies and the rest of the world, and thus giving us a sense of personal privacy about our own bodies. Of particular interest to our privacy are those areas most commonly covered, including (among other things), the genital areas, which must be exposed for sexual activity. This turns sex into a private, personal act in our collective psyche, something to be shared only between the partners involved, and making any exploration of it seem like an invasion of our personal privacy. In effect, then, it would seem the Bible got it the wrong way around- it was clothes that gave us the knowledge and shame of nakedness, and thus the ‘shame’ of sex.

Then we must consider the social importance of sex & its consequences in our society generally. For centuries the entire governmental structure of most of the world was based around who was whose son, who was married to who  and, in various roundabout ways, who either had, was having, or could in the future be having, sex with whom. Even nowadays the due process of law usually means inheritance by either next of kin, spouse or partner, and so the consequences of sex carry immense weight. Even in the modern world, with the invention of contraceptives and abortion and the increasing prevalence of ‘casual sex’, sex itself carries immense societal weight, often determining how you are judged by your peers, your ‘history’ among them and your general social standing. To quote a favourite song of a friend of mine: ‘The whole damn world is just as obsessed/ With who’s the best dressed and who’s having sex’. And so, sex becomes this huge social thing, its pursuit full of little rules and nuances, all about who with who (and even with the where & how among some social groups) and it is simply not allowed to become ‘just this thing everyone does’ like it is with the bonobos. Thus, everything associated with sex & sexuality becomes highly strung and almost political in nature, making it a semi-taboo to talk about for fear of offending someone.

Finally, we must consider the development of the ‘sexual subculture’ that seems to run completely counter to this taboo attitude. For most of human history we have comfortably accepted and even encouraged the existence of brothels and prostitution, and whilst this has become very much frowned upon in today’s culture the position has been filled by strip clubs, lap dancing bars and the sheer mountains of pornography that fill the half-hidden corners of newsagents, small ads and the internet. This is almost a reaction to the rather more prim aloofness adopted by polite society, an acknowledgement and embracing of our enjoyment of sex (albeit one that caters almost exclusively to heterosexual men and has a dubious record for both women’s and, in places, human rights). But because this is almost a direct response to the attitudes of polite culture, it has naturally attracted connotations of being seedy and not respectable. Hundreds of men may visit strip clubs every night, but that doesn’t make it an OK career move for a prominent judge to be photographed walking out of one. Thus, as this sex-obsessed underworld has come into being on the wrong side of the public eye, so sex itself has attracted the same negative connotations, the same sense of lacking in respectability, among the ‘proper’ echelons of society, and has gone even more into the realms of ‘Do Not Discuss’.

But, you might say, sex appears to be getting even more prevalent in the modern age. You’ve mentioned internet porn, but what about the sexualisation of the media, the creation and use of sex symbols, the targeting of sexual content at a steadily younger audience? Good question, and one I’ll give a shot at answering next time…

Living for… when, exactly?

When we are young, we get a lot of advice and rules shoved down our throats in a seemingly endless stream of dos and don’ts. “Do eat your greens”, “Don’t spend too much time watching TV”, “Get your fingers away from your nose” and, an old personal favourite, “Keep your elbows off the table”. There are some schools of psychology who claim it is this militant enforcement of rules with no leeway or grey area may be responsible for some of our more rebellious behaviour in older life and, particularly, the teenage years, but I won’t delve into that now.

But there is one piece of advice, very broadly applied in a variety of contexts, in fact more of a general message than a rule, that is of particular interest to me. Throughout our lives, from cradle to right into adulthood, we are encouraged to take time over our decisions, to make only sensible choices, to plan ahead and think of the consequences, living for long-term satisfaction than short-term thrills. This takes the form of a myriad of bits of advice like ‘save not spend’ or ‘don’t eat all that chocolate at once’ (perhaps the most readily disobeyed of all parental instructions), but the general message remains the same: make the sensible, analytical decision.

The reason that this advice is so interesting is because when we hit adult life, many of us will encounter another, entirely contradictory school of thought that runs totally counter to the idea of sensible analysis- the idea of ‘living for the moment’. The basic viewpoint goes along the lines of ‘We only have one short life that could end tomorrow, so enjoy it as much as you can whilst you can. Take risks, make the mad decisions, go for the off-chances, try out as much as you can, and try to live your life in the moment, thinking of yourself and the here & now rather than worrying about what’s going to happen 20 years down the line’.

This is a very compelling viewpoint, particularly to the fun-centric outlook of the early-to-mid-twenties age bracket who most commonly get given and promote this way of life, for a host of reasons. Firstly, it offers a way of living in which very little can ever be considered to be a mistake, only an attempt at something new that didn’t come off. Secondly, its practice generates immediate and tangible results, rather than slower, more boring, long-term gains that a ‘sensible life’ may gain you, giving it an immediate association with living the good life. But, most importantly, following this life path is great fun, and leads you to the moments that make life truly special. Someone I know has often quoted their greatest ever regret as, when seriously strapped for cash, taking the sensible fiscal decision and not forking out to go to a Queen concert. Freddie Mercury died shortly afterwards, and this hardcore Queen fan never got to see them live. There is a similar and oft-quoted argument for the huge expense of the space program: ‘Across the galaxy there may be hundreds of dead civilizations, all of whom made the sensible economic choice to not pursue space exploration- who will only be discovered by whichever race made the irrational decision’. In short, sensible decisions may make your life seem good to an accountant, but might not make it seem that special or worthwhile.

On the other hand, this does not make ‘living for the moment’ an especially good life choice either- there’s a very good reason why your parents wanted you to be sensible. A ‘live for the future’ lifestyle is far more likely to reap long-term rewards in terms of salary and societal rank,  plans laid with the right degree of patience and care invariably more successful, whilst a constant, ceaseless focus on satisfying the urges of the moment is only ever going to end in disaster. This was perhaps best demonstrated in that episode of Family Guy entitled “Brian Sings and Swings”, in which, following a near-death experience, Brian is inspired by the ‘live for today’ lifestyle of Frank Sinatra Jr. For him, this takes the form of singing with Sinatra (and Stewie) every night, and drinking heavily both before & during performances, quickly resulting in drunken shows, throwing up into the toilet, losing a baby and, eventually, the gutter. Clearly, simply living for the now with no consideration for future happiness will very quickly leave you broke, out of a job, possibly homeless and with a monumental hangover. Not only that, but such a heavy focus on the short term has been blamed for a whole host of unsavoury side effects ranging from the ‘plastic’ consumer culture of the modern world and a lack of patience between people to the global economic meltdown, the latter of which could almost certainly have been prevented (and cleared up a bit quicker) had the world’s banks been a little more concerned with their long-term future and a little less with the size of their profit margin.

Clearly then, this is not a clear-cut balance between a right and wrong way of doing things- for one thing everybody’s priorities will be different, but for another neither way of life makes perfect sense without some degree of compromise. Perhaps this is in and of itself a life lesson- that nothing is ever quite fixed, that there are always shades of grey, and that compromise is sure to permeate every facet of our existence. Living for the moment is costly in all regards and potentially catastrophic, whilst living for the distant future is boring and makes life devoid of real value, neither of which is an ideal way to be. Perhaps the best solution is to aim for somewhere in the middle; don’t live for now, don’t live for the indeterminate future, but perhaps live for… this time next week?

I am away on holiday for the next week, so posts should resume on the Monday after next. To tide you over until then, I leave you with a recommendation: YouTube ‘Crapshots’. Find a spare hour or two. Watch all of. Giggle.

Why do we call a writer a bard, anyway?

In Britain at the moment, there are an awful lot of pessimists. Nothing unusual about this, as it’s hardly atypical human nature and my country has never been noted for its sunny, uplifting outlook on life as a rule anyway. Their pessimism is typically of the sort adopted by people who consider themselves too intelligent (read arrogant) to believe in optimism and nice things anyway, and nowadays tends to focus around Britain’s place in the world. “We have nothing world-class” they tend to say, or “The Olympics are going to be totally rubbish” if they wish to be topical.

However, whilst I could dedicate an entire post to the ramblings of these people, I would probably have to violate my ‘no Views’ clause by the end of it, so will instead focus on one apparent inconsistency in their argument. You see, the kind of people who say this sort of thing also tend to be the kind of people who really, really like the work of William Shakespeare.

There is no denying that the immortal Bard (as he is inexplicably known) is a true giant of literature. He is the only writer of any form to be compulsory reading on the national curriculum and is known of by just about everyone in the world, or at least the English-speaking part. He introduced between 150 and 1500 new words to the English language (depending on who you believe and how stringent you are in your criteria) as well as countless phrases ranging from ‘bug-eyed monster’ (Othello) to ‘a sorry sight’ (Macbeth), wrote nearly 40 plays, innumerable sonnets and poems, and revolutionised theatre of his time. As such he is idolised above all other literary figures, Zeus in the pantheon of the Gods of the written word, even in our modern age. All of which is doubly surprising when you consider how much of what he wrote was… well… crap.

I mean think about it- Romeo and Juliet is about a romance that ends with both lovers committing suicide over someone they’ve only known for three days, whilst Twelfth Night is nothing more than a romcom (in fact the film ‘She’s the Man’ turned it into a modern one), and not a great one at that. Julius Caesar is considered even by fans to be the most boring way to spend a few hours in known human history, the character of Othello is the dopiest human in history and A Midsummer Night’s Dream is about some fairies falling in love with a guy who turns into a donkey. That was considered, by Elizabethans, the very height of comedic expression.

So then, why is he so idolised? The answer is, in fact, remarkably simple: Shakespeare did stuff that was new. During the 16th century theatre hadn’t really evolved from its Greek origins, and as such every play was basically the same. Every tragedy had the exact same formulaic plot line of tragic flaw-catharsis-death, which, whilst a good structure used to great effect by Arthur Miller and the guy who wrote the plot for the first God of War game, does tend to lose interest after 2000 years of ceaseless repetition. Comedies & satyrs had a bit more variety, but were essentially a mixture of stereotypes and pantomime that might have been entertaining had they not been mostly based on tired old stories, philosophy and mythology and been so unfunny that they required a chorus (who were basically a staged audience meant to show how the audience how to react). In any case there was hardly any call for these comedies anyway- they were considered the poorer cousins to the more noble and proper tragedy, amusing sideshows to distract attention from the monotony of the main dish. And then, of course, there were the irreversibly fixed tropes and rules that had to be obeyed- characters were invariably all noble and kingly (in fact it wasn’t until the 1920’s that the idea of a classical tragedy of the common man was entertained at all) and spoke with rigid rhythm, making the whole experience more poetic than imitative of real life. The iambic pentameter was king, the new was non-existent, and there was no concept whatsoever that any of this could change.

Now contrast this with, say, Macbeth. This is (obviously) a tragedy, about a lord who, rather than failing to recognise a tragic flaw in his personality until right at the very end and then holding out for a protracted death scene in which to explain all of it (as in a Greek tragedy), starts off a good and noble man who is sent mental by a trio of witches. Before Shakespeare’s time a playwright could be lynched before he made such insulting suggestions about the noble classes (and it is worth noting that Macbeth wasn’t written until he was firmly established as a playwright), but Shakespeare was one of the first of a more common-born group of playwrights, raised an actor rather than aristocrat. The main characters may be lords & kings it is true (even Shakespeare couldn’t shake off the old tropes entirely, and it would take a long time for that to change), but the driving forces of the plot are all women, three of whom are old hags who speak in an irregular chanting and make up heathen prophecies. Then there is an entire monologue dedicated to an old drunk bloke, speaking just as irregularly, mumbling on about how booze kills a boner, and even the main characters get in on the act, with Macbeth and his lady scrambling structureless phrases as they fairly shit themselves in fear of discovery. Hell, he even managed to slip in an almost comic moment of parody as Macbeth compares his own life to that of a play (which, of course, it is. He pulls a similar trick in As You Like It)

This is just one example- there are countless more. Romeo and Juliet was one of the first examples of romance used as the central driving force of a tragedy, The Tempest was the Elizabethan version of fantasy literature and Henry V deserves a mention for coming up with some of the best inspirational quotes of all time. Unsurprisingly, whilst Shakespeare was able to spark a revolution at home, other countries were rocked by his radicalism- the French especially were sharply divided into two camps, one supporting this theatrical revolution (such as Voltaire) and the other vehemently opposing it. It didn’t do any good- the wheels had been set in motion, and for the next 500 years theatre and literature continued (and continues) to evolve at a previously unprecedented rate. Nowadays, the work of Shakespeare seems to us as much of a relic as the old Greek tragedies must have appeared to him, but as theatre has moved on so too has our expectations of it (such as, for instance, jokes that are actually funny and speech we can understand without a scholar on hand). Shakespeare may not have told the best stories or written the best plays to our ears, but that doesn’t mean he wasn’t the best playwright.

The Problems of the Real World

My last post on the subject of artificial intelligence was something of a philosophical argument on its nature- today I am going to take on a more practical perspective, and have a go at just scratching the surface of the monumental challenges that the real world poses to the development of AI- and, indeed, how they are (broadly speaking) solved.

To understand the issues surrounding the AI problem, we must first consider what, in the strictest sense of the matter, a computer is. To quote… someone, I can’t quite remember who: “A computer is basically just a dumb adding machine that counts on its fingers- except that it has an awful lot of fingers and counts terribly fast”. This, rather simplistic model, is in fact rather good for explaining exactly what it is that computers are good and bad at- they are very good at numbers, data crunching, the processing of information. Information is the key thing here- if something can be inputted into a computer purely in terms of information, then the computer is perfectly capable of modelling and processing it with ease- which is why a computer is very good at playing games. Even real-world problems that can be expressed in terms of rules and numbers can be converted into computer-recognisable format and mastered with ease, which is why computers make short work of things like ballistics modelling (such as gunnery tables, the US’s first usage of them), and logical games like chess.

However, where a computer develops problems is in the barrier between the real world and the virtual. One must remember that the actual ‘mind’ of a computer itself is confined exclusively to the virtual world- the processing within a robot has no actual concept of the world surrounding it, and as such is notoriously poor at interacting with it. The problem is twofold- firstly, the real world is not a mere simulation, where rules are constant and predictable; rather, it is an incredibly complicated, constantly changing environment where there are a thousand different things that we living humans keep track of without even thinking. As such, there are a LOT of very complicated inputs and outputs for a computer to keep track of in the real world, which makes it very hard to deal with. But this is merely a matter of grumbling over the engineering specifications and trying to meet the design brief of the programmers- it is the second problem which is the real stumbling block for the development of AI.

The second issue is related to the way a computer processes information- bit by bit, without any real grasp of the big picture. Take, for example, the computer monitor in front of you. To you, it is quite clearly a screen- the most notable clue being the pretty pattern of lights in front of you. Now, turn your screen slightly so that you are looking at it from an angle. It’s still got a pattern of lights coming out of it, it’s still the same colours- it’s still a screen. To a computer however, if you were to line up two pictures of your monitor from two different angles, it would be completely unable to realise that they were the same screen, or even that they were the same kind of objects. Because the pixels are in a different order, and as such the data’s different, the two pictures are completely different- the computer has no concept of the idea that the two patterns of lights are the same basic shape, just from different angles.

There are two potential solutions to this problem. Firstly, the computer can look at the monitor and store an image of it from every conceivable angle with every conceivable background, so that it would be able to recognise it anywhere, from any viewpoint- this would however take up a library’s worth of memory space and be stupidly wasteful. The alternative requires some cleverer programming- by training the computer to spot patterns of pixels that look roughly similar (either shifted along by a few bytes, or missing a few here and there), they can be ‘trained’ to pick out basic shapes- by using an algorithm to pick out changes in colour (an old trick that’s been used for years to clean up photos), the edges of objects can be identified and separate objects themselves picked out. I am not by any stretch of the imagination an expert in this field so won’t go into details, but by this basic method a computer can begin to step back and begin to look at the pattern of a picture as a whole.

But all that information inputting, all that work…  so your computer can identify just a monitor? What about all the myriad of other things our brains can recognise with such ease- animals, buildings, cars? And we haven’t even got on to differentiating between different types of things yet… how will we ever match the human brain?

This idea presented a big setback for the development of modern AI- so far we have been able to develop AI that allows one computer to handle a few real-world tasks or applications very well (and in some cases, depending on the task’s suitability to the computational mind, better than humans), but scientists and engineers were presented with a monumental challenge when faced with the prospect of trying to come close to the human mind (let alone its body) in anything like the breadth of tasks it is able to perform. So they went back to basics, and began to think of exactly how humans are able to do so much stuff.

Some of it can be put down to instinct, but then came the idea of learning. The human mind is especially remarkable in its ability to take in new information and learn new things about the world around it- and then take this new-found information and try to apply it to our own bodies. Not only can we do this, but we can also do it remarkably quickly- it is one of the main traits which has pushed us forward as a race.

So this is what inspires the current generation of AI programmers and robotocists- the idea of building into the robot’s design a capacity for learning. The latest generation of the Japanese ‘Asimo’ robots can learn what various objects presented to it are, and is then able to recognise them when shown them again- as well as having the best-functioning humanoid chassis of any existing robot, being able to run and climb stairs. Perhaps more excitingly are a pair of robots currently under development that start pretty much from first principles, just like babies do- first they are presented with a mirror and learn to manipulate their leg motors in such a way that allows them to stand up straight and walk (although they aren’t quite so good at picking themselves up if they fail in this endeavour). They then face one another and begin to demonstrate and repeat actions to one another, giving each action a name as they do so.  In doing this they build up an entirely new, if unsophisticated, language with which to make sense of the world around them- currently, this is just actions, but who knows what lies around the corner…