Alternative Marketing

Extra Credits is one of my favourite online productions: what started out as a couple of animated lectures on videogames as art written by then student Daniel Floyd and posted on YouTube has now attracted attracted a huge fan base of gamers wishing to greater understand videogames as a form of artistic media.  Nowadays the show is hosted by Floyd, utilises the art services of LeeLee Scaldaferri and Scott deWitt and its content comes straight from the mind of James Portnow, one of the videogame industry’s leading lights when it comes to advancing them as a respected form of media and art. It provides intelligent yet easy-to-understand discussion on a topic too frequently ignored and trivialised by gamers and the general public alike, and its existence is a boon to the gaming world.

However, a while back they produced an episode that I found particularly interesting. Creative Assembly, the developers behind the hugely successful Total War franchise, apparently had some money left over in the marketing budget for their latest game, Total War: Rome II, and offered to subcontract the Extra Credits team (with their old art maestro Allison Theus) to make a few episodes about the Punic Wars, possibly the single most crucial event in the Roman Empire’s rise to power. They weren’t asked to mention the Total War franchise or Rome II at all, or even so much as mention videogames, just to make some short historical lectures in the engaging style that has made them so successful. The only reason I know of this origin story is because they deliberately chose to mention it in their intro.

As a marketing tactic, hiring somebody to not talk about the content of your game is a somewhat strange one, at least on the surface of it, but when one works backwards from the end-goal of marketing Creative Assembly’s tactic starts to seem more and more clever. The final aim of games marketing is, of course, to make more people buy your game, which generally takes one of two forms; the creation, expansion and maintenance of a core fanbase who will always buy your game and will do their own viral marketing for you, and the attraction of buyers (both new and returning) outside this core bracket. The former area is generally catered for by means of convention panels, forums, Facebook groups and such, whilst the latter is what we are interested in right now.

Generally, attempting to attract ‘non-core’ buyers in the gaming world takes the form of showing off big, flashy adverts and gameplay demonstrations, effectively saying ‘look at all the stuff our game can do!’ amidst various bits of marketing jargon. However, gameplay features alone aren’t everything, and there is a growing body of evidence to suggest that, for many gamers (compulsive Call of Duty players perhaps being an exception) story is just as important a consideration in their games as gameplay features. For a game such as the Total War series, where there is no predefined story and a distinct lack of character interaction*, one might think that this consideration becomes irrelevant, but it nonetheless demonstrates a key point; the core motivation behind videogame players is frequently not concerned with the gameplay features that form the bulk of most marketing material.

For a Total War game, the key motivating factor is based around a power fantasy; the dream of the player controlling the entire world at the head of one of the greatest Empires in history and of winning epic battles against great odds. From here we can dissect the motivation further- the thrill of victory in some great, decisive battle against your nemesis comes not just from the victory itself, but also from the idea of the players’ skill allowing them to outsmart the enemy and overcome the no doubt overwhelming odds. The dream of dominion over all Europe and beyond comes is partly satisfying for the sense of power it alone generates, but this sense of achievement is enhanced when one knows it is being played out against some great historical background, full of its own great stories, giving it some context and allowing it to carry even more weight. In Rome II for example, you have the options to emulate or even surpass the achievements of the mightiest Roman generals and Emperors, placing yourself on a par with Scipio and the various Caesars, or alternatively you can play as another faction and overcome what history tells us is one of the greatest empires and most unstoppable military forces ever to walk the earth. You can literally change the course of history.

One might ask, therefore, why marketeers don’t focus more on these aspects of the games, and to an extent they do; adverts for games such as the Total War franchise are frequently filled with inspiring messages along the lines of ‘Lead your nation to victory!’ or ‘Crush all who dare oppose you!’. But the very format of an advert makes really delivering on this historical power fantasy difficult; with screen time expensive and thus at a premium, there is little room to wax lyrical about any great history or to debate military tactics. A convention panel or gameplay demo can go a little further, but the usefulness of these is limited since most of the people who are going to be there will be fans of the series anyway; their main focus is community-building. And that’s where Extra Credits come in.

What Creative Assembly have realised is that Extra Credits have a large audience of gamers who are already well-indoctrinated with the concept of buying games (as some advert-viewers may not be) and think deeply enough about their games that flashy adverts are unlikely to impress them as much as they might some audience. Thus, to recruit members of the EC audience to buy the game, they need to sell them on the core appeal of the campaign, that of the epic history surrounding the game and your chance to manipulate it; and thus, they came up with the idea to simply educate the gaming world about this amazing piece of history, get them interested in it and make them want to explore it through games, their favourite sort of media. The Punic wars too are a masterful choice of subject matter; once commonly taught in schools (meaning there’s a pretty decent body of work analysing them to draw upon), they fell out of favour as Latin and other features of classical education began to drop out of the school system, meaning the majority of the population are unfamiliar with this epic tale of warfare on the grandest of scales. Given how relatively cheap and simple a technique it is, since it lets others do most of the legwork for you, it’s a truly masterful piece of marketing. And I’m not just saying that because it’s resulted in a video I like.

*I didn’t mention it in the main post because it disrupts the flow, but even without a preset story grand strategy games most certainly have a narrative. Indeed, the self-made stories of beating down a simultaneous rebellion and foreign invasion, and in the process gaining the moniker of ‘the Great’, are one of the main things that makes me enjoy playing Crusader Kings II. There’s an entire post’s-worth of discussion on the subject of videogames’ potential for fluid, non-linear storytelling, but that’s for another time

Advertisement

Wub Wub

My brother has some dubstep on his iPod. To be honest, I’m not entirely sure why; he frequently says that the genre can be described less as music and more as ‘sounds’, and would be the first to claim that it’s far from a paradigm of deeply emotional musical expression. Indeed, to some, dubstep is the embodiment of everything that is wrong with 21st century music; loud, brash and completely devoid of meaning.

I, personally, find dubstep more interesting than anything else. I don’t listen to much of it myself (my musical tastes tend to be more… lyrical), but find it inherently fascinating, purely because of the reaction people have with it. It’s a Marmite thing; people either love it or hate it, and some people seem physically incapable of contemplating why others enjoy it. Or, indeed, why it’s considered music.

So, let’s take my favourite approach to the matter: an analytical perspective. The songs that people remember, that are catchy, that stick in the mind, that become old favourite and/or the ones that the pop music industry attempts to manufacture in bulk, tend to have simple, regular and easily recognisable beats that one can tap or bounce along to easily. Their pace and rhythm too tend to be fairly standard, often being based approximately around the 70bpm rhythm of the human heart (or some multiple thereof). Overlaying these simple rhythms, usually based around a drumbeat (strength depending on genre), we tend to factor in simple melodies with plenty of repeating patterns; think of how the pattern of a verse or chorus will usually be repeated multiple times throughout a song, and how even the different lines of a verse will often follow the same lines of a verse. And then there are lyrics; whilst many genres, particularly jazz, may have entire libraries of purely instrumental pieces, few of these (film soundtracks excepted) have ever gained mainstream cultural impact. Lyrics are important; they allow us to sing along, which makes it stick in our head more effectively, and they can allow a song to carry meaning too. Think of just about any famous, popular song (Bohemian Rhapsody excepted; NOBODY can explain why that is both so popular and so awesome), and chances are it’ll have several of the above features. Even rap, where music is stripped down to its barest bones, bases itself around a strong, simple rhythm and a voice-dictated melody (come to think of it, rap is basically poetry for the modern age… I should do a post on that some time).

Now, let’s compare that analysis with probably the most famous piece of dubstep around: Skrillex’s ‘Bangarang’. Bring it up on YouTube if you want to follow along. Upon listening the song, the beat is the first thing that becomes apparent; timing it I get a pace of 90bpm, the same rate as a fast walking pace or a fast, excited heartbeat; a mood that fits perfectly with the intentions of the music. It’s meant to excite, to get the blood pumping, to infuse the body with the beat, and to inspire excitement and a party atmosphere. The music is structured around this beat, but there is also an underlying ‘thump’ similar to the bass drum of a drumkit, just to enforce the point. Then we come onto the melody; after an intro that reminds me vaguely of something the Jackson 5 may once have done (just from something I heard on the radio once), we begin to layer over this underlying sound. This is a common trick employed across all genres; start with something simple and build on top of it, building in terms of scale and noise. The music industry has known for a long time that loudness is compelling, hooks us in and sells records, hence why there has been a trend over the last few decades for steadily increasing loudness in popular music… but that’s for another time. Building loudness encourages us to stick with a song, getting us drawn into it. The first added layer is a voice, not only giving us something to (after a fashion, since the words are rather unclear and almost meaningless) sing along to and recognise, adding another hook for us, but this also offers an early example of a repeated lyrical pattern- we have one two-line couplet repeated four times, with more layers of repeated bassline patterns being successively added throughout, and the two lines of said couplet only differ in the way they end. Again, this makes it easy and compelling to follow. The words are hard to make out, but that doesn’t matter; one is supposed to just feel the gist, get into the rhythm of it. The words are just another layer. This portion of the song takes on a ‘verse’ role for the rest of it, as it repeated several more times.

And then, we hit the meat and drink of the song; with the word ‘Bangarang’, everything shifts into a loud mesh of electronic sounds passed several times through an angle grinder. However, the beat (although pausing for emphasis at the moment of transition’) remains the same, carrying over some consistency, and we once again find ourselves found by repeated patterns, both in the backing sounds and in the lyrics that are still (sort of ) present. It’s also worth noting that the melody of the electronica pulses in time to the beat, enabling a listener/partygoer to rock to both beat and melody simultaneously, getting the whole body into it. This is our ‘chorus’- we again have repeating stanzas for a while, but we return to our verse (building once again) after a while so we don’t get bored of the repetition. Then chorus again, and then a shift in tone; another common trick employed across all genres to hold our interest. We have a slight key change up, and our melody is taken over by a new, unidentified instrument/noise. We still have our original sound playing in the background to ensure the shift is not too weirdly abrupt, but this melody, again utilising short, repeated units, is what takes centre stage. We then have another shift, to a quiet patch, still keeping the background. Here the muted sounds offer us time for reflection and preparation; the ‘loud soft loud’ pattern was one used extensively by Nirvana, The Pixies and other grunge bands during the 1990s, and continues to find a place in popular music to this day. We have flashes of loudness, just to whet our appetites for the return to chaos that is to come, and then we start to build again (another repeating pattern you see, this time built in to the structure of the song). The loudness returns and then, just because this kind of thing doesn’t have a particularly natural end, we leave on some more unintelligible, distorted lyrics; because finishing on a lone voice isn’t just for ‘proper’ bands like, just off the top of my head, Red Hot Chili Peppers.

Notice how absolutely every one of those features I identified can be linked to other musical genres, the kind of thing present, admittedly in different formats, across the full spectrum of the musical world. The only difference with dubstep is that, unlike using voices & guitars like more traditional genres, dubstep favours entirely electronic sounds made on the computer; in that respect, combined with its way of being unabashedly loud and partying, it is the perfect musical representation of the 21st century thus far. In fact, the only thing on my original list that it lacks is a strong lyrical focus; in that respect, I feel that it is missing a trick, and that it could use a few more intelligible words to carry some meaning and become more recognised as A Thing. Actually, after listening to that song a few times, it reminds me vaguely of The Prodigy (apologies to any fans who are offended by this; I don’t listen to them much), but maybe that’s just me. Does all this mean dubstep is necessarily ‘good’ as a musical type? Of course not; taste is purely subjective. But to say that dubstep is merely noise, and that there is no reason anyone would listen to it, misses the point; it pulls the same tricks as every other genre, and they all have fans. No reason to begrudge them a few of those.

“Oh man, you gotta see this video…”

Everyone loves YouTube, or at least the numbers suggest so; the total number of siteviews they’ve racked up must number in the low trillions, the most popular video on the site (of course it’s still Gangnam Style) has over one billion views, and YouTube has indeed become so ubiquitous that if a video of something cannot be found there then it probably doesn’t exist.

Indeed, YouTube’s ubiquity is perhaps the most surprising, or at least interesting thing about it; YouTube is certainly not the only and wasn’t even the first large-scale video hosting site, being launched in 2005, a year after Vimeo (the only other such site I am familiar with) and well after several others had made efforts at video-sharing. It was the brainchild of three early employees of PayPal, Chad Hurley, Jawed Karim and Steve Chen. A commonly reported story (that is frequently claimed to be not true), the three had recorded video at a dinner party but were having difficulty sharing it online, so being reasonably gifted programmers decided to build the service themselves. What actually happened has never really been confirmed, but the first video (showing Karim at San Diego zoo; yes, perhaps it wasn’t the most auspicious start) went up in April 2005, of course, is history.

To some, YouTube’s meteoric rise might be considered surprising, or simply the result of good fortune favouring them over some other site. Indeed, given that Apple computers used not to be able to display videos using the Adobe Flash video format used by the site, it’s remarkable (and a testament to Microsoft’s dominance of the PC market for so many years) that the site was able to take off as it did. However, if one looks closely then it isn’t hard to identify the hallmarks of a business model that was born to succeed online, and bears striking hallmarks to the story of Facebook; something that started purely as a cool idea for a website, and considered monetisation something of a secondary priority to be dealt with when it came along. The audience was the first priority, and everything was geared to maximising the ability of users to both share and view content freely. Videos didn’t (and still don’t) have to be passed or inspected before being uploaded to the site (although anything flagged by users as inappropriate will be watched and taken down if the moderators see fit to do so), there is no limit on the amount that can be watched or uploaded by a user and there is never any need to pay for anything. YouTube understands the most important thing about the internet; it is a place with an almost infinite supply of stuff and a finite amount of users willing to surf around and look for it. This makes the value of content to a user very low, so everything must be done to attract ‘customers’ before one can worry about such pesky things as money. YouTube is a place of non-regulation, of freedom; no wonder the internet loves it.

The proof of the pudding is, of course, in the money; even as early as November 2005 Sequoia Capital had enough faith in the company (along with superhuman levels of optimism and sheer balls) to invest over $11 million in the company. Less than a year later, YouTube was bought by Google, the past masters at knowing how the internet works- for $1.65 billion. Given that people estimate that Sequoia’s comparatively meagre investment in the company netted them a 30% share in the company by April 2006, this suggests the company’s value increased over 40 times in six months. That ballsy investment has proved a very, very profitable one, but some would argue that even this massive (and very quickly made) whack of cash hasn’t proved worth it in the long run. After all, less than two years after he was offered $500 000 for Facebook, Mark Zuckerberg’s company was worth several billion and still rising (it’s currently valued at $11 billion, after that messy stock market flotation), and YouTube is now, if anything, even bigger.

It’s actually quite hard to visualise just how big a thing YouTube has now managed to be come, but I’ll try; every second, roughly one hour of footage is uploaded to the site, or to put it another way, you would have to watch continually for the next three and a half millennia just to get through the stuff published this year. Even watching just the ones involving cats would be a full-time job. I occasionally visit one channel with more than one and a half thousand videos published by just one guy, each of which is around 20 minutes long, and there are in the region of several thousand people across the world who are able to make a living through nothing more than sitting in front of a camera and showing their antics to the world.

Precisely because of this, the very concept of YouTube has not infrequently come under fire. In much the same way as social networking sites, the free and open nature of YouTube means everything is on show for the whole world to see, so that video you of your mate doing this hilarious thing while drunk one time could, at best, make him the butt of a few jokes among your mates or, at worst, subject him to large-scale public ridicule. For every TomSka, beloved by his followers and able to live off YouTube-related income, there is a Star Wars kid, who (after having the titular video put online without his permission) was forced to seek psychiatric help for the bullying and ridicule he became the victim of and launched a high-profile lawsuit against his antagonists. Like so many things, YouTube is neither beneficial nor detrimental to humanity as a whole on its own; it is merely a tool of our modern world, and to what degree of awesomeness or depravity we exploit it is down purely to us.

Sorry about that, wasn’t really a conclusion was it?

“The most honest three and a half minutes in television history”

OK, I know this should have been put up on Wednesday, but I wanted to get this one right. Anyway…

This video appeared on my Facebook feed a few days ago, and I have been unable to get it out of my head since. It is, I am told, the opening scene of a new HBO series (The Newsroom), and since HBO’s most famous product, Game of Thrones, is famously the most pirated TV show on earth, I hope they won’t mind me borrowing another three minute snippet too much.

OK, watched it? Good, now I can begin to get my thoughts off my chest.

This video is many things; to me, it is quite possibly one of the most poignant and beautiful, and in many ways is the best summary of greatness ever put to film. It is inspiring, it is blunt, it is great television. It is not, however, “The most honest three and a half minutes of television, EVER…” as claimed in its title; there are a lot of things I disagree with in it. For one thing, I’m not entirely sure on our protagonist’s reasons for saying ‘liberals lose’. If anything, the last century of our existence can be viewed as one long series of victories for liberal ideology; women have been given the vote, homosexuality has been decriminalised, racism has steadily been dying out, gender equality is advancing year by year and only the other day the British government legalised gay marriage. His viewpoint may have something to do with features of American politics that I’m missing, particularly his reference to the NEA (an organisation which I do not really understand), but even so. I’m basically happy with the next few seconds; I’ll agree that claiming to be the best country in the world based solely on rights and freedoms is not something that holds water in our modern, highly democratic world. Freedom of speech, information, press and so on are, to most eyes, prerequisites to any country wishing to have any claim to true greatness these days, rather than the scale against which such activities are judged. Not entirely sure why he’s putting so much emphasis on the idea of a free Australia and Belgium, but hey ho.

Now, blatant insults of intelligence directed towards the questioner aside, we then start to quote statistics- always a good foundation point to start from in any political discussion. I’ll presume all his statistics are correct, so plus points there, but I’m surprised that he apparently didn’t notice that one key area America does lead the world in is size of economy; China is still, much to its chagrin, in second place on that front. However, I will always stand up for the viewpoint that economy does not equal greatness, so I reckon his point still stands.

Next, we move on to insulting 20 year old college students, not too far off my own personal social demographic; as such, this is a generation I feel I can speak on with some confidence. This is, probably the biggest problem I have with anything said during this little clip; no justification is offered as to why this group is the “WORST PERIOD GENERATION PERIOD EVER PERIOD”. Plenty of reasons for this opinion have been suggested in the past by other commentators, and these may or may not be true; but making assumptions and insults about a person based solely on their date of manufacture is hardly the most noble of activities. In any case, in the age of the internet and mass media, a lot of the world’s problems, with the younger generation in particular, get somewhat exaggerated… but no Views here, bad Ix.

And here we come to the meat of the video, the long, passionate soliloquy containing all the message and poignancy of the video with suitably beautiful backing music. But, what he comes out with could still be argued back against by an equally vitriolic critic; no time frame of when America genuinely was ‘the greatest country in the world’ is ever given. Earlier, he attempted to justify non-greatness by way of statistics, but his choice of language in his ‘we sure as hell used to be great’ passage appears to hark back to the days of Revolutionary-era and Lincoln-era America, when America was lead by the ‘great men’ he refers to. But if we look at these periods of time, the statistics don’t add up anywhere near as well; America didn’t become the world-dominating superpower with the stated ‘world’s greatest economy’ it is today until after making a bucket load of money from the two World Wars (America only became, in the words of then President Calvin Coolidge, ‘the richest country in the history of the world’, during the 1920s). Back in the periods where American heroes were born, America was a relatively poor country, consisting of vast expanses of wilderness, hardline Christian motivation, an unflinching belief in democracy, and an obsession the American spirit of ‘rugged individualism’ that never really manifested itself into any super-economy until it became able to loan everyone vast sums of money to pay off war debts. And that’s not all; he makes mention of ‘making war for moral reasons’, but of the dozens of wars America has fought only two are popularly thought of as being morally motivated. These were the American War of Independence, which was declared less for moral reasons and more because the Americans didn’t like being taxed, and the American Civil War, which ended with the southern states being legally allowed to pass the ‘Jim Crow laws’ that limited black rights until the 1960s; here they hardly ‘passed laws, struck down laws for moral reasons’. Basically, there is no period of history in which his justifications for why America was once’the greatest country in the world’ actually stand up at once.

But this, to me, is the point of what he’s getting at; during his soliloquy, a historical period of greatness is never defined so much as a model and hope for greatness is presented.. Despite all his earlier quoting of statistics and ‘evidence’, they are not what makes a country great. Money, and the power that comes with it, are not defining features of greatness, but just stuff that makes doing great things possible. The soliloquy, intentionally or not, aligns itself with the Socratic idea of justice; that a just society is one in which every person concerns himself with doing their own, ideally suited, work, and does not concern himself with trying to be a busybody and doing someone else’s job for them. Exactly how he arrives at this conclusion is somewhat complex; Plato’s Republic gives the full discourse. This idea is applied to political parties during the soliloquy; defining ourselves by our political stance is a self-destructive idea, meaning all our political system ever does is bicker at itself rather than just concentrating on making the country a better place. Also mentioned is the idea of ‘beating our chest’, the kind of arrogant self-importance that further prevents us from seeking to do good in this world, and the equally destructive concept of belittling intelligence that prevents us from making the world a better, more righteous place, full of the artistic and technological breakthroughs that make our world so awesome to bring in. For, as he says so eloquently, what really makes a country great is to be right. To be just, to be fair, to mean and above all to stand for something. To not be obsessed about ourselves, or other people’s business; to have rightness and morality as the priority for the country as a whole. To lay down sacrifices and be willing to sacrifice ourselves for the greater good, to back our promises and ideals and to care, above all else, simply for what is right.

You know what, he put it better than I ever could analyse. I’m just going to straight up quote him:

“We stood up for what was right. We fought for moral reasons, we passed laws, struck down laws for moral reasons, we waged wars on poverty not poor people. We sacrificed, we cared about our neighbours, we put our money where our mouths were and we never beat our chest. We built great big things, made ungodly technological advances, explored the universe, cured diseases and we cultivated the world’s greatest artists and the world’s greatest economy. We reached for the stars, acted like men- we aspired to intelligence, we didn’t belittle it, it didn’t make us feel inferior. We didn’t identify ourselves by who we voted for in the last election and we didn’t scare so easy.”

Maybe his words don’t quite match the history; it honestly doesn’t matter. The message of that passage embodies everything that defines greatness, ideas of morality and justice and doing good by the world. That statement is not harking back to some mythical past, but a statement of hope and ambition for the future. That is beauty embodied. That is greatness.

The Six Nations Returns…

…and with it my weekly awards ceremony, as last year, for the weekend’s matches. To be honest, I haven’t had much time to think about these, so enthralled with the actual games as I was (over 150 points and 17 tries scored; absolutely fantastic stuff), but I think I’ll just dive straight in with the first match of the weekend.

First, we must turn to WALES, who take the dubious honour of the Year-Long Nostalgia Award for Most Dramatic Fall From Grace, reclaiming a title they won in both 2006 and 2009. Last year the Welsh, after a proud performance at the World Cup the previous awesome, had their ranks positively blooming with talent and good form. Behind the scrum, Rhys Priestland was still hanging on to some of his outstanding 2011 form, Jamie Roberts was in the kind of hard-running, defence-busting mode that won him three Lions caps in 2009, George North (alongside, to a lesser extent, Alex Cuthbert) was terrorising defences through a mixture of raw speed and power, and Jonathan Davies’ smooth running and handling in the centres was causing him to be mentioned in the same breath as New Zealand’s great Conrad Smith. The team seemed unstoppable, battering, bludgeoning and otherwise smashing all who came before them as they romped home to the Grand Slam.

And then the slide began. Since they took the title against the French 11 months ago, Wales have lost eight games on the trot, of which Saturday’s display against Ireland was only the most recent. After some pretty dire performances against the southern hemisphere sides during the summer, a few traces of hope were salvaged during the autumn from close losses to the likes of Australia. Some of the more optimistic Welsh fans thought that the Six Nations may signal a new return to form for their players; but an opening match against Ireland proved unforgiving. The Irish put 30 points past the Welsh in 50 minutes with only 3 in reply, and although Wales mounted a spirited comback it all proved too much, too late.

On, then, to IRELAND; more specifically to left winger Simon Zebo, who takes the Nyan Cat Award for Most YouTube-Worthy Moment from George North in this fixture last year. Whilst North’s little moment of hilarity was typical of a player whose size and strength is his greatest asset, Zebo’s piece of magic was a more mercurial bit of skill. After Dan Biggar (the Welsh flyhalf) decided, for reasons best known to himself, to aim a kick straight at the face of the onrushing Rory Best, the Irishman managed to gather the ball on the rebound and set off for the line. Realising he was being pushed for space, he elected to throw a beautiful long pass out to captain Jamie Heaslip. If Heaslip were able to flick the ball to Zebo, sprinting along his outside, there was a fair chance that the winger could make the corner; but the skipper was under pressure and could only manage a flick off his knees. The pass was poor; thrown at knee-height about a metre behind the onrushing winger, most moves would have ended there with a loose ball. But Zebo produced a truly magical piece of skill– as the ball seemed destined to disappear behind him, he turned and flicked at it deftly with his left heel, before gathering the ball one handed and continuing his run; all without breaking stride. He may not have got the try, but from his bit of sublimity prop Cian Healy did, and thus Zebo will be forever honoured in the hall of fame that is YouTube.

Onto Saturday’s second match and SCOTLAND, proud takers of the Holy Shit, How Did That Happen Award for Biggest Disparity Between Score and Performance. In all honesty, the Scots were never not going to struggle against their English opponents; Calcutta Cups are always ripe for upsets its true, and there’s nothing the Scots like better than being mistaken for the underdog, but they had not won at Twickenham for 30 years and the current team was probably not in the best shape to break that duck. A new side under a new coach (Scott Johnson), they had taken last place and the wooden spoon in last year’s Six Nations, even losing rather badly to Italy, they reached a nadir during the dire loss to Tonga that ended their autumn series and led old coach Andy Robinson to resign. By contrast, the Auld Enemy were ebullient after their emphatic win against New Zealand in November, and some smart money was being put on them to take the Six Nations title this year. And it showed during the game; for all Jim Telfer’s pre-match comments about the England side being ‘arrogant’, the young English side were clinical and efficient, winning twice as many breakdowns as the Scots and Owen Farrell kicking everything he could get his boots on. Nonetheless, the Scots put in a pretty damn good show when they could; new winger Sean Maitland opened the game’s scoring with a neatly taken try in the corner, and fullback Stuart Hogg not only set up that try with a dazzling 60 metre break, but eventually grabbed one of his own and was probably the best back on the pitch. Johnnie Beattie was sublime in the back row, and if it wasn’t for England’s clinical territory game then they would certainly have managed a scoreline far closer than the 20 points it ended up being. We’ve all played games like that; you think you’re playing well and putting up a good fight, scoring some points, and then look up at the scoreboard and think ‘how did that happen?’

As for ENGLAND, centre Billy Twelvetrees takes the Carlos Spencer Award for Most Impressive Debut Performance (and, incidentally, the Staff Sergeant Max Fightmaster Award for Best Name- dunno why, it’s just cool). England have in recent past been rather good at debuts (Freddie Burns last year enjoyed a sound beating of the world champions as his first cap), and much speculation was put forward before the game as to whether the young Gloucester man could fill the sizeable hole left by the injured Manu Tuilagi. As it turned out, he did so splendidly; despite a somewhat ignominious start to his international career (ie he dropped the first ball that came his way), he spent most of the match running superb lines that often threatened the Scottish centre pairing and kept the tempo of the match nice and fast. To cap a great first performance, he even picked up England’s third try, running a typically lovely angle to seemingly pop up from nowhere and slip straight through a gap in the defence. Good stuff, and I look forward to seeing if he can make it a habit.

And now to Sunday’s match, where FRANCE take the When Did I Get In Last Night Award for Least Looking Like They Wanted To Be On The Pitch. France are always a tricky bunch to predict, and their last visit to Rome ended an embarrassing defeat that lead coach Marc Lievremont to dub them cowards; but they’d fared the best out of all the northern hemisphere sides in the awesome, beating Australia and Argentina convincingly, and Frederic Michalak, once the French equivalent to Jonny Wilkinson, was back on form and in the No. 10 shirt. To many, a trip to face the usually table-propping Italians was the perfect warmup before the tournament really hotted up, and it seems the French may have made the mistake of thinking the Azzurri easybeats. It quickly transpired that they were not; Italy’s talismanic captain Sergio Parisse grabbed an early try courtesy of fly half Luciano Orquera, who had a stunning game and lead for most of the first half before a try from Louis Picamoles and some good kicking from Michalak put the French in front. But at no point in the game did France ever look threatening; in the first 25 minutes Italy controlled nearly 75% of the game’s possession whilst France seemed content to wait for mistakes that the Italians simply never made. They seemed lazy, lethargic, even as the precious minutes towards the end of the game ticked away, and never matched Italy’s sheer commitment and drive at the breakdown. Even when they did get good ball, the Italian’s surprisingly impressive kicking game meant they rarely had the territory to do anything with it.

As for the ITALY themelves, they (and Luciano Orquera in particular) take the About Bloody Time Award for Finally Finding A Fly Half. Italy have always had strength in the pack thanks to such men as Parisse and Martin Castrogiovanni, but behind the scrum they have always lacked class. In particular, they have lacked a good kicker ever since Diego Dominguez retired, allowing teams to be ferocious in the breakdowns with only a minimal risk associated with penalties. Kris Burton and Orquera both tried and failed to ignite the Italian back division, growing in strength with the achievements of Tomasso Benvenuti and Andrea Masi, last year, but yesterday Orquera ran the show. He and Tomas Botes at scrum half kept the French pinned back with a long and effective kicking game, whilst Masi’s incisive running from full back and an energetic display from centre Luke Mclean meant the French were never able to establish any sort of rhythm. With their backs to the wall and their fingers not yet pulled out, the French were sufficiently nullified to allow the Italian forwards to establish dominance at the breakdown; and with Orquera’s place kicking proving as accurate as his punts from hand, the French were punished through both the boot and the tries from Parisse and Castrogiovanni. An outstanding defensive effort to keep the French out in the final 10 minutes and two lovely drop goals from Orquera and Burton sealed the deal on a fantastic display, and the Italians can proudly say for the next two years that the frequently championship-winning French haven’t beaten them in Rome since 2009.

Final Scores: Wales 22-30 Ireland
England 38-18 Scotland
Italy 23-18 France

Drunken Science

In my last post, I talked about the societal impact of alcohol and its place in our everyday culture; today, however, my inner nerd has taken it upon himself to get stuck into the real meat of the question of alcohol, the chemistry and biology of it all, and how all the science fits together.

To a scientist, the word ‘alcohol’ does not refer to a specific substance at all, but rather to a family of chemical compounds containing an oxygen and hydrogen atom bonded to one another (known as an OH group) on the end of a chain of carbon atoms. Different members of the family (or ‘homologous series’, to give it its proper name) have different numbers of carbon atoms and have slightly different physical properties (such as melting point), and they also react chemically to form slightly different compounds. The stuff we drink is that with two carbon atoms in its chain, and is technically known as ethanol.

There are a few things about ethanol that make it special stuff to us humans, and all of them refer to chemical reactions and biological interactions. The first is the formation of it; there are many different types of sugar found in nature (fructose & sucrose are two common examples; the ‘-ose’ ending is what denotes them as sugars), but one of the most common is glucose, with six carbon atoms. This is the substance our body converts starch and other sugars into in order to use for energy or store as glycogen. As such, many biological systems are so primed to convert other sugars into glucose, and it just so happens that when glucose breaks down in the presence of the right enzymes, it forms carbon dioxide and an alcohol; ethanol, to be precise, in a process known as either glycolosis (to a scientist) or fermentation (to everyone else).

Yeast performs this process in order to respire (ie produce energy) anaerobically (in the absence of oxygen), so leading to the two most common cases where this reaction occurs. The first we know as brewing, in which an anaerobic atmosphere is deliberately produced to make alcohol; the other occurs when baking bread. The yeast we put in the bread causes the sugar (ie glucose) in it to produce carbon dioxide, which is what causes the bread to rise since it has been filled with gas, whilst the ethanol tends to boil off in the heat of the baking process. For industrial purposes, ethanol is made by hydrating (reacting with water) an oil by-product called ethene, but the product isn’t generally something you’d want to drink.

But anyway, back to the booze itself, and this time what happens upon its entry into the body. Exactly why alcohol acts as a depressant and intoxicant (if that’s a proper word) is down to a very complex interaction with various parts and receptors of the brain that I am not nearly intelligent enough to understand, let alone explain. However, what I can explain is what happens when the body gets round to breaking the alcohol down and getting rid of the stuff. This takes place in the liver, an amazing organ that performs hundreds of jobs within the body and contains a vast repetoir of enzymes. One of these is known as alcohol dehydrogenase, which has the task of oxidising the alcohol (not a simple task, and one impossible without enzymes) into something the body can get rid of. However, most ethanol we drink is what is known as a primary alcohol (meaning the OH group is on the end of the carbon chain), and this causes it to oxidise in two stages, only the first of which can be done using alcohol dehydrogenase. This process converts the alcohol into an aldehyde (with an oxygen chemically double-bonded to the carbon where the OH group was), which in the case of ethanol is called acetaldehyde (or ethanal). This molecule cannot be broken down straight away, and instead gets itself lodged in the body’s tissues in such a way (thanks to its shape) to produce mild toxins, activate our immune system and make us feel generally lousy. This is also known as having a hangover, and only ends when the body is able to complete the second stage of the oxidation process and convert the acetaldehyde into acetic acid, which the body can get rid of relatively easily. Acetic acid is commonly known as the active ingredient in vinegar, which is why alcoholics smell so bad and are often said to be ‘pickled’.

This process occurs in the same way when other alcohols enter the body, but ethanol is unique in how harmless (relatively speaking) its aldehyde is. Methanol, for example, can also be oxidised by alcohol dehydrogenase, but the aldehyde it produces (officially called methanal) is commonly known as formaldehyde; a highly toxic substance used in preservation work and as a disinfectant that will quickly poison the body. It is for this reason that methanol is present in the fuel commonly known as ‘meths’- ethanol actually produces more energy per gram and makes up 90% of the fuel by volume, but since it is cheaper than most alcoholic drinks the toxic methanol is put in to prevent it being drunk by severely desperate alcoholics. Not that it stops many of them; methanol poisoning is a leading cause of death among many homeless people.

Homeless people were also responsible for a major discovery in the field of alcohol research, concerning the causes of alcoholism. For many years it was thought that alcoholics were purely addicts mentally rather than biologically, and had just ‘let it get to them’, but some years ago a young student (I believe she was Canadian, but certainty of that fact and her name both escape me) was looking for some fresh cadavers for her PhD research. She went to the police and asked if she could use the bodies of the various dead homeless people who they found on their morning beats, and when she started dissecting them she noticed signs of a compound in them that was known to be linked to heroin addiction. She mentioned to a friend that all these people appeared to be on heroin, but her friend said that these people barely had enough to buy drink, let alone something as expensive as heroin. This young doctor-to-be realised she might be onto something here, and changed the focus of her research onto studying how alcohol was broken down by different bodies, and discovered something quite astonishing. Inside serious alcoholics, ethanol was being broken down into this substance previously only linked to heroin addiction, leading her to believe that for some unlucky people, the behaviour of their bodies made alcohol as addictive to them as heroin was to others. Whilst this research has by no means settled the issue, it did demonstrate two important facts; firstly, that whilst alcoholism certainly has some links to mental issues, it is also fundamentally biological and genetic by nature and cannot be solely put down as the fault of the victim’s brain. Secondly, it ‘sciencified’ (my apologies to grammar nazis everywhere for making that word up) a fact already known by many reformed drinkers; that when a former alcoholic stops drinking, they can never go back. Not even one drink. There can be no ‘just having one’, or drinking socially with friends, because if one more drink hits their body, deprived for so long, there’s a very good chance it could kill them.

Still, that’s not a reason to get totally down about alcohol, for two very good reasons. The first of these comes from some (admittely rather spurious) research suggesting that ‘addictive personalities’, including alcoholics, are far more likely to do well in life, have good jobs and overall succeed; alcoholics are, by nature, present at the top as well as the bottom of our society. The other concerns the one bit of science I haven’t tried to explain here- your body is remarkably good at dealing with alcohol, and we all know it can make us feel better, so if only for your mental health a little drink now and then isn’t an all bad thing after all. And anyway, it makes for some killer YouTube videos…

Bouncing horses

I have , over recent months, built up a rule concerning posts about YouTube videos, partly on the grounds that it’s bloody hard to make a full post out of them but also because there are most certainly a hell of a lot of good ones out there that I haven’t heard of, so any discussion of them is sure to be incomplete and biased, which I try to avoid wherever possible. Normally, this blog also rarely delves into what might be even vaguely dubbed ‘current affairs’, but since it regularly does discuss the weird and wonderful world of the internet and its occasional forays into the real world I thought that I might make an exception; today, I’m going to be talking about Gangnam Style.

Now officially the most liked video in the long and multi-faceted history of YouTube (taking over from the previous record holder and a personal favourite, LMFAO’s Party Rock Anthem), this music video by Korean rapper & pop star PSY was released over two and a half months ago, and for the majority of that time it lay in some obscure and foreign corner of the internet. Then, in that strange way that random videos, memes and general random bits and pieces are wont to do online, it suddenly shot to prominence thanks to the web collectively pissing itself over the sight of a chubby Korean bloke in sunglasses doing ‘the horse riding dance’. Quite how this was even discovered by some casual YouTube-surfer is something of a mystery to me given that said dance doesn’t even start for a good minute and a half or so, but the fact remains that it was, and that it is now absolutely bloody everywhere. Only the other day it became the first ever Korean single to reach no.1 in the UK charts, despite not having been translated from its original language, and has even prompted a dance off between rival Thai gangs prior to a gunfight. Seriously.

Not that it has met with universal appeal though. I’m honestly surprised that more critics didn’t get up in their artistic arms at the sheer ridiculousness of it, and the apparent lack of reason for it to enjoy the degree of success that it has (although quite a few probably got that out of their system after Call Me Maybe), but several did nonetheless. Some have called it ‘generic’ in music terms, others have found its general ridiculousness more tiresome and annoying than fun, and one Australian journalist commented that the song “makes you wonder if you have accidentally taken someone else’s medication”. That such criticism has been fairly limited can be partly attributed to the fact that the song itself is actually intended to be a parody anyway. Gangnam is a classy, fashionable district of the South Korean capital Seoul (PSY has likened it to Beverly Hills in California), and gangnam style is a Korean phrase referring to the kind of lavish & upmarket (if slightly pretentious) lifestyle of those who live there; or, more specifically, the kind of posers & hipsters who claim to affect ‘the Gangnam Style’. The song’s self-parody comes from the contrast between PSY’s lyrics, written from the first-person perspective of such a poser, and his deliberately ridiculous dress and dance style.

Such an act of deliberate self-parody has certainly helped to win plaudits from serious music critics, who have found themselves to be surprisingly good-humoured once told that the ridiculousness is deliberate and therefore actually funny- however, it’s almost certainly not the reason for the video’s over 300 million YouTube views, most of which surely go to people who’ve never heard of Gangnam, and certainly have no idea of the people PSY is mocking. In fact, there have been several different theories proposed as to why its popularity has soared quite so violently.

Most point to PSY’s very internet-friendly position on his video’s copyright. The Guardian claim that PSY has in fact waived his copyright to the video, but what is certain is that he has neglected to take any legal action on the dozens of parodies and alternate versions of his video, allowing others to spread the word in their own, unique ways and giving it enormous potential to spread, and spread far. These parodies have been many and varied in content, author and style, ranging from the North Korean government’s version aimed at satirising the South Korean president Park Guen-hye (breaking their own world record for most ridiculous entry into a political pissing contest, especially given that it mocks her supposed devotion to an autocratic system of government, and one moreover that ended over 30 years ago), to the apparently borderline racist “Jewish Style” (neither of which I have watched, so cannot comment on). One parody has even sparked a quite significant legal case, with 14 California lifeguards being fired for filming, dancing in, or even appearing in the background of, their parody video “Lifeguard Style” and investigation has since been launched by the City Council in response to the thousands of complaints and suggestions, one even by PSY himself, that the local government were taking themselves somewhat too seriously.

However, by far the most plausible reason for he mammoth success of the video is also the simplest; that people simply find it funny as hell. Yes, it helps a lot that such a joke was entirely intended (let’s be honest, he probably couldn’t have come up with quite such inspired lunacy by accident), and yes it helps how easily it has been able to spread, but to be honest the internet is almost always able to overcome such petty restrictions when it finds something it likes. Sometimes, giggling ridiculousness is just plain funny, and sometimes I can’t come up with a proper conclusion to these posts.

P.S. I forgot to mention it at the time, but last post was my 100th ever published on this little bloggy corner of the internet. Weird to think it’s been going for over 9 months already. And to anyone who’s ever stumbled across it, thank you; for making me feel a little less alone.

Attack of the Blocks

I spend far too much time on the internet. As well as putting many hours of work into trying to keep this blog updated regularly, I while away a fair portion of time on Facebook, follow a large number of video series’ and webcomics, and can often be found wandering through the recesses of YouTube (an interesting and frequently harrowing experience that can tell one an awful lot about the extremes of human nature). But there is one thing that any resident of the web cannot hope to avoid for any great period of time, and quite often doesn’t want to- the strange world of Minecraft.

Since its release as a humble alpha-version indie game in 2009, Minecraft has boomed to become a runaway success and something of a cultural phenomenon. By the end of 2011, before it had even been released in its final release format, Minecraft had registered 4 million purchases and 4 times that many registered users, which isn’t bad for a game that has never advertised itself, spread semi-virally among nerdy gamers for its mere three-year history and was made purely as an interesting project by its creator Markus Persson (aka Notch). Thousands of videos, ranging from gameplay to some quite startlingly good music videos (check out the work of Captain Sparklez if you haven’t already) litter YouTube and many of the games’ features (such as TNT and the exploding mobs known as Creepers) have become memes in their own right to some degree.

So then, why exactly has Minecraft succeeded where hundreds and thousands of games have failed, becoming a revolution in gamer culture? What is it that makes Minecraft both so brilliant, and so special?

Many, upon being asked this question, tend to revert to extolling the virtues of the game’s indie nature. Being created entirely without funding as an experiment in gaming rather than profit-making, Minecraft’s roots are firmly rooted in the humble sphere of independent gaming, and it shows. One obvious feature is the games inherent simplicity- initially solely featuring the ability to wander around, place and destroy blocks, the controls are mainly (although far from entirely) confined to move and ‘use’, whether that latter function be shoot, slash, mine or punch down a tree. The basic, cuboid, ‘blocky’ nature of the game’s graphics, allowing for both simplicity of production and creating an iconic, retro aesthetic that makes it memorable and standout to look at. Whilst the game has frequently been criticised for not including a tutorial (I myself took a good quarter of an hour to find out that you started by punching a tree, and a further ten minutes to work out that you were supposed to hold down the mouse button rather than repeatedly click), this is another common feature of indie gaming, partly because it saves time in development, but mostly because it makes the game feel like it is not pandering to you and thus allowing indie gamers to feel some degree of elitism that they are good enough to work it out by themselves. This also ties in with the very nature of the game- another criticism used to be (and, to an extent, still is, even with the addition of the Enderdragon as a final win objective) that the game appeared to be largely devoid of point, existent only for its own purpose. This is entirely true, whether you view that as a bonus or a detriment being entirely your own opinion, and this idea of an unfamiliar, experimental game structure is another feature common in one form or another to a lot of indie games.

However, to me these do not seem to be entirely worthy of the name ‘answers’ regarding the question of Minecraft’s phenomenal success. The reason I think this way is that they do not adequately explain exactly why Minecraft rose to such prominence whilst other, often similar, indie games have been left in relative obscurity. Limbo, for example, is a side-scrolling platformer and a quite disturbing, yet compelling, in-game experience, with almost as much intrigue and puzzle from a set of game mechanics simpler even than those of Minecraft. It has also received critical acclaim often far in excess of Minecraft (which has received a positive, but not wildly amazed, response from critics), and yet is still known to only an occasional few. Amnesia: The Dark Descent has been often described as the greatest survival horror game in history, as well as incorporating a superb set of graphics, a three-dimensional world view (unlike the 2D view common to most indie games) and the most pants-wettingly terrifying experience anyone who’s ever played it is likely to ever face- but again, it is confined to the indie realm. Hell, Terraria is basically Minecraft in 2D, but has sold around 40 times less than Minecraft itself. All three of these games have received fairly significant acclaim and coverage, and rightly so, but none has become the riotous cultural phenomenon that Minecraft has, and neither have had an Assassin’s Creed mod (first example that sprung to mind).

So… why has Minecraft been so successful. Well, I’m going to be sticking my neck out here, but to my mind it’s because it doesn’t play like an indie game. Whilst most independently produced titled are 2D, confined to fairly limited surroundings and made as simple & basic as possible to save on development (Amnesia can be regarded as an exception), Minecraft takes it own inherent simplicity and blows it up to a grand scale. It is a vast, open world sandbox game, with vague resonances of the Elder Scrolls games and MMORPG’s, taking the freedom, exploration and experimentation that have always been the advantages of this branch of the AAA world, and combined them with the innovative, simplistic gaming experience of its indie roots. In some ways it’s similar to Facebook, in that it takes a simple principle and then applies it to the largest stage possible, and both have enjoyed a similarly explosive rise to fame. The randomly generated worlds provide infinite caverns to explore, endless mobs to slay, all the space imaginable to build the grandest of castles, the largest of cathedrals, or the SS Enterprise if that takes your fancy. There are a thousand different ways to play the game on a million different planes, all based on just a few simple mechanics. Minecraft is the best of indie and AAA blended together, and is all the more awesome for it.

Icky stuff

OK guys, time for another multi-part series (always a good fallback when I’m short of ideas). Actually, this one started out as just an idea for a single post about homosexuality, but when thinking about how much background stuff I’d have to stick in for the argument to make sense, I thought I might as well dedicate an entire post to background and see what I could do with it from there. So, here comes said background: an entire post on the subject of sex.

The biological history of sex must really start by considering the history of biological reproduction. Reproduction is a vital part of the experience of life for all species, a necessary feature for something to be classified ‘life’, and among some thinkers is their only reason for existence in the first place. In order to be successful by any measure, a species must exist; in order to exist, those of the species who die must be replaced, and in order for this to occur, the species must reproduce. The earliest form of reproduction, occurring amongst the earliest single-celled life forms, was binary fission, a basic form of asexual reproduction whereby the internal structure of the organism is replicated, and it then splits in two to create two organisms with identical genetic makeup. This is an efficient way of expanding a population size very quickly, but it has its flaws. For one thing, it does not create any variation in the genetics of a population, meaning what kills one stands a very good chance of destroying the entire population; all genetic diversity is dependent on random mutations. For another, it is only really suitable for single-celled organisms such as bacteria, as trying to split up a multi-celled organism once all the data has been replicated is a complicated geometric task. Other organisms have tried other methods of reproducing asexually, such as budding in yeast, but about 1 billion years ago an incredibly strange piece of genetic mutation must have taken place, possibly among several different organisms at once. Nobody knows exactly what happened, but one type of organism began requiring the genetic data from two, rather than one, different creatures, and thus was sexual reproduction, both metaphorically and literally, born.

Just about every complex organism alive on Earth today now uses this system in one form or another (although some can reproduce asexually as well, or self-fertilise), and it’s easy to see why. It may be a more complicated system, far harder to execute, but by naturally varying the genetic makeup of a species it makes the species as a whole far more resistant to external factors such as disease- natural selection being demonstrated at its finest. Perhaps is most basic form is that adopted by aquatic animals such as most fish and lobster- both will simply spray their eggs and sperm into the water (usually as a group at roughly the same time and place to increase the chance of conception) and leave them to mix and fertilise one another. The zygotes are then left to grow into adults of their own accord- a lot are of course lost to predators, representing a huge loss in terms of inputted energy, but the sheer number of fertilised eggs still produces a healthy population. It is interesting to note that this most basic of reproductive methods, performed in a similar matter by plants, is performed by such complex animals as fish (although their place on the evolutionary ladder is both confusing and uncertain), whilst supposedly more ‘basic’ animals such as molluscs have some of the weirdest and most elaborate courtship and mating rituals on earth (seriously, YouTube ‘snail mating’. That shit’s weird)

Over time, the process of mating and breeding in the animal kingdom has grown more and more complicated. Exactly why the male testes & penis and the female vagina developed in the way they did is unclear from an evolutionary perspective, but since most animals appear to use a broadly similar system (males have an appendage, females have a depository) we can presume this was just how it started off and things haven’t changed much since. Most vertebrates and insects have distinct sexes and mate via internal fertilisation of a female’s eggs, in many cases by several different males to enhance genetic diversity. However, many species also take the approach that ensuring they care for their offspring for some portion of their development is a worthwhile trade-off in terms of energy when compared to the advantages of giving them the best possible chance in life. This care generally (but not always, perhaps most notably in seahorses) is the role of the mother, males having usually buggered off after mating to leave mother & baby well alone, and the general ‘attitude’ of such an approach gives a species, especially females, a vested interest in ensuring their baby is as well-prepared as possible. This manifests itself in the process of a female choosing her partner prior to mating. Natural selection dictates that females who pick characteristics in males that result in successful offspring, good at surviving, are more likely to pass on their genes and the same attraction towards those characteristics, so over time these traits become ‘attractive’ to all females of a species. These traits tend to be strength-related, since strong creatures are generally better at competing for food and such, hence the fact that most pre-mating procedures involve a fight or physical contest of some sort between males to allow them to take their pick of available females. This is also why strong, muscular men are considered attractive to women among the human race, even though these people may not always be the most suitable to father their children for various reasons (although one could counter this by saying that they are more likely to produce children capable of surviving the coming zombie apocalypse). Sexual selection on the other hand is to blame for the fact that sex is so enjoyable- members of a species who enjoy sex are more likely to perform it more often, making them more likely to conceive and thus pass on their genes, hence the massive hit of endorphins our bodies experience both during and post sexual activity.

Broadly speaking then, we come to the ‘sex situation’ we have now- we mate by sticking penises in vaginas to allow sperm and egg to meet, and women generally tend to pick men who they find ‘attractive’ because it is traditionally an evolutionary advantage, as is the fact that we find sex as a whole fun. Clearly, however, the whole situation is a good deal more complicated than just this… but what is a multi parter for otherwise?

‘Before it was cool’

Hipsters are one of the few remaining groups it is generally considered OK to take the piss out of as a collective in modern culture, along with chavs and the kind of people who comment below YouTube videos. The main complaint against them as a group is their overly superior and rather arrogant attitude- the sense that they are inherently ‘better’ than those around them simply by virtue of dressing differently (or ‘individually’ as they would have it) and listening to music that nobody’s ever heard of before.

However, perhaps the single thing that hipster elitism is loathed for more than any other is the simple four-letter phrase ‘before it was cool’. Invariably prefaced with ‘I was into that…’, ‘I knew about them…’ or ‘They were all over my iTunes…’ (although any truly self-respecting hipster would surely not stoop so low as to use such ‘mainstream’ software), and often surrounded by ‘y’know’s, this small phrase conjures up a quite alarming barrage of hatred from even the calmest music fan. It symbolises every piece of petty elitism and self-superiority that hipster culture appears to stand for, every condescending smirk and patronising drawl directed at a sense of taste that does not match their own, and every piece of weird, idiosyncratic acoustic that they insist is distilled awesome

On the other hand, despite the hate they typically receive for their opinions, hipster reasoning is largely sound. The symbolism of their dress code and music taste marking them out from the crowd is an expression of individuality and separatism from the ‘mass-produced’ culture of the modern world, championing the idea that they are able to think beyond what is simply fed to them by the media and popular culture. It is also an undeniable truth that there is an awful lot of rubbish that gets churned out of said media machine, from all the various flavours of manufactured pop to the way huge tracts of modern music sound the same, all voices having been put through a machine umpteen times. Indeed, whilst it is not my place to pass judgement on Justin Beiber and company (especially given that I haven’t listened to any of his stuff), many a more ‘casual’ music fan is just as quick to pass judgement on fans of that particular brand of ‘manufactured’ pop music as a hipster may be towards him or her.

In fact, this is nothing more than a very human trait- we like what we like, and would like as many other people as possible to like it too. What we don’t like we have a natural tendency to bracket as universally ‘bad’ rather than just ‘not our thing’, and thus anyone who likes what we don’t tends to be subconsciously labelled either ‘wrong’ or ‘misguided’ rather than simply ‘different’. As such, we feel the need to redress this issue by offering our views on what is ‘good’ and ‘bad’, which wouldn’t be a problem if other people didn’t happen to like what we see as bad, and perhaps not get on so well with (or not have heard of) stuff we think of as good. Basically, the problem boils down to the fact that all people are different, but our subconscious treats them as all being like us- an unfortunate state of affairs responsible for nearly all of the general confrontation & friction present in all walks of life today.

What about then that hated phrase of the hipster, ‘before it was cool’? Well, this too has some degree of logic behind it, as was best demonstrated in the early 1990s during the rise of Nirvana. When they first started out during the 1980’s they, along with other alternative rock bands of the time such as REM, represented a kind of rebellious undercurrent to the supposed good fortune of Reagan-era America, a country that was all well and good if you happened to be the kind of clean cut kid who went to school, did his exams, passed through college and got an office job. However, for those left out on a limb by the system, such as the young Kurt Cobain, life was far harsher and less forgiving- he faced a life of menial drudgery, even working as a janitor in his old high school. His music was a way to express himself, to stand out from a world where he didn’t fit in, and thus it really meant something. When ‘Smells Like Teen Spirit’ first made Nirvana big, it was a major victory for that counter-culture, and pretty much put grunge on the map both as a music genre and a cultural movement for the first time.

And with success came money, and here things began to unravel. Unfortunately where there is money, there are always people willing to make more of it, and the big corporations began to move in. Record labels started to sign every grunge band and Nirvana-clone that they could find in a desperate attempt to find ‘the next Nirvana’, and the odd, garish fashion sense of the grunge movement began to make itself felt in more mainstream culture, even finding its way onto the catwalk. The world began to get swamped with ‘grungy stuff’ without embracing what the movement really meant, and with that its whole meaning began to disappear altogether. This turning of his beloved underground scene into an emotionless mainstream culture broke Kurt Cobain’s heart, leaving him disillusioned with what he had unwittingly helped to create. He turned back to the drug abuse that had sprung from his poor health (both physical and mental) and traumatic childhood, and despite multiple attempts to try and pull him out of such a vicious cycle, he committed suicide in 1994.

This is an incredibly dramatic (and very depressing) example, but it illustrates a point- that when a band gets too big for its boots and, in effect, ‘becomes cool’, it can sometimes cause them to lose what made them special in the first place. And once that something has been lost, it may never be the same in the eyes who saw them with it.

Although having said that, there is a difference between being an indie rock fan and being a hipster- being a pretentious, arrogant moron about it. *$%#ing hipsters.