Time is an illusion, lunchtime doubly so…

In the dim and distant past, time was, to humankind, a thing and not much more. There was light-time, then there was dark-time, then there was another lot of light-time; during the day we could hunt, fight, eat and try to stay alive, and during the night we could sleep and have sex. However, we also realised that there were some parts of the year with short days and colder night, and others that were warmer, brighter and better for hunting. Being the bright sort, we humans realised that the amount of time it spent in winter, spring, summer and autumn (fall is the WRONG WORD) was about the same each time around, and thought that rather than just waiting for it to warm up every time we could count how long it took for one cycle (or year) so that we could work out when it was going to get warm next year. This enabled us to plan our hunting and farming patterns, and it became recognised that some knowledge of how the year worked was advantageous to a tribe. Eventually, this got so important that people started building monuments to the annual seasonal progression, hence such weird and staggeringly impressive prehistoric engineering achievements as Stonehenge.

However, this basic understanding of the year and the seasons was only one step on the journey, and as we moved from a hunter-gatherer paradigm to more of a civilised existence, we realised the benefits that a complete calendar could offer us, and thus began our still-continuing test to quantify time. Nowadays our understanding of time extends to clocks accurate to the degree of nanoseconds, and an understanding of relativity, but for a long time our greatest quest into the realm of bringing organised time into our lives was the creation of the concept of the wee.

Having seven days of the week is, to begin with, a strange idea; seven is an awkward prime number, and it seems odd that we don’t pick number that is easier to divide and multiply by, like six, eight or even ten, as the basis for our temporal system. Six would seem to make the most sense; most of our months have around 30 days, or 5 six-day weeks, and 365 days a year is only one less than multiple of six, which could surely be some sort of religious symbolism (and there would be an exact multiple on leap years- even better). And it would mean a shorter week, and more time spent on the weekend, which would be really great. But no, we’re stuck with seven, and it’s all the bloody moon’s fault.

Y’see, the sun’s daily cycle is useful for measuring short-term time (night and day), and the earth’s rotation around it provides the crucial yearly change of season. However, the moon’s cycle is 28 days long (fourteen to wax, fourteen to wane, regular as clockwork), providing a nice intermediary time unit with which to divide up the year into a more manageable number of pieces than 365. Thus, we began dividing the year up into ‘moons’ and using them as a convenient reference that we could refer to every night. However, even a moon cycle is a bit long for day-to-day scheduling, and it proved advantageous for our distant ancestors to split it up even further. Unfortunately, 28 is an awkward number to divide into pieces, and its only factors are 1, 2, 4, 7 and 14. An increment of 1 or 2 days is simply too small to be useful, and a 4 day ‘week’ isn’t much better. A 14 day week would hardly be an improvement on 28 for scheduling purposes, so seven is the only number of a practical size for the length of the week. The fact that months are now mostly 30 or 31 days rather than 28 to try and fit the awkward fact that there are 12.36 moon cycles in a year, hasn’t changed matters, so we’re stuck with an awkward 7 day cycle.

However, this wasn’t the end of the issue for the historic time-definers (for want of a better word); there’s not much advantage in defining a seven day week if you can’t then define which day of said week you want the crops to be planted on. Therefore, different days of the week needed names for identification purposes, and since astronomy had already provided our daily, weekly and yearly time structures it made sense to look skyward once again when searching for suitable names. At this time, centuries before the invention of the telescope, we only knew of seven planets, those celestial bodies that could be seen with the naked eye; the sun, the moon (yeah, their definition of ‘planet’ was a bit iffy), Mercury, Venus, Mars, Jupiter and Saturn. It might seem to make sense, with seven planets and seven days of the week, to just name the days after the planets in a random order, but humankind never does things so simply, and the process of picking which day got named after which planet was a complicated one.

In around 1000 BC the Egyptians had decided to divide the daylight into twelve hours (because they knew how to pick a nice, easy-to-divide number), and the Babylonians then took this a stage further by dividing the entire day, including night-time, into 24 hours. The Babylonians were also great astronomers, and had thus discovered the seven visible planets- however, because they were a bit weird, they decided that each planet had its place in a hierarchy, and that this hierarchy was dictated by which planet took the longest to complete its cycle and return to the same point in the sky. This order was, for the record, Saturn (29 years), Jupiter (12 years), Mars (687 days), Sun (365 days), Venus (225 days), Mercury (88 days) and Moon (28 days). So, did they name the days after the planets in this order? Of course not, that would be far too simple; instead, they decided to start naming the hours of the day after the planets (I did say they were a bit weird) in that order, going back to Saturn when they got to the Moon.

However, 24 hours does not divide nicely by seven planets, so the planet after which the first hour of the day was named changed each day. So, the first hour of the first day of the week was named after Saturn, the first hour of the second day after the Sun, and so on. Since the list repeated itself each week, the Babylonians decided to name each day after the planet that the first hour of each day was named, so we got Saturnday, Sunday, Moonday, Marsday, Mercuryday, Jupiterday and Venusday.

Now, you may have noticed that these are not the days of the week we English speakers are exactly used to, and for that we can blame the Vikings. The planetary method for naming the days of the week was brought to Britain by the Romans, and when they left the Britons held on to the names. However, Britain then spent the next 7 centuries getting repeatedly invaded and conquered by various foreigners, and for most of that time it was the Germanic Vikings and Saxons who fought over the country. Both groups worshipped the same gods, those of Norse mythology (so Thor, Odin and so on), and one of the practices they introduced was to replace the names of four days of the week with those of four of their gods; Tyr’sday, Woden’sday (Woden was the Saxon word for Odin), Thor’sday and Frig’sday replaced Marsday, Mercuryday, Jupiterday and Venusday in England, and soon the fluctuating nature of language renamed the days of the week Saturday, Sunday, Monday, Tuesday, Wednesday, Thursday and Friday.

However, the old planetary names remained in the romance languages (the Spanish translations of the days Tuesday to Friday are Mardi, Mercredi, Jeudi and Vendredi), with one small exception. When the Roman Empire went Christian in the fourth century, the ten commandments dictated they remember the Sabbath day; but, to avoid copying the Jews (whose Sabbath was on Saturday), they chose to make Sunday the Sabbath day. It is for this reason that Monday, the first day of the working week after one’s day of rest, became the start of the week, taking over from the Babylonian’s choice of Saturday, but close to Rome they went one stage further and renamed Sunday ‘Deus Dominici’, or Day Of The Lord. The practice didn’t catch on in Britain, thousands of miles from Rome, but the modern day Spanish, French and Italian words for Sunday are domingo, dimanche and domenica respectively, all of which are locally corrupted forms of ‘Deus Dominici’.

This is one of those posts that doesn’t have a natural conclusion, or even much of a point to it. But hey; I didn’t start writing this because I wanted to make a point, but more to share the kind of stuff I find slightly interesting. Sorry if you didn’t.

Why the chubs?

My last post dealt with the thorny issue of obesity, both it’s increasing presence in our everyday lives, and what for me is the underlying reason behind the stats that back up media scare stories concerning ‘the obesity epidemic’- the rise in size of the ‘average’ person over the last few decades. The precise causes of this trend can be put down to a whole host of societal factors within our modern age, but that story is boring as hell and has been repeated countless times by commenters far more adept in this field than me. Instead, today I wish present the case for modern-day obesity as a problem concerning the fundamental biology of a human being.

We, and our dim and distant ancestors of the scaly/furry variety, have spent the last few million years living wild; hunting, fighting and generally acting much like any other evolutionary pathway. Thus, we can learn a lot about our own inbuilt biology and instincts by studying the behaviour of animals currently alive today, and when we do so, several interesting animal eating habits become apparent. As anyone who has tried it as a child can attest (and I speak from personal experience), grass is not good stuff to eat. It’s tough, it takes a lot of chewing and processing (many herbivores have multiple stomachs to make sure they squeeze the maximum nutritional value out of their food), and there really isn’t much of it to power a fully-functional being. As such, grazers on grass and other such tough plant matter (such as leaves) will spend most of their lives doing nothing but guzzle the stuff, trying to get as much as possible through their system. Other animals will favour food with a higher nutritional content, such as fruits, tubers or, in many cases, meat, but these frequently present issues. Fruits are highly seasonal and rarely available in a large enough volume to support a large population, as well as being quite hard to get a lot of down; plants try to ‘design’ fruits so that each visitor takes only a few at a time, so as best to spread their seeds far and wide, and as such there are few animals that can sustain themselves on such a diet.  Other food such as tubers or nuts are hard to get at, needing to be dug up or broken in highly energy-consuming activities, whilst meat has the annoying habit of running away or fighting back whenever you try to get at it. As anyone who watches nature documentaries will attest, most large predators will only eat once every few days (admittedly rather heavily).

The unifying factor of all of this is that food is, in the wild, highly energy- and time-consuming to get hold of and consume, since every source of it guards its prize jealously. Therefore, any animal that wants to survive in this tough world must be near-constantly in pursuit of food simply to fulfil all of its life functions, and this is characterised by being perpetually hungry. Hunger is a body’s way of telling us that we should get more food, and in the wild this constant desire for more is kept in check by the difficulty that getting hold of it entails. Similarly, animal bodies try to assuage this desire by being lazy; if something isn’t necessary, then there’s no point wasting valuable energy going after it (since this will mean spending more time going after food to replace lost energy.)

However, in recent history (and a spectacularly short period of time from evolution’s point of view), one particular species called homo sapiens came up with this great idea called civilisation, which basically entailed the pooling and sharing of skill and resources in order to best benefit everyone as a whole. As an evolutionary success story, this is right up there with developing multicellular body structures in terms of being awesome, and it has enabled us humans to live far more comfortable lives than our ancestors did, with correspondingly far greater access to food. This has proved particularly true over the last two centuries, as technological advances in a more democratic society have improved the everyman’s access to food and comfortable living to a truly astounding degree. Unfortunately (from the point of view of our waistline) the instincts of our bodies haven’t quite caught up to the idea that when we want/need food, we can just get food, without all that inconvenient running around after it to get in the way. Not only that, but a lack of pack hierarchy combined with this increased availability means that we can stock up on food until we have eaten our absolute fill if so we wish; the difference between ‘satiated’ and ‘stuffed’ can work out as well over 1000 calories per meal, and over a long period of time it only takes a little more than we should be having every day to start packing on the pounds. Combine that with our natural predilection to laziness meaning that we don’t naturally think of going out for some exercise as fun purely for its own sake, and the fact that we no longer burn calories chasing our food, or in the muscles we build up from said chasing, and we find ourselves consuming a lot more calories than we really should be.

Not only that, but during this time we have also got into the habit of spending a lot of time worrying over the taste and texture of our food. This means that, unlike our ancestors who were just fine with simply jumping on a squirrel and devouring the thing, we have to go through the whole rigmarole of getting stuff out of the fridge, spending two hours slaving away in a kitchen and attempting to cook something vaguely resembling tasty. This wait is not something out bodies enjoy very much, meaning we often turn to ‘quick fixes’ when in need of food; stuff like bread, pasta or ready meals. Whilst we all know how much crap goes into ready meals (which should, as a rule, never be bought by anyone who cares even in the slightest about their health; salt content of those things is insane) and other such ‘quick fixes’, fewer people are aware of the impact a high intake of whole grains can have on our bodies. Stuff like bread and rice only started being eaten by humans a few thousand years ago, as we discovered the benefits of farming and cooking, and whilst they are undoubtedly a good food source (and are very, very difficult to cut from one’s diet whilst still remaining healthy) our bodies have simply not had enough time, evolutionarily speaking, to get used to them. This means they have a tendency to not make us feel as full as their calorie content should suggest, thus meaning that we eat more than our body in fact needs (if you want to feel full whilst not taking in so many calories, protein is the way to go; meat, fish and dairy are great for this).

This is all rather academic, but what does it mean for you if you want to lose a bit of weight? I am no expert on this, but then again neither are most of the people acting as self-proclaimed nutritionists in the general media, and anyway, I don’t have any better ideas for posts. So, look at my next post for my, admittedly basic, advice for anyone trying to make themselves that little bit healthier, especially if you’re trying to work of a few of the pounds built up over this festive season.

Hitting the hay

OK, so it was history last time, so I’m feeling like a bit of science today. So, here is your random question for today; are the ‘leaps of faith’ in the Assassin’s Creed games survivable?

Between them, the characters of Altair, Ezio and Connor* jump off a wide variety of famous buildings and monuments across the five current games, but the jump that springs most readily to mind is Ezio’s leap from the Campanile di San Marco, in St Mark’s Square, Venice, at the end of Assassin’s Creed II. It’s not the highest jump made, but it is one of the most interesting and it occurs as part of the main story campaign, meaning everyone who’s played the game through will have made the jump and it has some significance attached to it. It’s also a well-known building with plenty of information on it.

[*Interesting fact; apparently, both Altair and Ezio translate as ‘Eagle’ in some form in English, as does Connor’s Mohawk name (Ratonhnhaké;ton, according to Wikipedia) and the name of his ship, the Aquila. Connor itself translates as ‘lover of wolves’ from the original Gaelic]

The Campanile as it stands today is not the same one as in Ezio’s day; in 1902 the original building collapsed and took ten years to rebuild. However, the new Campanile was made to be cosmetically (if not quite structurally) identical to the original, so current data should still be accurate. Wikipedia again tells me the brick shaft making up the bulk of the structure accounts for (apparently only) 50m of the tower’s 98.6m total height, with Ezio’s leap (made from the belfry just above) coming in at around 55m. With this information we can calculate Ezio’s total gravitational potential energy lost during his fall; GPE lost = mgΔh, and presuming a 70kg bloke this comes to GPE lost= 33730J (Δ is, by the way, the mathematical way of expressing a change in something- in this case, Δh represents a change in height). If his fall were made with no air resistance, then all this GPE would be converted to kinetic energy, where KE = mv²/2. Solving to make v (his velocity upon hitting the ground) the subject gives v = sqrt(2*KE/m), and replacing KE with our value of the GPE lost, we get v = 31.04m/s. This tells us two things; firstly that the fall should take Ezio at least three seconds, and secondly that, without air resistance, he’d be in rather a lot of trouble.

But, we must of course factor air resistance into our calculations, but to do so to begin with we must make another assumption; that Ezio reaches terminal velocity before reaching the ground. Whether this statement is valid or not we will find out later. The terminal velocity is just a rearranged form of the drag equation: Vt=sqrt(2mg/pACd), where m= Ezio’s mass (70kg, as presumed earlier), g= gravitational field strength (on Earth, 9.8m/s²), p= air density (on a warm Venetian evening at around 15 degrees Celcius, this comes out as 1.225kg/m3), A= the cross-sectional area of Ezio’s falling body (call it 0.85m², presuming he’s around the same size as me) and Cd= his body’s drag coefficient (a number evaluating how well the air flows around his body and clothing, for which I shall pick 1 at complete random). Plugging these numbers into the equation gives a terminal velocity of 36.30m/s, which is an annoying number; because it’s larger than our previous velocity value, calculated without air resistance, of 31.04m/s, this means that Ezio definitely won’t have reached terminal velocity by the time he reaches the bottom of the Campanile, so we’re going to have to look elsewhere for our numbers. Interestingly, the terminal velocity for a falling skydiver, without parachute, is apparently around 54m/s, suggesting that I’ve got numbers that are in roughly the correct ballpark but that could do with some improvement (this is probably thanks to my chosen Cd value; 1 is a very high value, selected to give Ezio the best possible chance of survival, but ho hum)

Here, I could attempt to derive an equation for how velocity varies with distance travelled, but such things are complicated, time consuming and do not translate well into being typed out. Instead, I am going to take on blind faith a statement attached to my ‘falling skydiver’ number quoted above; that it takes about 3 seconds to achieve half the skydiver’s terminal velocity. We said that Ezio’s fall from the Campanile would take him at least three seconds (just trust me on that one), and in fact it would probably be closer to four, but no matter; let’s just presume he has jumped off some unidentified building such that it takes him precisely three seconds to hit the ground, at which point his velocity will be taken as 27m/s.

Except he won’t hit the ground; assuming he hits his target anyway. The Assassin’s Creed universe is literally littered with indiscriminate piles/carts of hay and flower petals that have been conveniently left around for no obvious reason, and when performing a leap of faith our protagonist’s always aim for them (the AC wiki tells me that these were in fact programmed into the memories that the games consist of in order to aid navigation, but this doesn’t matter). Let us presume that the hay is 1m deep where Ezio lands, and that the whole hay-and-cart structure is entirely successful in its task, in that it manages to reduce Ezio’s velocity from 27m/s to nought across this 1m distance, without any energy being lost through the hard floor (highly unlikely, but let’s be generous). At 27m/s, the 70kg Ezio has a momentum of 1890kgm/s, all of which must be dissipated through the hay across this 1m distance. This means an impulse of 1890Ns, and thus a force, will act upon him; Impulse=Force x ΔTime. This force will cause him to decelerate. If this deceleration is uniform (it wouldn’t be in real life, but modelling this is tricky business and it will do as an approximation), then his average velocity during his ‘slowing’ period will come to be 13.5m/s, and that this deceleration will take 0.074s. Given that we now know the impulse acting on Ezio and the time for which it acts, we can now work out the force upon him; 1890 / 0.074 = 1890 x 13.5 = 26460N. This corresponds to 364.5m/s² deceleration, or around 37g’s to put it in G-force terms. Given that 5g’s has been known to break bones in stunt aircraft, I think it’s safe to say that quite a lot more hay, Ezio’s not getting up any time soon. So remember; next time you’re thinking of jumping off a tall building, I would recommend a parachute over a haystack.

N.B.: The resulting deceleration calculated in the last bit seems a bit massive, suggesting I may have gone wrong somewhere, so if anyone has any better ideas of numbers/equations then feel free to leave them below. I feel here is also an appropriate place to mention a story I once heard concerning an air hostess whose plane blew up. She was thrown free, landed in a tree on the way down… and survived.

EDIT: Since writing this post, this has come into existence, more accurately calculating the drag and final velocity acting on the falling Assassin. They’re more advanced than me, but their conclusion is the same; I like being proved right :).

Practical computing

This looks set to be my final post of this series about the history and functional mechanics of computers. Today I want to get onto the nuts & bolts of computer programming and interaction, the sort of thing you might learn as a budding amateur wanting to figure out how to mess around with these things, and who’s interested in exactly how they work (bear in mind that I am not one of these people and am, therefore, likely to get quite a bit of this wrong). So, to summarise what I’ve said in the last two posts (and to fill in a couple of gaps): silicon chips are massive piles of tiny electronic switches, memory is stored in tiny circuits that are either off or on, this pattern of off and on can be used to represent information in memory, memory stores data and instructions for the CPU, the CPU has no actual ability to do anything but automatically delegates through the structure of its transistors to the areas that do, the arithmetic logic unit is a dumb counting machine used to do all the grunt work and is also responsible, through the CPU, for telling the screen how to make the appropriate pretty pictures.

OK? Good, we can get on then.

Programming languages are a way of translating the medium of computer information and instruction (binary data) into our medium of the same: words and language. Obviously, computers do not understand that the buttons we press on our screen have symbols on them, that these symbols mean something to us and that they are so built to produce the same symbols on the monitor when we press them, but we humans do and that makes computers actually usable for 99.99% of the world population. When a programmer brings up an appropriate program and starts typing instructions into it, at the time of typing their words mean absolutely nothing. The key thing is what happens when their data is committed to memory, for here the program concerned kicks in.

The key feature that defines a programming language is not the language itself, but the interface that converts words to instructions. Built into the workings of each is a list of ‘words’ in binary, each word having a corresponding, but entirely different, string of data associated with it that represents the appropriate set of ‘ons and offs’ that will get a computer to perform the correct task. This works in one of two ways: an ‘interpreter’ is an inbuilt system whereby the programming is stored just as words and is then converted to ‘machine code’ by the interpreter as it is accessed from memory, but the most common form is to use a compiler. This basically means that once you have finished writing your program, you hit a button to tell the computer to ‘compile’ your written code into an executable program in data form. This allows you to delete the written file afterwards, makes programs run faster, and gives programmers an excuse to bum around all the time (I refer you here)

That is, basically how computer programs work- but there is one last, key feature, in the workings of a modern computer, one that has divided both nerds and laymen alike across the years and decades and to this day provokes furious debate: the operating system.

An OS, something like Windows (Microsoft), OS X (Apple) or Linux (nerds), is basically the software that enables the CPU to do its job of managing processes and applications. Think of it this way: whilst the CPU might put two inputs through a logic gate and send an output to a program, it is the operating system that will set it up to determine exactly which gate to put it through and exactly how that program will execute. Operating systems are written onto the hard drive, and can, theoretically, be written using nothing more than a magnetized needle, a lot of time and a plethora of expertise to flip the magnetically charged ‘bits’ on the hard disk. They consist of many different parts, but the key feature of all of them is the kernel, the part that manages the memory, optimises the CPU performance and translates programs from memory to screen. The precise translation and method by which this latter function happens differs from OS to OS, hence why a program written for Windows won’t work on a Mac, and why Android (Linux-powered) smartphones couldn’t run iPhone (iOS) apps even if they could access the store. It is also the cause of all the debate between advocates of different operating systems, since different translation methods prioritise/are better at dealing with different things, work with varying degrees of efficiency and are more  or less vulnerable to virus attack. However, perhaps the most vital things that modern OS’s do on our home computers is the stuff that, at first glance seems secondary- moving stuff around and scheduling. A CPU cannot process more than one task at once, meaning that it should not be theoretically possible for a computer to multi-task; the sheer concept of playing minesweeper whilst waiting for the rest of the computer to boot up and sort itself out would be just too outlandish for words. However, a clever piece of software called a scheduler in each OS which switches from process to process very rapidly (remember computers run so fast that they can count to a billion, one by one, in under a second) to give the impression of it all happening simultaneously. Similarly, a kernel will allocate areas of empty memory for a given program to store its temporary information and run on, but may also shift some rarely-accessed memory from RAM (where it is accessible) to hard disk (where it isn’t) to free up more space (this is how computers with very little free memory space run programs, and the time taken to do this for large amounts of data is why they run so slowly) and must cope when a program needs to access data from another part of the computer that has not been specifically allocated a part of that program.

If I knew what I was talking about, I could witter on all day about the functioning of operating systems and the vast array of headache-causing practicalities and features that any OS programmer must consider, but I don’t and as such won’t. Instead, I will simply sit back, pat myself on the back for having actually got around to researching and (after a fashion) understanding all this, and marvel at what strange, confusing, brilliant inventions computers are.

Attack of the Blocks

I spend far too much time on the internet. As well as putting many hours of work into trying to keep this blog updated regularly, I while away a fair portion of time on Facebook, follow a large number of video series’ and webcomics, and can often be found wandering through the recesses of YouTube (an interesting and frequently harrowing experience that can tell one an awful lot about the extremes of human nature). But there is one thing that any resident of the web cannot hope to avoid for any great period of time, and quite often doesn’t want to- the strange world of Minecraft.

Since its release as a humble alpha-version indie game in 2009, Minecraft has boomed to become a runaway success and something of a cultural phenomenon. By the end of 2011, before it had even been released in its final release format, Minecraft had registered 4 million purchases and 4 times that many registered users, which isn’t bad for a game that has never advertised itself, spread semi-virally among nerdy gamers for its mere three-year history and was made purely as an interesting project by its creator Markus Persson (aka Notch). Thousands of videos, ranging from gameplay to some quite startlingly good music videos (check out the work of Captain Sparklez if you haven’t already) litter YouTube and many of the games’ features (such as TNT and the exploding mobs known as Creepers) have become memes in their own right to some degree.

So then, why exactly has Minecraft succeeded where hundreds and thousands of games have failed, becoming a revolution in gamer culture? What is it that makes Minecraft both so brilliant, and so special?

Many, upon being asked this question, tend to revert to extolling the virtues of the game’s indie nature. Being created entirely without funding as an experiment in gaming rather than profit-making, Minecraft’s roots are firmly rooted in the humble sphere of independent gaming, and it shows. One obvious feature is the games inherent simplicity- initially solely featuring the ability to wander around, place and destroy blocks, the controls are mainly (although far from entirely) confined to move and ‘use’, whether that latter function be shoot, slash, mine or punch down a tree. The basic, cuboid, ‘blocky’ nature of the game’s graphics, allowing for both simplicity of production and creating an iconic, retro aesthetic that makes it memorable and standout to look at. Whilst the game has frequently been criticised for not including a tutorial (I myself took a good quarter of an hour to find out that you started by punching a tree, and a further ten minutes to work out that you were supposed to hold down the mouse button rather than repeatedly click), this is another common feature of indie gaming, partly because it saves time in development, but mostly because it makes the game feel like it is not pandering to you and thus allowing indie gamers to feel some degree of elitism that they are good enough to work it out by themselves. This also ties in with the very nature of the game- another criticism used to be (and, to an extent, still is, even with the addition of the Enderdragon as a final win objective) that the game appeared to be largely devoid of point, existent only for its own purpose. This is entirely true, whether you view that as a bonus or a detriment being entirely your own opinion, and this idea of an unfamiliar, experimental game structure is another feature common in one form or another to a lot of indie games.

However, to me these do not seem to be entirely worthy of the name ‘answers’ regarding the question of Minecraft’s phenomenal success. The reason I think this way is that they do not adequately explain exactly why Minecraft rose to such prominence whilst other, often similar, indie games have been left in relative obscurity. Limbo, for example, is a side-scrolling platformer and a quite disturbing, yet compelling, in-game experience, with almost as much intrigue and puzzle from a set of game mechanics simpler even than those of Minecraft. It has also received critical acclaim often far in excess of Minecraft (which has received a positive, but not wildly amazed, response from critics), and yet is still known to only an occasional few. Amnesia: The Dark Descent has been often described as the greatest survival horror game in history, as well as incorporating a superb set of graphics, a three-dimensional world view (unlike the 2D view common to most indie games) and the most pants-wettingly terrifying experience anyone who’s ever played it is likely to ever face- but again, it is confined to the indie realm. Hell, Terraria is basically Minecraft in 2D, but has sold around 40 times less than Minecraft itself. All three of these games have received fairly significant acclaim and coverage, and rightly so, but none has become the riotous cultural phenomenon that Minecraft has, and neither have had an Assassin’s Creed mod (first example that sprung to mind).

So… why has Minecraft been so successful. Well, I’m going to be sticking my neck out here, but to my mind it’s because it doesn’t play like an indie game. Whilst most independently produced titled are 2D, confined to fairly limited surroundings and made as simple & basic as possible to save on development (Amnesia can be regarded as an exception), Minecraft takes it own inherent simplicity and blows it up to a grand scale. It is a vast, open world sandbox game, with vague resonances of the Elder Scrolls games and MMORPG’s, taking the freedom, exploration and experimentation that have always been the advantages of this branch of the AAA world, and combined them with the innovative, simplistic gaming experience of its indie roots. In some ways it’s similar to Facebook, in that it takes a simple principle and then applies it to the largest stage possible, and both have enjoyed a similarly explosive rise to fame. The randomly generated worlds provide infinite caverns to explore, endless mobs to slay, all the space imaginable to build the grandest of castles, the largest of cathedrals, or the SS Enterprise if that takes your fancy. There are a thousand different ways to play the game on a million different planes, all based on just a few simple mechanics. Minecraft is the best of indie and AAA blended together, and is all the more awesome for it.

This blog is getting WAY too nerdy…

Looking back over my more recent posts, I’m starting to spot a trend. My first few posts ranged widely in style and content, from poetry to random facts. Now, however, (if I can spot a trend in a blog only 15 posts old)all my posts are basically mini-essays. Time, murder, SOPA… all have their little 1000-word studies here.
The reason why this has happened is fairly obvious- they are what’s going through my brain, and the way my mind and writing style works means it is easiest for me just to write studies of whatever crosses my mind. 2011’s round robin letter was an idea that popped into my head while out walking one day, while the poetry wasn’t even intended for online publication. Why are they up? Well, it seemed like a good idea at the time.
So, what to do? Either I carry on with all this essay-writing, or I try to think of more… abstract and different stuff to put up here. Those who know me would probably agree that abstract would be a better expression of my personality (although most of my supplies of odd are used up thinking of Facebook birthday messages- for anyone reading this, ‘Happy Birthday!’ is the dullest thing you could possibly write, and it would make birthdays certainly a lot more entertaining if there were a little more… variety). But… I don’t know… essays suit me. I haven’t had to write one formally for a long time now and either I’m getting some seriously retarded nostalgia/withdrawal symptoms or my inner soul is just a massive masochist when it comes to spending way too long writing stuff that is ultimately not likely to be read by more people than I can count on my fingers (can you tell I watched a Yahtzee video before writing that sentence?). Or maybe it’s just that writing essays is a lot more fun (to my inner nerd anyway) when the subject is something you find interesting, or at least that you have chosen so can’t complain about.
Meh, I don’t know. In fact, I’m not even sure why I’m writing this post- it’s clearly not going to turn into something especially meaningful any time soon. But, then again, why am I writing this blog at all? It gets very few readers, little notice, and most people I know don’t even know I have one,  so it’s clearly not because it gives me a sense of achievement. To me, this blog serves a similar purpose to a diary- it’s a vent for all the various thoughts and ideas that clutter up my mind and that I never get a chance to express. As the little description bar says, this is a window into my mind. But I don’t just have the deep, inner thoughts people write in diaries- I have all these random ideas, all these flames of interest and inspiration sparking off inside my head. It does them good, I suppose, to get out and get an airing. Maybe a few of  them will help other people, maybe make them think. Maybe one or two will inspire them. This blog lets ideas live. So, for as long as my thoughts express themselves as facts and reasons and essays, I guess that’s what I’ll be writing. Thank you internet, for helping me make up my mind on this one. I wonder how often the web gets to read a train of thought? But that may be a post for another time…

Time is a funny old thing…

Today I am rather short on time- the work I have to do is beginning to mount up despite (and partially because) of a long weekend. To most people this is a perfectly good reason to put up an apologetic cop-out of a post to prevent them having to work on it, but for me, it is a perfectly good excuse for my bloodymindedness to take over, so I thought I might write something about time.
As such a strange and almost abstract concept as it is, time can be viewed from a number of perspectives- the physical sense, the thermodynamic sense, and the human sense are the three obvious ones that spring to mind. To a physicist, time is a dimension much like width and length, and is far from unique- in fact there is a large sector of theoretical physics open to the idea that in the big bang there were many millions of dimensions, only 4 of which (3 spacial and one temporal) opened up into the rest of the universe, the other dimensions only existing on a microscopic, atomic scale (which might explain why the quantum world is so plain weird. Hey- I’m no physicist, and the web can probably tell you more). The really special thing about time compared to spacial dimensions to a physicist (among a long list, that are confusing and difficult to describe), is that it is the only dimension with an obvious direction. People often talk of ‘the arrow of time’, but the idea of any other dimension having an arrow is only a sort of arbitrary point of reference (north & south, up & down are only relative to our own earth and so are merely convenient reference points. This idea of time having an irreversible arrow annoys a lot of physicists as there appears to be little, fundamentally, that means we couldn’t travel in time in the other direction- the theory of relativity, for example, shows how fluid time can be. The idea of time’s direction has a lot to do with thermodynamics, which is where the second perspective of time comes from.
Really I am using the word thermodynamic very loosely, as what I am really thinking of is more to do with the psychological arrow of time. To quickly paraphrase what I mean by thermodynamics, the second law of thermodynamics states that the universe’s level of entropy, or randomness, will always increase or stay the same, never decrease, because a more random, chaotic system is more stable. One way of thinking of this is like a beach- the large swathes of sand can be arranged in a huge number of configurations and still seem the same, but if there are lots of sandcastles over it, there is a lot less randomness. One can seemingly reverse this process by building more sandcastles, making the universe more ordered, but to do this requires energy which, on a universal level, increases the universe’s entropic level. It is for this reason that a smashed pot will always have been preceded, but not followed by, the same pot all in one piece.
The thing is, the psychological and thermodynamic arrows of time point in the same direction, and their link on one another is integral. Our whole view of the passing of time is influenced by the idea of events that have irrevocably ‘happened’ and are ‘over’, hence our eternal fascination with ‘what if’s’. We persistently worry about past mistakes, what could have been, and what things were like, but never can be- hence the popularity of historical stories, ruins, nostalgia and grumbling about teenagers. For a better explanation of the various ‘arrows of time’, try Stephen Hawking’s ‘A Brief History of Time’- it is somewhat out of date now and it is fashionable now to think of it as overly simplistic, but it’s still a good source of a grounding in high-level physics
The final, and to me most interesting, perspective of time I want to talk about is deeply linked to the psychological arrow and our thoughts of the passing of time, but brings its own, uniquely relative, view- the human perspective (notice how it is always people I seem to find the most interesting.) We humans view time in a way that, when thought about, paints a weirdly fluid portrait of the behaviour of time. There is never enough time to work, too much time spent waiting, not enough time spent on holidays or relaxing, too much time spent out of work, too little time spent eating the cake and too much spent washing up. There are the awkward pauses in conversation that seem to drag on for an eternity, especially when they come just after the moment when the entire room goes silent for no accountable reason, enabling everyone to hear the most embarrassing part of your conversation. There are those hours spent doing things you love that you just gobble up, revelling in your own happiness, and the bitter, painful minutes of deep personal pain.
Popular culture and everyday life often mentions or features these weirdly human notions of time being so relative to the scenario- Albert Einstein himself described relativity thus: “When you are talking to a nice girl, an hour seems like a second. When you have your hand on a bar of red-hot iron, a second seems like an hour”. In fact, when you think about it, it is almost as if time appears to be a living thing, at least in the context of our references to it. This, I think anyway, is the nub of the matter- time is something that we encounter, in its more thermodynamic form, every day of our lives, and just like pet owners will tend to anthropomorphise their pets’ facial expressions, so the human race has personified time in general conversation (well, at least in the western world- I cannot speak for anywhere non English-speaking as a certainty). Time is almost one of the family- ever-present, ever-around, ever-referred to, until it becomes as human as a long-lost friend, in its own little way.
Finally, on the subject of time, Mr Douglas Adams: “Time is an illusion; lunchtime doubly so”