Poverty Changes

£14,000 is quite a large amount of money. Enough for 70,000 Freddos, a decade’s worth of holidays, two new Nissan Pixo’s, several thousand potatoes or a gold standard racing pigeon. However, if you’re trying to live off just that amount in modern Britain, it quickly seems quite a lot smaller. Half of that could easily disappear on rent, whilst the average British family will spend a further £4,000 on food (significantly greater than the European average, for one reason or another). Then we must factor in tax, work-related expenses, various repair bills, a TV license, utility & heating bills, petrol money and other transport expenses, and it quickly becomes apparent that trying to live on this amount will require some careful budgeting. Still, not to worry too much though; it’s certainly possible to keep the body and soul of a medium sized family together on £14k a year, if not absolutely comfortably, and in any case 70% of British families have an annual income in excess of this amount. It might not be a vast amount to live on, but it should be about enough.

However, there’s a reason I quoted £14,000 specifically in the figure above, because I recently saw another statistic saying that if one’s income is above 14 grand a year, you are one of the top 4% richest people on planet Earth. Or, to put it another way, if you were on that income, and were then to select somebody totally at random from our species, then 24 times out of 25 you would be richer than them.

Now, this slightly shocking fact, as well as being a timely reminder as to the prevalence of poverty amongst fellow members of our species, to me raises an interesting question; if £14,000 is only just about enough to let one’s life operate properly in modern Britain, how on earth does the vast majority of the world manage to survive at all on significantly less than this? More than 70% of the Chinese population (in 2008, admittedly; the rate of Chinese poverty is decreasing at a staggering rate thanks to its booming economy) live on less than $5 a day, and 35 years ago more than 80% were considered to be in absolute poverty. How does this work? How does most of the rest of the world physically survive?

The obvious starting point is the one stating that much of it barely does. Despite the last few decades of massive improvement in the living standards and poverty levels in the world in general,  the World Bank estimates that some 20% of the world’s populace is living below the absolute poverty line of surviving on less than $1.50 per person per day, or £365 a year (down from around 45% in the early 1980s- Bob Geldof’s message has packed a powerful punch). This is the generally accepted marker for being less than what a person can physically keep body and soul together on, and having such a huge proportion of people living below this marker tends to drag down the global average. Poverty is something that the last quarter of the century has seen a definitive effort on the part of humanity to reduce, but it’s still a truly vast issue across the globe.

However, the main contributing factor to me behind how a seemingly meagre amount of money in the first world would be considered bountiful wealth in the third is simply down to how economics works. We in the west are currently enjoying the fruits of two centuries of free-market capitalism, which has fundamentally changed the way our civilisation functions. When we as a race first came up with the concept of civilisation, of pooling and exchanging skills and resources for the betterment of the collective, this was largely confined to the local community, or at least to the small-scale. Farmers provided for those living in the surrounding twenty miles or so, as did brewers, hunters, and all other such ‘small businessmen’, as they would be called today. The concept of a country provided security from invasion and legal support on a larger scale, but that was about it; any international trade was generally conducted between kings and noblemen, and was very much small scale.

However, since the days of the British Empire and the Industrial Revolution, business has got steadily bigger and bigger. It started out with international trade between the colonies, and the rich untapped resources the European imperial powers found there, moved on to the industrial scale manufacture of goods, and then the high-intensity sale of consumer products to the general population. Now we have vast multinational companies organising long, exhaustive chains of supply, manufacture and retail, and our society has become firmly rooted in this intense selling international economy. Without constantly selling vast quantities of stuff to one another, the western world as we know it simply would not exist.

This process causes many side effects, but one is of particular interest; everything becomes more expensive. To summarise very simply, the basic principle of capitalism involves workers putting in work and skill to increase the value of something; that something then gets sold, and the worker then gets some of the difference between cost of materials and cost of sale as a reward for their effort. For this to work, then one’s reward for putting in your effort must be enough to purchase the stuff needed to keep you alive; capitalism rests on the principle of our bodies being X% efficient at turning the food we eat into the energy we can use to work. If business is successful, then the workers of a company (here the term ‘workers’ covers everyone from factory floor to management) will gain money in the long term, enabling them to spend more money. This means that the market increases in size, and people can either sell more goods or start selling them for a higher price, so goods become worth more, so the people making those goods start getting more money, and so on.

The net result of this is that in an ‘expensive’ economy, everyone has a relatively high income and high expenditure, because all goods, taxes, land, utilities etc. cost quite a lot; but, for all practical purposes, this results in a remarkably similar situation to a ‘cheap’ economy, where the full force of western capitalism hasn’t quite taken hold yet- for, whilst the people residing there have less money, the stuff that is there costs less having not been through the corporation wringer. So, why would we find it tricky to live on less money than the top 4% of the world’s population? Blame the Industrial Revolution.

Advertisement

Time is an illusion, lunchtime doubly so…

In the dim and distant past, time was, to humankind, a thing and not much more. There was light-time, then there was dark-time, then there was another lot of light-time; during the day we could hunt, fight, eat and try to stay alive, and during the night we could sleep and have sex. However, we also realised that there were some parts of the year with short days and colder night, and others that were warmer, brighter and better for hunting. Being the bright sort, we humans realised that the amount of time it spent in winter, spring, summer and autumn (fall is the WRONG WORD) was about the same each time around, and thought that rather than just waiting for it to warm up every time we could count how long it took for one cycle (or year) so that we could work out when it was going to get warm next year. This enabled us to plan our hunting and farming patterns, and it became recognised that some knowledge of how the year worked was advantageous to a tribe. Eventually, this got so important that people started building monuments to the annual seasonal progression, hence such weird and staggeringly impressive prehistoric engineering achievements as Stonehenge.

However, this basic understanding of the year and the seasons was only one step on the journey, and as we moved from a hunter-gatherer paradigm to more of a civilised existence, we realised the benefits that a complete calendar could offer us, and thus began our still-continuing test to quantify time. Nowadays our understanding of time extends to clocks accurate to the degree of nanoseconds, and an understanding of relativity, but for a long time our greatest quest into the realm of bringing organised time into our lives was the creation of the concept of the wee.

Having seven days of the week is, to begin with, a strange idea; seven is an awkward prime number, and it seems odd that we don’t pick number that is easier to divide and multiply by, like six, eight or even ten, as the basis for our temporal system. Six would seem to make the most sense; most of our months have around 30 days, or 5 six-day weeks, and 365 days a year is only one less than multiple of six, which could surely be some sort of religious symbolism (and there would be an exact multiple on leap years- even better). And it would mean a shorter week, and more time spent on the weekend, which would be really great. But no, we’re stuck with seven, and it’s all the bloody moon’s fault.

Y’see, the sun’s daily cycle is useful for measuring short-term time (night and day), and the earth’s rotation around it provides the crucial yearly change of season. However, the moon’s cycle is 28 days long (fourteen to wax, fourteen to wane, regular as clockwork), providing a nice intermediary time unit with which to divide up the year into a more manageable number of pieces than 365. Thus, we began dividing the year up into ‘moons’ and using them as a convenient reference that we could refer to every night. However, even a moon cycle is a bit long for day-to-day scheduling, and it proved advantageous for our distant ancestors to split it up even further. Unfortunately, 28 is an awkward number to divide into pieces, and its only factors are 1, 2, 4, 7 and 14. An increment of 1 or 2 days is simply too small to be useful, and a 4 day ‘week’ isn’t much better. A 14 day week would hardly be an improvement on 28 for scheduling purposes, so seven is the only number of a practical size for the length of the week. The fact that months are now mostly 30 or 31 days rather than 28 to try and fit the awkward fact that there are 12.36 moon cycles in a year, hasn’t changed matters, so we’re stuck with an awkward 7 day cycle.

However, this wasn’t the end of the issue for the historic time-definers (for want of a better word); there’s not much advantage in defining a seven day week if you can’t then define which day of said week you want the crops to be planted on. Therefore, different days of the week needed names for identification purposes, and since astronomy had already provided our daily, weekly and yearly time structures it made sense to look skyward once again when searching for suitable names. At this time, centuries before the invention of the telescope, we only knew of seven planets, those celestial bodies that could be seen with the naked eye; the sun, the moon (yeah, their definition of ‘planet’ was a bit iffy), Mercury, Venus, Mars, Jupiter and Saturn. It might seem to make sense, with seven planets and seven days of the week, to just name the days after the planets in a random order, but humankind never does things so simply, and the process of picking which day got named after which planet was a complicated one.

In around 1000 BC the Egyptians had decided to divide the daylight into twelve hours (because they knew how to pick a nice, easy-to-divide number), and the Babylonians then took this a stage further by dividing the entire day, including night-time, into 24 hours. The Babylonians were also great astronomers, and had thus discovered the seven visible planets- however, because they were a bit weird, they decided that each planet had its place in a hierarchy, and that this hierarchy was dictated by which planet took the longest to complete its cycle and return to the same point in the sky. This order was, for the record, Saturn (29 years), Jupiter (12 years), Mars (687 days), Sun (365 days), Venus (225 days), Mercury (88 days) and Moon (28 days). So, did they name the days after the planets in this order? Of course not, that would be far too simple; instead, they decided to start naming the hours of the day after the planets (I did say they were a bit weird) in that order, going back to Saturn when they got to the Moon.

However, 24 hours does not divide nicely by seven planets, so the planet after which the first hour of the day was named changed each day. So, the first hour of the first day of the week was named after Saturn, the first hour of the second day after the Sun, and so on. Since the list repeated itself each week, the Babylonians decided to name each day after the planet that the first hour of each day was named, so we got Saturnday, Sunday, Moonday, Marsday, Mercuryday, Jupiterday and Venusday.

Now, you may have noticed that these are not the days of the week we English speakers are exactly used to, and for that we can blame the Vikings. The planetary method for naming the days of the week was brought to Britain by the Romans, and when they left the Britons held on to the names. However, Britain then spent the next 7 centuries getting repeatedly invaded and conquered by various foreigners, and for most of that time it was the Germanic Vikings and Saxons who fought over the country. Both groups worshipped the same gods, those of Norse mythology (so Thor, Odin and so on), and one of the practices they introduced was to replace the names of four days of the week with those of four of their gods; Tyr’sday, Woden’sday (Woden was the Saxon word for Odin), Thor’sday and Frig’sday replaced Marsday, Mercuryday, Jupiterday and Venusday in England, and soon the fluctuating nature of language renamed the days of the week Saturday, Sunday, Monday, Tuesday, Wednesday, Thursday and Friday.

However, the old planetary names remained in the romance languages (the Spanish translations of the days Tuesday to Friday are Mardi, Mercredi, Jeudi and Vendredi), with one small exception. When the Roman Empire went Christian in the fourth century, the ten commandments dictated they remember the Sabbath day; but, to avoid copying the Jews (whose Sabbath was on Saturday), they chose to make Sunday the Sabbath day. It is for this reason that Monday, the first day of the working week after one’s day of rest, became the start of the week, taking over from the Babylonian’s choice of Saturday, but close to Rome they went one stage further and renamed Sunday ‘Deus Dominici’, or Day Of The Lord. The practice didn’t catch on in Britain, thousands of miles from Rome, but the modern day Spanish, French and Italian words for Sunday are domingo, dimanche and domenica respectively, all of which are locally corrupted forms of ‘Deus Dominici’.

This is one of those posts that doesn’t have a natural conclusion, or even much of a point to it. But hey; I didn’t start writing this because I wanted to make a point, but more to share the kind of stuff I find slightly interesting. Sorry if you didn’t.

Why the chubs?

My last post dealt with the thorny issue of obesity, both it’s increasing presence in our everyday lives, and what for me is the underlying reason behind the stats that back up media scare stories concerning ‘the obesity epidemic’- the rise in size of the ‘average’ person over the last few decades. The precise causes of this trend can be put down to a whole host of societal factors within our modern age, but that story is boring as hell and has been repeated countless times by commenters far more adept in this field than me. Instead, today I wish present the case for modern-day obesity as a problem concerning the fundamental biology of a human being.

We, and our dim and distant ancestors of the scaly/furry variety, have spent the last few million years living wild; hunting, fighting and generally acting much like any other evolutionary pathway. Thus, we can learn a lot about our own inbuilt biology and instincts by studying the behaviour of animals currently alive today, and when we do so, several interesting animal eating habits become apparent. As anyone who has tried it as a child can attest (and I speak from personal experience), grass is not good stuff to eat. It’s tough, it takes a lot of chewing and processing (many herbivores have multiple stomachs to make sure they squeeze the maximum nutritional value out of their food), and there really isn’t much of it to power a fully-functional being. As such, grazers on grass and other such tough plant matter (such as leaves) will spend most of their lives doing nothing but guzzle the stuff, trying to get as much as possible through their system. Other animals will favour food with a higher nutritional content, such as fruits, tubers or, in many cases, meat, but these frequently present issues. Fruits are highly seasonal and rarely available in a large enough volume to support a large population, as well as being quite hard to get a lot of down; plants try to ‘design’ fruits so that each visitor takes only a few at a time, so as best to spread their seeds far and wide, and as such there are few animals that can sustain themselves on such a diet.  Other food such as tubers or nuts are hard to get at, needing to be dug up or broken in highly energy-consuming activities, whilst meat has the annoying habit of running away or fighting back whenever you try to get at it. As anyone who watches nature documentaries will attest, most large predators will only eat once every few days (admittedly rather heavily).

The unifying factor of all of this is that food is, in the wild, highly energy- and time-consuming to get hold of and consume, since every source of it guards its prize jealously. Therefore, any animal that wants to survive in this tough world must be near-constantly in pursuit of food simply to fulfil all of its life functions, and this is characterised by being perpetually hungry. Hunger is a body’s way of telling us that we should get more food, and in the wild this constant desire for more is kept in check by the difficulty that getting hold of it entails. Similarly, animal bodies try to assuage this desire by being lazy; if something isn’t necessary, then there’s no point wasting valuable energy going after it (since this will mean spending more time going after food to replace lost energy.)

However, in recent history (and a spectacularly short period of time from evolution’s point of view), one particular species called homo sapiens came up with this great idea called civilisation, which basically entailed the pooling and sharing of skill and resources in order to best benefit everyone as a whole. As an evolutionary success story, this is right up there with developing multicellular body structures in terms of being awesome, and it has enabled us humans to live far more comfortable lives than our ancestors did, with correspondingly far greater access to food. This has proved particularly true over the last two centuries, as technological advances in a more democratic society have improved the everyman’s access to food and comfortable living to a truly astounding degree. Unfortunately (from the point of view of our waistline) the instincts of our bodies haven’t quite caught up to the idea that when we want/need food, we can just get food, without all that inconvenient running around after it to get in the way. Not only that, but a lack of pack hierarchy combined with this increased availability means that we can stock up on food until we have eaten our absolute fill if so we wish; the difference between ‘satiated’ and ‘stuffed’ can work out as well over 1000 calories per meal, and over a long period of time it only takes a little more than we should be having every day to start packing on the pounds. Combine that with our natural predilection to laziness meaning that we don’t naturally think of going out for some exercise as fun purely for its own sake, and the fact that we no longer burn calories chasing our food, or in the muscles we build up from said chasing, and we find ourselves consuming a lot more calories than we really should be.

Not only that, but during this time we have also got into the habit of spending a lot of time worrying over the taste and texture of our food. This means that, unlike our ancestors who were just fine with simply jumping on a squirrel and devouring the thing, we have to go through the whole rigmarole of getting stuff out of the fridge, spending two hours slaving away in a kitchen and attempting to cook something vaguely resembling tasty. This wait is not something out bodies enjoy very much, meaning we often turn to ‘quick fixes’ when in need of food; stuff like bread, pasta or ready meals. Whilst we all know how much crap goes into ready meals (which should, as a rule, never be bought by anyone who cares even in the slightest about their health; salt content of those things is insane) and other such ‘quick fixes’, fewer people are aware of the impact a high intake of whole grains can have on our bodies. Stuff like bread and rice only started being eaten by humans a few thousand years ago, as we discovered the benefits of farming and cooking, and whilst they are undoubtedly a good food source (and are very, very difficult to cut from one’s diet whilst still remaining healthy) our bodies have simply not had enough time, evolutionarily speaking, to get used to them. This means they have a tendency to not make us feel as full as their calorie content should suggest, thus meaning that we eat more than our body in fact needs (if you want to feel full whilst not taking in so many calories, protein is the way to go; meat, fish and dairy are great for this).

This is all rather academic, but what does it mean for you if you want to lose a bit of weight? I am no expert on this, but then again neither are most of the people acting as self-proclaimed nutritionists in the general media, and anyway, I don’t have any better ideas for posts. So, look at my next post for my, admittedly basic, advice for anyone trying to make themselves that little bit healthier, especially if you’re trying to work of a few of the pounds built up over this festive season.

The Conquest of Air

Everybody in the USA, and in fact just about everyone across the world, has heard of Orville and Wilbur Wright. Two of the pioneers of aviation, when their experimental biplane Flyer achieved the first ever manned, powered, heavier-than-air flight on the morning of December 17, 1903, they had finally achieved one of man’s long-held dreams; control and mastery of air travel.

However, what is often puzzling when considering the Wright brothers’ story is the number of misconceptions surrounding them. Many, for instance, are under the impression that they were the first people to fly at all, inventing all the various technicalities of lift, aerofoil structures and control that are now commonplace in today’s aircraft. In fact, the story of flight, perhaps the oldest and maddest of human ambitions, an idea inspired by every time someone has looked up in wonder at the graceful flight of a bird, is a good deal older than either of them.

Our story begins, as does nearly all technological innovation, in imperial China, around 300 BC (the Greek scholar Archytas had admittedly made a model wooden pigeon ‘fly’ some 100 years previously, but nobody is sure exactly how he managed it). The Chinese’s first contribution was the invention of the kite, an innovation that would be insignificant if it wasn’t for whichever nutter decided to build one big enough to fly in. However, being strapped inside a giant kite and sent hurtling skywards not only took some balls, but was heavily dependent on wind conditions, heinously dangerous and dubiously useful, so in the end the Chinese gave up on manned flight and turned instead to unmanned ballooning, which they used for both military signalling and ceremonial purposes. It isn’t actually known if they ever successfully put a man into the air using a kite, but they almost certainly gave it a go. The Chinese did have one further attempt, this time at inventing the rocket engine, some years later, in which a young and presumably mental man theorised that if you strapped enough fireworks to a chair then they would send the chair and its occupants hurtling into the night sky. His prototype (predictably) exploded, and it wasn’t for two millennia, after the passage of classical civilisation, the Dark Ages and the Renaissance, that anyone tried flight again.

That is not to say that the idea didn’t stick around. The science was, admittedly beyond most people, but as early as 1500 Leonardo da Vinci, after close examination of bird wings, had successfully deduced the principle of lift and made several sketches showing designs for a manned glider. The design was never tested, and not fully rediscovered for many hundreds of years after his death (Da Vinci was not only a controversial figure and far ahead of his time, but wrote his notebooks in a code that it took centuries to decipher), but modern-day experiments have shown that his design would probably have worked. Da Vinci also put forward the popular idea of ornithopters, aircraft powered by flapping motion as in bird wings, and many subsequent attempts at flight attempted to emulate this method of motion. Needless to say, these all failed (not least because very few of the inventors concerned actually understood aerodynamics).

In fact, it wasn’t until the late 18th century that anyone started to really make any headway in the pursuit of flight. In 1783, a Parisian physics professor, Jacques Charles, built on the work of several Englishmen concerning the newly discovered hydrogen gas and the properties and behaviour of gases themselves. Theorising that, since hydrogen was less dense than air, it should follow Archimedes’ principle of buoyancy and rise, thus enabling it to lift a balloon, he launched the world’s first hydrogen balloon from the Champs du Mars on August 27th. The balloon was only small, and there were significant difficulties encountered in building it, but in the design process Charles, aided by his engineers the Roberts brothers, invented a method of treating silk to make it airtight, spelling the way for future pioneers of aviation. Whilst Charles made some significant headway in the launch of ever-larger hydrogen balloons, he was beaten to the next significant milestones by the Montgolfier brothers, Joseph-Michel and Jacques-Etienne. In that same year, their far simpler hot-air balloon designs not only put the first living things (a sheep, rooster and duck) into the atmosphere, but, just a month later, a human too- Jacques-Etienne was the first European, and probably the first human, ever to fly.

After that, balloon technology took off rapidly (no pun intended). The French rapidly became masters of the air, being the first to cross the English Channel and creators of the first steerable and powered balloon flights. Finally settling on Charles’ hydrogen balloons as a preferable method of flight, blimps and airships began, over the next century or so, to become an accepted method of travel, and would remain so right up until the Hindenburg disaster of 1937, which rather put people off the idea. For some scientists and engineers, humankind had made it- we could now fly, could control where we were going at least partially independent of the elements, and any attempt to do so with a heavier-than-air machine was both a waste of time and money, the preserve of dreamers. Nonetheless, to change the world, you sometimes have to dream big, and that was where Sir George Cayley came in.

Cayley was an aristocratic Yorkshireman, a skilled engineer and inventor, and a magnanimous, generous man- he offered all of his inventions for the public good and expected no payment for them. He dabbled in a number of fields, including seatbelts, lifeboats, caterpillar tracks, prosthetics, ballistics and railway signalling. In his development of flight, he even reinvented the wheel- he developed the idea of holding a wheel in place using thin metal spokes under tension rather than solid ones under compression, in an effort to make the wheels lighter, and is thus responsible for making all modern bicycles practical to use. However, he is most famous for being the first man ever, in 1853, to put somebody into the air using a heavier-than-air glider (although Cayley may have put a ten-year old in a biplane four years earlier).

The man in question was Cayley’s chauffeur (or butler- historical sources differ widely), who was (perhaps understandably) so hesitant to go in his boss’ mental contraption that he handed in his notice upon landing after his flight across Brompton Dale, stating  as his reason that ‘I was hired to drive, not fly’. Nonetheless, Cayley had shown that the impossible could be done- man could fly using just wings and wheels. He had also designed the aerofoil from scratch, identified the forces of thrust, lift, weight and drag that control an aircraft’s movements, and paved the way for the true pioneer of ‘heavy’ flight- Otto Lilienthal.

Lilienthal (aka ‘The Glider King’) was another engineer, making 25 patents in his life, including a revolutionary new engine design. But his fame comes from a world without engines- the world of the sky, with which he was obsessed. He was just a boy when he first strapped wings to his arms in an effort to fly (which obviously failed completely), and later published works detailing the physics of bird flight. It wasn’t until 1891, aged 43, once his career and financial position was stable and he had finished fighting in the Franco-Prussian War, that he began to fly in earnest, building around 12 gliders over a 5-year period (of which 6 still survive). It might have taken him a while, but once he started there was no stopping him, as he made over 2000 flights in just 5 years (averaging more than one every day). During this time he was only able to rack up 5 hours of flight time (meaning his average flight time was just 9 seconds), but his contribution to his field was enormous. He was the first to be able to control and manoeuvre his machines by varying his position and weight distribution, a factor whose importance he realised was absolutely paramount, and also recognised that a proper understanding of how to achieve powered flight (a pursuit that had been proceeding largely unsuccessfully for the past 50 years) could not be achieved without a basis in unpowered glider flight, in recognising that one must work in harmony with aerodynamic forces. Tragically, one of Lilienthal’s gliders crashed in 1896, and he died after two days in hospital. But his work lived on, and the story of his exploits and his death reached across the world, including to a pair of brothers living in Dayton, Ohio, USA, by the name of Wright. Together, the Wright brothers made huge innovations- they redesigned the aerofoil more efficiently, revolutionised aircraft control using wing warping technology (another idea possibly invented by da Vinci), conducted hours of testing in their own wind tunnel, built dozens of test gliders and brought together the work of Cayley, Lilienthal, da Vinci and a host of other, mostly sadly dead, pioneers of the air.  The Wright brothers are undoubtedly the conquerors of the air, being the first to show that man need not be constrained by either gravity or wind, but can use the air as a medium of travel unlike any other. But the credit is not theirs- it is a credit shared between all those who have lived and died in pursuit of the dream of fling like birds. To quote Lilienthal’s dying words, as he lay crippled by mortal injuries from his crash, ‘Sacrifices must be made’.

The Land of the Red

Nowadays, the country to talk about if you want to be seen as being politically forward-looking is, of course, China. The most populous nation on Earth (containing 1.3 billion souls) with an economy and defence budget second only to the USA in terms of size, it also features a gigantic manufacturing and raw materials extraction industry, the world’s largest standing army and one of only five remaining communist governments. In many ways, this is China’s second boom as a superpower, after its early forays into civilisation and technological innovation around the time of Christ made it the world’s largest economy for most of the intervening time. However, the technological revolution that swept the Western world in the two or three hundred years during and preceding the Industrial Revolution (which, according to QI, was entirely due to the development and use of high-quality glass in Europe, a material almost totally unheard of in China having been invented in Egypt and popularised by the Romans) rather passed China by, leaving it a severely underdeveloped nation by the nineteenth century. After around 100 years of bitter political infighting, during which time the 2000 year old Imperial China was replaced by a republic whose control was fiercely contested between nationalists and communists, the chaos of the Second World War destroyed most of what was left of the system. The Second Sino-Japanese War (as that particular branch of WWII was called) killed around 20 million Chinese civilians, the second biggest loss to a country after the Soviet Union, as a Japanese army fresh from an earlier revolution from Imperial to modern systems went on a rampage of rape, murder and destruction throughout the underdeveloped northern China, where some war leaders still fought with swords. The war also annihilated the nationalists, leaving the communists free to sweep to power after the Japanese surrender and establish the now 63-year old People’s Republic, then lead by former librarian Mao Zedong.

Since then, China has changed almost beyond recognition. During the idolised Mao’s reign, the Chinese population near-doubled in an effort to increase the available worker population, an idea tried far less successfully in other countries around the world with significantly less space to fill. This population was then put to work during Mao’s “Great Leap Forward”, in which he tried to move his country away from its previously agricultural economy and into a more manufacturing-centric system. However, whilst the Chinese government insists to this day that three subsequent years of famine were entirely due to natural disasters such as drought and poor weather, and only killed 15 million people, most external commentators agree that the sudden change in the availability of food thanks to the Great Leap certainly contributed to the death toll estimated to actually be in the region of 20-40 million. Oh, and the whole business was an economic failure, as farmers uneducated in modern manufacturing techniques attempted to produce steel at home, resulting in a net replacement of useful food for useless, low-quality pig iron.

This event in many ways typifies the Chinese way- that if millions of people must suffer in order for things to work out better in the long run and on the numbers sheet, then so be it, partially reflecting the disregard for the value of life historically also common in Japan. China is a country that has said it would, in the event of a nuclear war, consider the death of 90% of their population acceptable losses so long as they won, a country whose main justification for this “Great Leap Forward” was to try and bring about a state of social structure & culture that the government could effectively impose socialism upon, as it tried to do during its “Cultural Revolution” during the mid-sixties. All this served to do was get a lot of people killed, resulted in a decade of absolute chaos, literally destroyed China’s education system and, despite reaffirming Mao’s godlike status (partially thanks to an intensification in the formation of his personality cult), some of his actions rather shamed the governmental high-ups, forcing the party to take the angle that, whilst his guiding thought was of course still the foundation of the People’s Republic and entirely correct in every regard, his actions were somehow separate from that and got rather brushed under the carpet. It did help that, by this point, Mao was now dead and was unlikely to have them all hung for daring to question his actions.

But, despite all this chaos, all the destruction and all the political upheaval (nowadays the government is still liable to arrest anyone who suggests that the Cultural Revolution was a good idea), these things shaped China into the powerhouse it is today. It may have slaughtered millions of people and resolutely not worked for 20 years, but Mao’s focus on a manufacturing economy has now started to bear fruit and give the Chinese economy a stable footing that many countries would dearly love in these days of economic instability. It may have an appalling human rights record and have presided over the large-scale destruction of the Chinese environment, but Chinese communism has allowed for the government to control its labour force and industry effectively, allowing it to escape the worst ravages of the last few economic downturns and preventing internal instability. And the extent to which it has forced itself upon the people of China for decades, forcing them into the party line with an iron fist, has allowed its controls to be gently relaxed in the modern era whilst ensuring the government’s position is secure, to an extent satisfying the criticisms of western commentators. Now, China is rich enough and positioned solidly enough to placate its people, to keep up its education system and build cheap housing for the proletariat. To an accountant, therefore,  this has all worked out in the long run.

But we are not all accountants or economists- we are members of the human race, and there is more for us to consider than just some numbers on a spreadsheet. The Chinese government employs thousands of internet security agents to ensure that ‘dangerous’ ideas are not making their way into the country via the web, performs more executions annually than the rest of the world combined, and still viciously represses every critic of the government and any advocate of a new, more democratic system. China has paid an enormously heavy price for the success it enjoys today. Is that price worth it? Well, the government thinks so… but do you?

The Hidden Benefits

Corporations are having a rather rough time of it at the minute in the PR department. This is only to be expected given the current economic climate, and given the fact that almost exactly the same feelings of annoyance and distrust were expressed during the other two major economic downturns of the last 100 years. Big business has always been the all-pervasive face of ‘the man’, and when said man has let us down (either during a downturn or at any point in history when somebody is holding a guitar), they tend to be (often justifiably) the main victims of hatred. In essence, they are ‘the bad guys’.

However, no matter how cynical you are, there are a couple of glaring inconsistencies in this concept- things that can either (depending on your perspective) make the bad guys seem nice, make nice things seem secretly evil, or just make you go “WTF?”. Here we can find the proverbial shades of grey.

Let us consider, for instance, tourism. Nobody who lives anywhere even remotely pretty or interesting likes tourists, and some of the local nicknames for them, especially in coastal areas for some reason, are simultaneously interesting, hilarious and bizarre. They are an annoying bunch of people, seeming always to be asking dumb questions and trailing around places like flocks of lost sheep, and with roughly the same mental agility- although since the rest of us all act exactly the same when we are on holiday, then it’s probably better to tolerate them a little. Then there is the damage they can do to a local area, ranging from footpath erosion and littering to the case o the planet Bethselamin, “which is now so worried about the cumulative erosion of 10 billion visiting tourists a year that any net imbalance between the amount you eat and the amount you excrete whilst on the planet is surgically removed from your body weight when you leave- so every time you go to the lavatory there it is vitally important to get a receipt” (Douglas Adams again). The tourism industry is often accused of stifling local economies in places like Yorkshire or the Lake District, where entire towns can consist of nothing but second homes (sending the local housing market haywire), tea shops and B&B’s, with seemingly no way out of a spiral of dependence upon it.

However, what if I was to tell you that tourism is possibly the single most powerful force acting towards the preservation of biodiversity and the combating of climate change? You might think me mad, but consider this- why is there still Amazonian rainforest left? Why are there vast tracks of national path all over southern Africa? We might (and in fact should) be able to think of dozens of very good reasons for preserving these habitats, not least the benefits to making sure that all of our great planet’s inhabitants are allowed to survive without being crushed under the proverbial bulldozer that is civilisation, and the value to the environment of the carbon sink of the rainforests. But, unfortunately, when viewed from a purely clinical standpoint these arguments do not stand up. Consider the rainforest- depending on your perspective this is either a natural resource that is useful for all sorts of namby-pamby reasons like ensuring the planet doesn’t suffocate, or a source of a potentially huge amount of money. Timber is valuable stuff, especially given the types (such as mahogany) and sizes of trees one gets in the Amazon delta. Factor in that gain with the fact that many of the countries who own such rainforest are desperately poor and badly need the cash, and suddenly the plight of the Lesser Purple-Crested Cockroach seems less important.

And here tourists come to the rescue, for they are the sole financial justification for the preservation  of the rainforests. The idea of keeping all this natural biodiversity for people to have is all well and good, but this idea backed up by the prospect of people paying large sums of money to come and see it becomes doubly attractive, interesting governments in potential long-term financial gain rather than the quick buck that is to be gained from just using up their various natural resources from a purely industrial point of view.

Tourism is not the only industry that props up an entire section of life that we all know and love. Let me throw some names at you: Yahoo, Facebook, Google, Twitter. What do all of those (and many other besides) have in common? Firstly, that all are based on the internet, and secondly that the services offered by all three are entirely free. Contrast that against similarity three, that all are multi-billion dollar companies. How does this work? Answer, similarity 4: all gain their income from the advertising industry.

Advertising and marketing is another sect of modern business that we all hate, as adverts are always annoying by their presence, and can be downright offensively horrible in some cases. Aggressive marketing is basically the reason we can’t have nice things generally, and there is something particularly soulless about an industry whose sole purpose is to sell you things based on what they say, rather than what’s good about whatever they’re selling. They are perhaps the personification of the evils of big business, and yet without it, huge tracts of the internet, the home of the rebellion against modern consumer culture, would simply not be able to exist. Without advertising, the information Facebook has on its hundreds of millions of users would be financially useless, let alone the users themselves, and thus it would not be able to exist as a company or, probably, an entity at all, let alone one that has just completed one of the highest-value stock market flotations in commercial history. Google would exist perhaps merely as a neat idea, something a geek might have thought of in college and never been able to turn into a huge business that deals with a gigantic stake in web traffic as well as running its own social network, email service and even the web browser I am typing this on.

This doesn’t make advertisers and tourism companies suddenly all angels in the light of the world, and they are probably just as deserving of all the cynicism they get (equally deserving, probably, are Facebook and Google, but this would ruin my argument). But it’s worth thinking that, no matter how pushy or annoying they start to get, it may be a small price to pay for the benefits their very existence lends to us.