Poverty Changes

£14,000 is quite a large amount of money. Enough for 70,000 Freddos, a decade’s worth of holidays, two new Nissan Pixo’s, several thousand potatoes or a gold standard racing pigeon. However, if you’re trying to live off just that amount in modern Britain, it quickly seems quite a lot smaller. Half of that could easily disappear on rent, whilst the average British family will spend a further £4,000 on food (significantly greater than the European average, for one reason or another). Then we must factor in tax, work-related expenses, various repair bills, a TV license, utility & heating bills, petrol money and other transport expenses, and it quickly becomes apparent that trying to live on this amount will require some careful budgeting. Still, not to worry too much though; it’s certainly possible to keep the body and soul of a medium sized family together on £14k a year, if not absolutely comfortably, and in any case 70% of British families have an annual income in excess of this amount. It might not be a vast amount to live on, but it should be about enough.

However, there’s a reason I quoted £14,000 specifically in the figure above, because I recently saw another statistic saying that if one’s income is above 14 grand a year, you are one of the top 4% richest people on planet Earth. Or, to put it another way, if you were on that income, and were then to select somebody totally at random from our species, then 24 times out of 25 you would be richer than them.

Now, this slightly shocking fact, as well as being a timely reminder as to the prevalence of poverty amongst fellow members of our species, to me raises an interesting question; if £14,000 is only just about enough to let one’s life operate properly in modern Britain, how on earth does the vast majority of the world manage to survive at all on significantly less than this? More than 70% of the Chinese population (in 2008, admittedly; the rate of Chinese poverty is decreasing at a staggering rate thanks to its booming economy) live on less than $5 a day, and 35 years ago more than 80% were considered to be in absolute poverty. How does this work? How does most of the rest of the world physically survive?

The obvious starting point is the one stating that much of it barely does. Despite the last few decades of massive improvement in the living standards and poverty levels in the world in general,  the World Bank estimates that some 20% of the world’s populace is living below the absolute poverty line of surviving on less than $1.50 per person per day, or £365 a year (down from around 45% in the early 1980s- Bob Geldof’s message has packed a powerful punch). This is the generally accepted marker for being less than what a person can physically keep body and soul together on, and having such a huge proportion of people living below this marker tends to drag down the global average. Poverty is something that the last quarter of the century has seen a definitive effort on the part of humanity to reduce, but it’s still a truly vast issue across the globe.

However, the main contributing factor to me behind how a seemingly meagre amount of money in the first world would be considered bountiful wealth in the third is simply down to how economics works. We in the west are currently enjoying the fruits of two centuries of free-market capitalism, which has fundamentally changed the way our civilisation functions. When we as a race first came up with the concept of civilisation, of pooling and exchanging skills and resources for the betterment of the collective, this was largely confined to the local community, or at least to the small-scale. Farmers provided for those living in the surrounding twenty miles or so, as did brewers, hunters, and all other such ‘small businessmen’, as they would be called today. The concept of a country provided security from invasion and legal support on a larger scale, but that was about it; any international trade was generally conducted between kings and noblemen, and was very much small scale.

However, since the days of the British Empire and the Industrial Revolution, business has got steadily bigger and bigger. It started out with international trade between the colonies, and the rich untapped resources the European imperial powers found there, moved on to the industrial scale manufacture of goods, and then the high-intensity sale of consumer products to the general population. Now we have vast multinational companies organising long, exhaustive chains of supply, manufacture and retail, and our society has become firmly rooted in this intense selling international economy. Without constantly selling vast quantities of stuff to one another, the western world as we know it simply would not exist.

This process causes many side effects, but one is of particular interest; everything becomes more expensive. To summarise very simply, the basic principle of capitalism involves workers putting in work and skill to increase the value of something; that something then gets sold, and the worker then gets some of the difference between cost of materials and cost of sale as a reward for their effort. For this to work, then one’s reward for putting in your effort must be enough to purchase the stuff needed to keep you alive; capitalism rests on the principle of our bodies being X% efficient at turning the food we eat into the energy we can use to work. If business is successful, then the workers of a company (here the term ‘workers’ covers everyone from factory floor to management) will gain money in the long term, enabling them to spend more money. This means that the market increases in size, and people can either sell more goods or start selling them for a higher price, so goods become worth more, so the people making those goods start getting more money, and so on.

The net result of this is that in an ‘expensive’ economy, everyone has a relatively high income and high expenditure, because all goods, taxes, land, utilities etc. cost quite a lot; but, for all practical purposes, this results in a remarkably similar situation to a ‘cheap’ economy, where the full force of western capitalism hasn’t quite taken hold yet- for, whilst the people residing there have less money, the stuff that is there costs less having not been through the corporation wringer. So, why would we find it tricky to live on less money than the top 4% of the world’s population? Blame the Industrial Revolution.

Advertisement

“If I die before I wake…”

…which I might well do when this post hits the internet, then I hope somebody will at least look down upon my soul & life’s work favourably. Today, I am going to be dealing with the internet’s least favourite topic, an idea whose adherence will get you first derided and later inundated with offers to go and be slaughtered in one’s bed, a subject that should be taboo for any blogger looking to not infuriate everybody; that of religion.

I am not a religious person; despite a nominally Anglican upbringing my formative years found most of my Sundays occupied on the rugby pitch, whilst a deep interest in science tended to form the foundations of my world beliefs- I think (sometimes) to some personal detriment. This is a pattern I see regularly among those people I find as company (which may or may not say something about my choice of friends)- predominantly atheists with little or no religious upbringing who tend to steer clear of religion and its various associated features wherever possible. However, where I find I differ from them tends to be when the subject is broached when in the present of a devoutly Christian friend of mine; whilst I tend to leave his beliefs to himself and try not to spark an argument, many others I know see a demonstration of his beliefs as a cue to start on a campaign of ‘ha ha isn’t your world philosophy stupid’, and so on.  I tend to find these attacks more baffling and a little saddening than anything else, so I thought that I might take this opportunity to take my usual approach and try to analyse the issue

First up is a fact that most people are aware of even if it hasn’t quite made the jump into an articulate thought yet; that every religion is in fact two separate parts. The first of these can be dubbed the ‘faith’ aspect; the stories, the gods, the code of morals & general life guidelines and such, all of the bits that form the core of a system of beliefs and are, to a theist, the ‘godly’ part of their religion. The second can be labelled the ‘church’ aspect; this is the more man-made, even artificial, aspect of the religious system, and covers the system of priesthood (or equivalent) for each religion, their holy buildings, the religious leaders and even people’s personal interpretation of the ‘faith’ aspect. Holy books, such as the Bible or Torah, fall somewhere in between (Muslims believe, for example, that the Qur’an is literally the word of Allah, translated through the prophet Muhammed) as do the various prayers and religious music. In Buddhism, these two aspects are known as the Dharma (teachings) and Sangha (community), and together with Buddha form the ‘three jewels’ of their religion. In some religions, such as Scientology (if that can technically be called a religion) the two aspects are so closely entwined so as to be hard to separate, but they are still distinct aspects that should be treated separately. The ‘faith’ aspect of religion is, in most respects, the really important one, for it is this that actually formulates the basis of a religion; without a belief system, a church is nothing more than a place where people go to shout their views at those who inexplicably turn up. A religion’s ‘church’ aspect is its organised divisions, and exists for no greater or lesser purpose than to spread, cherish, protect and correctly translate the word of God, or other parts of the ‘faith’ aspect generally. This distinction is vital when we consider how great a difference there can be between what somebody believes and what another does in the same name.

For example, consider the ultra-fundamentalist Taliban currently fighting their Jihad (the word does not, on an unrelated note, technically translate as ‘holy war’ and the two should not be thought of a synonymous) in Afghanistan against the USA and other western powers. Their personal interpretation of the Qur’an and the teachings of Islam (their ‘church’ aspect) has lead them to believe that women do not deserve equal rights to men, that the western powers are ‘infidels’ who should be purged from the world, and that they must use force and military intervention against them to defend Islam from said infidels- hence why they are currently fighting a massive war that is getting huge amounts of innocent civilians killed and destroying their faith’s credibility. By contrast, there are nearly 2 million Muslims currently living in the UK, the vast majority of whom do not interpret their religion in the same way and are not currently blowing up many buildings- and yet they still identify as Islamic and believe in, broadly speaking, the same faith. To pick a perhaps more ‘real world’ example, I’m sure that the majority of Britain’s Catholic population steadfastly disagree with the paedophilia practiced by some of their Church’s priests, and that a certain proportion also disagree with the Pope’s views on the rights of homosexuals; and yet, they are still just as Christian as their priests, are devout believers in the teachings of God & Jesus and try to follow them as best as they can.

This I feel, is the nub of the matter; that one can be simultaneously a practising Christian, Muslim, Jew or whatever else and still be a normal human being. Just because your vicar holds one view, doesn’t mean you hold the same, and just because some people choose to base their entire life around their faith does not mean that a person must be defined by their belief system. And, returning to the subject of the ridicule many practising theists suffer, just because the ‘church’ aspect of a religion does something silly, doesn’t mean all practitioners of it deserve to be tarred with the same brush- or that their view on the world should even matter to you as you enjoy life in your own way (unless of course their belief actively impedes you in some way).

I feel like I haven’t really got my point across properly, so I’ll leave you with a few links that I think illustrate quite well what I’m trying to get at. I only hope that it will help others find a little more tolerance towards those who have found a religious path.

And sorry for this post being rather… weird

Today

Today, as very few of you will I’m sure be aware (hey, I wasn’t until a few minutes ago) is World Mental Health Day. I have touched on my own personal experiences of mental health problems before, having spent the last few years suffering from depression, but I feel today is a suitably appropriate time to bring it up again, because this is an issue that, in the modern world, cannot be talked about enough.

Y’see, conservative estimates claim at least 1 in 4 of us will suffer from a mental health problem at some point in our lives, be it a relatively temporary one such as post-natal depression or a lifelong battle with the likes of manic depressive disorder or schizophrenia. Mental health is also in the top five biggest killers in the developed world, through a mixture of suicide, drug usage, self-harming or self-negligence, and as such there is next to zero chance that you will go through your life without somebody you know very closely suffering or even dying as a result of what’s going on in their upstairs. If mental health disorders were a disease in the traditional sense, this would be labelled a red alert, emergency level pandemic.

However, despite the prevalence and danger associated with mental health, the majority of sufferers do so in silence. Some have argued that the two correlate due to the mindset of sufferers, but this claim does not change the fact 9 out of 10 people suffering from a mental health problem say that they feel a degree of social stigma and discrimination against their disability (and yes that description is appropriate; a damaged mind is surely just as debilitating, if not more so, than a damaged body), and this prevents them from coming out to their friends about their suffering.

The reason for this is an all too human one; we humans rely heavily, perhaps more so than any other species, on our sense of sight to formulate our mental picture of the world around us, from the obviously there to the unsaid subtext. We are, therefore, easily able to identify with and relate to physical injuries and obvious behaviours that suggest something is ‘broken’ with another’s body and general being, and that they are injured or disabled is clear to us. However, a mental problem is confined to the unseen recesses of our brain, hiding away from the physical world and making it hard for us to identify with as a problem. We may see people acting down a lot, hanging their head and giving other hints through their body language that something’s up, but everybody looks that way from time to time and it is generally considered a regrettable but normal part of being human. If we see someone acting like that every day, our sympathy for what we perceive as a short-term issue may often turn into annoyance that people aren’t resolving it, creating a sense that they are in the wrong for being so unhappy the whole time and not taking a positive outlook on life.

Then we must also consider the fact that mental health problems tend to place a lot of emphasis on the self, rather than one’s surroundings. With a physical disability, such as a broken leg, the source of our problems, and our worry, is centred on the physical world around us; how can I get up that flight of stairs, will I be able to keep up with everyone, what if I slip or get knocked over, and so on. However, when one suffers from depression, anxiety or whatever, the source of our worry is generally to do with our own personal failings or problems, and less on the world around us. We might continually beat ourselves up over the most microscopic of failings and tell ourselves that we’re not good enough, or be filled by an overbearing, unidentifiable sense of dread that we can only identify as emanating from within ourselves. Thus, when suffering from mental issues we tend to focus our attention inwards, creating a barrier between our suffering and the outside world and making it hard to break through the wall and let others know of our suffering.

All this creates an environment surrounding mental health that it is a subject not to be broached in general conversation, that it just doesn’t get talked about; not so much because it is a taboo of any kind but more due to a sense that it will not fit into the real world that well. This is even a problem in the environment of counselling  specifically designed to try and address such issues, as people are naturally reluctant to let it out or even to ‘give in’ and admit there is something wrong. Many people who take a break from counselling, me included, confident that we’ve come a long way towards solving our various issues, are for this reason resistive to the idea of going back if things take a turn for the worse again.

And it’s not as simple as making people go to counselling either, because quite frequently that’s not the answer. For some people, they go to the wrong place and find their counsellor is not good at relating to and helping them; others may need medication or some such rather than words to get them through the worst times, and for others counselling just plain doesn’t work. But this does not detract from the fact that no mental health condition in no person, however serious, is so bad as to be untreatable, and the best treatment I’ve ever found for my depression has been those moments when people are just nice to me, and make me feel like I belong.

This then, is the two-part message of today, of World Mental Health Day, and of every day and every person across the world; if you have a mental health problem, talk. Get it out there, let people know. Tell your friends, tell your family, find a therapist and tell them, but break the walls of your own mental imprisonment and let the message out. This is not something that should be forever bottled up inside us.

And for the rest of you, those of us who do not suffer or are not at the moment, your task is perhaps even more important; be there. Be prepared to hear that someone has a mental health problem, be ready to offer them support, a shoulder to lean on, but most importantly, just be a nice human being. Share a little love wherever and to whoever you can, and help to make the world a better place for every silent sufferer out there.

What we know and what we understand are two very different things…

If the whole Y2K debacle over a decade ago taught us anything, it was that the vast majority of the population did not understand the little plastic boxes known as computers that were rapidly filling up their homes. Nothing especially wrong or unusual about this- there’s a lot of things that only a few nerds understand properly, an awful lot of other stuff in our life to understand, and in any case the personal computer had only just started to become commonplace. However, over 12 and a half years later, the general understanding of a lot of us does not appear to have increased to any significant degree, and we still remain largely ignorant of these little feats of electronic witchcraft. Oh sure, we can work and operate them (most of us anyway), and we know roughly what they do, but as to exactly how they operate, precisely how they carry out their tasks? Sorry, not a clue.

This is largely understandable, particularly given the value of ‘understand’ that is applicable in computer-based situations. Computers are a rare example of a complex system that an expert is genuinely capable of understanding, in minute detail, every single aspect of the system’s working, both what it does, why it is there, and why it is (or, in some cases, shouldn’t be) constructed to that particular specification. To understand a computer in its entirety, therefore, is an equally complex job, and this is one very good reason why computer nerds tend to be a quite solitary bunch, with quite few links to the rest of us and, indeed, the outside world at large.

One person who does not understand computers very well is me, despite the fact that I have been using them, in one form or another, for as long as I can comfortably remember. Over this summer, however, I had quite a lot of free time on my hands, and part of that time was spent finally relenting to the badgering of a friend and having a go with Linux (Ubuntu if you really want to know) for the first time. Since I like to do my background research before getting stuck into any project, this necessitated quite some research into the hows and whys of its installation, along with which came quite a lot of info as to the hows and practicalities of my computer generally. I thought, then, that I might spend the next couple of posts or so detailing some of what I learned, building up a picture of a computer’s functioning from the ground up, and starting with a bit of a history lesson…

‘Computer’ was originally a job title, the job itself being akin to accountancy without the imagination. A computer was a number-cruncher, a supposedly infallible data processing machine employed to perform a range of jobs ranging from astronomical prediction to calculating interest. The job was a fairly good one, anyone clever enough to land it probably doing well by the standards of his age, but the output wasn’t. The human brain is not built for infallibility and, not infrequently, would make mistakes. Most of these undoubtedly went unnoticed or at least rarely caused significant harm, but the system was nonetheless inefficient. Abacuses, log tables and slide rules all aided arithmetic manipulation to a great degree in their respective fields, but true infallibility was unachievable whilst still reliant on the human mind.

Enter Blaise Pascal, 17th century mathematician and pioneer of probability theory (among other things), who invented the mechanical calculator aged just 19, in 1642. His original design wasn’t much more than a counting machine, a sequence of cogs and wheels so constructed as to able to count and convert between units, tens, hundreds and so on (ie a turn of 4 spaces on the ‘units’ cog whilst a seven was already counted would bring up eleven), as well as being able to work with currency denominations and distances as well. However, it could also subtract, multiply and divide (with some difficulty), and moreover proved an important point- that a mechanical machine could cut out the human error factor and reduce any inaccuracy to one of simply entering the wrong number.

Pascal’s machine was both expensive and complicated, meaning only twenty were ever made, but his was the only working mechanical calculator of the 17th century. Several, of a range of designs, were built during the 18th century as show pieces, but by the 19th the release of Thomas de Colmar’s Arithmometer, after 30 years of development, signified the birth of an industry. It wasn’t a large one, since the machines were still expensive and only of limited use, but de Colmar’s machine was the simplest and most reliable model yet. Around 3,000 mechanical calculators, of various designs and manufacturers, were sold by 1890, but by then the field had been given an unexpected shuffling.

Just two years after de Colmar had first patented his pre-development Arithmometer, an Englishmen by the name of Charles Babbage showed an interesting-looking pile of brass to a few friends and associates- a small assembly of cogs and wheels that he said was merely a precursor to the design of a far larger machine: his difference engine. The mathematical workings of his design were based on Newton polynomials, a fiddly bit of maths that I won’t even pretend to understand, but that could be used to closely approximate logarithmic and trigonometric functions. However, what made the difference engine special was that the original setup of the device, the positions of the various columns and so forth, determined what function the machine performed. This was more than just a simple device for adding up, this was beginning to look like a programmable computer.

Babbage’s machine was not the all-conquering revolutionary design the hype about it might have you believe. Babbage was commissioned to build one by the British government for military purposes, but since Babbage was often brash, once claiming that he could not fathom the idiocy of the mind that would think up a question an MP had just asked him, and prized academia above fiscal matters & practicality, the idea fell through. After investing £17,000 in his machine before realising that he had switched to working on a new and improved design known as the analytical engine, they pulled the plug and the machine never got made. Neither did the analytical engine, which is a crying shame; this was the first true computer design, with two separate inputs for both data and the required program, which could be a lot more complicated than just adding or subtracting, and an integrated memory system. It could even print results on one of three printers, in what could be considered the first human interfacing system (akin to a modern-day monitor), and had ‘control flow systems’ incorporated to ensure the performing of programs occurred in the correct order. We may never know, since it has never been built, whether Babbage’s analytical engine would have worked, but a later model of his difference engine was built for the London Science Museum in 1991, yielding accurate results to 31 decimal places.

…and I appear to have run on a bit further than intended. No matter- my next post will continue this journey down the history of the computer, and we’ll see if I can get onto any actual explanation of how the things work.