Blubber

Fat is a much-maligned substance in the twenty-first century world we find ourselves in; exhortations for it to be burnt or exhumed from one’s diet abound from all sides, and indeed entire industries are now founded on dealing with the unwanted stuff in one form or another. However, fat is not, in fact, some demonic hate figure designed specifically to kill all that is good and beautiful about our world, and since it is at least relatively interesting I thought it might be worth investigating a few bits and pieces surrounding it over the course of a post.

All fats are based upon a molecule called glycerol, or propan-1,2,3-triol to give it its technical IUPAC name. Glycerol is a very interesting substance used for a wide range of purposes both in the body and commercially; it can be broken down to form sugar, can be used as a laxative, is an effective antifreeze, a useful solvent, a sweetener, is a key ingredient in the production of dynamite and, of course, can be used to store energy in fatty form. Glycerol is, technically speaking, an alcohol, but unlike most everyday alcohols (such as the ethanol upon which many of our favourite drinks are based) each glycerol molecule contains not one but three alcohol functional groups. In a fat, these alcohol groups act like sticking points, allowing three different long-chain carboxylic acid molecules known as ‘fatty acids’ to attach to each glycerol molecule. For this reason, fats are also known as ‘triglycerides’, and precisely which fat is formed from this structure depends on the structure of these fatty acids.

Fatty acids consisting of shorter chains of carbon atoms have less atoms with which to interact with their surroundings,  and thus the intermolecular forces between the fatty acid chains and other molecules are weaker for shorter-chain acids. This has a number of effects on the properties of the final product, but one of the most obvious concerns its melting point; shorter-chain fatty acids generally result in a product that is liquid at room temperature, and such products are designated as ‘oils’ rather than fats. Thus, not all triglycerides are, technically speaking, fats, and even triglycerides are part of a larger chemical family of fat-like substances known as ‘lipids’ (organic chemistry can be confusing). As a general rule, plants tend to produce oils and animals produce fats (presumably for reasons of storage), which is why you get stuff like duck fat and olive oil rather than the reverse.

The structure of the fatty acids is also important in an important dietary consideration surrounding fats; whether they are saturated or unsaturated. In chemistry, carbon atoms are bonded to one another by covalent bonds, consisting of a shared pair of electrons (each atom providing one electron of the pair) that keeps the two atoms bonded together. Most of the time, only one pair of electrons forms the bond (known as a single bond), but sometimes the relevant carbon atoms have a surfeit of electrons and will create another shared pair, forming a double covalent bond. The nature of double bonds means that the carbon atoms involved can accept more hydrogen atoms (or other electrophiles such as bromine; bromine water is a good test for double bonds) whereas a molecule made up entirely of singly-bonded atoms couldn’t accept any more and would be said to be saturated with hydrogen. Thus, molecules (including fats and fatty acids) with only single bonds are described as saturated, whilst those with double bonds are known as unsaturated*. A mixture of the food industry and chemical fraternity has developed a whole host of more specific descriptive terms that give you more detail as to the chemical structure of your fats (stuff like monounsaturated and such), and has also subdivided unsaturated fats into two more categories, cis- and trans-fats (the names refer to the molecules’ arrangement in space about the double bond, not their gender orientation).

With all these different labels, it’s no wonder people have so much trouble remembering, much less identifying, which fats they are ‘supposed to avoid’. Saturated and trans-unsaturated fats (which occur rarely in nature due to enzyme structure and are usually manufactured artificially) are apparently bad, mono-unsaturated (cis-) fats are good, and poly-unsaturated (cis-) fats good in moderation.

The extent to which these fats are ‘good’ and ‘healthy’ does not refer to the effect they will have on your waistline; all fats you eat are first broken down by your digestive process, and the resulting calories produced are then either used to power your body or turned into other sorts of fat that take up belly space. This process is the same for all types of energy-containing food and I shall come onto a few details about it in a paragraph or two. No, the relative health risk of these different fat types refers instead to the production of another type of lipid; cholesterol, which has such a complex, confusing structure and synthesis that I’m not even going to try to describe it. Cholesterol is a substance produced intentionally by the body and is very useful; it is used in the production of all sorts of hormones and vitamins, is a key ingredient of bile and is used in helping cells rebuild themselves. It is transported through the body by two different substances known as LDL (low-density lipoprotein) and HDL (take a wild guess) that carry it via the bloodstream; and this is where problems arise. The precise mechanism behind it is not known, but an increased consumption of trans-fats and other ‘bad’ triglycerides leads to an increase in the amount of cholesterol and LDL in the bloodstream. If this stuff is allowed to build up, cholesterol can start to ‘stick’ to the sides of one’s blood vessels, slowly reducing the effective size of the blood vessel until it is almost completely shut. This greatly reduces the flow of blood through these vessels, and this can have particularly dramatic consequences if the large, important blood vessels close to or supplying the heart are affected, leading to coronary heart disease and a greatly increased risk of heart attacks. HDL, for some reason, doesn’t apparently contribute to this affect, leading HDL to be (misleadingly, since it’s not actually cholesterol) dubbed ‘good cholesterol’ and LDL as ‘bad cholesterol’.

Clearly, then, having too much of these ‘bad fats’ can have some pretty serious consequences, but public realisation of this has lead all fat to be considered as a disgusting thing to be shunned. Frankly, this is just plain old not true, and it is far easier to live a healthy life with a bit of meat** on the bones than to go down the super-skinny angle. Fat is a vital body tissue, required for insulation, vitamin transport, to store energy, to prevent the disease and provides many essential nutrients; omega-3, the ‘essential oil’ (meaning it is not produced by the body) found in fish that is thought  to play a role in brain development and other bodily functions, is nothing more than an unusual fatty acid.

If you want further evidence as to the importance fat plays in one’s body, I refer you to a condition known as lipodystrophy, in which one’s body cannot produce or store fat properly. In some cases this is localised and relatively harmless, but in incredibly rare cases it manifests itself as a hereditary condition that causes abnormal bone and muscle growth, facial disfigurement and requires an incredibly strict diet (in direct contravention of the massive appetite the condition gives you) in order to control one’s levels of cholesterol and carbohydrate intake. In many cases, sufferers of this horrible condition will not live past twenty, if they even get that far.

*Vegetable oils tend to be more frequently unsaturated than fats, as this is another factor that reduces their melting point and makes them liquid. A key process involved in producing margarine involves taking these vegetable oils and adding hydrogen to these double bonds, a process known as hydrogenation, in order to raise their melting point and make the margarine solid and spreadable. Chemistry!

**Although, as anyone who likes their bacon skinny will tell you, fat is most certainly not meat. In fact, it’s not even alive.

Advertisement

Fish

‘Fish’ is one of my favourite words. Having only a single syllable means it can be dropped into conversation without a second thought, thus enabling one to cause maximum confusion with minimal time spent considering one’s move, which often rather spoils the moment. The very… forward nature of the word also suits this function- the very bluntness of it, its definitive end and beginning with little in the way of middle to get distracting, almost forces it to take centre stage in any statement, whether alone or accompanied by other words, demanding it be said loud and proud without a trace of fear or embarrassment. It also helps that the word is very rarely an appropriate response to anything, enhancing its inherent weirdness.

Ahem. Sorry about that.

However, fish themselves are very interesting in their own right; and yes, I am about to attempt an overall summary of one of the largest groups in the animal kingdom in less than 1000 words.  For one thing, every single vertebrate on the planet is descended from them; in 1999 a fossil less than 3cm long and 524 million years old was discovered in China with a single ‘stick’ of rigid material, probably cartilage, running down the length of its body. It may be the only example ever discovered of Myllokunmingia fengjiaoa (awesome name), but that tiny little fossil has proved to be among the most significant ever found. Although not proven, that little bit of cartilage is thought to be the first ever backbone, making Myllokunmingia the world’s first fish and the direct ancestor of everything from you to the pigeon outside your window. It’s quite a humbling thought.

This incredible age of fish as a group, which in turn means there are very few specimens of early fish, has meant that piscine evolution is not studied as a single science; the three different classes of fish (bony, cartilaginous and jawless, representing the likes of cod, sharks and hagfish respectively- a fourth class of armoured fish died out some 360 million years ago) all split into separate entities long before any other group of vertebrates began to evolve, and all modern land-based vertebrates (tetrapods, meaning four-limbed) are direct descendants of the bony fish, the most successful of the three groups. This has two interesting side-effects; firstly that a salmon is more closely related to you than to a shark, and secondly (for precisely this reason) that some argue there is no such thing as a fish. The term ‘fish’ was introduced as a coverall term to everything whose lack of weight-bearing limbs confines them to the water before evolutionary biology had really got going, and technically the like of sharks and lamprey should each get a name to themselves- but it appears we’re stuck with fish, so any grumpy biologists are just going to have to suck it.

The reason for this early designation of fish in our language is almost certainly culinary in origin, for this is the main reason we ever came, and indeed continue to come, into contact with them at all. Fish have been an available, nutritious and relatively simple to catch food source for humans for many a millennia, but a mixture of their somewhat limited size, the fact that they can’t be farmed and the fact that bacon tastes damn good meant they are considered by some, particularly in the west (fish has always enjoyed far greater popularity in far eastern cultures), to the poor cousins to ‘proper meat’ like pork or beef. Indeed, many vegetarians (including me; it’s how I was brought up) will eschew meat but quite happily eat fish in large quantities, usually using the logic that fish are so damn stupid they’re almost vegetables anyway. Vegetarians were not, however, the main reason for fish’s survival as a common food for everyone, including those living far inland, in Europe- for that we can thank the Church. Somewhere in the dim and distant past, the Catholic Church decreed that one should not eat red meat on the Sabbath day- but that fish was permitted. This kept fish a common dish throughout Europe, as well as encouraging the rampant rule bending that always accompanies any inconvenient law; beaver were hunted almost to extinction in Europe by being classed as fish under this rule. It was also this ruling that lead to lamprey (a type of jawless fish that looks like a cross between a sea snake and a leech) becoming a delicacy among the crowned heads of Europe, and Henry I of England (third son of William the Conqueror, in case you wanted to know) is reported to have died from eating too many of the things.

The feature most characteristic of fish is, of course, gills, even though not all fish have them and many other aquatic species do (albeit less obviously). To many, how gills work is an absolute mystery, but then again how many of you can say, when it comes right down to the science of the gas exchange process, how your lungs work? In both systems, the basic principle is the same; very small, thin blood vessels within the structure concerned are small and permeable enough to allow gas molecules to move across the gap from one side of the blood vessel’s wall to the other, allowing carbon dioxide built up from moving and generally being alive to move out of the bloodstream and fresh oxygen to move in. The only real difference concerns structure; the lungs consist of a complex, intertwining labyrinth of air spaces of various size with blood vessels spread over the surface and designed to filter oxygen from the air, whilst gills basically string the blood vessels up along a series of sticks and hold them in the path of flowing water to absorb the oxygen dissolved within it- gills are usually located such that water flows through the mouth and out via the gills as the fish swims forward. In order to ensure a constant supply of oxygen-rich water is flowing over the gills, most fish must keep swimming constantly or else the water beside their gills would begin to stagnate- but some species’, such as nurse sharks, are able to pump water over their gills manually, allowing them to lie still and allow them to do… sharky things. Interestingly, the reason gills won’t work on land isn’t simply that they aren’t designed to filter oxygen from the air; a major contributory factor is the fact that, without the surrounding water to support them, the structure of the gills is prone to collapse, causing parts of it cease to be able to function as a gas exchange mechanism.

Well, that was a nice ramble. What’s up next time, I wonder…

Hope and Obama

Before I start writing this post, a brief disclaimer; I am not American, do not live there and do not have extensive first-hand experience of the political situation over there. This post is inspired entirely from stuff I’ve seen other people talk about online and a few bits of joining the dots from me, but if anyone feels I’ve gone wildly off-target please drop me a line in the comments. OK? Good, let’s get started.

The ascendency of Barack Hussein Obama to the Presidency of the USA in 2009 was among the most significant events in recent history. Not only did he become the first black person to sit in the Oval office, he put the Democrats back in power (representing a fairly major shift in direction for the country after eight years under George Bush Jnr.) and manage to put his party in control of Congress too, the first time any Democrat leader had been in that position for quite some time. With bold claims regarding the wars in Afghanistan and Iraq, both of which had been… talking points  during Bush’s time in charge, and big plans regarding the US healthcare system, this had all the hallmarks of a presidency dedicated to making change happen. Indeed, change was the key buzzword during Obama’s campaign; change for the punishing effects of US society on its young and poor, change for the recession-hit economy, and even change for the type of person in the White House (Bush had frequently been portrayed, rather unjustly for a man of notoriously quick wit, as stupid and socially incapable by satirists and left-leaning commentators, whilst even the right would find it hard to deny Obama’s natural charisma and intelligent, upright bearing) were all promised to voters, and it was a dream many took with them to the polling stations.

One of the key demographics the Democrats targeted and benefited from with this ‘pro-change’ style campaign was the youth vote; early twenty-somethings or even late teens, many of whom were voting in their first elections, who had grown up both physically and politically during the Bush administration and railed against his management of everything from the economy to the welfare system with all the ardour and uncluttered train of thought of young people everywhere. I should know: living through the period as a young person in a left-leaning family getting my news via the liberally-inclined BBC (and watching too much satirical comedy), one could hardly escape the idea that Bush was an absolute moron who know nothing about running his country. And this was whilst getting daily first-hand experience of what a left-wing government was like in Britain- I can imagine that to a young American with a similar outlook and position at the time, surrounded by right-leaning sentiment on all sides, the prospect of a Democratic president dedicated to change would have seemed like a shining beacon of hope for a brighter future. Indeed, the apparent importance of the youth vote to Obama’s success was illustrated during his 2012 re-election: when the news broke that Microsoft were planning on releasing a new Halo videogame on election day, conspiracy theorists had a wonderful time suggesting that Microsoft were embroiled in a great Republican plot to distract the youth vote by having them play Halo all day instead, thus meaning they couldn’t vote Democrat*.

Now, let us fast forward to the 2012 election. Obama won, but narrowly- and given he was up against a candidate whose comments that he ‘didn’t care about the very poor’ and thought that the windows in passenger aircraft should be able to be opened were very widely circulated and mocked, the result was far too close for comfort (even if, despite what some pundits and conservative commentators would have had you believe, all the pre-election statistics indicated a fairly safe Democrat victory). Whilst the airwaves weren’t exactly awash with anti-Obama messages, it wasn’t hard to find disillusionment and cynicism regarding his first term in office. For me, the whole thing was summed up by the attitudes of Jeph Jacques, the cartoonist behind the webcomic ‘Questionable Content’; reading through his back catalogue, he frequently had to restrain himself from verbalising his Obama-fandom in the comments below his comics during the 2008 election, but come election season in 2012 he chose to publish this. That comic pretty much sums it up: a whole generation had been promised change, and change had refused to come on a sufficiently large scale. The youthful optimism of his rise to power was replaced by something more akin to the weariness Obama himself displayed during the first live TV debate, and whilst I’m sure many of these somewhat disillusioned voters still voted Democrat (I mean, he still won, and preliminary statistics suggest voter turnout actually rose in 2012 compared to 2008), the prevailing mood seemed to be one less of optimism than of ‘better him than Romney’.

Exactly what was to blame for the lack of the promised change is a matter of debate; apologists may point to the difficulties had getting such radical (by American standards) health reforms and similar through a decidedly moderate congress, followed by the difficulties had trying to get anything through when congress became Republican-controlled, whilst the more cynical or pro-Republican would probably make some statement referring to the corporate-sponsored nature of the Democratic party/American political system or suggest that President Obama simply isn’t quite as good a politician/person (depending on the extent of your cynicism) as he came across as in 2008. Whatever the answer, the practical upshot has been quite interesting, as it has allowed one to watch as an entire generation discovered cynicism for the first time. All these hopes and dreams of some brave new vision for America went steaming face first into the bitter reality of the world and of politics, and the dream slowly fell apart. I am not old enough to definitively say that this is a pattern that has repeated itself down the ages, but nonetheless I found the whole escapade fascinating in a semi-morbid way, and I will be intrigued to see if/when it happens again.

Damn, I’m really going for conclusion-less posts at the moment…

*Interestingly, this kind of tactic has, so the story goes, been deliberately used in the past to achieve precisely the opposite effect. When Boris Yeltsin attempted to get re-elected as Russian president in 1996, voting day was designated a public holiday. Unfortunately, it was soon realised that many urban Russians, Yeltsin’s main voter base, were going to take this as a cue for a long weekend in the country (presumably hunting bears or whatever else Russians do in their second home in Siberia) rather than to go and vote, so Yeltsin went to the makers of telenovela (a kind of South American soap opera) called Tropikanka that was massively popular in the country and got them to make three brand-new episodes to be aired on election day. This kept the city dwellers at home, since many country spots didn’t have TV access, and meant they were around to go and vote. Yeltsin duly won, with 54.4% of the vote.

Keeping it Cool

We humans are unique in so many ways, but perhaps our mastery of the systems used in getting food into our mouths is the most remarkable. From our humble hunter-gatherer beginnings, in which we behaved much as any other animals, we have discovered agriculture, domesticated animals, learned to harvest milk and eggs and are nowadays even capable of growing a steak from just a few cells (we’ll temporarily gloss over the cost and taste of the finished product). However, arguably just as important as these advancements has been our ability to store food, allowing us to survive the harshest of winters and conditions in numbers few other animals could hope to match.

Our methods of food storage have varied widely over the years; beyond the simple tactic of ‘keep your food somewhere basically dry and clean’, in the last few decades we’ve moved on from our old favourites to explore as wide a variety of solutions as chemical preservatives and freeze drying. However, today I wish to explore an older, yet arguably far more interesting, method that remains our current favourite method of home food preservation: that of refrigeration.

Refrigeration, or the highly technical art of ‘making food colder so bad things can’t survive’, is an ancient idea; ice houses have been found in Iran dating from 1700BC, and were in use in both China and the Roman Empire throughout both culture’s long histories. Since making their own ice was impossible using the technology of the time, these ancient civilisations simply moved existing ice to a designated place where it with useful and came up with ingenious ways to make sure it stayed cold throughout the long summers; these great buildings would have immensely thick walls and were then packed with straw or sawdust to prevent the air circulating, thus helping to maintain their temperature. Thanks to their thick walls, ice houses were necessarily vast structures, acting rather like communal refrigerators for a local lord and his community and capable of holding up to thirty thousand tons of food.

In other countries, where snow and ice was harder to reliably come by (even in winter), refrigeration didn’t really catch on and people stuck with salting their food. However, because this a) made a lot of food taste disgusting and b) meant you still had to drink warm beer, by the seventeenth century it became relatively common for the rich across Europe to import ice (at vast expense) to their own personal ice houses, allowing them to serve fancy drinks at parties and the like and enjoy an unsalted pork roast in February. Ice was a symbol of luxury and status, which is presumably one of the reasons why ice sculptures are even today considered the pinnacle of class and fine living (that and the fact that they’re really, really cool). During the Georgian and Victorian eras, it was common practice for families going out for a day’s jolly (particularly in the colonies) to take an ice box of food with them, and there were even ice shops where the rich would go to buy high-quality, exceptionally clear ice for whatever party they happened to be hosting- but, by the end of the century that business would be long bust.

Y’see, in 1805 a man named Oliver Evans, who would later become known as ‘the father of refrigeration’, invented a device called the vapour-compression refrigeration machine. This is, basically, a series of tubes containing a stable coolant; the coolant is first compressed, then condenses (causing it to lose the heat it’s picked up- this is the vapour-compression bit), before going back inside and evaporating again thanks to a mixture of a pressure change and temperature change, thus allowing it to pick up heat. This rather convoluted evaporation/condensation procedure (first investigated by Benjamin Franklin and a chemistry professor called John Hadley half a century earlier) wasn’t actually the preferred solution for a few decades, since the earliest devices built were ‘compression-compression’ systems that used air as a coolant and were thus only able to change its pressure rather than get it to liquefy. Regardless, it was soon realised the vapour-compression system allows a device to more efficiently control the transfer of heat from in to out rather than vice versa, and is now pretty much universally used in modern day ‘heat pumps’ of all sorts.. Incidentally, heat pumps are among the most efficient systems ever devised for heating/cooling a space, and nowadays they are increasingly used (in the opposite direction, of course), to heat houses, as they use far less energy than conventional methods of heating.

But anyway; back to fridges. Evans’ design never actually built a prototype of his design, but it was picked up on and revised several times over the next seventy-odd years until the design was sufficiently advanced to be used in commercial ice makers, putting the old ice ‘manufacturers’ (who simply got their ice out of a convenient mountain lake or glacier) out of business, and by the early 20th century the devices got so good that they were able to liquefy air.

Surprisingly, it wasn’t until after this point that the modern science of refrigeration began to make it into our homes. It took until 1913 for a patent to be issued for a domestic refrigerator, and even that was just a way of keeping an existing ice box cool; it didn’t actually cool the interior of the fridge down. However, the following year the USA got the awesomely-named Kelvinator refrigerator, the first properly practical domestic fridge that held some 80% of the market by 1923. During the economic boom of the 1920s, fridges were among the many devices whose popularity exploded, and they gradually became bigger, sleeker, more practical and more efficient in the process. By the 1930s they’d even managed to find a coolant that wasn’t highly corrosive or toxic, which all seemed terribly fantastic in the days before most people knew what ‘CFCs’ and ‘the ozone layer’ were. By 1940 the idea of attaching a freezer (at a sub-zero temperature) to one’s fridge (which usually operates at about 3ºC) became commonplace, and since then most of the advancements in the field of domestic refrigeration have been limited to making fridges bigger, easier to clean (particularly with the introduction of injection-moulded plastic components), more energy-efficient and more of a middle-class fashion statement.

However, this does not mean that the science of refrigeration is slowing down: recently, a British company called Reaction Engines Ltd. demonstrated their prototype air-breathing rocket engine, whose key feature was a revolutionary new type of heat exchanger. Despite a design utilising pretty much exactly the same science you’d find at the back of your fridge at home, this heat exchange is capable of dropping the temperature of air from several hundred degrees to -150ºC; in a hundredth of a second. That change in heat energy represents roughly the power output of a medium sized power station from a device that weighs significantly less than a hatchback. I would love to explain all the mechanics of this technology to you, but right now I wish for little more than to sit back and marvel.

The Most Contentious Patch of Land In Human History

“The situation in Palestine” has become something of a cliche; the definitive example of terribly serious discussion taking place during a dinner party talked about by middle class men with glasses and a humanities degree. It also happens to be about the single most politically delicate and contentious issue in the world today, and indeed concerns a patch of earth that could be said to have spilt more blood and caused more destruction in fighting over it than any other. Palestine’s is a long and bloody history, but it is a story often presumed rather than explained in full: so here is my effort to explain, in about as much fullness as a blog post will allow, what ‘the situation in Palestine’ actually is.

Palestine is an old geographical term that originally referred to a Roman province in the area in and around what is now the country of Israel (although that statement is contentious enough on its own, for reasons that will become clear later). However, included within its borders is the city of Jerusalem and many of the holiest sites of the religions of Christianity, Islam and Judaism, and having three conflicting and very… forceful ideologies trying to share the same space was just never going to work. When Islam began to realise the potential of several hundred zealots and a lot of swords put together, the Holy Land (which included Palestine) came under Islamic rule and, as my previous posts on the Crusades explained, two thousand years of throwing the military might of Christendom against it failed to make any long-term difference. In time, Palestine was to come under the control of the mighty Ottoman Empire that would come to dominate the Middle East right up until the end of the nineteenth century. However, prior to the First World War what was left of the Empire, by that time a relatively technologically backward state compared to the industrialised powers of western Europe, threw its lot in with the Triple Alliance (ie the Germans), and during the war itself Palestine was invaded by the British. Post-war, the British were given a mandate to manage the region by the short-lived League of Nations as it attempted to organise the remnants of the Empire, and thus the territory effectively became part of the British Empire.

Prior to that, and with Muslims proving difficult opponents for Christianity to fight, successions of Christian rulers turned on a far easier target: Jews. The New Testament forbade moneylending, but it was such an economically useful practice that Jews were often able to make a good living out of providing the service to Christians. This meant the Jewish population was rich and sinful by Christian ruling, and combining that with their ethnic differences and the fact that they had no distinct nations or military power made them very, very easy for the Christian world to hate and persecute. During the Norman period (and probably quite a while since then), the main entertainment for residents of London appears to have been trashing the Jewish quarter every time a significant effect of some sort occured/they got bored on a Friday evening. People have come up with all sorts of regions for why Hitler and his ilk had such a vehement hatred of Jewish people, but the simplest explanation is also the most likely; that anti-Semitism was just, very, very common at the time and Hitler was just one Jew-hater of many.

However, it was actually prior to the Second World War that tensions in the region of Palestine began to intensify. The British had promised the Jewish population of the world in general a homeland in the area, perhaps as a retroactive apology for the years of persecution they’d suffered at the hands of the British and others, and hoped that the Jews and Arabs could live side-by-side with one another. This didn’t really work, mostly since the Muslim population in the area was (at the time) ten times that of the Jewish one, and tensions in the region escalated; there were three rebellions against British rule whilst they governed, partly in response to this Jewish repatriation policy. By the time the Second World War ended the western world was justifiably shocked at the sheer scale of genocide perpetuated by the Nazis, but a collective look back over their own history ended in cringes of guilt as they realised they had very little in the way of moral high ground. This guilt, combined with the very liberal, democratic and anti-imperialist sentiments gripping Britain at the time (its first labour government had, after all, just been installed), led Britain and the new United Nations, successor to the League of Nations who’d created the mandate in the first place, to push forwards with their plan to give the Jews a homeland. In 1947, the UN decided that having the two groups living alongside each other was just asking for even more trouble than was already present, and proposed a new, partitioned state of Palestine. Palestine would be divided, into one area governed by the Jews and three separate areas within the country’s borders that would be Muslim-controlled. Jerusalem was to be under the UN’s jurisdiction (this was back when this was something the UN would do) and would be a free city, available to everyone. Which all sounds great in theory, but the thought of giving up yet more of their land to the Jewish occupiers was the final straw for the Arabs. This new border lasted less than a week before war was in full swing.

The Arab Higher Commitee rejected the UN’s partition proposal, and civil war erupted in the new country, mostly thanks to disorganised groups of unofficial Arabic soldiers and snipers (there was no organised Israeli army and the politicians from other countries were still arguing in the UN). Thousands were killed, and thousands more left the country in search of pastures less violent (mostly Arabs, who at least had other homelands to go to). The British were supposed to be keeping order in the region during the transition phase, but were mainly interested in covering themselves whilst they evacuated as many troops as possible. By May 1948, the Jewish population in the region had got themselves sufficiently organised to declare the new, Jewish state of Israel over the entirety of Palestine, and the civil war segued into a more official conflict as the newly formed Israeli army began squaring up against the local Arab countries (mainly Jordan and Egypt). Supplied and trained by the USA (whose population have historically supported the state of Israel for an apparently bizarre reason concerning the Biblical prediction of Jesus’ second coming- I’m not even joking), the Jewish forces took control of much of the area originally allotted to the Palestinian Muslims (including most of Jerusalem) and left them only with the areas we now call the Gaza strip and the West Bank. Since the Arabs wouldn’t accept having control over only part of the country they considered theirs, and did not recognise the state of Israel anyway, no official Muslim state of Palestine was declared (since the Arabs believed the old one had never actually ended), hence why these different areas don’t show up separately on maps.

With the new Jewish state formed and many Arabs driven from their land (in total nearly one and a half million Arabs were displaced or left the area of their own volition as a result of the two-part war, a refugee crisis that has yet to fully resolve itself), a sizeable chunk of the Jewish population in the Arabian peninsula immigrated to Israel, with the consequence that over three quarters of the current population of Israel are Jewish. This did not help the smouldering tensions along the borders Israel had with its Arab neighbours, and for nearly two decades open hostility and sporadic outbreaks of fighting were the norm. On June 5 1967, the Israelis (in the latest of what was becoming a long series of aggressive political manoeuvres) launched a pre-emptive strike against their key enemies of Syria, Egypt and Jordan, using their US-made aircraft to annihilate the air forces of all three nations whilst they were still on the ground in what became known as the Six Day War (some people wonder how they ever got away with this. These people forget that this was the Cold War, and you did not go telling the USA’s allies what they could or couldn’t do). With control of the air now theirs, Israeli ground troops took full control of the city of Jerusalem, drove back Arab attempts at a counter-attack, took the Golan Heights from Syria, the Sinai desert from Egypt, increased fivefold in size (now it also had control of the West Bank and Gaza strip) and eventually destroyed around 80% of Egypt’s military capacity and killed around 30,000 Arab troops. In six days. It was one of the bloodiest, and militarily most impressive, weeks in modern history.

Now the Arab world was doubly furious, but there was little they, in their weakened state, could do about it. Israel hoped this would draw the Arabs to the negotiating table in pursuit of peace and prosperity, but (perhaps understandably), they still wouldn’t have anything to do with them, not even recognising the existence of the state of Israel. After six years of brooding and rebuilding their military strength, the Arab world launched an invasion of their own, called the Yom Kippur war after its timing to coincide with the holiest day of the Jewish Calendar and backed by the Soviet Union, and the Egyptian army* crossed the psychologically significant Suez Canal that had marked the border. Although the war eventually cost over 18,000 Arab lives to around 8,000 Israeli ones, with Israeli air power eventually winning them the day and forcing a UN-backed ceasefire (and nearly precipitating nuclear war, but that’s another story), it deeply damaged the Israeli’s confidence that their military might could be used to bully their Arab neighbours. In November 1977, Egypt recognised the state of Israel and in 1982, Israel gave back the Sinai desert.

On the map, very little has changed since then; but the fundamental argument as to who the land of Israel/Palestine belongs to has yet to be settled, and probably never will be. Indeed, the situation has only intensified as great barriers have been built by the Israelis and they have attacked Muslim communities (both, they say, in an effort to combat terrorism). Indeed, to this day, Israel and Syria are still technically at war, even though there is an Islamic . Some blame the Isrealis gung-ho attitude, whilst others claim they are only acting in response to Muslim aggression (and anyone who’s ever travelled into Israel via their national airline can tell you how stringent their security policy is). The only things that can safely be said without picking sides is that ‘the situation in Palestine’ has claimed thousands of lives, ruined countless others, has no side who are clearly on the ‘right’ side and doesn’t look like it will be ending any time soon. It is a sad state of affairs.

*The key instigator for the invasion was Egyptian president Anwar Sadat, who would be assassinated in 1981 by militants opposed to his peace treaty. His replacement was welcomed by the western world for bringing stability to Egypt; and Hosni Mubarak was still ‘bringing stability’ to his nation right up until the Arab Spring of two years ago. Another key ally was president Hafez al-Assad of Syria, who kept office from 1971 to 2000 when his son Bashar took over. This is the same Bashar al-Assad currently accused of using chemical weapons against Syrian rebels. I don’t know that this is relevant, just thought it was… interesting.