The Red Flower

Fire is, without a doubt, humanity’s oldest invention and its greatest friend; to many, the fundamental example what separates us from other animals. The abilities to keep warm through the coldest nights and harshest winters, to scare away predators by harnessing this strange force of nature, and to cook a joint of meat because screw it, it tastes better that way, are incredibly valuable ones, and they have seen us through many a tough moment. Over the centuries, fire in one form or another has been used for everything from being a weapon of war to furthering science, and very grateful we are for it too.

However, whilst the social history of fire is interesting, if I were to do a post on it then you dear readers would be faced with 1000 words of rather repetitive and somewhat boring myergh (technical term), so instead I thought I would take this opportunity to resort to my other old friend in these matters: science, as well as a few things learned from several years of very casual outdoorsmanship.

Fire is the natural product of any sufficiently exothermic reaction (ie one that gives out heat, rather than taking it in). These reactions can be of any type, but since fire can only form in air most of such reactions we are familiar with tend to be oxidation reactions; oxygen from the air bonding chemically with the substance in question (although there are exceptions;  a sample of potassium placed in water will float on the top and react with the water itself, become surrounded surrounded by a lilac flame sufficiently hot to melt it, and start fizzing violently and pushing itself around the container. A larger dose of potassium, or a more reactive alkali metal such as rubidium, will explode). The emission of heat causes a relatively gentle warming effect for the immediate area, but close to the site of the reaction itself a very large amount of heat is emitted in a small area. This excites the molecules of air close to the reaction and causes them to vibrate violently, emitting photons of electromagnetic radiation as they do so in the form of heat & light (among other things). These photons cause the air to glow brightly, creating the visible flame we can see; this large amount of thermal energy also ionises a lot of atoms and molecules in the area of the flame, meaning that a flame has a slight charge and is more conductive than the surrounding air. Because of this, flame probes are sometimes used to get rid of the excess charge in sensitive electromagnetic experiments, and flamethrowers can be made to fire lightning. Most often the glowing flame results in the characteristic reddy/orange colour of fire, but some reactions, such as the potassium one mentioned, cause them to emit radiation of other frequencies for a variety of reasons (chief among them the temperature of the flame and the spectral properties of the material in question), causing the flames to be of different colours, whilst a white-hot area of a fire is so hot that the molecules don’t care what frequency the photons they’re emitting are at so long as they can get rid of the things fast enough. Thus, light of all wavelengths gets emitted, and we see white light. The flickery nature of a flame is generally caused by the excited hot air moving about rapidly, until it gets far enough away from the source of heat to cool down and stop glowing; this process happens all the time with hundreds of packets of hot air, causing them to flicker back and forth.

However, we must remember that fires do not just give out heat, but must take some in too. This is to do with the way the chemical reaction to generate the heat in question works; the process requires the bonds between atoms to be broken, which uses up energy, before they can be reformed into a different pattern to release energy, and the energy needed to break the bonds and get the reaction going is known as the activation energy. Getting the molecules of the stuff you’re trying to react to the activation energy is the really hard part of lighting a fire, and different reactions (involving the burning of different stuff) have different activation energies, and thus different ‘ignition temperatures’ for the materials involved. Paper, for example, famously has an ignition temperature of 451 Fahrenheit (which means, incidentally, that you can cook with it if you’re sufficiently careful and not in a hurry to eat), whilst wood’s is only a little higher at around 300 degrees centigrade, both of which are less than that of a spark or flame. However, we must remember that neither fuel will ignite if it is wet, as water is not a fuel that can be burnt, meaning that it often takes a while to dry wood out sufficiently for it to catch, and that big, solid blocks of wood take quite a bit of energy to heat up.

From all of this information we can extrapolate the first rule that everybody learns about firelighting; that in order to catch a fire needs air, dry fuel and heat (the air provides the oxygen, the fuel the stuff it reacts with and the heat the activation energy). When one of these is lacking, one must make up for it by providing an excess of at least one of the other two, whilst remembering not to let the provision of the other ingredients suffer; it does no good, for example, to throw tons of fuel onto a new, small fire since it will snuff out its access to the air and put the fire out. Whilst fuel and air are usually relatively easy to come by when starting a fire, heat is always the tricky thing; matches are short lived, sparks even more so, and the fact that most of your fuel is likely to be damp makes the job even harder.

Provision of heat is also the main reason behind all of our classical methods of putting a fire out; covering it with cold water cuts it off from both heat and oxygen, and whilst blowing on a fire will provide it with more oxygen, it will also blow away the warm air close to the fire and replace it with cold, causing small flames like candles to be snuffed out (it is for this reason that a fire should be blown on very gently if you are trying to get it to catch and also why doing so will cause the flames, which are caused by hot air remember, to disappear but the embers to glow more brightly and burn with renewed vigour once you have stopped blowing).  Once a fire has sufficient heat, it is almost impossible to put out and blowing on it will only provide it with more oxygen and cause it to burn faster, as was ably demonstrated during the Great Fire of London. I myself have once, with a few friends, laid a fire that burned for 11 hours straight; many times it was reduced to a few humble embers, but it was so hot that all we had to do was throw another log on it and it would instantly begin to burn again. When the time came to put it out, it took half an hour for the embers to dim their glow.


Why the chubs?

My last post dealt with the thorny issue of obesity, both it’s increasing presence in our everyday lives, and what for me is the underlying reason behind the stats that back up media scare stories concerning ‘the obesity epidemic’- the rise in size of the ‘average’ person over the last few decades. The precise causes of this trend can be put down to a whole host of societal factors within our modern age, but that story is boring as hell and has been repeated countless times by commenters far more adept in this field than me. Instead, today I wish present the case for modern-day obesity as a problem concerning the fundamental biology of a human being.

We, and our dim and distant ancestors of the scaly/furry variety, have spent the last few million years living wild; hunting, fighting and generally acting much like any other evolutionary pathway. Thus, we can learn a lot about our own inbuilt biology and instincts by studying the behaviour of animals currently alive today, and when we do so, several interesting animal eating habits become apparent. As anyone who has tried it as a child can attest (and I speak from personal experience), grass is not good stuff to eat. It’s tough, it takes a lot of chewing and processing (many herbivores have multiple stomachs to make sure they squeeze the maximum nutritional value out of their food), and there really isn’t much of it to power a fully-functional being. As such, grazers on grass and other such tough plant matter (such as leaves) will spend most of their lives doing nothing but guzzle the stuff, trying to get as much as possible through their system. Other animals will favour food with a higher nutritional content, such as fruits, tubers or, in many cases, meat, but these frequently present issues. Fruits are highly seasonal and rarely available in a large enough volume to support a large population, as well as being quite hard to get a lot of down; plants try to ‘design’ fruits so that each visitor takes only a few at a time, so as best to spread their seeds far and wide, and as such there are few animals that can sustain themselves on such a diet.  Other food such as tubers or nuts are hard to get at, needing to be dug up or broken in highly energy-consuming activities, whilst meat has the annoying habit of running away or fighting back whenever you try to get at it. As anyone who watches nature documentaries will attest, most large predators will only eat once every few days (admittedly rather heavily).

The unifying factor of all of this is that food is, in the wild, highly energy- and time-consuming to get hold of and consume, since every source of it guards its prize jealously. Therefore, any animal that wants to survive in this tough world must be near-constantly in pursuit of food simply to fulfil all of its life functions, and this is characterised by being perpetually hungry. Hunger is a body’s way of telling us that we should get more food, and in the wild this constant desire for more is kept in check by the difficulty that getting hold of it entails. Similarly, animal bodies try to assuage this desire by being lazy; if something isn’t necessary, then there’s no point wasting valuable energy going after it (since this will mean spending more time going after food to replace lost energy.)

However, in recent history (and a spectacularly short period of time from evolution’s point of view), one particular species called homo sapiens came up with this great idea called civilisation, which basically entailed the pooling and sharing of skill and resources in order to best benefit everyone as a whole. As an evolutionary success story, this is right up there with developing multicellular body structures in terms of being awesome, and it has enabled us humans to live far more comfortable lives than our ancestors did, with correspondingly far greater access to food. This has proved particularly true over the last two centuries, as technological advances in a more democratic society have improved the everyman’s access to food and comfortable living to a truly astounding degree. Unfortunately (from the point of view of our waistline) the instincts of our bodies haven’t quite caught up to the idea that when we want/need food, we can just get food, without all that inconvenient running around after it to get in the way. Not only that, but a lack of pack hierarchy combined with this increased availability means that we can stock up on food until we have eaten our absolute fill if so we wish; the difference between ‘satiated’ and ‘stuffed’ can work out as well over 1000 calories per meal, and over a long period of time it only takes a little more than we should be having every day to start packing on the pounds. Combine that with our natural predilection to laziness meaning that we don’t naturally think of going out for some exercise as fun purely for its own sake, and the fact that we no longer burn calories chasing our food, or in the muscles we build up from said chasing, and we find ourselves consuming a lot more calories than we really should be.

Not only that, but during this time we have also got into the habit of spending a lot of time worrying over the taste and texture of our food. This means that, unlike our ancestors who were just fine with simply jumping on a squirrel and devouring the thing, we have to go through the whole rigmarole of getting stuff out of the fridge, spending two hours slaving away in a kitchen and attempting to cook something vaguely resembling tasty. This wait is not something out bodies enjoy very much, meaning we often turn to ‘quick fixes’ when in need of food; stuff like bread, pasta or ready meals. Whilst we all know how much crap goes into ready meals (which should, as a rule, never be bought by anyone who cares even in the slightest about their health; salt content of those things is insane) and other such ‘quick fixes’, fewer people are aware of the impact a high intake of whole grains can have on our bodies. Stuff like bread and rice only started being eaten by humans a few thousand years ago, as we discovered the benefits of farming and cooking, and whilst they are undoubtedly a good food source (and are very, very difficult to cut from one’s diet whilst still remaining healthy) our bodies have simply not had enough time, evolutionarily speaking, to get used to them. This means they have a tendency to not make us feel as full as their calorie content should suggest, thus meaning that we eat more than our body in fact needs (if you want to feel full whilst not taking in so many calories, protein is the way to go; meat, fish and dairy are great for this).

This is all rather academic, but what does it mean for you if you want to lose a bit of weight? I am no expert on this, but then again neither are most of the people acting as self-proclaimed nutritionists in the general media, and anyway, I don’t have any better ideas for posts. So, look at my next post for my, admittedly basic, advice for anyone trying to make themselves that little bit healthier, especially if you’re trying to work of a few of the pounds built up over this festive season.

Questionably Moral

We human beings tend to set a lot of store by the idea of morality (well, most of us anyway), and it is generally accepted that having a strong code of morals is a good thing. Even if many of us have never exactly qualified what we consider to be right or wrong, the majority of people have at least a basic idea of what they consider morally acceptable and a significant number are willing to make their moral standpoint on various issues very well known to anyone who doesn’t want to listen (internet, I’m looking at you again). One of the key features considered to be integral to such a moral code is the idea of rigidity and having fixed rules. Much like law, morality should ideally be inflexible, passing equal judgement on the same situation regardless of who is involved, how you’re feeling at the time and other outside factors. If only to avoid being accused of hypocrisy, social law dictates that one ‘should’ pass equal moral judgement on both your worst enemy and your spouse, and such a stringent dedication to ‘justice’ is a prized concept among those with strong moral codes.

However, human beings are nothing if not inconsistent, and even the strongest and most vehemently held ideas have a habit of withering in the face of context. One’s moral code is no exception, and with that in mind, let’s talk about cats.

Consider a person- call him a socialist, if you like that sort of description. Somebody who basically believes that we should be doing our bit to help our fellow man. Someone who buys The Big Issue, donates to charity, and gives their change to the homeless. They take the view that those in a more disadvantaged position should be offered help, and they live and share this view on a daily basis.

Now, consider what happens when, one day, said person is having a barbecue and a stray cat comes into the garden. Such strays are, nowadays, uncommon in suburban Britain, but across Europe (the Mediterranean especially), there may be hundreds of them in a town (maybe the person’s on holiday). Picture one such cat- skinny, with visible ribs, unkempt and patchy fur, perhaps a few open sores. A mangy, quite pathetic creature, clinging onto life through a mixture of tenacity and grubbing for scraps, it enters the garden and makes its way towards the man and his barbecue.

Human beings, especially modern-day ones, leave quite a wasteful and indulgent existence. We certainly do not need the vast majority of the food we produce and consume, and could quite happily do without a fair bit of it. A small cat, by contrast, can survive quite happily for at least day on just one small bowl of food, or a few scraps of meat. From a neutral, logical standpoint, therefore, the correct and generous thing to do according to this person’s moral standpoint, would be to throw the cat a few scraps and sleep comfortably with a satisfied conscience that evening. But, all our person sees is a mangy street cat, a dirty horrible stray that they don’t want anywhere near them or their food, so they do all they can to kick, scream, shout, throw water and generally drive a starving life form after just a few scraps away from a huge pile of pristine meat, much of which is likely to go to waste.

Now, you could argue that if the cat had been given food, it would have kept on coming back, quite insatiably, for more, and could possibly have got bolder and more aggressive. An aggressive, confident cat is more likely to try and steal food, and letting a possibly diseased and flea-ridden animal near food you are due to eat is probably not in the best interests of hygiene. You could argue that offering food is just going to encourage other cats to come to you for food, until you become a feeding station for all those in the area and are thus promoting the survival and growth of a feline population that nobody really likes to see around and would be unsustainable to keep. You could argue, if you were particularly harsh and probably not of the same viewpoint as the person in question, that a cat is not ‘worth’ as much as a human, if only because we should stick to looking after our own for starters and, in any case, it would be better for the world anyway if there weren’t stray cats around to cause such freak out-ness and moral dilemmas. But all of this does not change the fact that this person has, from an objective standpoint, violated their moral code by refusing a creature less fortunate than themselves a mere scrap that could, potentially, represent the difference between their living and dying.

There are other such examples of such moral inconsistency in the world around us. Animals are a common connecting factor (pacifists and people who generally don’t like murder will quite happily swat flies and such ‘because they’re annoying’), but there are other, more human examples (those who say we should be feeding the world’s poor whilst simultaneously both eating and wasting vast amounts of food and donating a mere pittance to help those in need). Now, does this mean that all of these moral standpoints are stupid? Of course not, if we all decided not to help and be nice to one another then the world would be an absolute mess. Does it mean that we’re all just bad, hypocritical people, as the violently forceful charity collectors would have you believe? Again, no- this ‘hypocrisy’ is something that all humans do to some extent, so either the entire human race is fundamentally flawed (in which case the point is not worth arguing) or we feel that looking after ourselves first and foremost before helping others is simply more practical. Should we all turn to communist leadership to try and redress some of these imbalances and remove the moral dilemmas? I won’t even go there.

It’s a little hard to identify a clear moral or conclusion to all of this, except to highlight that moral inconsistency is a natural and very human trait. Some might deplore this state of affairs, but we’ve always known humans are imperfect creatures; not that that gives us a right to give up on being the best we can be.