Once Were Hairy

Aesthetically, humans are somewhat standout from the rest of natural creation. We are multicellular organisms, instantly making us completely different to the vast majority of the species’ currently on earth today, and warm-blooded, differentiating us from every plant, fungi, invertebrate, fish, amphibian and reptile. We stand on two legs, but at the same time cannot fly, differentiating us from almost every species of bird and mammal. But this is only so much basic classification; the one trait that aesthetically differentiates from nearly all of these is our hairlessness. Brian May excepted.

Technically, there are other members of the order mammalia who go without fur; the simultaneously cute and horrifying naked mole rat is but one land-borne example, and no swimming mammal (whales, dolphins etc.) have fur. And yes, we technically do have a full layer of fur covering us, meaning that you have more hairs (in terms of number, rather than volume) than a chimpanzee- but our hairy covering is so small as to be practically not there, and the amount of insulation and protection that it provides is minimal. Across most of our body, the only natural protection we have against our enemies and the elements is our bare skin.

Exactly why this is the case is somewhat unclear, because fur is very useful stuff. It offers a surprising degree of protection against cuts and attacks, and is just as effective at keeping out the elements, be they cold, wind or rain. Length can and does vary widely depending on location and need, and many species (including some humans) have incorporated their fur as a form of bodily decoration to attract mates and intimidate rivals; the lion’s mane is the most obvious example.

Also confusing is why we have hair where we do; upon our heads and around the pubic regions. It is thought that hair on the head may be either an almost vestigial thing, left over from the days when our ancestors did have hair, although this theory doesn’t explain why it remained on our head. Better explanations include the slight degree of extra shielding it provides to our brain, our greatest evolutionary advantage, or because the obviousness of the head makes it a natural point for us to rig our hair into elaborate, ceremonial styles and headdresses, raising the social standing of those who are able to get away with such things and helping them attract a mate and further their genes. However, the pubic region is of particular interest to evolutionary biologists, in part because hair there seems counter-productive; the body keeps the testicles outside the body because they need to be kept slightly cooler than the body’s interior temperature in order to keep sperm count high and ensure fertility (an interesting side effect of which is that people who take regular hot baths tend to have a lower sperm count). Surrounding such an area with hair seems evolutionarily dumb, reducing our fertility and reducing our chances of passing our genes onto the next generation. It is however thought that hair around these regions may aid the release of sexual pheremones, helping us to attract a mate, or that it may have helped to reduce chafing during sex and that women tended to choose men with pubic hair (and vice versa) to make sex comfortable. This is an example of sexual selection, where evolution is powered by our sexual preferences rather than environmental necessity, and this itself has been suggested as a theory as to why we humans lost our hair in the first place, or at least stayed that way once we lost it; we just found it more attractive that way. This theory was proposed by Charles Darwin, which seems odd given the truly magnificent beard he wore. However, the ‘chafing’ theory regarding pubic hair is rather heavily disputed from a number of angles, among them the fact that many couples choose to shave their pubic region in order to enhance sexual satisfaction. Our preferences could, of course, have changed over time.

One of the more bizarre theories concerning human hairlessness is the ‘aquatic apes’ theory; it is well known that all swimming mammals, from river dolphins to sea lions, favour fat (or ‘blubber’) in place of fur as it is more streamlined and efficient for swimming and is better for warmth underwater. Therefore, some scientists have suggested that humans went through a period of evolution where we adopted a semi-aquatic lifestyle, fishing in shallow waters and making our homes in and around the water. They also point to the slight webbing effect between our fingers as evidence of a change that was just starting to happen before we left our waterborne lifestyle, and to humanity’s ability to swim (I am told that if a newborn baby falls into water he will not sink but will instinctively ‘swim’, an ability we lose once we become toddlers and must re-learn later, but I feel it may be inappropriate to test this theory out). However, there is no evidence for these aquatic apes, so most scientists feel we should look elsewhere.

Others have suggested that the reason may have been lice; one only needs to hear the horror stories of the First World War to know of the horrible-ness of a lice infestation, and such parasites are frequently the vectors for virulent diseases that can wipe out a population with ease. Many animals spend the majority of their time picking through their fur to remove them (in other apes this is a crucial part of social bonding), but if we have no fur then the business becomes infinitely simpler because we can actually see the lice. Once again, adherents point to sexual selection- without hair we can display our untarnished, healthy, parasite-free skin to the world and our prospective mates (along with any impressive scars we want to show off), allowing them to know they are choosing a healthy partner, and this may go some way to explaining why the ultimate expression of male bodily beauty is considered a strong, hairless chest and six-pack, symbolising both strength and health. Ironically, a loss of fur and our subsequent use of clothes developed an entire new species; the body louse lives only within the folds of our clothes, and was thought to have evolved from hair lice some 50,000 years ago (interestingly, over a million years passed between our African ancestors passing through the hairless phase and our use of clothes, during which time we diverged as a species from Neanderthals, discovered tools and lived through an Ice Age. Must have been chilly, even in Africa). It’s a nice theory, but one considered redundant by some in the face of another; homeostasis.

Apart from our brainpower, homeostasis (or the ability to regulate our body temperature) is humanity’s greatest evolutionary advantage; warm blooded mammals are naturally adept at it anyway, giving us the ability to hunt & forage in all weathers, times and climates, and in cold weather fur provides a natural advantage in this regard. However, without fur to slow the process of heat regulation (sweating, dilation of blood vessels and such all become less effective when insulated by fur) human beings are able to maintain an ambient bodily temperature almost regardless of the weather or climate. African tribesmen have been known to run through the bush for an hour straight and raise their body temperature by less than a degree, whilst our ability to regulate heat in colder climates was enough for scores of Ice Age-era human bones to be found across the then-freezing Europe. Our ability to regulate temperature surpasses even those other ‘naked’ land mammals, the elephant and rhinoceros, thanks to our prominent nose and extremities that allow us to control heat even more precisely. In short, we’re not 100% sure exactly why we humans evolved to be hairless, but it has proved a surprisingly useful trait.

Advertisement

The Alternative Oven

During the Second World War, the RAF pioneered the use of radar to detect the presence of the incoming Luftwaffe raids. One of the key pieces of equipment used in the construction of the radars was called a magnetron, which uses a magnetic field to propel high-speed electrons and generate the kind of high-powered radio waves needed for such a technology to be successful over long distances. After the war was over, the British government felt it could share such technology with its American allies, and so granted permission for Raytheon, a private American enterprise, to produce them. Whilst experimenting with such a radar set in 1945, a Raytheon engineer called Percy Spencer reached to the chocolate bar in his pocket; and discovered it had melted. He later realised that the electromagnetic radiation generated by the radar set had been the cause of this heating effect, and thought that such technology could be put to a different, non-military use- and so the microwave oven was born.

Since then, the microwave has become the epitome of western capitalism’s golden age; the near-ubiquitous kitchen gadget, usually in the traditional white plastic casing, designed to make certain specific aspects of a process already technically performed  by another appliance (the oven) that bit faster and more convenient. As such, it has garnered its fair share of hate over the years, shunned by serious foodies as a taste-ruining harbinger of doom to one’s gastric juices that wouldn’t be seen dead in any serious kitchen. The simplicity of the microwaving process (especially given that there is frequently no need for a pot or container) has also lead to the rise of microwavable meals, designed to take the concept of ultra-simple cooking to its extreme by creating an entire meal  from a few minutes in the microwave. However, as everyone who’s every attempted a bit of home cooking will know, such process does not naturally occur quite so easily and thus these ready meals generally require large quantities of what is technically known as ‘crap’ for them to function as meals. This low quality food has become distinctly associated with the microwave itself, further enhancing its image as a tool for the lazy and the kind of societal dregs that the media like to portray in scare statistics.

In fairness, this is hardly the device’s fault, and it is a pretty awesome one. Microwave ovens work thanks to the polarity of water molecules; they consist of one positively charged end (where the hydrogen part of H2O is) and a negatively charged end (where the electron-rich oxygen bit is). Also charged are electromagnetic waves, such as the microwaves after which the oven takes its name, and such waves (being as they are, y’know, waves) also oscillate (aka ‘wobble) back and forth. This charge wobbling back and forth causes the water molecules (technically it works with other polarised molecules too, but there are very few other liquids consisting of polarised molecules that one encounters in cookery; this is why microwaves can heat up stuff without water in, but don’t do it very well) to oscillate too. This oscillation means that they gain kinetic energy from the microwave radiation; it just so happens that the frequency of the microwave radiation is chosen so that it closely matches the resonant frequency of the oscillation of the water molecules, meaning this energy transfer is very efficient*; a microwave works out as a bit over 60% efficient (most of the energy being lost in the aforementioned magnetron used to generate the microwaves), which is exceptional compared to a kettle’s level of around 10%. The efficiency of an oven really depends on the meal and how it’s being used, but for small meals or for reheating cold (although not frozen, since ice molecules aren’t free to vibrate as much as liquid water) food the microwave is definitely the better choice. It helps even more that microwaves are really bad at penetrating the metal & glass walls of a microwave, meaning they tend to bounce off until they hit the food and that very little of the energy gets lost to the surroundings once it’s been emitted. However, if nothing is placed in the microwave then these waves are not ‘used up’ in heating food and tend to end up back in the microwave emitter, causing it to burn out and doing the device some serious damage.

*I have heard it said that this is in fact a myth, and that microwaves are in fact selected to be slightly off the resonant frequency range so that they don’t end up heating the food too violently. I can’t really cite my sources on this one nor explain why it makes sense.

This use of microwave radiation to heat food incurs some rather interesting side-effects; up first is the oft-cited myth that microwaves cook food ‘from the inside out’. This isn’t actually true, for although the inside of a piece of food may be slightly more insulated than the outside the microwaves should transfer energy to all of the food at a roughly equal rate; if anything the outside will get more heating since it is hit first by the microwaves. This effect is observed thanks to the chemical makeup of a lot of the food put in a microwave, which generally have the majority of their water content beneath the surface; this makes the surface relatively cool and crusty, with little water to heat it up, and the inside scaldingly hot. The use of high-power microwaves also means that just about everyone in the country has in their home a death ray capable of quite literally boiling someone’s brain if the rays were directed towards them (hence why dismantling a microwave is semi-illegal as I understand it), but it also means that everyone has ample opportunity to, so long as they don’t intend to use the microwave again afterwards  and have access to a fire extinguisher, do some seriously cool stuff with it. Note that this is both dangerous, rather stupid and liable to get you into some quite weird stuff, nothing is a more sure fire indicator of a scientific mind than an instinct to go ‘what happens when…’ and look at the powerful EM radiation emitter sitting in your kitchen. For the record, I did not say that this was a good idea…

The Red Flower

Fire is, without a doubt, humanity’s oldest invention and its greatest friend; to many, the fundamental example what separates us from other animals. The abilities to keep warm through the coldest nights and harshest winters, to scare away predators by harnessing this strange force of nature, and to cook a joint of meat because screw it, it tastes better that way, are incredibly valuable ones, and they have seen us through many a tough moment. Over the centuries, fire in one form or another has been used for everything from being a weapon of war to furthering science, and very grateful we are for it too.

However, whilst the social history of fire is interesting, if I were to do a post on it then you dear readers would be faced with 1000 words of rather repetitive and somewhat boring myergh (technical term), so instead I thought I would take this opportunity to resort to my other old friend in these matters: science, as well as a few things learned from several years of very casual outdoorsmanship.

Fire is the natural product of any sufficiently exothermic reaction (ie one that gives out heat, rather than taking it in). These reactions can be of any type, but since fire can only form in air most of such reactions we are familiar with tend to be oxidation reactions; oxygen from the air bonding chemically with the substance in question (although there are exceptions;  a sample of potassium placed in water will float on the top and react with the water itself, become surrounded surrounded by a lilac flame sufficiently hot to melt it, and start fizzing violently and pushing itself around the container. A larger dose of potassium, or a more reactive alkali metal such as rubidium, will explode). The emission of heat causes a relatively gentle warming effect for the immediate area, but close to the site of the reaction itself a very large amount of heat is emitted in a small area. This excites the molecules of air close to the reaction and causes them to vibrate violently, emitting photons of electromagnetic radiation as they do so in the form of heat & light (among other things). These photons cause the air to glow brightly, creating the visible flame we can see; this large amount of thermal energy also ionises a lot of atoms and molecules in the area of the flame, meaning that a flame has a slight charge and is more conductive than the surrounding air. Because of this, flame probes are sometimes used to get rid of the excess charge in sensitive electromagnetic experiments, and flamethrowers can be made to fire lightning. Most often the glowing flame results in the characteristic reddy/orange colour of fire, but some reactions, such as the potassium one mentioned, cause them to emit radiation of other frequencies for a variety of reasons (chief among them the temperature of the flame and the spectral properties of the material in question), causing the flames to be of different colours, whilst a white-hot area of a fire is so hot that the molecules don’t care what frequency the photons they’re emitting are at so long as they can get rid of the things fast enough. Thus, light of all wavelengths gets emitted, and we see white light. The flickery nature of a flame is generally caused by the excited hot air moving about rapidly, until it gets far enough away from the source of heat to cool down and stop glowing; this process happens all the time with hundreds of packets of hot air, causing them to flicker back and forth.

However, we must remember that fires do not just give out heat, but must take some in too. This is to do with the way the chemical reaction to generate the heat in question works; the process requires the bonds between atoms to be broken, which uses up energy, before they can be reformed into a different pattern to release energy, and the energy needed to break the bonds and get the reaction going is known as the activation energy. Getting the molecules of the stuff you’re trying to react to the activation energy is the really hard part of lighting a fire, and different reactions (involving the burning of different stuff) have different activation energies, and thus different ‘ignition temperatures’ for the materials involved. Paper, for example, famously has an ignition temperature of 451 Fahrenheit (which means, incidentally, that you can cook with it if you’re sufficiently careful and not in a hurry to eat), whilst wood’s is only a little higher at around 300 degrees centigrade, both of which are less than that of a spark or flame. However, we must remember that neither fuel will ignite if it is wet, as water is not a fuel that can be burnt, meaning that it often takes a while to dry wood out sufficiently for it to catch, and that big, solid blocks of wood take quite a bit of energy to heat up.

From all of this information we can extrapolate the first rule that everybody learns about firelighting; that in order to catch a fire needs air, dry fuel and heat (the air provides the oxygen, the fuel the stuff it reacts with and the heat the activation energy). When one of these is lacking, one must make up for it by providing an excess of at least one of the other two, whilst remembering not to let the provision of the other ingredients suffer; it does no good, for example, to throw tons of fuel onto a new, small fire since it will snuff out its access to the air and put the fire out. Whilst fuel and air are usually relatively easy to come by when starting a fire, heat is always the tricky thing; matches are short lived, sparks even more so, and the fact that most of your fuel is likely to be damp makes the job even harder.

Provision of heat is also the main reason behind all of our classical methods of putting a fire out; covering it with cold water cuts it off from both heat and oxygen, and whilst blowing on a fire will provide it with more oxygen, it will also blow away the warm air close to the fire and replace it with cold, causing small flames like candles to be snuffed out (it is for this reason that a fire should be blown on very gently if you are trying to get it to catch and also why doing so will cause the flames, which are caused by hot air remember, to disappear but the embers to glow more brightly and burn with renewed vigour once you have stopped blowing).  Once a fire has sufficient heat, it is almost impossible to put out and blowing on it will only provide it with more oxygen and cause it to burn faster, as was ably demonstrated during the Great Fire of London. I myself have once, with a few friends, laid a fire that burned for 11 hours straight; many times it was reduced to a few humble embers, but it was so hot that all we had to do was throw another log on it and it would instantly begin to burn again. When the time came to put it out, it took half an hour for the embers to dim their glow.

Socially Acceptable Druggies

Alcohol is, without a shadow of a doubt, our society’s commonly acceptable drug of choice; no matter that one third of people admit to smoking cannabis at some point in their lives, or that smoking kills tens of thousands more people every year, neither can touch alcohol for its prevalence and importance within western civilisation. It’s everywhere; for most polite social gatherings it is fundamentally necessary as an icebreaker, every settlement from the biggest city to the tiniest hamlet will have a bar, pub or other drinking venue and many people will collect veritable hoards of the stuff, sometimes even in purpose-built rooms.

Which, on the face of it, might seem odd given how much it screws around with you. Even before the damage it causes to one’s liver and internal organs was discovered, it had been known for centuries that alcohol was dangerously habit-forming stuff, and it was generally acknowledged that prolonged use ‘pickled’ the brain. It also leaves those who imbibe it severely confused and lacking in coordination, which has proved hideously dangerous in countless scenarios over the years (even contributing to several assassinations), and can be almost guaranteed to result in personal embarrassment and other decisions you’re really going to regret when sober. If it wasn’t for booze’s noted enhancing of promiscuity, it might be surprising that drinking hadn’t been bred out of us simply thanks to natural selection, so much does it generally screw around with our ability to function as proper human beings

Like many drugs, alcohol has its roots in the dim and distant past when it felt quite nice and we didn’t know any better; a natural product when sugar (usually in the form of fruit) comes into contact with yeast (a common, naturally occurring fungus), it was quickly discovered how to make this process happen efficiently and controlledly by putting both sugar and yeast under water (or in some other anaerobic atmosphere). All raw materials were easy to come by and the process didn’t require any special skill, so it was only natural that it should catch on. Especially when we consider that alcohol is generally considered to be the single best way of making the world feel like a less crappy place than it often appears.

However, the real secret to alcohol’s success in worming its way into our society is less linked to booze itself, and has more to do with water. From our earliest infancy as a species, water has been readily available in the world around us, whether it be from lakes, rivers, wells or wherever. Unfortunately, this means it is also available for lots of other things to use and make their homes in, including a vast array of nasty bacteria. As can be seen with the situation across swathes of Africa and the Third World (although this problem has been reduced quite significantly over the last decade or so), access to water that is not fetid, disgusting and dangerous can be nigh-on impossible for many, forcing them to settle for water containing diseases ranging from cholera to dysentery. And that’s where alcohol came in.

The great advantage of alcohol is that its production can be very carefully controlled; even if the majority of an alcoholic drink is water, this is generally a product of the fruit or other sugary substance used in the brewing process. This means it is a lot purer than most ‘fresh’ water, and in any case the alcohol present in the fluid kills off a lot of bacteria. Even for those that can survive that, alcoholic beverages are far more likely to be bottled (or at least they were, before someone discovered the sheer quantity of suckers willing to buy what you can get out of the tap) than water, keeping any more invading bacteria, parasites, insects and other crap out. All of this was, of course, not known before Louis Pasteur first came along with his Germ Theory, but the facts stayed the same; historically, you were far less likely to die from drinking alcohol than drinking water.

Still, come the 20th century most of our sanitation problems in the developed world were sorted, so we didn’t need to worry about all that any more did we? Surely, we would have been fine to get rid of booze from our culture, throw out a feature of our lives that ruins many a night out, body or family? Surely, we’d all be far better off without alcohol in our culture? Wouldn’t we?

In many cases, this kind of question would prove a purely theoretical one, to be discussed by leading thinkers; however, much to the delight of all champions of evidence over opinion, the USA were kind enough to give banning alcohol a go way back in the early days of the 20th century. A hundred years ago, campaigns from the likes of the church and the Anti Saloon Bar League painted alcohol as a decidedly destructive influence, so successfully that from 1920 to 1933 the sale, production and consumption of alcohol within the United States became illegal.

At the time, many people thought this was a brilliant idea that would yield great social change. They were right; society as a collective decided that the law was more like a guideline anyway, and through their lot in with the mob. This was the golden age of organised crime, the era of Al Capone and others making fortunes in dealing bootleg alcohol, either dangerous home-brewed ‘moonshine’ liquor or stuff smuggled across the Canadian border. Hundreds of illegal speakeasies, clubs whose drab outsides hid their gaudy interiors, and in which were housed illegal gambling nests, dancers, prostitutes and a hell of a lot of booze, sprung up in every major American city, and while the data is inconsistent some figures suggest alcohol consumption actually rose during the Prohibition era (as it was known). Next to nobody was ever imprisoned or even charged with their crimes however, because the now-wealthy mob could afford to bribe almost anyone, and in any case most police officers and legal officials were illicit drinkers themselves; even Al Capone wasn’t taken down until after he was suspected of ordering some rival gangsters gunned down in what became known as the St Valentine’s Day Massacre. Eventually a group of supremely dedicated policement known unofficially as ‘The Untouchables’ managed to pin tax evasion charges on him, and even had to switch a bribed jury to ensure he went down (a film, The Untouchables, was made about the story- give it a watch if you ever get the charge). By the time Franklin D. Roosevelt repealed prohibition upon coming to power in 1933, the message was clear: America loved alcohol too much, and it wasn’t about to let it go.

Alcohol is, in its effect at least, not a special drug; many others can be used to forget the bad times, enjoy the good times and make the world feel like a better place. But there’s something about, something about its cultural imagery, that makes it timeless, and makes it an immovable feature of our world. It could be that it’s probably the cheapest recreational drug, or maybe that it’s the oldest, but to me the real secret to its success is its weakness, combined with the way it is almost always served very dilute. Most illegal drugs give an instant hit, a huge rush followed by crashing downer, and this makes any use of it a brief, wild experience. Alcohol is more mellow; something you can spend an entire night slowly drowning your sorrows in, or casually imbibe whilst chatting and generally functioning like a normal human being. It’s slow, it’s casual, a feature of an evening that does not necessarily have to define it- that is the cultural secret to alcohol’s success.