Keeping it Cool

We humans are unique in so many ways, but perhaps our mastery of the systems used in getting food into our mouths is the most remarkable. From our humble hunter-gatherer beginnings, in which we behaved much as any other animals, we have discovered agriculture, domesticated animals, learned to harvest milk and eggs and are nowadays even capable of growing a steak from just a few cells (we’ll temporarily gloss over the cost and taste of the finished product). However, arguably just as important as these advancements has been our ability to store food, allowing us to survive the harshest of winters and conditions in numbers few other animals could hope to match.

Our methods of food storage have varied widely over the years; beyond the simple tactic of ‘keep your food somewhere basically dry and clean’, in the last few decades we’ve moved on from our old favourites to explore as wide a variety of solutions as chemical preservatives and freeze drying. However, today I wish to explore an older, yet arguably far more interesting, method that remains our current favourite method of home food preservation: that of refrigeration.

Refrigeration, or the highly technical art of ‘making food colder so bad things can’t survive’, is an ancient idea; ice houses have been found in Iran dating from 1700BC, and were in use in both China and the Roman Empire throughout both culture’s long histories. Since making their own ice was impossible using the technology of the time, these ancient civilisations simply moved existing ice to a designated place where it with useful and came up with ingenious ways to make sure it stayed cold throughout the long summers; these great buildings would have immensely thick walls and were then packed with straw or sawdust to prevent the air circulating, thus helping to maintain their temperature. Thanks to their thick walls, ice houses were necessarily vast structures, acting rather like communal refrigerators for a local lord and his community and capable of holding up to thirty thousand tons of food.

In other countries, where snow and ice was harder to reliably come by (even in winter), refrigeration didn’t really catch on and people stuck with salting their food. However, because this a) made a lot of food taste disgusting and b) meant you still had to drink warm beer, by the seventeenth century it became relatively common for the rich across Europe to import ice (at vast expense) to their own personal ice houses, allowing them to serve fancy drinks at parties and the like and enjoy an unsalted pork roast in February. Ice was a symbol of luxury and status, which is presumably one of the reasons why ice sculptures are even today considered the pinnacle of class and fine living (that and the fact that they’re really, really cool). During the Georgian and Victorian eras, it was common practice for families going out for a day’s jolly (particularly in the colonies) to take an ice box of food with them, and there were even ice shops where the rich would go to buy high-quality, exceptionally clear ice for whatever party they happened to be hosting- but, by the end of the century that business would be long bust.

Y’see, in 1805 a man named Oliver Evans, who would later become known as ‘the father of refrigeration’, invented a device called the vapour-compression refrigeration machine. This is, basically, a series of tubes containing a stable coolant; the coolant is first compressed, then condenses (causing it to lose the heat it’s picked up- this is the vapour-compression bit), before going back inside and evaporating again thanks to a mixture of a pressure change and temperature change, thus allowing it to pick up heat. This rather convoluted evaporation/condensation procedure (first investigated by Benjamin Franklin and a chemistry professor called John Hadley half a century earlier) wasn’t actually the preferred solution for a few decades, since the earliest devices built were ‘compression-compression’ systems that used air as a coolant and were thus only able to change its pressure rather than get it to liquefy. Regardless, it was soon realised the vapour-compression system allows a device to more efficiently control the transfer of heat from in to out rather than vice versa, and is now pretty much universally used in modern day ‘heat pumps’ of all sorts.. Incidentally, heat pumps are among the most efficient systems ever devised for heating/cooling a space, and nowadays they are increasingly used (in the opposite direction, of course), to heat houses, as they use far less energy than conventional methods of heating.

But anyway; back to fridges. Evans’ design never actually built a prototype of his design, but it was picked up on and revised several times over the next seventy-odd years until the design was sufficiently advanced to be used in commercial ice makers, putting the old ice ‘manufacturers’ (who simply got their ice out of a convenient mountain lake or glacier) out of business, and by the early 20th century the devices got so good that they were able to liquefy air.

Surprisingly, it wasn’t until after this point that the modern science of refrigeration began to make it into our homes. It took until 1913 for a patent to be issued for a domestic refrigerator, and even that was just a way of keeping an existing ice box cool; it didn’t actually cool the interior of the fridge down. However, the following year the USA got the awesomely-named Kelvinator refrigerator, the first properly practical domestic fridge that held some 80% of the market by 1923. During the economic boom of the 1920s, fridges were among the many devices whose popularity exploded, and they gradually became bigger, sleeker, more practical and more efficient in the process. By the 1930s they’d even managed to find a coolant that wasn’t highly corrosive or toxic, which all seemed terribly fantastic in the days before most people knew what ‘CFCs’ and ‘the ozone layer’ were. By 1940 the idea of attaching a freezer (at a sub-zero temperature) to one’s fridge (which usually operates at about 3ºC) became commonplace, and since then most of the advancements in the field of domestic refrigeration have been limited to making fridges bigger, easier to clean (particularly with the introduction of injection-moulded plastic components), more energy-efficient and more of a middle-class fashion statement.

However, this does not mean that the science of refrigeration is slowing down: recently, a British company called Reaction Engines Ltd. demonstrated their prototype air-breathing rocket engine, whose key feature was a revolutionary new type of heat exchanger. Despite a design utilising pretty much exactly the same science you’d find at the back of your fridge at home, this heat exchange is capable of dropping the temperature of air from several hundred degrees to -150ºC; in a hundredth of a second. That change in heat energy represents roughly the power output of a medium sized power station from a device that weighs significantly less than a hatchback. I would love to explain all the mechanics of this technology to you, but right now I wish for little more than to sit back and marvel.

Advertisements

Time is an illusion, lunchtime doubly so…

In the dim and distant past, time was, to humankind, a thing and not much more. There was light-time, then there was dark-time, then there was another lot of light-time; during the day we could hunt, fight, eat and try to stay alive, and during the night we could sleep and have sex. However, we also realised that there were some parts of the year with short days and colder night, and others that were warmer, brighter and better for hunting. Being the bright sort, we humans realised that the amount of time it spent in winter, spring, summer and autumn (fall is the WRONG WORD) was about the same each time around, and thought that rather than just waiting for it to warm up every time we could count how long it took for one cycle (or year) so that we could work out when it was going to get warm next year. This enabled us to plan our hunting and farming patterns, and it became recognised that some knowledge of how the year worked was advantageous to a tribe. Eventually, this got so important that people started building monuments to the annual seasonal progression, hence such weird and staggeringly impressive prehistoric engineering achievements as Stonehenge.

However, this basic understanding of the year and the seasons was only one step on the journey, and as we moved from a hunter-gatherer paradigm to more of a civilised existence, we realised the benefits that a complete calendar could offer us, and thus began our still-continuing test to quantify time. Nowadays our understanding of time extends to clocks accurate to the degree of nanoseconds, and an understanding of relativity, but for a long time our greatest quest into the realm of bringing organised time into our lives was the creation of the concept of the wee.

Having seven days of the week is, to begin with, a strange idea; seven is an awkward prime number, and it seems odd that we don’t pick number that is easier to divide and multiply by, like six, eight or even ten, as the basis for our temporal system. Six would seem to make the most sense; most of our months have around 30 days, or 5 six-day weeks, and 365 days a year is only one less than multiple of six, which could surely be some sort of religious symbolism (and there would be an exact multiple on leap years- even better). And it would mean a shorter week, and more time spent on the weekend, which would be really great. But no, we’re stuck with seven, and it’s all the bloody moon’s fault.

Y’see, the sun’s daily cycle is useful for measuring short-term time (night and day), and the earth’s rotation around it provides the crucial yearly change of season. However, the moon’s cycle is 28 days long (fourteen to wax, fourteen to wane, regular as clockwork), providing a nice intermediary time unit with which to divide up the year into a more manageable number of pieces than 365. Thus, we began dividing the year up into ‘moons’ and using them as a convenient reference that we could refer to every night. However, even a moon cycle is a bit long for day-to-day scheduling, and it proved advantageous for our distant ancestors to split it up even further. Unfortunately, 28 is an awkward number to divide into pieces, and its only factors are 1, 2, 4, 7 and 14. An increment of 1 or 2 days is simply too small to be useful, and a 4 day ‘week’ isn’t much better. A 14 day week would hardly be an improvement on 28 for scheduling purposes, so seven is the only number of a practical size for the length of the week. The fact that months are now mostly 30 or 31 days rather than 28 to try and fit the awkward fact that there are 12.36 moon cycles in a year, hasn’t changed matters, so we’re stuck with an awkward 7 day cycle.

However, this wasn’t the end of the issue for the historic time-definers (for want of a better word); there’s not much advantage in defining a seven day week if you can’t then define which day of said week you want the crops to be planted on. Therefore, different days of the week needed names for identification purposes, and since astronomy had already provided our daily, weekly and yearly time structures it made sense to look skyward once again when searching for suitable names. At this time, centuries before the invention of the telescope, we only knew of seven planets, those celestial bodies that could be seen with the naked eye; the sun, the moon (yeah, their definition of ‘planet’ was a bit iffy), Mercury, Venus, Mars, Jupiter and Saturn. It might seem to make sense, with seven planets and seven days of the week, to just name the days after the planets in a random order, but humankind never does things so simply, and the process of picking which day got named after which planet was a complicated one.

In around 1000 BC the Egyptians had decided to divide the daylight into twelve hours (because they knew how to pick a nice, easy-to-divide number), and the Babylonians then took this a stage further by dividing the entire day, including night-time, into 24 hours. The Babylonians were also great astronomers, and had thus discovered the seven visible planets- however, because they were a bit weird, they decided that each planet had its place in a hierarchy, and that this hierarchy was dictated by which planet took the longest to complete its cycle and return to the same point in the sky. This order was, for the record, Saturn (29 years), Jupiter (12 years), Mars (687 days), Sun (365 days), Venus (225 days), Mercury (88 days) and Moon (28 days). So, did they name the days after the planets in this order? Of course not, that would be far too simple; instead, they decided to start naming the hours of the day after the planets (I did say they were a bit weird) in that order, going back to Saturn when they got to the Moon.

However, 24 hours does not divide nicely by seven planets, so the planet after which the first hour of the day was named changed each day. So, the first hour of the first day of the week was named after Saturn, the first hour of the second day after the Sun, and so on. Since the list repeated itself each week, the Babylonians decided to name each day after the planet that the first hour of each day was named, so we got Saturnday, Sunday, Moonday, Marsday, Mercuryday, Jupiterday and Venusday.

Now, you may have noticed that these are not the days of the week we English speakers are exactly used to, and for that we can blame the Vikings. The planetary method for naming the days of the week was brought to Britain by the Romans, and when they left the Britons held on to the names. However, Britain then spent the next 7 centuries getting repeatedly invaded and conquered by various foreigners, and for most of that time it was the Germanic Vikings and Saxons who fought over the country. Both groups worshipped the same gods, those of Norse mythology (so Thor, Odin and so on), and one of the practices they introduced was to replace the names of four days of the week with those of four of their gods; Tyr’sday, Woden’sday (Woden was the Saxon word for Odin), Thor’sday and Frig’sday replaced Marsday, Mercuryday, Jupiterday and Venusday in England, and soon the fluctuating nature of language renamed the days of the week Saturday, Sunday, Monday, Tuesday, Wednesday, Thursday and Friday.

However, the old planetary names remained in the romance languages (the Spanish translations of the days Tuesday to Friday are Mardi, Mercredi, Jeudi and Vendredi), with one small exception. When the Roman Empire went Christian in the fourth century, the ten commandments dictated they remember the Sabbath day; but, to avoid copying the Jews (whose Sabbath was on Saturday), they chose to make Sunday the Sabbath day. It is for this reason that Monday, the first day of the working week after one’s day of rest, became the start of the week, taking over from the Babylonian’s choice of Saturday, but close to Rome they went one stage further and renamed Sunday ‘Deus Dominici’, or Day Of The Lord. The practice didn’t catch on in Britain, thousands of miles from Rome, but the modern day Spanish, French and Italian words for Sunday are domingo, dimanche and domenica respectively, all of which are locally corrupted forms of ‘Deus Dominici’.

This is one of those posts that doesn’t have a natural conclusion, or even much of a point to it. But hey; I didn’t start writing this because I wanted to make a point, but more to share the kind of stuff I find slightly interesting. Sorry if you didn’t.