Aurum Potestas Est

We as a race and a culture have a massive love affair with gold. It is the basis of our currency, the definitive mark of wealth and status, in some ways the bedrock of our society. We hoard it, we covet it, we hide it away except for special occasions, but we never really use it.

This is perhaps the strangest thing about gold; for something around which we have based our economy on, it is remarkably useless. To be sure, gold has many advantageous properties; it is the best thermal and electrical conductor and is pretty easy to shape, leading it to be used widely in contacts for computing and on the engine cover for the McLaren F1 supercar. But other than these, relatively minor, uses, gold is something we keep safe rather than make use of; it has none of the ubiquity nor usefulness of such metals as steel or copper. So why are we on the gold standard? Why not base our economy around iron, around copper, around praseodymium (a long shot, I will admit), something a bit more functional? What makes gold so special?

In part we can blame gold’s chemical nature; as a transition metal it is hard, tough, and a solid at room temperature, making it able to be mined, extracted, transported and used with ease and without degenerating and breaking too easily. It is also very malleable, meaning it can be shaped easily to form coins and jewellery; shaping into coins is especially important in order to standardise the weight of meta worth a particular amount. However, by far its most defining chemical feature is its reactivity; gold is very chemically stable in its pure, unionised, ‘native’ form, meaning it is unreactive, particularly with such common substances as; for this reason it is often referred to as a noble metal. This means gold is usually found native, making it easier to identify and mine, but is also means that gold products take millennia to oxidise and tarnish, if they do so at all. Therefore, gold holds its purity like no other chemical (shush, helium & co.), and this means it holds its value like nothing else. Even silver, another noble and comparatively precious metal, will blacken eventually and lose its perfection, but not gold. To an economist, gold is eternal, and this makes it the most stable and safe of all potential investments. Nothing can replace it, it is always a safe bet; a fine thing to base an economy on.

However, just as important as gold’s refusal to tarnish and protect is beauty is the simple presence of a beauty to protect. This is partly put down to the uniqueness of its colour; in the world around us there are many greens, blues, blacks, browns and whites, as well as the odd purple. However, red and yellow are (fire and a few types of fish and flower excepted) comparatively rare, and only four chemical elements that we commonly come across are red or yellow in colour; phosphorus, sulphur, copper and gold. And rusty iron but… just no. Of the others, phosphorus (red) is rather dangerous given its propensity to burst into flames, is also commonly found as a boring old white element, and is rather reactive, meaning it is not often found in its reddish form. Sulphur is also reactive, also burns and also readily forms compounds; but these compounds have the added bonus of stinking to high heaven. It is partly for this reason, and partly for the fact that it turns blood-red when molten, that brimstone (aka sulphur) is heavily associated with hell, punishment and general sinfulness in the Bible and that it would be rather an unpopular choice to base an economy on. In any case, the two non-metals do not have any of the properties that the transition metals of copper and gold do; those of being malleable, hard, having a high melting point, and being shiny and pwettiful. Gold edged out over copper partly for its unreactivity as explored above (after time copper loses its reddish beauty and takes on a, but also because of its deep, beautiful, lustrous finish. That beauty made it precious to us, made it something we desired and lusted after, and (combined with gold’s relative rarity, which could be an entire section of its own) made it valuable. This value allows relatively small amounts of gold to represent large quantities of worth and value, and justifies its use as coinage, bullion and an economic standard.

However, for me the key feature of gold’s place as our defining scale of value concerns its relative uselessness. Consider the following scenario; in the years preceding the birth of Christ, the technology, warfare and overall political situation of the day was governed by one material, bronze. It was used to make swords, armour, jewellery, the lot; until one day some smartarse figured out how to smelt iron. Iron was easier to work than bronze, allowing better stuff to be made, and with some skill it could be turned into steel. Steel was stronger as well as more malleable than bronze, and could be tempered to change its properties; over time, skilled metalsmiths even learned how to make the edge of a sword blade harder than the centre, making it better at cutting whilst the core absorbed the impact. This was all several hundred years in the future, but in the end the result was the same; bronze fell from grace and its societal value slumped. It is still around today, but it will never again enjoy its place as the metal that ruled the world.

Now, consider if that metal had, instead of bronze, been gold. Something that had been ultra-precious, the king of all metals, reduced to something that was merely valuable. It had been trumped by iron, and iron would have this connotation of being better than it; gold’s value would have dropped. In any economic system, even a primitive one, having the value of the substance around which your economy is based change in value would be catastrophic; when Mansa Musa travelled from Mali on a pilgrimage to Mecca, he stopped off in Cairo, then the home of the world’s foremost gold trade, and spent so much gold that the non-Malian world had never known about that the price of gold collapsed and it took more than a decade for the Egyptian economy to recover. If gold were to have a purpose, it could be usurped; we might find something better, we might decide we don’t need that any more, and thus gold’s value, once supported by those wishing to buy it for this purpose, would drop. Gold is used so little that this simply doesn’t happen, making it the most economically stable substance; it is valuable precisely and solely because we want it to be and, strange though it may seem, gold is always in fashion. Economically as well as chemically, gold is uniquely stable- the perfect choice around which to base a global economy.

Advertisement

Getting bored with history lessons

Last post’s investigation into the post-Babbage history of computers took us up to around the end of the Second World War, before the computer age could really be said to have kicked off. However, with the coming of Alan Turing the biggest stumbling block for the intellectual development of computing as a science had been overcome, since it now clearly understood what it was and where it was going. From then on, therefore, the history of computing is basically one long series of hardware improvements and business successes, and the only thing of real scholarly interest was Moore’s law. This law is an unofficial, yet surprisingly accurate, model of the exponential growth in the capabilities of computer hardware, stating that every 18 months computing hardware gets either twice as powerful, half the size, or half the price for the same other specifications. This law was based on a 1965 paper by Gordon E Moore, who noted that the number of transistors on integrated circuits had been doubling every two years since their invention 7 years earlier. The modern day figure of an 18-monthly doubling in performance comes from an Intel executive’s estimate based on both the increasing number of transistors and their getting faster & more efficient… but I’m getting sidetracked. The point I meant to make was that there is no point me continuing with a potted history of the last 70 years of computing, so in this post I wish to get on with the business of exactly how (roughly fundamentally speaking) computers work.

A modern computer is, basically, a huge bundle of switches- literally billions of the things. Normal switches are obviously not up to the job, being both too large and requiring an electromechanical rather than purely electrical interface to function, so computer designers have had to come up with electrically-activated switches instead. In Colossus’ day they used vacuum tubes, but these were large and prone to breaking so, in the late 1940s, the transistor was invented. This is a marvellous semiconductor-based device, but to explain how it works I’m going to have to go on a bit of a tangent.

Semiconductors are materials that do not conduct electricity freely and every which way like a metal, but do not insulate like a wood or plastic either- sometimes they conduct, sometimes they don’t. In modern computing and electronics, silicon is the substance most readily used for this purpose. For use in a transistor, silicon (an element with four electrons in its outer atomic ‘shell’) must be ‘doped’ with other elements, meaning that they are ‘mixed’ into the chemical, crystalline structure of the silicon. Doping with a substance such as boron, with three electrons in its outer shell, creates an area with a ‘missing’ electron, known as a hole. Holes have, effectively, a positive charge compared a ‘normal’ area of silicon (since electrons are negatively charged), so this kind of doping produces what is known as p-type silicon. Similarly, doping with something like phosphorus, with five outer shell electrons, produces an excess of negatively-charged electrons and n-type silicon. Thus electrons, and therefore electricity (made up entirely of the net movement of electrons from one area to another) finds it easy to flow from n- to p-type silicon, but not very well going the other way- it conducts in one direction and insulates in the other, hence a semiconductor. However, it is vital to remember that the p-type silicon is not an insulator and does allow for free passage of electrons, unlike pure, undoped silicon. A transistor generally consists of three layers of silicon sandwiched together, in order NPN or PNP depending on the practicality of the situation, with each layer of the sandwich having a metal contact or ‘leg’ attached to it- the leg in the middle is called the base, and the ones at either side are called the emitter and collector.

Now, when the three layers of silicon are stuck next to one another, some of the free electrons in the n-type layer(s) jump to fill the holes in the adjacent p-type, creating areas of neutral, or zero, charge. These are called ‘depletion zones’ and are good insulators, meaning that there is a high electrical resistance across the transistor and that a current cannot flow between the emitter and collector despite usually having a voltage ‘drop’ between them that is trying to get a current flowing. However, when a voltage is applied across the collector and base a current can flow between these two different types of silicon without a problem, and as such it does. This pulls electrons across the border between layers, and decreases the size of the depletion zones, decreasing the amount of electrical resistance across the transistor and allowing an electrical current to flow between the collector and emitter. In short, one current can be used to ‘turn on’ another.

Transistor radios use this principle to amplify the signal they receive into a loud, clear sound, and if you crack one open you should be able to see some (well, if you know what you’re looking for). However, computer and manufacturing technology has got so advanced over the last 50 years that it is now possible to fit over ten million of these transistor switches onto a silicon chip the size of your thumbnail- and bear in mind that the entire Colossus machine, the machine that cracked the Lorenz cipher, contained only ten thousand or so vacuum tube switches all told. Modern technology is a wonderful thing, and the sheer achievement behind it is worth bearing in mind next time you get shocked over the price of a new computer (unless you’re buying an Apple- that’s just business elitism).

…and dammit, I’ve filled up a whole post again without getting onto what I really wanted to talk about. Ah well, there’s always next time…

(In which I promise to actually get on with talking about computers)