What we know and what we understand are two very different things…

If the whole Y2K debacle over a decade ago taught us anything, it was that the vast majority of the population did not understand the little plastic boxes known as computers that were rapidly filling up their homes. Nothing especially wrong or unusual about this- there’s a lot of things that only a few nerds understand properly, an awful lot of other stuff in our life to understand, and in any case the personal computer had only just started to become commonplace. However, over 12 and a half years later, the general understanding of a lot of us does not appear to have increased to any significant degree, and we still remain largely ignorant of these little feats of electronic witchcraft. Oh sure, we can work and operate them (most of us anyway), and we know roughly what they do, but as to exactly how they operate, precisely how they carry out their tasks? Sorry, not a clue.

This is largely understandable, particularly given the value of ‘understand’ that is applicable in computer-based situations. Computers are a rare example of a complex system that an expert is genuinely capable of understanding, in minute detail, every single aspect of the system’s working, both what it does, why it is there, and why it is (or, in some cases, shouldn’t be) constructed to that particular specification. To understand a computer in its entirety, therefore, is an equally complex job, and this is one very good reason why computer nerds tend to be a quite solitary bunch, with quite few links to the rest of us and, indeed, the outside world at large.

One person who does not understand computers very well is me, despite the fact that I have been using them, in one form or another, for as long as I can comfortably remember. Over this summer, however, I had quite a lot of free time on my hands, and part of that time was spent finally relenting to the badgering of a friend and having a go with Linux (Ubuntu if you really want to know) for the first time. Since I like to do my background research before getting stuck into any project, this necessitated quite some research into the hows and whys of its installation, along with which came quite a lot of info as to the hows and practicalities of my computer generally. I thought, then, that I might spend the next couple of posts or so detailing some of what I learned, building up a picture of a computer’s functioning from the ground up, and starting with a bit of a history lesson…

‘Computer’ was originally a job title, the job itself being akin to accountancy without the imagination. A computer was a number-cruncher, a supposedly infallible data processing machine employed to perform a range of jobs ranging from astronomical prediction to calculating interest. The job was a fairly good one, anyone clever enough to land it probably doing well by the standards of his age, but the output wasn’t. The human brain is not built for infallibility and, not infrequently, would make mistakes. Most of these undoubtedly went unnoticed or at least rarely caused significant harm, but the system was nonetheless inefficient. Abacuses, log tables and slide rules all aided arithmetic manipulation to a great degree in their respective fields, but true infallibility was unachievable whilst still reliant on the human mind.

Enter Blaise Pascal, 17th century mathematician and pioneer of probability theory (among other things), who invented the mechanical calculator aged just 19, in 1642. His original design wasn’t much more than a counting machine, a sequence of cogs and wheels so constructed as to able to count and convert between units, tens, hundreds and so on (ie a turn of 4 spaces on the ‘units’ cog whilst a seven was already counted would bring up eleven), as well as being able to work with currency denominations and distances as well. However, it could also subtract, multiply and divide (with some difficulty), and moreover proved an important point- that a mechanical machine could cut out the human error factor and reduce any inaccuracy to one of simply entering the wrong number.

Pascal’s machine was both expensive and complicated, meaning only twenty were ever made, but his was the only working mechanical calculator of the 17th century. Several, of a range of designs, were built during the 18th century as show pieces, but by the 19th the release of Thomas de Colmar’s Arithmometer, after 30 years of development, signified the birth of an industry. It wasn’t a large one, since the machines were still expensive and only of limited use, but de Colmar’s machine was the simplest and most reliable model yet. Around 3,000 mechanical calculators, of various designs and manufacturers, were sold by 1890, but by then the field had been given an unexpected shuffling.

Just two years after de Colmar had first patented his pre-development Arithmometer, an Englishmen by the name of Charles Babbage showed an interesting-looking pile of brass to a few friends and associates- a small assembly of cogs and wheels that he said was merely a precursor to the design of a far larger machine: his difference engine. The mathematical workings of his design were based on Newton polynomials, a fiddly bit of maths that I won’t even pretend to understand, but that could be used to closely approximate logarithmic and trigonometric functions. However, what made the difference engine special was that the original setup of the device, the positions of the various columns and so forth, determined what function the machine performed. This was more than just a simple device for adding up, this was beginning to look like a programmable computer.

Babbage’s machine was not the all-conquering revolutionary design the hype about it might have you believe. Babbage was commissioned to build one by the British government for military purposes, but since Babbage was often brash, once claiming that he could not fathom the idiocy of the mind that would think up a question an MP had just asked him, and prized academia above fiscal matters & practicality, the idea fell through. After investing £17,000 in his machine before realising that he had switched to working on a new and improved design known as the analytical engine, they pulled the plug and the machine never got made. Neither did the analytical engine, which is a crying shame; this was the first true computer design, with two separate inputs for both data and the required program, which could be a lot more complicated than just adding or subtracting, and an integrated memory system. It could even print results on one of three printers, in what could be considered the first human interfacing system (akin to a modern-day monitor), and had ‘control flow systems’ incorporated to ensure the performing of programs occurred in the correct order. We may never know, since it has never been built, whether Babbage’s analytical engine would have worked, but a later model of his difference engine was built for the London Science Museum in 1991, yielding accurate results to 31 decimal places.

…and I appear to have run on a bit further than intended. No matter- my next post will continue this journey down the history of the computer, and we’ll see if I can get onto any actual explanation of how the things work.

Advertisement

The Conquest of Air

Everybody in the USA, and in fact just about everyone across the world, has heard of Orville and Wilbur Wright. Two of the pioneers of aviation, when their experimental biplane Flyer achieved the first ever manned, powered, heavier-than-air flight on the morning of December 17, 1903, they had finally achieved one of man’s long-held dreams; control and mastery of air travel.

However, what is often puzzling when considering the Wright brothers’ story is the number of misconceptions surrounding them. Many, for instance, are under the impression that they were the first people to fly at all, inventing all the various technicalities of lift, aerofoil structures and control that are now commonplace in today’s aircraft. In fact, the story of flight, perhaps the oldest and maddest of human ambitions, an idea inspired by every time someone has looked up in wonder at the graceful flight of a bird, is a good deal older than either of them.

Our story begins, as does nearly all technological innovation, in imperial China, around 300 BC (the Greek scholar Archytas had admittedly made a model wooden pigeon ‘fly’ some 100 years previously, but nobody is sure exactly how he managed it). The Chinese’s first contribution was the invention of the kite, an innovation that would be insignificant if it wasn’t for whichever nutter decided to build one big enough to fly in. However, being strapped inside a giant kite and sent hurtling skywards not only took some balls, but was heavily dependent on wind conditions, heinously dangerous and dubiously useful, so in the end the Chinese gave up on manned flight and turned instead to unmanned ballooning, which they used for both military signalling and ceremonial purposes. It isn’t actually known if they ever successfully put a man into the air using a kite, but they almost certainly gave it a go. The Chinese did have one further attempt, this time at inventing the rocket engine, some years later, in which a young and presumably mental man theorised that if you strapped enough fireworks to a chair then they would send the chair and its occupants hurtling into the night sky. His prototype (predictably) exploded, and it wasn’t for two millennia, after the passage of classical civilisation, the Dark Ages and the Renaissance, that anyone tried flight again.

That is not to say that the idea didn’t stick around. The science was, admittedly beyond most people, but as early as 1500 Leonardo da Vinci, after close examination of bird wings, had successfully deduced the principle of lift and made several sketches showing designs for a manned glider. The design was never tested, and not fully rediscovered for many hundreds of years after his death (Da Vinci was not only a controversial figure and far ahead of his time, but wrote his notebooks in a code that it took centuries to decipher), but modern-day experiments have shown that his design would probably have worked. Da Vinci also put forward the popular idea of ornithopters, aircraft powered by flapping motion as in bird wings, and many subsequent attempts at flight attempted to emulate this method of motion. Needless to say, these all failed (not least because very few of the inventors concerned actually understood aerodynamics).

In fact, it wasn’t until the late 18th century that anyone started to really make any headway in the pursuit of flight. In 1783, a Parisian physics professor, Jacques Charles, built on the work of several Englishmen concerning the newly discovered hydrogen gas and the properties and behaviour of gases themselves. Theorising that, since hydrogen was less dense than air, it should follow Archimedes’ principle of buoyancy and rise, thus enabling it to lift a balloon, he launched the world’s first hydrogen balloon from the Champs du Mars on August 27th. The balloon was only small, and there were significant difficulties encountered in building it, but in the design process Charles, aided by his engineers the Roberts brothers, invented a method of treating silk to make it airtight, spelling the way for future pioneers of aviation. Whilst Charles made some significant headway in the launch of ever-larger hydrogen balloons, he was beaten to the next significant milestones by the Montgolfier brothers, Joseph-Michel and Jacques-Etienne. In that same year, their far simpler hot-air balloon designs not only put the first living things (a sheep, rooster and duck) into the atmosphere, but, just a month later, a human too- Jacques-Etienne was the first European, and probably the first human, ever to fly.

After that, balloon technology took off rapidly (no pun intended). The French rapidly became masters of the air, being the first to cross the English Channel and creators of the first steerable and powered balloon flights. Finally settling on Charles’ hydrogen balloons as a preferable method of flight, blimps and airships began, over the next century or so, to become an accepted method of travel, and would remain so right up until the Hindenburg disaster of 1937, which rather put people off the idea. For some scientists and engineers, humankind had made it- we could now fly, could control where we were going at least partially independent of the elements, and any attempt to do so with a heavier-than-air machine was both a waste of time and money, the preserve of dreamers. Nonetheless, to change the world, you sometimes have to dream big, and that was where Sir George Cayley came in.

Cayley was an aristocratic Yorkshireman, a skilled engineer and inventor, and a magnanimous, generous man- he offered all of his inventions for the public good and expected no payment for them. He dabbled in a number of fields, including seatbelts, lifeboats, caterpillar tracks, prosthetics, ballistics and railway signalling. In his development of flight, he even reinvented the wheel- he developed the idea of holding a wheel in place using thin metal spokes under tension rather than solid ones under compression, in an effort to make the wheels lighter, and is thus responsible for making all modern bicycles practical to use. However, he is most famous for being the first man ever, in 1853, to put somebody into the air using a heavier-than-air glider (although Cayley may have put a ten-year old in a biplane four years earlier).

The man in question was Cayley’s chauffeur (or butler- historical sources differ widely), who was (perhaps understandably) so hesitant to go in his boss’ mental contraption that he handed in his notice upon landing after his flight across Brompton Dale, stating  as his reason that ‘I was hired to drive, not fly’. Nonetheless, Cayley had shown that the impossible could be done- man could fly using just wings and wheels. He had also designed the aerofoil from scratch, identified the forces of thrust, lift, weight and drag that control an aircraft’s movements, and paved the way for the true pioneer of ‘heavy’ flight- Otto Lilienthal.

Lilienthal (aka ‘The Glider King’) was another engineer, making 25 patents in his life, including a revolutionary new engine design. But his fame comes from a world without engines- the world of the sky, with which he was obsessed. He was just a boy when he first strapped wings to his arms in an effort to fly (which obviously failed completely), and later published works detailing the physics of bird flight. It wasn’t until 1891, aged 43, once his career and financial position was stable and he had finished fighting in the Franco-Prussian War, that he began to fly in earnest, building around 12 gliders over a 5-year period (of which 6 still survive). It might have taken him a while, but once he started there was no stopping him, as he made over 2000 flights in just 5 years (averaging more than one every day). During this time he was only able to rack up 5 hours of flight time (meaning his average flight time was just 9 seconds), but his contribution to his field was enormous. He was the first to be able to control and manoeuvre his machines by varying his position and weight distribution, a factor whose importance he realised was absolutely paramount, and also recognised that a proper understanding of how to achieve powered flight (a pursuit that had been proceeding largely unsuccessfully for the past 50 years) could not be achieved without a basis in unpowered glider flight, in recognising that one must work in harmony with aerodynamic forces. Tragically, one of Lilienthal’s gliders crashed in 1896, and he died after two days in hospital. But his work lived on, and the story of his exploits and his death reached across the world, including to a pair of brothers living in Dayton, Ohio, USA, by the name of Wright. Together, the Wright brothers made huge innovations- they redesigned the aerofoil more efficiently, revolutionised aircraft control using wing warping technology (another idea possibly invented by da Vinci), conducted hours of testing in their own wind tunnel, built dozens of test gliders and brought together the work of Cayley, Lilienthal, da Vinci and a host of other, mostly sadly dead, pioneers of the air.  The Wright brothers are undoubtedly the conquerors of the air, being the first to show that man need not be constrained by either gravity or wind, but can use the air as a medium of travel unlike any other. But the credit is not theirs- it is a credit shared between all those who have lived and died in pursuit of the dream of fling like birds. To quote Lilienthal’s dying words, as he lay crippled by mortal injuries from his crash, ‘Sacrifices must be made’.