The Hairy Ones

My last post on the subject of music history covered the relatively short timespan between around 1950 and 1965, leaving off at about the time The Beatles began leading the ‘British Invasion’ of American music culture. This invasion was a confluence of a whole host of factors; a fresh generation of youths wishing to identify with something new as ‘theirs’ and different to their parents, a British music scene that had been influenced by the American one without being so ingratiated into it as to snub their ability to innovate and make a good sound, and the fact that said generation of youngsters were the first to grow up around guitar music and thus the first to learn to play them and other genre-defining instruments en masse. Plus, some seriously good musicians in there. However, the British invasion was only the first of a multi-part wave of insane musical experimentation and innovation, flooding the market with new ideas and spawning, in the space of less than a decade, almost every genre to exist today. And for the cause of much of part two, we must backtrack a little to 1955.

Y’see, after the Second World War Japan, the dominant East Asian power, had surrendered unconditionally to the Allies and there was no dominant force in the region. This created something of a power vacuum in the area, with a host of new governments trying to rise from the post-war chaos and establish themselves as such a power. Many of these new nations, including those of China, Cambodia, North Korea and North Vietnam, were Communist states, and therefore were a serious concern to the western world. The US in particular, as a fiercely capitalist power, were deeply worried by the prospect of the whole of South East Asia, according to communist theory, just amalgamating into another great communist superpower and landing them with next to zero chance of triumphing in their ‘battle against communism’ against the already hugely powerful Soviet Union. As such, they were hell-bent on preserving every ounce of capitalist democracy they could in the area, and were prepared to defend such governments with as much force as necessary. In 1950 they had already started a war in Korea to prevent the communist north’s invasion of the democratic south, with the practical upshot (after China joined in) of re establishing the border pretty much exactly where it had been to start with and creating a state of war that, officially, has yet to end. In 1955, a similar situation was developing in Vietnam, and President Dwight D Eisenhower once again sent in the army.

Cut to ten years later, and the war was still going on. Once a crusade against the onward-marching forces of communism, the war had just dragged on and on with its only tangible result being a steady stream of dead and injured servicemen fighting a war many, especially the young who had not grown up with the degree of Commie-hating their parents had, now considered futile and stupid. Also related to ‘the Red Scare’ was the government’s allowing of capitalist corporations to run haywire, vamping up their marketing and the consumer-saturation of America. This might have lead to a 15 year long economic boom, but again many of the younger generation were getting sick of it all. All of this, combined with a natural teenage predisposition to do exactly what their parents don’t want them to, lead to a new, reactionary counter-culture that provided an impetus for a whole wave of musical experimentation; hippies.

The hippie movement (the word is, strangely, derived from ‘hipster’) was centred around pacifism, freedom of love and sex (hence ‘make love not war’), an appreciation of the home made and the natural rather than the plastic and capitalist, and drug use. The movement exists to this day, but it was most prevalent in the late 60s when a craze took the American youth by storm. They protested on a huge variety of issues, ranging from booing returning soldiers and more general anti-war stuff (hippies were also dubbed ‘flower children’ for their practice of giving flowers to police officers at such demonstrations) to demonstrations on the banning of LSD or ‘acid’, one of their more commonly used drugs. This movement of wired, eco-centric vegetarians didn’t connect well with the relatively fresh, clean tones of rock & roll and The Beatles, and inspired new music based around their psychedelic and their ‘appreciation’ of drug use. It was in this vein that The Beatles recorded Lucy in the Sky with Diamonds, and why Jimi Hendrix and Janis Joplin rose to fame in a new genre known as ‘acid rock’ (named after the drug from which most of the lyrics were ‘inspired’). Characterised by long, confusing and hideously difficult solos (I’m looking at you Hendrix), this was the prominent genre on show at the infamous Woodstock festival of 1969, featuring Hendrix, Joplin, The Who, The Grateful Dead & Carlos Santana among other things. Woodstock was the high point of the hippie movement, with over half a million fans attending to smoke, listen to the music, skinny dip and make love in and around the lake and generally by as hippie as possible.

Hippie culture went downhill post-Woodstock; public outcry following the Altamont Free Concert close to San Francisco (where Hell’s Angels provided security and shot a concert-goer during The Rolling Stones’ set for brandishing a gun) coincided with ‘the hippie generation’ mostly growing up. The movement still exists today, and it legacy in terms of public attitudes to sexual freedom, pacifism and general tolerance (hippies were big on civil rights and respect for the LGBT community) is certainly considerable. But their contribution to the musical world is almost as massive; acid rock was a key driving force behind the development of the genres of folk rock (think Noah and the Whale) and heavy metal (who borrowed from Hendrix’s style of heavy guitar playing). Most importantly, music being as big a part as it was of hippie culture definitively established that the practice of everyone, even the lowliest, ‘commonest’ people, buying, listening to, sharing and most importantly making music themselves was here to stay.

The story of hippies covers just one of the music families spawned out of the late 60s. The wave of kids growing up with guitars and the idea that they can make their own music, can be the next big thing, with no preconceived ideas, resulted in a myriad of different styles and genres that form the roots of every style of modern rock music. This period was known as ‘the golden age of rock’ for a reason; before pop was big, before hip-hop, before rap, decades before dubstep, before even punk rock (born in the early seventies and disliked by many serious music nerds for being unimaginative and stupid), rock music ruled and rock music blossomed.

You could argue that this, then, marks the story of rock, and that the rest of the tale is just one long spiral downwards- that once the golden age ended, everything is just a nice depressing story. Well, I certainly don’t like to think of that as true (if only because I would rather not have a mindset to make me stop listening to music),  but even if it was, there is a hell of a lot of stuff left in this story. Over? Not for another post or two…

Advertisement

What we know and what we understand are two very different things…

If the whole Y2K debacle over a decade ago taught us anything, it was that the vast majority of the population did not understand the little plastic boxes known as computers that were rapidly filling up their homes. Nothing especially wrong or unusual about this- there’s a lot of things that only a few nerds understand properly, an awful lot of other stuff in our life to understand, and in any case the personal computer had only just started to become commonplace. However, over 12 and a half years later, the general understanding of a lot of us does not appear to have increased to any significant degree, and we still remain largely ignorant of these little feats of electronic witchcraft. Oh sure, we can work and operate them (most of us anyway), and we know roughly what they do, but as to exactly how they operate, precisely how they carry out their tasks? Sorry, not a clue.

This is largely understandable, particularly given the value of ‘understand’ that is applicable in computer-based situations. Computers are a rare example of a complex system that an expert is genuinely capable of understanding, in minute detail, every single aspect of the system’s working, both what it does, why it is there, and why it is (or, in some cases, shouldn’t be) constructed to that particular specification. To understand a computer in its entirety, therefore, is an equally complex job, and this is one very good reason why computer nerds tend to be a quite solitary bunch, with quite few links to the rest of us and, indeed, the outside world at large.

One person who does not understand computers very well is me, despite the fact that I have been using them, in one form or another, for as long as I can comfortably remember. Over this summer, however, I had quite a lot of free time on my hands, and part of that time was spent finally relenting to the badgering of a friend and having a go with Linux (Ubuntu if you really want to know) for the first time. Since I like to do my background research before getting stuck into any project, this necessitated quite some research into the hows and whys of its installation, along with which came quite a lot of info as to the hows and practicalities of my computer generally. I thought, then, that I might spend the next couple of posts or so detailing some of what I learned, building up a picture of a computer’s functioning from the ground up, and starting with a bit of a history lesson…

‘Computer’ was originally a job title, the job itself being akin to accountancy without the imagination. A computer was a number-cruncher, a supposedly infallible data processing machine employed to perform a range of jobs ranging from astronomical prediction to calculating interest. The job was a fairly good one, anyone clever enough to land it probably doing well by the standards of his age, but the output wasn’t. The human brain is not built for infallibility and, not infrequently, would make mistakes. Most of these undoubtedly went unnoticed or at least rarely caused significant harm, but the system was nonetheless inefficient. Abacuses, log tables and slide rules all aided arithmetic manipulation to a great degree in their respective fields, but true infallibility was unachievable whilst still reliant on the human mind.

Enter Blaise Pascal, 17th century mathematician and pioneer of probability theory (among other things), who invented the mechanical calculator aged just 19, in 1642. His original design wasn’t much more than a counting machine, a sequence of cogs and wheels so constructed as to able to count and convert between units, tens, hundreds and so on (ie a turn of 4 spaces on the ‘units’ cog whilst a seven was already counted would bring up eleven), as well as being able to work with currency denominations and distances as well. However, it could also subtract, multiply and divide (with some difficulty), and moreover proved an important point- that a mechanical machine could cut out the human error factor and reduce any inaccuracy to one of simply entering the wrong number.

Pascal’s machine was both expensive and complicated, meaning only twenty were ever made, but his was the only working mechanical calculator of the 17th century. Several, of a range of designs, were built during the 18th century as show pieces, but by the 19th the release of Thomas de Colmar’s Arithmometer, after 30 years of development, signified the birth of an industry. It wasn’t a large one, since the machines were still expensive and only of limited use, but de Colmar’s machine was the simplest and most reliable model yet. Around 3,000 mechanical calculators, of various designs and manufacturers, were sold by 1890, but by then the field had been given an unexpected shuffling.

Just two years after de Colmar had first patented his pre-development Arithmometer, an Englishmen by the name of Charles Babbage showed an interesting-looking pile of brass to a few friends and associates- a small assembly of cogs and wheels that he said was merely a precursor to the design of a far larger machine: his difference engine. The mathematical workings of his design were based on Newton polynomials, a fiddly bit of maths that I won’t even pretend to understand, but that could be used to closely approximate logarithmic and trigonometric functions. However, what made the difference engine special was that the original setup of the device, the positions of the various columns and so forth, determined what function the machine performed. This was more than just a simple device for adding up, this was beginning to look like a programmable computer.

Babbage’s machine was not the all-conquering revolutionary design the hype about it might have you believe. Babbage was commissioned to build one by the British government for military purposes, but since Babbage was often brash, once claiming that he could not fathom the idiocy of the mind that would think up a question an MP had just asked him, and prized academia above fiscal matters & practicality, the idea fell through. After investing £17,000 in his machine before realising that he had switched to working on a new and improved design known as the analytical engine, they pulled the plug and the machine never got made. Neither did the analytical engine, which is a crying shame; this was the first true computer design, with two separate inputs for both data and the required program, which could be a lot more complicated than just adding or subtracting, and an integrated memory system. It could even print results on one of three printers, in what could be considered the first human interfacing system (akin to a modern-day monitor), and had ‘control flow systems’ incorporated to ensure the performing of programs occurred in the correct order. We may never know, since it has never been built, whether Babbage’s analytical engine would have worked, but a later model of his difference engine was built for the London Science Museum in 1991, yielding accurate results to 31 decimal places.

…and I appear to have run on a bit further than intended. No matter- my next post will continue this journey down the history of the computer, and we’ll see if I can get onto any actual explanation of how the things work.