War Games

So, what haven’t I done a post on in a while. Hmm…

Film reviewing?

WarGames was always going to struggle to age gracefully; even in 1983 setting one’s plot against the backdrop of the Cold War was something of an old idea, and the fear of the unofficial conflict degenerating into armageddon had certainly lessened since the ‘Red Scare’ days of the 50s and 60s. Then there’s the subject matter and plot- ‘supercomputer almost destroys world via nuclear war’ must have seemed terribly futuristic and sci-fi, but several years of filmmaking have rendered the idea somewhat cliched; it’s no coincidence that the film’s 2008 ‘sequel’ went straight to DVD. In an age where computers have now become ubiquitous, the computing technology on display also seems hilariously old-fashioned, but a bigger flaw is the film’s presentation of how computers work. Our AI antagonist, ‘Joshua’, shows the ability to think creatively, talk and respond like a human and to learn from experience & repetition, all features that 30 years of superhuman technological advancement in the field of computing have still not been able to pull off with any real success; the first in a long series of plot holes. I myself spent much of the second act inwardly shouting at the characters for making quite so many either hideously dumb or just plain illogical decisions, ranging from agreeing on a whim to pay for a flight across the USA to a friend met just days earlier to deciding that the best way to convince a bunch of enraged FBI officers of that you are not a Soviet-controlled terrorist bent on destruction of the USA is to break out of their custody.

The first act largely avoided these problems, and the setup was well executed; our protagonist is David (Matthew Broderick), a late teenage high school nerd who manages to avoid the typical Hollywood idea of nerd-dom by being articulate, well-liked, not particularly concerned about his schoolwork and relatively normal. Indeed, the only clues we have to his nerdery come thanks to his twin loves of video gaming and messing around in his room with a computer, hacking into anything undefended that he considers interesting. The film also manages to avoid reverting to formula with regards to the film’s female lead, his friend Jennifer (Ally Sheedy), who manages to not fall into the role of designated love interest whilst acting as an effective sounding board for the audience’s questions; a nice touch when dealing subject matter that audiences of the time would doubtless have found difficult to understand. This does leave her character somewhat lacking in depth, but thankfully this proves the exception rather than the rule.

Parallel to this, we have NORAD; the USA’s nuclear defence headquarters, who after realising the potential risk of human missile operators being unwilling to launch their deadly weapons, decide to place their entire nuclear arsenal under computerised control. The computer in question is the WOPR, a supercomputer intended to continually play ‘war games’ to identify the optimal strategy in the event of nuclear war. So we have a casual computer hacker at one end of the story and a computer with far too much control for its own good in the other; you can guess how things are going to go from there.

Unfortunately, things start to unravel once the plot starts to gather speed. Broderick’s presentation of David works great when he’s playing a confident, playful geek, but when he starts trying to act scared or serious his delivery becomes painfully unnatural. Since he and Sheedy’s rather depthless character et the majority of the screen time, this leaves large portions of the film lying fallow; the supporting characters, such as the brash General Beringer (Barry Corbin) and the eccentric Dr. Stephen Falken (John Wood) do a far better job of filling out their respective character patterns, but they can’t quite overshadow the plot holes and character deficiencies of the twin leads. This is not to say the film is bad, far from it; director John Badham clearly knows how to build tension, using NORAD’s Defcon level as a neat indicator of just how high the stakes are/how much **** is waiting to hit the proverbial fan. Joshua manages to be a compelling bad guy, in spite of being faceless and having less than five minutes of actual screen time, and his famous line “A strange game. The only winning move is not to play” carries enough resonance and meaning that I’d heard of it long before I had the film it came from. It also attempts the classic trick, demonstrated to perfection in Inception, of dealing with subject matter that attempts to blur the line between fiction (the ‘war games’) and reality (nuclear war) in an effort to similarly blur its own fiction with the reality of the audience; it is all desperately trying to be serious and meaningful.

But in the end, it all feels like so much add-ons, and somehow the core dynamics and characterisation left me out of the experience. WarGames tries so very hard to hook the viewer in to a compelling, intriguing, high-stakes plot, but for me it just failed to quite pull it off. It’s not a bad film, but to me it all felt somehow underwhelming. The internet tells me that for some people, it’s a favourite, but for me it was gently downhill from the first act onwards. I don’t really have much more to say.

Advertisement

The Sting

I have twice before used this blog to foray into the strange world of film reviewing; something that I enjoy, given that I enjoy cinema, but am usually unable to make a stable source of material since I don’t generally have the time (or, given a lot of the films that get released in my local cinema, inclination) to see too many of them. My first foray was a rather rambling (and decidedly rubbish) examination of The Hunger Games, with a couple of nods to the general awesomeness of The Shawshank Redemption, whilst I felt compelled to write my second just to articulate my frustration after seeing The Dark Knight Rises. Today, I wish to return to the magical fairy kingdom of the big screen, this time concerning something that I would ordinarily have never seen at all; 70s crime flick ‘The Sting’

The Sting is quite clearly a film from another era of filmmaking; I am not old enough to remember the times when a stock ‘thump’ sound byte was inserted into the footage every time an object is put onto a table, but this film contains such cinematic anachronisms in spades. Similarly, this is the first film I have ever seen starring Robert Redford and my first from director George Roy Hill, but age should be no barrier to quality entertainment if it’s there to shine through and thankfully it’s basic plot and premise lend it to a graceful aging process.

The plot can be fairly summarily described as uncomplicated; a young confidence trickster who ends up accidentally making a small fortune from a fairly routine con is pursued by the mob boss whose money he has now lost, so teams up with an experienced ‘old head’ to bring him down. So Ocean’s Eleven with a simpler character base and more realistic motivations. Where the two differ, however, is in their dedication to their subject material; whilst the Ocean’s films are generally content to follow some rather formulaic Hollywood scriptwriting, placing their emphasis heavily on interpersonal relationships and love interests, The Sting goes out of its way to be a true crime story to its very core. Set in the golden age of organised crime (1930s prohibition-era Illinois, real-life home of Al Capone) with a memorable ragtime soundtrack to match, every stage (illustrated explicitly through the use of old-fashioned title cards) of the film’s overarching ‘big con’ plot takes the form of a classic confidence trick, from an old-fashioned money switch to a large-scale rigged betting house, incorporating along the way possibly the finest played (and cheated) game of poker ever to appear on screen. Every feature, facet and subplot from the cheated cop to the seemingly out-of-place love interest all has its place in the big con, and there was nothing there that didn’t have a very good reason to be. Not only did this create a rollercoaster of a focused, central plot without unnecessary distractions, but the authenticity of the tricks, characters and terminology used built a believable, compelling world to immerse oneself in and enjoy. Combine that with a truly stellar portrayal of the seen-it-all genius conman Henry Gondorff by Paul Newman, and Robert Redford’s evident gift for building a very real, believable character in the form of naive youngster Johnny Hooker, and we have the makings of an incredibly immersive story that you often have to remind yourself isn’t actually real.

However, by putting such focus on its central con, The Sting puts itself under an awful lot of pressure, for without any extraneous components (hell, there aren’t even any proper action scenes, despite the not infrequent bouts of gunfire) it has got nowhere to fall if its central plot fails. Thus, the success of the film very much rests on the success of the con it centres around, not just in terms of execution itself but in making its execution fit its style. The Sting is not about coming up with something on the fly, about something unexpected coming up and winning through on the day- it is an homage to planning, to the skill of the con, of hooking in the mark and making them think they’ve won, before turning the ace in the hole. To turn successful planning, what was intended to happen happening, into compelling drama is a task indeed for a filmmaker.

And yet, despite all the odds, The Sting pulls it off, thanks to the extraordinary depth director Hill packs into his seemingly simplistic plot. Each subplot put into play is like adding another dot to the puzzle, and it is left to the viewer to try and join them all to formulate the finished picture- or alternatively watch to see the film do so all with staggering aplomb. Every element is laid out on the table, everyone can see the cards, and it’s simply a matter of the film being far smarter than you are in revealing how it pulls its trick, just like a conman and his mark. You, the viewer, have been stung just as much as Robert Shaw’s mob boss of a mark, except that you can walk out of the room with your wallet full and a smile on your face.

This is not to say that the film doesn’t have problems. Whilst the basic premise is simple and well-executed enough to be bulletproof, its ‘setup’ phase (as the title cards called it) spends an awful lot of time on world-, scenario- and character-building, filling the early parts of the film with enough exposition to make me feel decidedly lukewarm about it- it’s all necessary to remove plot holes and to build the wonderful air of depth and authenticity, but something about its execution strikes me as clunky. It also suffers Inception’s problem of being potentially confusing to anyone not keeping a very close track of what’s going on, and one or two of the minor characters suffer from having enough of a role to be significant but not enough characterisation to seem especially real. That said, this film won seven Oscars for a reason, and regardless of how slow it may seem to begin with, it’s definitely worth sticking it out to the end. I can promise you it will be worth it.

What we know and what we understand are two very different things…

If the whole Y2K debacle over a decade ago taught us anything, it was that the vast majority of the population did not understand the little plastic boxes known as computers that were rapidly filling up their homes. Nothing especially wrong or unusual about this- there’s a lot of things that only a few nerds understand properly, an awful lot of other stuff in our life to understand, and in any case the personal computer had only just started to become commonplace. However, over 12 and a half years later, the general understanding of a lot of us does not appear to have increased to any significant degree, and we still remain largely ignorant of these little feats of electronic witchcraft. Oh sure, we can work and operate them (most of us anyway), and we know roughly what they do, but as to exactly how they operate, precisely how they carry out their tasks? Sorry, not a clue.

This is largely understandable, particularly given the value of ‘understand’ that is applicable in computer-based situations. Computers are a rare example of a complex system that an expert is genuinely capable of understanding, in minute detail, every single aspect of the system’s working, both what it does, why it is there, and why it is (or, in some cases, shouldn’t be) constructed to that particular specification. To understand a computer in its entirety, therefore, is an equally complex job, and this is one very good reason why computer nerds tend to be a quite solitary bunch, with quite few links to the rest of us and, indeed, the outside world at large.

One person who does not understand computers very well is me, despite the fact that I have been using them, in one form or another, for as long as I can comfortably remember. Over this summer, however, I had quite a lot of free time on my hands, and part of that time was spent finally relenting to the badgering of a friend and having a go with Linux (Ubuntu if you really want to know) for the first time. Since I like to do my background research before getting stuck into any project, this necessitated quite some research into the hows and whys of its installation, along with which came quite a lot of info as to the hows and practicalities of my computer generally. I thought, then, that I might spend the next couple of posts or so detailing some of what I learned, building up a picture of a computer’s functioning from the ground up, and starting with a bit of a history lesson…

‘Computer’ was originally a job title, the job itself being akin to accountancy without the imagination. A computer was a number-cruncher, a supposedly infallible data processing machine employed to perform a range of jobs ranging from astronomical prediction to calculating interest. The job was a fairly good one, anyone clever enough to land it probably doing well by the standards of his age, but the output wasn’t. The human brain is not built for infallibility and, not infrequently, would make mistakes. Most of these undoubtedly went unnoticed or at least rarely caused significant harm, but the system was nonetheless inefficient. Abacuses, log tables and slide rules all aided arithmetic manipulation to a great degree in their respective fields, but true infallibility was unachievable whilst still reliant on the human mind.

Enter Blaise Pascal, 17th century mathematician and pioneer of probability theory (among other things), who invented the mechanical calculator aged just 19, in 1642. His original design wasn’t much more than a counting machine, a sequence of cogs and wheels so constructed as to able to count and convert between units, tens, hundreds and so on (ie a turn of 4 spaces on the ‘units’ cog whilst a seven was already counted would bring up eleven), as well as being able to work with currency denominations and distances as well. However, it could also subtract, multiply and divide (with some difficulty), and moreover proved an important point- that a mechanical machine could cut out the human error factor and reduce any inaccuracy to one of simply entering the wrong number.

Pascal’s machine was both expensive and complicated, meaning only twenty were ever made, but his was the only working mechanical calculator of the 17th century. Several, of a range of designs, were built during the 18th century as show pieces, but by the 19th the release of Thomas de Colmar’s Arithmometer, after 30 years of development, signified the birth of an industry. It wasn’t a large one, since the machines were still expensive and only of limited use, but de Colmar’s machine was the simplest and most reliable model yet. Around 3,000 mechanical calculators, of various designs and manufacturers, were sold by 1890, but by then the field had been given an unexpected shuffling.

Just two years after de Colmar had first patented his pre-development Arithmometer, an Englishmen by the name of Charles Babbage showed an interesting-looking pile of brass to a few friends and associates- a small assembly of cogs and wheels that he said was merely a precursor to the design of a far larger machine: his difference engine. The mathematical workings of his design were based on Newton polynomials, a fiddly bit of maths that I won’t even pretend to understand, but that could be used to closely approximate logarithmic and trigonometric functions. However, what made the difference engine special was that the original setup of the device, the positions of the various columns and so forth, determined what function the machine performed. This was more than just a simple device for adding up, this was beginning to look like a programmable computer.

Babbage’s machine was not the all-conquering revolutionary design the hype about it might have you believe. Babbage was commissioned to build one by the British government for military purposes, but since Babbage was often brash, once claiming that he could not fathom the idiocy of the mind that would think up a question an MP had just asked him, and prized academia above fiscal matters & practicality, the idea fell through. After investing £17,000 in his machine before realising that he had switched to working on a new and improved design known as the analytical engine, they pulled the plug and the machine never got made. Neither did the analytical engine, which is a crying shame; this was the first true computer design, with two separate inputs for both data and the required program, which could be a lot more complicated than just adding or subtracting, and an integrated memory system. It could even print results on one of three printers, in what could be considered the first human interfacing system (akin to a modern-day monitor), and had ‘control flow systems’ incorporated to ensure the performing of programs occurred in the correct order. We may never know, since it has never been built, whether Babbage’s analytical engine would have worked, but a later model of his difference engine was built for the London Science Museum in 1991, yielding accurate results to 31 decimal places.

…and I appear to have run on a bit further than intended. No matter- my next post will continue this journey down the history of the computer, and we’ll see if I can get onto any actual explanation of how the things work.