F=ma

On Christmas Day 1642, a baby boy was born to a well-off Lincolnshire family in Woolsthorpe Manor. His childhood was somewhat chaotic; his father had died before he was born, and his mother remarried (to a stepfather he came to acutely dislike) when he was three. He was later to run away from school, discovered he hated the farming alternative and returned to become the school’s top pupil. He was also to later attend Trinity College Cambridge; oh, and became arguably the greatest scientist and mathematician of all time. His name was Isaac Newton.

Newton started off in a small way, developing binomial theorem; a technique used to expand powers of polynomials, which is a kind of fundamental technique used pretty much everywhere in modern science and mathematics; the advanced mathematical equivalent of knowing that 2 x 4 = 8. Oh, and did I mention that he was still a student at this point? Taking a break from his Cambridge career for a couple of years due to the minor inconvenience of the Great Plague, he whiled away the hours inventing calculus, which he finalised upon his return to Cambridge. Calculus is the collective name for differentiating and integrating, which allows one to find out the rate at which something is occurring, the gradient of a graph and the area under it algebraically; plus enabling us to reverse all of the above processes. This makes it sound like rather a neat and useful gimmick, but belies the fact that it allows us to mathematically describe everything from water flowing through a pipe to how aeroplanes fly (the Euler equations mentioned in my aerodynamics posts come from advanced calculus), and the discovery of it alone would have been enough to warrant Newton’s place in the history books. OK, and Leibniz who discovered pretty much the same thing at roughly the same time, but he got there later than Newton. So there.

However, discovering the most important mathematical tool to modern scientists and engineers was clearly not enough to occupy Newton’s prodigious mind during his downtime, so he also turned his attention to optics, aka the behaviour of light. He began by discovering that white light was comprised of all colours, revolutionising all contemporary scientific understanding of light itself by suggesting that coloured objects did not create their own colour, but reflected only certain portions of already coloured light. He combined this with discovering diffraction; that light shone through glass or another transparent material at an angle will bend. This then lead him to explain how telescopes worked, why the existing designs (based around refracting light through a lens) were flawed, and to design an entirely new type of telescope (the reflecting telescope) that is used in all modern astronomical equipment, allowing us to study, look at and map the universe like never before. Oh, and he also took the time to theorise the existence of photons (he called them corpuscles), which wouldn’t be discovered for another 250 years.

When that got boring, Newton turned his attention to a subject that he had first fiddled around with during his calculus time: gravity. Nowadays gravity is a concept taught to every schoolchild, but in Newton’s day the idea that objects fall to earth was barely even considered. Aristotle’s theories dictated that every object ‘wanted’ to be in a state of stillness on the ground unless disturbed, and Newton was the first person to make a serious challenge to that theory in nearly two millennia (whether an apple tree was involved in his discovery is heavily disputed). Not only did he and colleague Robert Hooke define the force of gravity, but they also discovered the inverse-square law for its behaviour (aka if you multiply the distance you are away from a planet by 2, then you will decrease the gravitational force on you by 2 squared, or 4) and turned it into an equation (F=-GMm/r^2). This single equation would explain Kepler’s work on celestial mechanics, accurately predict the orbit of the ****ing planets (predictions based, just to remind you, on the thoughts of one bloke on earth with little technology more advanced than a pen and paper) and form the basis of his subsequent book: “Philosophiæ Naturalis Principia Mathematica”.

Principia, as it is commonly known, is probably the single most important piece of scientific writing ever written. Not only does it set down all Newton’s gravitational theories and explore their consequences (in minute detail; the book in its original Latin is bigger than a pair of good-sized bricks), but he later defines the concepts of mass, momentum and force properly for the first time; indeed, his definitions survive to this day and have yet to be improved upon.  He also set down his three laws of motion: velocity is constant unless a force acts upon an object, the acceleration of an object is proportional to the force acting on it and the object’s mass (summarised in the title of this post) and action and reaction are equal and opposite. These three laws not only tore two thousand years of scientific theory to shreds, but nowadays underlie everything we understand about object mechanics; indeed, no flaw was found in Newton’s equations until relativity was discovered 250 years later, which only really applies to objects travelling at around 100,000 kilometres per second or greater; not something Newton was ever likely to come across.

Isaac Newton’s life outside science was no less successful; he was something of an amateur alchemist and when he was appointed Master of the Royal Mint (a post he held for 30 years until his death; there is speculation his alchemical meddling may have resulted in mercury poisoning) he used those skills to great affect in assessing coinage, in an effort to fight Britain’s massive forgery problem. He was successful in this endeavour and later became the first man to put Britain onto the gold, rather than silver, standard, reflecting his knowledge of the superior chemical qualities of the latter metal (see another previous post). He is still considered by many to be the greatest genius who ever lived, and I can see where those people are coming from.

However, the reason I find Newton especially interesting concerns his private life. Newton was a notoriously hard man to get along with; he never married, almost certainly died a virgin and is reported to have only laughed once in his life (when somebody asked him what was the point in studying Euclid. The joke is somewhat highbrow, I’ll admit). His was a lonely existence, largely friendless, and he lived, basically for his work (he has been posthumously diagnosed with everything from bipolar disorder to Asperger’s syndrome). In an age when we are used to such charismatic scientists as Richard Feynman and Stephen Hawking, Newton’s cut-off, isolated existence with only his prodigious intellect for company seems especially alien. That the approach was effective is most certainly not in doubt; every one of his scientific discoveries would alone be enough to place him in science’s hall of fame, and to have done all of them puts him head and shoulders above all of his compatriots. In many ways, Newton’s story is one of the price of success. Was Isaac Newton a successful man? Undoubtedly, in almost every field he turned his hand to. Was he a happy man? We don’t know, but it would appear not. Given the choice between success and happiness, where would you fall?

Advertisement

Determinism

In the early years of the 19th century, science was on a roll. The dark days of alchemy were beginning to give way to the modern science of chemistry as we know it today, the world of physics and the study of electromagnetism were starting to get going, and the world was on the brink of an industrial revolution that would be powered by scientists and engineers. Slowly, we were beginning to piece together exactly how our world works, and some dared to dream of a day where we might understand all of it. Yes, it would be a long way off, yes there would be stumbling blocks, but maybe, just maybe, so long as we don’t discover anything inconvenient like advanced cosmology, we might one day begin to see the light at the end of the long tunnel of science.

Most of this stuff was the preserve of hopeless dreamers, but in the year 1814 a brilliant mathematician and philosopher, responsible for underpinning vast quantities of modern mathematics and cosmology, called Pierre-Simon Laplace published a bold new article that took this concept to extremes. Laplace lived in the age of ‘the clockwork universe’, a theory that held Newton’s laws of motion to be sacrosanct truths and claimed that these laws of physics caused the universe to just keep on ticking over, just like the mechanical innards of a clock- and just like a clock, the universe was predictable. Just as one hour after five o clock will always be six, presuming a perfect clock, so every result in the world can be predicted from the results. Laplace’s arguments took such theory to its logical conclusion; if some vast intellect were able to know the precise positions of every particle in the universe, and all the forces and motions of them, at a single point in time, then using the laws of physics such an intellect would be able to know everything, see into the past, and predict the future.

Those who believed in this theory were generally disapproved of by the Church for devaluing the role of God and the unaccountable divine, whilst others thought it implied a lack of free will (although these issues are still considered somewhat up for debate to this day). However, among the scientific community Laplace’s ideas conjured up a flurry of debate; some entirely believed in the concept of a predictable universe, in the theory of scientific determinism (as it became known), whilst others pointed out the sheer difficulty in getting any ‘vast intellect’ to fully comprehend so much as a heap of sand as making Laplace’s arguments completely pointless. Other, far later, observers, would call into question some of the axiom’s upon which the model of the clockwork universe was based, such as Newton’s laws of motion (which collapse when one does not take into account relativity at very high velocities); but the majority of the scientific community was rather taken with the idea that they could know everything about something should they choose to. Perhaps the universe was a bit much, but being able to predict everything, to an infinitely precise degree, about a few atoms perhaps, seemed like a very tempting idea, offering a delightful sense of certainty. More than anything, to these scientists there work now had one overarching goal; to complete the laws necessary to provide a deterministic picture of the universe.

However, by the late 19th century scientific determinism was beginning to stand on rather shaky ground; although  the attack against it came from the rather unexpected direction of science being used to support the religious viewpoint. By this time the laws of thermodynamics, detailing the behaviour of molecules in relation to the heat energy they have, had been formulated, and fundamental to the second law of thermodynamics (which is, to this day, one of the fundamental principles of physics) was the concept of entropy.  Entropy (denoted in physics by the symbol S, for no obvious reason) is a measure of the degree of uncertainty or ‘randomness’ inherent in the universe; or, for want of a clearer explanation, consider a sandy beach. All of the grains of sand in the beach can be arranged in a vast number of different ways to form the shape of a disorganised heap, but if we make a giant, detailed sandcastle instead there are far fewer arrangements of the molecules of sand that will result in the same structure. Therefore, if we just consider the two situations separately, it is far, far more likely that we will end up with a disorganised ‘beach’ structure rather than a castle forming of its own accord (which is why sandcastles don’t spring fully formed from the sea), and we say that the beach has a higher degree of entropy than the castle. This increased likelihood of higher entropy situations, on an atomic scale, means that the universe tends to increase the overall level of entropy in it; if we attempt to impose order upon it (by making a sandcastle, rather than waiting for one to be formed purely by chance), we must input energy, which increases the entropy of the surrounding air and thus resulting in a net entropy increase. This is the second law of thermodynamics; entropy always increases, and this principle underlies vast quantities of modern physics and chemistry.

If we extrapolate this situation backwards, we realise that the universe must have had a definite beginning at some point; a starting point of order from which things get steadily more chaotic, for order cannot increase infinitely as we look backwards in time. This suggests some point at which our current universe sprang into being, including all the laws of physics that make it up; but this cannot have occurred under ‘our’ laws of physics that we experience in the everyday universe, as they could not kickstart their own existence. There must, therefore, have been some other, higher power to get the clockwork universe in motion, destroying the image of it as some eternal, unquestionable predictive cycle. At the time, this was seen as vindicating the idea of the existence of God to start everything off; it would be some years before Edwin Hubble would venture the Big Bang Theory, but even now we understand next to nothing about the moment of our creation.

However, this argument wasn’t exactly a death knell for determinism; after all, the laws of physics could still describe our existing universe as a ticking clock, surely? True; the killer blow for that idea would come from Werner Heisenburg in 1927.

Heisenburg was a particle physicist, often described as the person who invented quantum mechanics (a paper which won him a Nobel prize). The key feature of his work here was the concept of uncertainty on a subatomic level; that certain properties, such as the position and momentum of a particle, are impossible to know exactly at any one time. There is an incredibly complicated explanation for this concerning wave functions and matrix algebra, but a simpler way to explain part of the concept concerns how we examine something’s position (apologies in advance to all physics students I end up annoying). If we want to know where something is, then the tried and tested method is to look at the thing; this requires photons of light to bounce off the object and enter our eyes, or hypersensitive measuring equipment if we want to get really advanced. However, at a subatomic level a photon of light represents a sizeable chunk of energy, so when it bounces off an atom or subatomic particle, allowing us to know where it is, it so messes around with the atom’s energy that it changes its velocity and momentum, although we cannot predict how. Thus, the more precisely we try to measure the position of something, the less accurately we are able to know its velocity (and vice versa; I recognise this explanation is incomplete, but can we just take it as red that finer minds than mine agree on this point). Therefore, we cannot ever measure every property of every particle in a given space, never mind the engineering challenge; it’s simply not possible.

This idea did not enter the scientific consciousness comfortably; many scientists were incensed by the idea that they couldn’t know everything, that their goal of an entirely predictable, deterministic universe would forever remain unfulfilled. Einstein was a particularly vocal critic, dedicating the rest of his life’s work to attempting to disprove quantum mechanics and back up his famous statement that ‘God does not play dice with the universe’. But eventually the scientific world came to accept the truth; that determinism was dead. The universe would never seem so sure and predictable again.