F=ma

On Christmas Day 1642, a baby boy was born to a well-off Lincolnshire family in Woolsthorpe Manor. His childhood was somewhat chaotic; his father had died before he was born, and his mother remarried (to a stepfather he came to acutely dislike) when he was three. He was later to run away from school, discovered he hated the farming alternative and returned to become the school’s top pupil. He was also to later attend Trinity College Cambridge; oh, and became arguably the greatest scientist and mathematician of all time. His name was Isaac Newton.

Newton started off in a small way, developing binomial theorem; a technique used to expand powers of polynomials, which is a kind of fundamental technique used pretty much everywhere in modern science and mathematics; the advanced mathematical equivalent of knowing that 2 x 4 = 8. Oh, and did I mention that he was still a student at this point? Taking a break from his Cambridge career for a couple of years due to the minor inconvenience of the Great Plague, he whiled away the hours inventing calculus, which he finalised upon his return to Cambridge. Calculus is the collective name for differentiating and integrating, which allows one to find out the rate at which something is occurring, the gradient of a graph and the area under it algebraically; plus enabling us to reverse all of the above processes. This makes it sound like rather a neat and useful gimmick, but belies the fact that it allows us to mathematically describe everything from water flowing through a pipe to how aeroplanes fly (the Euler equations mentioned in my aerodynamics posts come from advanced calculus), and the discovery of it alone would have been enough to warrant Newton’s place in the history books. OK, and Leibniz who discovered pretty much the same thing at roughly the same time, but he got there later than Newton. So there.

However, discovering the most important mathematical tool to modern scientists and engineers was clearly not enough to occupy Newton’s prodigious mind during his downtime, so he also turned his attention to optics, aka the behaviour of light. He began by discovering that white light was comprised of all colours, revolutionising all contemporary scientific understanding of light itself by suggesting that coloured objects did not create their own colour, but reflected only certain portions of already coloured light. He combined this with discovering diffraction; that light shone through glass or another transparent material at an angle will bend. This then lead him to explain how telescopes worked, why the existing designs (based around refracting light through a lens) were flawed, and to design an entirely new type of telescope (the reflecting telescope) that is used in all modern astronomical equipment, allowing us to study, look at and map the universe like never before. Oh, and he also took the time to theorise the existence of photons (he called them corpuscles), which wouldn’t be discovered for another 250 years.

When that got boring, Newton turned his attention to a subject that he had first fiddled around with during his calculus time: gravity. Nowadays gravity is a concept taught to every schoolchild, but in Newton’s day the idea that objects fall to earth was barely even considered. Aristotle’s theories dictated that every object ‘wanted’ to be in a state of stillness on the ground unless disturbed, and Newton was the first person to make a serious challenge to that theory in nearly two millennia (whether an apple tree was involved in his discovery is heavily disputed). Not only did he and colleague Robert Hooke define the force of gravity, but they also discovered the inverse-square law for its behaviour (aka if you multiply the distance you are away from a planet by 2, then you will decrease the gravitational force on you by 2 squared, or 4) and turned it into an equation (F=-GMm/r^2). This single equation would explain Kepler’s work on celestial mechanics, accurately predict the orbit of the ****ing planets (predictions based, just to remind you, on the thoughts of one bloke on earth with little technology more advanced than a pen and paper) and form the basis of his subsequent book: “Philosophiæ Naturalis Principia Mathematica”.

Principia, as it is commonly known, is probably the single most important piece of scientific writing ever written. Not only does it set down all Newton’s gravitational theories and explore their consequences (in minute detail; the book in its original Latin is bigger than a pair of good-sized bricks), but he later defines the concepts of mass, momentum and force properly for the first time; indeed, his definitions survive to this day and have yet to be improved upon.  He also set down his three laws of motion: velocity is constant unless a force acts upon an object, the acceleration of an object is proportional to the force acting on it and the object’s mass (summarised in the title of this post) and action and reaction are equal and opposite. These three laws not only tore two thousand years of scientific theory to shreds, but nowadays underlie everything we understand about object mechanics; indeed, no flaw was found in Newton’s equations until relativity was discovered 250 years later, which only really applies to objects travelling at around 100,000 kilometres per second or greater; not something Newton was ever likely to come across.

Isaac Newton’s life outside science was no less successful; he was something of an amateur alchemist and when he was appointed Master of the Royal Mint (a post he held for 30 years until his death; there is speculation his alchemical meddling may have resulted in mercury poisoning) he used those skills to great affect in assessing coinage, in an effort to fight Britain’s massive forgery problem. He was successful in this endeavour and later became the first man to put Britain onto the gold, rather than silver, standard, reflecting his knowledge of the superior chemical qualities of the latter metal (see another previous post). He is still considered by many to be the greatest genius who ever lived, and I can see where those people are coming from.

However, the reason I find Newton especially interesting concerns his private life. Newton was a notoriously hard man to get along with; he never married, almost certainly died a virgin and is reported to have only laughed once in his life (when somebody asked him what was the point in studying Euclid. The joke is somewhat highbrow, I’ll admit). His was a lonely existence, largely friendless, and he lived, basically for his work (he has been posthumously diagnosed with everything from bipolar disorder to Asperger’s syndrome). In an age when we are used to such charismatic scientists as Richard Feynman and Stephen Hawking, Newton’s cut-off, isolated existence with only his prodigious intellect for company seems especially alien. That the approach was effective is most certainly not in doubt; every one of his scientific discoveries would alone be enough to place him in science’s hall of fame, and to have done all of them puts him head and shoulders above all of his compatriots. In many ways, Newton’s story is one of the price of success. Was Isaac Newton a successful man? Undoubtedly, in almost every field he turned his hand to. Was he a happy man? We don’t know, but it would appear not. Given the choice between success and happiness, where would you fall?

Advertisement

NUMBERS

One of the most endlessly charming parts of the human experience is our capacity to see something we can’t describe and just make something up in order to do so, never mind whether it makes any sense in the long run or not. Countless examples have been demonstrated over the years, but the mother lode of such situations has to be humanity’s invention of counting.

Numbers do not, in and of themselves, exist- they are simply a construct designed by our brains to help us get around the awe-inspiring concept of the relative amounts of things. However, this hasn’t prevented this ‘neat little tool’ spiralling out of control to form the vast field that is mathematics. Once merely a diverting pastime designed to help us get more use out of our counting tools, maths (I’m British, live with the spelling) first tentatively applied itself to shapes and geometry before experimenting with trigonometry, storming onwards to algebra, turning calculus into a total mess about four nanoseconds after its discovery of something useful, before just throwing it all together into a melting point of cross-genre mayhem that eventually ended up as a field that it as close as STEM (science, technology, engineering and mathematics) gets to art, in that it has no discernible purpose other than for the sake of its own existence.

This is not to say that mathematics is not a useful field, far from it. The study of different ways of counting lead to the discovery of binary arithmetic and enabled the birth of modern computing, huge chunks of astronomy and classical scientific experiments were and are reliant on the application of geometric and trigonometric principles, mathematical modelling has allowed us to predict behaviour ranging from economics & statistics to the weather (albeit with varying degrees of accuracy) and just about every aspect of modern science and engineering is grounded in the brute logic that is core mathematics. But… well, perhaps the best way to explain where the modern science of maths has lead over the last century is to study the story of i.

One of the most basic functions we are able to perform to a number is to multiply it by something- a special case, when we multiply it by itself, is ‘squaring’ it (since a number ‘squared’ is equal to the area of a square with side lengths of that number). Naturally, there is a way of reversing this function, known as finding the square root of a number (ie square rooting the square of a number will yield the original number). However, convention dictates that a negative number squared makes a positive one, and hence there is no number squared that makes a negative and there is no such thing as the square root of a negative number, such as -1. So far, all I have done is use a very basic application of logic, something a five-year old could understand, to explain a fact about ‘real’ numbers, but maths decided that it didn’t want to not be able to square root a negative number, so had to find a way round that problem. The solution? Invent an entirely new type of number, based on the quantity i (which equals the square root of -1), with its own totally arbitrary and made up way of fitting  on a number line, and which can in no way exist in real life.

Admittedly, i has turned out to be useful. When considering electromagnetic forces, quantum physicists generally assign the electrical and magnetic components real and imaginary quantities in order to identify said different components, but its main purpose was only ever to satisfy the OCD nature of mathematicians by filling a hole in their theorems. Since then, it has just become another toy in the mathematician’s arsenal, something for them to play with, slip into inappropriate situations to try and solve abstract and largely irrelevant problems, and with which they can push the field of maths in ever more ridiculous directions.

A good example of the way mathematics has started to lose any semblance of its grip on reality concerns the most famous problem in the whole of the mathematical world- Fermat’s last theorem. Pythagoras famously used the fact that, in certain cases, a squared plus b squared equals c squared as a way of solving some basic problems of geometry, but it was never known as to whether a cubed plus b cubed could ever equal c cubed if a, b and c were whole numbers. This was also true for all other powers of a, b and c greater than 2, but in 1637 the brilliant French mathematician Pierre de Fermat claimed, in a scrawled note inside his copy of Diohantus’ Arithmetica, to have a proof for this fact ‘that is too large for this margin to contain’. This statement ensured the immortality of the puzzle, but its eventual solution (not found until 1995, leading most independent observers to conclude that Fermat must have made a mistake somewhere in his ‘marvellous proof’) took one man, Andrew Wiles, around a decade to complete. His proof involved showing that the terms involved in the theorem could be expressed in the form of an incredibly weird equation that doesn’t exist in the real world, and that all equations of this type had a counterpart equation of an equally irrelevant type. However, since the ‘Fermat equation’ was too weird to exist in the other format, it could not logically be true.

To a mathematician, this was the holy grail; not only did it finally lay to rest an ages-old riddle, but it linked two hitherto unrelated branches of algebraic mathematics by way of proving what is (now it’s been solved) known as the Taniyama-Shimura theorem. To anyone interested in the real world, this exercise made no contribution to it whatsoever- apart from satisfying a few nerds, nobody’s life was made easier by the solution, it didn’t solve any real-world problem, and it did not make the world a tangibly better place. In this respect then, it was a total waste of time.

However, despite everything I’ve just said, I’m not going to decide that all modern day mathematics is a waste of time; very few human activities ever are. Mathematics is many things; among them ridiculous, confusing, full of contradictions and potential slip-ups and, in a field whose age of winning a major prize is younger than in any other STEM field, apparently full of those likely to belittle you out of future success should you enter the world of serious academia. But, for some people, maths is just what makes the world makes sense, and at its heart that was all it was ever created to do. And if some people want their life to be all about the little symbols that make the world make sense, then well done to the world for making a place for them.

Oh, and there’s a theory doing the rounds of cosmology nowadays that reality is nothing more than a mathematical construct. Who knows in what obscure branch of reverse logarithmic integrals we’ll find answers about that one…