Pineapples (TM)

If the last few decades of consumerism have taught us anything, it is just how much faith people are able of setting store in a brand. In everything from motorbikes to washing powder, we do not simply test and judge effectiveness of competing products objectively (although, especially when considering expensive items such as cars, this is sometimes impractical); we must compare them to what we think of the brand and the label, what reputation this product has and what it is particularly good at, which we think most suits our social standing and how others will judge our use of it. And good thing too, from many companies’ perspective, otherwise the amount of business they do would be slashed. There are many companies whose success can be almost entirely put down to the effect of their branding and the impact their marketing has had on the psyche of western culture, but perhaps the most spectacular example concerns Apple.

In some ways, to typecast Apple as a brand-built company is a harsh one; their products are doubtless good ones, and they have shown a staggering gift for bringing existed ideas together into forms that, if not quite new, are always the first to be a practical, genuine market presence. It is also true that Apple products are often better than their competitors in very specific fields; in computing, for example, OS X is better at dealing with media than other operating systems, whilst Windows has traditionally been far stronger when it comes to word processing, gaming and absolutely everything else (although Windows 8 looks very likely to change all of that- I am not looking forward to it). However, it is almost universally agreed (among non-Apple whores anyway) that once the rest of the market gets hold of it Apple’s version of a product is almost never the definitive best, from a purely analytical perspective (the iPod is a possible exception, solely due to the existence of iTunes redefining the music industry before everyone else and remaining competitive to this day) and that every Apple product is ridiculously overpriced for what it is. Seriously, who genuinely thinks that top-end Macs are a good investment?

Still, Apple make high-end, high-quality products with a few things they do really, really well that are basically capable of doing everything else. They should have a small market share, perhaps among the creative or the indie, and a somewhat larger one in the MP3 player sector. They should be a status symbol for those who can afford them, a nice company with a good history but that nowadays has to face up to a lot of competitors. As it is, the Apple way of doing business has proven successful enough to make them the biggest private company in the world. Bigger than every other technology company, bigger than every hedge fund or finance company, bigger than any oil company, worth more than every single one (excluding state owned companies such as Saudi Aramco, which is estimated to be worth around 3 trillion dollars by dealing in Saudi oil exports). How has a technology company come to be worth $400 billion? How?

One undoubted feature is Apple’s uncanny knack of getting there first- the Apple II was the first real personal computer and provided the genes for Windows-powered PC’s to take the world, whilst the iPod was the first MP3 player that was genuinely enjoyable to use, the iPhone the first smartphone (after just four years, somewhere in the region of 30% of the world’s phones are now smartphones) and the iPad the first tablet computer. Being in the technology business has made this kind of innovation especially rewarding for them; every company is constantly terrified of being left behind, so whenever a new innovation comes along they will knock something together as soon as possible just to jump on the bandwagon. However, technology is a difficult business to get right, meaning that these products are usually rubbish and make the Apple version shine by comparison. This also means that if Apple comes up with the idea first, they have had a couple of years of working time to make sure they get it right, whilst everyone else’s first efforts have had only a few scance months; it takes a while for any serious competitors to develop, by which time Apple have already made a few hundred million off it and have moved on to something else; innovation matters in this business.

But the real reason for Apple’s success can be put down to the aura the company have built around themselves and their products. From their earliest infancy Apple fans have been self-dubbed as the independent, the free thinkers, the creative, those who love to be different and stand out from the crowd of grey, calculating Windows-users (which sounds disturbingly like a conspiracy theory or a dystopian vision of the future when it is articulated like that). Whilst Windows has its problems, Apple has decided on what is important and has made something perfect in this regard (their view, not mine), and being willing to pay for it is just part of the induction into the wonderful world of being an Apple customer (still their view). It’s a compelling world view, and one that thousands of people have subscribed to, simply because it is so comforting; it sells us the idea that we are special, individual, and not just one of the millions of customers responsible for Apple’s phenomenal size and success as a company. But the secret to the success of this vision is not just the view itself; it is the method and the longevity of its delivery. This is an image that has been present in their advertising campaign from its earliest infancy, and is now so ingrained that it doesn’t have to be articulated any more; it’s just present in the subtle hints, the colour scheme, the way the Apple store is structured and the very existence of Apple-dedicated shops generally. Apple have delivered the masterclass in successful branding; and that’s all the conclusion you’re going to get for today.

Advertisement

3500 calories per pound

This looks set to be the concluding post in this particular little series on the subject of obesity and overweightness. So, to summarise where we’ve been so far- post 1: that there are a lot of slightly chubby people present in the western world leading to statistics supporting a massive obesity problem, and that even this mediocre degree of fatness can be seriously damaging to your health. Post 2: why we have spent recent history getting slightly chubby. And for today, post 3: how one can try to do your bit, especially following the Christmas excesses and the soon-broken promises of New Year, to lose some of that excess poundage.

It was Albert Einstein who first demonstrated that mass was nothing more than stored energy, and although the theory behind that precise idea doesn’t really correlate with biology the principle still stands; fat is your body’s way of storing energy. It’s also a vital body tissue, and is not a 100% bad and evil thing to ingest, but if you want to lose it then the aim should simply be one of ensuring that one’s energy output, in the form of exercise  exceeds one’s energy input, in the form of food. The body’s response to this is to use up some of its fat stores to replace this lost energy (although this process can take up to a week to run its full course; the body is a complicated thing), meaning that the amount of fat in/on your body will gradually decrease over time. Therefore, slimming down is a process that is best approached from two directions; restricting what’s going in, and increasing what’s going out (both at the same time is infinitely more effective than an either/or process). I’ll deal with what’s going in first.

The most important point to make about improving one’s diet, and when considering weight loss generally, is that there are no cheats. There are no wonder pills that will shed 20lb of body fat in a week, and no super-foods or nutritional supplements that will slim you down in a matter of months. Losing weight is always going to be a messy business that will take several months at a minimum (the title of this post refers to the calorie content of body fat, meaning that to lose one pound you must expend 3500 more calories than you ingest over a given period of time), and unfortunately prevention is better than cure; but moping won’t help anyone, so let’s just gather our resolve and move on.

There is currently a huge debate going on concerning the nation’s diet problems of amount versus content; whether people are eating too much, or just the wrong stuff. In most cases it’s probably going to be a mixture of the two, but I tend to favour the latter answer; and in any case, there’s not much I can say about the former beyond ‘eat less stuff’. I am not a good enough cook to offer any great advice on what foods you should or shouldn’t be avoiding, particularly since the consensus appears to change every fortnight, so instead I will concentrate on the one solid piece of advice that I can champion; cook your own stuff.

This is a piece of advice that many people find hard to cope with- as I said in my last post, our body doesn’t want to waste time cooking when it could be eating. When faced with the unknown product of one’s efforts in an hours time, and the surety of a ready meal or fast food within five minutes, the latter option and all the crap that goes in it starts to seem a lot more attractive. The trick is, therefore, to learn how to cook quickly- the best meals should either take less than 10-15 minutes of actual effort to prepare and make, or be able to be made in large amounts and last for a week or more. Or, even better, both. Skilled chefs achieve this by having their skills honed to a fine art and working at a furious rate, but then again they’re getting paid for it; for the layman, a better solution is to know the right dishes. I’m not going to include a full recipe list, but there are thousands online, and there is a skill to reading recipes; it can get easy to get lost between a long list of numbers and a complicated ordering system, but reading between the lines one can often identify which recipes mean ‘chop it all up and chuck in some water for half an hour’.

That’s a very brief touch on the issue, but now I want to move on and look at energy going out; exercise. I personally would recommend sport, particularly team sport, as the most reliably fun way to get fit and enjoy oneself on a weekend- rugby has always done me right. If you’re looking in the right place, age shouldn’t be an issue (I’ve seen a 50 year old play alongside a 19 year old student at a club rugby match near me), and neither should skill so long as you are willing to give it a decent go; but, sport’s not for everyone and can present injury issues so I’ll also look elsewhere.

The traditional form of fat-burning exercise is jogging, but that’s an idea to be taken with a large pinch of salt and caution. Regular joggers will lose weight it’s true, but jogging places an awful lot of stress on one’s joints (swimming, cycling and rowing are all good forms of ‘low-impact exercise’ that avoid this issue), and suffers the crowning flaw of being boring as hell. To me, anyway- it takes up a good chunk of time, during which one’s mind is so filled with the thump of footfalls and aching limbs that one is forced to endure the experience rather than enjoy it. I’ll put up with that for strength exercises, but not for weight loss when two far better techniques present themselves; intensity sessions and walking.

Intensity sessions is just a posh name for doing very, very tiring exercise for a short period of time; they’re great for burning fat & building fitness, but I’ll warn you now that they are not pleasant. As the name suggest, these involve very high-intensity exercise (as a general rule, you not be able to talk throughout high-intensity work) performed either continuously or next to continuously for relatively short periods of time- an 8 minute session a few times a week should be plenty. This exercise can take many forms; shuttle runs (sprinting back and forth as fast as possible between two marked points or lines), suicides (doing shuttle runs between one ‘base’ line and a number of different lines at different distances from the base, such that one’s runs change in length after each set) and tabata sets (picking an easily repeatable exercise, such as squats, performing them as fast as possible for 20 seconds, followed by 10 seconds of rest, then another 20 seconds of exercise, and so on for 4-8 minute) are just three examples. Effective though these are, it’s difficult to find an area of empty space to perform them without getting awkward looks and the odd spot of abuse from passers-by or neighbours, so they may not be ideal for many people (tabata sets or other exercises such as press ups are an exception, and can generally be done in a bedroom; Mark Lauren’s excellent ‘You Are Your Own Gym’ is a great place to start for anyone interested in pursuing this route to lose weight & build muscle). This leaves us with one more option; walking.

To my mind, if everyone ate properly and walked 10,000 steps per day, the scare stats behind the media’s obesity fix would disappear within a matter of months. 10,000 steps may seem a lot, and for many holding office jobs it may seem impossible, but walking is a wonderful form of exercise since it allows you to lose oneself in thought or music, whichever takes your fancy. Even if you don’t have time for a separate walk, with a pedometer in hand (they are built into many modern iPods, and free pedometer apps are available for both iPhone and Android) and a target in mind (10k is the standard) then after a couple of weeks it’s not unusual to find yourself subtly changing the tiny aspects of your day (stairs instead of lift, that sort of thing) to try and hit your target; and the results will follow. As car ownership, an office economy and lack of free time have all grown in the last few decades, we as a nation do not walk as much as we used to. It’s high time that changed.

Big Pharma

The pharmaceutical industry is (some might say amazingly) the second largest on the planet, worth over 600 billion dollars in sales every year and acting as the force behind the cutting edge of science that continues to push the science of medicine onwards as a field- and while we may never develop a cure for everything you can be damn sure that the modern medical world will have given it a good shot. In fact the pharmaceutical industry is in quite an unusual position in this regard, forming the only part of the medicinal public service, and indeed any major public service, that is privatised the world over.

The reason for this is quite simply one of practicality; the sheer amount of startup capital required to develop even one new drug, let alone form a public service of this R&D, would feature in the hundreds of millions of dollars, something that no government would be willing to set aside for a small immediate gain. All modern companies in the ‘big pharma’ demographic were formed many decades ago on the basis of a surprise cheap discovery or suchlike, and are now so big that they are the only people capable of fronting such a big initial investment. There are a few organisations (the National Institute of Health, the Royal Society, universities) who conduct such research away from the private sectors, but they are small in number and are also very old institutions.

Many people, in a slightly different field, have voiced the opinion that people whose primary concern is profit are those we should least be putting in charge of our healthcare and wellbeing (although I’m not about to get into that argument now), and a similar argument has been raised concerning private pharmaceutical companies. However, that is not to say that a profit driven approach is necessarily a bad thing for medicine, for without it many of the ‘minor’ drugs that have greatly improved the overall healthcare environment would not exist. I, for example, suffer from irritable bowel syndrome, a far from life threatening but nonetheless annoying and inconvenient condition that has been greatly helped by a drug called mebeverine hydrochloride. If all medicine focused on the greater good of ‘solving’ life-threatening illnesses, a potentially futile task anyway, this drug would never have been developed and I would be even more hateful to my fragile digestive system. In the western world, motivated-by-profit makes a lot of sense when trying to make life just that bit more comfortable. Oh, and they also make the drugs that, y’know, save your life every time you’re in hospital.

Now, normally at this point in any ‘balanced argument/opinion piece’ thing on this blog, I try to come up with another point to try and keep each side of the argument at an about equal 500 words. However, this time I’m going to break that rule, and jump straight into the reverse argument straight away. Why? Because I can genuinely think of no more good stuff to say about big pharma.

If I may just digress a little; in the UK & USA (I think, anyway) a patent for a drug or medicine lasts for 10 years, on the basis that these little capsules can be very valuable things and it wouldn’t do to let people hang onto the sole rights to make them for ages. This means that just about every really vital lifesaving drug in medicinal use today, given the time it takes for an experimental treatment to become commonplace, now exists outside its patent and is now manufactured by either the lowest bidder or, in a surprisingly high number of cases, the health service itself (the UK, for instance, is currently trying to become self-sufficient in morphine poppies to prevent it from having to import from Afghanistan or whatever), so these costs are kept relatively low by market forces. This therefore means that during their 10-year grace period, drugs companies will do absolutely everything they can to extort cash out of their product; when the antihistamine drug loratadine (another drug I use relatively regularly, it being used to combat colds) was passing through the last two years of its patent, its market price was quadrupled by the company making it; they had been trying to get the market hooked onto using it before jacking up the prices in order to wring out as much cash as possible. This behaviour is not untypical for a huge number of drugs, many of which deal with serious illness rather than being semi-irrelevant cures for the snuffles.

So far, so much normal corporate behaviour. Reaching this point, we must now turn to consider some practices of the big pharma industry that would make Rupert Murdoch think twice. Drugs companies, for example, have a reputation for setting up price fixing networks, many of which have been worth several hundred million dollars. One, featuring what were technically food supplements businesses, subsidiaries of the pharmaceutical industry, later set the world record for the largest fines levied in criminal history- this a record that persists despite the fact that the cost of producing the actual drugs themselves (at least physically) rarely exceeds a couple of pence per capsule, hundreds of times less than their asking price.

“Oh, but they need to make heavy profits because of the cost of R&D to make all their new drugs”. Good point, well made and entirely true, and it would also be valid if the numbers behind it didn’t stack up. In the USA, the National Institute of Health last year had a total budget of $23 billion, whilst all the drug companies in the US collectively spent $32 billion on R&D. This might seem at first glance like the private sector has won this particular moral battle; but remember that the American drug industry generated $289 billion in 2006, and accounting for inflation (and the fact that pharmaceutical profits tend to stay high despite the current economic situation affecting other industries) we can approximate that only around 10% of company turnover is, on average, spent on R&D. Even accounting for manufacturing costs, salaries and such, the vast majority of that turnover goes into profit, making the pharmaceutical industry the most profitable on the planet.

I know that health is an industry, I know money must be made, I know it’s all necessary for innovation. I also know that I promised not to go into my Views here. But a drug is not like an iPhone, or a pair of designer jeans; it’s the health of millions at stake, the lives of billions, and the quality of life of the whole world. It’s not something to be played around with and treated like some generic commodity with no value beyond a number. Profits might need to be made, but nobody said there had to be 12 figures of them.

Practical computing

This looks set to be my final post of this series about the history and functional mechanics of computers. Today I want to get onto the nuts & bolts of computer programming and interaction, the sort of thing you might learn as a budding amateur wanting to figure out how to mess around with these things, and who’s interested in exactly how they work (bear in mind that I am not one of these people and am, therefore, likely to get quite a bit of this wrong). So, to summarise what I’ve said in the last two posts (and to fill in a couple of gaps): silicon chips are massive piles of tiny electronic switches, memory is stored in tiny circuits that are either off or on, this pattern of off and on can be used to represent information in memory, memory stores data and instructions for the CPU, the CPU has no actual ability to do anything but automatically delegates through the structure of its transistors to the areas that do, the arithmetic logic unit is a dumb counting machine used to do all the grunt work and is also responsible, through the CPU, for telling the screen how to make the appropriate pretty pictures.

OK? Good, we can get on then.

Programming languages are a way of translating the medium of computer information and instruction (binary data) into our medium of the same: words and language. Obviously, computers do not understand that the buttons we press on our screen have symbols on them, that these symbols mean something to us and that they are so built to produce the same symbols on the monitor when we press them, but we humans do and that makes computers actually usable for 99.99% of the world population. When a programmer brings up an appropriate program and starts typing instructions into it, at the time of typing their words mean absolutely nothing. The key thing is what happens when their data is committed to memory, for here the program concerned kicks in.

The key feature that defines a programming language is not the language itself, but the interface that converts words to instructions. Built into the workings of each is a list of ‘words’ in binary, each word having a corresponding, but entirely different, string of data associated with it that represents the appropriate set of ‘ons and offs’ that will get a computer to perform the correct task. This works in one of two ways: an ‘interpreter’ is an inbuilt system whereby the programming is stored just as words and is then converted to ‘machine code’ by the interpreter as it is accessed from memory, but the most common form is to use a compiler. This basically means that once you have finished writing your program, you hit a button to tell the computer to ‘compile’ your written code into an executable program in data form. This allows you to delete the written file afterwards, makes programs run faster, and gives programmers an excuse to bum around all the time (I refer you here)

That is, basically how computer programs work- but there is one last, key feature, in the workings of a modern computer, one that has divided both nerds and laymen alike across the years and decades and to this day provokes furious debate: the operating system.

An OS, something like Windows (Microsoft), OS X (Apple) or Linux (nerds), is basically the software that enables the CPU to do its job of managing processes and applications. Think of it this way: whilst the CPU might put two inputs through a logic gate and send an output to a program, it is the operating system that will set it up to determine exactly which gate to put it through and exactly how that program will execute. Operating systems are written onto the hard drive, and can, theoretically, be written using nothing more than a magnetized needle, a lot of time and a plethora of expertise to flip the magnetically charged ‘bits’ on the hard disk. They consist of many different parts, but the key feature of all of them is the kernel, the part that manages the memory, optimises the CPU performance and translates programs from memory to screen. The precise translation and method by which this latter function happens differs from OS to OS, hence why a program written for Windows won’t work on a Mac, and why Android (Linux-powered) smartphones couldn’t run iPhone (iOS) apps even if they could access the store. It is also the cause of all the debate between advocates of different operating systems, since different translation methods prioritise/are better at dealing with different things, work with varying degrees of efficiency and are more  or less vulnerable to virus attack. However, perhaps the most vital things that modern OS’s do on our home computers is the stuff that, at first glance seems secondary- moving stuff around and scheduling. A CPU cannot process more than one task at once, meaning that it should not be theoretically possible for a computer to multi-task; the sheer concept of playing minesweeper whilst waiting for the rest of the computer to boot up and sort itself out would be just too outlandish for words. However, a clever piece of software called a scheduler in each OS which switches from process to process very rapidly (remember computers run so fast that they can count to a billion, one by one, in under a second) to give the impression of it all happening simultaneously. Similarly, a kernel will allocate areas of empty memory for a given program to store its temporary information and run on, but may also shift some rarely-accessed memory from RAM (where it is accessible) to hard disk (where it isn’t) to free up more space (this is how computers with very little free memory space run programs, and the time taken to do this for large amounts of data is why they run so slowly) and must cope when a program needs to access data from another part of the computer that has not been specifically allocated a part of that program.

If I knew what I was talking about, I could witter on all day about the functioning of operating systems and the vast array of headache-causing practicalities and features that any OS programmer must consider, but I don’t and as such won’t. Instead, I will simply sit back, pat myself on the back for having actually got around to researching and (after a fashion) understanding all this, and marvel at what strange, confusing, brilliant inventions computers are.