Computers have done mankind a massive disservice.
That’s not to say they haven’t also done some amazing things, so let’s dismiss in a paragraph anyone who would read a silent ‘only’ into that intentionally controversial sentiment. They’ve changed the face of society, transformed our economies and revolutionised science in every field. They have assisted in the development of other technologies. And, most trite of all, you’re reading this article on the Internet, a huge all-permeating network of the damn things. Computers are amazing. Wow.
The history of computer science is a sufficiently short one that everyone’s dad has an amusing ancient computer anecdote. Perhaps your father was the proud owner of a room-sized computer with 64 K of memory, or maybe he had a mate down in accounts who used to program on punch-cards. Hell, I’ve got some mildly amusing computer anecdotes; my thousand-pound cutting-edge Windows 3.11/gaming/multimedia monster of the mid-nineties was a 486-66. That means it had a 66 MHz processor to go with its 8 MB of RAM. My current computer has four gigabytes of RAM, to go with its dual 2.2 gigahertz processors. Oh, and did I mention that it was cheaper than the 486? Ha ha. A conversation about the computer technology of yesteryear is always bound to raise a smirk. Isn’t progress entertaining?
It’s entertaining because it’s exponential. Computer technology has (give or take) been following Moore’s law, which predicts that computing power will double every eighteen months, since the 60s when Moore laid it down. This means you only have to go back a few years for computers to be four, eight or sixteen times slower than modern ones, and going all the way back to when your dad was young would see performance drops of thousands or millions of times. Old stuff which is millions of times more rubbish than current stuff is funny.
However, the law has been largely confined to a very specific niche of technology—computation—and it’s been a revolution partially in quality, but mostly in size. The fact is that much of the last half-century of technological change has been attributable to one component: the silicon transistor. But silicon transistors have presided over this revolution not by getting ‘better’; mainly, they’ve been getting smaller. And because you’re making the components smaller, you need less ultra-pure silicon, so everything gets cheaper and cheaper to boot. It’s also fairly simple as technological challenges go, because the fundamental physics of transistors is the same as when the first one was made in 1947; the tools we use to make them have just got tinier—we’ve gone from soldering irons to deep-ultraviolet photolithography.
There’s simply nowhere else where everyday technology has been so radically transformed, and almost everywhere outside of computing where there has been a transformation, it’s really just computers in disguise—for example, the circuits in your mobile which can transmit and receive microwaves and yet are portable enough to slip in your pocket depend entirely on shrunken silicon transistors.
The massive disservice for which computers are responsible is to imbue a lot of people with false expectations of the pace of technological change. Science is thought to be a panacea for all manner of social problems because of the unprecedented and wide-ranging triumph of the silicon microchip. The fact is that other areas of science and technology haven’t just failed to keep up: in most cases keeping pace would break the laws of physics.
The efficiency of a modern internal combustion engine is somewhere in the vicinity of 25%; that means for every horsepower with which your sporty convertible propels you down the road, three are lost as heat, light and the reassuring purr of its well-tuned V6. Even if someone came up with a design which totally usurped the last century of automotive engineering and had 100% efficiency, it would only be four times better—a mere thirty-six months of progress under Moore’s law—and you can never get more than 100% efficiency, because the laws of thermodynamics trump Moore’s law: even Moore can’t beat the Universe. And even at 100% efficiency, you still need to burn a certain amount of petrol to get yourself to a certain speed and keep yourself there, even if it is four times less than a contemporary car.
In reality, it’s going to be a lot more than thirty-six months before we see 100% efficient petrol engines. Their design is getting better incrementally rather than exponentially; we’ll be lucky to break 50% in production models any time soon—or possibly ever.
Another everyday example is the humble oven: think about roasting a chicken. It might take an hour or two to get your fowl crispy on the outside and well-cooked in the middle, and it has taken about that long since man developed fire many thousands of years before Christ. The problem is that it’s physically impossible to cook a chicken any quicker. As any impatient cook knows, turning up the thermostat gives you a chicken black on the outside, and moist, pink and salmonella-riddled in the middle. Increase the amount of power you’re throwing at the chicken and you don’t give the heat time to conduct through to the middle, so all that power ends up being absorbed by the top few centimetres, which duly burn.
You could try throwing radiation of a different wavelength at your chicken. Enter the microwave oven: microwaves have a wavelength longer than the heat in your oven, and so they penetrate much deeper into your chicken and cook it from the inside out. The two problems with this technique are that it’s very inhomogeneous, which is why you always have to stir things you’re heating up in the microwave, and that these deep-penetrating rays skip the surface of the food, which leaves it cooked, but not delicious and crispy like a good oven-roasted bird.
These are not technological restrictions: these are the laws of physics. We can make bigger ovens, more powerful ovens or microwave ovens, but all we’ll end up with is burnt, burnt or soggy chicken, respectively.
The other reason microwaves are a good example in this context is because they were not the result of an incremental scientific advance like microchip technology has been. Your microwave oven works in a totally different way to your conventional oven—it’s more like a mobile ’phone than an electric heater—and was developed from a serendipitous discovery, not by making normal oven components smaller and more intricate. No incremental advance in oven technology will allow us to cook a chicken twice as fast, let alone double the speed every eighteen months.
However, you and I do not expect our cars to be twice as efficient next year, nor our ovens to cook chickens in half the time, because we’ve seen enough technological progress (or lack thereof) to know that this simply isn’t the way these industries work. The false expectations bestowed upon the public by the computing revolution only extend to spheres where we have not been jaded by continual failure to provide, and are most acute in technology which simply doesn’t exist yet.
One area is the public perception of technological solutions to climate change. There is a significant subset of the population whose justification for apathy and inaction on global warming is that technology will come to our rescue. For example, ‘all we need is a way to suck carbon dioxide out of the atmosphere and turn it into carbon and oxygen,’ the optimists explain, ‘and look at how technology has marched forth of late, beyond the wildest predictions of even our most optimistic forefathers!’ (The infuriating recursion is that the optimists tend to believe that, no matter how wildly optimistic they are being, it’s probably not optimistic enough because their imagination is limited by the restricted mindset of current technology.)
The problem is that there are physical and chemical limits to the efficiency of this process. Turning CO2 into C and O2 requires exactly as much energy as burning C in O2 and making CO2 in the first place. The difficulty arises because neither process is 100% efficient by a long way (because of the heat, light and sound produced along the way), so you need far more energy than you produce by burning fossil fuels to grab the carbon dioxide back out of the atmosphere again. You don’t need to be an economist or an engineer to wonder whether it might be cheaper to simply not emit the carbon dioxide in the first place by making electric cars powered by solar cells, rather than building a hundred solar cells for every petrol-driven car in order that we can power an atmospheric decarbonising plant. There’s no sign of exponential change in the pipeline, and there are physical limits on how far an exponential change could take us anyway. You can never get more than 100% efficient.
The alternative to holding out for the technological revolution which may never come is to act now. The best science and economics tells us that it would be cheaper to jump on the bandwagon of low-carbon technology now than it would be to use magic atmospheric scrubbing plants later. We have the know-how to start reducing our global carbon footprint. We should of course continue to invest heavily in research and, someday, some egghead may find the silver bullet. I really hope that they do—but I don’t want to be the one to bet the future of our species on it.