But there was a second and crucial aspect to this creation of proprietary culture – it enabled more products to exist. The motion picture, as you all are very well aware, is the salesman’s friend. In the combined experience of the shared imaginative experience in the dark room, all feeling together the shared fantasy of something, we pass as primates into a very receptive state, for the forming of affiliative bonds to bright shiny little objects. Whether those are cigarettes, or whisky, or automobiles, or diamonds, we find ourselves seeking the products that the motion picture offers to us as part of that shared communal activity of learning what we want.
Now, I don’t want to hinge only on the motion picture as an engine of want creation in the 21st century. But anybody who lives on planet Earth at the present day knows what the descendant invention Television did in the creation of needs for products in the world. So that in a short period of time, amounting to less than a generation and a half, a man in Georgia in the United States was able to turn the temporary consumption of a rather spicily flavoured carbonated drink into “being together”. And all the world became a place where everybody wanted to share a Coke. In other words, proprietary culture grew out of a particular form of technological deployment, centered around the concept of making experiences of community into property in the form of products that could be alienated and consumed at a distance. And this system was self-reinforcing, because once that technology was deployed, it created what we would now call a platform for deployment of many more products. And the 20th century became, of all the periods in the history of human beings, the century of the product.
It is over.
So, why is it over, this century of the product? Because with the same ironic relentlessness that brought the ruling class of the world to the century of the product, the engine of invention has continued running. And the inventiveness of humans transformed the product back into a service. The short career of computer software in going from service to product to service again, resulted from the fact that software was in a way the last of the 20th century’s inventions – it went the whole course briefly up and down again, as a little vignette, transforming in the smallest possible time all the forces back into their raw materials. So the story I am telling you is one in which, if you like to think in the terms of scale, the small scale illustration of the large scale historical process now wearing away at proprietary culture is the brief history of unfree software. That’s the structure of what I’m talking about. Just in case you were wondering.
But I’m still going to try to get there via the historian’s route, that is, forward from the beginning, rather than backward from the end. The product driven century that was the 20th century, the American century, the one in which the American empire came to dominate the globe with the economy of its products, the 20th century demanded larger and larger entities operating at higher and higher levels of efficiency. It became clear that there were two fundamental industries in the world, both of them fed by steel. One was automobiles and the other was war. Both consumed gasoline, and petroleum became the single central defining commodity of the moment. Petroleum produced in the end steel and made it move. And out of that petrochemical stew, that was steel plus gunpowder, we found our way to world empires concerned with the possession of oil reserves. And it became characteristic of the greatest empire in the history of the planet that it used automobile executives as its Secretaries of War, now called Defense. And whether it was Charley Wilson of General Motors or Robert McNamara of Ford, the process of moving the American war machine around the planet was essentially a process to be governed by an efficiency expert who had run an automobile company. And what had he learned there? Data is everything.
The 20th century management revolution in the United States, which with the products went around the world, was a revolution in management information. The doctrine that McNamara and his bright whiz kid boys – whether at Ford or at the Pentagon – best embodied was the doctrine that the enlightened manager has more data and analyzes it more thoroughly. And so the very economic processes of product formation and distribution in the great American century cried out for data. And the manufacturers of time clocks and telephones responded. IBM and AT&T built more and more and more data handling machinery. AT&T to consume it internally to enable the telephone network to embrace the whole economy, and IBM to allow those industrial institutions to understand better who they were and what they did. The consequence was the development of expensive digital analytic technology which grew out of the Second World War and which by the early 1960s as McNamara moved from Ford to the Pentagon bringing Cobol with him.
The forces that moved the American economy – and therefore the world economy – forward became dependant upon high quality digital processing hardware, and the manufacturers of that hardware used software to differentiate their products. Nobody could speak in useful terms, not really, of the performance related aspects of hardware engineering in the era of magnetic core. Everything operated with what we would now consider painful slowness. The earliest computer I ever owned and not the first, was an IBM 1401 with magnetic core memory, 4K, but I had an extra 4K embodied in a thing twice the size of a refrigerator, consisting of copper wires with small toroidal magnets at the intersection points of each of the wires in the warp. Can you imagine, then, the idea of billboards explaining the performance of my core as against Honeywell core?
Nobody could have stood for it – it was the software that did the work, that differentiated the products: Did IBM software do a better job than Honeywell software? But nobody hid the software, you couldn’t do it. Users had to have it, they had to see it. They had to go through it. And when there was something wrong, they were the ones who found it, and they were the ones who suggested how to fix it. User innovation was the life blood of this industry, because it was where products had always come from.
So software was the embodied technical knowledge about how to use highly sophisticated tools, resident in the hands of the engineers who made the products, that made the century, that produced the culture. This is the house that Jack built. Now I am going to change its foundation.
The grave problem at IBM in the 1970s – I saw it, I lived through it, those who were there too will also remember it – the grave problem of IBM in the palmiest days of the monopoly was how to avoid inventing hardware that would eat the product line. I worked on experimental hardware at IBM in the late 1970s and early 1980s that was very good hardware. Stunningly fast, innovatively architected, capable of things that nothing less than the $12 million computer would do and capable of being produced for $10,000, which is why we knew, even as we wrote the software for it, that IBM wasn’t going to build and sell them. There was no point in eliminating $12 million machines with $10,000 machines if you were a hardware monopoly.
Of course, the hardware monopoly didn’t last. And the result was that products got better very fast. On the day I went to work at IBM Santa Theresa laboratory in the summer of 1979, 330 professional mainframe programmers worked in a laboratory containing more than 20 370/168s or equivalent mainframes – one of 12 largest data centers inside IBM, one of the 20 largest data centres in the world. There were hectares of IBM 3330 and 3350 disk drives, things with 8 inch platters that you lifted out of the end of a long stretched arm. Hectares of those drives. I have a spreadsheet from the day I joined given to me as part of my “Welcome to the laboratory, see what wonders we make here” presentation, and I wrinkled it out of an old box some years ago for something I was writing. In the first week of July, 1979 that laboratory had a stunning total disk drive capacity of 29Gigabytes. And we thought that that was just beyond amazement. 29 GB, as you all know, now is 9.8mm high, 2.5 inches wide, and costs approximately $30, bought through the net.
In other words, the hardware engineers in the generation after the collapse of hardware monopoly did wonders. They changed, the scale of the performance of their products by orders of magnitude. We have rocket ships that run on air, that travel three quarters of speed of light without breaking a sweat. We have hardware we can hardly imagine, it’s so good. And the software we put on that is not much better than the software we wrote in the 1970s, arguably – if you are using Microsoft software – substantially worse. It’s hard to explain the complete and total divergence of the quality course of hardware and software on the theory that the people who make hardware are really, really smart and the people who make software are really, really stupid. Because the people who make software are obviously not really, really stupid, there just aren’t many brains on the planet allowed to make the software most people use. Which results in a very untenable structure for the production of software, as a result of which quality is low. These are simple engineering propositions, but the economics of it wasn’t simple.
Largely because although IBM itself set in motion the demonopolization of the hardware industry. How did that happen? Well, that happened because, after 16 years of bruising economically and legally destructive anti-trust litigations with the United States Government and European Union, IBM created a tiny little break-all-the-rules project to build a cheap personal computer out of in-stock available parts, and succeeded wildly. I remember as a kid lawyer working at IBM in the summer of 1983, when a large insurance company in Hartford, for the first time asked to buy 12000 IBM PC’s in a single order. And the guys who served that account, one of the 100 largest accounts at IBM, didn’t want to fill the order. Because they thought that selling 12000 personal computers was beneath contempt. They needed that company to buy more 3084 cubes and they didn’t want to waste any time moving 12000 units of something that looked to them like a typewriter. So, part of the reason that the whole thing happened was that the very people who sold the hardware didn’t understand what it was they were selling and didn’t understand that the time had actually come when a cheap product was going to eat the product line.
All that great hardware that had been made back in the days of corporate strength, that had gone on the shelf because it would have destroy too much profitable existing market – there wasn’t anybody who understood that about the IBM PC and they let it happen. And not only did it happen, but it happened in a way which allowed hardware manufacturers around the world to go crazy making it slightly better. And it allowed Intel to do the work of trying to eat an IBM lunch, for the first time, by improving the chips and reshifting the value inside. And everything would have been just great, except that the IBM lawyers bent over backward not to remonopolize the industry, and they allowed the software owner to own the software.
Next thing you knew, you had a wildly profitable system going on in which the dominant software contributer was a guy who made no hardware. He had no product except the secret of his source code. And so we were well on our way to the moment crystallized by Microsoft CTO Craig Mundie in a speech in Brazil in the summer of 2004, in which Mr. Mundie memorably said, “The one great principle of the software industry is: Never show anybody the source code to anything”. This was an inevitable economic consequence of the one thing that had happened, which was: they made no hardware, all they had was software, the software was the source code, the source code was the recipe for the product, the product – like Coke – had to be a secret in order to be a product, therefore the source code had to be a secret, therefore, when you get right down it, the software it had to be bad. This was not an escapable conclusion. It is remarkably difficult, as all of you who have built any software know, it is remarkably difficult to make software with your eyes closed. If you’re not allowed to see it, it you’re not allowed to build it, if you’re not allowed to debug it, if you’re not allowed to your hands dirty in it, it’s really, really hard to do.
I tried to estimate, two years ago, what proportion of the people who work on Windows, actually have the authority to build Windows. I can’t answer the question because I don’t have access to trade secret information, but I can set a reasonable upper bound. It is certain that no more than 2% of the people who work on Windows can build Windows. That means 98% of the people who work on the product can’t build the product. That means, as far as I’m concerned, that it’s extremely surprising that it runs at all, and they probably deserve a great deal of credit that it does, because it’s really, really hard to build software when you can’t actually build the software you’re working on, and everything has to be done under the assumption that you know what’s going on before you turn the key and run the smoke test. This also tends to explain why the software is so terrible the moment it comes out the door. Because there are a remarkable number of joints that have been tested but never actually worked on by anybody whose eyes weren’t welded shut during the course of the work.
This process of turning software into a product was a misbegotten idea from the beginning, but it acquired $400 billion worth of business around the world. It taxed everybody. It turned the guy who thought it up into the richest man on the planet. And it therefore became religious doctrine that everything about it was right, and that it was responsible for innovation. This is a grand historical irony only to those of us who care about software, which is a very tiny proportion of humanity. To the rest it will never even be history, it’s too small a detail for most people to understand. Yet it’s our lives. And because it’s our lives we can also explain what it has to do with what happens to the rest of the human race for the next 100 years.