Once upon a time, there were great businesses that did great science - with massive central labs that did both product development and basic research. Today, most corporate labs have refocused on applied research. Benoit B Mandelbrot says they do not know what they are missing.
Business follows fashion. Fifty years ago, it was the fashion for big technology companies to do big science. Today, the fashion is towards incremental improvements and practical product development. Investors like the change; they want "value" from the lab, not Nobel prizes. But they are short-sighted. They quite literally do not know what they are missing.
I started my career as an industrial scientist in 1958. IBM seemed, at that time, a strange place for any serious scientist to work. It was a prosaic manufacturer of tabulating machines, not a high-class university laboratory. But it was retooling itself at speed into what would later become the world's largest computer company.
For that, it knew it needed science. Computers were new, and their uses unimagined. It needed lots of practical, applied researchers - just as do big companies today. But it also needed a number of great scientists to do the unexpected and inspire the rest. So with extravagant promises of freedom and competitive pay, it attracted an odd collection of brilliant misfits who pursued every imaginable topic - most related to computers, but some not obviously so.
Too many zeroes
I was at that time a young assistant math professor in war-ravaged France, a misfit in that system. Along came IBM in 1958 with what, to me, was a humungous amount of money - in fact, I think that for the first summer I spent there they wrote down one more zero than they had intended (though that was later corrected.) They wanted to design a machine translation system; and my earlier research on the mathematics of linguistics had attracted their attention, as had my apprenticeship as a postdoc to John von Neumann.
But by the time they offered me the job, I had already left that field and was doing other work, in economics. I told them. "Well, that's marvellous," came the reply, "we're interested in almost everything now." It was an age of new computers, new applications, new prospects (and for those who failed at research, there were thousands of openings in development and manufacturing).
The lab's history since then is well known. Five of its employees won Nobel prizes. Its achievements ranged across all fields of electronics and computers: inventing magnetic disk storage, the standard "dynamic" computer memory chip, the relational database, the FORTRAN programming language, the RISC computer architecture, and much more.
But there was also much work that was, at first sight, only tangentially useful to a computer company (but led to patents that continue to bring income today) - such as the invention of the scanning tunnelling microscope to "see" individual molecules, or the discovery of new high-temperature superconductors.
A Christmas weekend
Or my own work. To some, fractals at first seemed a scientific curiosity. They are a way of looking at the world - of analysing patterns, of measuring how regular or irregular something is. The field's development was possible only with the help of computers: in the 1960s and 1970s I devoted many nights - even one Christmas weekend - to get a few precious hours on the company's biggest computers.
The results seemed odd to some. In fact, one computer-lab technician threw out some of my early print-outs - intentional, and successful, computer imitations of ink blots; he assumed they were mere "mistakes." But fractals have since proved central to computing. They are used today to compress digital files, to analyse Internet traffic, and to analyse all types of data.
But there was much more. In the early days of computer communications, the company's engineers were wrestling with a technical problem. As the computer signals went down a phone line, crackles and popping noises could interrupt them and cause transmission errors. How to fix that? Perhaps make the signal stronger, at great expense? Or spend more on the quality of the phone lines and junction boxes?
A friend of mine, who was leading the research, mentioned it to me - and by the chance of my own background, I recognised the "popping" and argued it was a random noise on the line, inherent in the physics of the situation. You cannot engineer that out of existence, no matter how much you spend on the problem. You have to live with it, by getting the computer to correct the errors itself - that is, you need error-correction algorithms in the software rather than noise-suppressants in the hardware. Simple logic, perhaps - but it saved IBM (and later AT&T) a great deal of money that it would otherwise have thrown into the wrong investment path.
Some today argue that my kind of science belongs in a public university, not a corporate laboratory. In fact, none of my discoveries could have been done in the formal atmosphere of a university, with its peer pressures and incessant demands to publish and teach. The corporate lab provides a unique interplay between the practical and theoretical - an interplay that benefits both society and the shareholders who pay for it.