Preview
Science, technology and innovation each represent a successively larger category of activities which are highly interdependent but distinct. Science contributes to technology in at least six ways: (1) new knowledge which serves as a direct source of ideas for new technological possibilities; (2) source of tools and techniques for more efficient engineering design and a knowledge base for evaluation of feasibility of designs; (3) research instrumentation, laboratory techniques and analytical methods used in research that eventually find their way into design or industrial practices, often through intermediate disciplines; (4) practice of research as a source for development and assimilation of new human skills and capabilities eventually useful for technology; (5) creation of a knowledge base that becomes increasingly important in the assessment of technology in terms of its wider social and environmental impacts; (6) knowledge base that enables more efficient strategies of applied research, development, and refinement of new technologies.
The converse impact of technology on science is of at least equal importance: (1) through providing a fertile source of novel scientific questions and thereby also helping to justify the allocation of resources needed to address these questions in an efficient and timely manner, extending the agenda of science; (2) as a source of otherwise unavailable instrumentation and techniques needed to address novel and more difficult scientific questions more efficiently.
1: Introduction
Much public debate about science and technology policy has been implicitly dominated by a ‘pipeline’ model of the innovation process in which new technological ideas emerge as a result of new discoveries in science and move through a progression from applied research, design, manufacturing and, finally, commercialization and marketing. This model seemed to correspond with some of the most visible success stories of World War II, such as the atomic bomb, radar, and the proximity fuze, and appeared to be further exemplified by developments such as the transistor, the laser, the computer, and, most recently, the nascent biotechnology industry arising out of the discovery of recombinant DNA techniques. The model was also, perhaps inadvertently, legitimated by the influential Bush report, Science, the Endless Frontier, which over time came to be interpreted as saying that if the nation supported scientists to carry out research according to their own sense of what was important and interesting, technologies useful to health, national security, and the economy would follow almost automatically once the potential opportunities opened up by new scientific discoveries became widely known to the military, the health professions, and the private entrepreneurs operating in the national economy. (See United States Office of Scientific Research and Development (1945) for a recent account of the political context and general intellectual climate in which this report originated; see also Frederickson, 1993.) The body of research knowledge was thought of as a kind of intellectual bank account on which society as a whole would be able to draw almost automatically as required to fulfil its aspirations and needs.
Though most knowledgeable people understood that such a model corresponded only to the rare and exceptional cases cited above, it became embodied in political rhetoric and took considerable hold on the public imagination and seemed to be confirmed by a sufficient number of dramatic anecdotes so that it was regarded as typical of the entire process of technological innovation, though it was severely criticized by many scholars. (See Kline and Rosenberg (1986) for an example of criticism and an excellent discussion of a more realistic and typical model.) One consequence was considerable confusion in the public mind between science and engineering, an excessive preoccupation with technical originality and priority of conception as not only necessary but sufficient conditions for successful technological innovation, and in fact an equating of organized research and development (R&D) with the innovation process itself. The ratio of national R&D expenditures to gross domestic product (GDP) often became a surrogate measure of national technological performance and, ultimately, of long-term national economic potential. The content of R&D was treated as a ‘black box’ that yielded benefits almost independently of what was inside it (Brooks, 1993, pp. 30-31).
The public may be forgiven its confusions, as indeed the relationships between science and technology are very complex, though interactive, and are often different in different fields and at different phases of a technological ‘life cycle’. Nelson (1992) has given a definition of technology both as “ . . , specific designs and practices” and as “generic knowledge.. . that provides understanding of how [and why] things work.. . ” and what are the most promising approaches to further advances, including “. . . the nature of currently binding constraints.” It is important here to note that technology is not just things, but also embodies
a degree of generic understanding, which makes it seem more like science, and yet it is understanding that relates to a specific artifact, which distinguishes it from normal scientific understanding, although there may be a close correspondence.
Similarly, Nelson (1992, p. 349) defines innovation as “ . . . the processes by which firms master and get into practice product designs that are new to them, whether or not they are new to the universe, or even to the nation.” The current US mental model of innovation often places excessive emphasis on originality in the sense of newness to the universe as opposed to newness in context. In general, the activities and investments associated with ‘technological leadership’ in the sense of absolute original ~ differ much less than is generally assumed from those associated with simply staying near the forefront of best national or world practice. Yet R&D is also necessary for learning about technology even when it is not ‘new to the universe’ but only in the particular context in which it is being used for the first time (Brooks, 1991, pp. 20-25).
Innovation is more than R&D
However, innovation involves much more than R&D. Charpie (1967) has provided a representative
allocation of effort that goes into the introduction of a new product, as follows:
- conception, primarily knowledge generation (research, advanced development, basic invention) 5-10%;
- product design and engineering, 10-20%;
- getting ready for manufacturing (lay-out, tooling, process design), 40-60%;
- manufacturing start-up, debugging production, 5-15%;
- marketing start-up, probing the market, 10-20%.
It does not follow from this that R&D or knowledge generation is only 5-10% of total innovative activity because many projects are started that never get beyond stage (a) and an even smaller proportion of projects are carried all the way through stage (e). In addition, there is a certain amount of background research that is carried out on a level-of-effort basis without any specific product in mind. There is no very good estimate of what percentage of the innovative activity of a particular firm would be classified in category (a) if unsuccessful projects or background research are taken into account. The fact remains that all five stages involve a certain proportion of technical work which is not classified as R&D, and the collection of statistical data on this portion of ‘downstream’ innovative activity is in a very rudimentary state compared with that for organized R&D. Indeed, only about 35% of scientists and engineers in the US are employed in R&D.
In small firms, especially technological ‘niche’ firms whose business is based on a cluster of specialized technologies which are often designed in close collaboration with potential users, there is a good deal of technical activity by highly trained people which is never captured in the usual R&D statistics.
Thus, science, technology and innovation each represent a successively larger universe of activities
which are highly interdependent, yet nevertheless distinct from each other. Even success in technology by itself, let alone science, provides an insufficient basis for success in the whole process of technological innovation.
In fact, the relation between science and technology is better thought of in terms of two parallel streams of cumulative knowledge, which have many interdependencies and cross relations, but whose internal connections are much stronger than their cross connections.
The metaphor I like to use is two strands of DNA which can exist independently, but cannot be truly functional until they are paired.
2: The contributions of science to technology
The relations between science and technology are complex and vary considerably with the particular
field of technology being discussed. For mechanical technology, for example, the contribution
of science to technology is relatively weak, and it is often possible to make rather important
inventions without a deep knowledge of the underlying science. By contrast, electrical, chemical,
and nuclear technology are deeply dependent on science, and most inventions are made only by
people with considerable training in science. In the following discussion, we outline the variety of
ways in which science can contribute to technological development. The complexity of the interconnections
of science and technology is further discussed in Nelson and Rosenberg (1993).
2.1: Science as a direct source of new technological ideas
In this case, opportunities for meeting new social needs or previously identified social needs in new ways are conceived as a direct sequel to a scientific discovery made in the course of an exploration of natural phenomena undertaken with no potential application in mind. The discovery of uranium fission leading to the concept of a nuclear chain reaction and the atomic bomb and nuclear power is, perhaps, the cleanest example of this. Other examples include the laser and its numerous embodiments and applications, the discoveries of X-rays and of artificial radioactivity and their subsequent applications in medicine and industry, the discovery of nuclear magnetic resonance (NMR) and its subsequent manifold applications in chemical analysis, biomedical research, and ultimately medical diagnosis, and maser amplifiers and their applications in radioastronomy and communications. These do exemplify most of the features of the pipeline model of innovation described above. Yet, they are the rarest, but therefore also the most dramatic cases, which may account for the persistence of the pipeline model of public discussions. It also suits the purpose of basic scientists arguing for government support of their research in a pragmatically oriented culture.
A more common example of a direct genetic relationship between science and technology occurs when the exploration of a new field of science is deliberately undertaken with a general anticipation that it has a high likelihood of leading to useful applications, though there is no specific end-product in mind. The work at Bell Telephone Laboratories and elsewhere which led eventually to the invention of the transistor is one
of the clearest examples of this. The group that was set up at Bell Labs to explore the physics of Group IV semiconductors such as germanium was clearly motivated by the hope of finding a method of making a solid state amplifier to substitute for the use of vacuum tubes in repeaters for the transmission of telephone signals over long distances.
As indicated above, much so-called basic research undertaken by industry or supported by the military services has been undertaken with this kind of non-specific potential applicability in mind, and indeed much basic biomedical research is of this character. The selection of fields for emphasis is a ‘strategic’ decision, while the actual day-to-day ‘tactics’ of the research are delegated to the ‘bench scientists’. Broad industrial and government support for condensed matter physics and atomic and molecular physics since World War II has been motivated by the well-substantiated expectation that it would lead to important new applications in electronics, communications, and computers. The determination of an appropriate level of effort, and the creation of an organizational environment that will facilitate the earliest possible identification of technological opportunities without too much constraint on the research agenda is a continuing challenge to research
planning in respect to this particular mechanism of science-technology interaction.
2.2: Science as a source of engineering design tools and techniques
While the process of design is quite distinct from the process of developing new knowledge of natural phenomena, the two processes are very intimately related. This relationship has become more and more important as the cost of empirically testing and evaluating complex prototype technological systems has mounted. Theoretical prediction, modeling, and simulation of large systems, often accompanied by measurement and empirical testing of subsystems and components, has increasingly substituted for full scale empirical testing of complete systems, and this requires design tools and analytical methods grounded in
phenomenological understanding. This is particularly important for anticipating failure modes under
extreme but conceivable conditions of service of complex technological systems. (See Alit et al.,
1992, Chapter 4). For a discussion of technical knowledge underlying the engineering design
process, cf. Chapter 2 (pp. 39-341.)
Much of the technical knowledge used in design and the comparative analytical evaluation of alternative designs is actually developed as ‘engineering science’ by engineers, and is in fact the major activity comprising engineering research in academic engineering departments. This research is very much in the style of other basic research in the ‘pure’ sciences and is supported in a similar manner by the Engineering Division of the National Science Foundation, i.e. as unsolicited, investigator-originated project research. Even though it is generally labelled as ‘engineering’ rather than ‘science’, such research is really another
example of basic research whose agenda happens to be motivated primarily by potential applications in design ‘downstream’ though its theoretical interest and its mathematical sophistication are comparable with that of pure science.
2.3: Instrumentation, laboratory techniques, and analytical methods
Laboratory techniques or analytical methods used in basic research, particularly in physics, often find their way either directly, or indirectly via other disciplines, into industrial processes and process controls largely unrelated either to their original use or to the concepts and results of the research for which they were originally devised (Rosenberg, 1991). According to Rosenberg (19911, “this involves the movement of new instrumentation technologies.. . from the status of a tool of basic research, often in universities, to the status of a production tool, or capital good, in private industry.” Examples are legion and include electron diffraction, the scanning electron microscope (SEMI, ion implantation, synchrotron radiation sources, phase-shifted lithography, high vacuum technology, industrial cryogenics, superconducting magnets (originally developed for cloud chamber observations in particle physics, then commercialized for ‘magnetic resonance
imaging’ (MRI) in medicine). In Rosenberg’s words, “the common denominator running through and connecting all these experiences is that instrumentation that was developed in the pursuit of scientific knowledge eventually had direct applications as part of a manufacturing process.” Also, in considering the potential economic benefits of science, as Rosenberg says, “there is no obvious reason for failing to examine
the hardware consequences of even the most fundamental scientific research.” One can also envision ultimate industrial process applications from many other techniques now restricted to the research laboratory. One example might be techniques for creating selective chemical reactions using molecular beams.
2.4: The development of human skills
An important function of academic research often neglected in estimating its economic benefits is that it imparts research skills to graduate students and other advanced trainees, many of whom “go on to work in applied activities and take with them not just the knowledge resulting from their research, but also the skills, methods, and a web of professional contacts that will help them tackle the technological problems that they later face.” (See Rosenberg (1990) and Pavitt (19911.) This is especially important in light of the
fact that basic research instrumentation so often later finds application not only in engineering and other more applied disciplines such as clinical medicine, but also ultimately in routine industrial processes and operations, health care delivery, and environmental monitoring.
A study based on a ranking by 6.50 industrial research executives in 130 industries of the relevance
of a number of academic scientific disciplines to technology in their industry, first, on the basis of their skill base and, second, on the basis of their research results, showed strikingly higher ratings for the skill base from most disciplines than from the actual research results. In the most extreme case, 44 industries rated physics high in skill base (second only to materials science, computer science, metallurgy and chemistry, in that order), whereas physics was almost at the bottom of the list in respect to the direct contribution of
academic research results to industrial applications. Only in biology and medical science were the contributions of skill base and research results comparable (Nelson and Levin, 1986; Pavitt, 1991, p. 114 (Table 1)). The conclusion was “that most scientific fields are much more strategically important to technology than data on direct transfers of knowledge would lead us to believe” (Pavitt, 1991). From these data, Pavitt inferred that “policies for greater selectivity and concentration in the support of scientific fields have
probably been misconceived”, for the contribution of various disciplines to the development of potentially useful skills appears to be much more broadly distributed among fields than are their practically relevant research contributions. A part of the problem here is, of course, that this conclusion is contrary to much of the rhetoric used in advocating the support of basic research by governments.
As a further example of the importance of the widely usable generalized skills derived from participation in any challenging field of research, the National Research Council in 1964 surveyed about 1900 doctoral scientists working in industry in solid state physics and electronics. By that date, most of the basic ideas underlying the most important advances in solid state electronics had already been developed. It was found, however, that only 2.5% of the scientists surveyed had received their Ph.D. training in solid state physics;
19% were chemists, and 73% had received their doctorates in physics fields other than solid state,
with nuclear physics predominating (Brooks, 1985). In fact, the shift of physics graduate study into solid state and condensed matter physics (about 40% of all physics Ph.D.s by the early 1970s) occurred after many of the fundamental inventions had already been made. The skills acquired in graduate training in nuclear physics had been readily turned to the development and improvement of solid state devices (Brooks, 1978).
2.5. Technology assessment
The past two decades have witnessed an enormous growth of interest and concern with predicting and controlling the social impact of technology, both anticipating new technologies and their social and environment implications and the consequences of ever-increasing scale of application of older technologies (Brooks, 1973). In general, the assessment of technology, whether for evaluating its feasibility to assess entrepreneurial risk, or for foreseeing its societal side-effects, requires a deeper and more fundamental scientific understanding of the basis of the technology than does its original creation, which can often be carried out by empirical trial-and-error methods. Further, such understanding often requires basic scientific
knowledge well outside the scope of what was clearly relevant in the development of the technology. For example, the manufacture of a new chemical may involve disposal of wastes which require knowledge of the groundwater hydrology of the manufacturing site. Thus, as the deployment of technology becomes more extensive, and the technology itself becomes more complex, one may anticipate the need for much
more basic research knowledge relative to the technical knowledge required for original development. This has sometimes been called ‘defensive research’ and, it can be shown that, over time, the volume of research that can be described as defensive has steadily increased relative to the research that can be described as ‘offensive’ - i.e. aimed at turning up new technological opportunities. This has led me to call science the ‘conscience’ of technology.
2.6: Science as a source of development strategy
Somewhat similarly to the case of technology assessment, the planning of the most efficient strategy of technological development, once general objectives have been set, is often quite dependent on science from many fields. This accumulated stock of existing scientific (and technological) knowledge helps to avoid blind alleys and hence wasteful development expenditures. Much of this is, of course, old knowledge, rather than the latest research results, but it is nonetheless important and requires people who know the field of relevant background science. One piece of evidence of this is the observation that very creative engineers and inventors tend to read very widely and eclectically both in the history of science and technology, and about contemporary scientific developments.
Brooks, Harvey. “The Relationship Between Science and Technology.” Research Policy, 1994