Please Read How You Can Help Keep the Encyclopedia Free
Philosophy of Technology
If philosophy is the attempt “to understand how things in the broadest possible sense of the term hang together in the broadest possible sense of the term”, as Sellars (1962) put it, philosophy should not ignore technology. It is largely by technology that contemporary society hangs together. It is hugely important not only as an economic force but also as a cultural force. During the last two centuries, much philosophy of technology has been concerned with the impact of technology on society. Mitcham (1994) calls this type of philosophy of technology ‘humanities philosophy of technology’ because it is continuous with social science and the humanities. In addition to this, there is also a branch of the philosophy of technology that is concerned with technology in itself. This entry focuses on the latter branch of the philosophy of technology, which seeks continuity with the philosophy of science rather than social science and the humanities. The approach is analytic; other approaches are possible, but will not be discussed.
The entry starts with a brief historical overview, then presents an introduction to the modern philosophy of technology, and ends with a discussion of the societal and ethical aspects of technology. This discussion takes into consideration the development of technology as a process originating within and guided by the practice of engineering, by standards on which only limited societal control is exercised, as well as the consequences for society of its implementation, resulting from processes upon which only limited control can be exercised.
- 1. Historical Developments
- 2. Analytic Philosophy of Technology
- 2.1. Introduction: Philosophy of technology and philosophy of science
- 2.2. The relationship between technology and science
- 2.3. The centrality of design to technology
- 2.4. Methodological issues: design as decision making
- 2.5. Metaphysical issues: The status and characteristics of artifacts
- 2.6. Other topics
- 3. Ethical and Social Aspects of Technology
- Other Internet Resources
- Related Entries
Philosophical reflection on technology is about as old as philosophy itself. It started in ancient Greece. There are four prominent themes.
One early theme is the thesis that technology learns from or imitates nature (Plato, Laws X 899a ff.). According to Democritus, for example, house-building and weaving were first invented by imitating swallows and spiders building their nests and nets, respectively (fr D154; perhaps the oldest extant source for the exemplary role of nature is Herakleitos fr D112). Aristotle referred to this tradition by repeating Democritus' examples, but he did not maintain that technology can only imitate nature: “generally art in some cases completes what nature cannot bring to a finish, and in others imitates nature” (Physics II.8, 199a15; see also Physics II.2, and see Schummer 2001 for discussion).
A second theme is the thesis that there is a fundamental ontological distinction between natural things and artifacts. According to Aristotle, Physics II.1, the former have their principles of generation and motion inside, whereas the latter, insofar as they are artifacts, are generated only by outward causes, namely human aims and forms in the human soul. Natural products (animals and their parts, plants, and the four elements) move, grow, change, and reproduce themselves by inner final causes; they are driven by purposes of nature. Artifacts, on the other hand, cannot reproduce themselves. Without human care and intervention, they vanish after some time by losing their artificial forms and decomposing into (natural) materials. For instance, if a wooden bed is buried, it decomposes to earth or changes back into its botanical nature by putting forth a shoot. The thesis that there is a fundamental difference between man-made products and natural substances had a long-lasting influence. In the Middle Ages, Avicenna criticized alchemy on the ground that it can never produce ‘genuine’ substances. Even today, some still maintain that there is a difference between, for example, natural and synthetic vitamin C.
Aristotle's doctrine of the four causes—material, formal, efficient and final—can be regarded as a third early contribution to the philosophy of technology. Aristotle explained this doctrine by referring to technical artifacts such as houses and statues (Physics II.3).
A final point that deserves mentioning is the extensive employment of technological images by Plato and Aristotle. In his Timaeus, Plato described the world as the work of an Artisan, the Demiurge. His account of the details of creation is full of images drawn from carpentry, weaving, modelling, metallurgy, and agricultural technology. Aristotle used comparisons drawn from the arts and crafts to illustrate how final causes are at work in natural processes. Despite their criticism of the life led by merely human artisans, both Plato and Aristotle found technological imagery indispensable for expressing their belief in the rational design of the universe (Lloyd 1973: 61).
Although there was much technological progress in the Roman empire and during the Middle Ages, philosophical reflection on technology did not grow at a corresponding rate. Comprehensive works such as Vitruvius' De Architectura (first century BC) and Agricola's De re metallica (1556) paid much attention to practical aspects of technology but little to philosophy.
In the realm of scholastic philosophy, there was an emergent appreciation for the mechanical arts. They were generally considered to be born of—and limited to—the mimicry of nature. This view became challenged when alchemy was introduced in the Latin West around the mid-twelfth century. Some alchemical writers such as Roger Bacon were willing to argue that human art, even if it learned by imitating natural processes, could successfully reproduce natural products or even surpass them. The result was a philosophy of technology in which human art was raised to a level of appreciation difficult to find in other writings until the Renaissance. However, the last three decades of the thirteenth century witnessed an increasingly hostile attitude by religious authorities toward alchemy that culminated eventually in the denunciation Contra alchymistas, written by the inquisitor Nicholas Eymeric in 1396 (Newman 1989, 2004).
The Renaissance led to a greater appreciation of human beings and their creative efforts, including technology. As a result, philosophical reflection on technology and its impact on society increased. Francis Bacon is generally regarded as the first modern author to put forward such reflection. His view, expressed in his fantasy New Atlantis (1627), was overwhelmingly positive. This positive attitude lasted well into the nineteenth century, incorporating the first half-century of the industrial revolution. Karl Marx did not condemn the steam engine or the spinning mill for the vices of the bourgeois mode of production and believed that ongoing technological innovation would support the more blissful stages of socialism and communism of the future (see Bimber 1990 for a recent discussion of different views on the role of technology in Marx's theory of historical development).
A turning point in the appreciation of technology as a socio-cultural phenomenon is marked by Samuel Butler's Erewhon (1872) written under the influence of the Industrial Revolution and Darwin's Origin of Species. This book gave an account of a fictional country where all machines are banned and the possession of a machine or the attempt to build one is a capital crime, ever since the population was convinced by an argument that ongoing technical improvements are likely to lead to a ‘race’ of machines that will replace mankind as the dominant species on earth.
During the last quarter of the nineteenth century and most of the twentieth century a critical attitude predominated in philosophy. The representatives of this attitude were, overwhelmingly, schooled in the humanities or the social sciences and had virtually no first-hand knowledge of engineering practice. Whereas Bacon wrote extensively on the method of science and conducted physical experiments himself, Butler was a clergyman. The author of the first text in which the term ‘philosophy of technology’ occurred, Ernst Kapp's Eine Philosophie der Technik (1877), was a philologist and historian. Most of the authors who reflected critically on technology and its socio-cultural role during the twentieth century were philosophers of a general outlook (Heidegger, Jonas, Gehlen, Anders, Feenberg), had a background in one of the other humanities or in social science, like law (Ellul), political science (Winner) or literary studies (Borgmann). Carl Mitcham (1994) has called this type of philosophy of technology ‘humanities philosophy of technology’. It can be interpreted as exercising continuing influence in the field known as ‘Science and Technology Studies (STS)’, which studies how social, political, and cultural values affect scientific research and technological innovation, and how these in turn affect society, politics, and culture. For those interested in the humanities philosophy of technology, Mitcham's books provide an excellent overview.
Since the 1960s, a form of the philosophy of technology has been developing that can be regarded as an alternative to the humanities philosophy of technology. It has gained momentum in the past 10 or 15 years, and it is now becoming the dominant form of philosophy of technology. This form of the philosophy of technology may be called ‘analytic’. It is not so much concerned with the relations between technology and society as with technology itself. It does not see technology as a ‘black box’, but as a phenomenon that deserves study. It regards technology as a practice, basically the practice of engineering. It analyzes this practice, its goals, its concepts and its methods, and it relates these issues to various themes from philosophy. The following discussion will be concerned with this form of the philosophy of technology.
Few practices in our society are as closely related as science and technology. Experimental science is nowadays crucially dependent on technology for the realization of its research settings and for the creation of circumstances in which a phenomenon will become observable. Theoretical research within technology has come to be often indistinguishable from theoretical research in science, making engineering science largely continuous with ‘ordinary’ science. This is a relatively recent development, not more than a century old, and is responsible for great differences between modern technology and traditional, craft-like techniques. The educational training that aspiring scientists and engineers receive starts off being largely identical and only gradually diverges into a science or engineering curriculum. It might therefore be thought that there are equally strong resemblances between the philosophy of science and the philosophy of technology. Almost the contrary is true, however. Ever since the scientific revolution of, primarily, the seventeenth century, characterized by its two major innovations, the experimental method and the mathematical articulation of scientific theories, philosophical reflection on science has concentrated on the method by which scientific knowledge is generated, on the reasons for thinking scientific theories to be true, and on the nature of evidence and the reasons for accepting one theory and rejecting another. Hardly ever have philosophers of science posed questions that did not have the community of scientists, their concerns, their aims, their intuitions, their arguments and choices, as the primary target. The philosophy of technology, in contrast, has traditionally largely ignored the community of engineers and has almost exclusively dealt with the place of technology in, and its meaning for, human society, human culture, and human existence, in terms of Sellarsian broadness.
To say that this is understandable because science affects society only through technology will not do. Right from the start of the scientific revolution, science affected human culture and thought fundamentally and directly, not with a detour through technology, and the same is true for later developments such as relativity, atomic physics and quantum mechanics, the theory of evolution, genetics, biochemistry, and the increasingly unified scientific world view overall. Philosophers of science seem to leave questions addressing this side of things gladly to other philosophical disciplines, or to historical studies.
A major difference between the historical development of modern technology as compared to modern science, which can at least partly explain this situation, is that science emerged in the seventeenth century from philosophy itself. The answers that Galileo, Huygens, Newton, and others gave, by which they initiated the alliance of empiricism and mathematical description that is so characteristic for modern science, were answers to questions that had belonged to the core business of philosophy since antiquity. Science, therefore, kept the attention of philosophers. Philosophy of science is a transformation of epistemology in the light of the emergence of science. The foundational issues—the reality of atoms, the status of causality and probability, questions of space and time, the nature of the quantum world—that were so lively discussed during the end of the nineteenth and the beginning of the twentieth century are an illustration of this close relationship between scientists and philosophers. No such intimacy has ever existed between those same philosophers and technicians; their worlds still barely touch. To be sure, a case can be made for a similar continuity between central questions in philosophy, having to do with human action and practical rationality, and the way technology approaches and systematizes the solution of practical problems. This continuity appears only by hindsight, however, and dimly, as the historical development is at most a slow approach in the direction of these philosophical thoughts on action and rationality, not away from them as its place of birth. Significantly it is only the academic outsider Ellul who has, in his idiosyncratic way, recognized in technology the emergent single dominant way of answering all questions concerning human action, comparable to science as the single dominant way of answering all questions concerning human knowledge (Ellul 1964). Ellul, however, does not merely give as his interpretation that technology can be the sum total of rational action, he also deplores the hold it has on modern society due to its forcing all aspects of human life within the mould of a single narrowed-down criterion of rationality: maximum efficiency. In this respect Ellul is not an outsider but entirely typical for an approach to the philosophy of technology that has dominated the debate during the twentieth century. This approach is openly critical of technology: it tends to have a negative judgment, all things considered, of the way technology has affected human society and culture, or it concentrates on the negative effects of technology on human society and culture. This does not necessarily mean that technology itself is pointed out as the direct cause of these negative developments. In the case of Heidegger, in particular, the paramount position of technology in modern society is a symptom of something more fundamental, namely a wrongheaded attitude towards Being which has been in the making for almost 25 centuries. It is therefore questionable whether Heidegger should be considered as a philosopher of technology, although within the traditional view he is considered to be among the most important ones.
The close relationship between the practices of science and technology may easily keep the important differences between the two from view. The predominant position of science in the philosophical perspective did not easily lead to a recognition that technology merited special attention for involving issues that did not emerge in science. This situation is often presented, perhaps somewhat dramatized, as coming down to a claim that technology is ‘merely’ applied science.
A questioning of the relation between science and technology was the central issue in one of the earliest discussions among analytic philosophers of technology. In 1966, in a special issue of the journal Technology and Culture, Henryk Skolimowski argued that technology is something quite different from science (Skolimowski 1966). As he phrased it, science concerns itself with what is, whereas technology concerns itself with what is to be. A few years later, in his well-known book The sciences of the artificial (1969), Herbert Simon emphasized this important distinction in almost the same words, stating that the scientist is concerned with how things are but the engineer with how things ought to be. Although it is difficult to imagine that earlier philosophers of science were blind to this difference in orientation, their inclination, in particular in the tradition of logical empiricism, to view knowledge as a system of statements may have led to a conviction that in technology no knowledge claims play a role that cannot also be found in science, and that therefore the study of technology poses no new challenges and holds no surprises regarding the interests of analytic philosophy.
In the same issue of Technology and Culture, Mario Bunge (1966) defended the view that technology is applied science, but in a subtle way that does justice to the differences between science and technology. Bunge acknowledges that technology is about action, but an action heavily underpinned by theory—that is what distinguishes technology from the arts and crafts and puts it on a par with science. According to Bunge, theories in technology come in two types: substantive theories, which provide knowledge about the object of action, and operative theories, which are concerned with action itself. The substantive theories of technology are indeed largely applications of scientific theories. The operative theories, in contrast, are not preceded by scientific theories but are born in applied research itself. Still, as Bunge claims, operative theories show a dependency on science in that in such theories the method of science is employed. This includes such features as modeling and idealization, the use of theoretical concepts and abstractions, and the modification of theories by the absorption of empirical data through prediction and retrodiction.
In his comment on Skolimowski's paper in Technology and Culture, Ian Jarvie (1966) proposed as important questions for a philosophy of technology an inquiry into the epistemological status of technological statements and the way technological statements are to be demarcated from scientific statements. This suggests a thorough investigation of the various forms of knowledge occurring in either practice, in particular, since scientific knowledge has already been so extensively studied, of the forms of knowledge that are characteristic of technology and are lacking, or of much less prominence, in science. A distinction between ‘knowing that’—traditional propositional knowledge—and ‘knowing how’—non-articulated and even impossible-to-articulate knowledge—had been introduced by Gilbert Ryle (1949) in a different context. The notion of ‘knowing how’ was taken up by Michael Polanyi under the name of tacit knowledge and made a central characteristic of technology (Polanyi 1958). However, emphasizing too much the role of unarticulated knowledge, of ‘rules of thumb’ as they are often called, easily underplays the importance of rational methods in technology. The following two sections take up the role of rational methods in technology. An emphasis on tacit knowledge may also be ill-fit for distinguishing the practices of science and technology because the role of tacit knowledge in science may well be more important than current philosophy of science acknowledges, for example in concluding causal relationships on the basis of empirical evidence. This was also an important theme in the writings of Thomas Kuhn on scientific theory-change (Kuhn 1962).
To claim, with Skolimowski and Simon, that technology is about what is to be or what ought to be rather than what is may serve to distinguish it from science but will hardly make it understandable why so much philosophical reflection has taken the form of socio-cultural criticism. Technology is a continuous attempt to bring the world closer to the way it is to be. Whereas science aims to understand the world as it is, technology aims to change the world. These are abstractions, of course. Unlike scientists, however, who are considered personally motivated in their attempts at describing and understanding the world, engineers are considered, not in the least by engineers themselves, as undertaking their attempts to change the world as a service to the public. The ideas on what is to be or what ought to be are seen as originating outside of technology itself; engineers then take it upon themselves to realize these ideas. This view is the source for the widely spread picture of technology as being instrumental, as delivering instruments that will be used ‘elsewhere’. This view involves a considerable distortion of reality. Many engineers are intrinsically motivated to change the world; they are their own best customers. The same is true for most industrial companies, particularly in a market economy. As a result, much technological development is ‘technology-driven’.
Be that as it may, technology is a practice focused on the creation of artifacts and, of increasing importance, artifact-based services. The design process, the structured process leading toward that goal, forms the core of the practice of technology. The design process is commonly represented as consisting of a series of translational steps. In the first step the needs or wishes of the customer are translated into a list of functional requirements, which defines the design task. The functional requirements specify as precisely as possible what the device to be designed must be able to do. This step is required because customers usually focus on just one or two features and are unable to articulate the requirements that are necessary to support the functionality they desire. This is all the more true if the needs of the customer are ‘deduced’ from market developments. In the second step the functional requirements are translated into design specifications, which the exact physical parameters of crucial components by which the functional requirements are going to be met. In the third and final step, the design parameters are combined and amended such that a blueprint of the device results. The blueprint contains all the details that must be known for the manufacture of the device and can be considered as the end result of a design process, rather than a finished copy. Actual copies of a device play a role only as prototypes for the purpose of testing. Prototyping and testing presupposes that the sequence of steps making up the design process can and will often contain iterations, leading to revisions of the design parameters and/or the functional requirements. For a general discussion of the structure of design processes, see, e.g., Suh (2001).
Certainly for mass-produced items, the manufacture of a product is not considered part of the design phase. Still, the manufacturing process is often reflected in the functional requirements of a device, for example in putting restrictions on the number of different components of which the device consists. Ease of maintenance is often a functional requirement as well. An important modern development is that the complete life cycle of an artifact is now considered to be the designing engineer's concern, up till the final stages of the recycling and disposal of its components and materials, and the functional requirements of any device should reflect this.
An important input for the design process is scientific knowledge: knowledge about the behavior of components and the materials they are composed of in specific circumstances. This is the point where science is applied. However, much of this knowledge is not directly available from the sciences, since it often concerns exact behavior in very specific circumstances. This scientific knowledge is therefore often generated within technology, by the engineering sciences. The sort of knowledge involved in engineering design is of a broader character, however. In his book What engineers know and how they know it (Vincenti 1990), the aeronautical engineer Walter Vincenti gave a six-fold categorization of engineering design knowledge (leaving aside production and operation as the other two basic constituents of engineering practice). Vincenti distinguishes
- Fundamental design concepts, including primarily the operational principle and the normal configuration of a particular device;
- Criteria and specifications;
- Theoretical tools;
- Quantitative data;
- Practical considerations;
- Design instrumentalities.
The fourth category concerns the quantitative knowledge just referred to, and the third the theoretical tools used to acquire it. These two categories can be assumed to match Bunge's notion of substantive technological theories. The status of the remaining four categories is much less clear, however, partly because they are less familiar, or not at all, from the well-explored context of science. Of these categories, Vincenti claims that they represent prescriptive forms of knowledge rather than descriptive ones. Here, the activity of design introduces an element of normativity, which is absent from scientific knowledge. Take such a basic notion as ‘operational principle’, which refers to the way in which the function of a device is realized, or, in short, how it works. This is still a purely descriptive notion. Subsequently, however, it plays a role in arguments that seek to prescribe a course of action to someone who has a goal that could be realized by the operation of such a device. At this stage, the issue changes from a descriptive to a prescriptive or normative one. Although the notion of ‘operational principle’—a term that seems to originate with Polanyi (1958)—is central to engineering design, no single clear-cut definition of it seems to exist. The issue of disentangling descriptive from prescriptive aspects in an analysis of the technical action and its constituents is therefore a task that has hardly begun.
This task requires a clear view on the extent and scope of technology. If one follows Joseph Pitt in his book Thinking about technology (1999) and defines technology broadly as ‘humanity at work’, then to distinguish between technological action and action in general becomes difficult, and the study of technological action must absorb all descriptive and normative theories of action, including the theory of practical rationality, and much of theoretical economics in its wake. There have indeed been attempts at such an encompassing account of human action, for example Tadeusz Kotarbinski's Praxiology (1965), but a perspective of such generality makes it difficult to arrive at results of sufficient depth. It would be a challenge for philosophy to specify the differences among action forms and the reasoning grounding them in, to single out three prominent practices, technology, organization and management, and economics.
Design is an activity that is subject to rational scrutiny but in which creativity is considered to play an important role as well. Since design is a form of action, a structured series of decisions to proceed in one way rather than another, the form of rationality that is relevant to it is practical rationality, the rationality incorporating the criteria on how to act, given particular circumstances. This suggests a clear division of labor between the part to be played by rational scrutiny and the part to be played by creativity. Theories of rational action generally conceive their problem situation as one involving a choice among various course of action open to the agent. Rationality then concerns the question how to decide among given options, whereas creativity concerns the generation of these options. This distinction is similar to the distinction between the context of justification and the context of discovery in science. The suggestion that is associated with this distinction, however, that rational scrutiny only applies in the context of justification, is difficult to uphold for technological design. If the initial creative phase of option generation is conducted sloppily, the result of the design task can hardly be satisfactory. Unlike the case of science, where the practical consequences of entertaining a particular theory are not taken into consideration, the context of discovery in technology is governed by severe constraints of time and money, and an analysis of the problem how best to proceed certainly seems in order. There has been little philosophical work done in this direction.
The ideas of Herbert Simon on bounded rationality (see, e.g., Simon 1982) are relevant here, since decisions on when to stop generating options and when to stop gathering information about these options and the consequences when they are adopted are crucial in decision making if informational overload and calculative intractability are to be avoided, but it has proved difficult to further develop Simon's ideas on bounded rationality. Another notion that is relevant here is means-ends reasoning. In order to be of any help here, theories of means-ends reasoning should then concern not just the evaluation of given means with respect to their ability to achieve given ends, but also the generation or construction of means for given ends. Such theories, however, are not yet available; for a proposal on how to develop means-ends reasoning in the context of technical artifacts, see (Hughes, Kroes and Zwart 2007). In the practice of technology, alternative proposals for the realization of particular functions are usually taken from ‘catalogs’ of existing and proven realizations. These catalogs are extended by ongoing research in technology rather than under the urge of particular design tasks.
When engineering design is conceived as a process of decision making, governed by considerations of practical rationality, the next step is to specify these considerations. Almost all theories of practical rationality conceive of it as a reasoning process where a match between beliefs and desires or goals is sought. The desires or goals are represented by their value or utility for the decision maker, and the decision maker's problem is to choose an action that realizes a situation that has maximal value or utility among all the situations that could be realized. If there is uncertainty concerning he situations that will be realized by a particular action, then the problem is conceived as aiming for maximal expected value or utility. Now the instrumental perspective on technology implies that the value that is at issue in the design process viewed as a process of rational decision making is not the value of the artefacts that are created. Those values are the domain of the users of the technology so created. They are supposed to be represented in the functional requirements defining the design task. Instead the value to be maximized is the extent to which a particular design meets the functional requirements defining the design task. It is in this sense that engineers share an overall perspective on engineering design as an exercise in optimization. But although optimization is a value-orientated notion, it is not itself a value driving engineering design.
The functional requirements that define most design problems do not prescribe explicitly what should be optimized; usually they set levels to be attained minimally. It is then up to the engineer to choose how far to go beyond meeting the requirements in this minimal sense. Efficiency, in energy consumption and use of materials first of all, is then often a prime value. Under the pressure of society, other values have come to be incorporated, in particular safety and, more recently, sustainability. Sometimes it is claimed that what engineers aim to maximize is just one factor, namely market success. Market success, however, can only be assessed after the fact. The engineer's maximization effort will instead be directed at what are considered the predictors of market success. Meeting the functional requirements and being relatively efficient and safe are plausible candidates as such predictors, but additional methods, informed by market research, may introduce additional factors or may lead to a hierarchy among the factors.
Choosing that design option that maximally meets all the functional requirements originating with the prospective user and all other considerations and criteria that are taken to be relevant, then becomes the practical decision-making problem to solve in a particular engineering-design task. This creates several methodological problems. Most important of these is that the engineer is facing a multi-criteria decision problem. The various requirements come with their own operationalizations in terms of design parameters and measurement procedures for assessing their performance. This results in a number of rank orders or quantitative scales which represent the various options out of which a choice is to be made. The task is to come up with a final score in which all these results are ‘adequately’ represented, such that the option that scores best can be considered the optimal solution to the design problem. Engineers describe this situation as one where trade-offs have to be made: in judging the merit of one option relative to other options, a relative bad performance on one criterion can be balanced by a relatively good performance on another criterion. An important problem is whether a rational method for doing this can be formulated. It has been argued (Franssen 2005) that this problem is structurally similar to the well-known problem of social choice, for which Kenneth Arrow proved his notorious impossibility theorem in 1950, implying that no general rational solution method can be found for this problem. This poses serious problems for the claim of engineers that their designs are optimal solutions, since Arrow's theorem implies that in a multi-criteria problem the notion of ‘optimal’ cannot be rigorously defined.
Another problem for the decision-making view of engineering design is that in modern technology almost all design is done by teams. Such teams are composed of experts from many different disciplines. Since each discipline has its own theories, its own models of interdependencies, its own assessment criteria, and so forth, such that they must be considered as inhabitants of different object worlds, as Louis Bucciarelli (1994) phrases it. The different team members are, therefore, likely to disagree on the relative rankings and evaluations of the various design options under discussion. Agreement on one option as the overall best one can here be even less arrived at by an algorithmic method exemplifying engineering rationality. Instead, models of social interaction, such as bargaining and strategic thinking, are relevant here. An example of such an approach to an (abstract) design problem is (Franssen and Buciarelli 2004).
To look in this way at technological design as a decision-making process is to view it normatively from the point of view of practical or instrumental rationality. At the same time it is descriptive in that it is a description of how engineering methodology generally presents the issue how to solve design problems. From that somewhat higher perspective there is room for all kinds of normative questions that are not addressed here, such as whether the functional requirements defining a design problem can be seen as an adequate representation of the values of the prospective users of an artifact or a technology, or by which methods values such as safety and sustainability can best be elicited and represented in the design process. These issues will be taken up in Section 3.
Another issue of central concern to internal philosophy of technology is the status and the character of artifacts. Artifacts are man-made objects: they have an author (see Hilpinen, article ‘artifact’). The artifacts that are of relevance to technology are, in particular, made to serve a purpose. This excludes, within the set of all man-made objects, on the one hand byproducts and waste products and on the other hand works of art. Byproducts and waste products result from an intentional act to make something but just not precisely, although the author at work may be well aware of their creation. Works of art result from an intention directed at their creation (although in exceptional cases of conceptual art, this directedness may involve many intermediate steps) but it is contested whether artists include in their intentions concerning their work an intention that the work serves some purpose. A further discussion of this aspect belongs to the philosophy of art. An interesting general account has been presented by Dipert (1993).
Technical artifacts, then, are made to serve some purpose, generally to be used for something or to act as a component in a larger artifact, which in its turn is either something to be used or again a component. Whether end product or component, an artifact is ‘for something’, and what it is for is called the artifact's function. Several researchers have emphasized that an adequate description of artifacts must refer both to their status as tangible physical objects and to the intentions of the people engaged with them. Kroes and Meijers (2006) have dubbed this view ‘the dual nature of technical artifacts’. They suggest that the two aspects are ‘tied up’, so to speak, in the notion of artifact function. This gives rise to several problems. One, which will be passed over quickly because little philosophical work seems to have been done concerning it, is that structure and function mutually constrain each other, but the constraining is only partial. It is unclear whether a general account of this relation is possible and what problems need to be solved to arrive there. There may be interesting connections with the issue of multiple realizability in the philosophy of mind and with accounts of reduction in science, but these have not yet been explored.
It is equally problematic whether a unified account of the notion of function as such is possible, but this issue has received considerably more philosophical attention. The notion of function is of paramount importance for characterizing artifacts, but the notion is used much more widely. The notion of an artifact's function seems to refer necessarily to human intentions. Function is also a key concept in biology, however, where no intentionality plays a role, and it is a key concept in cognitive science and the philosophy of mind, where it is crucial in grounding intentionality in non-intentional, structural and physical properties. Up till now there is no accepted general account of function that covers both the intentionality-based notion of artifact function and the non-intentional notion of biological function—not to speak of other areas where the concept plays a role, such as the social sciences. The most comprehensive theory, that has the ambition to account for the biological notion, cognitive notion and the intentional notion, is Ruth Millikan's (Millikan 1984); for criticisms and replies, see Preston (1998, 2003), Millikan (1999) and Houkes & Vermaas (2003). The collection of essays edited by Ariew, Cummins and Perlman (2002) presents a recent introduction to the general topic of defining the notion of function in general, although the emphasis is, as is generally the case in the literature on function, on biological functions.
Against the view that the notion of functions refers necessarily to intentionality at least in the case of artifacts, it could be argued that even there, when discussing the functions of the components of a larger device and their interrelations, the intentional ‘side’ of these functions is of secondary importance only. This, however, would be to ignore the possibility of the malfunctioning of such components. This notion seems to be definable only in terms of a mismatch between actual behavior and intended behavior. The notion of malfunction also sharpens an ambiguity in the general reference to intentions when characterizing technical artifacts. These artifacts usually engage many people, and the intentions of these people may not all pull in the same direction. A major distinction can be drawn between the intentions of the actual user of an artifact for a particular purpose and the intentions of the artifact's designer. Since an artifact may be used for a purpose different from the one for which its designer intended it to be used, and since people may also use natural objects for some purpose or other, one is invited to allow that artifacts can have multiple functions, or to enforce a hierarchy among all relevant intentions in determining the function of an artifact, or to introduce a classification of functions in terms of the sorts of determining intentions. In the latter case, which is a sort of middle way between the two other options, one commonly distinguishes between the proper function of an artifact as the one intended by its designer and the accidental function of the artifact as the one given to it by some user on private considerations. Accidental use can become so common, however, that the original function drops out of memory.
Closely related to this issue to what extent use and design determine the function of an artifact is the problem of characterizing artifact kinds. It may seem that we use functions to classify artifacts: an object is a knife because it has the function of cutting, or more precisely, of enabling us to cut. It is hardly recognized, however, that the link between function and kind-membership is not that straightforward. The basic kinds in technology are, for example, ‘knife’, ‘airplane’ and ‘piston’. The members of these kinds have been designed in order to be used to cut something with, to transport something through the air and to generate mechanical movement through thermodynamic expansion. However, one cannot create a particular kind of artifact just by designing something with the intention that it be used for some particular purpose: a member of the kind so created must actually be useful for that purpose. Despite innumerable design attempts and claims, the perpetual motion machine is not a kind of artifact. A kind like ‘knife’ is defined, therefore, not only by the intention of the designer of each of its members that it be useful for cutting but also by an operational principle known to these designers, and on which they based their design. This is, in a different setting, also defended by Thomasson, who in her characterization of what she in general calls an artifactual kind says that such a kind is defined by the designer's intention to make something of that kind, by a substantive idea that the designer has of how this can be achieved, and by his or her largely successful achievement of it (Thomasson 2003, 2007). Qua sorts of kinds in which artifacts can be grouped, a distinction must therefore be made between a kind like ‘knife’ and a corresponding but different kind ‘cutter’. A ‘knife’ indicates a particular way a ‘cutter’ can be made. One can also cut, however, with a thread or line, a welding torch, a water jet, and undoubtedly by other sorts of means that have not yet been thought of. A ‘cutter’ is an example of what could be looked upon as a truly functional kind. As such, it is subject to the conflict between use and design: one could mean by ‘cutter’ anything than can be used for cutting or anything that has been designed to be used for cutting, by the application of whatever operational principle, presently known or unknown.
This distinction between artifact kinds and functional kinds is relevant for the status of such kinds in comparison to other notions of kinds. Philosophy of science has emphasized that the concept of natural kind, such as exemplified by ‘water’ or ‘atom’, lies at the basis of science. On the other hand it is generally taken for granted that there are no regularities that all knives or airplanes or pistons answer to. This, however, is loosely based on considerations of multiple realizability that apply only to functional kinds, not to artifact kinds. Artifact kinds share an operational principle that gives then some commonality in physical features, and this commonality becomes stronger once a particular artifact kind is subdivided into narrower kinds. Since these kinds are specified in terms of physical and geometrical parameters, they are much closer to the natural kinds of science, in that they support law-like regularities; see for a defense of this position (Soavi 2008).Turing machines, the Church-Turing thesis, computability and complexity, the Turing test, the Chinese room argument, the computational theory of mind, functionalism, multiple realizability, and the philosophy of computer science.
It was not until the twentieth century that the development of the ethics of technology as a systematic and more or less independent subdiscipline of philosophy started. This late development may seem surprising given the large impact that technology has had on society, especially since the industrial revolution.
A plausible reason for this late development of ethics of technology is the instrumental perspective on technology that was mentioned in Section 2.2. This perspective implies, basically, a positive ethical assessment of technology: technology increases the possibilities and capabilities of humans, which seems in general desirable. Of course, since antiquity, it has been recognized that the new capabilities may be put to bad use or lead to human hubris. Often, however, these undesirable consequences are attributed to the users of technology, rather than the technology itself, or its developers. This vision is known as the instrumental vision of technology resulting in the so-called neutrality thesis. The neutrality thesis holds that technology is a neutral instrument that can be put to good or bad use by its users. During the twentieth century, this neutrality thesis met with severe critique, most prominently by Heidegger and Ellul, who have been mentioned in this context in Section 2.0, but also by philosophers from the Frankfurt School (Adorno, Horkheimer, Marcuse, Habermas).
As the brief overview above illustrates the scope and agenda for ethics of technology to a large extent depends on how technology is conceptualized. The second half of the twentieth century has witnessed a richer variety of conceptualizations of technology that move beyond the conceptualization of technology as a neutral tool, as a world view or as a historical necessity. This includes conceptualizations of technology as a political phenomenon (Winner, Feenberg, Sclove), as a social activity (Latour, Callon, Bijker and others in the area of science and technology studies), as a cultural phenomenon (Ihde, Borgmann), as a professional activity (engineering ethics, e.g., Davis), and as a cognitive activity (Bunge, Vincenti). Despite this diversity, the development in the second half of the twentieth century is characterized by two general trends. One is a move away from technological determinism and the assumption that technology develops autonomously to an emphasis on choices in technological development. The other is a move away from ethical reflection on technology as such to ethical reflection of specific technologies and to specific phases in the development of technology. Both trends together have resulted in an enormous increase in the number and scope of ethical questions that are asked about technology. The developments also imply that ethics of technology is to be adequately empirically informed, not only about the exact consequences of specific technologies but also about the actions of engineers and the process of technological development. This has also opened the way to the involvement of other disciplines in ethical reflections on technology, such as Science and Technology Studies (STS) and Technology Assessment (TA).
Not only is the ethics of technology characterized by a diversity of approaches, it might even be doubted whether something like a subdiscipline of ethics of technology, in the sense of a community of scholars working on a common set of problems, exists. The scholars studying ethical issues in technology have diverse backgrounds (e.g., philosophy, STS, TA, law, political science) and they do not always consider themselves (primarily) ethicists of technology. Moreover, there is limited interaction and discussion between different strands in the ethics of technology, like the ethics of engineering, the ethics of specific technologies (such as computer ethics) and approaches that remain primarily inspired by the traditional philosophy of technology. To give the reader an overview of the field, three basic approaches or strands that might be distinguished in the ethics of technology will be discussed.
3.2.1. Cultural and political approaches
Both cultural and political approaches build on the traditional philosophy and ethics of technology of the first half of the twentieth century. Whereas cultural approaches conceive of technology as a cultural phenomenon that influences our perception of the world, political approaches conceive of technology as a political phenomenon, i.e. as a phenomenon that is ruled by and embodies institutional power relations between people.
Cultural approaches are often phenomenological in nature or at least position themselves in relation to phenomenology as post-phenomenology. Examples of philosophers in this tradition are Don Ihde, Albert Borgmann, Peter-Paul Verbeek and Evan Selinger (e.g., Borgmann 1984; Ihde 1990; Verbeek 2005). The approaches are usually influenced by developments in STS, especially the idea that technologies contain a script that influences not only people's perception of the world but also human behavior, and the idea of the absence of a fundamental distinction between humans and non-humans, including technological artifacts (Akrich 1992; Latour 1992; Latour 1993; Ihde and Selinger 2003). The combination of both ideas have led to the claim that technology has (moral) agency.
Political approaches to technology go back to Marx, who assumed that the material structure of production in society, in which technology is obviously a major factor, determined the economic and social structure of that society. Similarly, Langdon Winner has argued that technologies can embody specific forms of power and authority (Winner 1980). According to him, some technologies are inherently normative in the sense that they require or are strongly compatible with certain social and political relations. Railroads, for example, seem to require a certain authoritative management structure. In other cases, technologies may be political due to the particular way they have been designed. Some political approaches to technology are inspired by (American) pragmatism and, to a lesser extent, discourse ethics. A number of philosophers, for example, have pleaded for a democratization of technological development and the inclusion of ordinary people in the shaping of technology (Winner 1983; Sclove 1995; Feenberg 1999).
Although political approaches have obviously ethical ramifications, many philosophers who have adopted such approaches do not engage in explicit ethical reflection on technology. An interesting recent exception, and an attempt to consolidate a number of recent developments and to articulate them into a more general account of what an ethics of technology should look like, is the volume Pragmatist Ethics for a Technological Culture (Keulartz et al. 2002). In this volume, the authors plea for a revival of the pragmatist tradition in moral philosophy because it is better fit to deal with a number of moral issues in technology. Instead of focusing on how to reach and justify normative judgments about technology, a pragmatist ethics focuses on how to recognize and trace moral problems in the first place. Moreover, the process of dealing with these problems is considered more important than the outcome.
3.2.2. Engineering ethics
Engineering ethics is a relatively new field of education and research. It started off in the 1980s in the United States, merely as an educational effort. Engineering ethics is concerned with ‘the actions and decisions made by persons, individually or collectively, who belong to the profession of engineering’ (Baum 1980: 1). According to this approach, engineering is a profession, in the same way as medicine is a profession.
Although there is no agreement on how a profession exactly should be defined, the following characteristics are often mentioned:
- The use of specialized knowledge and skills that require a long period of study.
- The occupational group has a monopoly on the carrying out of the occupation.
- The assessment of whether the professional work is carried out in a competent way is done, and can only be done, by colleague professionals.
Typical ethical issues that are discussed in engineering ethics are professional obligations of engineers as exemplified in, for example, codes of ethics of engineers, the role of engineers versus managers, competence, honesty, whistle-blowing, concern for safety and conflicts of interest (Davis 2005; Martin and Schinzinger 2005; Harris, Pritchard, and Rabins 2008).
Recently, a number of authors have pleaded for broadening the traditional scope of engineering ethics (e.g., Herkert 2001). This call for a broader approach derives from two concerns. One concern is that the traditional micro-ethical approach in engineering ethics tends to take the contexts in which engineers have to work for given, while major ethical issues pertain to how this context is ‘organized’. It is one thing to deliberate whether an engineer should blow the whistle or not in a specific situation, it is quite another thing to deliberate whether cases of whistle-blowing could be largely avoided by better procedures, organizational structures and laws. Another concern is that the traditional micro-ethical focus tends to neglect issues relating to the impact of technology on society or issues relating to decisions about technology. Broadening the scope of engineering ethics would then, among others, imply more attention for such issues as sustainability and social justice.
3.2.3. Ethics of specific technologies
The last decades have witnessed an increase in ethical inquiries into specific technologies. One of the most visible new fields is probably computer ethics (e.g., Johnson 2001; Weckert 2007; Van den Hoven and Weckert 2008), but biotechnology has spurred dedicated ethical investigations as well (e.g., Morris 2006; Thompson 2007). Also more traditional fields like architecture and urban planning have attracted specific ethical attention (Fox 2000). More recently, nanotechnology and so-called converging technologies have led to the establishment of what is called nanoethics (Allhoff et al. 2007). Apart from this, there has been a debate over the ethics of nuclear deterrence (Finnis et al. 1988).
Obviously the establishment of such new fields of ethical reflection is a response to social and technological developments. Still, the question can be asked whether the social demand is best met by establishing new fields of applied ethics. This issue is in fact regularly discussed as new fields emerge. Several authors have, for example argued that there is no need for nanoethics because nanotechnology does not raise any really new ethical issues (e.g., Grunwald 2005). The alleged absence of newness here is supported by the claim that the ethical issues raised by nanotechnology are a variation on, and sometimes an intensification of, existing ethical issues, but hardly really new, and by the claim that these issues can be dealt with the existing theories and concepts from moral philosophy. For an earlier, similar discussion concerning the supposed new character of ethical issues in computer ethics, see (Maner 1996).
The new fields of ethical reflection are often characterized as applied ethics, that is, as applications of theories, normative standards, concepts and methods developed in moral philosophy. For each of these elements, however, application is usually not straightforward but requires a further development or revision. This is the case because general moral standards, concepts and methods are often not specific enough to be applicable in any direct sense to specific moral problems. ‘Application’ therefore often leads to new insights which might well result in the reformulation or at least refinement of existing normative standards, concepts and methods. In some cases, ethical issues in a specific field might require new standards, concepts or methods. Beauchamp and Childress for example have proposed a number of general ethical principles for biomedical ethics (Beauchamp and Childress 2001). These principles are more specific than general normative standards, but still so general and abstract that they apply to different issues in biomedical ethics. In computer ethics, existing moral concepts relating to for example privacy and ownership has been redefined and adapted to deal with issues which are typical for the computer age (Johnson 2003). New fields of ethical application might also require new methods for, for example, discerning ethical issues that take into account relevant empirical facts about these fields, like the fact that technological research & development usually takes place in networks of people rather than by individuals (Zwart et al. 2006).
The above suggests that different fields of ethical reflection on specific technologies might well raise their own philosophical and ethical issues. Even if this is true, it is not clear whether this justifies the development of separate subfields or even subdisciplines. It might well be argued that a lot can be learned from interaction and discussion between these fields and a fruitful interaction with the two other strands discussed above (cultural and political approaches and engineering ethics). Currently, such interaction in many cases seems absent, although there are of course exceptions.
We now turn to the description of some themes in the ethics of technology. We focus on a number of general themes that provide an illustration of general issues in the ethics of technology and the way these are treated.
3.3.1. Neutrality versus moral agency
One important general theme in the ethics of technology is the question whether technology is value-laden. Some authors have maintained that technology is value-neutral, in the sense that technology is just a neutral means to an end, and accordingly can be put to good or bad use (e.g., Pitt 2000). This view might have some plausibility in as far as technology is considered to be just a bare physical structure. Most philosophers of technology, however, agree that technological development is a goal-oriented process and that technological artifacts by definition have certain functions, so that they can be used for certain goals but not, or far more difficulty or less effectively, for other goals. This conceptual connection between technological artifacts, functions and goals makes it hard to maintain that technology is value-neutral. Even if this point is granted, the value-ladenness of technology can be construed in a host of different ways. Some authors have maintained that technology can have moral agency. This claim suggests that technologies can autonomously and freely ‘act’ in a moral sense and can be held morally responsible for their actions.
The debate whether technologies can have moral agency is most vivid in computer ethics, because (future) computers and artificial agents behave more like humans than other technologies do (Bechtel 1985; Snapper 1985; Dennett 1997; Floridi and Sanders 2004). Still the claim that technologies can have moral agency is also made more generally (Latour 1992; Verbeek 2005). Typically, the authors who claim that technologies (can) have moral agency often redefine the notion of agency, and its connection to human will and freedom (e.g., Latour 1993; Floridi and Sanders 2004). A disadvantage of this strategy is that it tends to blur the morally relevant distinctions between people and technological artifacts. More generally, the claim that technologies have moral agency sometimes seems to have become shorthand for claiming that technology is morally relevant. This, however, overlooks the fact technologies can be value-laden in other ways than by having moral agency. One might, for example, claim that technology enables (or even invites) and constrains (or even inhibits) certain human actions and the attainment of certain human goals and therefore is to some extent value-laden, without claiming moral agency for technological artifacts.
Responsibility has always been a central theme in the ethics of technology. The traditional philosophy and ethics of technology, however, tended to discuss responsibility in rather general terms and were rather pessimistic about the possibility of engineers to assume responsibility for the technologies they developed. Ellul, for example, has characterized engineers as the high priests of technology, who cherish technology but cannot steer it.
In engineering ethics, the responsibility of engineers is often discussed in relation to code of ethics that articulate specific responsibilities of engineers. Such codes of ethics stress three types of responsibilities of engineers: 1) conducting the profession with integrity and honesty and in a competent way, 2) responsibilities towards employers and clients and 3) responsibility towards the public and society. With respect to the latter, most US codes of ethics maintain that engineers ‘should hold paramount the safety, health and welfare of the public’.
One may wonder what the grounds are for the responsibilities that are listed in codes of ethics. A possible answer is suggested by Davis (1998): engineers are subject to special moral standards to which other people are not subject because engineering is a profession.
Another possible explanation is that ethical codes do not constitute a contract among professionals, but a contract between a profession and the rest of society. According to this explanation, such a contract would be worthwhile for professionals because in exchange to serving a moral ideal, the profession receives several privileges, such as status, a monopoly on carrying the occupation and good salaries.
A third explanation is that the codes of ethics as such are not morally binding but that they express moral responsibilities that are grounded otherwise. One may, for example, ground the responsibility of engineers by applying general philosophical notions of responsibility. Typical conditions for responsibility mentioned in the literature on responsibility include (e.g., Fischer and Ravizza 1993):
- The responsible actor is an intentional agent concerning the action;
- The action, resulting in the outcome, was voluntary;
- The actor knew, or could have known, the outcome;
- The action of the actor contributed causally to the outcome; and
- The causally contributory action was in some way faulty, i.e. the actor can be blamed for the contributory action.
Some of these conditions are hard to meet in engineering practice (Nissenbaum 1996; Swierstra and Jelsma 2006). For example, engineers may feel compelled to act in a certain way due to hierarchical or market constraints, so that the second condition is not fulfilled. Negative consequences may be very hard or impossible to predict beforehand, so that the third condition is not met. Also the causality condition is often hard to meet due to the long chain from research and development of a technology till use and the many people involved in this chain.
If the traditional philosophical notion of responsibility is applied to engineering and technology it might well turn out that nobody is responsible for certain undesirable consequences of technology. This seems an undesirable result, not only because the social consequences of technology are often considerable, but also because it is often the case that negative consequences could have been prevented if certain precautions had been taken or certain people had cooperated better. One could basically react in two ways to this outcome. One reaction is to retain largely the classical notion of responsibility and to identify current barriers to responsibility and then devise strategies for overcoming these barriers (cf. the entry on Computing and Moral Responsibility in this Encyclopedia). The other reaction is to plea for a new notion of responsibility. Some authors, for example, have pleaded for a notion of responsibility in engineering that is more like the legal notion of strict liability, in which the conditions for being responsible are seriously weakened (e.g., Zandvoort 2000). Other authors have criticized the traditional notion of responsibility for being backward-looking and too much focused on blame (Ladd 1991). An alternative might be sought in a notion of responsibility that is based on virtue ethics. Also the notion of collective responsibility might offer an alternative (see, e.g., May and Hoffman 1991). It remains to be seen, however, to what extent such alternative notions are both philosophically and morally tenable and help to overcome the current problems with responsibility in engineering practice.
In the last decades, increasingly attention is paid not only to ethical issues that arise during the use of a technology, but also during the design phase. An important consideration behind this development is the thought that during the design phase technologies, and their social consequences, are still malleable while during the use phase technologies are more or less given and negative social consequences may be harder to avoid or positive effects harder to achieve. Although the design phase is regularly identified as an ethically relevant phase of technological development, there is remarkably little in-depth research on ethics in design. One reason might be that many studies in engineering ethics tend to focus on disasters and choices and dilemmas that are obviously ethical while the ethical issues in design are often more subtle, more difficult to recognize and arise on a day-to-day basis (see, e.g., Lloyd and Busby 2003).
Van Gorp and Van de Poel (2001) distinguish five choices in design processes that are potentially ethically relevant: 1) the formulation of goals, design criteria and requirements and their operationalization, 2) the choice of alternatives to be investigated during a design process and the selection among those alternatives at a later stage in the process, 3) the assessment of trade-offs between design criteria (given particular alternatives) and decisions about the acceptability of particular trade-offs, 4) assessment of risks and unintended or unforeseen effects and decisions about the acceptability or desirability of these and 5) the assessment of scripts and political and social visions that are (implicitly) inherent in a design and decisions about the desirability of these scripts. Van Gorp (2005) has shown that the type of ethical issues that arise in design and the way they are dealt with depends on the type of design process: in normal, incremental design, engineers usually have recourse to existing normative frameworks in engineering practice, while such frameworks are absent or difficult to apply in radical, innovative design.
As was mentioned in section 2.4, design is not only a cognitive process but also a social process in which different individuals and groups are involved and in which negotiation plays an important role (Bucciarelli 1994). The social nature of design raises a range of ethical issues like: Who is to be involved in the design process? How are decisions to be made in a morally acceptable way? How to allocate responsibilities between the various participants (Devon and Van de Poel 2004)?
In computer ethics, an approach known as Value Sensitive Design has been developed to explicitly address the ethical nature of design. Value Sensitive Design aims at integrating values of ethical importance in a systematic way in engineering design (Friedman and Kahn 2003). The approach combines conceptual, empirical and technical investigations.
3.3.4. Technological risks
The risks of technology are one of the traditional ethical concerns in the ethics of technology. Risk is usually defined as the product of the probability of an undesired event and the effect of that event, although there are also other definitions around (Hansson 2004b). In general it seems desirable to keep technological risks as small as possible. The larger the risk, the larger either the likeliness or the impact of an undesirable event is. Risk reduction therefore is an important goal in technological development and engineering codes of ethics often attribute a responsibility to engineers in reducing risks and designing safe products. Still, risk reduction is not always feasible or desirable. It is sometimes not feasible, because there are no absolutely safe products and technologies. But even if risk reduction is feasible it may not be desirable from a moral point of view. Reducing risk often comes at a cost. Safer products may be more difficult to use, more expensive or less sustainable. So sooner or later, one is confronted with the question: what is safe enough? What makes a risk (un)acceptable?
The process of dealing with risks is often divided into three stages: risk assessment, risk evaluation and risk management. Of these, the second is most obviously ethically relevant. However, also risk assessment involves value judgments, for example about what risks should be assessed in the first place (Shrader-Frechette 1991). An important, and morally relevant, issue is also the degree of evidence that is needed to establish a risk. In establishing a risk on the basis of a body of empirical data one might make two kinds of mistakes. One can establish a risk when there is actually none (type I error) or one can mistakenly conclude that there is no risk while there actually is a risk (type II error). Science traditionally aims at avoiding type I errors. Several authors have argued that in the specific context of risk assessment it is often more important to avoid type II errors (Cranor 1990; Shrader-Frechette 1991). The reason for this is that risk assessment not just aims at establishing scientific truth but has a practical aim, i.e. to provide the knowledge on basis of which decisions can be made about whether it is desirable to reduce or avoid certain technological risks in order to protect users or the public.
Risk evaluation is carried out in a number of ways (see, e.g., Shrader-Frechette 1985). One possible approach is to judge the acceptability of risks by comparing them to other risks or to certain standards. One could, for example, compare technological risks with naturally occurring risks. This approach, however, runs the danger of committing a naturalistic fallacy: naturally occurring risks may (sometimes) be unavoidable but that does not necessarily make them morally acceptable. More generally, it is often dubious to judge the acceptability of the risk of technology A by comparing it to the risk of technology B if A and B are not alternatives in a decision (For this and other fallacies in reasoning about risks, see Hansson 2004a).
A second approach to risk evaluation is risk-cost benefit analysis, which is based on weighing the risks against the benefits of an activity. Different decision criteria can be applied if a (risk) cost benefit analysis is carried out (Kneese, Ben-David, and Schulze 1983). According to Hansson (2003: 306), usually the following criterion is applied: ‘... a risk is acceptable if and only if the total benefits that the exposure gives rise to outweigh the total risks, measured as the probability-weighted disutility of outcomes’.
A third approach is to base risk acceptance on the consent of people who suffer the risks after informing them about these risks (informed consent). A problem of this approach is that technological risks usually affect a large number of people at once. Informed consent may therefore lead to a ‘society of stalemates’ (Hansson 2003: 300).
Several authors have proposed alternatives to the traditional approaches of risk evaluation on the basis of philosophical and ethical arguments. Shrader-Frechette (1991) has proposed a number of reforms in risk assessment and evaluation procedures on the basis of a philosophical criticism of current practices. Roeser (2006) argues for a role of emotions in judging the acceptability of risks. Hansson has proposed the following alternative principle for risk evaluation: ‘Exposure of a person to a risk is acceptable if and only if this exposure is part of an equitable social system of risk-taking that works to her advantage’ (Hansson 2003: 305). Hansson's proposal introduces a number of moral considerations in risk evaluation that are traditionally not addressed or only marginally addressed. These are the consideration whether individuals profit from a risky activity and the consideration whether the distribution of risks and benefits is fair.
- Agricola, G. (1556) De re metallica. Translated by H. C. Hoover and L. H. Hoover. London: The Mining Magazine, 1912.
- Akrich, M. (1992) The description of technical objects. In Shaping technology/building society: Studies in sociotechnical change, edited by W. Bijker and J. Law. Cambridge, Mass.: MIT Press.
- Allhoff, F., P. Lin, J. Moor, and J. Weckert, eds. (2007) Nanoethics: the ethical and social implications of nanotechnology. Hoboken, N.J.: Wiley-Interscience.
- Ariew, A., R. Cummins, and M. Perlman, eds. (2002) Functions: new essays in the philosophy of psychology and biology. New York/Oxford: Oxford University Press.
- Bacon, F. (1627) New Atlantis: A worke vnfinished. In Bacon, Sylva sylvarum: or a naturall historie, in ten centuries. London: William Lee.
- Baum, R. J. (1980) Ethics and engineering curricula. Hastings-on-Hudson: The Hastings Center.
- Beauchamp, T. L. (2003) The nature of applied ethics. In A companion to applied ethics, edited by R. G. Frey and C. H. Wellman. Oxford/Malden, Mass.: Blackwell.
- Beauchamp, T. L., and J. F. Childress (2001) Principles of biomedical ethics. 5th ed. Oxford/New York: Oxford University Press.
- Bechtel, W. (1985) Attributing responsibility to computer systems. Metaphilosophy 16: 296–306.
- Bimber, B. (1990) Karl Marx and the three faces of technological determinism. Social Studies of Science 20: 333–351.
- Borgmann, A. (1984) Technology and the character of contemporary life: a philosophical inquiry. Chicago/London: University of Chicago Press.
- Bucciarelli, L. L. (1994) Designing engineers. Cambridge, Mass.: MIT Press.
- Bunge, M. (1966) Technology as applied science. Technology and Culture 7: 329–347.
- Butler, S. (1872) Erewhon. London: Trubner and Co.
- Cranor, C. F. (1990) Some moral issues in risk assessment. Ethics 101: 123–143.
- Davis, M. (1998) Thinking like an engineer: studies in the ethics of a profession. New York/Oxford: Oxford University Press.
- ––– (2005) Engineering ethics. Aldershot/Burlington, Vt.: Ashgate.
- Dennett, D. C. (1997) When HAL kills, who's to blame? Computer ethics. In Hal's legacy: 2001's computer as dream and reality, edited by D. G. Stork. Cambridge, Mass.: MIT Press.
- Devon, R., and I. R. van de Poel. 2004. Design ethics: the social ethics paradigm. International Journal of Engineering Education 20: 461–469.
- Dipert, R. R. (1993) Artifacts, art works, and agency. Philadelphia: Temple University Press.
- Ellul, J. (1964) The technological society. Translated by J. Wilkinson. New York: Alfred A. Knopf.
- Feenberg, A. (1999) Questioning technology. London/New York: Routledge.
- Finnis, J., J. Boyle, and G.Grisez (1988) Nuclear deterrence, morality and realism. Oxford: Oxford University Press.
- Fischer, J. Martin, and M. Ravizza, eds. (1993) Perspectives on moral responsibility. Ithaca, N.Y.: Cornell University Press.
- Floridi, L., and J. W. Sanders (2004) On the morality of artificial agents, Minds and Machines 14: 349–379.
- Fox, W. (2000) Ethics and the built environment, Professional ethics. London/New York: Routledge.
- Franssen, M. and L. L. Bucciarelli (2004) On rationality in engineering design. Journal of Mechanical Design, 126: 945–949.
- Franssen, M. (2005) Arrow's theorem, multi-criteria decision problems and multi-attribute preferences in engineering design. Research in Engineering Design 16: 42–56.
- Friedman, B., and P. H. Kahn, Jr. (2003) Human values, ethics and design. In Handbook of human-computer interaction, edited by J. Jacko and A. Sears. Mahwah, N.J.: Lawrence Erlbaum Associates.
- Grunwald, A. (2005) Nanotechnology: a new field of ethical inquiry? Science and Engineering Ethics 11: 187–201.
- Habermas, J. (1970) Technology and science as ideology. In Toward a rational society. Boston, Mass.: Beacon Press.
- Hansson, S. O. (2003) Ethical criteria of risk acceptance. Erkenntnis 59: 291–309.
- ––– (2004a) Fallacies of risk. Journal of Risk Research 7: 353–360.
- ––– (2004b) Philosophical perspectives on risk. Technè 8: 10–35.
- Harris, C. E., M. S. Pritchard, and M. J. Rabins (2008) Engineering ethics: concepts and cases. 4th ed. Belmont, Cal.: Wadsworth.
- Heidegger, M. (1977) The turning. In The question concerning technology and other essays. New York: Harper and Row.
- Herkert, J. R. (2001) Future directions in engineering ethics research: microethics, macroethics and the role of professional societies. Science and Engineering Ethics 7: 403–414.
- Hughes, J. L., P. A. Kroes, S. D. Zwart (2007) A semantics for means-end relations. Synthese 158: 207–231.
- Ihde, D. (1990) Technology and the lifeworld: from garden to earth. Bloomington: Indiana University Press.
- Ihde, D., and E. Selinger (2003) Chasing technoscience: matrix for materiality. Bloomington: Indiana University Press.
- Jarvie, I. C. (1966) The social character of technological problems: comments on Skolimowski's paper. Technology and Culture 7: 384–390.
- Johnson, D. G. (2001) Computer ethics. 3rd ed. Upper Saddle River, N.J.: Prentice Hall.
- ––– (2003) Computer ethics. In A companion to applied ethics, edited by R. G. Frey and C. H. Wellman. Oxford/Malden, Mass.: Blackwell.
- Kapp, E. (1877) Grundlinien einer Philosophie der Technik: Zur Entstehungsgeschichte der Cultur aus neuen Gesichtspunkten. Braunschweig: Westermann.
- Keulartz, J., M. Korthals, M. Schermer, and T. Swierstra, eds. (2002) Pragmatist ethics for a technological culture. Dordrecht: Kluwer Academic.
- Kneese, A. V., S. Ben-David, and W. D. Schulze (1983) The ethical foundations of benefit-cost analysis. In Energy and the future, edited by D. MacLean and P. G. Brown. Totowa, N.J.: Rowman and Littefield.
- Kotarbinski, T. (1965) Praxiology: an introduction to the sciences of efficient action, Oxford: Pergamon Press.
- Kroes, P., and A. Meijers, eds. (2006) The dual nature of technical artifacts. Special issue of Studies in History and Philosophy of Science 37: 1–158.
- Kroes, P. A., M. Franssen and L. L. Bucciarelli (2009). Rationality in engineering design. Philosophy of technology and the engineering sciences, edited by A. W. M. Meijers. Elsevier.
- Kuhn, T. S. (1962) The structure of scientific revolutions. Chicago: University of Chicago Press.
- Ladd, J. (1991) Bhopal: an essay on moral responsibility and civic virtue. Journal of Social Philosophy 22: 73–91.
- Latour, B. (1992) Where are the missing masses? In Shaping technology/building society: studies in sociotechnical change, edited by W. Bijker and J. Law. Cambridge, Mass.: MIT Press.
- ––– (1993) We have never been modern. New York: Harvester Wheatsheaf.
- Lloyd, G. E. R. (1973) Analogy in early Greek thought. In The dictionary of the history of ideas, edited by P. P. Wiener, Vol. 1. New York: Charles Scribner's Sons.
- Lloyd, P. A., and J. A. Busby (2003) “Things that went well—no serious injuries or deaths”: Ethical reasoning in a normal engineering design process. Science and Engineering Ethics 9: 503–516.
- Maner, W. (1996) Unique ethical problems in information technology, Science and Engineering Ethics 2: 137–154.
- Martin, M. W., and R. Schinzinger (2005) Ethics in engineering. 4th ed. Boston, Mass.: McGraw-Hill.
- May, L., and S. Hoffman (1991) Collective responsibility: five decades of debate in theoretical and applied ethics. Savage, Md.: Rowman and Littlefield Publishers.
- Millikan, R. G. (1999) Wings, spoons, pills, and quills: a pluralist theory of function. The Journal of Philosophy 96: 191–206.
- Mitcham, C. (1994) Thinking through technology: the path between engineering and philosophy. Chicago: University of Chicago Press.
- Morris, J. (2006) The ethics of biotechnology: biotechnology in the 21st century. [Philadelphia]: Chelsea House Publishers.
- Newman, W. R. (1989) Technology and alchemical debate in the late Middle Ages. Isis 80, 423–445.
- ––– (2004) Promethean ambitions: alchemy and the quest to perfect nature. Chicago: University of Chicago Press.
- Nissenbaum, H. (1996) Accountability in a computerized society. Science and Engineering Ethics 2: 25–42.
- Pitt, J. C. (1999) Thinking about technology: foundations of the philosophy of technology. New York: Seven Bridges Press.
- Polanyi, M. (1958) Personal knowledge: towards a post-critical philosophy. London: Routledge and Kegan Paul.
- Preston, B. (1998) Why is a wing like a spoon? A pluralist theory of function. The Journal of Philosophy 95: 215–254.
- ––– (2003) Of marigold beer: a reply to Vermaas and Houkes. British Journal for the Philosophy of Science 54: 601–612.
- Roeser, S. (2006) The role of emotions in judging the moral acceptability of risks. Safety Science 44: 689–700.
- Ryle, G. (1949) The concept of mind. London: Hutchinson.
- Schummer, J. (2001) Aristotle on technology and nature. Philosophia Naturalis 38: 105–120.
- Sclove, R. E. (1995) Democracy and technology. New York: The Guilford Press.
- Sellars, W. (1962) Philosophy and the scientific image of man. In Frontiers of science and philosophy, edited by R. Colodny, Pittsburgh: University of Pittsburgh Press, 35–78.
- Shrader-Frechette, K. S. (1985) Risk analysis and scientific method: methodological and ethical problems with evaluating societal hazards. Dordrecht: Reidel.
- ––– (1991) Risk and rationality: philosophical foundations for populist reform. Berkeley etc.: University of California Press.
- Simon, H. (1969) The sciences of the artificial. Cambridge, Mass./London: MIT Press.
- Simon, H. A. (1982) Models of bounded rationality. Cambridge, Mass./London: MIT Press.
- Skolimowski, H. (1966) The structure of thinking in technology. Technology and Culture 7: 371–383.
- Snapper, J. W. (1985) Responsibility for computer-based errors. Metaphilosophy 16: 289–295.
- Soavi, M. (2009) Realism and artifact kinds. Functions and more: comparative philosophy of technical artifacts and biological organisms, edited by U. Krohs and P. A. Kroes. Cambridge, Mass.: MIT Press.
- Suh, N. P. (2001) Axiomatic design: advances and applications. Oxford/New York: Oxford University Press.
- Swierstra, T., and J. Jelsma (2006) Responsibility without moralism in techno-scienctific design practice. Science, Technology and Human Values 31: 309–332.
- Thomasson, A. (2003) Realism and human kinds. Philosophy and Phenomenological Research 67: 580–609.
- ––– (2007) Artifacts and human concepts. In Creations of the mind: essays on artifacts and their representation, edited by E. Margolis and S. Laurence, Oxford: Oxford University Press, pp. 52–73.
- Thompson, P. B. (2007) Food biotechnology in ethical perspective. 2nd ed. Dordrecht: Springer.
- Van den Hoven, M. J., and J. Weckert, eds. (2008) Information technology and moral philosophy. Cambridge/New York: Cambridge University Press.
- Van der Pot, J. H. J. (1994) Steward or sorcerer's apprentice? The evaluation of technical progress: a systematic overview of theories and opinions. 2 vols. Delft: Eburon. (2nd ed. 2004 under the title Encyclopedia of technological progress: A systematic overview of theories and opinions.)
- Van Gorp, A. (2005) Ethical issues in engineering design: safety and sustainability. Delft: Simon Stevin Series in the Philosophy of Technology.
- Van Gorp, A., and I. R. van de Poel (2001) Ethical considerations in engineering design processes. IEEE Technology and Society Magazine 20: 15–22.
- Verbeek, P. P. (2005) What things do: philosophical reflections on technology, agency, and design. Translated by R. P. Crease. Penn State: Penn State University Press.
- Vermaas, P. E., and Houkes, W. (2003) Ascribing functions to technical artifacts: a challenge to etiological accounts of functions. British Journal for the Philosophy of Science 54: 261–289.
- Vincenti, W. A. (1990) What engineers know and how they know it: analytical studies from aeronautical history. Baltimore, Md./London: Johns Hopkins University Press.
- Vitruvius (first ct BC) The ten books on architecture. Translated by M. H. Morgan. Cambridge, Mass.: Harvard University Press, 1914.
- Weckert, J. (2007) Computer ethics. Aldershot/Burlington, Vt.: Ashgate.
- Winner, L. (1980) Do artifacts have politics? Daedalus 109: 121–136.
- ––– (1983) Techne and politeia: the technical constitution of society. In Philosophy and technology, edited by P. T. Durbin and F. Rapp. Dordrecht: Reidel.
- Zandvoort, H. (2000) Codes of conduct, the law, and technological design and development. In The empirical turn in the philosophy of technology, edited by P. Kroes and A. Meijers. Amsterdam etc.: JAI/Elsevier.
- Zwart, S. D., I. R. van de Poel, H. van Mil and M. Brumsen (2006) A network approach for distinguishing ethical issues in research and development. Science and Engineering Ethics 12: 663–684.
- Ethics and Information Technology
- Science and Engineering Ethics
- Techné: Research in Philosophy and Technology
- Society for Philosophy and Technology.
- Online Ethics Center.
- 3TU.Centre for Ethics and Technology (3TU = T.U. Delft, T.U. Eindhoven, and U. Twente).
- The Dual Nature of Artifacts (Research project, T.U. Delft).
Aristotle, Special Topics: causality | artifact | Bacon, Francis | computer and information ethics | computing: and moral responsibility | information technology: and moral values | moral responsibility | responsibility: collective | risk
The SEP Editors would like to thank Carl Mitcham for his helpful comments on, and suggestions for, this entry. We'd also like to thank Gintautas Miliauskas (Vilnius University) for carefully proofreading the text and suggesting numerous improvements.