Reports
Ways To Go Wrong
Herman Kahn lecturing
Herman Kahn lecturing
Founder, Hudson Institute
Default Expert Image
(1930-2012)

Let us consider some of the mechanisms responsible for undesirable (by contemporary values) results in social processes. There seem to be at least ten important--—though sometimes overlapping--—"pitfalls."

1. Criteria too Narrow. Criteria for decisions are often too narrow because the decision-makers are parochial, partisan, or self-interested, or simply not accustomed to considering the new criteria that are becoming relevant.

Because of new technologies, new wealth, new conditions of domestic life and of international relations, unprecedented criteria and issues are coming up for national decision. But in the usual bureaucratic situation an executive is expected to be concerned with his own immediate responsibilities and not to worry unduly about others except for purposes of "political" bargaining or compromise. The only man who has nominal responsibility for the "overall" problem is the President (or other chief executive), who has little time to spend on anything except already "felt" pressures; and his competence obviously is limited. Furthermore, executives often do not make the crucial design decisions or even have much effective influence on them; they tend to make choices among already designed systems. As a result the principles of contingency design are often neglected, or the choices and compromises that are formulated may be far from optimal.

Thus the national executive viewpoint may be narrow simply because there is no group whose professional and continuous job it is to worry about the appropriate issue. We have seen more than one instance in which a new issue is identified, all the officials to whom it is pointed out agree that it is significant, and it remains neglected simply because no one in the bureaucratic structure has a "mission" that would permit him to take cognizance of the new problem. Similarly it often happens that although there are offices in many parts of the government that are responsible for various parts of a problem, there is no one who has responsibility for the problem as a whole, for fitting the traditionally recognized part into a newly glimpsed total system—--except of course the Chief Executive, for whom the problem may be at too high or low a level of abstraction or too low in priority in comparison to more pressing (and possibly less important) demands on his attention. It is one of the most important objectives of a policy-oriented research organization to function as a "lobby for the future": that is, to make a deliberate attempt to take a broad and long-range view of problems, and to try to create intellectual pressures on behalf of considerations outside the institutionalized criteria, particularly those considerations relevant to the long-run future or to the larger community, as in the case of issues such as international security. Policy studies should seek to discover important issues that are not currently recognized and should try to see that they are not unduly neglected in favor of more obvious, more pressing, or better-institutionalized considerations.

2. Decisions at Inappropriate Point in Structure. The above discussion inevitably raises the question of who determines the "good of society." This question is further obscured by the ways in which preferences are modified and decisions changed by the decision-making process. In its simplest form this is evident in the so-called Committee Paradox, in which the result of group voting—depending on the agenda— can be different from the result any member of the group would have preferred.

A closely related common error is to mistake a prescription for macro-behavior as one that will affect microbehavior, or vice versa. For example, a park and a police force are, most people will agree, good things. This does not mean that people will contribute toward them, unless they are public-spirited, for each individual is better off, on a strictly individual utilitarian calculus, if other people will contribute and he does not have to. Since decisions of all individuals are presumably independent, no individual has reason to expect that his behavior will affect that of any significant number of other people, if the community is large. (This is one reason why such goals are best implemented by collective rather than individual action.) The converse of this may also occur. If the price of wheat is dropping, a farmer may want to maintain his income by increasing his production. But the same factors that impelled him to this decision will impel others to the same decision, reducing the price of wheat still further. If it drops only a little they may still gain from their added effort, but if it drops below the marginal planting costs, they will have lost as a result of their extra efforts.

There are many other instances in which decisions that are rational for members of a group lead to results that are undesirable from the point of view of the whole group. If there is tension between two nations, they would both--—considered together--—usually be better off to avoid an arms race or a war; but in view of the uncertainties one nation might rationally decide that its best course is to enter the arms race, or even to fight preemptively; the other might then find it better to respond in kind than not.

From the individual point of view, voting may be judged to be not worth the effort, as there is no reason to believe that a tie vote will occur. (…) Yet if each followed the utilitarian rule for the individual, catastrophe might result. Thus naive utilitarianism can do considerable social damage; and moral rules are required under a variety of circumstances.

A mistake related to those in the paragraphs above is to mistake agreement on goals with agreement on means of achieving them, or to mistake agreement on a specific set of means with agreement on goals. Thus the fact (if it is a fact) that two states favor disarmament does not mean that they can agree on the rates and categories of disarmament. Many kinds of instabilities and hidden advantages or disadvantages can inhere in any specific means of implementation; no means may be substantially neutral with respect to the interests of the parties in the interim. If these asymmetries are great enough, they may foreclose agreement even though the destination is mutually agreed. Conversely, as was true during World War II, nations may be able to agree on means, for example, the defeat of Nazi Germany, while disagreeing about their destination, that is, the regime for Europe and the world in the postwar period. Sometimes these disagreements on destinations are obscured in a way that is functional for agreement on present means, say, the war effort, but dysfunctional for the future goals, because of the excessive fear of "rocking the boat" or lack of attention to "less important issues."

3. Inadequate Thought. Failures of perspective in decision-making can be due to aspects of the social utility paradox, but more often result from simple mistakes caused by inadequate thought. It is, for example, common enough that unnecessarily poor analyses are made. Obviously some analyses are done badly because they cannot be done well. But often sufficient information simply is not gathered, or there is a culpable failure in understanding theory. Both errors are avoidable if sufficient thought and time are devoted to the problem. Or there may be insufficient attention paid to hedging against complexities and contingencies. It is usually possible to make a plan that will work well if things go according to specially selected assumptions but that fails disastrously if certain not-unlikely variations from the assumptions take place. It is the purpose of such techniques as systems analysis to make designs that are relatively insensitive to changes in assumptions. For a remarkable number of cases this can be done, given sufficient intelligence, care, and interest. While of course there are problems for which no reasonable "contingency design" is possible, there are still a great many plans that simply have not been thought through because custom, doctrine, or disastrous experience have not created any pressure to do so.

Most administrators dislike debating or thinking about fundamentals, even when vague, implicit, and half-formulated views obviously are governing choices, and when some searching debate is clearly desirable. Administrators resist even more "unnecessary" discussions that may become unpleasant or divisive. They tend to resist still more the very basic or very speculative thinking that may be essential to raising issues about the future, in part because of a well-grounded feeling that such thinking and discussing are usually unproductive and expose those who make the attempt to criticism, bureaucratic animosity, or ridicule. Yet it is often necessary to be courageous—or seemingly irresponsible—in suggesting and defending far-fetched issues in argument, if unprecedented but crucial considerations are to be discovered and appreciated. It is necessary to spend some time and energy in a process that frequently leads nowhere for the sake of the instances in which something new is learned.

And even when something new is learned, it is difficult to get "responsible" people to take the results seriously or to face up to thinking the issues through and then to providing relatively clear guidance or to making decisions. Again part of the reason is a lack of confidence based on experience of failure with similar issues. The result is a tendency to make important decisions almost arbitrarily, as if there were no way to judge whether any one decision was better than any other. A surprising number of government committees will make important decisions on fundamental matters with less attention than each individual would give to buying a suit.

4. Bad Luck: Unknown Issues. Sometimes, of course, certain information is simply not available. Then, in a sense, no mistake is made: the decision-maker did not understand the problem even though he thought he did and perhaps had every right to think he did; it is simply bad luck that there were aspects of the problem that could not be assessed. There are undoubtedly situations in which the theory or empirical data are insufficient not only to supply the information needed, but even to alert the planner to the fact that important information is missing. The recognition of this possibility is one of the reasons one must be both humble and skeptical about relying too heavily on either new "logical" analyses or old intuitions in unprecedented situations. It is one reason why decision-makers lack confidence in their ability to raise and settle basic issues; it is also a main justification for contingency design and for trying to defer irrevocable or firm choices by preserving flexibility as long as possible. Flexibility is not always good; firm and irrevocable policies may be better if the policies are correct, and sometimes almost any policy pursued firmly is better than no policy. More often wrong policies pursued firmly undercut good results that more flexibility would have saved. Unfortunately no single rule of thumb is sufficiently good, and judgment will have to be exercised on each case of assumed importance.

5. Bad Luck: Unlikely Events. Sometimes the best-laid plans go awry for "statistical" reasons. That is, a proper judgment may be made on the basis of the probabilities as they are known, but the improbable occurs; either conditions are met that are far worse than anyone could have anticipated, or some bizarre combination of accidents—each one of which was unlikely in itself but could have been handled—takes place, and "swamps" the system. The most dramatic possibility for bad luck today would be an accidental or inadvertent nuclear war caused by some extremely unlikely, but not absolutely impossible, combination of technological and human errors or failures. Good planning is designed to decrease not only the likelihood of bad luck but also the consequences if it occurs, since the "extremely improbable" is not the same as the impossible.

6. Changes in Actors. Miscarriages of policy decisions can result from a lack of continuity in the effective actors or pressure groups. In a typical situation one group initiates, another formulates, a third sets up the program, while a fourth actually carries it through. It is this fourth group (or possibly still another group) that furnishes the continuous pressure and determines what the program actually accomplishes. In many cases this turns out to be quite different from what all the previous groups wanted and intended.

7. Inappropriate Models. One kind of inappropriate model is simply technically wrong; someone has made a mistake. For example, many people feel that the unrest in underdeveloped societies results primarily from their poverty. They conclude that foreign aid can decrease the amount of unrest in these countries. Yet one thing that seems very clear by now is that the process of development is disrupting and usually increases violence and unrest. Social change is disruptive and partly destructive, causing many breakdowns and strains in existing systems and creating new systems that clash with the old.

Another common error in models is to mirror-image. One knows one's self and motives, and one imputes these motives to others. Mistaken mirror images play important roles in foreign policy, in ethnic issues, and in confrontations between classes. Most important of all mistakes arise out of attempts to treat complex and intractable issues by overly abstract or simple models. This is often done in foreign affairs as well as in middle-class judgments on the poor. No matter how much energy or effort one puts into an analysis or the execution of a policy, if the efforts are guided by a badly formulated model they can be ineffective and even counterproductive.

8. Inappropriate Values. Some of our misgivings about the future may simply be due to the fact that our values are inappropriate to the future. Within broad limits the future's values should belong to the future. It is quite possible that our apprehensions about alienations and affluence, admittedly based upon certain current middle-class and democratic American values, may seem entirely misplaced by the year 2000. We, of course, do not think so; but this may be our limitation.

Almost any decision-maker will find many aspects of subsequent events undesirable. The medieval church doubtless would have more strongly resisted the Renaissance if it had understood that the Renaissance would lead to the secularization of European society. Kings would have fought the rise of the bourgeoisie sooner and more strongly if they had understood that eventually the bourgeoisie would not only support them against the nobles but would eventually take over their role.

Our values may be inappropriate in still another, less easily recognized, way. We may think that we prize a certain aspect of the current system and regard it as an end in itself when in fact it would be better understood as a means to an end. It is common—and very often of great importance—for people to treat means as values, as ends in themselves, since to consider means as merely instrumental is to subject them to questioning. Yet when conditions change, the failure to reconsider the relation of old means to continuing ends can result not only in misdirected efforts but in behavior that becomes destructive to other, more important goals.

9. Over- or Under-Discounting of Uncertainty or the Future. Probably the most important reason apparently reasonable decisions lead in the long run to undesirable results is that, by and large, it is so difficult to discount uncertain and/or distant difficulties appropriately--—neither too much nor too little. For example, it is difficult to imagine a Virginia planter's wanting to stop the slave trade in 1620—or in 1800—because in 1861 there might be civil war. Of course, this might have been reasonable and realistic of him; the future is a region of great uncertainty; and it is the present in which we live and have the power to act. Yet two kinds of mistakes can be made: those who focus pragmatically on case-by-case decisions may take the long run too little into account, while those who are most concerned with adherence to principles, now and forever (and these principles may be radical or conservative), may fail to deal adequately with problems as they arise in the present.

Of course, the planter in our example might have wanted to stop the slave trade on moral rather than prudential grounds; and events would have shown him right in consequential as well as in absolute terms. One problem of the secular humanist's relativistic "ethic of consequences"--—in which the consequences, including both means and ends of each decision, taken together, are weighed against the total consequences of the alternatives--—is that it depends so much on fallible assessments of consequences. It could be argued that human judgment on such matters is so typically bad that an absolute morality, which prohibits certain means no matter how comparatively attractive the "total consequences" of means and ends may appear, actually leads to better results, even in consequential terms. On many kinds of issues we find this argument persuasive; and we would find the argument persuasive on many additional issues, if social conditions were not changing so rapidly as to require continual reexamination of means as ends change, lest the means become too much ends in themselves. Under twentieth-century conditions of flux, however, there seems to be more to lose by routinely deciding major policy questions on the basis of received doctrine or principle than by making such decisions on the basis of fallible assessments of the likely results of one choice compared to another.

10. The Best May Be the Enemy of the Good (and Sometimes Vice Versa). Desirability and feasibility may be separable for analytic purposes, but when it comes to making choices they are intimately related. By trying for a great deal one ordinarily increases the risk of failure; by attempting too little one may ensure that at best one does not get very much. On the one hand, if a goal is very desirable it may be possible to arouse a great deal of enthusiasm for it, and its feasibility may be greater than one would have thought; on the other hand a goal that seems within reach looks more attractive than one that is hard to get.

Obviously there can be no general rule for making such choices; they often turn on subtle, difficult-to-evaluate factors. On the whole, our own judgment is indicated in the title above: limited objectives usually do not preclude further incremental progress, but excessive or utopian objectives often prevent even limited gains from being obtained. If there is any general idea in this field that we would generally reject, it is the radical or "dialectical" notion that to make things better one should first make them worse, since only then will people understand that something must be done. On this dubious basis Communists resisted meliorating the lot of workers during the depression, since to do so would postpone the revolution; similarly there are those who oppose intellectuals giving constructive advice to the military, political, or economic "establishment," since to cause improvements in policies is merely to cloak the "power structure" in a "veneer" of rationality. While this principle that it is desirable to refuse to improve matters, or to make them worse, no doubt works sometimes, more often it simply makes or leaves things worse than they need be.