Even the development of weapons systems themselves (and Wright's discussion of their increasing complexity over time) left him open to criticism, put into words by
Steven Pinker, a linguist/cognitive scientist specializing in
evolutionary psychology: :"Natural selection has the "goal" of enhancing replication, period. An increase in complexity and cooperation is just one of many subgoals that help organisms attain that ultimate goal. Other subgoals include increases in size, speed,
motor coordination, weaponry, energy efficiency, perceptual acuity, parental care, and so on. All have increased over evolutionary time, but none is the "natural end" of the evolutionary process. Would anyone single out lethal weaponry as "highly likely" or our "destiny," just because weapons have become more lethal over organic and human history?" -
Steven Pinker, from Nonzero, Slate.com. Similarly, the idea of greater and greater non-zero-sum gains benefiting the world at large is also debated, as such technologies allow the injury of ever larger numbers of people. While Wright believes that the goal of natural selection is increasing non-zero-sum gains, it is also clear that these gains might not benefit everyone. Though this does not in any way invalidate Wright's thesis, it does dampen the optimism Wright appears to hold for non-zero-sum dynamics. Indeed, in a world of separated, village-like units, atrocities within
Joseph Stalin's Soviet Union or
Adolf Hitler's
Third Reich could not have occurred. (Of course, life within those village-like units had its own inherent problems, and the question of which point in history was better is addressed by arguments within
teleology—whether history has a direction, and thus if history has shown consistent progress.) Wright believes that overall there has been net progress (with some exceptions), and further, that this progress will continue. In response to Wright's assumption that cooperation and communication will continue to increase, Pinker writes: :"...global cooperation and
moral progress will not increase toward some theoretical maximum or Teilhardesque
Omega Point, but will level off at a point where the pleasures resulting from global cooperation (having more stuff than you had before) are balanced by the pleasures resulting from non-cooperation (having more stuff than your neighbors, or the warm glow of ethnic
chauvinism)." -
Steven Pinker, from Nonzero, Slate.com. Pinker also challenges Wright's core thesis, echoing the case made by
Stephen Jay Gould, that human-like organisms are no more than a coincidence: :"A species with humanlike intelligence was no more "in the cards" than a species with an elephantlike trunk--both are just handy biological gadgets. (Of course, given enough time, humanlike intelligence is near-certain to evolve; but given enough time, anything with nonzero probability is near-certain to evolve, including an elephantlike trunk.) A brain with the intelligence necessary for cooperation and specialization is metabolically expensive and biomechanically hazardous, and evolves only when the evolutionary precursor and current ecosystem make the benefits exceed the costs. Most lineages (e.g., of plants) never got smart, and all lineages of animals on earth except ours were stuck well beneath the subgenius level." -
Steven Pinker, from Nonzero, Slate.com. == Wright's response to criticism ==