Human bounds: rationality for our species
Is there such a thing as bounded rationality? I first try to make sense of the question, and then to suggest which of the disambiguated versions might have answers. We need an account of bounded rationality that takes account of detailed contingent facts about the ways in which human beings fail to perform as we might ideally want to. But we should not think in terms of rules or norms which define good responses to an individual’s limitations, but rather in terms of desiderata, situations that limited agents can hope to achieve, and corresponding virtues of achieving them. We should not take formal theories defining optimal behavior in watered-down bounded form, even though they can impose enormous computational or cognitive demands.