I would like to clarify what I mean by ``dignity.'' The American Heritage dictionary lists as the first definition of dignity: ``the presence of poise and self-respect in one's deportment to a degree that inspires respect.'' The image is of Charlie Chaplin's little tramp: he carried himself with poise no matter what troubles came his way. Will machines have dignity based on this definition? Yes. To some degree they already do. This kind of dignity is not as great a concern as the dignity we find in the second definition: ``inherent worth and nobility.''
The big question is: IF we could give machines the same kinds of emotions as people, then what important distinctions would remain between such future computers and future humans? How would human worth compare to machine worth? It is easy to imagine the relative worth as being greater, less, or equal, in certain situations, for certain tasks. Will this lead to an overall assessment of relative worth?
Distinguished computer scientist Marvin Minsky has stated that computers will eventually be so vastly superior to us that we will be lucky if they keep us around as household pets. We might find this funny for a moment. Some pets get treated quite well. It might not be that bad to lounge around a house all day, given food and water and toys. But these musings only delay the sting of Minsky's remark: owners have more worth than their pets. At issue is man's preeminence over his creation. At risk is the exchange of human dignity for machine dignity---what I will call the ``human-machine dignity exchange.''