Wednesday, December 06, 2006

Two Relationships Between Technology and Values

In many of my previous posts, I have assumed there are human values both independent of technology, and capable of providing benchmarks against which to measure the value of technological change. But technology itself often affects human values. I think some of these effects are crucial to any ideal of progress, but others threaten its very possibility. Here's a preliminary classification:

1) Technology that gives us a better understanding of the world and ourselves: Here, technology can play a crucial role in revealing to us the partiality or flat wrongness of our assumptions. The telescope revealed the shortcomings of a geocentric worldview, occasioning all manner of responsive revisions in elite and popular thought. Philosopher Charles Taylor would call such breakthroughs, and many of their social repercussions, "epistemic gains," which permit us a clearer and better view of the world and ourselves.

2) Technology that blunts or otherwise obscures our understanding of the world and ourselves: As Joel Garreau has argued, humans beings are not merely the authors of technological change, but also its objects, to an extent barely imaginable in earlier times. The convergence of genetics, robotics, information technology and nanotechnology (GRIN) has radically altered our sense of the possible in the realm of self-manipulation.

I'd like to focus on one angle here: the potential for "cosmetic pscychopharmacology" to dampen or reverse negative emotional states. (I focus on the term "cosmetic" to distinguish between the therapeutic alleviation of abnormal states (such as depression), and the enhancement of emotion to the point of feeling "better than well." So "cosmetic" is meant to designate, not the triviality or superficiality of the intervention, but rather, how different it is from classic methods of restoring health to a norm.)

On a purely individualist and hedonist account of well-being, we should welcome such a development--bring on the soma! But if we share Martha Nussbaums's account of emotions as judgments of value, a great deal is lost here. The technology may well have "extended human capacity," but for what end? And, more importantly, has it diminished the possibility of our rightly discerning our ends?

To bioethicists like Carl Elliott, using drugs to alleviate mild alienation may lead to self-betrayal, since intuitions about the worth or worthlessness of forms of life around us are constitutive of our identity. Peter Kramer counters that current drugs don’t dispatch such intuitions, but only relieve the negative affect they generate in those who hold them. But this response does not begin to address the social concerns raised by future technological interventions.

5 Comments:

Blogger Gaia Bernstein said...

Some comments about drawing the lines:

First, I think there is a difference between a technology that alters our definition of ourselves – alters our conception of identity -- and a technology that obscures our understanding of ourselves. It seems to me that the examples you bring forth of genetics, robotics and information technology are actually of technologies that alter our conception of identity perhaps even evolve it. But, your example of cosmetic psychopharmacology is different in that it is about blunting or obscuring. So perhaps these are two separate categories?

Second, it is always difficult to define the line between therapy and enhancement. But, in the case of mental states the line may be even harder than ever to draw. Who are we to decide when one undergoes unbearable and inhuman grief that justifies use of drugs and when one is experiencing sorrow, which is part of the human experience that she should learn to live through, for the sake of carrying on the tradition of humanity.

12/06/2006 9:04 PM  
Blogger LyriaBM said...

I've got to say I find human enhancement an interesting concept. Within our conception of what it means to be human, such things seem wrong. But we are speaking from our own perspective. In future, it could well be described as evolution. Despite this somewhat distressing argument, I remain, of course, trapped in our own time.

12/07/2006 5:03 AM  
Blogger Frank said...

1) As for Gaia’s point of clarification: absolutely, there’s a big difference between “obscuring” self-understanding and creating a whole new self. I should have made that clearer.

As for “who are we to judge the line between therapy and enhancement here:” Certainly interpersonal comparisons of negative affect are as (or even more) difficult than the interpersonal comparisons of utility that have bedeviled economic theory for so long. However, given how embedded we are in social contexts, somebody is going to judge—it’s just a matter of how robust social norms are going to be vis a vis the various actors with more immediate contact with the subject.

The prospect of emotional “control” via drugs may be, on this level, a “poisoned gift”—what was once merely an option may become an expectation. An anxious person may not be accommodated by a workplace, but rather expected to take beta blockers or whatever the relevant pill is. As Niklas Rose suggests in The Politics of Life Itself, this process of “responsibilization” can be just as diminishing of autonomy as the old social norms it’s designed to render irrelevant.

Admittedly, sometimes these older social norms are more restrictive than any drugs on the horizon--compare, for instance, the treatment of widows in the Indian film Water, and some parts of Greece (as reported in Kramer's book Listening to Prozac); in both cases, "mourning" was supposed to take over one's life for years, if not a lifetime. But all I'm trying to do is to point out how competitive dynamics can render an "option" a necessity. This certainly happened with the car in many parts of America...it's almost impossible to live in LA without one.

12/07/2006 8:50 AM  
Blogger Frank said...

As for Lyria's point: I don't think there's any need to apologize for being "trapped in one's time." I think the Achilles Heel of many visions of enhancement is that they facilely assume that improvement comes to everyone, when in fact the story so often appears to be one of one group gaining power relative to others. Given the libertarian background of most transhumanist advocacy, there is little reason to assume a "peaceable kingdom" of universal enhancement. (See, e.g., Pauline Borsook, Cyberselfish). On the other hand, I will admit that some of the more socially aware transhumanists have started addressing equaltiy issues (See, e.g., James Hughes, Citizen Cyborg).

12/07/2006 9:01 AM  
Blogger Art said...

I continue to like the duality that you've set up Frank, looking to the upsides as well as the potential downsides of technology. With respect to the downsides, whether or not taking 'cosmetic pscychopharmacology' is 'bad' is clearly frought with significant value judgments. I recall, for instance, that the prescription of certain types of calming-down drugs (e.g., Ridlin) is widespread for teens and younger students who live in particular areas while it is negligible in others. Presumably the usage of these drugs for kids in certain school districts has become socially-acceptable while it remains controversial elsewhere.
Should we care? To bring back the instrumentalist perspective, one view would suggest that parents and other adults should be able to seek whatever assistance from Big Pharma that they see fit--within a broad regulatory regime to inhibit health risks associated with taking the prescription drugs. On the other side of the coin are those theorists who, like Garreau, suggest that the technology itself has an 'essence' of sorts that overwhelms human willpower. Max Weber and Jacques Ellul also probably fall into the category of technological pessimists as they worried that our tendency to look to technology as a magic bullet for all of our problems would dehumanize us over time. The latter perspective would seem to call for more regulatory efforts to ensure that over-usage does not become a problem, maybe there should be a government screening program to monitor the system to discern what jurisdictions appear to have unacceptably high usage rates. But that approach seems so darned paternalistic ...

12/08/2006 1:44 PM  

Post a Comment

<< Home