A Synthetic Theory of Law and Technology
I’d like to start my last post by thanking Gaia, Frank and Jim for putting this blog together. I also now see my link to ‘digital biosphere’ in yesterday's post was wrong, should be okay now.
Today’s post will discuss a forthcoming co-authored work (with Jason Pridmore) ‘A Synthetic Theory of Law and Technology, Minnesota Journal of Science and Technology (forthcoming 2007)’ where we discuss how a synthetic theory of law and technology could inform law and tech analysis—I don’t have a copy posted anywhere but I’d be happy to email you a copy of the draft, if interested.
The theory draws from existing literature, mainly developed by sociologists. I suppose it might be possible to develop a theory from scratch, examining issues such as the definition of technology, but it may make more sense to draw from a mature body of literature. Other disciplines, such as economists' theories of economic diffusion, might also serve to ground a law and tech theory.
First off, why a synthetic theory? Why not say polyester or perhaps a nice cotton blend? The synthetic theory is a synthesis of two broad theories of technology: instrumental theories and substantive theories. Instrumental theories (probably more like social perspectives than outright theory) tend to treat technology as a neutral tool without examining its broader social/cultural impact. In contrast, substantive theories emphasize the ways that technological systems can exert ‘control’ over individuals, often without their knowledge that this process is taking place.
From our perspective, each theory, standing alone, has disadvantages that reduce their utility with respect to legal analysis. Instrumental theories suffer from the fact that they do not take into full account the contextual complexities that could inform legal analysis in search of optimal policy solutions in an environment of tech change. Substantive theories, on the other hand, appear to over-emphasize the need to address the social impact of technological structures, at the expense of a fuller consideration of human agency and examination of each case on its particular facts and circumstances. We tried to draw out and integrate the most helpful elements of both theories to create the synthetic theory.
It may be helpful to offer an example of the ways that technologies can have a substantive impact (whether political, social, cultural or some other way) on society so that, according to the substantive theories, they should not be viewed as merely neutral tools. In Do Artifacts Have Politics, Langdon Winner takes it as a given that technologies are interwoven into modern politics and in fact embody specific forms of power and authority. To sustain this point, Winner uses the examples of low highway overpasses and mechanical iron molding machines. The overpass bridges were built low to deliberately prevent low-income transportation (e.g., buses) from travelling out of New York towards the homes of the wealthy on Long Island. The iron moulds did not work as well, or as cheaply as skilled iron workers, although they were implemented to effectively prevent iron workers from unionizing, as the steel mill owners now had an alternative, if needed. To Winner, it is obvious that technologies stack the deck in favour of certain social or political interests and, as such, the technologies have a substantive impact on society that exists outside of their intended use.
For a more modern example, consider cell phones: they were developed to enable wireless communications but an unintended use is that they reveal the geographic location of the user, potentially for state investigatory purposes at some later date. So many of us now carry around a state tracking device without a second thought. Substantive theorists—including critical theorists, and folks like Max Weber, Jacques Ellul—worry that technology is embedded within social structures such as capitalism (or Ellul’s technique) that render the actions of human agents insignificant—we no longer seem to mind carrying around tracking devices, which may help to change or ‘determine’ individual and social expectations about privacy in the context of state searches.
We propose a synthetic theory that tries to balance the potentials for restrictive and beneficial forms of social structure against the limitations and potentials of human agency. The synthetic theory could then be directed at the analysis of the three broad themes or general principles at the intersection of law and technology discussed in the last post: (a) the analysis needs to try to account for the complex and interdependent relationship between law and tech; (b) the analysis needs to explore how the regulation of tech could indirectly protect legal interests; and (c) the analysis should explore whether tech change is subverting traditional legal interest and, if so, deploy creative analysis that is less deferential to traditional doctrine in order to preserve these interests.
Consider briefly the deployment of new surveillance technologies and enhanced sharing of personal information among governments in the post-9/11 environment coupled with legal changes in many countries that reduce traditional protections against unreasonable state searches. If a judge is presented with a case involving state searches of terrorist suspects, by drawing from substantive perspectives of technology she could gain a more accurate assessment of the risks associated with reducing legal protections in an era of enhanced surveillance technologies. Under the substantive view, legal analysis should recognize the ‘public’ or ‘social’ aspect of privacy, which is society’s interest in preserving privacy apart from a particular individual’s interest.
Priscilla Regan, for instance, argues that privacy serves purposes beyond those that it performs for a particular individual: she notes that one aspect of the social value of privacy is that it sets boundaries that the state’s exercise of power should not transgress to preserve, for example, freedom of speech and association within a democratic political system (See Priscilla Regan, Legislating Privacy: Technology, Social Values, and Public Policy (1995)). Under this view, even if privacy becomes less important to certain individuals, it continues to serve other critical interests in a free and democratic state (e.g., the need to protect political dissent) beyond those that it performs for a particular person. As such, the preservation of the social value of privacy can be portrayed as consistent with the promotion of long-term security interests.
Consistent with this view, research by sociologists, political scientists and others discusses how surveillance technological advances heighten the risk of unanticipated adverse social consequences. These outcomes include repression of political dissent as surveillance technologies are used to target identifiable groups such as Muslims despite no evidence of individual wrongdoing: this sort of profiling also tends to lead to social alienation of the targeted group who increasingly take on an ‘us’ versus ‘them’ mentality. Our research team, the Queen’s Surveillance Project, discusses some of these issues in the context of a recent public enquiry involving a Canadian citizen who was sent by U.S. authorities to Syria where he was tortured for over a year, in part as a result of inaccurate information provided by Canadian police agencies. We are trying to reform Canadian law to exert more public oversight over Canadian agencies and their sharing of information about Canadians with foreign agencies.
Today’s post will discuss a forthcoming co-authored work (with Jason Pridmore) ‘A Synthetic Theory of Law and Technology, Minnesota Journal of Science and Technology (forthcoming 2007)’ where we discuss how a synthetic theory of law and technology could inform law and tech analysis—I don’t have a copy posted anywhere but I’d be happy to email you a copy of the draft, if interested.
The theory draws from existing literature, mainly developed by sociologists. I suppose it might be possible to develop a theory from scratch, examining issues such as the definition of technology, but it may make more sense to draw from a mature body of literature. Other disciplines, such as economists' theories of economic diffusion, might also serve to ground a law and tech theory.
First off, why a synthetic theory? Why not say polyester or perhaps a nice cotton blend? The synthetic theory is a synthesis of two broad theories of technology: instrumental theories and substantive theories. Instrumental theories (probably more like social perspectives than outright theory) tend to treat technology as a neutral tool without examining its broader social/cultural impact. In contrast, substantive theories emphasize the ways that technological systems can exert ‘control’ over individuals, often without their knowledge that this process is taking place.
From our perspective, each theory, standing alone, has disadvantages that reduce their utility with respect to legal analysis. Instrumental theories suffer from the fact that they do not take into full account the contextual complexities that could inform legal analysis in search of optimal policy solutions in an environment of tech change. Substantive theories, on the other hand, appear to over-emphasize the need to address the social impact of technological structures, at the expense of a fuller consideration of human agency and examination of each case on its particular facts and circumstances. We tried to draw out and integrate the most helpful elements of both theories to create the synthetic theory.
It may be helpful to offer an example of the ways that technologies can have a substantive impact (whether political, social, cultural or some other way) on society so that, according to the substantive theories, they should not be viewed as merely neutral tools. In Do Artifacts Have Politics, Langdon Winner takes it as a given that technologies are interwoven into modern politics and in fact embody specific forms of power and authority. To sustain this point, Winner uses the examples of low highway overpasses and mechanical iron molding machines. The overpass bridges were built low to deliberately prevent low-income transportation (e.g., buses) from travelling out of New York towards the homes of the wealthy on Long Island. The iron moulds did not work as well, or as cheaply as skilled iron workers, although they were implemented to effectively prevent iron workers from unionizing, as the steel mill owners now had an alternative, if needed. To Winner, it is obvious that technologies stack the deck in favour of certain social or political interests and, as such, the technologies have a substantive impact on society that exists outside of their intended use.
For a more modern example, consider cell phones: they were developed to enable wireless communications but an unintended use is that they reveal the geographic location of the user, potentially for state investigatory purposes at some later date. So many of us now carry around a state tracking device without a second thought. Substantive theorists—including critical theorists, and folks like Max Weber, Jacques Ellul—worry that technology is embedded within social structures such as capitalism (or Ellul’s technique) that render the actions of human agents insignificant—we no longer seem to mind carrying around tracking devices, which may help to change or ‘determine’ individual and social expectations about privacy in the context of state searches.
We propose a synthetic theory that tries to balance the potentials for restrictive and beneficial forms of social structure against the limitations and potentials of human agency. The synthetic theory could then be directed at the analysis of the three broad themes or general principles at the intersection of law and technology discussed in the last post: (a) the analysis needs to try to account for the complex and interdependent relationship between law and tech; (b) the analysis needs to explore how the regulation of tech could indirectly protect legal interests; and (c) the analysis should explore whether tech change is subverting traditional legal interest and, if so, deploy creative analysis that is less deferential to traditional doctrine in order to preserve these interests.
Consider briefly the deployment of new surveillance technologies and enhanced sharing of personal information among governments in the post-9/11 environment coupled with legal changes in many countries that reduce traditional protections against unreasonable state searches. If a judge is presented with a case involving state searches of terrorist suspects, by drawing from substantive perspectives of technology she could gain a more accurate assessment of the risks associated with reducing legal protections in an era of enhanced surveillance technologies. Under the substantive view, legal analysis should recognize the ‘public’ or ‘social’ aspect of privacy, which is society’s interest in preserving privacy apart from a particular individual’s interest.
Priscilla Regan, for instance, argues that privacy serves purposes beyond those that it performs for a particular individual: she notes that one aspect of the social value of privacy is that it sets boundaries that the state’s exercise of power should not transgress to preserve, for example, freedom of speech and association within a democratic political system (See Priscilla Regan, Legislating Privacy: Technology, Social Values, and Public Policy (1995)). Under this view, even if privacy becomes less important to certain individuals, it continues to serve other critical interests in a free and democratic state (e.g., the need to protect political dissent) beyond those that it performs for a particular person. As such, the preservation of the social value of privacy can be portrayed as consistent with the promotion of long-term security interests.
Consistent with this view, research by sociologists, political scientists and others discusses how surveillance technological advances heighten the risk of unanticipated adverse social consequences. These outcomes include repression of political dissent as surveillance technologies are used to target identifiable groups such as Muslims despite no evidence of individual wrongdoing: this sort of profiling also tends to lead to social alienation of the targeted group who increasingly take on an ‘us’ versus ‘them’ mentality. Our research team, the Queen’s Surveillance Project, discusses some of these issues in the context of a recent public enquiry involving a Canadian citizen who was sent by U.S. authorities to Syria where he was tortured for over a year, in part as a result of inaccurate information provided by Canadian police agencies. We are trying to reform Canadian law to exert more public oversight over Canadian agencies and their sharing of information about Canadians with foreign agencies.
1 Comments:
Just a thought - rather than technologies exercising control, a nice phrase I encountered recently (I think it is from Latour) is that technologies carry a "script." That is, they don't necessarily force behaviour(although they may) but they carry with them a usual means of interaction. A speed hump, for instance, does not MAKE you drive slowly, but there are consequences if you don't. Cellphones are similar - the existence of cell phones does not force us to carry a tracking device, but the two come packaged together. It is different, from say, heat seeking detection that inevitably makes it possible for police to detect things through brick walls whether or not I "opt in." Should this distinction matter?
Post a Comment
<< Home