Crafting Nonlinear Technology Regulation
Nonlinear developmental theory offers five concrete lessons for crafting successful technology regulation.
First, nonlinear developmental theory instructs us that human development and learning is always situated; the proximate zone of development varies across individuals. Development is not something that happens to humans in a preordained manner; development is an interactive process that occurs not within the individual, but on the person-society border. Therefore, the society the person experiences pushes the course of development and visa versa. The same biological individual in two different technology-mediated social contexts will arrive at two different developmental outcomes and potentially two different regulatory prescriptions.
Second, development is an emergent phenomenon. The social context – including the technology itself – changes in frequently unpredictable ways. Thus, regulating in a manner predicated on static assumptions about people and technology results in law destined for quick obsolescence. Both human behavior and technology will evolve in response to law. Nonlinear developmental theory show us that effects on individuals’ development and behavior are emergent across multiple layers of context. Multiple developmental layers must coincide in pushing humans in the direction sought by the regulation. The influence of the exosystem of social norms, the mesosystem of peer groups and the economic exchange and the microsystem of the individual’s current state of development all come into play. Without considering all of these, regulation can frequently be circumvented or ignored.
Third, learning and development do not always cleanly map on to chronological age. An adult user whose only interactions with a software application occurs once a week for an hour in a library on a shared machine experiences technology development and learning differently than does the ten year old child with a dedicated laptop in her bedroom. Technology can act as both an equalizer of abilities and an exacerbator of differences.
Fourth, regulating the way that humans interact with technology means contemplating multiple layers of context that cooperate or conflict to generate development. At various stages of life, developmental progress intersects with identity goals, creating another lens guiding individual behavior and developmental outcomes. Because these identity goals are inherently social in nature, two layers of context push on the individual – first the context shaping development through interactions and second the context in which the individual attempts to work toward identity goals.
Finally, technology is merely a tool that assists humans in achieving more than they otherwise could; the regulatory and developmental focus should always remain human-centric. New technologies should be analyzed merely as tools in a Vygotskian sense. They enable a user to accomplish more than the user ordinarily could without the tool. As such, the conduct that arises from this assisted action is not new; it is merely amplified conduct. Regulating technology creation is, however, not the answer; regulating humans, their conduct and their use of that technology is a more promising approach. These humans, perhaps unlike the technology itself, can demonstrate extreme levels of variation but provide a more efficacious, though more complicated, point for regulation.
Placing these five lessons in regulatory context, the Children’s Online Privacy Protection Act demonstrates how ignoring these five lessons of contextualist developmental theory can result in regulatory suboptimality.
First, nonlinear developmental theory instructs us that human development and learning is always situated; the proximate zone of development varies across individuals. Development is not something that happens to humans in a preordained manner; development is an interactive process that occurs not within the individual, but on the person-society border. Therefore, the society the person experiences pushes the course of development and visa versa. The same biological individual in two different technology-mediated social contexts will arrive at two different developmental outcomes and potentially two different regulatory prescriptions.
Second, development is an emergent phenomenon. The social context – including the technology itself – changes in frequently unpredictable ways. Thus, regulating in a manner predicated on static assumptions about people and technology results in law destined for quick obsolescence. Both human behavior and technology will evolve in response to law. Nonlinear developmental theory show us that effects on individuals’ development and behavior are emergent across multiple layers of context. Multiple developmental layers must coincide in pushing humans in the direction sought by the regulation. The influence of the exosystem of social norms, the mesosystem of peer groups and the economic exchange and the microsystem of the individual’s current state of development all come into play. Without considering all of these, regulation can frequently be circumvented or ignored.
Third, learning and development do not always cleanly map on to chronological age. An adult user whose only interactions with a software application occurs once a week for an hour in a library on a shared machine experiences technology development and learning differently than does the ten year old child with a dedicated laptop in her bedroom. Technology can act as both an equalizer of abilities and an exacerbator of differences.
Fourth, regulating the way that humans interact with technology means contemplating multiple layers of context that cooperate or conflict to generate development. At various stages of life, developmental progress intersects with identity goals, creating another lens guiding individual behavior and developmental outcomes. Because these identity goals are inherently social in nature, two layers of context push on the individual – first the context shaping development through interactions and second the context in which the individual attempts to work toward identity goals.
Finally, technology is merely a tool that assists humans in achieving more than they otherwise could; the regulatory and developmental focus should always remain human-centric. New technologies should be analyzed merely as tools in a Vygotskian sense. They enable a user to accomplish more than the user ordinarily could without the tool. As such, the conduct that arises from this assisted action is not new; it is merely amplified conduct. Regulating technology creation is, however, not the answer; regulating humans, their conduct and their use of that technology is a more promising approach. These humans, perhaps unlike the technology itself, can demonstrate extreme levels of variation but provide a more efficacious, though more complicated, point for regulation.
Placing these five lessons in regulatory context, the Children’s Online Privacy Protection Act demonstrates how ignoring these five lessons of contextualist developmental theory can result in regulatory suboptimality.
0 Comments:
Post a Comment
<< Home