One of the, shall we say, interdisciplinary topics I can really geek out about is unintended consequences.
I first got interested in it with respect to engineering. I like disasters. No, I don't really mean that I like disasters, but they fascinate me. I'm always grumpy with news coverage of bridges falling down and malfunctioning airplanes, because the news coverage always stops after it's done with all the human interest stories, and there's never a big fanfare when the NTSB comes out with their final report and explains why and how it all happened.
Sometimes the cause of things falling down is a fairly uncomplicated error, triggered by the sudden application of unusual circumstances (like the 35W bridge collapse here in Minneapolis -- an underdesigned set of parts failed 30 years into the bridge's life during a massive resurfacing project which altered the loads on the bridge) or a perfect storm of errors (like the Gimli Glider story from 1983 that I linked to just a few days ago). But I am particularly interested in bad things that result from good intentions: unintended consequences.
(Actually, the Gimli Glider story is a bit of an example of that: the accident would not have happened had Canada not decided to convert to the metric system, which (one assumes) was a well-intentioned policy change.)
I have a good book on the topic of unintended engineering consequences somewhere here at home. It's called Why Things Bite Back, which is all about technological fixes that create bigger problems than the ones they were intended to solve. There are also a few examples among the case studies of failures in Henry Petroski's To Engineer is Human.
But I'm also interested in unintended consequences in law and policy, and unintended consequences in social engineering. What put me in mind of these today was a post at the Volokh Conspiracy about the abuse of privacy law:
New Hampshire is one of about a dozen “all party consent” states. The federal government and most states are “one party consent” jurisdictions, where recording is legal if one participant agrees to it. ... Recently, though, these laws have mainly protected the police, who’ve used the laws to arrest bystanders for making cell phone videos of police conduct.
Judging from the outrage such arrests have sparked, it’s safe to say that protecting police from public scrutiny is an unintended consequence of this privacy law. (As I’ve pointed out recently, that is not the only unintended consequence of all-party consent laws. They’re also bad for computer security. In most states, I can hire someone to screen incoming messages for malware, and as long as I consent to the monitoring there’s no legal problem. In all-party consent states, though, there’s a real risk that I need the consent of the malware sender before someone can screen his incoming message.)
...Privacy laws are largely efforts to regulate technology. Some new technology comes along, and we don’t like some of the changes it is likely to bring. The privacy campaigners tell us that we can keep the good parts of the technology and ban the bad. So we adopt a new privacy law based on some principle that sounds good to us at the time.
All-party consent laws, for example, responded to cheap taping equipment by adopting the principle that both parties should agree before their conversation is recorded. It sounded good at the time; after all, wouldn’t any other rule encourage treachery? Then, gradually, cheap recording technology spread, and it became easier and easier to violate the law. After a while, the principle that sounded so good a decade or two earlier began to seem a little artificial. Our internal privacy expectations had changed, but the law hadn’t.
Inevitably, violations of the law proliferated, to the point where the violations didn’t feel like wrongdoing.
When law-breaking is widespread and unapologetic, the authorities can pick and choose whom they prosecute. Is it any surprise that they choose to prosecute people who inconvenience the authorities? Or that the laws end up being used to bolster the status quo? ...[H]aving laws on the books that are widely violated because they no longer fit our actual sense of right and wrong practically invites abuse by those in power.
The point of having laws is to punish, stigmatize, and prevent behavior that is widely accepted to be wrong, and to do so fairly. Once a law is technically broken by lots of people, it is eventually only used by the powerful and privileged to punish people who threaten them. If that isn't an unintended consequence I don't know what is.
It occurred to me I could use an "unintended consequences" category, so I'm creating one.
Oh I like looking at unintended consequences too. Fascinating stuff.
Posted by: MelanieB | 14 August 2012 at 09:18 AM