I recently discovered a blog that's new to me but has been around since 2006: Engineering Ethics Blog, written by "Kaydee," whose profile states that he teaches engineering courses at Texas State and that he has worked in industry and as a consulting engineer.
Engineering ethics is a great interest of mine, coming at it from two directions -- the conventional one, and a different one as well.
The first and common understanding of "engineering ethics" would be the professional ethical standards of practicing and teaching engineers. Engineering organizations have their own published codes of ethics (for example, here is the code of ethics of AIChE, in which I briefly held membership before switching my job description to "never worked a day in her life"*). Engineering colleges routinely offer courses in engineering ethics to their students, although the courses are not always required. Here is an online, uncredited course in engineering ethics offered through MIT OpenCourseWare. (I have not reviewed this course and this link is not a recommendation.)
In that sense, "engineering ethics" encompasses questions of serving public safety, of truth-telling in public statements, of refusing to take part in bribery or corruption, of accurate self-representation, and of working within one's own sphere of competence.
There is another sense in which I, personally, like to think of "engineering ethics." That is the special perspective and specialized knowledge that engineering training brings to the consideration of ethical questions in general.
Such questions confront all human beings above the age of reason. They include questions that societies face in balancing diverse interests through public policy; questions of evaluating costs and benefits, risks and rewards in the workplace and in our personal lives; and how specific ethical questions ("case studies," if you will) refine and sharpen the broad principles of moral philosophy.
What do I mean by the "special perspective and specialized knowledge" of engineers? Emphatically not that the engineer, or any other kind of scientist, has a specially privileged moral sense. Scientists and engineers are no more likely to make good interior ethical judgments than anyone else. We are just as likely to be selfish, corruptible, or arrogant. If all our policymakers were scientists and engineers, we might have more technically informed policy, but we would not necessarily have policy that better served the public good. There is precious little about "what one ought to do" in an engineering education -- and what there is, appears in the engineering ethics courses.
Part of what I mean by the relevant perspective and specialized knowledge is our expertise in technical matters that may inform ethical decisions.
- We may not be specially able to tell the public whether we ought to mandate such-and-such an adaptive technology to better accommodate the disabled in public buildings. But we can tell the public how much the machines will cost, how soon they could be installed, which places they would be more difficult to implement, and how frequently they would require maintenance.
- We may not be specially able to tell the public whether we ought to ban a particular type of gadget that has fallen out of favor because of some impact its production or disposal has on the environment. But we can provide a list of feasible substitute gadgets, count up how much the substitutes would cost, estimate how soon new technologies might come to market, consider whether there are alternative means of dealing with the old gadget's problems, and try to predict whether the substitute gadgets will bring worse costs and risks than the old ones.
My husband, the other engineer on this blog, says about his responsibilities giving technical advice at work: "I never tell them that such-and-such an idea is impossible, or that it is a bad idea. That is not my job. I always tell them how much their idea would cost, and let them make the conclusions."
But another aspect that the engineer brings to ethical problems is our trained approach to problems in general. It's sometimes maddening for the people who have to work with us and live with us, but we can be accused of treating every problem, from public policy to household budgetary dilemmas to difficult childrearing decisions to friends' marital woes as an engineering problem.
What are the constraints? Make a list. What are the available resources? Tally them up. Which pieces of information do we lack? Take a measurement, go get advice, or make an appropriate assumption along with the necessary caveats. What are the conceivable courses of action? Plot them out. Where are the critical decision points? Identify them and identify the possible decisions.
How much could each path cost? What are the possible benefits? Who benefits at each stage? What are the risks -- and what is the probability of each projected undesirable outcome? Can we live with those possibilities?
It's not that we don't take into consideration people's feelings about their problems. It's just that the potential effect of feelings have to be factored into the cost-and-benefit structure. If you're going to have an irrational reaction to one potential outcome of this decision, then I have to factor your irrationality into the potential negative costs.
We tend to drive nontechnical types a little crazy when we import our approach into nontechnical problems. But I think it's a good way to approach problems of all kinds, and is one reason why I'm grateful for my engineering education, which has shaped the way I look at all kinds of things, even though (as noted) I haven't worked a day in my life. At least not recently.
Anyway, here is a sample of posts from Engineering Ethics Blog.
On the unintended consequences of the ethanol-in-gasoline mandate ERISA:
At the time, ERISA was passed, ethanol was the only biofuel that had any reasonable chance of making it into the nation’s gas tanks in a reasonable time frame....
The not-so-advertised reasons for the law have to do with the strength of the agricultural lobby. The E10 mandate was a tremendous windfall for everybody who grows corn. While some ethanol from corn was being used voluntarily as a fuel additive before 2007, the mandate caused this use to skyrocket. By 2011, according to the Mosbacher Institute report by economist James Griffin, 37% of the entire U. S. corn crop went toward ethanol production. And corn prices soared from $2.50 per bushel up to as high as $7.50.
If the only people hurt were U. S. food consumers (not everybody drives a car, but everybody eats), it would be bad enough. But the U. S. grows and sells more corn than any other nation, and much of it is exported to poorer countries, where it is a staple in many diets. While the rise in corn prices was not solely responsible for the worldwide inflation in food costs that led to food riots in many nations in recent years, the timing is suspicious, and there is no question that the ERISA law led to hardships for many poor people around the world who were now even less able to afford to eat....
Unintended consequences show up all the time in considering engineering ethics, and the ERISA mandate has plenty. The parties who appear to have benefited are: growers of corn and producers of corn-based ethanol (a lot), the U. S. driving public (a little), and the U. S. overall, from the viewpoint of slightly improved energy security. The losers include refiners (who have had to fool with the mandate and change their processes), anybody who buys corn (U. S. food consumers, U. S. livestock growers, and millions of foreign food consumers, many of whom are poor), and the U. S. public in the sense that they have had to pay the 45-cent-a-gallon subsidy through the U. S. treasury. Quite a mixed bag, to say the least.
On the Air France 447 crash of June 1, 2009:
While we will never know why co-pilot Bonin (the one with least experience) did what he did, the fact remains that at 2:10, he pulled the stick back and basically kept it there until it was too late to correct his mistake....
In older aircraft, the two pilot sticks are mechanically coupled together, so only one message goes from the cockpit to the ailerons. If two pilots disagree on what to do with such a stick, they find themselves literally fighting a tug-of-war in the cockpit, and most reasonable people would react by at least talking about what to do next.
But even in the autopilot-off mode, the Airbus sticks could be moved independently, and the plane responds to the average of the two sticks’ motion. To my ears, this sounds like a software engineer’s solution to a human-factors problem. In the event, even though the senior pilot eventually did the right thing with his stick, the computer averaged it with Bonin’s all-way-back stick, and the stall continued.
... I hope the software and hardware engineers working on the next Airbus rethink their strategy of independent sticks and averaging. While human-machine communication is important, this accident emphasizes the fact that interpersonal communication in a crisis is vital. That single additional channel of communication through a mechanical link between sticks might have been enough to avoid this accident.
Hyneman and Savage are really doing what used to be called “natural philosophy,” back when philosophy really meant the love of knowledge, and not some arcane specialty that you have to get a Ph. D. in to understand, which is mostly what it means today. Before about 1800, most science was done simply because people were curious and wanted to know whether a thing was true or not. There were no huge funding agencies, no boards of proposal review or journal referees—just a few curious guys (it was nearly all guys then) who got together in coffee shops and wrote each other letters about their experiments. And because there was almost no organized industry producing scientific instruments, they had to build almost all their equipment and experiments themselves.
Hyneman ran a special-effects shop before getting involved with MythBusters, and so the very hands-on demands of that type of work (especially before digital technology took over movies to the degree it has) gave him a set of skills that fits very well into the kind of things required by the MythBusters shows. So his lack of formal scientific training isn’t really a disadvantage—instead, he goes about things the way the average guy with time on his hands might look into them.
Somewhat to my regret, I noted that the Wikipedia biographies of both stars list them as sympathetic with the skeptic or atheist turn of mind. While such a philosophy may be an advantage in their particular line of work, it is by no means a necessity....
The MythBusters people deserve credit for popularizing both science and how to do dangerous things safely. Their latest mishap, although attention-getting, could have been a lot worse, and I’m sure they will be more careful in the future while investigating questions from the past, such as whether a cannonball could really breach a stone wall. And I’m glad they are continuing a long-established tradition of science for science’s sake—even if they are interrupted by messages from their sponsors.
_______
*This joke is the property of MrsDarwin.
Comments