I recently discovered a blog that's new to me but has been around since 2006: Engineering Ethics Blog, written by "Kaydee," whose profile states that he teaches engineering courses at Texas State and that he has worked in industry and as a consulting engineer.
Engineering ethics is a great interest of mine, coming at it from two directions -- the conventional one, and a different one as well.
The first and common understanding of "engineering ethics" would be the professional ethical standards of practicing and teaching engineers. Engineering organizations have their own published codes of ethics (for example, here is the code of ethics of AIChE, in which I briefly held membership before switching my job description to "never worked a day in her life"*). Engineering colleges routinely offer courses in engineering ethics to their students, although the courses are not always required. Here is an online, uncredited course in engineering ethics offered through MIT OpenCourseWare. (I have not reviewed this course and this link is not a recommendation.)
In that sense, "engineering ethics" encompasses questions of serving public safety, of truth-telling in public statements, of refusing to take part in bribery or corruption, of accurate self-representation, and of working within one's own sphere of competence.
There is another sense in which I, personally, like to think of "engineering ethics." That is the special perspective and specialized knowledge that engineering training brings to the consideration of ethical questions in general.
Such questions confront all human beings above the age of reason. They include questions that societies face in balancing diverse interests through public policy; questions of evaluating costs and benefits, risks and rewards in the workplace and in our personal lives; and how specific ethical questions ("case studies," if you will) refine and sharpen the broad principles of moral philosophy.
What do I mean by the "special perspective and specialized knowledge" of engineers? Emphatically not that the engineer, or any other kind of scientist, has a specially privileged moral sense. Scientists and engineers are no more likely to make good interior ethical judgments than anyone else. We are just as likely to be selfish, corruptible, or arrogant. If all our policymakers were scientists and engineers, we might have more technically informed policy, but we would not necessarily have policy that better served the public good. There is precious little about "what one ought to do" in an engineering education -- and what there is, appears in the engineering ethics courses.
Part of what I mean by the relevant perspective and specialized knowledge is our expertise in technical matters that may inform ethical decisions.
- We may not be specially able to tell the public whether we ought to mandate such-and-such an adaptive technology to better accommodate the disabled in public buildings. But we can tell the public how much the machines will cost, how soon they could be installed, which places they would be more difficult to implement, and how frequently they would require maintenance.
- We may not be specially able to tell the public whether we ought to ban a particular type of gadget that has fallen out of favor because of some impact its production or disposal has on the environment. But we can provide a list of feasible substitute gadgets, count up how much the substitutes would cost, estimate how soon new technologies might come to market, consider whether there are alternative means of dealing with the old gadget's problems, and try to predict whether the substitute gadgets will bring worse costs and risks than the old ones.
My husband, the other engineer on this blog, says about his responsibilities giving technical advice at work: "I never tell them that such-and-such an idea is impossible, or that it is a bad idea. That is not my job. I always tell them how much their idea would cost, and let them make the conclusions."
But another aspect that the engineer brings to ethical problems is our trained approach to problems in general. It's sometimes maddening for the people who have to work with us and live with us, but we can be accused of treating every problem, from public policy to household budgetary dilemmas to difficult childrearing decisions to friends' marital woes as an engineering problem.
What are the constraints? Make a list. What are the available resources? Tally them up. Which pieces of information do we lack? Take a measurement, go get advice, or make an appropriate assumption along with the necessary caveats. What are the conceivable courses of action? Plot them out. Where are the critical decision points? Identify them and identify the possible decisions.
How much could each path cost? What are the possible benefits? Who benefits at each stage? What are the risks -- and what is the probability of each projected undesirable outcome? Can we live with those possibilities?
It's not that we don't take into consideration people's feelings about their problems. It's just that the potential effect of feelings have to be factored into the cost-and-benefit structure. If you're going to have an irrational reaction to one potential outcome of this decision, then I have to factor your irrationality into the potential negative costs.
We tend to drive nontechnical types a little crazy when we import our approach into nontechnical problems. But I think it's a good way to approach problems of all kinds, and is one reason why I'm grateful for my engineering education, which has shaped the way I look at all kinds of things, even though (as noted) I haven't worked a day in my life. At least not recently.
Anyway, here is a sample of posts from Engineering Ethics Blog.
On the unintended consequences of the ethanol-in-gasoline mandate ERISA:
At the time, ERISA was passed, ethanol was the only biofuel that had any reasonable chance of making it into the nation’s gas tanks in a reasonable time frame....
The not-so-advertised reasons for the law have to do with the strength of the agricultural lobby. The E10 mandate was a tremendous windfall for everybody who grows corn. While some ethanol from corn was being used voluntarily as a fuel additive before 2007, the mandate caused this use to skyrocket. By 2011, according to the Mosbacher Institute report by economist James Griffin, 37% of the entire U. S. corn crop went toward ethanol production. And corn prices soared from $2.50 per bushel up to as high as $7.50.
If the only people hurt were U. S. food consumers (not everybody drives a car, but everybody eats), it would be bad enough. But the U. S. grows and sells more corn than any other nation, and much of it is exported to poorer countries, where it is a staple in many diets. While the rise in corn prices was not solely responsible for the worldwide inflation in food costs that led to food riots in many nations in recent years, the timing is suspicious, and there is no question that the ERISA law led to hardships for many poor people around the world who were now even less able to afford to eat....
Unintended consequences show up all the time in considering engineering ethics, and the ERISA mandate has plenty. The parties who appear to have benefited are: growers of corn and producers of corn-based ethanol (a lot), the U. S. driving public (a little), and the U. S. overall, from the viewpoint of slightly improved energy security. The losers include refiners (who have had to fool with the mandate and change their processes), anybody who buys corn (U. S. food consumers, U. S. livestock growers, and millions of foreign food consumers, many of whom are poor), and the U. S. public in the sense that they have had to pay the 45-cent-a-gallon subsidy through the U. S. treasury. Quite a mixed bag, to say the least.
On the Air France 447 crash of June 1, 2009:
While we will never know why co-pilot Bonin (the one with least experience) did what he did, the fact remains that at 2:10, he pulled the stick back and basically kept it there until it was too late to correct his mistake....
In older aircraft, the two pilot sticks are mechanically coupled together, so only one message goes from the cockpit to the ailerons. If two pilots disagree on what to do with such a stick, they find themselves literally fighting a tug-of-war in the cockpit, and most reasonable people would react by at least talking about what to do next.
But even in the autopilot-off mode, the Airbus sticks could be moved independently, and the plane responds to the average of the two sticks’ motion. To my ears, this sounds like a software engineer’s solution to a human-factors problem. In the event, even though the senior pilot eventually did the right thing with his stick, the computer averaged it with Bonin’s all-way-back stick, and the stall continued.
... I hope the software and hardware engineers working on the next Airbus rethink their strategy of independent sticks and averaging. While human-machine communication is important, this accident emphasizes the fact that interpersonal communication in a crisis is vital. That single additional channel of communication through a mechanical link between sticks might have been enough to avoid this accident.
Hyneman and Savage are really doing what used to be called “natural philosophy,” back when philosophy really meant the love of knowledge, and not some arcane specialty that you have to get a Ph. D. in to understand, which is mostly what it means today. Before about 1800, most science was done simply because people were curious and wanted to know whether a thing was true or not. There were no huge funding agencies, no boards of proposal review or journal referees—just a few curious guys (it was nearly all guys then) who got together in coffee shops and wrote each other letters about their experiments. And because there was almost no organized industry producing scientific instruments, they had to build almost all their equipment and experiments themselves.
Hyneman ran a special-effects shop before getting involved with MythBusters, and so the very hands-on demands of that type of work (especially before digital technology took over movies to the degree it has) gave him a set of skills that fits very well into the kind of things required by the MythBusters shows. So his lack of formal scientific training isn’t really a disadvantage—instead, he goes about things the way the average guy with time on his hands might look into them.
Somewhat to my regret, I noted that the Wikipedia biographies of both stars list them as sympathetic with the skeptic or atheist turn of mind. While such a philosophy may be an advantage in their particular line of work, it is by no means a necessity....
The MythBusters people deserve credit for popularizing both science and how to do dangerous things safely. Their latest mishap, although attention-getting, could have been a lot worse, and I’m sure they will be more careful in the future while investigating questions from the past, such as whether a cannonball could really breach a stone wall. And I’m glad they are continuing a long-established tradition of science for science’s sake—even if they are interrupted by messages from their sponsors.
_______
*This joke is the property of MrsDarwin.
1a. When my child can't or won't respond to an apology with "I forgive you." Truthfully, my child usually easily say the words "I forgive you" because, I think, they are so well practiced at it. They know that the sky doesn't fall down if they say it before they "feel" it. They know that life goes on, and usually the words have helped.
But if it was one of my children who felt they could not forgive another, I would take that as a sign that the kids need additional intervention before the discussion is over -- maybe my child is reasonably afraid that the behavior won't stop, etc., in which case it is probably time for some redirection to different activities. You can forgive someone and still decide you're done playing with them for the day. We would stay there and explore it further. On the other hand, if my child is just being obstinate, well, I don't allow that for the young ones. If you can say "I forgive you," you do.
1b. When my child seeks forgiveness and doesn't hear it.
More commonly, it's my child who has apologized and asked for forgiveness, and another child doesn't say the words. I've been through this one a lot.
2a. The "fake apology" coming from my kid. Well, young children rarely give "fake apologies," right? The "I'm sorry you took what I said the wrong way" kind? They sometimes refuse to apologize, and they sometimes say "I'm sorry" when they don't mean it -- and that last is something I wholeheartedly support!
Fake apologies, which have the words "I am sorry" or "I regret" in them but point the sorrow or the regret the wrong direction because they are not grounded in a desire for forgiveness but instead in a desire to continue making a point, are the domain of older kids and adults. People who want to save face.
I have yet to hit the teen years, but I imagine that if I hear one of those coming out of my tween's mouth, I will take him aside and explain that there is nothing wrong or unusual about feeling that you have been misunderstood or wrongly accused, but that the fake apology is never appropriate. If you really desire to be forgiven (and even if the other person is wrong about you, you should desire his forgiveness because forgiveness is good for him and good for your relationship) you will find a way to express that desire sincerely.
Maybe you will have to suck it up and say "I am sorry" and let the person think it is an admission of wrongdoing. Maybe it is not advisable to admit wrongdoing (there are sometimes legal consequences after all) and if no apology you can offer is accepted, at that point you just have to let it go and try (silently) to forgive *him* for refusing to forgive *you.*
2b. Other people's fake apologies. "If someone fake-apologizes to you," I guess I will tell my kids, "the ball is in your court."
You have the choice to accept it as if it were a sincere apology. This is called "taking the high road." It is difficult, but you can have some satisfaction because it is an exercise in humility. It means you let the other person have the last word, and you let it stand for what it is. Sometimes the exact choice of words can be a little tricky, though, because what makes a fake apology fake is that it does not, actually, ask for forgiveness, and so "I forgive you" is a non-sequitur.
(Try it: "I'm sorry you took what I said out of context." "I forgive you." Doesn't work, does it? You see why the fake apology is so insidious? It deprives both people of forgiveness. The only logical response is... "...Um... I'm sorry I took what you said out of context, too?" Or... "I forgive you for being so unclear that I couldn't tell what the context was?" Logically, the argument continues.)
Anyway, "I accept your apology" might work.
There is an alternative response, particularly if you care what the other person thinks of you. You have the choice to treat it as an opportunity for more dialogue, chock full of I-statements ("I get the sense that you feel I have misunderstood you. Do you want to tell me more about that?") Past history and expected future interaction are the guide to which approach makes sense, and you can stop at any point and accept the apology -- such as it is -- on the theory that it is the best you will get.
+ + +
One more thing: I think it is totally appropriate to apply lessons for children to adult problems. That is exactly what we are supposed to do as we grow up: apply what we have learned in the past to whatever is going on right now. It works really well if the lessons were good. And I think that the purpose of apologies does not change with age.