I’m not sure if this is a field, but it’s the name I’ve given to the intersection of ethics and systems thinking. Some fundamental questions that arise from this intersection are:
- How can we locate ethical responsibility within complex systems with emergent properties where cause and effect are not straightforward to identify? (Here I see a parallel with the ethics of risk in the need to consider ethically the potential effects of our actions.)
- How can we behave ethically within systems that are themselves unethical?
Some practical questions I think are related:
- What are my ethical obligations with respect to climate change? To do nothing, to move my family to a self-sufficient commune, or something in between like buying carbon offsets? (which I’m doing and encourage you to do as well!)
- While I’ll choose to dismantle white supremacy regardless, what are white people’s ethical obligations with regard to multi-generational systems of white supremacy, in which we may not have chosen our race but neither can we opt out of the privilege that comes with it? (This may relate to political philosophy and in particular social contract theory with the idea that we implicitly accept certain ethical responsibilities just by living in a society.)
Historically, ethics has focused on the intentions behind our actions, on what we mean to do. But for people embedded within systems as we are, the intentional effects of our actions are only the tip of the iceberg. To behave ethically within a system is to recognize the vast majority of unintended effects of our actions, and to modify our behavior accordingly. (In economics, these unintended effects are called externalities.)
When unintended, harmful outcomes of one’s actions are pointed out, we typically defend ourselves saying “Oh, I didn’t mean for that to happen,” and leave it at that. But that assumes the point of ethics is to assign blame, and that once blame is assigned (or deflected) we’re done. But it seems to me what really matters is what we do after that. When it becomes clear that our actions are having a negative effect that we did not intend, do we stop? If we don’t, we can no longer claim ignorance of that effect.
But, of course it’s often not a simple question of stopping some action or not. Once I became aware of the fact that cattle farming is a significant contributor to climate destabilization, I began eating less beef, and when I do eat beef I seek out beef raised more sustainably. But I didn’t stop eating beef. I did enough to feel like I was doing something, because ultimately were just I to stop eating meat at all, it wouldn’t be enough to stop climate destabilization. However, I think this notion of “enough” is a trap. Ethics doesn’t call us to do enough, it calls us to do all we can. When the systems in which we live produce outcomes we don’t want, perhaps we should focus on what we personally are putting in to the system. With interdependence in mind, we never have total control over the outputs of the system, but we do ave control over what we put into it, so perhaps that is where we should place the locus of ethical responsibility.
Systems thinker Peter Senge on the topic, in a talk titled Systems Thinking for a Better World:
This is, as far as any of us know, the first time in human history where simple daily acts of living, very mundane things like we plug it into the wall. Very few of us think of this as an ethical action: charging our device, whatever that device might be. But of course that device uses electricity. That electricity has to come from some place…it comes from an electric grid, a grid that moves electricity all over this part of the world… In my country, about 70% of electricity comes from burning coal. I always try to remind people—we get caught up, dazzled by our technology—but we have to relize that none of it works if we don’t plug it into the wall and burn former living things to make it work.
…How many of us want to destroy species? Really, you wake up in the morning and say “What a beautiful day to destabilize the climate a bit more.” Of course we don’t think that way! No one wants to produce the systemic outcomes that we consistently produce. And what I started to realize is that is almost the archetypal definition of systems intelligence, or let’s just say systems ignorance.
Eugenia Cheng in The Art of Logic in an Illogical World:
When people argue about who is to blame for something, it is usually an “and” situation. Everyone involved collectively caused whatever it was to happen, linked by their particular situational brand of “and”. And usually, whatever they did was in turn caused by something else, some other pressure in the system or society. So who is to blame? It is possibly futile even trying to answer that question. A better question is “who is going to take responsibility for changing it?”