Whatever happened to the ethics of engineering?
We’ve seen just one disastrous news story after another these past few years, almost all knowable and preventable. Planes falling out of the sky. Nuclear power plants melting down. Foreign powers engorging on user data. Environmental testing thrashed. Electrical grids burning states to the ground.
The patterns are not centered around discipline or nationality, nor do these events share an obvious social structure. Facebook machine learning programmers mostly don’t hang with German VW automotive engineers or Japanese nuclear plant designers. They weren’t taught at the same schools, nor share the same textbooks, nor read the same journals.
Instead, there is a more fundamental thread that binds these disparate and heinous stories together: the increasingly noxious alchemy of complexity and capitalism. Only through a rejuvenation of safety culture can we hope to mend the pair.
Unexpected disasters are really “normal accidents”
Before we start to assign blame though, we need to take a step back to look at these technical systems. Automotive emissions, nuclear power plants, airplanes, application platforms, and electrical grids all share one thing in common: they are very complex, highly coupled systems.
They are complex in the sense that they have many individual parts that are connected together in sometimes non-linear ways. They are highly coupled in the sense that perturbations to one component can lead to rapid change in that system’s entire operation.
And so you get a reasonably small safety system on the 737 MAX that downs planes. And you have a reasonably limited API in a social platform that leaks its entire users’ data stream. And you have an electrical grid interacting arboreally that sparks and catches fire killing dozens of people.
All of these outcomes are theoretically preventable, but then, the scale of the interactions in these systems is uncountable. Again, small changes can have enormous effects.
Years ago, Charles Perrow wrote a splendid book connecting the rising complexity and coupledness of technical systems with the increase in catastrophic, but “normal accidents,” which he used as the book’s title. His thesis wasn’t that such disasters are rare and should be shocking, but rather that the very design of these systems guarantees that accidents must occur. No level of testing or systems design can prevent a mistake across billions and billions of interactions. Thus, we get normal accidents.
He writes a dour account of the future of engineering, which may well be too cynical. Engineers have matched some of this growing complexity with more sophisticated tools, mostly derived from greater computing power and better modeling. But there are limits to how far the technical tools can help here given our limits of organizational behavior about complexity in these systems.
Management‘s safety delusion
Even if engineers are (potentially) acquiring more sophisticated tools, management itself most definitely is not.
Safety is a very slippery concept. No business leader is anti-safety. None. Every single business leader and manager in the world at least pays lip service to the value of safety. Construction sites may be warrens of danger, but they always have a “hard hat is required” sign out front.
Safety may indeed be the first value of almost all of these organizations, but then, you can spend hours inside of a company’s 10-K or 10-Q before finding one iota of a statement about it (except, of course, after disaster strikes).
It’s this intersection of capitalism and complexity where things have gone awry.
One pattern that binds all of these engineering disasters together is that they all had whistleblowers who were aware of the looming danger before it happened. Someone, somewhere knew what was about to transpire, and couldn’t hit the red button to stop the line.
And of course they couldn’t. That’s what happens when the pressure for quarterly earnings, for growth, can be so intense, that no one in an organization has the capability — not even the CEO — to stop the system.
What’s strange is that these knowable disasters are hardly profitable for their creators. PG&E entered bankruptcy. Facebook is facing a multi-billion dollar fine. VW settled its scandal for $14.7 billion. The 737 MAX situation is leading to questions about whether Boeing can remain a going concern.
No shareholder wants to shred worthless stock certificates. So where is the disconnect?
Rebuilding an ethical base within engineering culture
Ethics starts with leadership at the top, and specifically with better communication around safety and regulatory concerns to all stakeholders, but most definitely shareholders. Owners of stock in companies with complex technical products need to be told — again and again — that the companies they own will prioritize safety over immediate profits. The tone must always be to value long-term growth and sustainability.
To those who don’t frequent Wall Street watering holes, it may come as a jolt to learn that such a sales process may well be difficult. Investors don’t like to hear that their return on equity will lose some basis points, and would prefer to just buy a credit-default swap and jump ship when the ship literally and metaphorically sinks.
Yet, short-term traders aren’t the only investors available. The capital markets are diverse, and there are trillions of dollars of wealth handled by managers seeking to invest in long-term growth, without the downsides of inevitable disasters. One key part of investor relations is to acquire the investors that match the culture of the firm. If your investors don’t care about safety, no one else will either.
The upshot of most of these scandals is that there is now an extended graveyard of companies to point to, and that will help with these conversations.
Beyond boardrooms and shareholders though, engineering cultures need to build resiliency to ship and approve products when they are ready. Engineering leaders need to talk to their business executives and explain safety concerns just as much as they need to constantly reenforce that safety and security is a priority for every individual contributor.
Engineering managers probably have the most challenging role, since they both need to sell upwards and downwards within an organization in order to maintain safety standards. The pattern that I have gleaned from reading many reports on disasters over the years indicates that most safety breakdowns start right here. The eng manager starts to prioritize business concerns from their leadership over the safety of their own product. Resistance of these pecuniary impulses is not enough — safety has to be the watchword for everyone.
Finally, for individual contributors and employees, the key is to always be observant, to be thinking about safety and security while conducting engineering work, and to bring up any concerns early and often. Safety requires tenacity. And if the organization you are working for is sufficiently corrupt, then frankly, it might be incumbent on you to pull that proverbial red button and whistleblow to stop the madness.
Here at Extra Crunch, we are trying to do our part to increase awareness of these issues. Our resident humanist, Greg Epstein, interviews and discusses the challenging ethics of our modern technical world with all kinds of thinkers.
Take some of his work as inspiration, since the demise of the ethical engineer doesn’t have to be a fait accompli. Nor do normal accidents — as normal as they are — have to be so common. We can repair capitalism by adding better tools and accountability for all levels of technical organizations. And in the long run — peering into that burgeoning corporate cemetery — that’s an incredible investment for future returns.
Originally published at techcrunch.com