Saturday, May 18, 2013

Complexity and Hubris


In the book "Immoderate Greatness," author William Ophuls describes how an organization like a civilization grows complex by creating chaos and degradation in its environment, eventually using up or rendering useless one or more resources that is critical to its survival. Ironically, the complexity works against it, because the people in the civilization are inherently unable to understand it well enough to predict its effects, and can't recognize the symptoms of impending decline. When the decline starts, chaos grows and overwhelms them. They then react inappropriately, and the civilization rapidly collapses. This concurs with other things I've read, along with observations of how governments and businesses tend to develop.

There are, of course, some people who recognize the symptoms and what they mean. Unfortunately, they're not in the majority, and certainly not in the majority of leadership positions. As our worldwide civilization rapidly approaches the threshold where nothing can save it, in part due to the consequences of global warming, the majority of those who know what's coming hold out hope that somehow enlightenment will spread or technology will triumph, just in time. The rest have already given up, and are focused on explaining what's happening while salvaging what will be left by developing alternative values and ways of living like what I suggested in Beyond Hope.

In Efficiency and Completion Time, I described how progress on "tasks" evolves over time and depends upon preparation, action, and luck. As a test engineer, it's been my job to help designers and manufacturers to determine how much of a task remains, where the task is the creation and deployment of a technological system that meets a set of expectations called "requirements." This part of the preparation phase can only be done after the first attempt at completing the task, and real-world conditions (including how people will use the system) can be applied to demonstrate what the system will actually do, along with what affects it. Requirements, more often than not, are very simplistic guesses which should be modified or supplemented based on experience that checks assumptions they were based on and reveals unintended consequences of their application. I say "should be," because my specialty is finding these oversights, which tend to comprise most of problems discovered after the second attempt at completing the task, which can be up to 25% of the total task. Unfortunately, managers and those who pay them typically only plan for no more than two attempts, using their best guess as to how much time and resources are needed, and compensating for luck by hiring the most experienced and capable people they can find.

This a good example from my experience of the linear thinking that Ophuls attributes to system failure. I've heard it explained away as "realistic," and "pragmatic" by people who swear they would do more "in a perfect world." Yet they are also typically people who don't have the "bandwidth" (read "limit to the rate they can process information") to handle explanations that can't be captured in single pages of bullet points. To be fair, all of us can keep only a handful of ideas in our head at a time (I've heard between 3 and 7); and the people who have the power to decide what others should do often have more than a handful of complex tasks of their own that they are expected to work on simultaneously. The more power they have, the less time they have to do any part of it, even if they're highly efficient and work every waking hour, so it's no wonder that more than two attempts at a task is considered a luxury, and that 80% completion is considered acceptable, with the rest – hopefully – undetected, explained away, or blamed on someone else (such as bad luck as "acts of God").

Given enough time, we will experience the consequences of ignoring details that we missed. These consequences pile up, especially since we're driven to accomplish more and more, and the consequences interact to create amplifying feedback loops. Eventually they can't be ignored; but as Ophuls points out, by that point it may be too late to fix the underlying problems before we're overwhelmed. To the extent that people recognize this as a valid threat, they might be inclined to limit what they do to the consequences they can adequately foresee (matching power to acceptable responsibility), but our culture – and arguably human nature – makes their taking action on it highly unlikely. Most likely, people like me who are good at identifying the problems that aren't obvious, and yet potentially the most destructive in the long term, will be vilified, shunned, or merely tolerated, especially if we're vocal about what we find and what needs to be done to fix it.

Ophuls considers what I call "graceful shutdown" all but impossible without an extremely unlikely shift in values to ones like those I've promoted. I'm obviously inclined to agree; but I've grown even less optimistic than he is that it will happen, given the forces we have collectively unleashed with the prime focus of accelerating global extinction rates. In short, having already lost hope for civilization, I'm now a hair's-breadth away from searching for the planetary equivalent of hospice.