Interesting article by Nicholas Carr at the Wall Street Journal this weekend, “Automation Makes Us Dumb.” Carr wants to make the case that, like factory automation in the years after World War II, the increasing sophistication of software to help us do our jobs may be de-skilling even our smartest people.
Worrisome evidence suggests that our own intelligence is withering as we become more dependent on the artificial variety. Rather than lifting us up, smart software seems to be dumbing us down.
I’m not at all sure I buy the argument entirely. The “worrisome evidence” he cites is minimal. The first case involves airline pilots who rely too much on “fly-by-wire” software. If you’ve forgotten (or don’t know) how to fly a plane manually, tricky maneuvers that allow you to safely land in the Hudson River become more difficult when the moment requires it.
The second item is ripped from the headlines, about computerized health systems, and is a little worrisome:
In a recent paper published in the journal Diagnosis, three medical researchers—including Hardeep Singh, director of the health policy, quality and informatics program at the Veterans Administration Medical Center in Houston—examined the misdiagnosis of Thomas Eric Duncan, the first person to die of Ebola in the U.S., at Texas Health Presbyterian Hospital Dallas. They argue that the digital templates used by the hospital’s clinicians to record patient information probably helped to induce a kind of tunnel vision. “These highly constrained tools,” the researchers write, “are optimized for data capture but at the expense of sacrificing their utility for appropriate triage and diagnosis, leading users to miss the forest for the trees.” Medical software, they write, is no “replacement for basic history-taking, examination skills, and critical thinking.”
Where I agree with Carr almost completely is his solution, which sounds like one of my favorite hobby horses: Doug Engelbart‘s augmented computing, or what Carr calls “human centered automation.”
In “human-centered automation,” the talents of people take precedence. Systems are designed to keep the human operator in what engineers call “the decision loop”—the continuing process of action, feedback and judgment-making. That keeps workers attentive and engaged and promotes the kind of challenging practice that strengthens skills.
In this model, software plays an essential but secondary role. It takes over routine functions that a human operator has already mastered, issues alerts when unexpected situations arise, provides fresh information that expands the operator’s perspective and counters the biases that often distort human thinking. The technology becomes the expert’s partner, not the expert’s replacement.
Something tells me I’ll have to read his referenced books.