complacency

Does Technology Make You Complacent?

Is autopilot dangerous? The National Transportation Safety Board is holding a three-day conference in Washington, D.C. to discuss pilot and air traffic controller professionalism, including whether automation makes pilots complacent.  The New York Times reports:

Automation is generally considered a positive development in aviation safety because it reduces pilot workload and eliminates errors in calculation and navigation. “The introduction of automation did good things,” said Key Dismukes, chief scientist for aerospace human factors at NASA. But it changed the essential nature of the pilot’s role in the cockpit. “Now the pilot is a manager, which is good, and a monitor, which is not so good.”

...

Finding the balance between too much technology and too little is crucial, according to William B. Rouse, an engineering and computing professor at the Georgia Institute of Technology. “Complacency is an issue, but designing the interaction between human and technical so the human has the right level of judgment when you need them is a design task in itself,” Mr. Rouse said. “When the person has no role in the task, there’s a much greater risk of complacency.”

Law offices certainly don't run themselves. But some functions are now automated, like document assembly, which utilizes software, templates, and the organization's knowledge base. There's no dispute this is a good development, reducing the time and expense of legal work and producing higher quality and more consistent work product.

Yet the danger of complacency exists. The technology makes it easy to produce good looking work product without dwelling on the details of the process. Professionals can be lulled into clicking buttons rather than thinking carefully.  They can overlook special circumstances or reasons for deviating from standard work.

Good countermeasures might include checklists to ensure people think through the issues. There should be a good review process to ensure final quality. And most importantly, as mentioned in the article, humans must maintain a role in the task -- important work shouldn't be completely automated.

Lean bonus: Discussing the Northwest Airlines flight that overshot its destination, the article quotes Chesley B. Sullenberger III, the captain who famously landed the US Airways plane in the Hudson last summer, reminding us to look for root causes of problems rather than reflexively blaming technology:

“Something in the system allowed these well-trained, experienced, well-meaning, well-intentioned pilots not to notice where they were, and we need to find out what the root causes are,” he said. “Simply to blame individual practitioners is wrong and it doesn’t solve the underlying issues or prevent it from happening.”

Also see this post by Mark Graban on aviation, standardized work, and automation.