automation

John Henry Reviews Documents

Integreon has an interesting discussion on a recent study pitting humans against machines.  No this isn't about supercomputers and Jeopardy! It's something much practical:

The underlying study by a trio of recognized experts in cognitive science, information management, and e-discovery, Herb Roitblat, Anne Kershaw, and Patrick Oot, is described in detail in their journal article, Document Categorization in Legal Electronic Discovery: Computer Classification vs. Manual Review, published in the January 2010 issue of theJournal of the American Society for Information Science and Technology [link available at the Posse List].

It raises - and partially answers - the important question whether we are approaching a breakthrough in terms of the capability of automated review tools to render ‘consistent’ and ‘correct’ decisions, as measured against an existing standard, while classifying documents in a legal discovery context. The study pitted two teams of contract attorneys against two commercial electronic discovery applications to review a limited set - 5,000 documents - culled from a collection of 1.6 million documents. The larger collection had been reviewed two years earlier by attorney teams in connection with a Second Request relating to Verizon’s acquisition of MCI. The authors’ hypothesis was that “the rate of agreement between two independent [teams of] reviewers of the same documents will be equal to or less than the agreement between a computer-aided system and the original review.”

The study set out to test whether an automated review tool would show similar levels of agreement with classifications made by the original reviewers as did the two contract teams. The two re-review teams agreed with the original review on about 75% of document classification decisions; the commercial automated applications fared slightly better.

There a number of obvious (and not so obvious) flaws in the study, which the Integreon post nicely lays out. My first reaction is that "rate of agreement" is a lousy benchmark, since the measure conflates too many significant variables.

I'm also fascinated by this quest for the document review holy grail: total automation. Contrary to lean principles, these managers seek to automate the process without fully understanding how it works manually. Just exactly how and why do review document reviewers make different calls?

And what about a hybrid approach?

A potential hybrid model would have senior attorneys review representative sets of documents and the tool analyze features of the reviewed documents to identify and auto-tag “like” documents in the larger collection. As the review proceeded, the tool would ‘percolate’ to the review team’s attention subsets of documents from the collection dissimilar from those already reviewed. Based on the reviewers’ decisions as to these documents, the tool continues to apply tags to more of the collection.

The attraction of this approach is two-fold: human attorneys are still making initial determinations but the application magnifies the effect of their determinations by propagating decisions to similar documents throughout the larger collection. It has been suggested that, in the proper context, this approach would permit a single attorney to “review” a vast collection of documents in several hours. A test of that claim is warranted and, if the premise were proved, it would be impressive and could directly influence the increased use of automation in review, even if, for all the reasons stated above, wide adoption of such processes would take a while.

As a lawyer who likes to tightly control processes, I'll admit the attraction of this approach. As one moves down the hierarchy in any litigation team, deep knowledge of the client and issues is inevitable lost. If technology can leverage the knowledge of the most engaged, the better the result, theoretically.

(cross-posted at California E-Discovery Law)

Does Automation Diminish Our Basic Skills?

Photo Credit: Rui Caldeira

Photo Credit: Rui Caldeira

Pilot Patrick Smith has another interesting article on cockpit automation and flight safety, something this blog has considered before.

Has automation reduced pilots' basic "stick and rudder" skills?  His answer:  "Probably, yes."

But the more interesting discussion is how automation has grafted a new technological skill set onto basic flying:

[A]utomation is merely a tool. You still need to tell the airplane what to do, when to do it, and how to do it. There are, for example, no fewer than six different ways that I can program in a simple climb or descent on my 757, depending on preference or circumstances. The automation is not flying the plane. The pilots are flying the plane through this automation.

A fitting metaphor for other knowledge work.  Technology hasn't changed what we do, as much as changed how we interface with machines to get it done.  The tools have changed.  The work, fundamentally, has not.

Of course, interfaces are complicated and can even add to our overall workload:

If you ask me, the modern cockpit hasn't sapped away a pilot's skills so much as overloaded and overburdened them, in rare instances leading to a dangerous loss of situational awareness.

A danger for all of us.  Alarms, notifications, badges, and our ever-expanding landscape of electronic inputs, distract us from real work.  Whether that's landing a plane, or delivering a project.

This has given birth to a meta-skill: the ability to sift, filter, and organize the elements of our work.  Our first challenge, then, is to maintain situational awareness in a complicated world.

Update:  Interesting post on maintaining situational awareness in e-discovery.

Does Technology Make You Complacent?

Is autopilot dangerous? The National Transportation Safety Board is holding a three-day conference in Washington, D.C. to discuss pilot and air traffic controller professionalism, including whether automation makes pilots complacent.  The New York Times reports:

Automation is generally considered a positive development in aviation safety because it reduces pilot workload and eliminates errors in calculation and navigation. “The introduction of automation did good things,” said Key Dismukes, chief scientist for aerospace human factors at NASA. But it changed the essential nature of the pilot’s role in the cockpit. “Now the pilot is a manager, which is good, and a monitor, which is not so good.”

...

Finding the balance between too much technology and too little is crucial, according to William B. Rouse, an engineering and computing professor at the Georgia Institute of Technology. “Complacency is an issue, but designing the interaction between human and technical so the human has the right level of judgment when you need them is a design task in itself,” Mr. Rouse said. “When the person has no role in the task, there’s a much greater risk of complacency.”

Law offices certainly don't run themselves. But some functions are now automated, like document assembly, which utilizes software, templates, and the organization's knowledge base. There's no dispute this is a good development, reducing the time and expense of legal work and producing higher quality and more consistent work product.

Yet the danger of complacency exists. The technology makes it easy to produce good looking work product without dwelling on the details of the process. Professionals can be lulled into clicking buttons rather than thinking carefully.  They can overlook special circumstances or reasons for deviating from standard work.

Good countermeasures might include checklists to ensure people think through the issues. There should be a good review process to ensure final quality. And most importantly, as mentioned in the article, humans must maintain a role in the task -- important work shouldn't be completely automated.

Lean bonus: Discussing the Northwest Airlines flight that overshot its destination, the article quotes Chesley B. Sullenberger III, the captain who famously landed the US Airways plane in the Hudson last summer, reminding us to look for root causes of problems rather than reflexively blaming technology:

“Something in the system allowed these well-trained, experienced, well-meaning, well-intentioned pilots not to notice where they were, and we need to find out what the root causes are,” he said. “Simply to blame individual practitioners is wrong and it doesn’t solve the underlying issues or prevent it from happening.”

Also see this post by Mark Graban on aviation, standardized work, and automation.