Is Automation and Artificial Intelligence a Threat to Legal?aderantuser
When we think of automation technology affecting industries, we tend to associate it with lower-skill jobs that can adequately be replaced with robotics or software. When the Associated Press conducted interviews two years ago with CEOs, software developers and various other experts, however, they found that most of the jobs being lost to new technology were more traditional middle-class positions requiring greater skills. As The Guardian pointed out, “Software was replacing administrators and travel agents, bookkeepers and secretaries, and at alarming rates.”
Will this upward trend continue? Should lawyers be concerned that automation could impact their careers? Almost certainly, but perhaps not in the way many imagine. The legal futurist professor Richard Susskind, for instance, believes that the practice of law will dramatically change over time. He argues that the legal industry of the future will offer new services in new ways, and technology could end up creating new jobs in the process. In The Guardian’s report they noted a potential upside of a cheaper, more technology-driven practice: a large, untapped client base. This so-called “latent legal market” is estimated to be worth as much as $40 billion, and could lead to a greater demand for lawyers.
At the moment, however, there are serious limits in the artificial intelligence (AI) necessary to automate most legal practices, not to mention the relative difficulty computer systems have in providing a “human element” in negotiation. On the other hand, we are already seeing a significant change in what people accept every day from computers for services that they previously would have seen as exclusively needing a personal touch. WebMD is a great example in medicine—not a replacement for your doctor, but a way of being better informed, and for some, a way to reduce cost by self-managing minor ailments.
A recent post on the Balkinization blog written by law professor Frank Pasquale found that “to the extent lawyers are presently doing rather simple tasks, computation can replace them.” But many legal tasks are beyond today’s automation technology, including:
- Perception and manipulation tasks: Robots are still unable to match the depth and breadth of human perception.
- Creative intelligence tasks: The psychological processes underlying human creativity are difficult to specify.
- Social intelligence tasks: Human social intelligence is important in a wide range of work tasks, such as those involving negotiation, persuasion and care.
As professor Pasquale noted, “There is a world of difference between computation as substitution for attorneys, and computation as complement. The latter increases lawyers’ private income and (if properly deployed) contribution to society.” And while Oxford academics Carl Benedikt Frey and Michael A. Osborne have predicted that computerisation could make nearly half of jobs redundant within 10 to 20 years, they view legal careers as being a “low risk” for this fate.
Last April, the ABA Journal cited warnings from the technology journalist Nicholas Carr that lawyers need to better understand the software and computers they use. Since many firms are already employing automation for document review and even drafting contracts, it’s more important than ever that they maintain their skills. Well-designed automation technology will allow lawyers to do this, while freeing up their time for more complex—and profitable—legal services.
Carr also addressed a potentially serious side effect of automation. In his keynote speech earlier this year he noted that he “came across research that showed that, as software gets better, humans learn less and practice their skills less. In theory, the more we can take tasks away from humans, the more time humans will have to work on more important things. What actually happens is people start to get passive and lose attentiveness. What we end up seeing is a fading of human talent.”
I see this every day in our heavily computer/automation supported industry, but I also see the innate human need to have purpose driving people to “add value” around those automated elements, and from this we see progress unimaginable without access to such tools.
Ultimately, the extent of legal automation will be closely connected to the perceived value-add in the legal industry. If it is to act as brokers in negotiations around truth, fairness or any “emotionally significant” transaction then some humanistic element seems imperative—and computers are still a long way away from measuring the ethical intricacies of all the discrete factors in such negotiations. If they are performing legally-approved but otherwise predictable transactions, however, then computers are on their way as fast as people can make money by putting them there.
So, is automation and AI a threat to legal? No. In fact, it could open up the industry to some great opportunities.