以下内容由机器翻译生成。如果您觉得可读性不好, 请阅读原文或 点击这里.
Cue the sad tuba and attorney jokes: Machines just landed the hurt on lawyers.
LawGeex, an Israel-based startup focused on automating contract reviews, released a study showing its AI software pummels lawyers in document review accuracy. The AI service outperformed 20 corporate lawyers at identifying legal risks in nondisclosure agreement contracts.
But don’t worry, the machines got no papercuts. Undisclosed, however, is whether the lawyers involved in the study have sent their billable hours invoices to the machines for payment.
Bad jokes aside, the lawyers-vs-AI contest involved the review of five previously unseen contracts containing 153 paragraphs of legalese. The AI software reached 94 percent accuracy in surfacing risks in NDA contracts. That compared with an average of 85 percent for the lawyers.
Don’t expect machines to kill lawyers’ careers though, said Shmuli Goldberg, vice president of marketing at LawGeex. He likened the use of LawGeex AI to people using spellcheckers in their emails or pilots using autopilot on planes, both enhancing their work.
“Lawyers are realizing this actually can make their job better,” Goldberg said. “This work is the everyday gruntwork, and most lawyers can’t wait for an AI to take it off their plate.”
Machines were as much as several hundred times faster, too. LawGeex AI’s service did the NDA reviews in a blazing 26 seconds, compared with a span of about 1 to 2.5 hours for the lawyers.
Consulting firm McKinsey has estimated that 22 percent of a lawyer’s job and 35 percent of a paralegal’s job can be automated. Customers using LawGeex include Deloitte, law firm White & Case and Quadrant Law Group.
The LawGeex AI service spent three years in development and was trained on tens of thousands of contracts.
LawGeex AI relied on GeForce GTX 1080 钛 GPUs to build and test its models and Tesla P100 GPUs running on AWS with TensorFlow for its intelligence. The startup, which has raised about $9 million in funding in total, plans to get into inferencing using GPUs next.
“The learning engine requires a lot of computational power,” Goldberg said.