E-discovery is typically considered a niche area within the practice of law. But at its core, e-discovery is just the science of finding the proverbial data “needle” in increasingly large sets of information. In the modern landscape, organizations are not only concerned with finding relevant data, but finding it fast.
This charge, however, is sometimes difficult to accomplish on a government budget. While corporations are technologically capable of flying through data, many government agencies lag behind.
Artificial intelligence software company Veritone is hoping to bring some of the data culling and practices that have worked well in their legal division into the public sector, forming a new government and local agency-oriented division of their organization. Called Veritone Government, this division will help agencies analyze audio and video data for use in ongoing records requests and investigations.
John Newsom, executive vice president at Veritone Enterprise, explained that the company’s AI software can help law enforcement officials move through security video footage to identify key facial features or comply with records requests seeking all available information on a given topic.
Government agencies are required to be accountable to the public, so the technology they use is subject to considerable scrutiny. Recently, predictive algorithm-based technology employed by criminal justice agencies has come under fire following civil liberties advocates’ concerns about law enforcement surveillance tactics and a ProPublica investigation of racial bias in predictive algorithms in “risk assessment tools” used by county agencies.
While Veritone’s tools perhaps fall into a separate category within government technology, Newsom believes that many public concerns around law enforcement and technology are somewhat exacerbated by a fear of the unknown.
“I think the privacy concern is a really big deal, but it’s frustrating when people don’t have an understanding of what it is. That’s probably the tech world’s fault,” he said, adding that the technology community can potentially do more to demystify algorithmic learning.
To date, Newsom said, AI technology’s use in government agencies is a far cry from the Minority Report-like scenarios that critics may envision. “At this stage, this is nothing different than I could do with a large group of humans,” he said. Veritone’s technology, he said, is simply intended look through tons of data to essentially identify whether a given key search term is present or not, something he referred to as “narrow AI.”
“What’s different about having a team of 400 people do that or if I can teach a machine to do it?” Newsom asked. “Agencies can’t afford to do that, so why not train a machine?”
While it may be daunting to allow law enforcement to lean heavily on machines, Newsom noted the company is not intending for any matters of actual judgment to be left to AI technology. “There’s always human judgment at the very top of this whole thing,” he said.
As for privacy, Newsom noted that Veritone’s government-facing division has no plans to hack into private social media data, but hopes to help law enforcement move more quickly through the vast amount of data available from public social media platforms. “It’s not like these machines are going through the back door and digging into non-publicly visible social media; it is going to look at public information that’s publicly visible,” he said.
Instead, Newsom hopes that Veritone’s attempt to introduce machine learning into law enforcement data review can actually help members of the public feel more secure in the accuracy and equity in law enforcement. With records requests, for example, law enforcement officials are often relegated to conducting long, manual reviews of documents, which Newsom said may contribute to some of the public mistrust in the process.
“They’re going to have the appearance of hiding things, whereas AI can help them very quickly isolate the subset of content they need to review and automatically redact the simple things,” he said.