Senior Writer

LexisNexis rises to the generative AI challenge

Feature
Dec 01, 20236 mins
Cloud ComputingDigital TransformationGenerative AI

With generative AI, the legal information services giant faces its most formidable disruptor yet. That’s why CTO Jeff Reihl is embracing and enhancing the technology swiftly to keep in front of the competition.

Jeff Reihl stylized
Credit: Jeff Reihl / LexisNexis

IT leaders looking for a blueprint for staving off the disruptive threat of generative AI might benefit from a tip from LexisNexis EVP and CTO Jeff Reihl: Be a fast mover in adopting the technology to get ahead of potential disruptors.

Since its origins in the early 1970s, LexisNexis and its portfolio of legal and business data and analytics services have faced competitive threats heralded by the rise of the Internet, Google Search, and open source software — and now perhaps its most formidable adversary yet: generative AI, Reihl notes.

Reihl concedes that generative AI is evolving much faster than anything he has seen in his career, which spans nearly four decades in IT leadership roles. To address this new reality, his company’s C-suite convened to strategize after the debut of OpenAI’s GPT-4 last March. The meeting consensus? To rewrite and reprioritize all the company’s annual goals to address the new innovations head on.

“We were all-hands-on-deck,” Reihl says. “We did a major pivot because this was a game changer in terms of its interactive abilities, as well as the comprehensiveness of its answers and its data generation capabilities. It was just staggering in terms of its capabilities.”

Given LexisNexis’ core business, gathering and providing information and analytics to legal, insurance, and financial firms, as well as government and law enforcement agencies, the threat of generative AI is real. But Reihl is confident that LexisNexis can tackle generative AI’s advances due to imperfections in today’s general-purpose large language models (LLMs), as well as the proprietary data and unique tools LexisNexis has honed to enhance and customize the LLMs it uses for its services, including Anthropic’s Claude AI assistant and GPT-4 on Microsoft Azure.

LexisNexis’ 2,000-plus technologists and around 200 data scientists have been working feverishly to incorporate unique features that exploit generative AI and add more value for the company’s global customer base. But the foray isn’t entirely new. LexisNexis has been playing with BERT, a family of natural language processing (NLP) models, since Google introduced it in 2018, as well as Chat GPT since its inception.  But now the company supports all major LLMs, Reihl says.

“If you’re an end user and you are part of our conversational search, some of those queries will go to both ChatGPT-4 in Azure as well as Anthropic in AWS in a single transaction,” the CTO says. “If I type in a query, it could go to both based on the type of question that you’re asking. We will pick the optimal LLM. We use AWS and Azure. We’ll take the optimal model to answer the question that the customer asks.”

Late last month, LexisNexis launched Lexis+ AI, its own generative AI solution, in the US that promises to eradicate AI “hallucinations” and provide linked legal citations to ensure lawyers have access to accurate, up-to-date legal precedents — weaknesses discovered in the current slew of LLMs.

Laying the foundation for innovation

None of this would have been possible without having migrated to the cloud, which LexisNexis began in 2015. Primarily an AWS customer, LexisNexis also offers Microsoft Azure for many customers using Microsoft Office and other Microsoft platforms.

But it was an uphill climb to get to the cloud.

When Reihl joined LexisNexis in 2007, roughly half of the company’s infrastructure, including its core platform, was based on the mainframe. The company operated two very large data centers in the US and had made several acquisitions, leading to a very diverse set of technologies and data in a wide variety of formats.

Soon after, LexisNexis IT leaders approached the board of directors to request several hundred million dollars to replace all that infrastructure with XML-based open systems, Reihl says. The company migrated much of the data in a lift-and-shift fashion from the mainframe to those open systems, while adding proprietary search capabilities, as well as indexing and automation. But the applications were not optimized for the cloud, so eventually that had to be rearchitecting for microservices when the company began embracing the cloud nearly a decade ago. 

In 2020, LexisNexis shut down its final mainframes — representing a major cost savings — and put the full force of its energies on its cloud platform. 

While some workloads still run in the remaining data center, most of the data LexisNexis leverages flows from more than 50,000 sources, such as court filings, law firms, news sources, and websites, into the company’s proprietary content fabrication system. The service’s editorial staff also enhance and enrich the proprietary content, while automation adds value to the workflow on the cloud.

LexisNexis has enjoyed many of the same benefits enterprises gain by moving to the cloud, including significant cost savings, scalability, agility, and speed of innovation. But perhaps the biggest benefit has been LexisNexis’ ability to swiftly embrace machine learning and LLMs in its own generative AI applications.

“This is where some of our initial work with AI started,” Reihl says. “We were doing all that through NLP and some basic machine learning, which evolved into more deep learning over time.”

Another major aspect of the transformation has been the company’s efforts in upskilling employees and acquiring new talent. The team makeup at LexisNexis has changed from UX designers, product managers, and software engineers to also include subject matter experts, IP attorneys who understand the law and legal language, as well as nearly 200 data scientists and machine learning engineers.

In total, LexisNexis spent $1.4 billion on its digital transformation, the CTO says. It appears to have been well worth the investment.  

LexisNexis launched Lexis+ AI, its multimodel LLM solution with generative AI enhancements, in the US market in October. The fine-tuned AI platform for the legal industry — one of few AI SaaS platforms on the market — features a retrieval augmented generative engine to eliminate hallucinations, offers refined conversational search capabilities, legal document drafting, case summarization, and document upload capabilities that enable users to analyze, summarize, and extract core insights from legal documents in minutes, according to the company.

The CTO says the platform was co-developed with customers who worked on the beta version and helped the company refine prompts and searches, and implement security to ensure privacy and certain searches can be kept in-house, which is critical for attorneys.

The greatest challenge for LexisNexis is the same one all organizations face: finding enough talent.

“There’s just not a lot of people out there so we’re also training people that have data acumen in those skills,” says Reihl, who still believes that, with 200 data scientists on board, the company is well poised to release its offerings in international markets over the next year.