Follow BigDATAwire:

July 18, 2024

LexisNexis Accelerates Business Research with GenAI

(BOY-ANTHONY/Shutterstock)

LexisNexis says its new generative AI solution will save hours of time for customers who are conducting research using the company’s massive repository of news, legal filings, and other business information. Dubbed Nexis+ AI, the new offering leverages large language models (LLMs) in several ways to surface “gold nuggets” of insight from the reams of available data.

Since it was founded in 1970, LexisNexis has been a trusted source of information for thousands of corporations, law firms, and other entities. The New York analytics firm today provides a curated flow of information from more than 20,000 licensed sources, including the Associated Press, Gannett, McClatchy, Benzinga, FiscalNote – CQ Roll Call, and many others.

The trusty search engine remains a primary means for customers to interact with LexisNexis’ treasure trove of information. In addition to basic keyword search, LexisNexis has long since adopted neural search technology that uses vector embeddings to help users find the information they’re after. It has also adopted GenAI in other avenues of the business.

But even with neural search, the vast universe of available information can still be overwhelming. That’s where its new Nexis+AI offering comes in.

“What we’re trying to resolve… is that there is oodles of data out there,” says Dani McCormick, the vice president of product for Global Nexis Solutions at LexisNexis. “There are lots of things you can look at, but not enough time, unless you can clone yourself to effectively go through all of it.”

It can take a journalist or corporate researcher many hours to comb through all of the sources of data that LexisNexis can surface on a given topic and complete a research project, McCormick says. The need to be thorough and get accurate answers means that customers in regulated or otherwise exacting industries, such as investigative journalism, can’t take shortcuts. That makes for painfully long hours of reading, note-taking, link-copying, synthesizing, summarizing, and ultimately writing a report.

Nexis+AI uses LLMs to accelerate research and report-writing tasks

McCormick demonstrated for Datanami how Nexis+ AI works. The researcher begins her session the standard way: By entering a term into the search bar (it is hard to improve on good old search). But once the researcher starts reading through the various newspaper articles, annual reports, and other sources of information that the search presents, the power of Nexis+ AI begins to show itself.

Nexis+ AI uses GenAI tech in multiple ways to accelerate the research and report-writing process. For starters, the software uses an LLM to provide a summary of a given newspaper article or other document, which tells the researcher whether the information is likely to be relevant to her task.

As the researcher browses the stories or documents, another LLM enables the researcher to have an interactive question-and-answer session about the information in the story or document, which is the second way that the product uses GenAI.

Finally, when the researcher finds a piece of relevant information, she can take a “snippet” of the information by highlighting the text using her mouse and keyboard. These snippets, along with the metadata describing their sources, are then stored by LexisNexis in a “hub.” The researcher can load the hub with snippets up to a maximum of 2,000 characters, the company says.

After the research is completed, Nexis+ AI will summarize the various snippets in the hub and provide a draft of a report, complete with links to source material, which is the third use of an LLM.

When you put it all together, this significantly streamlines access to actionable information within LexisNexis, McCormick says.

“What we’re trying to do [with Nexis+ AI] is to make driving you to the gold nuggets within that data… effortless,” she tells Datanami. “This allows people to dive into the document super quickly. It reduces a job down to about 15 minutes that could have taken, dependent on the report size, anywhere from 45 minutes to two to three hours.”

LexisNexis used several LLMs to build Nexis+ AI, including Anthropic Claude running on AWS Bedrock and OpenAI’s GPT models running on Microsoft Azure, says Snehit Cherian, CTO for Global Nexis Solutions at LexisNexis. The company uses retrieval-augmented generation (RAG) to improve the accuracy of the information generated by the LLMs, and also put in place a separate framework to combat hallucinations, he says.

LexisNexis worked closely with its publishers and content providers to ensure that their intellectual property is protected, Cherian says. In addition to IP rights, the publishers were particularly concerned with the context that the information is used in. The capability in Nexis+ AI to automatically attach metadata to the snippets, which in turn provides links back to information sources cited in the LLM-generated summaries, really resonated with the publishers, he says.

“We started sharing this framework with them starting the middle of 2023,” he says. “We’ve been working with one publisher at a time [including] with their legal team, they’re licensing team, and their technology team.”

This is not the last GenAI product that you’ll see from LexisNexis, according to McCormick, who says the company has a solid roadmap for using the tech.

“This is the beginning of really exciting product journey,” she says. “We have a huge development roadmap that we’re going on with this product. But even the functionality that I’ve shown you today saves significant time on this journey.”

For more information about Nexis+ AI, go to lexisnexis.com/NexisAI.

Related Items:

GenAI-Generated News: Less Accurate, Timelier, Survey Finds

Where US Spy Agencies Get American’s Personal Data From

LexisNexis Touts HPCC’s Benefits Over Hadoop

BigDATAwire