Get to the Good Stuff Faster with Active Learning

Active learning, the newest addition to RelativityOne Assisted Review, learns from your team’s coding decisions and uses them to continuously deliver the documents that matter most to your reviewer.

Get to the good stuff faster.

Active learning keeps a pulse on coding decisions in real time to refine its understanding of what’s responsive. As your project progresses and reviewers code more documents, the engine gets smarter, analyzing the coding decisions and constantly refining its understanding of what’s most important to your matter—so you can get to the heart of the issue faster.

Spend less time on setup and administration.

Getting to the most important documents doesn’t have to take a lot of effort. Active learning handles the brunt of the work, with minimal setup and human input. There’s no need for training sets, no manually batching documents. You can take a 100,000-document project from setup to review in under 10 minutes.

Reviewers simply log in, click a button, and start reviewing the most relevant data. The review queue is continuous, so administrators don’t have to worry about any next steps and they can easily monitor the results.

Flex your analytics muscles to meet the needs of any review project.

Combine active learning with structured and unstructured analytics tools like email threading, clustering, sample-based learning, and visualizations to create unique workflows that match the needs of your project—whether it’s investigating the merits of a claim, sorting your data into key issues, or preparing evidence for litigation.

For example, you can use cluster visualization to narrow in on relevant documents based on search terms and key players you’ve determined as important, then use those documents to kickstart active learning. Right off the bat, the active learning engine will have a solid understanding of what’s relevant—based on your coding of relevant documents—and deliver more of those types of documents to your team.

Call a project "done" with confidence.

Your review is winding down, but when is it safe to call it? Elusion testing can help with that. It’s a new validation test you run at the end of the project to see which relevant documents were missed.

To help you determine when it’s time to run the elusion test, check the review progress report, which records the percentage of documents coded responsive from the queue. As your review progresses, you’ll see fewer responsive documents for RelativityOne to serve up. Once you stop seeing a high portion of responsive documents on the graph, it’s time to run the elusion test.