The Series A investment came entirely from a single venture capital firm, Light Speed Venture Partners. Today’s funding comes on top of $4 million in seed capital the company raised previously from Boldstart Ventures, Comcast Ventures and Eniac Ventures.

Pankaj Chowdhry, founder and CEO of FortressIQ, says that his company basically replaces high-cost consultants who are paid to do time and motion studies and automates that process in a fairly creative way. It’s a bit like Robotics Process Automation (RPA), a space that is attracting a lot of investment right now, but instead of simply recording what’s happening on the desktop, and reproducing that digitally, it takes it a step further in a process called “imitation learning.”

“We want to be able to replicate human behavior through observation. We’re targeting this idea of how can we help people understand their processes. But imitation learning is I think the most interesting area of artificial intelligence because it focuses not on what AI can do, but how can AI learn and adapt,” he explained

They start by capturing a low-bandwidth movie of the process. “So we build virtual processors. And basically the idea is we have an agent that gets deployed by your enterprise IT group, and it integrates into the video card,” Chowdhry explained.

He points out that it’s not actually using a camera, but it captures everything going on, as a person interacts with a Windows desktop. In that regard it’s similar to RPA. “The next component is our AI models and computer vision. And we build these models that can literally watch the movie and transcribe the movie into what we call a series of software interactions,” he said.

Another key differentiator here is that they have built a data mining component on top of this, so if the person in the movie is doing something like booking an invoice, and stops to check email or Slack, FortressIQ can understand when an activity isn’t part of the process and filters that out automatically.