Getting smart about intelligent automation

By FCW Staff

Jun 01, 2018

Federal agencies are drowning in data. Whether it’s network security logs, mundane financial transactions or satellite feeds and other mission-specific datasets, the volume has far outstripped the processing capacity of any human — not to mention that of many existing IT systems.

There’s tremendous potential in all that data, however, and automation is essential to extracting it. FCW recently gathered a group of agency leaders to discuss the opportunities they see in their organizations and the challenges — technical, budgetary, cultural and otherwise — that must be addressed. Identity management was the use case that kicked off the conversation, but participants quickly zeroed in on the underlying tensions that are complicating a far broader range of automation ambitions.

The conversation was on the record but not for individual attribution (see below for a list of participants), and the quotes below have been edited for length and clarity. Here’s what the group had to say.

Silos, snowflakes and overlapping solutions

Participants agreed that a main reason for the data overload is the fact that so many overlapping tools and systems are creating and touching it. Just for identity and access management (IAM), a single agency has half a dozen pilot programs underway.

“At the top level, you have attributes that are generic,” an executive from that agency said, “but as we go further down into those particular databases or authoritative sources, it quickly becomes more case-specific. But to avoid creating still more silos, you eventually bring them up to the airspace level.”

Different missions, meanwhile, bring unique constraints. “Ships at sea have only a thimbleful of bandwidth,” another participant noted. “I can’t do certificate revocation in real time over something like that when I’ve got mission data going up.” There are plenty of tools for almost every process, the executive added. But when 25 solutions cover 95 percent of the need, “there are about 175,000 overlaps, right? We’ve got to get out of that madness.”

Overlapping teams also complicate matters, especially for IAM. Access privileges vary widely, especially when a team spans multiple organizations, and the growing reliance on mobile technology requires different sorts of authentication factors. “We have a mix,” one participant said. “That’s causing a problem.”

The group agreed that agencies too often wind up adding still more layers of complexity. “We take those tools and customize them,” one participant said. “Then they lose their ability to be fast, to be upgraded and to be migrated. It really is about protecting the data. Unfortunately, sometimes we have to put the shell around it because we need speed.”

Another participant was even more succinct. “It’s messy,” she said.

What automation can do for you

Several participants said a holy grail for IAM would be continuous multifactor authentication that could manage physical and digital access and be standardized across agencies.

“It would happen on entry,” one executive said. “Walking in the first turnstile, you look at the camera. Better than having a card. That entrance security system should notify the IT system that, ‘Hey, Bob just walked in the door.’” The camera on the user’s computer would verify that “that’s the same guy who walked in,” and anywhere that individual goes in the building, there “should be a sensor that knows, ‘Hey, that’s him. Yeah, he can come into this office. Or no, he can’t.’ It’s got to be continual.”

Such a solution would require a new data layer that spans countless existing systems and highly automated handoffs from one system to the next. But it’s not as far-fetched as it might sound, multiple participants said.

One executive shared how he had connected 22 systems and used robotic process automation to take a business process from 110 days to “what I publicly said would be 10 days but actually can be as fast as one hour.” RPA extracted and rationalized the data from the legacy systems, allowing the new AI-powered process to be spun up while leaving those existing systems in place. “So to me that’s not really blue sky,” he said.

That approach could be applied to a range of other missions. Network and user data is already being mined to identify potential risks, but one participant said the process for verifying and sharing threats with the U.S. Computer Emergency Readiness Team is ripe for automation.

The acquisition process has similar potential, another participant said. “If I can reach into my contract writing data and do an analysis of it, leveraging machine learning or RPA, now I can get a range of the terms and conditions that exist and the prices paid. I can get an understanding of what I’m buying. That’s very powerful.”

Even basic knowledge management could benefit, a third participant said. “You can’t secure what you don’t know you have. I have stuff in Google Drive, stuff in my shared drive, my local drive. None of it’s named anything. There is no naming convention. How do I make that into a national format so I can say who gets access to that data?”

Note: FCW Editor-in-Chief Troy K. Schneider led the roundtable discussion. The April 11 gathering was underwritten by KPMG, but both the substance of the discussion and the recap on these pages are strictly editorial products. Neither KPMG nor any of the roundtable participants had input beyond their April 11 comments.

Regardless of the use case, the group agreed, data is the key component. “Whether you call it RPA or AI,” one participant said, agencies must accept that “humans can’t process through the data. If you have five cents to spend, spend it on analytics first.”

Overcoming the obstacles

Pursuing any of those use cases, however, requires more than scrubbing data and having some dollars to spend. Participants agreed that data governance is crucial, though there was some debate over whether two much or too little poses a bigger risk.

“Honestly, if you don’t have a good governance layer to start that with, someone may turn to techies and fall in love with the tool, but the tool becomes the problem, not the answer,” one executive said. “Figure out what is the information we’re trying to protect and how you’re going to try to protect it. And now that I’ve got that, what tools are available to make that work?”

But another participant said starting with policy “is a terrible step to take. I think you have to start with, ‘How would we operate if we didn’t have system constraints?’”

She added that “tools solve a problem that exists with my existing systems. But they don’t help me get to an optimal business process. I think the discussion has to occur as, ‘Can we get to an optimal business process at a low cost and in a flexible way?’ If you don’t start at that point, you’re starting on the wrong foot.”

Good governance can also facilitate standardization, which most of the participants said should be a high priority. “If you’re building policy to address your problem now, you’re not building the right policy,” one executive said. “Your policy really needs to have the future vision of interoperability and of normalized functions.”

“One of the challenges is to abstract away from the symptoms themselves and understand what are these discrete problems that we’re trying to solve,” another added. “Use standards-based integration approaches instead of tools.”

“And it should apply to all the government because we all want to be on the same page,” yet another said.

In some cases, being on the same page should mean being on a common platform. Several participants argued that ticketing systems, certain finance operations, and security information and event management could all benefit from solutions that bring intelligent automation to most or all of government.

Other participants, however, raised concerns about aiming for the lowest common denominator. “I agree with the interoperability,” one said. “Centralization is what scares me. Centralization slows business processes down. Centralization really creates a lot of frustration if you’re an operator.”

Another participant agreed: “It doesn’t matter what tool you pick if you have a vision of how you’re going to interact with that tool and others. If the standards are there, fantastic.”

To strike that balance between standards and mission-driven focus, several executives advocated a microservices approach. “If we start smaller, we can talk about data elements and attributes,” one said. “The fact that we can come to some sort of agreement on five or seven is big.”

“It’s a specific business problem where you can do a proof,” another noted. “Get some measurable return-on-investment numbers out of it and then you can try to build your business network and scale it. But agencies aren’t structured to execute in that way. That’s where the big challenge is.”

Moreover, the “start small” advocates argued, picking a single business process and getting its owner on board facilitates the most important part: rethinking that process before you automate it.

One participant objected, saying: “If you have that opportunity to do blue sky, that’s awesome. But a lot of us are trying to build the plane while we’re flying it, so we don’t have that opportunity to start completely from scratch.”

“But if you write a performance work statement by asking people what outcome they want,” another countered, “they’re going to write it on the basis of what they do every day.”

A third executive suggested a different approach: “If you take a step back and say, ‘Are there any new capabilities in the marketplace that we could use, that we could redesign for the way we work, so that we could manage that issue?’ Then the outcome may be completely different.”