I like writing about cool applications of technology that are so pregnant with the promise of the future, that they have to be seen to be believed, and here’s another one that’s almost ready for prime time.

The Washington Post today launched an exciting new technology prototype invoking powerful new technologies for journalism and democratic accountability in politics and government. As you can see from the screenshot (left), it runs an automated fact-checking algorithm against the streaming video of politicians or other talking heads and displays in real time a “True” or “False” label as they’re speaking.

Called “Truth Teller,” the system uses technologies from Microsoft Research and Windows Azure cloud-computing services (I have included some of the technical details below).

But first, a digression on motivation. Back in the late 1970s I was living in Europe and was very taken with punk rock. Among my favorite bands were the UK’s anarcho-punk collective Crass, and in 1980 I bought their compilation LP “Bullshit Detector,” whose title certainly appealed to me because of my equally avid interest in politics 🙂

Today, my driving interests are in the use of novel or increasingly powerful technologies for the public good, by government agencies or in the effort to improve the performance of government functions. Because of my Jeffersonian tendencies (I did after all take a degree in Government at Mr. Jefferson’s University of Virginia), I am even more interested in improving government accountability and popular control over the political process itself, and I’ve written or spoken often about the “Government 2.0” movement.

In an interview with GovFresh several years ago, I was asked: “What’s the killer app that will make Gov 2.0 the norm instead of the exception?”

My answer then looked to systems that might “maintain the representative aspect (the elected official, exercising his or her judgment) while incorporating real-time, structured, unfiltered but managed visualizations of popular opinion and advice… I’m also a big proponent of semantic computing – called Web 3.0 by some – and that should lead the worlds of crowdsourcing, prediction markets, and open government data movements to unfold in dramatic, previously unexpected ways. We’re working on cool stuff like that.”

The Truth Teller prototype is an attempt to construct a rudimentary automated “Political Bullshit Detector, and addresses each of those factors I mentioned in GovFresh – recognizing the importance of political leadership and its public communication, incorporating iterative aspects of public opinion and crowd wisdom, all while imbuing automated systems with semantic sense-making technology to operate at the speed of today’s real world.

Real-time politics? Real-time truth detection. Or at least that’s the goal; this is just a budding prototype, built in three months.

Cory Haik, who is the Post’s Executive Producer for Digital News, says it “aims to fact-check speeches in as close to real time as possible” in speeches, TV ads, or interviews. Here’s how it works:

The Truth Teller prototype was built and runs with a combination of several technologies — some new, some very familiar. We’ve combined video and audio extraction with a speech-to-text technology to search a database of facts and fact checks. We are effectively taking in video, converting the audio to text (the rough transcript below the video), matching that text to our database, and then displaying, in real time, what’s true and what’s false.

We are transcribing videos using Microsoft Audio Video indexing service (MAVIS) technology. MAVIS is a Windows Azure application which uses State of the Art of Deep Neural Net (DNN) based speech recognition technology to convert audio signals into words. Using this service, we are extracting audio from videos and saving the information in our Lucene search index as a transcript. We are then looking for the facts in the transcription. Finding distinct phrases to match is difficult. That’s why we are focusing on patterns instead.

We are using approximate string matching or a fuzzy string searching algorithm. We are implementing a modified version Rabin-Karp using Levenshtein distance algorithm as our first implementation. This will be modified to recognize paraphrasing, negative connotations in the future.

What you see in the prototype is actual live fact checking — each time the video is played the fact checking starts anew.

What other uses could be made of semantic “truth detection” or fact-checking, in other aspects of the relationship between the government and the governed?

Could the justice system use something like Truth Teller, or will human judges and juries always have a preeminent role in determining the veracity of testimony? Will police officers and detectives be able to use cloud-based mobile services like Truth Teller in real time during criminal investigations as they’re evaluating witness accounts? Should the Intelligence Community be running intercepts of foreign terrorist suspects’ communications through a massive look-up system like Truth Teller?

Perhaps, and time will tell how valuable – or error-prone – these systems can be. But in the next couple of years we will be developing (and be able to assess the adoption of) increasingly powerful semantic systems against big-data collections, using faster and faster cloud-based computing architectures.

In the meantime, watch for further refinements and innovation from The Washington Post’s prototyping efforts; after all, we just had a big national U.S. election but congressional elections in 2014 and the presidential race in 2016 are just around the corner. Like my fellow citizens, I will be grateful for any help in keeping candidates accountable to something resembling “the truth.”

The easiest prediction in Washington is this one: “bureaucratic turf war.” The Obama Administration isn’t immune. Several months ago when the president-elect announced his names for DNI and CIA director, I put forth this idea: “Swap Blair and Panetta: A Modest Proposal.” In it I wondered, between the two of them, “Who gets the top bunk?”

Two years after launching the most technologically savvy presidential campaign in history, Obama officials ran smack into the constraints of the federal bureaucracy yesterday, encountering a jumble of disconnected phone lines, old computer software, and security regulations forbidding outside e-mail accounts.”

“What does that mean in 21st-century terms? No Facebook to communicate with supporters. No outside e-mail log-ins. No instant messaging. Hard adjustments for a staff that helped sweep Obama to power through, among other things, relentless online social networking.” -Washington Post

Some say that whoever has been responsible for information technology in the White House itself should be fired — but then perhaps the change of Administration just took care of that 🙂

Overall, this situation is familiar to anyone who has worked in what I call “Big-G IT” or the information technology of a federal government agency. I’ve argued about its challenges and sub-optimality before: see my previous pieces on “Roadmap for Innovation: From the Center to the Edge,” and more specifically “Puncturing Circles of Bureaucracy.” In that latter piece back in March of 2008, I wrote about the “the defensive perimeters of overwhelming bureaucratic torpor,” and the frustrating reality within much of Big Government: “Federal employees have an entire complex of bizarrely-incented practices and career motivations, which make progress on technology innovation very difficult, not to mention general business-practice transformation as a whole.”

Here’s the truly frustrating, mind-bending part: it isn’t always true! Other elements of the White House have cutting-edge, world-class technologies operating day in, day out.

Not long after the initial establishment of a “Director of National Intelligence,” the DNI CIO held an inaugural “DNI Information Sharing Conference” in Denver in the summer of 2006. I was asked to sit on a panel about “Innovation across the Intelligence Community,” representing the Defense Intelligence Agency and sharing the stage with two counterparts, from the CIA and NSA. Our panel chair was Mr. CJ Chapla, then the Chief Technology Officer (CTO) of the old Intelink Management Office, redubbed “Intelligence Community Enterprise Services,” an office now under the Office of the DNI (ODNI). CJ asked the three of us to describe briefly the goals and projects we were each working on, and in seriatim that’s what we did for 90 minutes or so.

When it was time for questions, the very first audience-member asked: “It seems that each of you are independently working on, and paying for, very similar kinds of technology projects. It would make sense to combine or rationalize the work, so why are you continuing to do it independently?” Continue reading →

Today’s Friday – usually a big news day in Washington, whether by design (bury bad news late in a deep weekend news hole) or by human error (bureaucrats tried all week to get something done and slipped it in at the deadline). There should be Obama cabinet announcements today, and meanwhile tech luminaries across the country are sitting by their phones, drumming their fingers and hoping for a call offering them the position of the nation’s first Chief Technology Officer. Norm Lorentz, who was OMB’s first-ever CTO, told C-SPAN this week that “If I were asked, I would serve in a heartbeat.”

Analysis:My earlier post about John Brennan being President-elect Obama’s “imminent” selection as CIA director is now a curio, given Brennan’s decision yesterday to withdraw from consideration.

Like any good intelligence analyst writing a balanced assessment, I had included the caveat that the only thing standing between Brennan and the appointment was the likelihood of a last-minute political squabble or contretemps.

This being Washington we’re talking about, that is precisely what happened.

Stop the presses! Microsoft Research is getting national front page coverage!

The work of Eric Horvitz and Jure Leskovec got top coverage in major newspapers and news sites today. With that fame, Eric will probably never again be willing to just while away a Friday afternoon with our Microsoft Institute folks, brainstorming some outside-the-box ideas for future work, as he did this week with us in Redmond’s Building 99.