Now Big Brother is REALLY Watching You

In a government-sponsored research project eerily reminiscent of the
2002 film “Minority Report,” the Army’s Defense Advanced Research
Projects Agency (DARPA) has partnered with Carnegie-Mellon University to
create “an artificial intelligence (AI) system that can watch and
predict what a person will likely do in the future.”In “Minority Report,” a specialized “PreCrime” unit, part of the
Washington, D.C. police department, arrests criminals based on the
precognition of three psychics. In the near future, DARPA hopes that
rather than using psychics, computers will be able to identify and order
individuals detained based on their “anomalous behavior.” Tapping into live surveillance video feeds and using specially
programmed software, a new computer system dubbed “Mind’s Eye” will
ﬁlter surveillance footage to support human operators, and automatically
alert them whenever suspicious behavior is recognized.According to the research coming from Carnegie-Mellon, the security
camera system can monitor a scene in real time and sound an alarm if the
program determines that illicit activity is indicated. The program
would be sophisticated enough to determine if, for example, a person was
setting down a bag in an airport because he is sitting next to it or
that person has left the bag all together.The researchers noted that humans are extremely skilled at choosing
important pieces of information out of a mass of visual data and making
decisions based on both the recorded information and acquired background
knowledge. The DARPA project strives to mimic human behavior in picking
out important pieces of information from a sea of visual data and make
predictions on how people will behave based on their actions under
uncertain conditions.Darpa wants to deploy this software initially in airports and bus
stations, and if the pilot program is successful, the software could be
installed at every red light, street corner, and public place in
America. It could also capture feeds from video conferencing systems,
video emails, and other forms of streaming media.According to Forbes, Carnegie Mellon is just one of 15 research teams
that are participating in the program to develop smart video software.
The final version of the program is scheduled to be deployed in 2015.Mark Geertsen, a spokesman for DARPA, said in a statement that the
goal of the project is “to invent new approaches to the identification
of people, places, things and activities from still or moving defense
and open-source imagery.”The
first part of the project involves a program called PetaVision. This
initiative is a cooperative effort between Los Alamos National
Laboratory (LANL) and Portland State University with the support of the
National Science Foundation. The goal of this initiative is to “Achieve
human-level performance in a ‘synthetic visual cognition’ system,” in
other words, create a computer program that will duplicate a human’s
ability to see and recognize objects, specifically faces. It would
incorporate advanced artificial intelligence to identify people and
objects in a video feed by looking at their shape, color, texture as
well as how they move.To do this type of advanced computing, the program is being developed
on an IBM “roadrunner” supercomputer running one quadrillion (a million
billion) mathematical operations every second.While the initial software is being programmed by humans, the program has the ability to learn as it is being programmed.According to the Los Alamos National Laboratory, the goal of the
project is to recreate the visual functions of the human brain. They
already have plans of implementing the second phase of the project which
would be to develop a program that would mimic the function of the
entire brain.The second part of the project is another program called Videovor.
While little is known about this program, what little information that
is available seems to indicate that the program will be used to
“summarize” data taken from video cameras.The most time consuming part of surveillance analysis is looking at
the accumulated video intelligence and determining its value. Videovor
captures the video feed, analyzes it, and presents a summary of the
useful information and events found in the feed.All this would be done in real time, eliminating the need to wait for results.The third part of the project is the development of a “geospatial
oriented structure extraction” program, designed to automatically render
a crude “wireframe” representation of the important events in the video
from several angles, eventually eliminating the need for a human to
condense hours of video into a few minutes of pertinent information.This automated approach to video surveillance could one day replace
using humans to monitor cameras. With Mind’s Eye installed, the computer
system would be cheaper to maintain than human operators and would
never need a lunch break or a day off. The computer could monitor every
camera in a city around the clock, 365 days per year.Also, current surveillance systems can only report what has happened
in the past, it cannot forecast future behavior. Today, investigators
can only see how a car was stolen or person mugged after the fact. This
new software is being designed to prevent crimes before they happen.Buried in the footnotes of the Carnegie-Mellon paper was a reference
to P. W. Singer’s book, “Wired for War: The Robotics Revolution and
Conflict in the 21st Century.” It is an interesting glimpse into the
direction the research team may be taking. The book examines the
revolution that is taking place on the battlefield and how it is
changing not only how wars are fought, but also the politics, economics,
laws, and ethics that surround war itself.The book talks about the explosion of unmanned systems on the
battlefield. It notes that the number of unmanned systems on the ground
in Iraq during the Second Gulf War had gone from zero to 12,000 in just
five years. The book also notes that these new computer systems will
soon make human fighter pilots obsolete. Robotic scouts the size of
house flies do reconnaissance work now conducted by Special Forces units
and military pilots fly combat missions from their cubicles outside Las
Vegas.However, critics suggest just as there are inherent dangers
associated with turning over wars to machines, so too are there dangers
associated with turning over national security and the criminal justice
system to mechanical watchdogs.The Mind’s Eye AI system holds a very real danger to individual civil
liberties. Critics say relinquishing surveillance and law enforcement
to a machine leaves a society open to a future where all activities will
be monitored and recorded, in the name of public safety. That
surveillance would not be limited to just public venues. As the courts
have increasingly limited an individual’s “expectation of privacy,”
automated monitoring of human behavior can take on increasingly invasive
proportions.As with so many other government programs, the scope of the Mind’s
Eye project can be vastly expanded into areas far outside of its
original intent.Deployment of this project could be a major threat to an individual’s privacy rights and turn a Hollywood script into reality.Steve Elwart, P.E., Ph.D., is the executive research analyst with
the Koinonia Institute and a subject matter expert for the Department
of Homeland Security. He can be contacted at steve.elwart@studycenter.com.