Artificial intelligence has always been based off an understanding of human intellect – but AI “behavior” is not always programmed to actually simulate human beings. In fact, AI tend to be designed to act without the variable wills and idiosyncrasies we recognize in our fellow Man.

Though there’s surely a want to keep AI inhuman, the challenge of compiling a large enough controlled pool of participants to study and develop AI is also a constraint. Despite the technical necessity and ethical intent behind inhuman AI, the tendency for AI to behave like obviously artificial systems might soon change.

Researchers at Charles III University of Madrid (UC3M), University of Valencia, and elsewhere across Europe are developing a project called Bridging the Gap: From Individual Behavior to the Socio-Technical Man (IBSEN). By engaging and analyzing 1,000 or more subjects simultaneously the researchers hope to uncover patterns in human behavior on an individual and societal level with a particular emphasis on how these behaviors manifest within our highly technological and connected culture. In measuring behavior at this scale, IBSEN project head Anxo Sánchez says, “We are going to lay the foundations to start a new way of doing social science…”

IBSEN is funded as a part of the European Union’s Horizon 2020 program, a tremendous effort (the EU’s biggest ever of its kind) to provide almost €80 billion (around $85 billion) to research and innovation initiatives within science and industry.

In order to determine individuals’ idiosyncratic social behaviors, the researchers will connect subjects via online networks and present them with economic games, cooperation puzzles, and social problems while observing how they interact. The researchers gaze will be out for finding subtle patterns and cues in the subjects’ decision making. By then combining their observations into a comprehensive whole, the researchers hope to develop a system that can accurately simulate human conduct when the system is fed real world socioeconomic problems. The system will need to be able to react on an individual level, representing one human’s behavior, and a societal level, representing the outcrop of many thousands of humans’ behaviors.

Though it might seem that this latter step of representing an individual and society would be the most challenging aspect of the project, the researchers suggest that the greatest difficulty regards designing and creating a test management system that enables all subjects to engage in the experiment at the same time. Since participants will be connected online, they will not be within the researchers’ watchful gaze. And while most projects of this sort engage tens of subjects, IBSEN hopes to engage more than 1,000.

With a working system simulating individual behavior researchers hope to humanize AI such as video games characters that we interact with on a personal level. But the main focus on comprehensive collective conduct researches hope glean from the project will be applied to global issues like social and economic issues.

To make IBSEN all the more expansive, the project will employ the help of individuals from fields like economics, physics, and social psychology. And still, researchers acknowledge the many risks behind IBSEN. “This is a high-risk project,” they write on their website, “as the experimental design may prove unfeasible for really large systems and extracting meaningful data from the participants’ actions may not be possible.” But results from pilot studies conducted by the researchers’ partners have given IBSEN and the EU’s Horizon 2020 program enough optimism to move advance their project and – hopefully – build a repertoire of human behavior.

We all know what robots sound like. The awkward, choppy, nasally voice is engrained into popular culture. And even though contemporary robots have significantly advanced vocal capabilities, countries like Yamaha continue to represent the AI voice as mechanical and constrained.

A few weeks ago, Chinese software company Baidu released key parts of a key artificial intelligence/ speech recognition algorithm into the realm of open source, following in the footsteps of Facebook and Google last year.

We all know what robots sound like. The awkward, choppy, nasally voice is engrained into popular culture. And even though contemporary robots have significantly advanced vocal capabilities, countries like Yamaha continue to represent the AI voice as mechanical and constrained.

Stay Ahead of the Machine Learning Curve

At Emerj, we have the largest audience of AI-focused business readers online - join other industry leaders and receive our latest AI research, trends analysis, and interviews sent to your inbox weekly.