Yoshida claimed that future technology would read far more than player movement, and also detect where players are looking and how they could be feeling.

“It’s really difficult to judge this, but I’d like to think that in ten years, game developers will have access to [this kind of] player information in real-time,” he added.

Mick Hocking, a senior director at Sony Worldwide Studios, elaborated on Yoshida’s claim, suggesting that future games will “involve the player as an actor, as a participant”.

“Having a camera being able to study a player’s biometrics and movements [is possible],” he said, “so perhaps you can play a detective game that decides whether you’re lying due to what it reads from your face.

“In ten years’ time I’d like to think we’ll be able to form a map of the player, combining other sorts of sensory data together, from facial expressions to heart rate.

“You can see how, over a period of time, you can form a map of the player and their emotional state, whether they’re sad or happy.

“Maybe people in their social network can comment on it,” he continued. “The more accurate that map can become, the more we can tailor it to the experience.”

Asked whether Sony was internally testing this technology, Hocking would only claim that the PlayStation group does “lots and lots of R&D in these areas”.

Also taking part in the Gamescom panel debate was Thatgamecompany co-founder Kellee Santiago, who was similarly excited by emotion-reading technologies.

“Our goal is to have that kind of interaction as accessible as the eye, and being able to take that kind of information from even just observing the player’s face, or extracting an experience just by looking at the very subtle movements of a player’s body,” she said.