Oculus VR Jam entry promises a "personalized afterlife experience."

Share this story

How far is too far when it comes to pushing the boundaries of virtual reality? As VR devices grow ever more sophisticated—and the tools to create software for them ever more accessible—where do we draw the line between what’s ethically acceptable in the real world and what’s ethically acceptable in the virtual world?

One of the developers putting this question to the test is Australia-based Paranormal Games. Project Elysium, its entry into the upcoming Oculus VR Jam 2015, treads some shaky moral ground by promising to create a "personalized afterlife experience," reuniting people with loved ones who have passed on. Exactly how the developer hopes to do this isn’t clear at this point (it will be required to showcase screenshots by April 27, followed by video footage the week after to be eligible for the jam’s grand prize), although a screenshot from Project Elysium’s development does show a friend of the studio being transformed into a 3D model.

Naturally, this raises more questions. Would potential users of Project Elysium have to send pictures and video of the deceased to the developer in order to have him or her mapped into the game? And what about that person’s personality? How much data would the developer need in order to create a realistic representation of that person rather than just a robotic and potentially distressing facsimile? Perhaps you'll be able to do it yourself, using a character editor a la Skyrim.

Most importantly of all, though, what effect would seeing the deceased in a virtual world have on the mental health of the user?

Perhaps there will be a subset of people for whom Project Elysium provides real comfort and support in times of grief. We’d hope that if Paranormal Games is truly serious about its "personalized afterlife experience" of helping people, it makes a real effort to study the potential mental health effects of its software. For now, it seems VR remains a wild west of unregulated innovation, where things like Project Elysium can push the applications of VR without necessarily following real world rules.

Undoubtedly things will change in the future, for better or worse, if we see VR used more frequently in this way. At the very least, Project Elysium will add to the growing discussion of what is and isn’t appropriate in VR, whether that’s violent gameplay, reviving the dead or otherwise, and whether we’re all at risk of losing our grip on reality.

Promoted Comments

I lost my mother when I was three. I have very few memories of her. I remember visiting her in the hospital and I remember her funeral, but the memories themselves are... incomplete. I have the scene, but at least for the funeral, I do not remember the emotional context of the event. I think that's because I didn't understand it.

I would love a program that would allow me to talk to a simulation of my mother, in the same fashion that Star Trek does with a holodeck in several episodes. I'd give a great deal to hear her thoughts or feelings on topics, on me, of my life. To know more about her likes and dislikes, her opinions and desires. I would never mistake such a program for *her*, but it would help fill a void in my life.

(For the record, I have an amazing stepmother -- but you still can't help but wonder, what would your birth parent have thought of this or that?)

However, with that said, I don't believe such a program could be brought to fruition without the active and comprehensive actions of a great many other people. If you're going to simulate someone, your users are going to want to *interact* with that person.Needless to say, such goals are far, far beyond our current understanding of AI.

You might not need a comprehensive brain scan to mimic someone, but you'd need their diaries, journals, YouTube videos, Facebook posts, and interviews with friends and family, all to create a comprehensive picture of the person to be emulated.

I'm sorry, but this just can't be healthy. We've all lost someone, and while I'm sure we all wish we could have them back, after enough time passes we realize that we need to move on, and we're better for it.

This honestly sounds like a more exploitative business than funeral homes.

I'm all about technology but this doesn't seem like a great idea, people needs to learn to deal with the loss as terrible as that might be, this can lead to some unhealthy relations with a machine (as if we don't have enough already)

Charlie Brooker's Black Mirror episode, starring Domnall Gleeson (of Ex Machina film fame most recently): 'Be Right Back'. Everyone knows this is not going to end well. Who is this supposed to be for, exactly?

I think I've seen this movie once or twice or three times. It never ends well.

Suddenly conversations like this won't seem so implausible.

"The walls of reality will come crashing down. One minute, you'll be the saviour of the rebel cause; next thing you know, you'll be Cohaagen's bosom buddy. You even have fantasies about alien civilizations as you requested; but in the end, back on Earth, you'll be lobotomized! So get a grip on yourself, Doug, and put down that gun!"

I could see this as someone leaving a personalized message for a loved one after you yourself had died. But someone already being deceased and taking their image/information to do this is creepy as hell.

This seems like a terrible idea. Also, shouldn't Occulus release... anything to the public before attempting anything more ambitious?

If you read the article, you'll find that this thing is being developed by Paranormal Games, not Oculus VR. As far as the Oculus VR Jam, it makes sense for Oculus VR to hold stuff like that even before full release of the Rift, since they want content for the Rift ready at the same time as the Rift launch.

It was about the whole plot of Caprica. Great show, now I hate you for reminding me it's gone. Damn you SyFy and your WWE crap.

There was actually something weird with the genesis of Caprica. There was another show that was being developed at the same time -- one that was all about virtual reality and mind uploading. SyFy decided to focus on Caprica, but they folded in some concepts from the other show. I wish I could remember the other show...

Reminds me of John Edward. The bullshit psychic, not the bullshit politician with a similar name. He built his fortune on exploiting people who knew dead people. Same basic concept replacing the supernatural bits with technology.

I lost my mother when I was three. I have very few memories of her. I remember visiting her in the hospital and I remember her funeral, but the memories themselves are... incomplete. I have the scene, but at least for the funeral, I do not remember the emotional context of the event. I think that's because I didn't understand it.

I would love a program that would allow me to talk to a simulation of my mother, in the same fashion that Star Trek does with a holodeck in several episodes. I'd give a great deal to hear her thoughts or feelings on topics, on me, of my life. To know more about her likes and dislikes, her opinions and desires. I would never mistake such a program for *her*, but it would help fill a void in my life.

(For the record, I have an amazing stepmother -- but you still can't help but wonder, what would your birth parent have thought of this or that?)

However, with that said, I don't believe such a program could be brought to fruition without the active and comprehensive actions of a great many other people. If you're going to simulate someone, your users are going to want to *interact* with that person.Needless to say, such goals are far, far beyond our current understanding of AI.

You might not need a comprehensive brain scan to mimic someone, but you'd need their diaries, journals, YouTube videos, Facebook posts, and interviews with friends and family, all to create a comprehensive picture of the person to be emulated.

Sounds like an AI problem. Of what value is interacting with a script that cannot pass the Turing test? And even if it can pass the Turing test, can it pass my own test of whether I'm speaking with so-and-so? No, of course not. There can be no conversation that explores a shared memory or that can really be free because of the lack of the actual experiences that form the basis for a conversation on the part of the script. Or will the avatar just smile wistfully and look into the purchaser's eyes nostalgically?

It really seems to me that this sort of idea is doomed to uncanny valley for longer even than a visual representation of Cortana, etc. We're too close to these people to be fooled easily, and I doubt it will be comfortable when they fail to fool.

None of that gets into whether it's really healthy, psychologically speaking. Who knows, perhaps such things are a way for some people in the future. A sort of Ancestral realm...

This press release is a good way to drum up attention, but really a much less alarming way of positioning this service is "we will make a virtual statue of someone". That's it. I guess it's a statue that can move around, maybe even say some things that sound like the dead person. I'm pretty sure Disneyland has a bunch of those already at the Hall of Presidents, and they're not a threat to anyone's mental health.

Animatronic statues of dead people... now in VR! is not as exciting as a press release.

This press release is a good way to drum up attention, but really a much less alarming way of positioning this service is "we will make a virtual statue of someone". That's it. I guess it's a statue that can move around, maybe even say some things that sound like the dead person. I'm pretty sure Disneyland has a bunch of those already at the Hall of Presidents, and they're not a threat to anyone's mental health.

Animatronic statues of dead people... now in VR! is not as exciting as a press release.

There's a big psychological difference between doing this with dead historical figures and doing this with your own family members.

Obviously virtual worlds that simulate RL to a high degree of accuracy is the target everyone is looking towards. This in itself brings up issues. Ex Machina, the upcoming movie, has a premise that envisions AI supported by big data that achieves sentience. Kurzweil believes that computers can run a simulation of us that extends our conscious past death. Game designers use social engineering tactics to encourage game play over indulgences.

This leads me to believe at some point people will enter the VR and never leave. A condition that has been produced in rats with Skinner Boxes and pleasure center stimulation.

Some may think of Snow Crash, but I'm not thinking the brain will "crash", but more that humans are prone to make bad life choices due to the Hedonistic Treadmill syndrome.

I think the idea of myself creating a virtual "me" for my loved ones to interact with when I'm gone is a great idea - the idea of someone else creating it after the fact is not great. Say I'm diagnosed with a terminal illness tomorrow, it would be awesome to be able to create an avatar that could "sit down" and give pre-recorded talks to my son that he could interact with (in a more realistic way than watching a video) as he grows up without me there. Having an AI try to speak for me, though, is something completely different and not at all a good idea. I imagine it more like Superman "talking" with his dad.

This press release is a good way to drum up attention, but really a much less alarming way of positioning this service is "we will make a virtual statue of someone". That's it. I guess it's a statue that can move around, maybe even say some things that sound like the dead person. I'm pretty sure Disneyland has a bunch of those already at the Hall of Presidents, and they're not a threat to anyone's mental health.

Animatronic statues of dead people... now in VR! is not as exciting as a press release.

There's a big psychological difference between doing this with dead historical figures and doing this with your own family members.

Sure. My point is that this is not new technology. Someone could have offered to build an animatronic version of your dead relative 40 years ago -- that would have been just as creepy, and I suspect just as (un)popular. Nothing has changed except that they're doing it in VR instead of real life -- in other words, the technology is a lot less impressive.