The whole “camera adds ten pounds” thing isn’t an old wives’ tale — it’s a result of distortion created from distance and camera angle. But could it soon be a thing of the past? Ohad Fried, a computer science PhD candidate at Princeton University, and a team of students and Adobe researchers have designed an algorithm that negates that distortion, called perspective-aware manipulation.

The closer an object is to the camera, the larger it seems. So when a person is close to the camera, their nose appears larger and their face appears thinner. When using a wide-angle lens, the subject has to get closer to the camera, creating more distortion. That’s why most portrait photographers use a 50mm or 85mm lens — it allows them to shoot at a distance that doesn’t make the nose look overly large but also doesn’t make the face look too wide.

Related: A new tool could drastically change selfie editing

This GIF from photographer Dan Voitech shows this effect in action (just keep in mind that it’s actually the distance from the camera, not the focal length, that creates the distortion):

Of course, the distortion from being too close to the camera is an even greater problem in the selfie era. Held at arms length, the camera will capture much of the distortion that appears to slim the face but widen the nose.

Recognizing the issue, Fried used a 3D model to warp a two-dimensional image, mimicking the effect of shooting at a different focal length. Thanks to that 3D model, the software can also mimic slight adjustments in the position of the face, fixing a (minor) pose issue.

According to the paper, the technique isn’t a retouch that creates a different person (like a Photoshop diet plan). “Our editing operations remain in the realm of “plausible” — they do not create new people, rather they show the same people under different viewing conditions,” the research team writes. “In that sense, they are the post-processing equivalent of a portrait photographer making a different decision about composition.”

While the software appears pretty accurate, it’s not a 100-percent reflection of changing the camera’s perspective — the researchers note, for example, that the hair doesn’t adjust to the new virtually created distance.

While the software is currently in the research stage, there is an online demo that anyone can try with their own photo or a sample. Who knows, maybe the feature could be added to Photoshop — or a selfie app — down the road.