According to a study from the Internet Watch Foundation (IWF), all of the debauchery documented on the Internet in your hazy, poor-judgement ridden youth is being hoarded and used by other websites inappropriately.

The study was conducted in order to ascertain how many self-taken or willingly-taken images of young users were online, and researchers found a shockingly high number of them were being pulled from their origins by parasite domains in order.

“Most of the images and videos (88 percent) appeared on ‘parasite websites,’ meaning they were taken from the original area where they were uploaded and made public on other websites,” says the IWF. “These parasite websites had often been created for the sole purpose of offering sexually explicit images and videos of young people and therefore contained large amounts of sexually explicit content.”

Here’s a quick look at the important numbers associated with the study:

A total of 12,224 images and videos were analysed and logged.

The content was on 68 discrete websites.

7,147 were images.

5,077 were videos.

5,001 were both an image and a video.

Of the 12,224 images/videos, 10,776 were on parasite websites.

Therefore, 88 percent of content was taken from the original source site.

In only 14 instances could analysts not determine whether the site was a parasite website.

The IWF isn’t naming names — meaning we don’t have any information about the websites that are pulling these NSFW images of young users and turning them into explicit photo galleries. “The analysts sanitized the data before [we] saw it on the basis that some of this content could potentially be criminal and it may constitute an offense if [we] knowingly look at it under UK law,” IWF director of communications Emma Lowther tells me in response to a request for information about the guilty parties. Lowther did say that most of the images pulled for these purposes originated from social networking sites, though the IWF won’t “point the finger at any in particular.”

While the IWF might not reveal where these images are coming from or where they going to, the Internet has seen more than a few photo scandals thanks to Facebook, Instagram, and Tumblr. Of course, the teenage “sexting” problem has been debated, and while some might argue its prevalence, it’s certainly led to private images being exposed.

The purpose of the study, obviously, is to serve as a warning for users — regardless of age, really: The Internet never forgets; and sometimes it will pull those half-naked shots you took in your parents’ bathroom mirror at 16 and include them in some seedy gallery for all of eternity. So while you can go back and edit that unfortunate album you made in college, you can’t trust any entity that’s stolen your image to do the same, instead preserving it for who know how long. Which means it could take little more than a Google image search to pull up your stupid mistake.

This report follows hot on the heels of two separate but overlapping issues regarding the Internet and indecent youth exposure. Earlier this month, 15 year old Amanda Todd committed suicide after revealing her breasts to a man she’d met on video chat site BlogTV. He urged her to show her breasts and later exposed her by releasing a screenshot of the clip. The bullying that followed this incident is what led Todd to kill herself.

And last week, Reddit and the rest of the Internet watched as an all-out war was waged between Gawker and Redditor Violentacrez, the creator of Subreddits like the now-infamous Jailbait. These categories were primarily dedicated to lewd pictures of women. Violentacrez was outed by Gawker’s Adrien Chen, who was lambasted by much of the community for violating its anonymity policies — although the argument quickly spirals into what’s more important, women’s rights to not have their images unknowingly flooded these forums, or someone’s right to anonymously find or take these photos and share them without repercussions? Regardless, the fact remains that young users and those among us who took risky pictures in their youthful days could be targeted.

With the proliferation of photo-sharing services, if you absolutely must (and really, you mustn’t, but the warning is justified) send or share explicit photos, you shouldn’t trust privacy settings on large social networks or photo-sharing platforms to be good enough. Abine privacy analyst and attorney Sarah Downey suggests using Snapchat or OneShar.es, which use expirations so that images can’t stick around forever.

For the record, deleting your old Facebook albums or Tumblr posts isn’t good enough, either. You’re left to request the site to pull the content, or ask Google to unindex said image, neither of which are easy or quick solutions. Downey says to ask a site’s hosting company first, and then move to Google and its URL removal tool if need be. “I suggest moving in that order, from the party with the greatest control over the content to the least.” And if it’s a child pornography complaint, sites should act quickly — Downey says generally with 48 hours.

Users can feel incredibly powerless in this situation, and Downey mentions one very important thing to remember: “If it’s a picture that you took, you own the copyright to it. Under the DMCA, all websites are required to have a process for removing material that infringes copyright.”

Obviously, the best course of action is to refrain from posting explicit images. But for some, it’s too late, and it’s becoming increasingly clear that users have less and less control over their photos.