A viral YouTube video is exposing what appears to be a worldwide ring of child exploitation videos hidden in plain sight on the site and aided by its algorithm to share questionable content featuring adolescent girls — videos complete with advertising from major companies.

I doubt this is intentional in anyway. YouTube algorithms don't get everything right. It will recognize similar activities and make lists based on them. It looks to me like this started naturally based on user activity. Pretty disturbing though, hope YouTube takes action.

That said I'm not sure this is the type of news that's best catered to this site.

Defoler

02-18-2019 11:25 AM

Considering the "elsagate" took so long for youtube to handle, this doesn't surprise me at all.

bigjdubb

02-18-2019 12:09 PM

I didn't watch the video but I did read the article. I have noticed that YouTube tends to recommend videos from underage youtubers quite often. I'm not 100 percent sure which one of my watching habits triggers this to happen but the videos are never related to what I typically watch, it's always make up videos, clothes videos, or something along those lines. I assume this recommendation trend is because those videos are quite popular, even if it's for the wrong reasons. My guess would be that I end up getting these video recommendations because I do watch some channels that are Vlogs so YouTube starts recommending other vlog channels.

There is no way to control what someone is thinking when they are watching videos of kids doing innocent things, but they should be able to figure out if someone is uploading multiple videos of different children to a channel. Considering how many users there are and how easy it is to create a new channel, I really doubt they will be able to do anything about this. Parents face the same problem at the park or the public swimming pool, I think this fear is part of the reason so many kids end up living sheltered lives.

huzzug

02-18-2019 08:25 PM

Quote:

Originally Posted by UltraMega
(Post 27858448)

I doubt this is intentional in anyway. YouTube algorithms don't get everything right. It will recognize similar activities and make lists based on them. It looks to me like this started naturally based on user activity. Pretty disturbing though, hope YouTube takes action.

YouTube's algorithm already is able to detect video's featuring minors. It can also detect swear words within such videos to take action against. But they seem to not be able to do it in all videos. Those all are majority of the videos on YouTube. Moreover, YouTube let's the uploaders of such videos stay even after such videos have been reported.

Quote:

That said I'm not sure this is the type of news that's best catered to this site.

Hey if we're able to talk about decrypting devices because there are going to be "children at risk", this directly affects those children

JackCY

02-19-2019 11:41 AM

Considering what kind of useless news there are here at times I do not think this big old issue YouTube still has is useless and should definitely be brought to attention again.

YouTube is screwing over good content creators, demonetization, getting left out of suggestions by algorithms. Yet YouTube has no problem of monetizing videos of minors that shouldn't be on their/any platform in the first place, let alone their fancy algorithm is so great that it recommends nothing but this kind of content very quickly to those interested in it. Sure many platforms suffer from this sooner or later and have to deal with it or have solutions in place from the get go. YouTube is a gigantic money machine at this point with no regulation over what data they accept, promote or sell. It is disgusting that after so long they still haven't done anything about it and instead they went after regular channels with sensible content. The only things that can force YouTube to make at least some impactful changes are: public outcry, laws/fines/shutdown until ample protections are in place, advertisers withdrawing their adds/money not because they don't want their adds on sensible topics of discussion, gun videos, videos with swearing, etc. but because they don't want their adds shown on videos exploiting minors, on inappropriate videos of minors, etc.

atomicmew

02-19-2019 01:54 PM

Quote:

Originally Posted by UltraMega
(Post 27858448)

I doubt this is intentional in anyway. YouTube algorithms don't get everything right. It will recognize similar activities and make lists based on them. It looks to me like this started naturally based on user activity. Pretty disturbing though, hope YouTube takes action.

That said I'm not sure this is the type of news that's best catered to this site.

Youtube spends too much resources demonetizing creators and going after people making off-color jokes. It's not intentional, but it is damning. Youtube's priorities are not aligned with the public good.

azanimefan

02-19-2019 03:23 PM

Quote:

Originally Posted by bigjdubb
(Post 27858502)

I didn't watch the video but I did read the article. I have noticed that YouTube tends to recommend videos from underage youtubers quite often. I'm not 100 percent sure which one of my watching habits triggers this to happen but the videos are never related to what I typically watch, it's always make up videos, clothes videos, or something along those lines. I assume this recommendation trend is because those videos are quite popular, even if it's for the wrong reasons. My guess would be that I end up getting these video recommendations because I do watch some channels that are Vlogs so YouTube starts recommending other vlog channels.

There is no way to control what someone is thinking when they are watching videos of kids doing innocent things, but they should be able to figure out if someone is uploading multiple videos of different children to a channel. Considering how many users there are and how easy it is to create a new channel, I really doubt they will be able to do anything about this. Parents face the same problem at the park or the public swimming pool, I think this fear is part of the reason so many kids end up living sheltered lives.

no clue, never once seen this content on the recommended videos for me. I get a bunch of tim pool videos, lots of national geographic wildlife stuff, anime stuff, and superhero/star trek/star wars stuff (plus video games). pretty much all I use youtube for actually, so it's got me pretty well nailed down.

steelbom

02-19-2019 03:41 PM

I can't say I've had any of these sorts of videos but I do get wacky recommendations. I miss the old days when it actually gave me good suggestions based on the video I was currently watching.

Now my list of related or recommended videos is full of a garbage.

battlenut

02-20-2019 01:27 AM

I am in the same puddle as the two above me. Personal favorites is "stupid people getting hurt for doing stupid things" that's all that shows up for me. Also I am not gonna completely blame youtube for this either. Parents are mostly to blame for posting this stuff on their kids request or allowing there kids to post stuff unbeknownst to them.