(Reuters) - Facebook Inc is introducing a new detection technology to stop the spread of intimate photos posted on Facebook or Instagram without people's permission, sometimes called "revenge porn," the company said on Friday.

"By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram," the social networking giant said in a blog post https://newsroom.fb.com/news/2019/03/detecting-non-consensual-intimate-images.

Facebook will also launch a support hub called "Not Without My Consent" on its safety centre page.