Instagram will notify users if their account may be removed. Now, Instagram will also remove accounts with a certain number of violations within a certain time frame-like if someone goes on a racist, homophobic, or violent Instagram rant-which is more in line with Facebook'spolicy.

Until now, Instagram's policy banned users who posted "a certain percentage of violating content" but now it would ban people who repeatedly violate its policies within a window of time.

The app already disables users' accounts if a certain percentage of their content or comments is in violation of community guidelines, which includes things like respecting other users and following laws. In the beginning, only some types of content would be allowed to be appealed such as pictures removed for nudity or hate speech.

Facebook-owned Instagram has received a lot of flak in the past for disabling accounts without any prior notice. If the content has been wrongly flagged, users will be able to appeal.

Instagram said it will be tweaking its policy on accounts found to be frequently violating its terms of service.

Instagram has made changes to its policy about removing accounts.

The announcement comes during the same week that a man suspected of killing a 17-year-old girl posted a photo of her bloody body on what appeared to be his Instagram account.

Under increasing pressure, major platforms have begun to reevaluate their policies regarding offensive or harmful content.

The FTC recently announced that Facebook would be fined $5 billion for various privacy violations throughout the last two years.