October is National Bullying Prevention Month and Instagram is adding new tools to help reduce instances of cyberbullying within its app to protect its more susceptible audience. Instagram is launching a new test feature that will automatically hide comments that are similar to ones that have already been reported as offensive or harmful. Currently Instagram has an automated warning system that alerts users when they try to post an offensive comment. In this system, the comment is still visible, and the user is given the choice to either delete or continue to post their comment. The new system will use the same process but will hide those comments from view. The hope is that this will be effective in limiting exposure to harmful comments online. Instagram is even taking it a step further by adding a new alert warning users that their account could be deleted if their repeated offensive comments are reported. According to a survey conducted in 2017, Instagram was “”the worst social media network for mental health and wellbeing””, with the platform reportedly contributing to higher levels of anxiety and depression.” By taking these steps to diminish instances of cyberbullying within their app, Instagram is hopeful that these new measures will provide an extra level of protection and make users think twice about what they post. It’s wise to remember that even with all of the advanced technology to help prevent cyberbullying, the real power is in the hands of the person with the device.