The fact that images of child sexual abuse are prevalent across the internet is, unfortunately, not a new reality. What is new, is the announcement by Apple of what their plans are to help combat this issue. By adding a new feature that will allow Apple to scan U.S. iPhones for images of child pornography, the tech giant is hopeful to identify child predators. While it all sounds hopeful, a new investigation by the Tech Transparency Project (TTP) reveals that Apple is far from being able to back up their own statements. Currently, Apple claims the App Store is a “safe place for kids” and that it “rejects apps that are ‘over the line- especially when it puts children at risk.’” However, TTP’s research came to a different conclusion. The team created a fake account pretending to be a 14 year old user. From that account, it was discovered that out of 80 apps in the App Store that are restricted to users who are of 17 years of age or older, it was quite easy for the minor to get by those restrictions in most cases. One such app was a dating app that opened directly to pornography BEFORE asking a user’s age as well as adult chat apps full of explicit images that never asked for an age. Another discovery was that Apple basically “passes the buck” to other apps, and vice versa, when it comes to blocking users who are underage. To be perfectly transparent, it is important to know that the parental controls were not activated on our minor’s iPhone. However, those controls are disabled by default, so typically, unless a parent is aware they should turn them on, the controls would be disabled anyway.) Here’s the bottom line. I realize that there is much work needed to protect our children. And while I appreciate the steps Apple claims to be taking in order to do just that, I find it to be pretty hard to believe when they can’t even protect our kids in their own App Store. If Apple really wants to make a difference, maybe they need to take an introspective look at themselves and start in their own home.