On Nov. 8, 2022, 21-year-old Michael Watson Jr. was arrested for his involvement in 13 rapes, four of which were minors, allegedly threatening his victims, who he met via Instagram, to post nude images of themselves online if they didn’t send him money.
A year later on Nov. 13, 2023, police arrested 46-year-old California teacher Michelle Christine Solis for sexually assaulting an eighth-grade student in a locked classroom on the day of his graduation after grooming him via Instagram.
Nine months after that, 23-year-old Alejandro Garcia Aranda allegedly used Instagram to advertise and distribute sexually explicit images of high school girls without their consent. Los Angeles police arrested him on July 2, 2024.
Incidents like these have been happening to teenagers more frequently now, with the rise in the use of social media all around the world, according to a global social media statistics research summary on Smart Insights.
More than 2 billion people use Instagram worldwide, but the app has come under increasing fire for failing to sufficiently address a number of negative issues, such as its role in fostering child sexualization and the teenage mental health crisis, according to a Tuesday, Sept. 17, online article from National Public Radio [NPR].
Instagram has decided to take action against the increasing harm being inflicted on its users, especially teenagers, as a result of the easily accessible public accounts.
According to a Tuesday, Aug. 27, online article from Avast, about half of the teenagers in the United States have experienced some form of cyberbullying this year. The article further reveals that Instagram is a “particularly ripe platform for bullying.”
In light of the increasing rates of harm, the platform has implemented since last fall six total restrictions for minors using Instagram. Here they are, according to Instagram’s website:
- Set teenagers’ accounts to private instead of public by default in the United States, United Kingdom, Canada and Australia. The difference is that now, those under 16 must approve another Instagram user’s follow request; whereas before, teens’ public accounts meant anyone could see their content, which made them more vulnerable to pedophiles and adult cyberbullies.
- Any newly created teenage account was also automatically registered as a private one.
- Already existing accounts were modified within the first 60 days after Tuesday, Sept. 17.
- These changes also limited who the underage user can message and tag — their approved followers.
- Instagram started using AI to automatically detect and filter offensive or suggestive words on messages and posts that minors receive from both familiar and unfamiliar accounts. According to Meta AI’s feature, the filtersrange from profanity to hate speech to sexual content to spam to self-promotion to threats to bullying to drug-related content to violent content.
- The app set highly restrictive regulations on content posted by accounts that the user does not follow; whereas before, a user could view any post regardless of its content
- The platform installed a periodic check-in every 60 minutes to make sure the 16 or under user is not spending too much time on the app.
- And at the end of the day at 10 p.m., the app goes on sleep-mode which automatically mutes notifications and sends auto-replies to any direct messages
Several Sunny Hills students such as sophomore Marisa Thienprasiddhi, who are avid Instagram users think negatively of this update.
“I think the new update would make it a lot harder to communicate with other people at school because one of the reasons I liked Instagram was so I could message people if I didn’t have their numbers,” Thienprasiddhi said. “For example, if I needed help with homework but didn’t have any close friends in my class, I could just DM [direct message] them, but now that all the accounts are private by default, I can only contact people that I already follow.”
Some of the main uses of Instagram within the school are for club activities and promotions, and junior Mia Gonzales expressed her worries about the restrictions potentially making it more difficult to reach out to SH students.
“I am a social media manager for the Recognize Accept Dance [RAD] club,” Gonzales said. “I think this update will negatively impact our club going forward because it hinders our ability to directly share information about the club to students on campus.”
Instagram additionally introduced a new system that detects teenagers who lie about their age to avoid the extra restrictions on their accounts.
They implemented built-in artificial intelligence programs, along with the services the British company Yoti, whom Meta, Instagram’s parent company, has been working with since 2022, provides.
Yoti has special technology that scans faces in photos and estimates their age, ultimately making it difficult for teenagers to input false birthdays.
Parental supervision is now offered as an option and allows parents access to view their children’s recent activity.
According to Instagram, parents who opt to monitor their child’s account will have access to information about the individuals they communicate with, but they will not be able to read the actual messages.
Despite the stronger protection services offered, some parents believe it won’t be very effective.
Gaming and coding teacher Sonya Joyce, a mother of three teenage boys, remained skeptical about Instagram’s changes. Instead, Joyce said social media use should come down to parental guidelines.
“If a teen has a phone, a parent needs to be able to monitor and talk to them about phone usage because they know what their child does on their phone, so there’s really no benefit on what type of account they have because they can create a non-teen account and gain access to all content,” Joyce said. “It’s not just Instagram, it’s YouTube and games and Netflix and all the apps that can make a student lose track of time and be four hours into something and not do anything else, that’s the ‘real’ issue.”