Technology

Instagram Announces Plans for Parental Controls and Other Safety Features Ahead of Congressional Hearing

Instagram Announces Plans for Parental Controls and Other Safety Features Ahead of Congressional Hearing

Following the recent testimony from Facebook whistleblower Frances Haugen, Instagram CEO Adam Mosseri slated to appear before the Senate for the first time on the topic of how the platform is harming kids’ mental health on Wednesday. Instagram has launched a new set of safety features, including its first set of parental controls, just before of that hearing. Mosseri wrote a blog post for the corporation that announced the changes. Some of the improvements are tiny enhancements on previous safety features that the corporation already has in the works.

The bigger story today is Instagram’s announcement that it will deploy its first set of parental controls in March. Parents and guardians will be able to check how much time their children spend on Instagram and establish screen time limits using these capabilities. Teens will also have the option of notifying their parents if they make a report. These tools are an opt-in experience; adolescents may opt-out of receiving notifications, and parents and teens are not obligated to utilize parental controls.

The parental restrictions, as outlined, are also less robust than those on TikTok, where parents may lock their children’s accounts into a restricted experience, ban access to search, and regulate their child’s visibility on the site, including who can see, comment, or message them. 

Instagram Announces Plans for Parental Controls and Other Safety Features Ahead of Congressional Hearing

Meanwhile, the platforms themselves provide screen time limitations – Apple’s iOS and Google’s Android mobile operating systems, for example, have comparable constraints. In other words, Instagram is not doing much in the way of creative parental restrictions right now, but it promises to “offer additional alternatives over time.”

Another new feature has mentioned earlier. Instagram introduced a test of its new “Take a Break” feature earlier this month, which allows users to set a timer for themselves to stop using the app after 10, 20, or 30 minutes, depending on their desire. This function will now be available in the United States, the United Kingdom, Ireland, Canada, Australia, and New Zealand.

Unlike rival TikTok, where videos that encourage users to leave the app display in the main feed after a specific period, Instagram’s “Take a Break” function is only available to those who opt-in. Users will be encouraged to set these reminders by the firm, but they will not be required to do so. 

That creates the impression that Instagram is doing something to address app addiction, without really going so far as to have “Take a Break” activated by default for its users, or, like TikTok, reminding users to exit the app on a frequent basis.

Another characteristic is the broadening of previous attempts to keep teenagers away from adult contact. Instagram has already started setting kids’ profiles to private and restricting target advertising and unwanted adult interaction — the latter by deploying technology to detect “possibly suspect behavior” from adult users and then blocking them from interacting with teens’ accounts. 

Other adult users are no longer able to contact minors who have not already followed them, and the teen receives warnings if the adult engages in questionable conduct, as well as methods for banning and reporting.

It will now broaden this set of capabilities to prevent adults from tagging or mentioning kids who do not follow them, as well as from including their content in Reels Remixes (video content) or Guides. These will the new default settings, which will be implemented in the coming year.

Instagram also claims it would be tighter about what it recommends to teenagers in areas like Search, Explore, Hashtags, and Suggested Accounts. However, the corporation does not appear to have made a firm decision on what would modify in its description of the action it is taking. Instead, Instagram says it is “experimenting” with the notion of restricting material in Explore, utilizing a newer set of sensitive content management measures that debuted in July. The business says it is thinking of expanding the “Limit Even More” setting to cover not just Explore, but also Search, Hashtags, Reels, and Suggested Accounts.