Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on December 9, 2021

What we learned from Instagram head’s Congress testimony

Congress lets Mosseri know it's watching the app closely — oh, and the chronological feed is coming back


What we learned from Instagram head’s Congress testimony Image by: Alexander Shantov/Unsplash

Adam Mosseri, who took over Instagram in 2018, appeared in front of Congress for the first time last night. The company came under major scrutiny after a series of reports from the Wall Street Journal published earlier this year highlighted the firm’s negligence toward teen mental health.

Ex-Meta employee and whistleblower Frances Haugen said last month Instagram was “more dangerous than other forms of social media.”

All of this led to lawmakers being concerned about the social media firm’s lack of actions to protect kids online. In October, Senator Richard Blumenthal had called for Meta CEO Mark Zuckerberg to testify on the issue, but said Mosseri could appear instead.

Here are the six major talking points about the hearing, and why you should care about all this.

Criticism of Instagram’s last-minute tools rollout

Hours before the Congress hearing started, Instagram dropped a blog post detailing its efforts towards teen safety. The company said that it will introduce parental control tools early next year. This will give guardians an option to set time limits on the app for their wards.

Plus, the company will introduce an educational hub for parents to teach them about the intricacies of Instagram, so they can understand how the app works.

For teens, it will launch a bulk delete option in January — for posts, comments, and likes — so they don’t have to keep seeing how they looked or what they liked, years after the fact. Other measures that are coming up include, include control over tagging, tighter handling of recommendations, and nudging teens towards different topics if they’re getting too deep in a rabbit hole about a specific topic.

While these tools seem important, they drew scrutiny at the hearing. Blumenthal said, “These changes seem like PR moves because they were announced hours before the hearing.” He pointed out that these tools should’ve been introduced years ago, and since many of them are in testing, we don’t know when they’ll reach users. 

US Senator Richard Blumenthal grilled Adam Mosseri about Instagram’s safety measure for teens

Senator Marsha Blackburn said, “While we share the common goal of protecting kids and teens online, what we aren’t sure about is how the half measures you’ve introduced are going to get us there.”

Senatorial finstas

One of the things members of the subcommittee on Consumer Protection, Product Safety, and Data Security did to prove Instagram’s negligence was to create fake accounts and replicate the app experience for teens.

Blackburn noted that they were able to create an account as a 15-year-old girl, which defaulted to public, despite Instagram’s announcement in July about turning all under-16 accounts private by default at sign up.

We created a fake account and followed a few accounts about eating disorders, and within hours it was shown pro-anorexia and eating disorder content, which was harmful.

Notably, his office did the same experiments two months ago, when Meta’s global head of saftey, Antigone Davis, was testifying against Congress. This time it was to show Mosseri that Instagram hasn’t made any changes despite highlighting the issue.

Instagram is bringing back its chronological feed

During his testimony in front of Congress, Mosseri said that in order to give people more control over what they see on their feed, the company is “currently working on a version of a chronological feed that we hope to launch next year.” This means people will see posts in the order they were published, without any algorithm interference. 

What’s more, the social network said it’s also experimenting with favorites, so you can choose accounts whose posts you want to see higher up in the feed. Notably, Facebook already has this feature for friends and pages.

Instagram wants industry-wide standards

The Instagram head quoted an independent study claiming that teens use YouTube and TikTok more than Instagram. He noted that teen safety is “an industry-wide problem, so we need industry-wide standards, and industry-wide solutions.”

He also put out a proposal to create an independent industry body that identifies best practices concerning age verification, parental controls, and age-appropriate content. Mosseri said this body should get input from civil society and regulators to create universal standards.

He noted that Instagram has new technology-based tools to remove accounts below the age of 13 using, and is not just relying on government-issued ID cards for verification.

Transparency and regulation

Blumenthal didn’t quite like the idea of a self-regulatory body and said, ” The time for self-policing and self-regulation is over. We need independent researchers, objective overseers not chosen by big tech. Standards that stop toxic content that has driven kids to dark rabbit holes.”

He also pointed that the US should follow the UK to form child protection regulations related to preventing addictive app designs.

In his opening statement, Mosseri said, “We’ve been calling for regulation for three years now.” He also committed to providing data to independent researchers to study Instagram’s algorithms.

Instagram CEO Adam Mosseri
Credit: Wikimedia Commons
Instagram CEO Adam Mosseri

Blumenthal said that algorithm transparency should be a legal requirement, and Mosseri agreed to work with him on the issue.

There have been several proposals to change the Children’s Online Privacy Protection Act (COPPA). For example, a proposal from Senators Edward Markey and Bill Cassidy, introduced in May, asked companies to introduce a clause that prohibited companies from collecting personal data of 13- to 15-year-olds without consent

Is Instagram Kids still in the works?

When Mosseri was asked that if the company would permanently shelve plans to develop a kids version, he said “if we ever manage to build Instagram for 10- to 12-year-olds, they wouldn’t have access to that without explicit parental consent.”

Instagram put a pause on its plan to develop a kid-friendly version of the app this September after backlash from lawmakers.

Age verification is a massive problem, as many kids under the age of 13 use Instagram without facing any hurdles. At an event with the company’s head in June, influencer JoJo Siwa said she had been using the app since she was eight without any problems.

Shares don’t care

Despite many issues raised by Congress over Instagram’s measures to protect the children, the stock market was bullish about Meta. At the time of writing, the company’s shares had risen by more than two percent since the last close.

Meta shares
Meta’s stock price rose by 2% over the course of last 24 hours

Why should you care?

Such hearings bring about changes in technology services that have a direct impact on users. The right amount of regulation and grinding from lawmakers across the world might force companies like Meta to be more proactive about the protection of its users.

Constant criticism of Instagram’s lax attitude towards teen safety has led to the company launching many tools to provide a better and safer experience to them — notably restricting unknown adults from messaging kids. Under constant pressure, Instagram might have to think proactively and introduce new measures like the algorithm-free chronological feed, to give users more control over the content they see on the app.

If there’s legislation in place, the social network might implement stronger guardrails around sign-ups so that underage users are kept off the platform. Mosseri himself suggested that companies like Instagram “should have to adhere to these standards to earn some of our Section 230 protections.” But we don’t know what happens when a social network is not protected by the safe harbor law. 

All of these suggestions might not be taken up by Congress, but it’s important for Instagram, and its parent Meta, to ensure that people don’t come to harm as a result of using their products.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top