njsbf new jersey state bar foundation logo a 501c3 non profit organization

Informed Citizens

are Better Citizens

by Maria Wood

According to the Pew Research Center, approximately 95% of kids ages 13 to 17 are on social media, with more than a third of them admitting “they use social media ‘almost constantly.’” In May 2023, U.S. Surgeon General Dr. Vivek Murthy released an advisory about the harmful effects of social media usage on our nation’s youth.

“Children are exposed to harmful content on social media, ranging from violent and sexual content, to bullying and harassment,” Dr. Murthy said in a statement when the advisory was released. “And for too many children, social media use is compromising their sleep and valuable in-person time with family and friends. We are in the middle of a national youth mental health crisis, and I am concerned that social media is an important driver of that crisis—one that we must urgently address.”

Specifically, the Surgeon General’s report claims spending time online damages an adolescent’s self-esteem, perpetuates eating disorders, and increases negative feelings about their bodies. One statistic in the report states that 46% of 13- to 17-year-olds surveyed said being on social media made them feel worse about their bodies. In addition, a Centers for Disease Control and Prevention report, released in 2023, revealed that teenage girls face “unprecedented levels of hopelessness” and one in three girls have seriously considered suicide.

U.S. senators take on issue

Two bills being considered in the U.S. Senate would set age limits on social media usage. The Protecting Kids on Social Media Act, introduced in April 2023 by a bipartisan group of senators, would bar children under 13 from creating social media accounts. Young people between the ages of 13 and 17 would need consent from a parent or guardian before creating an account. In addition, the legislation stipulates that social media companies would not be permitted to use a teen’s personal information to push advertising or content to them. The legislation would also establish a program, run by the Department of Commerce, to verify the age of users. The government-run program would be voluntary. Social media companies could also develop in-house age verification technology or hire a third party to verify ages.

In July 2023, the Kids Online Safety Act (KOSA), another bipartisan bill was introduced. Similar to the Protecting Kids on Social Media Act, KOSA would ban anyone under 13 from joining a social media network and require parental permission for young people under 17. KOSA would also empower the Federal Trade Commission and state attorneys general to penalize social media companies that expose children to harmful content, including glamorizing eating disorders, suicide and substance abuse.

In addition, Senate lawmakers proposed an update to the Children’s Online Privacy Protection Act (COPPA), a law enacted in 1998 to protect the privacy of children 13 and younger, giving parents control of what data is gathered by their children on the internet. Under COPPA 2.0, the age of protection would be raised to 16. Social media companies would be barred from collecting personal information without parental consent under COPPA 2.0 and it would also ban advertising aimed at children and teens.

Jonathan D. Bick, an adjunct professor at Rutgers Law School in Newark, notes that under COPPA, commercial social media companies can be held liable if someone under 13 goes on their platform. Bick, who has published more than 200 articles dealing with internet law, points out that these companies can avoid liability by just prohibiting children under 13 from accessing their sites.

In July 2023, KOSA and COPPA 2.0 advanced out of the Senate Commerce Committee with a unanimous vote. At press time, the bills were awaiting a hearing in the full U.S. Senate, and no movement had been taken on the Protecting Kids on Social Media Act.

On the state level

In March 2023, Utah became the first state to prohibit social media use by those under 18 between the hours of 10:30 p.m. to 6:30 a.m. Utah’s law also requires age verification and parental consent to open a social media account for those under age 18. The law is slated to take effect in March 2024.

According to the National Conference of State Legislatures, 35 states and Puerto Rico have proposed legislation to curb the harm of social media on the young. In addition, 11 states have either passed laws or resolutions to address the problem, including creating study commissions, requiring age verification or parental consent to open a social media account, and requiring age verification in order to access certain websites.

Some of these laws have met with pushback in the courts. In fact, when Utah Governor Spencer Cox signed the state’s legislation into law, he said he anticipated that the law will be challenged in the courts; however, at press time it had not.

Federal courts have so far blocked California and Arkansas from enforcing laws requiring parental consent for minors to create new social media accounts on the grounds that it may violate the First Amendment. Professor Bick contends banning children under 13 from social media sites isn’t a First Amendment violation because the platforms are private entities, not government agencies.

In the Garden State, Governor Phil Murphy signed legislation in July 2023 to establish a commission to study the effects of social media usage on adolescents. New Jersey Advance Media also reported that in November 2023 Assemblyman Herb Conaway introduced a bill in the New Jersey Assembly mandating that social media platforms verify that users in New Jersey are at least 18 years of age or have parental consent to be on the platform. The bill is not expected to have a hearing until January 2024 at the earliest.

Problem with limits

Critics of laws that ban or limit youth on social media claim they add another layer of government oversight on social media users and could clash with the right to access information. Youth that belong to marginalized groups such as the LGBTQ+ community could also be negatively affected by these laws.

“The internet is like a public square. When you have a restriction that actually requires people to only participate by verifying their age by uploading a government-issued ID, that really prevents people of all ages from safely participating in the public square and public conversation,” Dillon Reisman, an attorney for the American Civil Liberties Union of New Jersey, told New Jersey Advance Media. “If you have kids who belong to marginalized communities and don’t receive support in school or from their own parents, being able to access information about health, identity, politics online without parents acting as the gatekeeper is important.”

Gaia Bernstein, a professor at Seton Hall University Law School who teaches courses on technology, privacy and policy, says the laws do not interfere with young users’ ability to gather information online.

“Searching for information online has nothing to do with joining social networks,” explains Professor Berstein, who is also co-director of the Institute for Privacy Protection, and co-director of the Gibbons Institute for Law, Science and Technology at Seton Hall.

School districts take to the courts

According to reporting from The Wall Street Journal, nearly 200 school districts across the country are suing social media companies for the alleged harm they have on children.

In New Jersey, school districts in Chatham, Matawan-Aberdeen and Watchung are suing to recoup money spent to provide extra mental health services to students they claim were harmed by the platforms. Chatham Superintendent Michael LaSusa told nj.com his district spends $1 million annually for additional mental health treatment for its students, an added expense he believes is due to social media usage.

Professor Bernstein says the lawsuits fall under the public nuisance theory, a broad legal doctrine regarding any conduct said to interfere with the rights of the public. Under this theory, parties not directly harmed by an action can sue, she explains, and points out that it was used when states sued the tobacco companies to recoup the cost of providing healthcare to smokers.

Some legal scholars are not convinced that social media companies can be held liable for the cost of providing mental health services.

“Think about all the things that are social ills that manifest themselves on school property—drugs, political discord, domestic violence,” Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University in California, told Law.com. “Can schools sue all the potential sources of those social ills for nuisance?”

Eugene Volokh, a law professor at UCLA and an expert on First Amendment issues, is also skeptical.

“Providing recommendations of videos you might watch, providing social media services where people can communicate with each other, is generally protected by the First Amendment,” Professor Volokh told Education Week. “The government generally can’t impose liability on publishers for supposedly publishing material that causes some of the users to be psychologically harmed. You can’t sue a movie studio for putting out a movie that is bad for some small fraction of the audience.”

Professor Bernstein says “there is no magic bullet” to solving the issue of kids online. “We are not going back to the 20th century with no screens,” she says. “The goal is to have a healthier online/offline experience.”

Discussion Questions

  1. The Surgeon General says the nation is in a “youth mental health crisis.” What do you do to promote your mental health? Do you have coping skills if you feel down?
  2. Choose one of these statements to defend: 1. Limiting time spent on social media is beneficial. 2. Limiting time spent on social media is not necessary. Explain your answer in detail.
  3. Read the sidebar “Can’t Stop Scrolling.” If you had to pay for social media, whether with a subscription or a pay-as-you-go plan, how would that affect the hours you spend on social media platforms? What do you think you might do instead? Explain your answer.

Glossary Words
algorithm
—a set of rules to be followed in calculations.
bipartisan — supported by two political parties.

BONUS CONTENT: Can’t Stop Scrolling

In October 2023 New Jersey joined more than 40 states suing Meta, the parent company of Facebook and Instagram, for alleged harm to young people. The suit claims Meta intentionally hooked users and wrongly assured the public that their features were safe and appropriate for children.

Gaia Bernstein, a professor at Seton Hall University Law School and author of the book, Unwired: Gaining Control Over Addictive Technologies, says addictive features such as infinite scrolling, which keeps users endlessly peering through the site, and intermittent rewards, such as likes and comments, are designed to maximize time on the site.

“Social media companies have no incentive to limit the time we spend online because the whole business model is based on collecting our data,” she says. “For that they need to have us online as long as possible to sell advertising.”

It’s addictive

In September 2020, Tim Kendall, a former Facebook executive testified before Congress during a hearing about online extremism.

“We sought to mine as much human attention as possible and turn it into historically unprecedented profits,” Kendall said in his testimony. “To do this, we didn’t simply create something useful and fun. We took a page from Big Tobacco’s playbook, working to make our offering addictive at the outset.”

One New Jersey teen told New Jersey Monitor, “The algorithm, it curates to what you like….The addicting thing is that there’s always something endlessly there, so you keep scrolling.”

Expecting social media companies to self-regulate will not work, Professor Bernstein maintains. Regulation, however, can be part of the solution, she says, especially if the laws impose a “duty of care” on social media companies. Duty of care is a legal obligation that requires individuals or companies to adhere to legal and ethical standards in order not to cause harm to others.

Professor Bernstein also suggests including a rating for addictiveness on sites. These ratings would be similar to how video games are rated for age appropriateness and violence. Social media companies could also change their business model from free to subscription-based or a pay-as-you-go plan, she says. Her reasoning is that if kids have to pay to be on social media networks they will “think much more carefully about how much time they spend on them.” She is also skeptical about the effectiveness of parental permission, surmising that children will find a way around it.

“It’s very difficult to tell a kid they can’t be on Instagram when all their friends are on Instagram,” Professor Bernstein says.—Maria Wood

This article and sidebar originally appeared in the winter 2024 issue of The Legal Eagle, Special Social Media Edition.