Cyber Safety Consulting Quoted in Washington Times on Landmark Social Media Verdicts

We are always honored to give our thoughts on cyber safety issues. Check out the article at this link:

https://www.washingtontimes.com/news/2026/mar/25/juries-find-social-media-platforms-knowingly-harmed-children/

The Hidden Dangers of Roblox: What Parents Need to Know About the Whisper Chat Function

The Hidden Dangers of Roblox: What Parents Need to Know About the Whisper Chat Function

By: Allison Bonacci

Based on my time with students in schools, I have no doubt that the most popular online game is Roblox! Whether I am working with 3rd grade students or 8th grade students, this is a hugely popular game. Upon a casual glance at Roblox, it appears to definitely be a ‘kids’ game’ and some may even say a ‘little kids’ game’. As a result of this perception, most parents instinctively assume if it is a little kids’ game, then the dangers are probably fairly limited. Not so much…

Looking at the hard numbers, Roblox is one of the most popular online gaming platforms for kids, with over two hundred million active users each month. Its interactive, user-generated worlds and immersive multiplayer experience make it incredibly appealing to young players. However, like many online platforms, Roblox comes with its fair share of risks, particularly when it comes to communication features like the whisper chat function, which was new to me until today when I read this article.

What is the Whisper Chat Function

The whisper chat function in Roblox is a predator’s dream. It allows players to send private messages to each other within the game. Unlike the general chat, which can easily be monitored, whisper chat messages are private and can be harder to track. This function was created to be useful for teamwork and private discussions, but has instead opened the door to potential dangers, mainly that of a predator grooming an innocent child, but also includes cyberbullying, inappropriate content, scams, and phishing attempts. Here is a short YouTube video to familiarize yourself with how it works.

Security experts said sites like Roblox have become a hotspot for predators. In fact, in 2024, Roblox was called a ‘pedophile hellscape for kids’ in a report by Hindenburg Research, alleging that students have been “groomed” on this app for years. The whisper chat functionality available for Roblox players contributes to this problem. Since whisper messages are not always visible to moderators or parents, it becomes easier for predators to build trust and manipulate young players.

How Parents Can Protect Their Kids on Roblox

While Roblox offers many opportunities for creativity and socialization, parents must be aware of the potential dangers lurking within the game and take proactive steps to protect their children. Roblox has built-in parental controls that can significantly improve online safety:

  • Restrict Chat Functions: Under the account settings, parents can limit who can message their child or disable chat features altogether. This will also disable whisper chats.
  • Enable Account Restrictions: Turning on Account Restrictions in the Privacy Settings can help determine that only age-appropriate content is accessible to the child.
  • Set a Parent PIN: By enabling a Parent PIN, settings cannot be changed without parental approval.
  • Monitor Friend Requests: Set the account to only allow friend requests from known, trusted individuals.
  • Use the Report & Block Features: Teach children how to report and block users who behave inappropriately. This allows children to control who they wish to communicate with online and is a great skill to teach for future online interactions.

Additionally, I want to emphasize the importance of talking to your child about this feature. Show them what this looks like and explain that it is dangers for anyone to talk to you privately on any game. Also emphasize the importance of only talking to people you know in real life or that you, the parent, have approved. Even if you have chat disabled, your child can be playing at a friend’s house who may not have chat disabled and needs the knowledge to self-protect.

In addition, it is important to educate children about general online safety so that they understand the dangers of private messaging and online interactions. This includes never sharing personal information, such as real names, addresses, or passwords, and understanding that not everyone online is who they claim to be. Students should also be encouraged to report any uncomfortable interactions to a parent or trusted adult immediately. These could include predatory behavior, such as someone asking them to keep secrets or trying to move the conversation to another platform.

Checking Roblox chat logs and in-game activities periodically can help parents identify any concerning interactions or behaviors. We think it is best to foster open communication with your child by encouraging them to talk openly about their gaming experiences, friends they meet online, and any messages that make them uncomfortable. Assure them they will not be punished for reporting concerns.

Roblox can be a fun and engaging platform for children, but it is essential for parents to stay informed and proactive about its risks. The whisper chat function, while useful for gameplay, poses dangers that should not be ignored. By utilizing parental controls, educating kids on online safety, and fostering open communication, parents can create a safer online environment for their children.

Cyber Safety in the News

More Than 1 in 3 Adolescent and Teen Boys Are Gambling—And It Often Starts with Video Games

Parent’s Magazine, February 3, 2026

A new study from Common Sense Media finds that gambling is far more common among U.S. boys ages 11–17 than many parents realize, with about 36% reporting they gambled in the past year. Much of this is not traditional betting, such as sports wagering or card games, but rather gambling-like systems embedded in video games, such as loot boxes and randomized reward mechanics that require real money and mimic slot machine behavior. These features normalize chance-based spending in contexts kids view as harmless gaming, making the line between play and gambling blurry.

The study also highlights several ways boys are exposed to gambling: social media ads, sports broadcasts, peer influence, and algorithms pushing gambling-related content into feeds. A significant majority of boys who see gambling content online are not necessarily seeking it out, instead the content is delivered to them as part of regular scrolling on platforms like YouTube or TikTok. Peer groups were one of the strongest predictors of gambling behavior, with many boys more likely to gamble if their friends do.

Experts warn that starting these habits early, especially while teens’ brains are still developing, can increase the risk of addictive behaviors, anxiety, depression, and negative impacts on school or relationships. Parents are encouraged to talk openly about gambling, set spending limits, monitor in-game purchases, and watch for patterns of frequent gambling rather than isolated instances. The goal is to help students understand the risks and recognize gambling-like mechanics before they escalate into problematic habits.

 

‘Firearm Influencers’ Are Targeting Kids on Social Media- What Parents Should Know

TODAY, Feb 10, 2026Top of Form

The article explains that many children and teens are encountering firearm content on social media and video platforms, often without their parents knowing, because algorithms can recommend videos and posts related to guns even when kids are not searching for them. This content can include “firearm influencers,” unsafe handling demonstrations, and marketing that frames guns as exciting or desirable, all of which may shape young people’s perceptions of guns and normalize risky behavior. Campaigners and researchers argue that this kind of exposure can reach children quickly after they start using social platforms and that platforms should be more transparent about what kids see and how it is recommended.

The guide part of the article focuses on what parents need to know and do: it urges caregivers to be proactive in understanding the type of gun-related content their children might encounter, to talk openly about it, and to use available tools like parental controls or monitoring features to limit exposure. Experts also suggest that simply forbidding access is not enough, parents should engage with their kids about why certain content can be harmful and help them think critically about online material. As always, we promote open and honest communication between students and parents.

 

Cell Phones to Be Banned in Michigan Classrooms

Detroit Free Press, February 10, 2026

Michigan Governor Gretchen Whitmer has signed a new statewide law that will ban students from using smartphones during instructional time in K-12 public school classrooms starting in the 2026–27 school year. The legislation, passed with bipartisan support in the legislature, requires every school district to adopt policies that prohibit phone use while class is in session, though students can still bring phones to school and use them between classes or at lunch. Basic “flip phones” and medically necessary devices are exempt, and schools can implement even stricter rules if they choose.

The law is intended to reduce distractions, improve academic focus, and address concerns about high screen time and its effects on student learning and mental health. Local school districts retain control over enforcement details, and the policy includes exceptions for emergency communication and teacher-approved academic uses. Supporters argue the ban will help students engage more in lessons and reduce disruptions, while also aligning Michigan with a growing number of states adopting similar restrictions.

 

The Surging Online Risk to 13-Year-Olds Most Parents Aren’t Talking About

Newsweek, Feb 13, 2026

A recent national study of more than 3,400 U.S. adolescents ages 13–17 shows that sexting has become widespread, with nearly one-third of teens reporting they have received sexually suggestive images or videos and about one-in-four having sent them. Researchers found that sending explicit content to someone outside a committed relationship greatly increases the risk of harmful outcomes: those teens were over 13 times more likely to have their images shared without consent and nearly five times more likely to face sextortion, which is when someone threatens to distribute the images unless the victim sends more content, money, or complies with other demands. Requests for sexts were also common, with roughly 30 % of teens saying they had been asked to send explicit content, indicating that social pressure often drives these interactions rather than mutual choice.

The study highlighted troubling patterns among diverse groups: boys reported higher rates of sending and receiving sexts than girls, non-heterosexual teens experienced higher involvement and pressure, and younger teens, especially 13-year-olds, were particularly vulnerable to having content shared without permission. Nearly half of teens who had sent explicit images said they were later targeted with sextortion. Experts emphasize that simply telling teens “Don’t sext” is ineffective; instead, education should focus on consent, boundaries, digital privacy, and how to handle risky online situations, helping them navigate digital relationships safely and seek help if something goes wrong. We always discuss sextortion risks and dangers with our secondary students, as the FBI labeled this as the fastest growing crime online.

 

High School Student Facing More Than 300 Felony Charges for Running a ‘Sextortion’ Scheme That Exploited Minors

People Magazine, February 22, 2026

An 18-year-old senior at Peters Township High School in Pennsylvania has been charged with more than three hundred felony counts in connection with a large-scale sextortion and catfishing operation that targeted minors. Prosecutors allege the student, identified as Zachariah Abraham Meyers, used fake profiles on social media platforms like TikTok and Snapchat, including posing as an adult woman, to contact boys between the ages of about 14 and 17. He reportedly tricked them into sending explicit images and videos and, in some cases, used threats to coerce further material or money by threatening to share the content with family and friends. At least twenty-one victims have been identified so far, and evidence from seized devices linked him directly to the alleged network.

According to authorities, Meyers’ alleged conduct was not limited to obtaining images: in some cases, he is accused of directing victims to produce sexually explicit recordings, including one involving adult men, and of exploiting his access to school environments. He is currently held without bail as investigators continue to analyze devices and determine the full scope of the scheme; the school district has stated that there is no ongoing threat to student safety while cooperating with law enforcement. At Cyber Safety Consulting, we always warn parents to monitor their children’s digital interactions and be vigilant about online enticement and exploitation.

Instagram To Alert Parents When Teens Search for Info on Suicide or Self-Harm

CBS News, February 26, 2026

Meta-owned Instagram announced it will start notifying parents if teenage users repeatedly search for terms related to suicide or self-harm on the platform. These alerts will be sent via email, text, WhatsApp, or an in-app notification, but only if parents are enrolled in Instagram’s parental supervision tools. Instagram said it already blocks such content for teens under eighteen and directs them to helplines and resources when they try to search for harmful terms.

The alerts are designed to give parents an early warning that their teen may be struggling, so they can intervene and offer support or resources for sensitive conversations about mental health. Meta specified that the alert will only trigger after a teen performs multiple related searches within a short timeframe, a threshold it set to reduce unnecessary notifications and avoid overwhelming parents.

This rollout begins next week in the United States, United Kingdom, Australia, and Canada, with plans to expand to other regions later in 2026. The update comes as the company faces ongoing legal scrutiny and trials over how its platforms affect young users’ mental health, including claims about platform design and youth harm. Instagram’s teen safety enhancements also include prior content restrictions for minors and efforts to bolster parental controls.

Allison Bonacci featured in Washington Times Article: More states consider ‘bell-to-bell’ cellphone bans for K-12th grade students 

We are always honored to contribute to articles about online student safety…

https://www.washingtontimes.com/news/2026/feb/5/states-consider-bell-bell-cellphone-bans-kindergarten-12th-grade/

Cyber Safety in the News

Believe It or Not, Kids Actually Want to Get Off Their Phones – Dr. Jonathan Haidt Says He Has Proof

Parent’s Magazine, January 3, 2026

The article highlights research from Dr. Jonathan Haidt and co-author Catherine Price in their book The Amazing Generation, which argues that many children actually want to spend less time on their smartphones and more time engaging in real-world activities like playing outside and socializing face-to-face. The authors gathered testimonials and survey data from young people themselves to show that when given a choice, many kids prefer unstructured, screen-free interactions with friends over hours spent on phones. This challenges the common belief that children are simply addicted to their devices and reveals that kids sometimes feel trapped by the expectations and norms around digital communication and social media.

Drawing on both research and real kids’ voices, the article suggests that parents have an opportunity to help their children reclaim a more balanced childhood by setting healthier boundaries around technology use. Rather than banning smartphones outright, the book’s messaging focuses on giving kids freedom and encouraging activities that foster real-world connections, which many young people say they genuinely want. The authors also explain how pressures from peers and fear of missing out can keep kids glued to their screens, even when they would rather be doing something else. The solution seems to involve creating environments that make screen-free time more appealing. When we are working with students in the classroom, we often encourage them to make a list of alternative offline activities that they enjoy, which results in fostering those real-world connections.

 

Phones Ruled Their Lives. A New College Class Helped Them Break Free.

The Washington Post, January 6, 2026

At Loyola University Maryland in Baltimore, a psychology professor created an experimental “digital detox” course to help students break free from excessive smartphone dependency, which many described as feeling “trapped in a phone prison.” Before the class began, some students reported checking their phones hundreds of times a day or having dozens of games downloaded and expressed concerns that constant screen use was hurting their focus, sleep, and emotional wellbeing. Over the semester, participants dramatically reduced their phone pickups and began recognizing how much time their devices consumed.

The class ran without phones, computers, or tablets; instead, students engaged in analog activities, digital fasts and outdoor experiences like football and hiking. They studied the psychology behind attention and notifications and practiced skills such as uninterrupted conversation, something many students said they’d rarely experienced. By the end of the semester, students created “digital manifestos” outlining how they planned to use technology more intentionally going forward.

Many participants said the experience helped them rediscover boredom and the value of in-person interaction, and several pledged to set concrete limits on social media and screen time after the class ended. The course reflects a growing awareness among educators that college-aged young adults often need structured support to rethink their relationship with technology. As part of Cyber Safety Consulting’s CASE curriculum, we work with students to create awareness surrounding their current daily screen time and being more intentional about offline activities in the future.

Character.AI And Google Agree to Settle Lawsuits Over Teen Mental Health Harms and Suicides

CNN, January 13, 2026

Google and the AI startup Character.AI have agreed to settle multiple U.S. lawsuits brought by families who alleged that interactions with Character.AI’s chatbot platform contributed to teenagers’ suicides or serious psychological harm. The legal claims include wrongful death and negligence, with one case involving a Florida mother who said her 14-year-old son formed a harmful emotional connection with a chatbot before ending his life.

The lawsuits were filed in several states, including Florida, Colorado, New York, and Texas, with plaintiffs arguing that the chatbots lacked adequate safety protections or crisis-intervention features for minors. Google was named in many of the suits because of its financial and technological ties to Character.AI. Opponents have claimed that this connection made Google partly responsible for the product’s design and deployment.

In response to growing concerns, Character.AI has already implemented changes aimed at protecting youth, such as banning under-eighteen users from open-ended chats and introducing age-verification measures to reduce harm. The settlement marks one of the first major legal resolutions tied directly to safety issues with AI chatbot use among teens. This highlights a broader debate about how tech companies should safeguard AI engagement with vulnerable users like teenagers in the future.

 

YouTube Will Let Parents Stop Their Teens from Endlessly Scrolling Short Videos

CNN, January 14, 2026

YouTube has announced expanded parental control features that let parents of supervised teen accounts manage how much time their children spend watching YouTube Shorts, the platform’s short-form video feed. These controls allow parents to set a daily time limit on Shorts viewing, ranging from up to two hours down to zero minutes, effectively blocking access altogether when needed, such as during homework or bedtime. The update is part of YouTube’s broader effort to respond to concerns from families, child advocates, and lawmakers about the addictive nature of endless scrolling on short-video platforms.

In addition to time limits, YouTube is introducing features like custom “Bedtime” and “Take a Break” reminders for teens, giving families more tools to promote healthier viewing habits and digital wellbeing. The company is also making it easier for parents to create and manage supervised accounts and to switch between adult and teen accounts on shared devices. These tools build on existing protections already in place for users under eighteen, including default recommendations aimed at reducing harmful content loops.

YouTube’s announcement reflects growing scrutiny of social media’s impact on youth, as platforms grapple with how to balance engagement with safety. By prioritizing parental control over shorts viewing and refining content recommendations, including promoting more educational or uplifting videos for younger audiences, YouTube aims to tailor experiences more appropriately for teens. Critics and advocates alike see such features as increasingly necessary given the attention-grabbing design of short-form video feeds. While this is a step in the right direction, it would be extremely easy for students to circumvent this parental control by using an alternative YouTube account or using the platform as a guest. As always, open communication between parents and kids about online safety is best.

 

Meta Halts Teens’ Access to AI Characters Globally

Reuters, January 23, 2026

Meta Platforms announced that it will suspend access for teenagers to its AI characters across all its apps worldwide while it builds an updated experience specifically for teen users. The pause will begin “in the coming weeks,” and teens will not be able to interact with the character-based AI until the revised version is ready. According to Meta, the new iteration will include parental controls designed to give guardians more oversight once it is launched.

Meta said that earlier previewed parental controls, which would let parents disable their teens’ private chats with AI characters, have not yet been fully rolled out, so the company is taking this step as an interim measure. The updated version of the characters is intended to be guided by a PG-13 content standard aimed at keeping interactions appropriate for minors and prevent access to harmful or age-inappropriate material.

The move comes as regulators and critics scrutinize how AI chatbots interact with minors, including past reporting that Meta’s AI rules at times allowed provocative or inappropriate conversations with younger users. Meta’s decision reflects rising industry and regulatory concerns over teen safety and content risks associated with AI-powered characters on social platforms. We are always happy to see parental controls put into place and would like to see more platforms follow suit in the future.