“Mindless digital content; the fixation on it and harmful mental effects of it,” the direct translation of the Oxford Word of the Year in 2024, “brain rot.” Brain rot is the next new worry of technology’s effects on today’s youth and others most susceptible to the unhelmed beast of the internet.
Labeling technology as addictive and linking it to mental health decline is nothing new.
As Cambridge professor Amy Orben (2020) highlights in “The Sisyphean Cycle of Technology Panics,” concerns over new technology follow a predictable, recurring pattern. However, the issue is rebranded as soon as a new technology or trend within technology is introduced.
Brain rot is just the latest in a long history of moral panics related to technology. Before the internet, fears revolved around television, rock music, video games and even the telephone.
Each new medium brings fresh concerns that society, especially older generations, claims will corrupt the youth.
Cases of this cycle include Mary I. Preston’s study of media consumption in the ’40s, Congress’s hearing on the impact of comic books on juvenile delinquency in the ’50s, conversations about video game violence in the eighties, the Y2K debacle and most recently the development of AI.
This fear of technology’s effects on youth dates back to Preston who, in her study “Children’s Reactions to Movie Horrors and Radio Crime” in the Forties, described children’s consumption of radio and television horror and crime shows as an addiction.
But what does this have to do with modern day issues with technology?
The answer is completely nothing. In fact, some would argue that an addiction to movie horror and radio crime would be much more preferable to doom scrolling.
As technology evolves, its predecessors become insignificant when they were once some of the most misunderstood pieces of our society.
Lackluster apprehension of these topics leads to scrutiny and fear towards these objects in the age of their popularity.
Fast forward two-to-10 years, these topics of focus in the technology world are almost irrelevant and the fear that once mongered around them has dissipated.
The latest target of technological disavowal is “brain rot.”
This fear can easily be seen when the word is searched on Google, with some of the top results titled: “Brain Rot | Why You Are Losing Control Of Your Brain?”, “Why ‘Brain Rot’ Can Hurt Learning — and How One District Is Kicking It Out of School” and “Online is undermining our ability to tell meaningful stories.”
The correlation of time spent on the internet or an electronic device to mental health decline is comparable to the popularity of the first name Annabelle and UFO sightings in Maryland.
Essentially, it’s a false correlation with a lack of data and spread by word of mouth.
Take Oxford University’s word for it. In a study involving over 12,000 children and using the Adolescent Brain Cognitive Development (ABCD) study, the most significant long-term study of brain development and child health in the United States, there was no correlation between screen time, well-being and impact on brain function.
But why do we keep falling for this loop of fears? One reason that seems to be most prevalent is media coverage and its correlation to selection bias.
In the modern day media, sources are beyond skewed with some more focused on creating entertainment that draws in viewers rather than informing viewers of topics.
As defined by the University of North Carolina, selection bias “occurs when data is selectively presented in a way that skews perception, reinforcing existing fears rather than reflecting reality.”
The consumption of media and coverage that aligns with individuals’ fears surrounding technology topics is one reason this cycle of fear continues.
Most articles surrounding brain rot are negative and studies around technology lack fundamental data or state falsehoods.
One of the biggest examples of selection bias in modern media is the use of Wilson, G. (2005). The “Infomania” experiment for Hewlett-Packard. Unpublished manuscript in a Guardian article titled ‘Emails Pose Threat to IQ’
The Wilson (2005) study claimed that frequent digital distractions caused an IQ decline of 10 points, greater than the reported 4-point decline from marijuana use.
However, the study did not establish a baseline IQ for participants before measuring the effects of email interruptions. Without an initial cognitive assessment, it is impossible to determine whether the supposed IQ drop was due to email use or other external factors.
Additionally, the study’s methodology lacked a proper control group, meaning there was no comparison with individuals not exposed to digital distractions.
A properly controlled study would have measured participants’ IQ under normal conditions and then compared it to their IQ when exposed to distractions
The absence of this fundamental scientific standard renders Wilson’s conclusions highly questionable.
Despite the research papers’ flaws and lack of peer review, The Guardian ran the piece titled “Emails Pose Threat to IQ,” spreading its misleading claims to its entire audience.
So, if these fears don’t hold up under scrutiny, why do they keep resurfacing? The answer lies in our tendency to romanticize the past and distrust the present.
Every generation believes the one after it is ‘ruined’ by the latest innovation. The same parents who worried about their kids’ video game or social media addiction once spent hours watching MTV, and the same people decrying TikTok likely spent their youth glued to cable television.
Technology isn’t inherently harmful, but our relationship with it determines its impact.
We as individuals are a product of our surroundings and the inability to adapt to our surroundings can cause panic and fear in individuals as well as create a misunderstanding of the surroundings around us
Instead of blindly fearing brain rot, perhaps the conversation should shift to how we engage with digital spaces mindfully, not whether the internet is melting our brains.
Voelker can be reached at voelkerw0364@uwec.edu.