A growing wave of studies is painting a troubling picture of how excessive dependence on artificial intelligence tools and social media platforms is quietly reshaping the way we think, write, and remember. Researchers warn that this digital overreliance especially among younger generations may be dulling critical thinking, weakening memory retention, and eroding the skills needed for deep learning. Yet, experts insist that with conscious use, technology can still become a tool for growth rather than a trigger for what many now call “brain rot.”
Last spring, Shiri Melumad, a professor at the Wharton School of the University of Pennsylvania, ran a revealing experiment with 250 participants. Each was asked to write a short piece of advice for a friend on how to live a healthier life. One group used traditional Google searches to gather information, while another group relied solely on AI-generated summaries provided by Google’s automated tools. The difference was stark: those who depended on AI wrote bland, repetitive, and predictable advice “eat healthy,” “stay hydrated,” “get more sleep.” Meanwhile, the participants who searched manually produced far richer, more layered suggestions, discussing mental, emotional, and physical well-being in a way that reflected personal engagement and thought.
Tech companies promise that AI will elevate how we learn and create, but research like Melumad’s suggests the opposite: people who outsource thinking tasks to AI tend to perform worse than those who do the cognitive work themselves. “I’m pretty frightened,” Melumad admitted. “I worry that younger people may be losing the ability to think critically or even conduct a basic search.” Her concern echoes a rising chorus of educators and psychologists who fear that tools like chatbots, meant to assist learning, may instead be numbing curiosity.
The term “brain rot,” once internet slang, has now entered the mainstream lexicon. When Oxford University Press named it the 2024 Word of the Year, it defined the phrase as the mental dullness caused by overconsumption of low-quality online content. Platforms like TikTok and Instagram have mastered the art of keeping users hooked on short bursts of dopamine-fueled entertainment snappy clips that captivate the senses but starve the intellect. This endless scrolling habit, researchers say, is reshaping the brain’s reward systems and shortening attention spans at alarming rates.
Of course, fears about new technology impairing the human mind aren’t new. Socrates once lamented that writing itself would destroy memory. In 2008, long before AI came into the picture, The Atlantic famously asked, “Is Google Making Us Stupid?” But today’s digital dependency, magnified by AI’s growing role in everyday life, presents a more pervasive challenge one that’s reflected in the alarming decline of reading and comprehension skills among American students.
Recent data from the National Assessment of Educational Progress, the nation’s most trusted benchmark for learning, showed that reading scores for eighth graders and high school seniors have plunged to their lowest levels in decades. The drop, first recorded after the COVID-19 pandemic, coincides with a massive increase in screen time and the widespread use of AI-powered learning tools. Researchers now believe that constant exposure to bite-sized content and automated answers may be reprogramming the brain to crave instant gratification rather than sustained focus.
A landmark study from MIT added weight to these concerns. In an experiment involving 54 college students, researchers compared writing performance among three groups: one using ChatGPT, another relying on Google searches, and a third depending solely on memory. Brain sensors revealed that ChatGPT users exhibited the lowest neural activity during writing tasks. More alarmingly, one minute after completing their essays, 83% of those students couldn’t recall even a single line they had written. In contrast, students who wrote without AI could recite large portions of their essays from memory. “If you can’t remember what you wrote,” said Nataliya Kosmyna, who led the study, “then you don’t feel ownership. Do you even care?”
This cognitive disconnect is not limited to AI writing tools. Social media platforms, too, are being linked to diminishing learning capacity. A study published in JAMA by researchers from the University of California, San Francisco, followed over 6,500 children aged 9 to 13. It found that those who spent three or more hours daily on social media scored markedly lower in tests of reading, memory, and vocabulary than peers who used no social media at all. Every hour spent scrolling, the researchers warned, was an hour lost from meaningful activities like reading, sleep, or face-to-face interaction.
In response, schools in several U.S. states including New York, Indiana, and Florida have banned smartphones in classrooms, citing growing evidence of social media’s detrimental effects on concentration. Pediatric experts advise parents to enforce “screen-free zones” at home particularly in bedrooms and at dinner tables to help children reclaim their focus and creativity.
But the research is not all doom and gloom. The same MIT study also revealed that when students began their writing tasks independently and only later used ChatGPT for refinement, their cognitive engagement actually increased. This suggests that AI can be a powerful tool when used as a supplement rather than a substitute for human effort—similar to how a calculator enhances math skills only after the formulas are first learned by hand.
Melumad agrees that the key lies in mindful use. “AI turns active learning into a passive experience,” she said. “But if you use it strategically to clarify doubts, cross-check facts, or refine your work it can still be valuable.” Instead of asking a chatbot to summarize an entire topic, she recommends using it for specific queries while maintaining traditional habits like reading books and verifying sources.
As AI and social media continue to dominate daily life, society faces a choice: allow convenience to erode curiosity, or use technology as a tool to sharpen it. “Brain rot” may be the phrase of the decade, but it doesn’t have to define the future. The human mind, after all, has survived every technological revolution before by remembering how to think for itself.