“If the spectacle — understood in the limited sense of those ‘mass media’ that are its most stultifying superficial manifestation — seems at times to be invading society in the shape of a mere apparatus, it should be remembered that this apparatus has nothing neutral about it.” — Guy Debord, The Society of the Spectacle, 1967.
After 13 years weaving hammocks on a hippie commune, I reentered the mainstream and, after a period of adjustment (social and technological), I found employment as a Teacher’s Assistant (Special Ed) at a public middle school. What I discovered there were the psychosocial dynamics resulting from a technological colonization beyond my wildest sci-fi dreams (or nightmares). Ironically enough, in an environment of professed diversity and ‘cultural responsiveness’ (a staff certification mandated by the school district), the choice to unplug from the matrix built by Gates, Jobs and other cyber-totalitarians is not an option; although the school has a personal device ban for the students in place (“Away for the Day”), the internet intrudes upon almost every education hour.
1.
I was relieved I landed a position with that particular age group, probably because of my positive (sentimental) recollections of my own middle school years. That was the early 70’s and my middle school experience was characterized by the safety of childhood stimulated with the liberated sophistication of the era.1 Questioning authority went mainstream, perhaps for the first time; after a successfully-resolved popular opposition to the Vietnam War — especially its mandatory draft — the primary villain of that outrage, President Richard Nixon, was in the public process of being threatened with impeachment which lead to his shamed resignation. For a brief, euphoric moment, it seemed like maybe all the protesting longhairs were proven right.
Adolescent development wasn’t all fun and games for me by any means, however. As an introverted ‘artistic’ kid, I proved increasingly worrisome to my (unhappily divorced and remarried) mother who remembered her 1940’s adolescence best characterized by wholesome participation in sports, especially a local girls’ softball team. Frustrated by my tendency to commune with John Lennon albums while drawing psychedelic sketches all the time — perhaps pissed off I was squandering the opportunity to do the ‘guy stuff’ mainly denied her as a Boomer girl enlisted to become a Boomer ‘housewife’ in the unenlightened early 60’s — she sent me to a shrink when I entered 7th grade, a secret weekly rendezvous I was terrified would out me at school. Her central complaint: Why didn’t I have friends and do “normal activities” like other kids? Fortunately, the psychiatrist who saw me (for two sessions) knew a bit more about so-called normal kids in 1972 than my mother kid. At least I wasn’t skipping classes to smoke weed behind the school like the cool teenagers.
By 8th grade, my nerd specs got upgraded to granny glasses and I grew out my hair into a regular rat’s nest. This — along with albums by groovy dudes like CSN&Y (where a tolerant, and beloved, English teacher allowed students to play songs on an ancient Crosley prior to first class) — seriously upped my social credibility. Around the time Tricky Dick was stammering “I am not a crook” to a burned electorate, it was better business in my school to be a ‘freak’ rather than a jock. Still my hapless mother didn’t get it. One morning, grousing about my uncombed hair, she suggested that one reason I “had no friends” owed to my poor hygiene. Joyfully challenged, I spent my freewheeling art class that day designing a colorful card in which to enlist my classmates’ opinions about messy long hair. It was unanimous: some 20 kids wrote comments about the benefits of “expressing yourself” and “being your own person” and other outward forms of right-on high-fiving me, all gleefully signed. See mom, I do have friends, a regular social network.
2.
Fast-forwarding almost half a century and quite a few adulthood eras into the future, I found middle school a changed place.
While working in the classroom my first month there, I jotted down a poem (“Swipples” [a pseudonym for laptops]) which included the observations:
“We’re going to leave no child behind —
a swipple’s better than a mind;
just push a button to reset
the information you’ll forget;
you need a swipple on your lap —
instead of thinking, you just tap;
machines we all depend upon —
they’ll rule the world when we are gone;
it’s time for the commercial break —
when we’re asleep, they’re still awake.”2
Although Special Ed is a world removed from conventional public middle school, my job necessitates traveling with my students into the ‘regular’ classes (electives such as PE and art, not to mention open lunchtime in the school cafeteria), offering me a glimpse of direct comparison to my recollected experience in middle school. Additionally, the 13 sheltered years lived on a “voluntary poverty” intentional community, rurally isolated, brought me a telescoped perspective of modern times (I went there the year the iPhone went on sale, for example). Of course future shock is the first thing I noticed. That old Crosley phonograph had been replaced by a 4-by-5 foot screen called a Promethean Board, most likely glowing with a buffered YouTube video of a Taylor Swift song — but, first, an advertisement for Doritos (“Ad in 5”). Not that any of the students were looking at the monstrous TV screen; they’re staring into the individually glowing screens, often with earbuds in, of their own laptops, playing their own selections. Or gaming. First up, I acknowledge a solipsistic customization of media, along with a corresponding relentlessness. The classes aren’t even quiet when the kids are consuming their video feeds (like children used to be when enthralled in groups by their unrewindable television shows) but once the teacher insists (often for several agonizing minutes), they close their contraptions, and the room erupts into a ferocious squall.
Soon enough, I notice other key generational shifts. These kids have severe difficulty sitting still at their desks; for some of them, 10 minutes is their limit before they spring up, walk around the room, usually grabbing some other kid (aggressively, affectionately; a coded combination of both) and flinging school supplies around aimlessly before returning to their seat (after the teacher yells at them) and immediately cracking open their laptop (or attempting to). Ten minutes later, repeat; all day long. That didn’t exist in the 1970s. Then I notice vocabulary has shrunk — even the insults have been reduced — and what there is of it expresses less range and more monocultural homogeneity. Rote Twitter quips shot out in staccato, often devoid of passion (or context), it seems like even arguing has been dumbed-down. (Example: As I’m attempting to scold a kid outside the classroom, she yells “Body space! Body space!” although I’m three feet away.) Eye contact is almost non-existent; was that the norm when I was a kid?
There’s other, more obvious, markers of a vast cultural shift — eating Doritos in class; wearing pajamas and bedroom slippers (even while throwing footballs during recess in the winter); carrying blankets with them slung over the shoulders; calling teachers “bro” and “yo” if not outright ignoring them; filming each other in the restrooms [!]; and other salient indicia that a classroom is merely an extension (albeit a disappointing one) of home. This informality to cocoon and turn public spaces into domestic ones seems to prove the thesis of Jean M. Twenge’s iGen that, overall, children today are increasingly sheltered and “growing up more slowly” than their generational predecessors. Another, less charitable, way of making this observation is to note that children nowadays are far less mature than children were ever before. Several of the teachers I work with (some of them drawing upon decades’ experience) tell me that, especially post-Covid, 12-year-old are now the emotional equivalent of 8-year-olds. Impatience and impulsivity are universal traits.
Then I notice some teachers (literally) toss out candy to their students. I definitely don’t remember that from the 70’s.
3.
There is no credible doubt that education — overwhelmingly demonstrated by plummeting scores in reading — has been in peril over the last generation. (Try as some bureaucrats might, Covid will not credibly take all the heat for this.) Illiteracy in the U.S,, which was nearly eradicated in 1979, has now risen to 21% — a level not seen in this country since the conclusion of the Civil War2 (when literacy was illegal for some of the population). Currently, half of the U.S. reads at the 7th-to-8th grade level. None of this surprises me; I see only a rare 5% of kids reading for pleasure in the school where I work (and sadly, since the Harry Potter novels ostensibly “proved” that reading can “succeed in the digital era,” most teen reading is now fantastical, often at an elementary-school level4). Most kids I encounter openly ridicule the idea of reading. Studying is cut-and-pasting from Google search questions or, at best, the first paragraph from Wikipedia5; as Cory Doctorow famously put it, “The internet is a copying machine.” (Increasingly, as a recent article by Charlie Warzel in The Atlantic reports [“We Finally Have Proof that the Internet is Worse,” Oct. 7, 2023], Google searches are now conspicuously directing searches toward “more lucrative keywords” in which to promote product placement over ‘information.’ Not an ideal place to research, say, middle school social studies’ essays.) Almost all of the professional development trainings for educators now transpire on videos instead of text, as if the educators can’t be taxed with reading. Intelligence has eroded with the lack of reading. This is quantified by what is known as the Reverse Flynn Effect: since 1975, every generation’s average IQ has dropped an average of 7 points.6
Edward Tenner has suggested that the “topographic experience” of reading books — that is, the tactile contact of the pages in concert with the intellectual content — creates “higher level recall.”7 Scientific American (April 2013) concurs, saying, “Compared with paper, screens may drain more of our mental resources while we are reading and makes it harder to remember what we read when we are done. […] Whether they realize it or not, many people approach computers and tablets with a state with a state of mind less conducive to learning than the one they bring to paper.” (I never see my typos until my document goes to paper.) Nicholas Carr likewise argues as much, concluding: “The idea that our minds operate as high-speed data-processing machines is not only built into the workings of the internet; it is the network’s reigning business model… The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction.”8
Distraction is the operating system of modern society — and it shows in the children’s behavior, and the children’s academic decline. It doesn’t help, with auto-correct, auto-complete and, more generally, the impression that a first draft is the final version due to the polished quality of the screen projection of the Word document. Robo ed!
When I watch kids read, I see they read text vertically, as if they are scrolling words, paragraphs reduced to a quickly-swiped F pattern. Reading directions is a relic of an antiquated past; hopefully they have the patience to rewind that YouTube video (when the Doritos ad ends). The result: I have observed 12-year-old children unable to assemble a cardboard packing box; the flap pattern befuddles them. (I am not talking about the Special Ed kids.) Luckily for them, as Scott Galloway points out,9 Amazon ‘fulfillment center’ workers get verbally directed by algorithms, obviating the need for any literacy skills. Not for nothing does Mark Bauerian call today’s youth the “dumbest generation.”
On a similar note, another — and frustrating — characteristic I encounter daily is an implausible absence of accountability; “I didn’t do it” (even when directly observed) is a default posture. It’s as if real life events are transmitted — perhaps by SnapChat — into an atomizing network of ether, or Twitter flame wars, continuously refreshed and contested. As Seth Stephens-Davidowitz noted, “On social media, there’s no incentive to tell the truth.”10 This axiom appears to be leaking into this world. Neuroscientist Susan Greenfield has observed, “The digital world offers the possibility, even the temptation, of becoming a world unto itself. Life in front of a computer screen is threatening to outcompete real life.”11 Although it seems counterintuitive that the era of searchability would foster throwing the dice on errant falsity (not that that ever deterred Donald Trump), even when witnessed, perhaps an algorithmic avalanche of bias-confirmation has ‘postmodernized’ up and down (and left and right). Supposedly there are ‘Flat Earthers’ pulling it off, so why not the kid who throws his tater tots on the cafeteria floor? Maybe if 100 followers retweeted it, the little punk would finally fess up.
Questioning authority is yielding to denying authority — and (admitting I’m now the authority in question), that feels qualitatively different.
Elsewhere, there’s not much controversy in the idea that video gaming carries health risks; horror stories abound like AA testimonials. (Certainly, I see middle school boys sprinting and zapping their animated avatars with great gusto whenever they have the opportunity in study hall, home room, lunch and whenever the teacher is too burnt-out to intervene. Nancy Colier has noted that most teenage boys spend as many hours gaming as attending middle and high school, combined.12) A little less obvious is the proposition that almost all internet interaction is gaming, however. As Justin E. H. Smith observes, “Social media platforms like Facebook and Twitter are, in the end, video games, and so is LinkedIn, and so is ResearchGate. […] Twitter is a video game in which you start as a mere ‘reply guy,’ and the goal is to work your way up to the rank of at least a ‘microinfluencer’ by developing strategies to unlock rewards that result in increased engagement with your posts, thereby accruing to you more ‘points’ in the form of followers. […] All of these platforms are contributing to the gamification of social reality.”13 James Williams corroborates this thought, observing of his own experience, “Social interaction had become a numbers game for me, and I was focused on ‘winning’ — even though I had no idea what winning looked like.”14 We know what losing looks like, thanks in part to Francis Haugen’s Congressional testimony in 2021 regarding Facebook’s internal studies that documented the psychological harms that it, and Instagram (a FB property), cause pre-teen and teenage girls. As Twenge observed, teens on screen are 35% more likely to develop suicide risk factors.15 Incredibly enough (and despite the working ban of social media at school), the school where I work maintains both an official Facebook and Instagram account.
4.
Meanwhile, back in the Special Ed classroom where I spend most of my time, we’re all taking a break with an abridged video of “It’s the Great Pumpkin, Charlie Brown” to celebrate Halloween. Only it’s not what it seems — as the dialog accompanying Lucy and Linus’ pumpkin carving evinces. Lucy hollows out the top of the Jack ‘o Lantern with a knife and quips, “This is what I’m gonna do to Charlie Brown’s big, fat head.” The teacher rises from her seat, hits the pause button, notices the title on YouTube is “The Great Lumpkin, Charlie Brown” and starts searching for another clip. I have read about this phenomena in James Bridle’s New Dark Age (2018), about children’s programming being increasingly conducted (and created) by algorithms searching ‘optimizing’ content descriptions for YouTube searches, resulting in occasionally contextless, even terrifying, ‘black box’ content which, in turn, gets fed into automated scrolling recommendations, and autoplay for preschool children. Elsagate was what the media called it when it came to public light; when Minnie Mouse got dismembered in one especially gruesome clip for toddlers, “YouTube’s machines saw [it] only as a children’s cartoon.”16 Of course, YouTube eventually apologized and informed concerned parents that it was assigning more (and better) review controls to prevent such things from ever happening again. That was six years ago. Imagine unattended toddlers at home sucking up hours of this material on autoplay!
I would unequivocally state that I consider the internet child abuse. It’s been demonstrated — repeatedly, and often by the internal studies of Big Tech companies (such as Facebook) themselves — that the internet is addictive as well as distracting and disorienting. Information overload equals attention deficit. Algorithms are programmed to agitate the amygdala. With its irresponsible dopamine production and subsequent exploitation, obvious corollaries to Big Tech are the cigarette and alcohol industries. No surprise, there are laws limiting the exposure and sale of these commodities to minors. (Operating an automobile is another apt comparison.) Considering the American Pediatric Association has recommended limiting screen time to children under 16 to only two hours daily and the recent Surgeon General’s warning about the perils of social media, it seems blindingly evident that regulation of children’s access to the internet (especially regarding personal devices) is vital to protect them during their developmental years. The internet is a business; and business has little business in the lives of children.
Ian Bremmer opined, “Facebook, Twitter and YouTube should come with a warning label that says ‘Use as directed will be hazardous to your health.’ […] It’s very clear that these companies are bad for human beings, as their business model necessarily productizes people in ways that are fundamentally unhealthy and needs to be regulated.”17 That’s for consenting adults; I believe minors should be prohibited from iPhones and tablets until a certain age, perhaps 16 or 18. (If not for the addictive and deranging content, then for blue light brain damage itself.) The growing link between screen exposure and autism is also a public health concern. Yup, government intervention sucks but what sucks even more is unrestrained predatory capitalism. Care for a smoke, kiddo? Like a tobacco executive who didn’t smoke, Steve Jobs (a Montessori pupil) famously denied his young kids the use of iPhones or tablets. Yet when the ‘genius’ met Obama, he suggested to the President that public schools — in addition to being open until 6pm, 11 months a year — replace all books in favor of digital tablets18; what an epic hypocrite.
Concluding, I remember that period of time between enjoying comics, cartoons and superheroes in my young life. It seems — somewhere between 6th and 8th grade — all that primary-colored fantasy stuff from early childhood evolved into an interest in the more ‘sophisticated’ rock and roll; the Beatles strolled like a different, grown-up, sort of superheroes across Abbey Road. And, seeing the seemingly infinite fascination the public has for Marvel movies (and other infantilizing pablum), I wonder what happened to that subsequent step into adolescence now that Napster and push-button songwriting killed off the teenager music? I believe Twenge is right: childhood lasts longer in this era, the era of the (ubiquitous, mobile) internet. And, certainly, the internet has become more ‘childish’ since all the unwashed masses (and their kids, who spend the most time online) have moved into the medium previously dominated by university access. Seeing all the emojis, with the thumbs-up icons, not to mention the bright logos everywhere, I can only surmise the internet is a cartoonized fountain of youth — a dangerous, destructive one that has seduced adults, who should know better, into child-like complacency.
NOTES
- I attended a racially integrated public school in Cincinnati, Ohio from 1964-73.
- Originally published by The 2022 Rhysling Anthology.
- National Center for Education Statistics, 2023.
- When I attended middle school, recreational books included J.R.R. Tolkien (of course) but also Carlos Castaneda, Kurt Vonnegut, Jonathan Livingston Seagull, Tom Wolfe, even Mary Shelley’s Frankenstein (the first novel about AI).
- According to the American Osteopathic Association, 9 out of 10 Wikipedia medical entries contain major mistakes.
- Psychology Today, September 2018.
- The Efficiency Paradox, 2018.
- “Is Google Making Us Stupid?,” The Atlantic, 2008. Thesis expanded in The Glass Cage, 2014.
- The Four, 2017.
- Everybody Lies, 2014.
- Mind Change, 2015.
- The Power of Off, 2017.
- The Internet Is Not What You Think It Is, 2022.
- Stand Out of Our Light, 2018.
- iGen, 2015.
- Mark Bergen, Like, Comment, Subscribe, 2018.
- Quoted in Tomorrows Vs Yesterdays, edited by Andrew Keen, 2020.
- Walter Isaacson, Steve Jobs, 2011.
ZNetwork is funded solely through the generosity of its readers.
Donate