[bsa_pro_ad_space id=1 delay=10]

From penny press to Snapchat: Parents fret through the ages

By , on September 4, 2018


“This whole idea that we even worry about what kids are doing is pretty much a 20th century thing.” (File Photo: Stock Catalog/Flickr, CC BY 2.0)

NEW YORK — When Stephen Dennis was raising his two sons in the 1980s, he never heard the phrase “screen time,” nor did he worry much about the hours his kids spent with technology. When he bought an Apple II Plus computer, he considered it an investment in their future and encouraged them to use it as much as possible.

Boy, have things changed with his grandkids and their phones and their Snapchat, Instagram and Twitter.

“It almost seems like an addiction,” said Dennis, a retired homebuilder who lives in Bellevue, Washington. “In the old days you had a computer and you had a TV and you had a phone but none of them were linked to the outside world but the phone. You didn’t have this omnipresence of technology.”

Today’s grandparents may have fond memories of the “good old days,” but history tells us that adults have worried about their kids’ fascination with new-fangled entertainment and technology since the days of dime novels, radio, the first comic books and rock n’ roll.

“This whole idea that we even worry about what kids are doing is pretty much a 20th century thing,” said Katie Foss, a media studies professor at Middle Tennessee State University. But when it comes to screen time, she added, “all we are doing is reinventing the same concern we were having back in the ’50s.”

True, the anxieties these days seem particularly acute — as, of course, they always have. Smartphones have a highly customized, 24/7 presence in our lives that feeds parental fears of antisocial behaviour and stranger danger.

What hasn’t changed, though, is a general parental dread of what kids are doing out of sight. In previous generations, this often meant kids wandering around on their own or sneaking out at night to drink. These days, it might mean hiding in their bedroom, chatting with strangers online.

Less than a century ago, the radio sparked similar fears.

“The radio seems to find parents more helpless than did the funnies, the automobile, the movies and other earlier invaders of the home, because it can not be locked out or the children locked in,” Sidonie Matsner Gruenberg, director of the Child Study Association of America, told The Washington Post in 1931. She added that the biggest worry radio gave parents was how it interfered with other interests — conversation, music practice, group games and reading.

In the early 1930s a group of mothers from Scarsdale, New York, pushed radio broadcasters to change programs they thought were too “overstimulating, frightening and emotionally overwhelming” for kids, said Margaret Cassidy, a media historian at Adelphi University in New York who authored a chronicle of American kids and media.

Called the Scarsdale Moms, their activism led the National Association of Broadcasters to come up with a code of ethics around children’s programming in which they pledged not to portray criminals as heroes and to refrain from glorifying greed, selfishness and disrespect for authority.

Then television burst into the public consciousness with unrivaled speed. By 1955, more than half of all U.S. homes had a black and white set, according to Mitchell Stephens, a media historian at New York University.

The hand-wringing started almost as quickly. A 1961 Stanford University study on 6,000 children, 2,000 parents and 100 teachers found that more than half of the kids studied watched “adult” programs such as Westerns, crime shows and shows that featured “emotional problems.” Researchers were aghast at the TV violence present even in children’s programming.

By the end of that decade, Congress had authorized $1 million (about $7 million today) to study the effects of TV violence, prompting “literally thousands of projects” in subsequent years, Cassidy said.

That eventually led the American Academy of Pediatrics to adopt, in 1984, its first recommendation that parents limit their kids’ exposure to technology. The medical association argued that television sent unrealistic messages around drugs and alcohol, could lead to obesity and might fuel violence. Fifteen years later, in 1999, it issued its now-infamous edict that kids under 2 should not watch any television at all.

The spark for that decision was the British kids’ show “Teletubbies,” which featured cavorting humanoids with TVs embedded in their abdomens. But the odd TV-within-the-TV-beings conceit of the show wasn’t the problem — it was the “gibberish” the Teletubbies directed at preverbal kids whom doctors thought should be learning to speak from their parents, said Donald Shifrin, a University of Washington pediatrician and former chair of the AAP committee that pushed for the recommendation.

Video games presented a different challenge. Decades of study have failed to validate the most prevalent fear, that violent games encourage violent behaviour. But from the moment the games emerged as a cultural force in the early 1980s, parents fretted about the way kids could lose themselves in games as simple and repetitive as “Pac-Man,” “Asteroids” and “Space Invaders.”

Some cities sought to restrict the spread of arcades; Mesquite, Texas, for instance, insisted that the under-17 set required parental supervision . Many parents imagined the arcades where many teenagers played video games “as dens of vice, of illicit trade in drugs and sex,” Michael Z. Newman, a University of Wisconsin-Milwaukee media historian, wrote recently in Smithsonian .

This time, some experts were more sympathetic to kids. Games could relieve anxiety and fed the age-old desire of kids to “be totally absorbed in an activity where they are out on an edge and can’t think of anything else,” Robert Millman, an addiction specialist at the New York Hospital-Cornell University Medical Center, told the New York Times in 1981. He cast them as benign alternatives to gambling and “glue sniffing.”

Initially, the internet — touted as an “information superhighway” that could connect kids to the world’s knowledge — got a similar pass for helping with homework and research. Yet as the internet began linking people together, often in ways that connected previously isolated people, familiar concerns soon resurfaced.

Sheila Azzara, a grandmother of 12 in Fallbrook, California, remembers learning about AOL chatrooms in the early 1990s and finding them “kind of a hostile place.” Teens with more permissive parents who came of age in the ’90s might remember these chatrooms as places a 17-year-old girl could pretend to be a 40-year-old man (and vice versa), and talk about sex, drugs and rock ‘n’ roll (or more mundane topics such as current events).

Azzara still didn’t worry too much about technology’s effects on her children. Cellphones weren’t in common use, and computers — if families had them — were usually set up in the living room. But she, too, worries about her grandkids.

“They don’t interact with you,” she said. “They either have their head in a screen or in a game.”

 

[bsa_pro_ad_space id=2 delay=10]