When Did TV Watching Peak?

It’s probably later than you think, and long after the internet became widespread.

Two people sitting on a yellow sofa watching TV at Burning Man
Jim Urquhart / Reuters

With Netflix and Amazon Prime, Facebook Video and YouTube, it’s tempting to imagine that the tech industry destroyed TV. The world is more than 25 years into the web era, after all, more than half of American households have had home Internet for 15 years, and the current smartphone paradigm began more than a decade ago.

But no.

Americans still watch an absolutely astounding amount of traditional television. In fact, television viewing didn’t peak until 2009-2010, when the average American household watched 8 hours and 55 minutes of TV per day. And the ’00s saw the greatest growth in TV viewing time of any decade since Nielsen began keeping track in 1949-1950: Americans watched 1 hour and 23 minutes more television at the end of the decade than at the beginning. Run the numbers and you’ll find that 32 percent of the increase in viewing time from the birth of television to its peak occurred in the first years of the 21st century.

Over the last 8 years, all the new, non-TV things—Facebook, phones, YouTube, Netflix—have only cut about an hour per day from the dizzying amount of TV that the average household watches. Americans are still watching more than 7 hours and 50 minutes per household per day.

Nielsen numbers for television viewing time since 1949 (Nielsen)

The thing that Americans do most often with their free time is not cooking or exercising or hiking or any other seemingly salutary activity. No, Americans watch TV. That’s the default the current move to even tinier screens has to be measured against.

In 1949-1950, American households were already watching 4 hours and 35 minutes of TV per day. Viewing time grew every decade. In an era of technological thinking that now seems long, long ago—2008—the media theorist Clay Shirky argued that the Internet was tapping into the post-war affluent “cognitive surplus” that television had, theretofore, soaked up. This was a good thing, he said.

“However lousy it is to sit in your basement and pretend to be an elf [online], I can tell you from personal experience it’s worse to sit in your basement and try to figure if Ginger or Mary Ann is cuter,” Shirky said, referring to two characters from Gilligan’s Island. “And I’m willing to raise that to a general principle. It’s better to do something than to do nothing. Even lolcats, even cute pictures of kittens made even cuter with the addition of cute captions, hold out an invitation to participation. When you see a lolcat, one of the things it says to the viewer is, ‘If you have some sans‐serif fonts on your computer, you can play this game, too.’”

To Shirky’s point, sometimes doing something equaled making lolcats, but other times people were doing something better. According to Shirky, the time put into making the entirety of Wikipedia, as it existed in 2008, added up to just 1/2000th of the time Americans spent watching TV each year in the ’00s.

Even though phones have pulled attention into this new kind of glowing rectangle to do new things, TV remains the biggest monopolizer of Americans’ attention. Whatever else we might say about Facebook as an ever-optimizing attention-mining machine or the clickbaity excesses of YouTube, Americans still watch as much TV as they did before the creation of Facebook, YouTube, and Netflix.

Alexis Madrigal is a contributing writer at The Atlantic and the host of KQED’s Forum.