Alan Jacobs


#
Until the late 19th century, according to historian Howard Chudacoff, age wasn’t such a defining fact about people’s lives. A professor at Brown University and the author of the book “How Old Are You? Age Consciousness in American Culture,” Chudacoff found that for most of the country’s history, people of different ages tended to mingle: Families were bigger, generations often worked side by side, and kids and adults got their entertainment at the same county fairs. Schoolchildren, meanwhile, were often assigned to classes based on how much they knew rather than when they were born.

All that changed with the Industrial Revolution. Child labor laws kept children out of dangerous factory jobs; older people were also deemed badly suited for new kinds of physically demanding work. Society began to divide people up into distinct stages. “Standardization spilled over into many different facets of life,” Chudacoff says, including the way people thought about the passage of time. Schools introduced so-called age-batching; birthdays became a bigger deal. In health care, pediatrics and gerontology broke off from the rest of medicine.

Today we divide people into generations and micro-generations almost obsessively, spending energy and marketing dollars trying to understand how millennials are constitutionally distinct from Gen-Xers. In dividing everybody into categories—tweens, thirtysomethings, senior citizens—our society implicitly treats age as a force that separates us.

What ‘age segregation’ does to America. Today at work I talked only to one person twenty years older than me and many people 35 years younger than me.