The Baby Boomers were born between 1946 and 1963, inclusive.  At least that’s how this site defines that generation.  You can quibble with the boundary dates; for instance, in their book Generations, Neil Howe and William Strauss use a somewhat unorthodox span, 1943 to 1960.  But what I’ve been seeing a lot lately is a redefinition that goes far beyond this kind of quibble.  Consider exchanges such as the following:

“The NBA’s marketing team isn’t thinking long-term.  They keep pushing Boomers like Steph and LeBron, so people who aren’t already into basketball don’t know anything about the next generation of stars.”

“LeBron James was born in 1984.  Steph Curry was born in 1988.  They’re not Boomers; they’re not even Gen X.  They’re Millennials.”

Or this:

“World War II was popular in the U.S. because the Japanese attacked us and the Nazis were obviously evil.  But in the ’60s there were lots of protests about the Vietnam War be­cause you had a bunch of old Boomers sending young people off to die for a cause that didn’t seem important.”

“But in the ’60s, the young people were the Boomers.  The ‘generation gap’ of that era was between the Baby Boomers, in their teens and early twenties, and their parents from the World War II generation.”

Human nature being what it is, rarely have I run across such ex­changes in which the first person replies, “Oh, I see! Apparently I misunderstood the term!”  Almost invariably, when told that the word “Boomer” means “member of the Baby Boom generation”, the response is instead, “That’s not what it means anymore. Now it just means ‘old person’.”  And often someone else will chime in, “Yeah, it’s linguistic drift! Language evolves! What are you, some kind of >shudder< prescriptivist?

For a lot of people, it seems, nothing could be worse than to be a linguistic prescriptivist.  To say that this use of language is “right” and that use of language is “wrong”… for many, that means that at best you’re a stodgy schoolmarm trying to keep everyone stuck in the past, but an even more serious charge is that you’re turning language into a tool that allows one class/race/etc. to perpetuate its privilege.  And language can be that.  But language is primarily a tool for communication.  And there is nothing wrong with advocating against making it a less effective tool.  If someone breaks into your toolbox and snaps the Phillips head off your screwdriver so that you’re left with a metal spike on a handle, I can’t imagine that you’d take too kindly to some­one scolding you that “Hey, it’s screwdriver drift! Tools evolve! What are you, some kind of hardware prescriptivist?”

To expand a bit: the primary purpose of language is to transmit ideas from one mind into another.  Ideally, this happens with a minimum of friction: someone speaks, and without conscious effort, you parse the flow of sounds into words and translate the words into concepts.  Contrast this with listening to a foreign language in which you are not fully fluent—​in such a case, you do have to consciously work through the steps involved.  What did she say?  “Kellerateel”? All right, that must be quelle - heure - est - il, which means, “What hour (or time) is it?”  Written langu­age seems as though it would add yet another step to this pro­cess, as you have to look at a set of abstract markings, mentally convert them into sounds, the sounds into words, and the words into concepts… except this isn’t actually how it works!  Though it’s easy to see why people might think it is.  In 1955, Rudolf Flesch released a book called Why Johnny Can’t Read, which argued that English-speaking children should be taught to read using phonics.  At the time, most American schools taught whole-word recognition: kids would learn that the word net meant “net” without any explanation that the n made the “nuh” sound and the e made the “ehh” sound and the t made the “tuh” sound.  As a consequence, when these kids encountered the written word ten, they had no idea what spoken word it could possibly corre­spond to.  This sort of thing made Flesch ask why we were acting as though English words were Chinese ideograms.  The advan­tage of an alphabetic system is that once you’ve learned the al­phabet, you’ve theoretically unlocked every word in the language (though this certainly isn’t true for English, with all its irregu­larities: “laugh” is /læf/, not “lah-uh-guh-huh”).  Flesch’s argu­ment was hugely influential, and my generation was taught how to read by Sesame Street and The Electric Company, which used phonics. 

All that said—​literate adults don’t actually use phonics when they read!  Phonics is a great tool for learning to read, the same way that training wheels are a great tool for learning to ride a bike—​but in both cases, the eventual goal is to toss out the crutch.  This is why the standardization of spelling and punctu­ation was so important: as late as the eighteenth century people spelled words and punctuated sentences pretty much however they pleased, and that made written documents harder to read than they had to be.  Standardization allowed for, yes, whole-word recognition without phonics: you glance at the word and just know what word it is.  There’s some leeway there: the stud­ies I’ve read have suggested that intelligibility depends on the first and last letters being correct, on ascenders and descenders being in roughly the right place, and on most of the remaining letters being accounted for if not necessarily in their proper locations.  That’s why it’s not as hrad as you mihgt epxect to mkae snese of smoetihng lkie tihs.  But it’s certainly not a smooth read.  And it doesn’t even take that level of garbling to introduce friction to the reading process.

Consider the word “definite”.  Is there anything “wrong” with spelling it “definate”?  There are plenty of words that use that -ate suffix rather than -ite—​accurate, intimate, legitimate, who knows how many others—​and the key vowels are pronounced the same (all these words end with /ɪ t/).  If you spell “definite” with an a, no one reading is going to be confused about what word you mean.  But there’s that moment—​maybe just a fraction of a second—​when the parser operating subconsciously in the read­er’s mind gets tripped up: “Wait, ‘definate’? No match! What might the writer have been going for…? Oh, aha! The writer must have meant ‘definite’!”  And that’s a problem.  Ideas are no longer flowing, seemingly unmediated, from the writer’s mind to the reader’s; rather, the reader is on some level aware of looking at glyphs on a page or a screen.  To argue that spelling doesn’t matter, then, strongly suggests that you haven’t internalized the standard lexicon well enough to ever experience this seamless flow of ideas.  If reading is never effortless for you—​if it’s always a matter of consciously decoding those squiggly marks, or even mentally sounding things out—​then, yeah, it probably doesn’t matter to you whether a piece of text reads “definite” or “defi­nate”, or “could’ve” or “could of”, or “you” or “u”.  For the “oh, you knew what I meant” crowd, it’s always a bumpy ride, so com­plaints about getting snagged on their errors sound to them like so much pedantry.  But standards exist for a reason.  If anything goes where orthography is concerned, then it’s a bumpy ride for everyone.

Thus, in determining whether we should welcome or fight a given change to the language, the question is not just whether the change is good, but whether it is good enough to be worth the disruption the change would cause.  Sometimes it is!  I’m old enough to have been taught in elementary school that, in refer­ring to individual members of a mixed-sex group, the proper form to use was the masculine singular: “Every student must keep his eyes on his own work.”  By the time I was teaching the SAT II Writing test a dozen years later, that rule had been re­placed, and good riddance.  That said, I do think that it’s useful to be able to distinguish between the singular and the plural—​“cat” vs. “cats”, “mouse” vs. “mice”—​and so I find it grating when the singular and plural forms of a noun are identical.  I would have no objection to changing the English language to allow us to speak of “deers” and “sheeps”.  But I can sympathize with those who have internalized “one sheep, two sheep, three sheep”, and I’m not sure that what we gain in clarity and regularity is worth making several hundred million people do a double take every time they see the word “sheeps” in print.  And if there’s no gain at all, then linguistic drift that might seem neutral is actually a net negative, due to this disruption.  For instance, while “defi­nate” is still generally recognized as a mistake, here’s one that’s a lot more contentious.  Just in the past ten years or so, a huge swath of people have stopped spelling the word “whoa” correct­ly.  These days I see “woah” more often than the correctly spelled version, at least online.  I think I can see why this error cropped up.  There’s a Gothic letter, ƕ, that is pronounced in such a way that it came into early English as “hw”.  Linguistic drift changed this to “wh” (so Old English “hwæte” became modern “wheat”), but it continued to be pronounced “hw”… until phonetic drift led “wh” to be pronounced not “hw” but “w” in most dialects.  In fact, the “hw” pronunciation fell so far out of favor that it be­came a joke.  So if “whoa” is pronounced to you not as /hwoʊ/ but as /woʊ/, then it stands to reason that, if you haven’t inter­nalized standard English spelling, you would assume that the word starts w-o rather than w-h.  But you vaguely recall that there’s an h in there somewhere, and lots of words have a silent h at the end.  So, stick the h there, and you’re left with “woah”.  And when people online point out that, no, it really is “whoa”, the “woah” crowd gets weirdly defensive and breaks out the usual vitriol about the evolution of language and the horror of linguis­tic prescriptivism.

As the example above suggests, linguistic drift is generally a matter of mistakes catching on until they’re no longer regarded as mistakes.  Mistakes are rarely positive developments.  Sure, you might wind up with the occasional happy accident, but more frequently we see the loss of useful distinctions.  Consider that in modern English, the second-person pronoun, whether subject or object and whether singular or plural, is the same: “you”.  Those all used to be different!  “You” was the plural object pronoun, corresponding to the first-person “us”; the plural subject pro­noun, corresponding to the first-person “we”, was “ye”.  The singular second-person subject and object pronouns were “thou” and “thee”, respectively.  But English speakers have a hell of a time with case (look at how commonly we hear errors such as “Me and my friend went to the park” or “This is great honor for my wife and I”), so it may not be too surprising that the distinc­tion between subject and object in the second-person plural collapsed.  The distinction between singular and plural collapsed for a different reason, common in many langu­ages: the singular form came to be seen, first, as informal, to be used among the very well acquainted, and later as impolite, applied only to those deemed inferior to the speaker.  Correspondingly, “you” went from solely plural to formal to polite to universal.  And yet, we can see that something has been lost, as English speakers grope for ways to specify that the second person is being used in the plural.  Perhaps most famously, in the South the term “y’all” has long been common… except it has also been subject to the collapse between singular and plural, so that you will sometimes hear “y’all” applied to a single person and “all y’all” applied to groups.

So let’s finally turn from syntax to semantics.  It’s not very com­mon for useful new grammatical forms to enter the language, but useful new words are coined all the time.  I once happened upon a list of words coined for the first time in the year of my birth, 1974; to pick an example off that list more or less at random, “closeted” is clearly a lot more concise than “in a state of not having revealed that one is not heterosexual”, which I guess is what people had to say in 1973.  The list from 2017 is much shorter, but it does include a word that I use all over this site: “Zoomer”, which is a lot snappier than “member of the genera­tion following the Millennials”.  (As it happens, according to the definitions I’m using on this site, 2017 is also the Zoomers’ final birth year.)  However, the fact that we’re adding words all the time doesn’t mean that the English lexicon is becoming richer and more precise with each passing day; simultaneously, words that had once represented useful distinctions are reduced to mere synonyms.  So it’s more like we’re a wartime factory hoping that the tanks we’re producing will make up for the ones we’re losing on the battlefield.  Some battles have long been lost.  I once took a class in which a professor told a very old joke about a lexicographer’s wife coming home unexpectedly to find her hus­band in bed with his mistress.  “Samuel,” the wife says, “I’m sur­prised!”  “No, madam,” the lexicographer replies, “you are aston­ished; we are surprised.”  The point was that the joke doesn’t even make sense anymore, because it depends on knowing that, historically, the word “surprised” meant “taken unawares”, and it was centuries before that definition was largely supplanted by the modern meaning of “emotionally stirred by something unex­pected”.  But the transition happened so long ago that it is no longer part of living memory.  A battle still in progress, but in which defeat seems certain, is to preserve the distinction be­tween “uninterested”, meaning “feeling no desire to pay atten­tion”, and “disinterested”, meaning “having no personal stake in an outcome”.  I suspect that the culprit in this case is that while “a feeling of curiosity and appreciation” was once a secondary meaning of the word “interest”, arising in the late eighteenth century, it has come to eclipse the original meaning of “a legal stake in something, generally conferring a benefit”, which dates back to the mid-fifteenth.  The dictionary definition of “disinter­ested” thus strikes modern speakers as unintuitive, and for ex­pressing how little attention they care to devote to a topic, they prefer “disinterested” to “uninterested” because the prefix “dis-” seems harsher.  But still—​whatever the explanation, these chan­ges didn’t start as conscious decisions to collapse what had been useful distinctions in the English language.  They started as mistakes.  And then they caught on.

The same is true of the use of the word “Boomer” to mean “old person”.  Again, the Baby Boomers were born between 1946 and 1963, meaning that as I write this in mid-2023, they are between fifty-nine and seventy-seven years old, inclusive.  I’m sure that the first person to direct an eyeroll at an old codger and say “Okay, Boomer” was well aware of this.  But humans are hard­wired for mimicry.  We don’t have the physiology to survive for very long on our own; we need to fold into groups and work together, and mimicry is a useful tool for demonstrating group affiliation.  People unfamiliar with generational terminology started to hear others saying “Okay, Boomer” to old people and, largely unconsciously, thought, “Oh, is that what we’re calling old people now? Okay! I’ll do it too!”  But that is a mistake.  We already have plenty of words that mean “old person”.  We only have this one to mean “person born between 1946 and 1963”.  And that is an important enough concept that it is vital that we have a word for it.

A surprising number of people have made elementary errors in imagining how we travel through time.  Consider a couple of popular TV shows about generation gaps: All in the Family, from the 1970s, and Family Ties, from the 1980s.  In the former, a stodgy right-winger in late middle age clashes with his liberal son-in-law; in the latter, a couple who had been hippies in their youth are bewildered by the materialistic outlook of their kids.  This gives us the following simple chart:

1970s 1980s
parents stodgy liberal
children liberal materialistic

What Howe and Strauss term the “life-course fallacy” is to imagine that a snapshot of one moment in time could represent a biography.  That is:

life-course fallacy 1970s 1980s
parents stodgy liberal
children liberal materialistic

So, liberal children grow up to become reactionary parents, while preppy Reaganites grow up to become left-wingers upon reach­ing adulthood?  We know that’s not how it works.  And the “age-bracket fallacy” is no better:

age-bracket fallacy 1970s 1980s
parents stodgy liberal
children liberal materialistic

Could people in the 1980s say that children sure have become a lot more materialistic than they used to be?  They could, and did, but they were wrong.  The children weren’t changing; they were being replaced.  The real story is this:

generational
diagonal
1970s 1980s
parents stodgy liberal
children liberal materialistic

The hippie twenty-somethings of All in the Family and the ex-hippie thirty-somethings of Family Ties were members of the same generation.  They aged as they moved through time, as do we all.  A generation is a diagonal, not a horizontal bracket.  One of the things that Howe and Strauss point out in Generations is that the Boomers were distinguished, at least as of the date of the book’s publication in 1991, as the center of American culture throughout their entire lifespans.  In the 1950s they were kids, and the culture revolved around kids: families were expected to be raising great gaggles of kids behind white picket fences.  In the 1960s the Boomers were teenagers and young adults, and the culture revolved around teenagers and young adults and the way they were transforming American life.  The 1970s were “the Me Decade”, and the “Me” in that appellation was almost certainly a Boomer enjoying the prime of adulthood.  Then came the 1980s, and the culture revolved around the thirtysomethings, the hip­pies turning into yuppies and reflecting on their receding youth.  For us Gen‐Xers, this was all kind of infuriating.  When we were kids, we were not the center of American culture; we were the neglected latchkey kids, left to raise ourselves.  When we were teenagers, we were not celebrated: in 1966, Time’s “Man of the Year” had been the young Baby Boom generation in its entirety, while upon reaching the same age we were derided as stupid, shallow, immoral, cynical.  The Boomers had brought about a glorious revolution when they were our age, we were told, while we just wasted our youth, addicted to video games and MTV.  When we finally had our brief moment in the spotlight in the early 1990s, when the “Baby Busters” were renamed “Genera­tion X” and our movies and music pushed aside Boomer nostal­gia, it was essentially to collectively lament how we’d been broken by our upbringing, or lack thereof.  And then before the decade was out, the kids took center stage, and the great genera­tion gap that took shape in the early twenty-first century was between those kids, the Millennials, and their Boomer parents.  Generation X was effectively erased from public consciousness.  Many of us became the Zoomers’ parents.  And now they call us “Boomers”.

“But our experience as a generation was virtually the opposite of the Boomers’!”  “Who cares? You’re both old, and ‘Boomer’ just means ‘old person’ now.”  It’s like calling a crow a duck and then defiantly insisting that “duck” just means any kind of bird now, and if you think that makes the language less useful that just goes to show what a duck you are.  Now, maybe you think, okay, sure, it is in fact more useful for “Boomer” to mean “person born between 1946 and 1963 who had a particular set of generational experiences that people who grew old before them did not have and that people who will grow old after them will not have” than it is for it to mean “old person”—​but that there’s nothing to be done.  After all, language evolves!  You can’t stop linguistic drift!  But to say this is to treat “linguistic drift” as some kind of imper­sonal force.  It is not.  This “linguistic drift” consists of large numbers of people making a mistake and many more making a conscious decision to obstinately persist in a mistake, a mistake that threatens to make the language that much less useful a tool for communication.  And if you can use language to make ill-considered changes, you can also use language to argue against making ill-considered changes.

look up a
name
return to the
top page
support
this site