Call me a heretic, but I’ve grown weary of the dogma. And I’m a little embarrassed that our industry has so eagerly adopted these generalizations as orthodoxy upon which to build our learning.

Unless your web connection has been down for a decade, you’ve doubtless heard the “generations” mantra, ad nauseam, in many a trendy learning article. You know the one: “One generation is characterized by this. Another by that. Digital natives are superhuman multi-taskers.” And so on. One stereotype heaped upon another until…what, we’ve had our self-image formed for us?: “I’m under thirty, therefore, texting shall be my preferred mode of communication.” Or, “I’m over sixty, I shall flee at the sight of an iPad.”

I’m no conspiracy buff, but the ubiquity with which the same article has been published leads me to suspect a propaganda office somewhere, texting talking points to every pundit in our industry. These experts then promulgate these stereotypes as if they are beyond controversy. And they run as a current (or undertow) through our grand dialogue, framing our approach to the challenges we face in Learning.

I say, “Millennials Shmillennials.” Call me a heretic, but I’ve grown weary of the dogma. And I’m a little embarrassed that our industry has so eagerly adopted this fluff as orthodoxy. Some critical thought is in order.

Differences Aplenty
“But don’t generations differ?” Sure, they do. We are products of our time. Yet we’re products of many things. Demographics offer convenient sociological labels but do little to explain behavior – why one person adopts technology while another doesn’t.  Factor in personalities, proclivities, experience, economic differences, beliefs, intelligence, and we begin to see the paucity of the argument.

When we consider the forces that go into one’s makeup, we recognize age for what it is – just one thing. And in an era where age discrimination is considered a “bad thing,” how is that we’ve allowed this mumbo-jumbo to skate the lines of acceptable conversation? Shall we hire on these assumptions? Why not pass over the old guys and gals if we’ve been told the smart money is on Gen-Y?

One outcome of this thinking is the myth of multi-tasking. One argument goes like this: The younger, “digital natives” are born with multi-threading processors for brains, able to handle myriad inputs at once, whereas the rest of us come with a pre-frontal Commodore 64, or maybe one of those abacus thingies with the strings and beads and…well, you get the point. We’re to accept that mankind has taken a progressive biological leap, merely because some of us have pre-natal experience with Facebook (“Day 230: Muffled voices outside. Dark in here. I wonder what they’ll name me. 50 Likes for ‘Gertrude’?”).

We are uni-taskers. We. Do. One. Thing. At. A. Time. Sure, we juggle. But we think one thought at a time, and when we slow down the tape, we’ll see that the juggler grasps one ball at a time, a pretty good metaphor for cognition. Nothing’s changed.

Take the happy homemaker of 1954: The TV’s on. There’s three pots on the stove. She’s monitoring the kiddies. She’s on the phone. And, oh, is that the doorbell?… What has changed, cognitively, since then?

Yet, some would float the conceit that our tools make us more advanced. Indeed, our brains become wired to our tools, but the new tools are an analog to the old – and those were harder, not easier, to use. (See the now-classic, “Is Google Making Us Stupid?”, Nicholas Carr, The Atlantic, published a thousand web-years ago (2008) and still as relevant. )

Unless your content is age-specific, then to fixate upon age is a design error. You will pigeon-hole your audience or condescend.

Forget this multi-generational stuff. I give you The Frog Principle: If you drop a live frog in boiling water, you’ll have a challenging case of change management on your hands (I have no experience boiling frogs, but indulge me). Now, if you place that frog in tepid water and the burner on low, it won’t perceive a drastic change and will gradually cook. My point is that we are the frogs, all of us together in a technologically simmering pot for quite some time now. And this renders the “digital native” moniker more meaningless with each passing year.

Murder 101
What’s true of homicide holds for tech-adoption. It’s about motive and opportunity. The year in which you were born matters little if you don’t have internet access. Your Gen-Y status doesn’t make you a tech wunderkind if you’re unmotivated to use tech.

Desire. Access. It’s that simple. The technology, itself, is inclusive. You see diversity all over the web. I defy you to find me the geriatric wing of Instagram or Twitter, so what’s up with the technological caste system we’ve manufactured? There’s no data to support homogenous generational behavior. It’s doesn’t exist.

Tech is opening new horizons. As bandwidth grows and location matters less, constraints are falling away. We can serve many contexts, meeting learners at their point of need. More than ever, we can design content for the mode and manner of its consumption. And that’s just what we should do.

“But we need to appeal to young people.” Unless your content is age-specific, then to fixate upon age is a design error. You will pigeon-hole your audience or condescend. Worse, you’ll polarize and alienate. I think the root fear is that we won’t engage. We’re afraid to bore in the media-driven age.

The good news is that all generations appreciate well-crafted content. Our palettes have become sophisticated. All us frogs know what’s readable, what’s watchable, what’s usable. Visual language evolves, devices morph, but fundamentals remain unchanged. If we design our brand experience (content, context), we’ll serve all of our community and alienate no one.


ANTHONY ROTOLO eschews his Gen-X label while managing e-learning programs for the award-winning Defense Acquisition University. This article has also appeared in the DAU publication, INSIGHT.

Photo Credit: