(Hat-tip to Tressie McMillan Cottom for the idea behind this one.)
Is “lifelong learning” a good thing or a bad thing? I suspect the answer depends on the definition, and on what you do for a living.
I hear plenty of my fellow academics wax rhapsodic about the virtues of lifelong learning. They’re sincere in what they’re saying; many of them embody the idea themselves. They mean several different things by “lifelong learning,” though, which leads to some confusion.
My personal understanding of “lifelong learning” involves picking up the skills to be able to investigate and discover things on your own. That means a level of literacy and numeracy sufficient to navigate relatively complicated material without help, or at least without sustained or intensive help. Ideally, it also involves a lively curiosity. Without that, the skills are largely wasted. Sadly, higher education struggles as much with the curiosity piece as with the skill piece.
If that were all it meant, I don’t expect that many people would object.
But some take a more literal approach. They seem to envision graduates cycling back to credit-bearing courses every few years until retirement, if not longer.
From within academia, I see the appeal of that view. Enrollments are the backbone of our economic model; if we get more repeat customers, that helps pay for all sorts of things. From a less cynical perspective, it’s certainly true that workers in many industries need to refresh some element of their content knowledge from time to time in order to remain current. Whether the changes are technological, scientific, or even regulatory, high-performing workers effectively don’t have the option of ignoring them. So they have to return to the well from time to time to avoid obsolescence.
But that version of “lifelong learning” strikes much of the public more as a chore than a goal. It’s acknowledged as a sort of necessary evil, or tolerated as a fact of life, but it’s hardly considered positive. Each return to school costs money and time, and people in midlife often have plenty of competing demands on both.
Returning to retool for a new career is usually regarded more positively, but people who do it once usually do it with the hope of not doing it twice. Having to go back to school to change careers every few years is a pretty dispiriting prospect. At some point, most people want to move past the ‘student’ stage and get on with life. Those of us who have made academia our lives may not grasp the distinction, but most civilians do.
It’s easy to write those attitudes off to anti-intellectualism, and some of that is always around. But much of it is exhaustion, and the exhaustion comes from entirely respectable sources. Work a full-time job, help the kids with their homework, shuttle them to after-school activities, take care of meals and laundry, and get some sleep, and time for anything else is at a premium. In that setting -- a pretty common one -- “lifetime learning” can just sound like one more task. It’s not the way to recruit allies.
I understand why academics might feel threatened or insulted by talk of “acceleration” of completion, or even of “completion” as a goal in itself. At its worst, it mistakes credentials for what those credentials are supposed to signify. But it’s also a sign of respect for people’s time.
If we want to keep to “lifelong learning” as a rallying cry, we need to embrace forms of it that don’t come across as chores. Otherwise, it lands with a thud, and defeats the purpose. If that means being a bit more open to different means to an end, well, we probably should be anyway.