Last Updated on
There was a time when we spoke of education in the singular. Tell me about your education. One must have a good education. She is pursuing an education. Then came the binary world—“traditional” and “alternative” educations. Running parallel to of this, but operating in a sort of educational netherworld, was the ever shape-shifting concept of “vocational” education.
Now, as we enter what is being called the Fourth Industrial Revolution (”characterized by a fusion of technologies that is blurring the lines between the physical, digital, and biological spheres”) we are experiencing a disrupted world of learning; a kaleidoscope of theories, strategies, and approaches each trying to gain a foothold, each making a case for being the most strategic path to a secure future.
The “Prove It” Economy
As it turns out, there may be no one right answer. Each of us may instead need to assemble our own personal learning construct—an aggregate of experiences, skills, and accomplishments that together symbolize our leverageable “value” as contributing members of a workforce. A recent article in The Atlantic offers a succinct description of this new reality:
“The country has entered a “prove it” economy in which codified skills are currency.”
The idea of skills being currency is of course not a new one. We have been “exchanging” our learned skills for employment opportunities for as long as there have been jobs. But the paradigm within which these exchanges take place has changed. The full quote reads as follows:
“The country has entered a “prove it” economy in which codified skills are currency. It’s driving a revolution in how education is constructed, delivered, used—and credentialed. Even as degrees, from associates to doctorates, proliferate, they are joined—maybe trumped—by thousands of resume-worthy credentials from shorter, non-degree programs.”
This is the disrupted world of learning we referred to in our introduction. This is also how learning connects to jobs in the Fourth Industrial Revolution.
The Atlantic article, entitled When a College Degree Isn’t Enough, highlights the story of a computer science major named Martin Chibwe:
Days after getting his diploma, and despite the big investment ($39,000 in student loans), he sought another credential to “stack” on top to make him more marketable. He enrolled in Udacity’s iOS Developer Nanodegree program, a five-course cluster from the online platform known for its techie-skills focus. Cost: $900.
“I knew I needed help to land a job,” said Chibwe. In January, he was hired to develop apps at the National Center for Telehealth and Technology near Tacoma, Washington.
In Martin’s case, the addition of a Nanodegree credential, and his foresight in recognizing the need for this additional layer of learning, proved to be a critical differentiating factor for his candidacy.
This all may seem like a daunting state of affairs, but it plays to a foundational strength within us all—the ability to create ourselves.
As citizens and employees both, we are largely creatures of our own design. We choose political affiliations. We join social organizations. We attend houses of worship. We move to certain neighborhoods because of the schools. We contribute to particular charities that align with our moral and economic values. We shop from certain companies, and boycott others. We learn this skill, take that class, and attend this institution.
In a “prove it” economy, acts of educational self-determination become a kind of career pathing, and embracing lifelong learning becomes the most strategic solution to complex challenges posed by a rapidly changing employment landscape. By “curating” a personalized suite of demonstrable accomplishments for ourselves, we build an educational identity that can be proffered to potential employers as evidence of our readiness to successfully fulfill roles from the general to the specialized.
What is striking about this model is the way it subverts a formerly prevailing set of educational distinctions. No longer do we have traditional vs. alternative, online vs. offline, vocational vs. general. Instead, borrowing a term from technology, we can understand this new model as a kind of “stack” approach.
A technology stack (or software stack, or solution stack) can be defined as: “a set of software subsystems or components needed to create a complete platform.” In similar fashion, we can “stack” educational credentials, certificates, and degrees, to create a “complete” candidate.
“We are watching the job market become more and more competitive, and working professionals need additional knowledge and skills … certificates and stackable credentials are the wave of the future.” —Susan Aldridge, the president of Drexel University Online
It is not just the already-working who need to be thinking along these lines. Those preparing to enter (or re-enter) the workforce would do well to consider supplemental additions to their “stack” before they even begin the job-seeking process.
As you seek to answer for yourself the question posed by the title of this post, consider the following advice—culled from an article by digital product development firm Thinslices—about choosing a technology stack:
“Choosing the right technology stack is simply a matter of thinking ahead.”
In many ways, this is what lifelong learning is all about. Thinking ahead. That is why, when you visit Udacity’s home page, you see the following:
Ultimately, because the imperative to “prove it” is ongoing, your education must be as well. Embracing lifelong learning is both the solution, and the way forward.