Our first three industrial revolutions have typically been defined by the specific technological advances that enabled them (steam engines, electricity and computers, respectively). Now, well into the 21st century, we are entering a Fourth Industrial Revolution (4IR), characterized by the parallel development of a swath of seemingly independent technologies, each with world-changing potential. From artificial intelligence to genetic engineering, virtual reality, and digital currencies, these advances each promise huge benefits to society. Yet this Utopian view does not tell the whole story as the 4IR also poses serious challenges.

As with previous industrial revolutions, the 4IR will fundamentally transform the way we live, work and govern ourselves. Many argue, however, that the nature of this era differs fundamentally from the previous, due to both the speed and the magnitude of the change facing us. And because new technologies will come from a range of fields, it is harder than ever to see where opportunities and risks lie.

Preparing ourselves for what the 4IR will bring tomorrow must begin with education today. But with our destination so unclear, navigating our way forward is no small feat. How should we educate tomorrow’s citizens and leaders to innovate and capitalize on unforeseen opportunities? And how should we educate our students to guide society through the many and significant changes they will face in their lifetime? What will our jobs be, if so many of them will be made redundant by advances like artificial intelligence, robotics and autonomous vehicles? How will civil society fare in the face of “false news”? Who will monitor the use of big data and implicit bias in the data with its concomitant implications for an equitable society? And who will make the ethical decisions bound to arise around genetic engineering and the internet of things? These changes are coming, and they’re coming quickly.

In contrast, universities – society’s main institutions for educating tomorrow’s leaders – are not known for changing quickly; 70 of the world’s oldest 100 institutions (having existed essentially unchanged for over 500 years) are universities. This conservative approach to change has benefits, as chasing current fashions can be dangerous. But it’s increasingly clear the sector must adapt to stay relevant. To our minds, while compromising the knowledge that comes from deep study in a specific field would be foolish, a new institutional approach encouraging broadened skillsets and experiences to complement this will be crucial. While all universities have made some inroads in this area in response to public pressure, we believe urgently that more must be done.

Students walk between classes in front of College Hall on the campus of the University of Pennsylvania in Philadelphia, Pennsylvania, U.S., September 25, 2017. REUTERS/Charles Mostoller - RC1149866740
Image: REUTERS/Charles Mostoller

A chronology of university siloes

How has academia got to where it is today? Ironically, many of the technological advances that underlie this need for academic change owe their heritage to last century’s academic breakthroughs. The 20th century witnessed an unprecedented fragmentation of academic disciplines. During this period the ‘core subjects’ splintered into multiple semi-independent sub-disciplines, and terms such as nanotechnology, artificial intelligence, neuroscience, and synthetic biology – many of the pioneering technologies of our time — first emerged.

Things were once very different. The separation and specialization among ‘core subjects’ was not always rigidly observed. The 20th century was, in fact, the first in human history in which it was not common for natural scientists to identify also as philosophers: pursuing a range of interests across the world of atoms, as well as music, philosophy, moral ethics, and language. This earlier breed of scientist made connections across these different fields often in service of scientific breakthroughs. Many of today’s greatest scientists, engineers and entrepreneurs are similar polymaths. Yet generally speaking, in today’s system, the narrowness and depth of the disciplines and sub-disciplines means it is virtually impossible to be an expert in such a broad number of fields.

The structures of the academic world exacerbate the problem with the tenure system, departmental structures, research councils and journals all playing a role in emphasizing narrow disciplinary work. Today’s academic world is still full of experts, whose domains of expertise become ever narrower – just at a time when we need broad thinking. This narrowing of expertise hampers the tackling of the ‘big problems’, which just do not come neatly packaged within fields.

Why making university education more universal is crucial

To serve society and the industries of the future, academic training should regain some of the breadth that it has lost. With disciplines being created and merged at rapid rate, it makes little sense only to teach within boundaries that may soon cease to exist. This holds for more than just technical skills: it’s easy to forget that in a world of such technological hype a broad understanding of humanistic issues is crucial to tackle big challenges too. The separation of the humanities and technology has lead to the risk that technology is viewed as the ultimate solution rather than as a tool to improve human existence. Technological expertise without a humanistic understanding of the nature of the relationship between technology, user and broader society will yield technology that doesn’t actually reflect human demands.

Moreover, presently, only 23% of UK employers think universities are preparing graduates adequately for work (the numbers are similar in other developed economies such as the U.S. and EU). Importantly, many of the skills employers observed as lacking (with interpersonal, problem-solving and creative-thinking skills at the top of the list) are not just ones crucial to being a good employee, but to being a good citizen too. The demand from the job market is already moving toward a mix of both social skills and technical skills, with many technology companies now hiring liberal arts graduates at unprecedented rates. Looking further forward, the jobs that AI will create will likely be as much about understanding human relationships with the technology as understanding the technology itself.

Any narrow focus on today’s industries will surely fall victim to the same problem. Ultimately, the best safeguard against an unpredictable future is to hew towards the traditional foundations and mission of higher education: to ground people in fundamental principles and teach them how to think in service of becoming productive and responsible citizens; how to question assumptions, to analyse, to reason, formulate arguments, and express themselves clearly. Such critical thinking is a necessary complement to discipline-based expertise, to avoid domain experts losing the contextualisation that a broader view can provide, and the ability and disposition to learn more when they need to.

Stanford University students listen while classmates make a presentation to a group of visiting venture capitalists during their Technology Entrepreneurship class in Stanford, California March 11, 2014.  REUTERS/Stephen Lam (UNITED STATES - Tags: EDUCATION) - GF2EA3C0D2601
Image: REUTERS/Stephen Lam

Promising Steps Forward

A number of initiatives have developed across the academic world that are promising examples of the direction higher education might take in the coming years. We offer below a few illustrative examples that could guide a way forward.

Targeted curricula around mission-based and cross-disciplinary learning.

The rise of entrepreneurial courses at universities has produced a range of initiatives that meld science and technology studies with entrepreneurship, business, marketing and the like. For example Tulane University’s Bioinnovation PhD programme trains students across the university’s science, engineering, medicine, law and business faculties and supplements that academic breadth with ‘real-world’ support of the US FDA and New Orleans Bioinnovation Center. UCSF has its QB3 program. But there is no reason why entrepreneurship need be the focus of interdisciplinary courses. A course on ethics and bio-engineering, for example, meets a similar need. These courses, while very time-consuming to design and teach for faculty, can also lead to interesting scholarly research.

Purpose learning is a related pedagogical philosophy (with many similarities to problem-based or experiential learning) that encourages students to “choose a mission not a major” – to seek to solve a problem that they feel passionate about, and to pick from the tools and approaches of any disciplines available that will enable them to do so best. The ethos is gaining currency already, with many schools offering the option of bespoke majors and concentrations to suit each undergraduate’s needs.

Varied departmental structures

Another potentially fruitful strategy is the development of academic structures that span the traditional disciplinary divides. For example, Cambridge University’s Institute for Manufacturing offers students a cross-disciplinary collection of expertise in management, engineering, technology and policy surrounding that relates to manufacturing. Cranfield University has organized its departments across specialty-focused themes (such as Transport Systems, Agrifood, and Energy) in order to provide real world solutions to the vast challenges facing society. The various disciplines collaborate and come together to offer, variously, views on systems design, human factors, technology, and trends in business and economics. While these new structures offer interesting potential, they also risk losing the appearance of or actual independence of thought and unbiased research. Alternatively, as new industries rise and old ones disrupted, such an approach risks irrelevancy. Indeed, the heart of the challenge requires achieving a delicate balance between high-quality impactful academic research and flexibility to meet rapidly changing needs.

Life-long learning

Increasingly, the contemporary requirement for 3- or 4-year undergraduate degrees seems out-dated. With information freely available on the Internet, curious students can learn at an individualized pace and direction. Moreover, it seems unreasonable to expect an 18-year old to have a clear view of her interests given the job she may do does not yet to exist. Finally, the old model of life-long employment has gone the way of the dinosaur. People can no longer expect to hold one job with one company for life. Nor does the millennial generation want such a work life. The very nature of work is evolving and research shows individuals will hold multiple jobs over a life time; continuous learning will be necessary in order to grow to suit changing jobs and technologies.

The explosion of MOOCs, for example, will perhaps be seen as the first step in the progress from campus-based university education toward more online education in which campus-based learning plays only a part in the education process. Other initiatives include inviting alumni back for various educational experiences. Stanford2025 talks about a 4-year education over a span of 30 years rather than four consecutive years. MIT, for example, in collaboration with Udacity, recently announced a graduate school program that does not require a formal undergraduate degree. GeorgiaTech launched the first ever 100% online Master’s degree in computer science where students can study while employed full-time regardless of geographic location. The programme is developed in collaboration with AT&T, giving a taste of the academic-industry alliances that may become common in years to come. Distance EMBA programs at business schools – adapted in order to enable students to learn while working, anywhere in the world, at any career stage – are now standard rather than outliers. We expect and hope universities will continue to experiment broadly in this arena.

In parallel with the above efforts, numerous professional service companies have developed professional training academies to answer this demand (e.g. Deloitte University and Mckinsey Academy). Institutions such as Singularity and THNK (an Amsterdam-based leadership institution) cater to mid- and senior-career individuals.

Conclusion

As educators, we believe in the power of education to change lives and effect a better future for the world. In order to continue doing so at the advent of the 4IR, however, schools and universities must reconsider how to conduct core basic research and translate applied research into real-world solutions; what constitutes the core components of an education; and provide education as and when it is needed. Each of these poses a considerable challenge and higher-education must confront all of these simultaneously.

At the same time, universities must do a better job ensuring students grasp the fundamental principles of a discipline, along with the basic skills of reasoning and communicating. The case for reform stands both on efficiency (educating students faster, better, cheaper and more conveniently) and also for existential safety – as technologies become more and more powerful, we need technologists with a humane eye and politicians with technological wiles to ensure a positive future. For almost a millennium universities have led humanity forward. They still have a crucial role to play – but they must tackle the task with urgency and principle.