- New roles are emerging for those who can translate how the engineering and design of new technologies and applications affects our political, economic, social and cultural worlds;
- Education in this field and notions of digital literacy must expand to reflect this need;
- In secondary schools and universities, it will be important to teach the building blocks of digital literacy, from basic computing concepts to how platforms are built.
Are doctors and engineers more similar than different? New York Times journalist Natasha Singer connected these two professions in 2018: “The medical profession has an ethic: First, do no harm. [But] Silicon Valley has a contrasting ethos: Build it first and ask for forgiveness later.” Singer’s point is that a new “human-centred” approach toward technology can and should start with what we learn in our schools and universities.
What could that education look like and how do we get there? In this excerpt from my recently released book, Beyond the Valley, I argue for two major interventions: firstly, changing technology education to tie it more closely to political, economic, cultural, and humanist thinking; and secondly, opening up what we mean by design and digital literacy and how we teach this.
Have you read?
Schools and universities will need to expand and revise their curricula if they wish to educate students for a digital future that is inclusive, sustainable and collaborative. An unfortunate legacy today is that most education systems treat the fields of science and engineering as merely technical, and therefore neutral, rather than socially constructed. This is why we rarely see courses in which software design is taught along with materials that “understand” the places where the software will “work.”
As we begin to see cultural or social topics being taught in conjunction with engineering, however, we are likely to see engineers who are better equipped to think deeply about how the systems they design will transform the world. New jobs can be created for those who can translate across technical and ethical domains as technologies are innovated and rolled out. In the US, this process is just beginning – the prestigious Association for Computing Machinery has released a code of ethics and a newly released list of computer science ethics classes taught at dozens of universities around the world reveals new course titles, such as Race and Gender in Silicon Valley or Ethics in Video Games.
Yes, our education is supposed to prepare us to work, to enter the job market. But it is also supposed to prepare us to be creative, reflective, deliberative humans with social, ethical and creative needs. That’s why it’s unrealistic to think of science or technology as a given, as some sort of airtight study of “what is”, without recognizing the deep influence of philosophy, ethics, human behaviour, politics and the arts.
Mitchell Baker, Executive Chairwoman of the Mozilla Foundation, believes “we are intentionally building the next generation of technologists who have not even the framework or the education or vocabulary to think about the relationship of STEM (…) to society or humans or life.” As users (who number in the billions) become complacent, blindly following what technologies tell us to do, we may lose our ability to ask fundamental questions such as “who does this serve?” or “how can we apply technical knowledge in different manners?”
What about design, which is often only seen as a way to make something look pleasing or “usable”? From this limited perspective, we give designers all the power, leaving us none. But this is shortsighted: despite the “lone genius” myths we tend to circulate, great scientists (like Newton) or split-brain artist-engineers (like Leonardo da Vinci) didn’t work in a vacuum; their technical and artistic expertise evolved in response (and was shaped) to societal visions of their times. Design is also a process that can be imaginative and speculative. What if supporting user autonomy, for example, were a design principle itself?
What is the World Economic Forum doing to improve digital intelligence in children?
The latest figures show that 56% of 8-12-year-olds across 29 countries are involved in at least one of the world's major cyber-risks: cyberbullying, video-game addiction, online sexual behaviour or meeting with strangers encountered on the web.
Using the Forum's platform to accelerate its work globally, #DQEveryChild, an initiative to increase the digital intelligence quotient (DQ) of children aged 8-12, has reduced cyber-risk exposure by 15%.
In March 2019, the DQ Global Standards Report 2019 was launched – the first attempt to define a global standard for digital literacy, skills and readiness across the education and technology sectors.
Our System Initiative on Shaping the Future of Media, Information and Entertainment has brought together key stakeholders to ensure better digital intelligence for children worldwide. Find our more about DQ Citizenship in our Impact Story.
Good design can be a source of empowerment, a way of delivering value to everyone. What we need to do, however, is escape from the “false loops” that drive our endless need to check in “just one more time” whether that’s scrolling through our Facebook feeds or clicking on that “up next” button on a YouTube video.
Related to design ethics is the potential of digital literacy. The term might seem self-explanatory, something like “ensuring that everyone knows how to use the existing technology”, but, in reality, it is far subtler and more important. Literacy, in its traditional definition, isn’t just the ability to read or write – it’s actually about the capacity to reflect, analyse and create. It’s about taking a book, a newspaper or magazine article, even a fictional story and reflecting on what’s behind it, who wrote it, what their assumptions were, what world they were a part of and what other information there might be on a similar subject. After all, children who can sound out the words in a book still aren’t quite “reading” if they can’t understand what the characters are doing or why. It’s even more powerful to grasp the meaning of the story, the reasons why someone would tell it or what cultural significance it has.
Our experience of the “open universe of information” – the way the internet was supposed to be – has been clouded by algorithmic goggles that filter the near-infinite possibilities available on the internet with results that masquerade as truth or knowledge. In our secondary schools and universities, it will be important to teach the building blocks of digital literacy: how different platforms are built, the basic concepts of computing even if we don’t wade into code and real glimpses into what is (or is not) happening behind the scenes when we use a system.
Digital literacy is just the doorway, then, to other literacies that we must wrap our heads around to ensure that technology serves all our best interests. As we step through that door, we can develop:
- Algorithmic literacy (understanding bias in artificial intelligence systems or how a search engine system works);
- Data literacy (how/when/where data is collected, how it is aggregated and retained, by whom and with what effects);
- Political and economic literacy (what technologies are owned by whom, what industries are shaped by technology in what manners, how technologies shape public and political life and the relationships between corporate and public/political interests).