Education

Without the humanities, great tech cannot exist. Here's why

Students attend a lecture in the auditorium of Technical University of Munich 'Technische Universitaet Muenchen' TUM in Munich, Germany, May 25, 2016.    REUTERS/Michaela Rehle/File Photo - RTX2G5HH

Education isn't always about the capacity to think, but the choice of what to think about. Image: REUTERS/Michaela Rehle

Tracy Chou
Entrepreneur, Project Include
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Education?
The Big Picture
Explore and monitor how Education is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Education

In 2005, the late writer David Foster Wallace delivered a now-famous commencement address. It starts with the story of the fish in water, who spend their lives not even knowing what water is. They are naively unaware of the ocean that permits their existence, and the currents that carry them.

The most important education we can receive, Wallace goes on to explain, “isn’t really about the capacity to think, but rather about the choice of what to think about.” He talks about finding appreciation for the richness of humanity and society. But it is the core concept of meta-cognition, of examining and editing what it is that we choose to contemplate, that has fixated me as someone who works in the tech industry.

As much as code and computation and data can feel as if they are mechanistically neutral, they are not. Technology products and services are built by humans who build their biases and flawed thinking right into those products and services—which in turn shapes human behavior and society, sometimes to a frightening degree. It’s arguable, for example, that online media’s reliance on clickbait journalism, and Facebook’s role in spreading “fake news” or otherwise sensationalized stories influenced the results of the 2016 US presidential election. This criticism is far from outward-facing; it comes from a place of self-reflection.

I studied engineering at Stanford University, and at the time I thought that was all I needed to study. I focused on problem-solving in the technical domain, and learned to see the world through the lens of equations, axioms, and lines of code. I found beauty and elegance in well-formulated optimization problems, tidy mathematical proofs, clever time- and space-efficient algorithms. Humanities classes, by contrast, I felt to be dreary, overwrought exercises in finding meaning where there was none. I dutifully completed my general education requirements in ethical reasoning and global community. But I was dismissive of the idea that there was any real value to be gleaned from the coursework.

Upon graduation, I went off to work as a software engineer at a small startup, Quora, then composed of only four people. Partly as a function of it being my first full-time job, and partly because the company and our product—a question and answer site—was so nascent, I found myself for the first time deeply considering what it was that I was working on, and to what end, and why.

I was no longer operating in a world circumscribed by lesson plans, problem sets and programming assignments, and intended course outcomes. I also wasn’t coding to specs, because there were no specs. As my teammates and I were building the product, we were also simultaneously defining what it should be, whom it would serve, what behaviors we wanted to incentivize amongst our users, what kind of community it would become, and what kind of value we hoped to create in the world.

I still loved immersing myself in code and falling into a state of flow—those hours-long intensive coding sessions where I could put everything else aside and focus solely on the engineering tasks at hand. But I also came to realize that such disengagement from reality and societal context could only be temporary.

The first feature I built when I worked at Quora was the block button. Even when the community numbered only in the thousands, there were already people who seemed to delight in being obnoxious and offensive. I was eager to work on the feature because I personally felt antagonized and abused on the site (gender isn’t an unlikely reason as to why). As such, I had an immediate desire to make use of a blocking function. But if I hadn’t had that personal perspective, it’s possible that the Quora team wouldn’t have prioritized building a block button so early in its existence.

Our thinking around anti-harassment design also intersected a great deal with our thinking on free speech and moderation. We pondered the philosophical question—also very relevant to our product—of whether people were by default good or bad. If people were mostly good, then we would design the product around the idea that we could trust users, with controls for rolling back the actions of bad actors in the exceptional cases. If they were by default bad, it would be better to put all user contributions and edits through approvals queues for moderator review.

We debated the implications for open discourse: If we trusted users by default, and then we had an influx of “low quality” users (and how appropriate was it, even, to be labeling users in such a way?), what kind of deteriorative effect might that have on the community? But if we didn’t trust Quora members, and instead always gave preference to existing users that were known to be “high quality,” would we end up with an opinionated, ossified, old-guard, niche community that rejected newcomers and new thoughts?

In the end, we chose to bias ourselves toward an open and free platform, believing not only in people but also in positive community norms and our ability to shape those through engineering and design. Perhaps, and probably, that was the right call. But we’ve also seen how the same bias in the design of another, pithier public platform has empowered and elevated abusers, harassers, and trolls to levels of national and international concern.

At Quora, and later at Pinterest, I also worked on the algorithms powering their respective homefeeds: the streams of content presented to users upon initial login, the default views we pushed to users. It seems simple enough to want to show users “good” content when they open up an app. But what makes for good content? Is the goal to help users to discover new ideas and expand their intellectual and creative horizons? To show them exactly the sort of content that they know they already like? Or, most easily measurable, to show them the content they’re most likely to click on and share, and that will make them spend the most time on the service?

Ruefully—and with some embarrassment at my younger self’s condescending attitude toward the humanities—I now wish that I had strived for a proper liberal arts education. That I’d learned how to think critically about the world we live in and how to engage with it. That I’d absorbed lessons about how to identify and interrogate privilege, power structures, structural inequality, and injustice. That I’d had opportunities to debate my peers and develop informed opinions on philosophy and morality. And even more than all of that, I wish I’d even realized that these were worthwhile thoughts to fill my mind with—that all of my engineering work would be contextualized by such subjects.

It worries me that so many of the builders of technology today are people like me; people haven’t spent anywhere near enough time thinking about these larger questions of what it is that we are building, and what the implications are for the world.

But it is never too late to be curious. Each of us can choose to learn, to read, to talk to people, to travel, and to engage intellectually and ethically. I hope that we all do so—so that we can come to acknowledge the full complexity and wonder of the world we live in, and be thoughtful in designing the future of it.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
EducationFourth Industrial RevolutionFuture of Work
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Why we need global minimum quality standards in EdTech

Natalia Kucirkova

April 17, 2024

About Us

Events

Media

Partners & Members

  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum