As a global society, we face a host of major challenges – whether it’s dealing with climate change, seeking new sources of energy and security, curing cancer or lifting billions of our fellow humans out of poverty. There are thousands of people engaged in solving these problems, in scores of laboratories and research centers around the globe. They are generating large amount of data about the issues at hand; yet we find ourselves at a juncture where we have so much data that it’s created a bottleneck. We’re faced with a classic needle in a haystack problem – finding the data that are most useful. We struggle to determine which data we can share and how to share it. We struggle with understanding how data can open new questions for us to pursue.
This is our bold, new world– a world on the cusp of data-driven research, innovation and discovery. At the World Economic Forum in Davos, we will talk about the challenges of working with large data sets, and the nearly limitless opportunities that big data present to addressing societal issues that know no borders. We will talk about how effectively sifting through massive data can help the societies of today preserve the civilizations of yesterday. We will talk about how decoding data can help scientists share information from a single, hard-to-find specimen, making new discoveries more possible and frequent. We will talk about how understanding big data can yield advances in the study of the human brain, offering the prospect, perhaps someday soon, of a once-paralyzed child moving as freely and effortlessly as any other.
It’s a funny thing about data: If you look up “data” in the literature, it invariably comes laden with negative baggage – words such as “overload” and “deluge”. With terms such as these, no wonder scientists and scholars alike shy away! But instead of resisting large data sets, we should be embracing them. Multinational companies are learning how to work with large data sets, and they’re getting better at it all the time. But mid-size and smaller businesses don’t have that expertise. Yet the largest companies will benefit greatly if smaller businesses, many of them key to their supply, production and workforce, can, too, handle massive data. Such ease of data handling can grease the wheels of global commerce.
To realize this vision, data and data awareness needs to be integrated into the educational curriculum. We need to teach students how to use massive amounts of data, in higher-education curriculums and even at secondary-school levels. Our educational focus must change, or our children won’t be prepared to handle the vast amounts of data that they invariably will encounter every day.
Data creates a bridge between traditional disciplines, spawning discovery and innovation from the humanities to the hard sciences. Data dissolves barriers, opening up new channels of communication, lines of research, and commercial opportunities. Data will be the engine, the spark to create a better world for all.
Jan Hesthaven, professor of applied mathematics at Brown University.
Pictured: A server is pictured in the CERN LHC computing grid centre in Geneva, October 3, 2008. This centre is one of the 140 data processing centres, located in 33 countries, taking part in the grid processing project. More than 15 million Gigabytes of data produced from the hundreds of millions of subatomic collisions in the LHC (Large Hadron Collider) should be collected every year. REUTERS/Valentin Flauraud