This post first appeared on The Economist’s GE LookAhead blog. Publication does not imply endorsement of views by the World Economic Forum.
The amount of data circulating and available worldwide can make you dizzy. According to International Data Corp. (IDC), 4.4 zettabytes (4.4trn gigabytes) of digital data had been created by 2013. The Industrial Internet is destined to boost that total to 44 zettabytes (ZB) by 2020.
How is the world supposed to keep up with a 10x rate of data creation over half a decade? The first, most natural step would be to increase capacity. For hard disk drives (HDD), which currently act as a primary storage option, developments like shingled magnetic recording have helped increase storage capacity by 25%, while heat-assisted magnetic recording could eventually see the introduction of 60TB drives—the largest HDD currently stands at 10TB.
For flash memory—a less-mature technology—even larger incremental gains are possible. Stacking layers of flash storage cells vertically, for example, can boost solid state drive (SSD) capacity by up to a factor three. “This enables 2.5-inch SSDs to hold 10TB,” says Doug Rollins, SSD marketing engineer at Micron Technology.
The 3D approach, in fact, may lead to even further gains. Micron and Intel, for example, are jointly working on a new class of flash-like, transistor-less memory known as “3D XPoint” that they hope will increase density by 10x and speed by 1000x.
Even that may not be enough, however. According to IDC’s projections, the size of global storage capacity relative to the amount of data generated is expected to decline by half by 2020 (from 33% to 15%). Efficient use of available space will, therefore, be crucial.
Approaches to storage efficiency will rely on three primary levers: storage policy about what to store and for how long; data-reduction techniques such as de-duplication and compression; and automated storage tiering, which distributes data across flash, disk or tape depending on the data’s priority and access frequency. Software-defined storage, which decouples physical storage from the softwares that manage it, is also maturing—the data lake being one of the recent illustrations of the concept.
For the Industrial Internet to achieve true real-time capabilities, updated communication protocols will also be required to ensure that information can be moved rapidly—from sensors to local storage repositories and analysis software, as well as be relayed to centralised data centres. That’s where software like NVMe comes in. Just now hitting the market, it is a way to bring data-transport mechanisms up to the velocity of modern processors and flash architectures. “Early NVMe products have demonstrated up to six times greater read and write performance compared to SSDs,” says Mr Rollins. As for the physical infrastructure, the Industrial Internet Consortium is also working on testing cross-continental wires that can transmit machine-to-machine communications at speeds of up to 100Gb/s, according to GE.
Could this combination of faster, denser storage, as well as higher efficiencies and greater data-transport velocities become the data-storage platform for the Industrial Internet age? Tell us what you think.
To keep up with the Agenda subscribe to our weekly newsletter.
Author: Drew Robb is a contributing writer for GE Look Ahead.
Image: An illustration picture shows a projection of binary code on a man holding a laptop computer. REUTERS.