TOBY OWEN, vice president of product, Fungible
Articles flood the web about the rise of artificial intelligence, machine learning and other big data technologies that will transform our lives in unimaginable ways. No one is more attuned to this radical revolution than those on the front lines responsible for delivering the new paradigm — data center operators and cloud service providers.
The numbers are staggering. The amount of data stored in data centers is growing at 36% CAGR, 300 hours of video are uploaded to YouTube every hour and more than 145 billion text messages are sent each day. Much of this is unstructured data, both machine generated, such as GPS data from mobile phones, and people generated, such as social media posts and viral videos. The amount of data is exploding as well as the data types. In this data-centric era, mass customization is the rule, not the exception. We even expect customized food when we eat out, with the latest trend of “specify your own bowl or pizza” in real time – a composable meal.
The skyrocketing data use cases go on and on, but to get a clear understanding of the daunting infrastructure challenges Fungible surveyed over 300 data center and cloud computing professionals to explore what is necessary to usher in the new Data-Centric Era.
Unsurprisingly, these experts believe big data is an important part of the future. In fact, 65% of data center and cloud computing professionals view newer workloads, such as AI/ML and big data analytics as critical to business strategy and organization. Data-Centric Era is placing different demands on data processing. Nearly 84% of the survey respondents say there will be a critical or substantial impact to their business if they do not implement infrastructure changes in the next year or two.
The quantity, variety, unpredictability and customization of data drive the trend toward data-centric computing. The characteristics of data-centric computing are:
- Work arrives in the form of packets
- High I/O to compute ratio and low operations per byte
- Unpredictable mixed data pattern that arrives in the form of interleaved flows of packets
- Computation is stateful – processing of a packet requires reading and writing state belonging to the packet’s flow
These characteristics present challenges to the data center, often built with compute-centric technologies. For example, with highly mixed, high I/O workload, low latency context switching is critical to responsiveness. CPUs can take tens of microseconds for a context switch. It was not optimized for rapid, low latency context switching and not designed to handle I/O efficiently. This inefficiency and large I/O to compute ratio cause servers to spend significant time processing I/O instead of applications.
Data-centric workloads are driving new demands. According to the Fungible survey, workloads that drive improvements in performance and growth are web applications at 77%, followed by cloud native at 73%. When asked what is the one thing you want to change about your infrastructure in 2020, moving to cloud architecture was on top of the list.
Read the full article in the December issue of Semiconductor Digest.