StandFirst: The data center industry’s proven ability to cope with massive demand growth can be applied to broader sustainability challenges
An important principle in the development of computing over the decades has been Moore’s Law. Simply put, he predicted that the density of transistors in processors would double every two years as development progressed.
Despite many predictions of its demise, it remained more or less a guiding principle. However, what is perhaps less well known is a similar trend persisting in the data center space.
Despite a 6-fold increase in data processed since 2010, data center energy consumption only increased by 6% through 2018 (Masanet et al, 2020). How was this possible and how does it influence future developments in sustainability?
Where does the data come from?
To contextualize this evolution, we must first understand where the increase in data processing is coming from.
Apple’s iPad debuted in 2010, which also saw the introduction of Instagram and Microsoft’s Azure cloud service. 2011 introduced us to Minecraft, Snapchat and Uber, 2013 brought Alexa from Amazon, along with Xbox One and PlayStation. 2017 brought Fortnite and Tiktok.
Social media engagement during the period increased significantly, while global data generation grew from estimates of 2 zettabytes in 2010 to 41 zettabytes in 2019. IDC estimates that global data load will reach the figure staggering 175 zettabytes by 2025.
The pandemic effect has been substantial, with MENA seeing a surge in messaging and social media usage: social media users in MEA and Latin America spend the most time on social media, on average more than 3.5 hours a day.
More than half of MEA users (57%) reported (May 2020) spending even more time on social media due to the pandemic. Similarly, in a separate study, 71% of respondents in the Middle East reported that the use of WhatsApp and other messaging apps had increased since the onset of the pandemic.
What impact does all this data have?
To understand the impact of this data explosion, a concept has been developed called data gravity. Coined by engineer David McRory, the term refers to the tendency of an accumulation of data to pull applications and services towards it, precipitating further accumulation, which can lead to data stranding, as well as to underutilization. Data that grows too large too quickly can become immobile, reducing its value and increasing its opacity. Only low-latency, high-bandwidth services, combined with new data architectures, can combat this growing and largely undocumented phenomenon.
What technological developments have made this possible?
Multiple technological developments can explain that this explosion of data is managed with only minimal increases in power consumption, improvements in the design and manufacture of processors, power supplies and storage, but also migrating workloads from on-premises infrastructure to the cloud.
Schneider Electric has been committed to a sustainable business for decades. This means a renewed focus on efficiency in all aspects of design and operation. Efficiency gains have been made in power and cooling, with UPS systems and modular power supplies showing significant gains with each generation, culminating in the current Galaxy VL range. The use of lithium-ion batteries in this range has not only increased efficiency, but also extended operational life, reduced environmental impact by reducing raw materials, and facilitated “hot swapping”, where the adding and/or replacing power modules can be done without any problems. downtime, while increasing the protection of operators and service personnel.
Cooling advancements such as flow control through rack, row and module containment systems, liquid cooling and intelligent software control ensure that pure data processing gains are achieved and scaled.
By ensuring that every link in the power chain, from power grid to rack, is as efficient, smart, and instrumented as possible, we provide the right foundation for the rapid compute, network, and storage development that continues. daily.
What is the place of software and applications here?
Another key element of technology development that has enabled such efficiency has been the application of better instrumentation, data collection and analysis that allows for better control and orchestration. This was illustrated by Google’s DeepMind AI, where the energy used for cooling was reduced in one of its data centers by around 40% in 2016, representing an overall reduction of 15% in consumption. of energy. This was accomplished using historical data from data center sensors such as temperature, power, pump speeds, set points, etc., to improve the energy efficiency of the data center. data. The AI system predicted the future temperature and pressure of the data center over the next hour and made recommendations to control consumption appropriately.
The development of Data Center Infrastructure Management (DCIM) systems has also continued, enabling the integration of AI to take advantage of all these hardware and infrastructure developments. These experiences are now features, enabling unprecedented visibility and control. For those designing for new developments, software such as ETAP makes it possible to integrate energy efficiency into the design from the start, while taking microgrid architectures into account.
What new data sources will contribute to this?
The data explosion is expected to continue to grow, with developments such as Industrial IoT, 5G, with increasing general automation and autonomous vehicles as driving factors. The data that will be generated, away from the centralized data infrastructure, must be manipulated, processed and transformed into intelligence quickly, where it is needed.
New data architectures should improve the efficiency of managing all of this. Edge computing is seen as an important approach to managing more data generated at the edge.
In one example, genomics research generates terabytes of data, often daily. Sending all this data to a centralized data center would be slow, require high bandwidth, and be inefficient. The Wellcome Sanger Institute created a state-of-the-art computing approach that allowed it to process data close to where it was produced, genomic sequencers, with only what was needed centrally. This saves on storage, bandwidth and speeds up intelligence time from data. This is where the edge paradigm came to us, said Simon Binley, data center manager, Sanger Institute.
Modular data centers, micro data centers, and better storage management will all help manage this developing wave effectively, keeping the data center power consumption line as stable in the future. In the MENA region, 5G and centralization with edge architectures will be balanced by larger-scale installations linking major demand centers.
What effects will this have on the entire data ecosystem?
However, efficiency must extend not only to the supply chain, but also to all life cycles. Vendors, suppliers, and partners all need to be engaged to ensure that no part of the ecosystem delays applying the tools to ensure effectiveness. This applies as much to the design of new equipment and applications as it does to service life and dismantling. Understanding the impact of a business ecosystem as a whole on the environment will be key to truly achieving net zero goals.
Agreed standards, transparency and measurability are all essential factors in ensuring results.
These considerations apply throughout the region, with great efforts being made to do better. Greater transparency is now accepted and embraced, with more and more organizations reporting on their progress.
Shared tools and processes
The data center sector has many things that will be useful to organizations and industries embarking on a sustainability path towards increasing circularity. With efficiency expertise and experience, combined with operations tools and intelligence, and deep commitments to tight goals for net zero operations, the data center industry can not only manage the data explosion and digital demands of the world, but do so sustainably, while providing others with the tools and knowledge to do the same for their respective industries.
For further reference, see our Cloud and Service Provider page, with Future Views Data Centers here