How Software Will Handle The Future of Data Explosion, Experts Point Way Forward
- The development of IT has remained the fulcrum of global growth across all fields as companies battle to meet up with innovation
- According to experts, innovations in the IT ecosystem has been responsible for most of the growth seen across sectors
- The same thing can be seen in the electricity sector where software development is playing huge role power supply across the globe
PAY ATTENTION: Click “See First” under the “Following” tab to see Legit.ng News on your Facebook News Feed!
An important principle in the development of IT over the decades has been Moore’s Law. It predicted that transistor density in processors would double every two years as development progresses.
Despite many predictions of its end, it has remained a more or less guiding principle. However, what is perhaps less well known is a similarly persistent trend in the data centre space.
Despite a 6-fold spike in the data being processed since 2010, data centre energy consumption has only risen by 6% to 2018
Where data come from
According to experts, the Apple iPad got its debut in 2010, which also saw the introduction of Instagram and Microsoft’s Azure cloud service. 2011 saw the introduction of Minecraft, Snapchat, and Uber, with 2013 bringing Amazon’s Alexa, accompanied by Xbox One and PlayStation. 2017 brought Fortnite and Tiktok.
PAY ATTENTION: Follow us on Instagram - get the most important news directly in your favourite app!
Social media engagement over the period increased manifold, while global data production went from estimates of 2 zettabytes in 2010 to 41 zettabytes in 2019. IDC estimates global data load will rise to a staggering 175 zettabytes by 2025.
Reports say that the pandemic effect has been substantial, with the MENA region seeing a big increase in messaging and social media usage: Social Media Users in MEA and Latin America spend the most time on social networks, averaging over 3.5 hours a day.
The impact of data
Data that grow too big too fast can become immobile, reducing its value and increasing its opacity. Only low latency, high bandwidth services, combined with new data architectures can combat this growing and largely undocumented phenomenon.
Multiple technological developments can account for this data explosion being handled with only minimal increases in energy consumption, from improvements in processor design and manufacture, through power supply units and storage, but also the migration of workloads from on-premises infrastructure to the cloud.
Schneider Electric has been committed to sustainable business for decades.
That has meant a renewed focus on efficiency in all aspects of design and operation. Gains have been made in efficiency in power and cooling, with UPS systems and modular power supplies showing significant gains with each generation, culminating in the likes of the current Galaxy VL line.
This line’s use of lithium-ion batteries has not only increased efficiency, it has extended operational life, reduced environmental impact in reducing raw materials, and facilitated “energized swapping”, where the addition and/or replacement of power modules can be performed with zero downtime, while increasing protection to operators and service personnel.
Occupational change: How Delta community developed other professions from fishing to survive due to oil spill
Advances in cooling, such as flow control through rack, row and pod containment systems, liquid cooling, and intelligent software control ensure that the pure data processing gains are met and matched.
By ensuring that every link in the chain of power from energy grid to rack is as efficient, intelligent, and instrumented as possible, we provide the right basis for the rapid development in computing, network, and storage that continues daily.
The role of software and apps
Another key element of the technological development that has allowed such relentless efficiency has been the application of better instrumentation, data gathering, and analysis that allows for better control and orchestration. This was illustrated by Google’s DeepMind AI, where the energy used for cooling was reduced at one of its data centres by some 40% in 2016, which represented a 15% overall reduction in power usage.
This was accomplished using historical data from data centre sensors such as temperature, power, pump speeds, setpoints, etc, to improve data centre energy efficiency.
The AI system predicted the future temperature and pressure of the data centre over the coming hour and made recommendations to control consumption appropriately.
Managed power Services, the next big thing in the power sector, firm says
Legit.ng reported that Schneider Electric has identified Managed Power Services as the next big thing in the power sector when the leader in energy management automation encouraged its partners, professionals and end users to take advantage of the next wave of growth opportunity in the power sector.
Speaking at a press conference recently, Oluwaseun Oloyede, Secure Power Leader for Anglophone West Africa, APC by Schneider Electric harped on the need for its partners and IT professionals to be well-positioned in order to take advantage of this growth opportunity.
He added that it was normal to see innovations in the sector because “if a business isn’t growing, its likely on its way to extinction, he said.
Source: Legit.ng