Towards the #FutureOfCompute: Sustainable, Flexible and Secure Data Centers
Juan A. Fraire - Cloud&Heat Technologies GmbH

Towards the #FutureOfCompute: Sustainable, Flexible and Secure Data Centers

Cloud data centers are increasingly being deployed by information technology service providers to cater for world׳s digital needs. Social media, search, file sharing, big data, the Internet of Things and streaming are creating huge amount of data that needs to be processed, stored and transmitted via an efficient infrastructure. On top of this, computing resources such as CPU, memory space, software, disk space and maintenance routines, are becoming services on demand now provided by the cloud. To keep pace with the growing number of cloud based applications and users, the number and size of data centers has been increasing exponentially over the past decade.

Challenges in this context are not few. On the one hand, data centers have emerged as a major consumer of electricity with an estimated consumption of more than 2.4% of the electricity worldwide and are expected to grow 15-20% annually. Sustainability and efficient energy management is thus a major concern. On the other hand, a flexible placement of data centers is becoming increasingly important to comply with hard real-time latency requirements of modern cloud applications. Security and data integrity are indeed another fundamental aspect as more and more sensitive user data has to be protected from unauthorized access.

We, at Cloud&Heat, tackle these intertwined challenges with smart water-cooling, modular containerized data centers and state-of-the art open-source software code as discussed below.

Sustainability

The depletion of the world’s limited reservoirs of fossil fuels, the worldwide impact of global warming and the high cost of energy are driving a renewed interest in the capture and reuse of waste energy. In the case of data centers, recent studies have suggested that on average ∼40% of the electricity consumed powers the thermal management equipment, which coincidentally exists to get rid of waste heat produced by the computing platform when utilizing the rest of the ∼60% of the total power.

An immediate perspective on sustainable development is to address both of these challenges by using the waste heat as space and/or water heat for commercial, residential, or even industrial and power plant buildings and facilities. Such an approach is crucial as the heating and cooling of buildings is known to consume ∼50% of the world's total energy generated from non-renewable sources. Reusing the heat produced by data centers leads to a vision where computing becomes a means for transforming and not wasting the energy. Ideally, data processing can become just an excuse to transform electricity into heat for other purposes.

While heat captured from air-cooled servers is typically 35–40 °C, liquid cooling techniques provide waste heat temperatures of 50–60 °C, making of water-cooling a keen feature for future sustainable data centers.

Flexibility

To keep up the ever-increasing demand for data center and cloud capabilities, the trend is towards "hyper-scale" data centers, huge server farms with scalable, highly-efficient infrastructures. Large cloud data-centers from major service providers tend to sit on cold weather where power is available at low cost.

However, more and more companies want to access their data in their physical proximity and therefore build their own data centers. The closer the data center is to the respective user, the lower the "reaction times" for querying and receiving information on the terminals. In technical jargon, this is called latency. Even though these are often only milliseconds that the user hardly notices, there are more and more applications in which these latencies are relevant, such as the application of autonomously driving cars. In contrast with "hyper-scale" data centers, flexible "edge" or "containerized" data centers can be placed at ideal geographical locations and be integrated to a distributed data center network to provide high quality services at low latencies.

Furthermore, decentralizing large data centers into smaller data center nodes, is a perhaps more effective waste heat utilization philosophy as they can be directly integrated into the buildings they serve providing not only a low latency, but a cost-efficient and sustainable solution. Computational jobs would then be migrated from node to node based on the computational requirements of the job, the availability of servers in the node, and the waste heat required by the integrated building. Intermittency of power supply, a typical feature of renewable energy sources such as wind and solar, can also be accommodated via a smart job migration.

As a result, decentralization is a compelling flexible approach to manage energy resources as energy–hungry computing technologies become even more integrated into our society.

Security

As data centers go distributed, our sensitive data is everywhere except under our control. When you subscribe to Amazon, Google, Microsoft, or some other cloud provider, you are blindly trusting your data to a closed third party software. Beyond the promises of security and data integrity claimed by such vendors, no one can be really sure on how the data is being handled. In particular, a key concern regarding security is whether sensitive data can be exposed by accidental, intentional, or government-enforced backdoor mechanisms.

In a context where data integrity filings to major cloud service providers are just starting to emerge, an on-premise or private cloud infrastructure would be the better and safer choice in many cases. By building your own cloud in your own data center, you will have more control of your data, exploit the flexibility of data center location for reduced latency and take advantage of potential heat reuse on your own premises. Furthermore, an immediate action to ensure our personal or company data is not manipulated by third party entities is by relaying in transparent, secure and open-source software. In the cloud computing world, this is materialized by SecuStack and OpenStack.

By empowering our cloud with open-source, any who wish to see the source code for any part of project can do so. Bugs including security vulnerabilities may be spotted by the many eyes– both the experts and novices alike – on the code. Open source code is subject to security reviews (as in any professional software developing enterprise), but in addition to “in-house” reviews by those engineers tied to the project, is also subject to unsolicited security reviews that may be conducted by any of the many users in the world. This indeed include the security mechanisms, which instead of being obscured are open for anyone to see and verify that only the user with the valid key will be able to decode and access his sensible data.

As a result, open-source code such as SecuStack and OpenStack benefits of larger code scrutiny, there is no need to blindly trust a third party, and guarantees that no unwanted backdoor are present in your private or public cloud.

By offering our sustainable, flexible and secure cloud products, we are pushing the boundary of cloud computing and mastering the so-called #FutureOfCompute. If you feel you are meant to be part of this compute revolution, do not hesitate and reach us at Cloud&Heat.

Ram Mohan

Manager - Digital Marketing | Sales & Marketing Automation | ABM | Helps Startups & Enterprises in Digital Marketing | Performance Marketing | Product Marketing | SaaS

5y

Don't get tangled on the web !Go Digital! – Live & On Demand Streaming ! Digital Media Solution - Functional Components https://bit.ly/2SxQQyJ

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics