green it, epoc, cloud computing, data center

Data centers: Taking up the energy challenge

Increasingly present throughout the world, data centers consume significant amounts of energy. Researchers at IMT Atlantique have conducted a study that combines renewable energy and electricity grids to power these infrastructures. To reduce energy consumption, scientists are also looking at the network and the anticipation and organization of tasks on the various servers.

 

Accessing this website via a search engine consumes four times more CO2 than getting here by typing the website address or using a bookmark shortcut, according to the ADEME. [1] While it is easy to forget, because of its lack of physical presence, that digital technology emits greenhouse gases, it is important to remember that data centers emit as much carbon dioxide as civil aviation (2% of global emissions) due to their electricity consumption. This observation is far from trivial, considering the significant increase in the number of data centers around the world. Furthermore, with the arrival of cloud computing, data centers are consuming increasing amounts of energy, leading to ecological and economic problems.

With each new data center that opens, reducing the environmental impact is therefore a key consideration. With the aim of reducing energy consumption, the EPOC collaborative project (Energy Proportional and Opportunistic Computing systems) combines alternative energies and the EDF grid to power single-site infrastructures containing around fifty servers. This purely theoretical project, launched in 2013 and financed by the Labex CominLabs, brings together researchers in computer science and telecommunications from IMT Atlantique. For this Green IT project, the researchers’ skills in computer optimization, virtualization, and software adaptation are combined with expertise in very high-speed optical networks. Let’s take a look at the theory at the heart of this project, while awaiting its implementation.

 

Technical optimization

In the EPOC theory, data centers are connected to the EDF grid, while still being powered by renewable energy. This renewable energy provides electricity in a discontinuous manner, unlike the electricity provided by the traditional network. For example, in the case of a photovoltaic panel, there is a surge in electricity when the sun is at its peak, whereas there is no production at night. “Currently, the most expensive aspect in a renewable energy electric system is the batteries used to store the energy that is not consumed. We would like to do away with the storage aspect and try to consume the electricity that is produced directly,” explains Jean-Marc Menaud, a researcher specialized in Green IT at IMT Atlantique and the EPOC coordinator. This is yet another reason to optimize the cost-effectiveness of this heterogeneous distribution of energy, and power the data centers in a consistent manner over time.

To achieve these goals, the researchers are improving network communication in particular. They have opted for an installation that is entirely connected via fiber optic technology. This reduces energy consumption during transfers between the servers. The transmission of information via a fiber optic system or, in other words, via light signals, consumes less electricity than an ADSL connection. ADSL, for example, uses cables containing copper and equipment that constantly consumes electricity.

 

Organizing tasks according to the electrical load

Once they are running and connected to a network, data centers are able to host two types of applications. Some, like search engines, require a constant online connection. Others must be performed before a deadline. The researchers are therefore able to coordinate the applications being used based on the energy received. When green energy is available, they can organize the tasks the way they like. Jean-Marc Menaud gives us an example: “at the end of every month, the accounting services must create the pay stubs in PDF format. These files must be available by the 30th of each month, but can be produced any time before this date. They can therefore be created when a high level of green electricity is available.

In addition to the number of tasks in progress, it is also possible to play on the applications that are always online. The goal of a data center is to constantly ensure a certain level of quality of service, or SLA (Service Level Agreement). This is a question of software elasticity, meaning the ability to adapt an application’s operation according to the available energy. Take, for example, a website used to evaluate a trip from one point to another. Its SLA consists in providing a route within a given time frame. If the electrical load is low, it will only meet these simple requirements. If, however, the green electricity rate is high, the website can provide alternative routes, hence improving the service provided.

 

Regulating the workload

Reducing energy consumption also involves reducing the number of servers that are running. Before turning a server off, the applications running on it must be transferred to another server. To do this, the researchers use the virtual machine principle. The researchers have two possibilities for relieving the workload on a server: they either suspend the calculation, or they perform a migration using virtual machines. This process of scheduling tasks on the servers is a complex problem. It is, above all, a question of placement and distribution.

Jean-Marc Menaud explains: “This principle of placement is similar to the principle involved in filling a backpack. Imagine you are leaving for a trek with a 60-liter backpack. You can choose from a wide variety of food items to take with you. Each food item has a calorific value, a volume and a weight. Your goal is to gather a maximum amount of calories within the limitations of a backpack with a set volume, while minimizing the final weight. The solution is easy when there are only 5 food items. But if you have 10,000 items, the problem becomes more complex, because it is impossible to test all the possibilities. Here we have a similar situation. A server is a backpack that can contain a certain quantity of virtual machines. We must maximize the service provided (the calories) and minimize the energy (the weight).

 

SeDuCe: a full-scale test data center

The last area the EPOC researchers are studying is anticipation. Predicting an application’s energy needs, combined with the electricity production forecast, is the key to responsible consumption. This is an aspect that will be directly tested in 2017, with the implementation of the CPER [2] SeDuCe (Sustainable Data Center) project, the logical follow-up to three years of theory. It will enable a test single-site data center to be established, powered by photovoltaic panels. This complete infrastructure will enable the practical analysis of the theories being developed through EPOC. “This type of site is rare in France. They are only beginning to emerge at the industrial level. With photovoltaic panels that are increasingly affordable, we will be able to test the hypotheses of this advanced research directly. The site should be operational by the summer of 2017″, Jean-Marc Menaud concludes.

 

[1] ADEME: The French Environment and Energy Management Agency (Agence De l’Environnement et de la Maîtrise de l’Energie)

[2] CPER: French State-Regional Plan Contract (Contrat Plan Etat-Région)

 

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *