telecommunications, energy transition, Loutfi Nuaymi

Energy and telecommunications: brought together by algorithms

It is now widely accepted that algorithms can have a transformative effect on a particular sector. In the field of telecommunications, they may indeed greatly impact how energy is produced and consumed. Between reducing the energy consumption of network facilities and making better use of renewable energy, operators have embarked on a number of large-scale projects. And each time, algorithms have been central to these changes. The following is an overview of the transformations currently taking place and findings from research by Loutfi Nuaymi, a researcher in telecommunications at IMT Atlantique. On April 28 he gave a talk about this subject at the IMT symposium dedicated to energy and the digital revolution.

 

20,000: the average number of relay antennae owned by a mobile operator in France. Also called “base stations,” they represent 70% of the energy bill for telecommunications operators. Since each station transmits with a power of approximately 1 kW, reducing their demand for electricity is a crucial issue for operators in order to improve the energy efficiency of their networks. To achieve this objective, the sector is currently focusing more on technological advances in hardware than on the software component. Due to the latest advances, a recent base station consumes significantly less energy for data throughout that is nearly a hundred times higher. But new algorithms that promise energy savings are being developed, including some which simply involve… switching off base stations at certain scheduled times!

This solution may seem radical since switching off a base station in a cellular network means potentially preventing users within a cell from accessing the service. Loutfi Nuaymi, a researcher in telecommunications at IMT Atlantique, is studying this topic, in collaboration with Orange. He explains that, “base stations would only be switched off during low-load times, and in urban areas where there is greater overlap between cells.” In large cities, switching off a base station from 3 to 5am would have almost no consequence, since users are likely to be located in areas covered by at least one other base station, if not more.

Here, the role of algorithms is twofold. First of all, they would manage the switching off of antennas when user demand is lowest (at night) while maintaining sufficient network coverage. Secondly, they would gradually switch the base stations back on when users reconnect (in the morning) and up to peak hours during which all cells must be activated. This technique could prove to be particularly effective in saving energy since base stations currently remain switched on at all times, even during off-peak hours.

Loutfi Nuaymi points out that, “the use of such algorithms for putting base stations on standby mode is taking time to reach operators.” Their reluctance is understandable, since interruptions in service are by their very definition the greatest fear of telecom companies. Today, certain operators could put one base station out of ten on standby mode in dense urban areas in the middle of the night. But the IMT Atlantique researcher is confident in the quality of his work and asserts that it is possible “to go even further, while still ensuring high quality service.

 

Allumer ou éteindre progressivement les stations de base le matin ou le soir en fonction de la demande des usagers est une bonne voix d'économie d'énergie pour les opérateurs.

Gradually switching base stations on or off in the morning and at night according to user demand is an effective energy-saving solution for operators.

 

While energy management algorithms already allow for significant energy savings in 4G networks, their contributions will be even greater over the next five years, with 5G technology leading to the creation of even more cells to manage. The new generation will be based on a large number of femtocells covering areas as small as ten meters — in addition to traditional macrocells with a one-kilometer area of coverage.

Femtocells consume significantly less energy, but given the large number of these cells, it may be advantageous to switch them off when not in use, especially since they are not used as the primary source of data transmission, but rather to support macrocells. Switching them off would not in any way prevent users from accessing the service. Loutfi Nuaymi describes one way this could work. “It could be based on a system in which a user’s device will be detected by the operator when it enters a femtocell. The operator’s energy management algorithm could then calculate whether it is advantageous to switch on the femtocell, by factoring in, for example, the cost of start-up or the availability of the macrocell. If it is not overloaded, there is no reason to switch on the femtocell.

 

What is the right energy mix to power mobile networks?

The value of these algorithms lies in their capacity to calculate cost/benefit ratios according to a model which takes account of a maximum number of parameters. They can therefore provide autonomy, flexibility, and quick performance in base station management. Researchers at IMT Atlantique are building on this decision-making support principle and going a step further than simply determining if base stations should be switched on or put on standby mode. In addition to limiting energy consumption, they are developing other algorithms for optimizing the energy mix used to power the network.

They begin with the observations that renewable sources of energy are less expensive, and if operators equip themselves with solar panels or wind turbines, they must also store the energy produced to make up for periodic variations in sunshine and the sporadic nature of wind. So, how can an operator decide between using stored energy, energy supplied by its own solar or wind facilities, or energy from the traditional grid, which may rely on a varying degree of smart technology?  Loutfi Nuaymi and his team are also working on user cases related to this question and have joined forces with industrial partners to test and develop algorithms which could provide some answers.

One of the very concrete questions operators ask is what size battery is best to use for storage,” says the researcher. “Huge batteries cost as much as what operators save by replacing fossil fuels with renewable energy sources. But if the batteries are too small, they will have storage problems. We’re developing algorithmic tools to help operators make these choices, and determine the right size according to their needs, type of battery used, and production capacity.”

Another question: is it more profitable to equip each base station with its own solar panel or wind turbine, or rather, create an energy farm to supply power to several antennas? The question is still being explored but preliminary findings suggest that no single organization is clearly preferential when it comes to solar panels. Wind turbines, however, are imposing objects which are sometimes refused by neighbors, making it preferential to group them together.

 

Renewable energies at the cost of diminishing quality of service?  

Once this type of constraint has been ruled out, operators must calculate the maximum proportion of renewable energies to include in the energy mix with the least possible consequences on quality of mobile service. Sunshine and wind speed are sporadic by nature. For an operator, a sudden drop in production at a wind or solar power farm could have direct consequences on network availability — no energy means no working base stations.

Loutfi Nuaymi admits that these limitations reveal the complexity of developing algorithms, “We cannot simply consider the cost of the operators’ energy bills. We must also take account of the minimum proportion of renewable energies they are willing to use so that their practices correspond to consumer expectations, the average speed of distribution to satisfy users, etc.”

Results from research in this field show that in most cases, the proportion of renewable energies used in the energy mix can be raised to 40%, with only an 8% drop in quality of service as a result. In off-peak hours, this represents only a slight deterioration and does not have a significant effect on network users’ ability to access the service.

And even if a drastic reduction in quality of service should occur, Loutfi Nuaymi has some solutions. “We have worked on a model for a mobile subscription that delays calls if the network is not available. The idea is based on the principle of overbooking planes. Voluntary subscribers — who, of course, do not have to choose this subscription— accept the risk of the network being temporarily unavailable and, in return, receive financial compensation if it affects their use.

Although this new subscription format is only one possible solution for operators, and is still a long way from becoming reality, it shows how the field of telecommunications may be transformed in response to energy issues. Questions have arisen at times about the outlook for mobile operators. Given the energy consumed by their facilities and the rise of smart grids which make it possible to manage both self-production and the resale of electricity, these digital players could, over time, come to play a significant role in the energy sector.

It is an open-ended question and there is a great deal of debate on the topic,” says Loutfi Nuaymi. “For some, energy is an entirely different line of business, while others see no reason why they should not sell the energy collected.” The controversy could be settled by new scientific studies in which the researcher is participating. “We are already performing technical-economic calculations in order to study operators’ prospects.” The highly-anticipated results could significantly transformation the energy and telecommunications market.

 

This article is part of our dossier Digital technology and energy: inseparable transitions!

 

 

Dark Matter

Even without dark matter Xenon1T is a success

Xenon1T is the largest detector of dark matter in the world. Unveiled in 2015, it searches for this invisible material — which is five times more abundant in the universe than ordinary matter — from the Gran Sasso laboratory in Italy, buried under a mountain. In May 2017, an international collaboration of 130 scientists published the first observations made by the instrument. Dominique Thers, the coordinator of the experiment in France and a researcher at Subatech*, explains the importance of these initial results from Xenon1T. He gives us an overview of this cutting-edge research, which could unlock the secrets of the universe. 

[divider style=”dashed” top=”20″ bottom=”20″]
Learn more about Xenon1T by reading our article about the experiment.
[divider style=”dashed” top=”20″ bottom=”20″]

What did the Xenon1T collaboration work on between the inauguration a year and a half ago and the first results published last month?

Dominique Thers: We spent the better part of a year organizing the validation of instruments to make sure that they worked properly. The entire collaboration worked on this qualification and calibration phase between fall 2015 and fall 2016. This phase can be quite long and it’s difficult to predict how long it will take in advance. We were very satisfied to finish it in a year — a short time for such a large-scale experiment like Xenon1T.

 

So you had to wait one year before launching the first real experiment?

DT: That’s right. The first observations were launched in early December 2016. We had allowed the ton of xenon to be exposed to potential dark matter particles for exactly 34.2 days. In reality, the actual time was a bit longer, since we have to recalibrate the instruments regularly and no data is recorded during these times. This period of exposure ended on January 18, when three high-magnitude earthquakes were recorded near Gran Sasso. Due to mechanical turbulence, the instruments had to be serviced over the course of a week, and we decided at that time to proceed to what we call “unblinding.”

 

Does that mean you only discovered what you had been recording with the experiment once it was finished, rather than in real time?

DT: Yes, this is in line with the logic of our community. We perform data analysis which is independent from its acquisition. This allows us to limit bias to the maximum in our analysis, as this could occur when we stop an observation period to verify whether or not there has been an interaction between the xenon and dark matter. When we have reached a significant period of time we determine to be satisfactory, we stop the experiment and look at the data. The analysis portion is prepared in advance and everyone is ready for this moment in general. The earthquake occurred very near to the scheduled end date, so we preferred to stop the measurements then.

 

The results did not reveal interactions between the xenon and dark matter particles, which would have represented a first direct observation of dark matter. Does this mean that the collaboration has been a failure?

DT: Not at all! It’s important to understand that there is fierce competition around the globe to increase the volume of ordinary material exposed to dark matter. With a ton of xenon, Xenon1T is the world’s largest experiment, and potentially, the most likely to observe dark matter. It was out of the question to continue over a long period of time without first confirming that the experiment had reached an unprecedented level of sensitivity. With this first publication, we have proven that Xenon1T is up to the challenge. However, Xenon1T will only reach its maximum sensitivity in sessions lasting 18 to 24 months, so it holds great promise.

 

How does this sensibility work? Is Xenon1T really more sensitive than other competing experiments?

DT: A very simple but symbolic approach is to say that the more the detector is exposed to dark matter, the more likely it is to record an interaction between it and ordinary matter. We therefore have a law proportional to time. So, it’s clear to see why, after obtaining this world record in just one month, we are optimistic about the capacities of Xenon1T over an 18 to 24-month period. But we cannot go further than this point, since we would encounter an excessive level of background noise, which would hide potential observations of dark matter with Xenon1T.

 

L'expérience Xenon1T dans le laboratoire du Gran Sasso, en Italie. À gauche, le réservoir de xénon enfermé dans un caisson protecteur. À droite, les locaux box abritent les instruments d'analyse et de contrôle de l'expérience.

The Xenon1T experiment in the Gran Sasso laboratory in Italy. On the left, the xenon reservoir enclosed in protective casing. On the right, rooms housing instruments used for analysis and control in the experiment.

 

So it was more important for the Xenon1T partnership to confirm the superiority of its experiment than to directly carry out an 18 to 24-month period of exposure which may have been more conclusive?  

DT: This enabled us to confirm the quality of Xenon1T, both in terms of the scientific community and governments which support us and need to justify the investments they make. This was a way to respond to the financial and human resources provided by our partners, collaborators, and ourselves. And we do not necessarily control observation time at our level. It also depends on results from competing experiments. The idea is not to keep our eyes closed for 18 months without concerning ourselves with what is happening elsewhere. If another experiment assures that it has found traces of dark matter in an energy field where we have visibility with Xenon1T, we can stop the acquisitions in order to confirm or disprove these results. This first observation enables us to position ourselves as the best-placed authority to settle any scientific disagreements.

 

Your relationship with other experiments seems a bit unusual: you are all in competition with one another but you also need each other.

DT: It is very important to have several sites on Earth which can report a direct observation of dark matter. Naturally, we hope that Xenon1T will be the first to do so. But even if it is, we’ll still need other sites to demonstrate that the dark matter observed in Italy is the same as that observed elsewhere. But this does not mean that we cannot all improve the sensitivity of our individual experiments in order to maintain or recover the leading role in this research.

 

So Xenon1T is already looking to the future?

DT: We are already preparing the next experiment and determining what Xenon1T will be like in 2019 or 2020. The idea is to gain an order of magnitude in the mass of ordinary material exposed to potential dark matter particles with XENONnT. We are thinking of developing an instrument which will contain ten tons of xenon. In this respect we are competing with the American LZ experiment and the Chinese PandaX collaboration. They also hope to work with several tons of xenon in a few years’ time. By then, we may have already observed dark matter…

Subatech is a joint research unit between IMT Atlantique, CNRS and Université de Nantes.

 

energy transitions

Energy Transitions: The challenge is a global one, but the solutions are also local

For Bernard Bourges, there is no doubt: there are multiple energy transitions. He is a researcher at IMT Atlantique studying the changes in the energy sector, and takes a multi-faceted view of the changes happening in this field. He associates specificities with each situation, each territory, which instead of providing an overall solution, give a multitude of responses to the great challenges in energy today. This is one of the central points in the “Energy Transitions: mechanisms and levers” MOOC which he is running from 15 May until 17 July 2017. On this occasion, he gives us his view of the current metamorphosis in the field of energy.

 

You prefer to talk about energy transitions in the plural, rather than the energy transition. Why is the plural form justified?

Bernard Bourges: There is a lot of talk about global challenges, the question of climate change, and energy resources to face the growing population and economic development. This is the general framework, and it is absolutely undeniable. But, on the scale of a country, a territory, a household, or a company, these big challenges occur in extremely different ways. The available energy resources, the level of development, public policy, economic stakes, or the dynamics of those involved, are parameters which change between two given situations, and which have an impact on the solutions put in place.

 

Is energy transition different from one country to another?

BB: It can be. The need to switch energy model in order to reduce global warming is absolutely imperative. In vast regions, global warming is a matter of life or death for populations, like in the Pacific Islands. On the contrary, in some cases, the rising temperatures may even be seen as an opportunity for economic development: countries like Russia or Canada will gain new cultivatable land. There are also contradictions in terms of resources. The development of renewable energies means that countries with a climate suited to solar or wind power production will have greater energy independence. Also, technical advances and melting ice caps are making some fossil fuel deposits more accessible, which had previously been too costly to use. This implies a multitude of opportunities, some of which are dangerously tempting, and contradictory interests, often within the same country or the same company.

 

You highlight the importance of economic stakes. What about political decisions?

BB: Of course, there is an important political dimension, as there is a wide range of possibilities. To make the system more complex, energy is an element which overlaps with other environmental challenges, as well as social ones like employment. This results in a specific alchemy. Contradictory choices will arise, according to the importance politicians place on these great problems of society. In France as in other countries, there is a law on energy transition. But this does not mean that this apparent, inferred unanimity is real. It is important to realize that behind the scenes, there may be strong antagonism. This conditions political, social and even technological choices.

 

“Behind the question of energy, there are physical laws, and we cannot just do what we want with them.”

 

On the question of technology, there is a kind of optimism which consists in believing that science and innovation will solve the problem. Is it reasonable to believe this?

BB: This feeling is held by part of the population. We need to be careful about this point, as it is also marketing speak used to sell solutions. However, it is very clear that technology will greatly contribute to the solutions put in place, but for now there is no miracle cure. Technology will probably never be capable of satisfying all needs for growth, at a reasonable cost, and without a major impact on the climate or the environment. I often see inventors pop up, promising perpetual movement or 100% productivity rates, or even more. It’s absurd! Behind the question of energy, there are physical laws, and we cannot just do what we want with them.

 

What about the current technologies for harvesting renewable resources? They seem satisfactory on a large scale.

BB: The enthusiasm needs to be tempered. For example, there is currently a lot of enthusiasm surrounding solar power, to the point where some people imagine all households on the planet becoming energy independent thanks to solar panels. However, this utopia has a technological limit. The sun is an intermittent resource, it is only available for half the day, and only in fine weather. This energy must therefore be stored in batteries. But batteries use rare resources such as lithium, which are not limitless. Extracting these resources has environmental impacts. What could be a solution for several tens of millions of Europeans can therefore become a problem for billions of other people. This is one of the illustrations of the multifaceted nature of energy transitions, which we highlight in our MOOC.

 

Does this mean we should be pessimistic about the potential solutions provided by natural resources?

BB: The ADEME carried out a study on a 100% renewable electricity mix by 2050. One of the most symbolic conclusions was that it is possible, but that we will have to manage demand. This implies being sure that new types of usage will not appear. But this is difficult, as innovations will result in a drop in energy prices. If the costs decrease, the result will be that new types of use are made possible, which will increase demand. The realistic solution is to use a combination of solutions that use renewable resources (locally or on a large scale), intelligent management of energy networks, and innovative technologies. Managing demand is not only based on technological solutions, but also on changes in organization and behavior. Each combination will therefore be specific to a given territory of situation.

 

Doesn’t this type of solution make energy management more complex for consumers, whether individuals or companies? 

BB: This question is typical of the mistake people often make, that of limiting the question of energy to electricity. Energy is certainly a question of electricity usage, but also thermal needs, heating, and mobility. The goal for mobility will be to switch to partially electric transport modes, but we are not there yet, as this requires a colossal amount of investment. For thermal needs, the goal is to reduce demand by increasing the energy efficiency of buildings. Electricity is really only a third of the problem. Local solutions must also provide answers to other uses of energy, with completely different types of action. Having said this, electricity does take center-stage, as there are great changes underway. These changes are not only technological but also institutional (liberalization for example), difficult to understand, and sometimes even misleading for consumers.

 

What do you mean by that?

BB: For the moment, we cannot differentiate between the electrons in the network. No provider can tell you at a given moment whether you are receiving electricity produced by a wind farm, or generated by a nuclear power plant. We therefore must be wary of energy providers who tell us the opposite. This is another physical constraint. There are also legal and economic constraints. But we have understood that in this time of great change, there are many actors who are trying to win, or at least trying not to lose.

This is also why we are running this MOOC. The consumer needs to be helped in understanding the energy chain: where does energy come from? What are the basic physical laws involved? We have to try and decipher these points. But, in order to understand energy transitions, we also have to identify the constraints linked specifically to human societies and organizations. This is another point we present in the MOOC, and we make use of the diverse range of skills of people at IMT’s schools and external partners.

 

This article is part of our dossier Digital technology and energy: inseparable transitions!

 

[divider style=”normal” top=”20″ bottom=”20″]

The MOOC “Energy Transitions: mechanisms and levers” in brief

The MOOC “Energy Transitions: mechanisms and levers” at IMT is available (in French) on the “Fun” platform. It will take place from 15 May to 17 July 2017. It is aimed at both consumers wanting to gain a better understanding of energy, and professionals who want to identify specific levers for their companies.

[divider style=”normal” top=”20″ bottom=”20″]

 

SEAS, ITEA

How the SEAS project is redefining the energy market

The current energy transition has brought with it new energy production and consumption modes. Coordinated by Engie, the european SEAS project aims to foster these changes to create a more responsible energy market. SEAS is seeking to invent the future of energy usage by facilitating the integration of new economic stakeholders to redistribute energy, as well as increasing the energy management options offered to individuals. These ideas have been made possible through the contributions of researchers from several IMT graduate schools (IMT Atlantique, Mines Saint-Étienne, Télécom ParisTech and Télécom SudParis). Among these contributions, two innovations are supported by IMT Atlantique and Mines Saint-Étienne.

 

An increasing number of people are installing their own energy production tools, such as solar panels. This breaks with the traditional energy model of producer-distributor-consumer”. Redefining the stakeholders in the energy chain, as noted by Guillaume Habault, IMT Atlantique computer science researcher, is at the heart of the issue addressed by the Smart Energy Aware Systems (SEAS) project. The project was completed in December, after three years of research as part of the European ITEA program. It brought together 34 partners from 7 countries, one of which was IMT in France. On 11 May, the SEAS project won the ITEA Award of Excellence, in acknowledgement for the high quality of its results.

The project is especially promising as it does not only involve individuals wanting to produce their own energy using solar panels. New installations such as wind turbines provide new sources of energy on a local scale. However, this creates complications for stakeholders in the chain such as network operators: their energy production is erratic, as it is dependent on the seasons and the weather. Yet it is important to be able to foresee energy production in the very short term in order to ensure that every consumer is supplied. Overestimating the production of a wind farm or a neighborhood equipped with solar panels means taking the risk of not having enough energy to cope with a lack in production, and ultimately causing power cuts for residents. “Conversely, underestimating production means having to store or dispatch the surplus energy elsewhere. Poor planning can create problems in the network, and even reduce the lifespan of some equipment” the researcher warns.

 

An architecture for smart energy grid management

Among the outcomes of the SEAS is a communication architecture capable of gathering all information from different production and consumption modes locally, almost in real time. “The ideal goal is to be able to inform the network in 1-hour segments: with this length of time, we can avoid getting information about user consumption that is excessively precise, while anticipating cases of over- or under-consumption,” explains Guillaume Habault, the creator of the architecture.

For individuals, SEAS may take the form of an electric device that can transmit information about their consumption and production to their electricity provider. “This type of data will allow people to optimize their power bills,” the researcher explains. “By having perfect knowledge of the local energy production and demand at a given moment, residents will be able to tell if they should store the energy they produce, or redistribute it on the network. With the help of the network, they may also decide what time would be the most economical to recharge their electric car, according to electricity prices, for instance.”

 

 

These data on the current state of a sub-network point to the emergence of new stakeholders, known as “flexibility operators”. First of all, because optimizing your consumption by adapting the way you use each appliance in the house requires specific equipment, and takes time. While it is easy to predict that energy will be more expensive at times of peak demand, such as in the evenings, it is more difficult to anticipate the price of electricity according to how strong the wind is blowing in a wind farm located several dozen kilometers away. It is safe to say that with suitable equipment, some individuals will be inclined to delegate their energy consumption optimization to third-party companies.

The perspectives of intelligent energy management offered by SEAS go beyond the context of the individual. If the inhabitants of a house are away on holiday, couldn’t the energy produced by their solar panels be used to supply the neighborhood, thus taking pressure off a power plant located a hundred kilometers away? Another example: refrigerators operate periodically, they don’t cool constantly, but rather at intervals. In a neighborhood, or a city, it would therefore be possible to intelligently shift the startup time of a group of these appliances to outside peak hours, so that an already heavily-used network can be concentrated on the heaters people switch on when they return home from work.

Companies are particularly keen to get these types of services. Load management allows them to temporarily switch off machines that are not essential to their service in exchange for a payment to those who are in charge of this load management. The SEAS architecture incorporates communication security in order to ensure trust between stakeholders. In particular, personal data are decentralized: each party owns their own data and can decide not only to allow a flexibility operator to have access to them, but can also determine their granularity and level of use. “an individual will have no trouble accepting that their refrigerator cools at different times from usual, but not that their television gets cut off while they are watching it,” says Guillaume Habault. “And companies will want to have even more control over whether machines are switched off or on.”

 

Objects that speak the same language

In order to achieve such efficient management of electricity grids, the SEAS project turned to the semantic web expertise of Mines Saint-Étienne. “The semantic web is a set of principals and formalisms that are intended to allow machines to exchange knowledge on the web”, explains Maxime Lefrançois, head researcher in developing the knowledge model for the SEAS project. This knowledge model is the pivotal language that allows objects to be interoperable in the context of energy network management.

Up to now, each manufacturer had their own way of describing the world, and the machines made by each company evolved in their own worlds. With SEAS, we used the principles and formalisms of the semantic web to provide machines with a vocabulary allowing them to “talk energy”, to use open data that exists elsewhere on the web, or to use innovative optimization algorithms on the web”, says the researcher. In other words, SEAS proposes a common language enabling each entity to interpret a given message in the same way. Concretely, this involves giving each object a URL address, which can be consulted in order to obtain the information on it, in particular to find out what it can do and how to communicate with it. Maxime Lefrançois adds, “We also contributed to principles and formalisms of the semantic web with a series of projects aimed at making it more accessible to companies and machine designers, so that they could adapt their existing machines and web services to the SEAS model at a lower cost”.

Returning to a previous example, using this extension of the web makes it possible to adapt two refrigerators of different brands so that they can communicate, agree on the way they operate, and avoid creating a consumption peak by starting up at the same time. In terms of services, this will allow flexibility operators to create solutions without being limited by the languages specific to each brand. As for manufacturers, it is an opportunity for them to offer household energy management solutions that go beyond simple appliances.

Thanks to the semantic web, communication between machines can be more easily automated, improving the energy management service proposed to the customer. “All these projects point to a large-scale deployment,” says Maxime Lefrançois. Different levels of management can thus be envisioned. Firstly, for households, for coordinating appliances. Next, for neighborhoods, redistributing the energy produced by each individual according to their neighbors’ needs. Finally on a regional or even national scale, for coordinating load management for overall consumption, relieving networks in cases of extremely cold temperatures, for example. The SEAS project could therefore change things on many levels, offering new modes of more responsible energy consumption.

 

This article is part of our dossier Digital technology and energy: inseparable transitions!

 

SEAS, ITEA

 

[divider style=”normal” top=”20″ bottom=”20″]

SEAS wins an “ITEA Award of Excellence for Innovation and Business impact”

Coordinated by Engie, and with one of its main academic partners being IMT, SEAS won an award of excellence on May 11 at the Digital Innovation Forum 2017 in Amsterdam. This award recognizes the relevance of the innovation in terms of its impact on the industry.

[divider style=”normal” top=”20″ bottom=”20″]

green it, epoc, cloud computing, data center

Data centers: Taking up the energy challenge

Increasingly present throughout the world, data centers consume significant amounts of energy. Researchers at IMT Atlantique have conducted a study that combines renewable energy and electricity grids to power these infrastructures. To reduce energy consumption, scientists are also looking at the network and the anticipation and organization of tasks on the various servers.

 

Accessing this website via a search engine consumes four times more CO2 than getting here by typing the website address or using a bookmark shortcut, according to the ADEME. [1] While it is easy to forget, because of its lack of physical presence, that digital technology emits greenhouse gases, it is important to remember that data centers emit as much carbon dioxide as civil aviation (2% of global emissions) due to their electricity consumption. This observation is far from trivial, considering the significant increase in the number of data centers around the world. Furthermore, with the arrival of cloud computing, data centers are consuming increasing amounts of energy, leading to ecological and economic problems.

With each new data center that opens, reducing the environmental impact is therefore a key consideration. With the aim of reducing energy consumption, the EPOC collaborative project (Energy Proportional and Opportunistic Computing systems) combines alternative energies and the EDF grid to power single-site infrastructures containing around fifty servers. This purely theoretical project, launched in 2013 and financed by the Labex CominLabs, brings together researchers in computer science and telecommunications from IMT Atlantique. For this Green IT project, the researchers’ skills in computer optimization, virtualization, and software adaptation are combined with expertise in very high-speed optical networks. Let’s take a look at the theory at the heart of this project, while awaiting its implementation.

 

Technical optimization

In the EPOC theory, data centers are connected to the EDF grid, while still being powered by renewable energy. This renewable energy provides electricity in a discontinuous manner, unlike the electricity provided by the traditional network. For example, in the case of a photovoltaic panel, there is a surge in electricity when the sun is at its peak, whereas there is no production at night. “Currently, the most expensive aspect in a renewable energy electric system is the batteries used to store the energy that is not consumed. We would like to do away with the storage aspect and try to consume the electricity that is produced directly,” explains Jean-Marc Menaud, a researcher specialized in Green IT at IMT Atlantique and the EPOC coordinator. This is yet another reason to optimize the cost-effectiveness of this heterogeneous distribution of energy, and power the data centers in a consistent manner over time.

To achieve these goals, the researchers are improving network communication in particular. They have opted for an installation that is entirely connected via fiber optic technology. This reduces energy consumption during transfers between the servers. The transmission of information via a fiber optic system or, in other words, via light signals, consumes less electricity than an ADSL connection. ADSL, for example, uses cables containing copper and equipment that constantly consumes electricity.

 

Organizing tasks according to the electrical load

Once they are running and connected to a network, data centers are able to host two types of applications. Some, like search engines, require a constant online connection. Others must be performed before a deadline. The researchers are therefore able to coordinate the applications being used based on the energy received. When green energy is available, they can organize the tasks the way they like. Jean-Marc Menaud gives us an example: “at the end of every month, the accounting services must create the pay stubs in PDF format. These files must be available by the 30th of each month, but can be produced any time before this date. They can therefore be created when a high level of green electricity is available.

In addition to the number of tasks in progress, it is also possible to play on the applications that are always online. The goal of a data center is to constantly ensure a certain level of quality of service, or SLA (Service Level Agreement). This is a question of software elasticity, meaning the ability to adapt an application’s operation according to the available energy. Take, for example, a website used to evaluate a trip from one point to another. Its SLA consists in providing a route within a given time frame. If the electrical load is low, it will only meet these simple requirements. If, however, the green electricity rate is high, the website can provide alternative routes, hence improving the service provided.

 

Regulating the workload

Reducing energy consumption also involves reducing the number of servers that are running. Before turning a server off, the applications running on it must be transferred to another server. To do this, the researchers use the virtual machine principle. The researchers have two possibilities for relieving the workload on a server: they either suspend the calculation, or they perform a migration using virtual machines. This process of scheduling tasks on the servers is a complex problem. It is, above all, a question of placement and distribution.

Jean-Marc Menaud explains: “This principle of placement is similar to the principle involved in filling a backpack. Imagine you are leaving for a trek with a 60-liter backpack. You can choose from a wide variety of food items to take with you. Each food item has a calorific value, a volume and a weight. Your goal is to gather a maximum amount of calories within the limitations of a backpack with a set volume, while minimizing the final weight. The solution is easy when there are only 5 food items. But if you have 10,000 items, the problem becomes more complex, because it is impossible to test all the possibilities. Here we have a similar situation. A server is a backpack that can contain a certain quantity of virtual machines. We must maximize the service provided (the calories) and minimize the energy (the weight).

 

SeDuCe: a full-scale test data center

The last area the EPOC researchers are studying is anticipation. Predicting an application’s energy needs, combined with the electricity production forecast, is the key to responsible consumption. This is an aspect that will be directly tested in 2017, with the implementation of the CPER [2] SeDuCe (Sustainable Data Center) project, the logical follow-up to three years of theory. It will enable a test single-site data center to be established, powered by photovoltaic panels. This complete infrastructure will enable the practical analysis of the theories being developed through EPOC. “This type of site is rare in France. They are only beginning to emerge at the industrial level. With photovoltaic panels that are increasingly affordable, we will be able to test the hypotheses of this advanced research directly. The site should be operational by the summer of 2017″, Jean-Marc Menaud concludes.

 

[1] ADEME: The French Environment and Energy Management Agency (Agence De l’Environnement et de la Maîtrise de l’Energie)

[2] CPER: French State-Regional Plan Contract (Contrat Plan Etat-Région)

 

mySMARTLife, Nantes

In Nantes, the smart city becomes a reality with mySMARTlife

Projets européens H2020Alongside European smart city champions like Barcelona, Copenhagen and Stockholm, France boasts a few gems of its own. One such city is Nantes, a participant in the European H2020 research project mySMARTlife since December 1st, 2016. Thanks to this project, the capital of the Pays de la Loire region plans to put its scientific heritage to good use, represented by IMT Atlantique, as it continues its transformation into a smart city.

 

When searching for proof that major French cities are truly transitioning to become smart cities, look no further than Nantes. For several years now, the city has been engaged in a transformational process, turning the concept of the city of the future into a reality. This is a role and ambition the city has taken to the European level, with Johanna Rolland, the Mayor of Nantes and President of Nantes Métropole, serving on the Executive Committee of Eurocities. This network of cities — also chaired by the Mayor of Nantes from 2014 to 2016 — advocates with European authorities for the interests of major metropolitan areas, and includes some big names among smart cities: Barcelona, Stockholm and Milan, etc. In short, at a time when few European cities can claim to be undertaking tangible measures towards becoming smart cities, Nantes can boast of being a pioneer in this area.

On December 1st, 2016, this characteristic was further enhanced with the official launch of the H2020 mySMARTlife research project. As proof of the position that Nantes holds in the European ecosystem of cities of the future, the city is now working alongside Hamburg and Helsinki as pilot cities for the project. At the local level in Nantes, MySmartLife is aimed at modernizing several of the city’s major areas of governance, particularly in terms of energy and transport. More specifically, one of the objectives is to “have a platform for Nantes Métropole, and its associated stakeholders,[1] to enable new services to be developed and monitored and to provide decision-support,” explains Bruno Lacarrière, researcher at IMT Atlantique. The institution is participating in this H2020 project, and offers dual expertise: in both energy efficiency related to heating networks and in offering decision-support. This expertise is provided by the Department of Energy Systems and Environment (member of the UMR CNRS 6144 GEPEA) for Nantes, and by the Department of Logics in Uses, Social Science and Management (member of the UMR CNRS 6285 LAB-STICC) for Brest.

 

Optimizing the city’s energy efficiency

The researchers from the IMT Atlantique Department of Energy Systems and Environment will specifically provide their knowledge in energy efficiency and system analysis, applied to heating networks. “Our skills in the field allow us to model these systems with an integrated approach that goes beyond thermal-hydraulic studies, for example,” explains Bruno Lacarrière. “We do not only model pipes, but an entire set of connected technological objects,” he continues. The researchers take into account the variety of systems that can provide heat sources for the network (boilers, cogeneration units, geothermal energy, recovering surplus industrial heat…), and the diversity of the consumers connected to the network. All of the heating network components are therefore integrated into the researchers’ models. This approach, which is complex because it is based on a comprehensive view of the network, makes it possible to better assess the areas for improvement in optimizing energy efficiency, and to better predict the consequences, for example, of renovating a building.

The researchers will greatly benefit from their proximity to the industrial partners in this project. To develop their models, they need field data such as heat output measurements from various points in the network. “This data is difficult to obtain, because in this case the system is connected to several hundred buildings,” Bruno Lacarrière points out. Furthermore, this information is not public. “Being able to work with stakeholders on the ground, such as Erena (Engie subsidiary and the network operator in Nantes), is therefore a real advantage for us, provided, of course, that the necessary confidentiality clauses are established,” the researcher adds.

 

No smart cities without decision support 

At the same time, the role of the Department of Logics in Uses, Social Science and Management is to develop decision-support tools, an important aspect in many of the smart city’s activities. This is true for mobility and transport, as Bruno Lacarrière points out: “In the context of the boom in electric vehicles, one application of decision-support is providing users with the nearest locations of available charging stations in real time.” Decision-support can also be used by public authorities to determine the best location for charging stations based on the configuration of the infrastructures and electrical distribution. “This is where having a platform becomes truly valuable: the information is centralized and made available to several stakeholders,” the researcher explains.

While the two types of expertise provided by IMT Atlantique are different in terms of research, they are very much complementary. Decision-support can, for example, use information obtained via the heating network models to propose new buildings to be connected to the network, or to study the deployment of new production sources. On the other hand, the results from decision-support based on several criteria (often related to various stakeholders) help to define new modeling scenarios for the networks. The researchers in energy efficiency and those in decision-support therefore complement each other through the platform, and provide support to the different stakeholders in the decisions they must make.

 

Ensuring the transformations are here to stay

While the mySMARTlife project will last five years, all the project’s actions — including rolling out the platform — must be completed within the first three years. The last two years will be dedicated to assessing the various actions, studying the impacts and making revisions if necessary. “For example, the first three years could be spent implementing an optimized energy management system, and the two follow-up years would provide feedback on the actual optimization. It is necessary to have sufficient hindsight, spanning several heating seasons,” explains Bruno Lacarrière.

The platform’s specific features must still be determined, and this will be the partners’ first mission. Because although it will initially be a demo platform, it is intended to remain after the project has ended. Therefore, planning must be done ahead of time to determine what form it will take, specifically so that industrial partners, as well as public authorities and final users, can make the most of it. Through this H2020 project, the European Commission is therefore planning to develop concrete actions that are made to last.

 

From a focus on Nantes to an international perspective

The work will initially focus on the Île de Nantes, located at the heart of the city on the Loire river. However, because certain heating and transportation networks are not confined to this area alone, the project will already be expanded to include other areas of the city. For example, the energy used by the Île de Nantes area is partially produced outside the district’s boundaries, therefore, the geographic area used for the models must be expanded. Several actions involving other zones in the metropolitan area are already planned.

Furthermore, the mySMARTlife project should not be seen solely as an effort to modernize a few areas of Nantes and the other two pilot cities. Brussels’ desire to ensure the sustainability of the actions over time is also related to its stated intention to ensure the scaling-up of the results from mySMARTlife. The key challenge is to produce knowledge and results that can be transferred to other urban areas, in France and abroad. This explains the advantage of entrusting the H2020 project management to Helsinki and Hamburg, in addition to Nantes.

By working together with the partners from these other two cities, the researchers will be able to validate their models by applying them to other major metropolitan areas. They will also attempt to test the validity of their work in smaller cities, since the project also includes the cities of Bydgoszcz (Poland), Rijeka (Croatia), Varna (Bulgaria) and Palencia (Spain). “The project is therefore aimed at demonstrating the implemented technology’s capacity to mass produce the actions used to develop a smart city,” the researcher points out. A key challenge in transforming cities is to make the transition to a smart city available not only to major metropolitan areas that are technologically advanced in this area, but also to smaller cities.

 

[1]­ At the local level in Nantes, Nantes Métropole will be supported by nine different partners: IMT Atlantique, Nantes Métropole Habitat, la Semitan, Armines, Atlanpole, Cerema, Engie and Enedis.

 

 

 

 

Gasification, Pilote, Mines Albi, VALTHERA, Javier Escudero

Gasification, the future of organic waste recovery

At a time when the challenge of waste recovery is becoming increasingly evident, gasification is emerging as a promising solution. The process allows organic waste to be decomposed into synthetic gas, which can be burned for energy purposes, or reprocessed to obtain gases of interest, such as methane and hydrogen. Javier Escudero has been studying this virtuous alternative to incineration for over eight years at Mines Albi. At the RAPSODEE laboratory (UMR CNRS 5302), he is developing a pilot process for recovering problematic waste, such as non-recyclable plastic materials and certain types of agricultural residue.

 

This century-old technique is now more relevant than ever. Gasification, which generates combustible gas from carbonaceous solids, such as coal and wood, was popularized in the 19th century to power producer-gas vehicles. They sparked renewed interest during World War II, when they were used to produce synthetic fuels from coal during the oil shortage.

 

Waste, tomorrow’s resource

In this season of energy transition, researchers are reviving this technique to recover a much more promising carbon source: organic waste! Javier Escudero is one such researcher. His credo? “Waste is tomorrow’s resource.” At Mines Albi, he is working to optimize this recovery method, which is more virtuous than outright incineration. His target materials are forest residues, household waste and non-recyclable plastic materials, etc. “Gasification is used particularly for dry and solid waste. It is complementary to the biological methanation process, which is used more for wet waste,” he explains.

Several steps are involved in the gasification process of transforming waste into gas. The waste, which is preconditioned and dried beforehand, first undergoes pyrolysis in a low-oxygen atmosphere at temperatures of over 300°C. “In these conditions, the energy produced breaks the molecular bonds. The carbonaceous materials separate into gas and solid residue. The following step is the true gasification stage: at 750°C or higher, the water vapor or carbon dioxide that are present complete the decomposition of these elements into a mixture of small molecules called synthesis gas, essentially composed of carbon monoxide and hydrogen,” Javier Escudero explains.

This synthesis gas, the basic “building block” of petrochemistry, has proven to be very useful: it can be incinerated, providing a greater yield than the combustion of the original solid. It can also power a cogeneration motor to produce heat and electricity. Finally, it can be reprocessed to produce gases of interest: methane, hydrogen, acetylene, etc… “We can therefore replace one source of energy or fossil material with its renewable energy equivalent,” Javier Escudero explains. It is thanks to this great versatility that gasification provides a virtuous alternative to incineration. However, some optimizations must still be made to improve its economic results.

 

Thermal recovery for industrial benefit

Javier Escudero has been working towards this goal since his arrival at Mines Albi in 2008. His goal is to identify the best means for enhancing the yield of the process, of which some mechanisms remain relatively unknown. In 2013, one of his team’s publications,1 explaining the respective influences of carbon dioxide and water vapor in the efficiency of gasification, was well received by the scientific community.

 

[box type=”shadow” align=”” class=”” width=””]

VALTHERA, waste recovery here, there, and everywhere

The VALTHERA platform (which in French stands for VALorisation THErmique des Résidus de transformation des Agro-ressources, the Thermal Recovery of Processing Residues from Agro-Resources), is located at the Mines Albi site, and is backed by the Agri Sud-Ouest and Derbi competitiveness clusters. It is a technological platform specialized in the development of highly energy-efficient thermal processes for the recovery of biomass waste and by-products. Its technological offer includes drying, pyrolysis, torrefaction, combustion, and gasification. Different means of recovery are being studied for this waste that is widely available, which would generate energy or value-added materials. Another specific feature of the VALTHERA platform is that it develops a source of solar power intended to power all of these thermal processes and improve their ecological footprint. It also offers high-performance equipment for treating various types of emissions and pollutants. The platform also acts as a catalyst for companies, and specifically for SMEs seeking to carry out research and development programs, demonstrate the feasibility of a project, or generalize a process.[/box]

 

Gazéification, Javier Escudero, VALTHERA, Mines AlbiNow, the time has come to apply this research. The researcher and his team are therefore working to develop the VALTHERA platform (in French: VALorisation THErmique des Résidus de transformation des Agro-ressources, the Thermal Recovery of Processing Residues from Agro-Resources). This platform is aimed at developing various processes for thermal waste recovery in partnership with industrial stakeholders (see box). In particular, Javier Escudero and his colleagues at the RAPSODEE laboratory (Recherche d’Albi en génie des Procédés des Solides Divisés, de l’Énergie et de l’Environnement, the Albi Research Centre for Process Engineering in Particulate Solids, Energy and the Environment) are working on a 100 kW pilot gasification process. This process is scheduled to be operational by the end of 2016, and will be a forerunner of final processes reaching up to 3 MW, “a power range that is suitable for processing a small-scale of generated organic waste, which could suit the needs of an SME.” The team is particularly focused on “fixed-bed” technology. With this system, the entire process takes place within a single reactor. The waste is “piled in” from the top, and then gradually goes through the steps of pyrolysis and gasification, driven downwards by the force of gravity, until the synthesis gas is recovered at the bottom of the reactor.

The researchers are working in partnership with the French gasifier manufacturer, CogeBio, to expand the possibilities of this technology. “The only commercial solutions that exist are for wood chips. We are going to assess the use of other types of waste, such as vine shoots,” explains Javier Escudero. Eventually, the project will expand to include other sources, such as non-recyclable plastics, still in connection with solutions industrial stakeholders are seeking. “Today, the processing cost for certain types of waste is negative, because the demand to get rid of this waste is greater than the processing capacities,” the researcher explains. In terms of recovery, the synthesis gas will first be burned for energy purposes. Based on the different partnerships, more ambitious recovery processes could be implemented. A top process of interest is the production of hydrogen: a high-value-added energy carrier. All of these valuable initiatives are aimed at transforming our waste into renewable energy!

 

Javier Escudero, Mines Albi, Gazéification

Curiosity: the single driving force

Nothing predestined Javier Escudero to develop gasification in France… unless it was his scientific curiosity. After falling in love with research during an internship at a Swiss polymer manufacturer, the Spanish student began his thesis on polymerization, under the joint direction of a Spanish manufacturer. After completing his post-graduate research on the same theme at the Laboratory of Chemical Engineering – LGC – in Toulouse (UMR 5503), in 2008 he applied for a research position at Mines Albi in the area of waste gasification, a subject that strayed from his beginnings in chemistry. However, his curiosity and industrial experience combined to bring him success. Eight years later, he is now an Assistant Professor at the RAPSODEE laboratory (UMR CNRS 5302)… and extremely passionate about sustainable development. In addition to his daily work on gasification, he is co-organizing the international WasteEng conference (conference on engineering for waste and biomass valorisation), which brings together stakeholders from across the waste chain, from the identification of sources to their recovery.

 

(1) Guizani, C. et al ; The gasification reactivity of high-heating-rate chars in single and mixed atmospheres of H2O and CO2 ; Fuel 108 (2013) 812–823

 

Bitcoin, blockchain, Michel Berne, Fabrice Flipo

The bitcoin and blockchain: energy hogs

By Fabrice Flipo and Michel Berne, researchers at Télécom École de Management.
Editorial originally published in
French in The Conversation France

_______________________________________________________________________________________

 

The digital world still lives under the illusion that it is intangible. As governments gathered in Paris at COP21, pledging to reduce their carbon emissions to keep global warming below 2°C, the spread of digital technology continues to take place without the slightest concern for the environment. The current popularity of the bitcoin and blockchain provide the perfect example.

The principle of the blockchain can be summarized as follows: each transaction is recorded in thousands of accounting ledgers, and each one is scrutinized by a different observer. Yet no mention is made of the energy footprint of this unprecedented ledger of transactions, or of the energy footprint of the new “virtual currency” (the bitcoin) it manages.

Read the blog post What is a blockchain?

 

Electricity consumption equivalent to that of Ireland

In a study published in 2014, Karl J. O’Dwyer and David Malone showed that the consumption of the bitcoin network was likely to be approximately equivalent to the electricity consumption of a country like Ireland, i.e. an estimated 3 GW.

Imagine the consequences if this type of bitcoin currency becomes widespread. The global money supply in circulation is estimated at $11,000 billion. The corresponding energy consumption should therefore exceed 4,000 GW, which is 8 times the electricity consumption of France and twice that of the United States. It is not without reason that a recent headline on the Novethic website proclaimed “The bitcoin, a burden for the climate”.

 

What do the numbers say?

Since every blockchain is a ledger (and therefore a file) that exists in many copies, the computer resources required for the calculation, transmission and storage of the information increases, as well as the energy footprint, even if improvements in the underlying technologies are taken into account.

The two important factors here are the length of the blockchain and the number of copies. For the bitcoin, the blockchain’s length grew very quickly: according to Quandl, it was 27 GB in early 2015 and rose to 74 by mid-2016.

The bitcoin, whose system is modeled on that of the former gold standard currencies, is generated through complex computer transactions, which become increasingly complex over time, as for an increasingly depleted goldmine in which production costs rise.

In 2015, Genesis Mining revealed in Business Insider that it was one of the most energy-consuming companies in Iceland, with electricity costs of 60 dollars per “extracted” bitcoin– despite benefiting from a low price per kWh and a favorable climate.

Finally, we can also imagine all the “smart contract” type applications supported by the Internet of Things. This will also have a considerable impact on energy and the environment, considering the manufacturing requirements, the electrical supply (often autonomous, and therefore complicated and not very efficient) and disposal.

However, although the majority of connected objects will probably not support smart contracts, a very large amount of connected objects are anticipated in the near future, with a total likely to reach 30 billion in 2020, according to McKinsey, the American consulting firm.

The bitcoin is just one of the many systems being developed without concern for their energy impact. In response to the climate issue, their promoters act as if it does not exist, or as if alternative energy solutions existed.

 

An increasingly high price to pay

Yet decarbonizing the energy system is a vast issue, involving major risks. And the proposed technical solutions in this area offer no guarantees of being able to handle the massive and global increase in energy consumption, while still reducing greenhouse gas emissions.

Digital technology already accounts for approximately 15% of the national electricity consumption in France, and consumes as much energy, on the global scale, as aviation. Today, nothing suggests that there will be a decrease in the mass to be absorbed, nor is there any indication that digital technology will enable a reduction in consumption, as industrialists in this sector have confirmed (see the publication entitled La Face cachée du numérique – “The hidden face of digital technology”).

The massive decarbonization of energy faces many challenges: the reliability of the many different carbon sequestration techniques proposed, the “energy cannibalism” involved in the launch of renewable energies, which require energy to be manufactured and have technical, social, and political limitations (for example, the various sources of renewable energy require large surface areas, yet the space that could potentially be used is largely occupied)… The challenges are huge.

Avec PREVER, tous les résidus sont valorisables en énergie

PREVER: residue turned into energy

Mines Nantes is home to PREVER – France’s only technological platform that offers a large range of competencies and tools for converting industrial and household waste into energy. It provides customized support for companies wanting to produce combustibles or fuel from miscellaneous residues.

 

As a species, Homo sapiens have a very strange way of disposing of their waste. They dig an enormous hole, put their waste and debris in it, cover it up, then start the process all over again in a nearby location. And there you have it! “Out of sight and out of mind”. Over the past few decades, some individuals from this species have been warning their fellow creatures about the dangers of continuing this practice. They describe its consequences for the sustainability of their habitat and environment, and therefore their own survival. Consequences that could almost make them deny their very identity as humans. Fortunately, solutions are beginning to emerge!

At Mines Nantes, an area of almost 520 m2 is dedicated to a research and analysis platform for energy recovery from residues (PREVER). What is the goal of the scientists working there? To work together with companies to explore new solutions for recycling household and industrial waste and transforming it into energy sources. “This includes any companies that want to use their residues to produce their own energy, and SMEs and start-ups that want to develop a recycling program,” explains Mohand Tazerout, who is in charge of PREVER.

The platform team meets with companies and identifies their needs, whether this involves solving specific problems for very small businesses, or developing large-scale programs for major companies. The researchers then prototype a solution, taking the entire recovery chain into account: from the raw residue to the final product. This residue may be in the form of a synthesis gas, a solid fuel or another fuel. This platform also offers the benefit of working alongside companies from the prototyping to full-scale production stages.

 

A technological solution that’s unique in France

Another advantage is that PREVER has all the machinery required to study the performance of recovery processes. “When we’re working with a liquid fuel, for example, we have engine test units that enable us to study its energy characteristics or its impacts on the environment,” describes Mohand Tazerout. The platform is very well equipped with cutting-edge equipment, including electricity-generating boilers, gas microturbines, hydrothermal liquefaction machinery, etc. “I believe we are the only platform in France to feature all the material for the recovery chain in the same place,” the researcher continues.

In addition to the technical aspect, PREVER also boasts a physico-chemical analysis laboratory, which extends its range of services: atomic spectroscopy, thermogravimetric analysis, octane number measurement, etc. It can also rely on its qualified staff, who are highly qualified for their tasks. As Mohand Tazerout mentions, “The department associated with PREVER includes around ten researchers who work with the companies on the programs, plus technicians and PhD students who are specially recruited for specific projects.

The reason the platform offers this wide range of energy recovery methods is because it can work with all types of residue. The researcher explains, “We work with all types of waste, from the food industry to household waste, organic, inorganic, liquid and solid waste…” This also includes unusual and atypical waste, such as maritime wood contaminated by oil spills. PREVER seeks to demonstrate that all residues can be converted into energy, and that hiding them in a hole is no longer a solution in our modern day and age.

Read the blog post Vabhyogaz uses our waste to produce hydrogen

 

Le détecteur ALICE.

ALICE sees the origins of the universe in a quark-gluon plasma droplet

ALICE, the Cern experiment in charge of studying the quark-gluon plasma, restarted its main activities on November 17. Based in the Large Hadron Collider (LHC) in Switzerland, ALICE mission is to track each single particle ejected when lead ions collide. Thanks to the “Run 2” upgrade that occurred at the LHC in early 2015, such collisions are now more energetic, reaching 5 TeV* per nucleon. Results of the experiment could increase knowledge of the matter, but also allow researchers to go way back in time, a few fractions of a second after the bing bang, to study how matter initially formed. Subatech, a mixed research unit between Mines Nantes, CNRS and Université de Nantes, is part of this great scientific adventure.

 

 

Le détecteur ALICE est aussi haut qu'un immeuble de 5 étages. Crédits : CERN.

ALICE’s detector is as big as a 5 floors high building. Credits: CERN.

ALICE. Behind this poetic acronym full of fantasy, lies one of the four experiments of the Large Hadron Collider (LHC). Although the Swiss particle collider of the Cern is famous for hosting the discovery of the Higgs boson, other researches are carried out on the topic of fundamental matter. ALICE, which stands for A Large Ion Collider Experiment, is among them: its mission is the study and characterization of quark-gluon plasma. “It is a deconfined state of matter, in which interactions binding the components of protons and neutrons do not exist anymore” explains Ginés Martinez, member of a research team in Subatech — a collaboration between Mines Nantes, CNRS and Université de Nantes — full member of the ALICE collaboration.

Less known than ATLAS and CMS, the LHC experiments that have led to to the 2013 Nobel prize, ALICE has nothing to envy to its too big sisters though: 26 meters long, 16 meters high and weighing 10,000 tons, its characteristics are in the superlative standards of the LHC. Although the particle collider has started its “Run 2” last March, with proton collisions reaching now 13 TeV*, ALICE main mission has truly started only last month.

Once upon a time, in Switzerland…

Back in 2010: first collisions between protons — hydrogen atoms without electrons — are happening at the LHC. ATLAS, CMS, and even ALICE are looking upon them with their huge sensors. But ALICE is backstage: collisions between protons are not energetic enough to produce a quark-gluon plasma. ALICE knows it for RHIC, a smaller and older collider based in Long Island, USA, has already demonstrated this fact. ALICE has to wait for other collisions: those between lead ions — lead atoms also without electrons. These ones will occur the same year, several months after first proton-proton collisions.

Unfortunately, each time the nature of the collisions changes, “every optical tool in the LHC has to be recalibrate” describes Ginés Martinez. The researcher then explains that “lead-lead collisions only take place one month each year” before adding that “proton-proton collisions keep the LHC busy 90 % of the time”. For 2015, first lead-lead collisions have been produced on November 17th. The time during which ALICE reveals all its potential is then… Right now. Up till mid-December.

Le lancement du

ALICE staff members during the launch of the LHC “Run 2”. Credits: Laurent Egli / CERN.

 

A very hot plasma

To form this plasma, lead ions are accelerated in the 27 km long ring of the LHC, up to a speed close to the light speed. When they collide, lead ions release a tremendous amount of energy. Protons and neutrons forming the atom nucleus melt in a drop of plasma, and release quarks and gluons that compose them. These particles are fundamental components of matter, just like electrons, meaning they can’t be subdivided into smaller building blocks. The collision is so intense that “matter heats up to dreadful temperatures, several trillion degrees” says Ginés Martinez. For comparison, Sun inner centre would appear like a winter chill with its 15 million degrees.

With this drop of deconfined matter, of quark-gluon plasma, starts the mission of ALICE. For the collision point is under vacuum, the state of the drop is temporary. Under vacuum, matter tends to occupy maximum space. The plasma drop expands as much as it can, and then disappears. The phenomenon is similar to a water drop that would be under vacuum: H2O molecules would spread apart till they are so far away from each other than it becomes a gas, and not a liquid anymore. Quark-gluon plasma acts this way. The sensors of ALICE watch every particle spreading from the collision point and measure their trajectories. “Especially, we pay attention to muons coming from particles formed by heavy quarks, like beauty and charmed quarks”.

ALICE and its silicon eyes

This wide range of different particles is expulsed in many different trajectories. As they are spread in all the directions, they pass through thin plates of silicon placed in layers all around the point of collision. Every time a particle crosses a silicon layer, an electric signal is produced. “We can measure very precisely the trajectory of each emitted particle” explains Ginés Martinez. These data are crucial, as they allow scientists to determine the mass of the particles, their speed and their energy. With such information, physicists can better understand how matter is organized.

Les trajectographes de silicium d'ALICE enregistrent les trajectoires des particules éjectées lors de la collision.

3D visualization of ejected particles trajectories after a lead-lead collision. Credits: CERN.

A drop of big bang

Once particles have been ejected, a phenomenon called “hadronization” occurs. As the drop of plasma is under vacuum, its temperature decrease and it loses energy. Quarks and gluons then bind together to form thousands of species of hadrons — among which protons and neutrons. This cooling is close to what happened to matter when the universe quickly expanded during the big bang. Study of the hadronization phenomenon gives insights to researchers about how atoms formed at the birth of the universe.

Each year, the ALICE team composed of 1,500 researchers makes another step towards the understanding of the matter. To increase the sensibility of the sensors, Subatech works with the Research Institute on Fundamental Laws (Irfu) of the CEA and the Cern to develop a new generation of sensors using the MAPS technology. The purpose is to be able to track 100,000 collisions by 2020, against 10,000 now. Such an increasing of ALICE capacity should allow more precision in the observations, and better monitor trajectories of the particles.

*TeV stands for Teraelectronvolt, which is 1012 electronvolts (eV). The electronvolt is a unit of energy adapted to the scale of nucleus. The energy of the nucleon of a lead atom is 5 TeV high during the collision. A proton can reach 13 TeV thanks to a higher charge over mass ratio. Such energies can only be produced at the LHC currently.

**MAPS stands for Monolithic Active Pixel Sensors, which are sensors inspired by a technology of medical imagery to detect visible light. MAPS sensors fully integrate electronics and detection volume in the same silicon wafer.

 

[box type=”shadow” align=”aligncenter” class=”” width=”95%”]

Le+bleu

Quarks, gluons, bosons… Diving into the standard model

Fundamental particles are the smallest components of matter. These entities can not be subdivided. For instance, electrons belong to this category. Quarks and gluons too: together, they form bigger particles called “hadrons” like protons and neutrons that form a nucleus when they bind. Thus, hadrons are not fundamental particles. Each fundamental particle has its own properties, and are therefore more or less sensitive to physical elementary interactions. Properties of quarks make them sensitive to the strong nuclear interaction, which confines hadrons. This strong interaction is carried by other fundamental particles: gluons, just like electromagnetic force is carried by photons.

Le modèle standard en un tableau. Crédits Dopplerlover.

The standard model in a table. Credits: Dopplerlover.

[/box]