Posts

réseaux optiques, optical networks

The virtualization of optical networks to support… 5G

Mobile networks are not entirely wireless. They also rely on a network of optical fibers, which connect antennas to the core network, among other things. With the arrival of 5G, optical networks must be able to keep up with the ramping up of the rest of the mobile network to ensure the promised quality of service. Two IMT Atlantique researchers are working on this issue, by making optical networks smarter and more flexible.  

In discussions of issues surrounding 5G, it is common to hear about the installation of a large number of antennas or the need for compatible devices. But we often overlook a crucial aspect of mobile networks: the fiber optic infrastructure on which they rely. Like previous generations, 5G relies on a wired connection in most cases. This technology is also used in the “last mile”. It therefore makes it possible to connect antennas to core network equipment, which is linked to most of the connected machines around the world. It can also connect various devices within the same antenna site.

In reality, 5G is even more dependent on this infrastructure than previous generations since the next-generation technology comes with new requirements related to new uses, such as the Internet of Things (IoT). For example, an application such as an autonomous car requires high availability, perfect reliability, very-low latency etc. All of these constraints weigh on the overall architecture, which includes fiber optics. If they cannot adapt to new demands within the last mile, the promises of 5G will be jeopardized. And new services (industry 4.0, connected cities, telesurgery etc.) will simply not be able to be provided in a reliable, secure way.

Facilitating network management through better interoperability

Today, optical networks are usually over-provizioned in relation to current average throughput needs. They are designed to be able to absorb 4G peak loads and are neither optimized, nor able to adapt intelligently to fluctuating demand. The new reality created by 5G, therefore represents both a threat for infrastructure in terms of its ability to respond to new challenges, and an opportunity to rethink its management.

Isabel Amigo and Luiz Anet Neto, telecommunications researchers at IMT Atlantique, are working with a team of researchers and PhD students to conduct research in this area. Their goal is to make optical networks smarter, more flexible and more independent from the proprietary systems imposed by vendors. A growing number of operators are moving in this direction. “At Orange, it used to be common to meet specialists in configuration syntaxes and equipment management for just one or two vendors,” explains Luiz Anet Neto, who worked for the French group for five years. “Now, teams are starting to set up a “translation layer” that turns the various configurations, which are specific to each vendor, into a common language that is more straightforward and abstract.”

This “translation layer”, on which he is working with other researchers, is called SDN, which stands for Software-Defined Networking. This model is already used in the wireless part of the network and involves offloading certain functions of network equipment. Traditionally, this equipment fulfills many missions: data processing (receiving and sending packets back to their destination), as well as a number of control tasks (routing protocols, transmission interfaces etc.) With SDN, equipment is relieved from these control tasks, which are centralized within an “orchestrator” entity that can control several devices at once.  

Read more on I’MTech: What is SDN?

There are many benefits to this approach. It provides an overview of the network, making it easier to manage, while making it possible to control all of the equipment, regardless of its vendor without having to know any proprietary language. “To understand the benefit of SDN, we can use an analogy between a personal computer and the SDN paradigm,” says Isabel Amigo. “Today, it would be unthinkable to have a computer that would only run applications that use a specific language. So, machines have an additional layer – the operating system – that is in charge of “translating” the various languages, as well as managing resources, memory, disks etc. SDN therefore aims to act like an operating system, but for the network.” Similarly, the goal is to be able to install applications that are able to work on any equipment, regardless of the hardware vendor. These applications could, for example, distribute the load based on demand.

Breaking our dependence on hardware vendors

SDN often goes hand in hand with another concept, inspired by virtualization in data centers: NFV (Network Functions Virtualization). Its principle: being able to execute any network functionality (not just control functions) on generic servers via software applications.”Usually, dedicated equipment is required for these functions,” says the IMT researcher. “For example, if you want to have a firewall, you need to buy a specific device from a vendor. With NFV, this is no longer necessary: you can implement the function on any server via an application.”

Read more on I’MTech: What is NFV?

As with SDN, the arrival of virtualization in optical networks promotes better interoperability. This makes it harder for vendors to require the use of their proprietary systems linked to their equipment. The market is also changing, by making more room for software developers. “But there is still a long way to go,” says Luiz Anet Neto. “Software providers can also try to make their customers dependent on their products, through closed systems. So operators have to remain vigilant and offer an increasing level of interoperability.”

Operators are working with the academic world precisely for this purpose. They would fully benefit from standardization, which would simplify the management of their optical networks. Laboratory tests carried out by IMT Atlantique in partnership with Orange provide them with technical information and areas to explore ahead of discussions with vendors and standardization bodies.

Sights are already set on 6G

For the research teams, there are many areas for development. First of all, the scientists are seeking to further demonstrate the value of their research, through testing focusing on a specific 5G service (up to now, the experiments have not applied to a specific application). Their aim is to establish recommendations for optical link dimensioning to connect mobile network equipment.

The goal is then to move towards smart optimization of optical networks. To provide an example of how findings by IMT Atlantique researchers may be applied, it is currently possible to add a “probe” that can determine if a path is overloaded and shift certain services to another link if necessary. The idea would then be to develop more in-depth mathematical modeling of the phenomena encountered, in order to automate incident resolution using artificial intelligence algorithms.

And it is already time for researchers to look toward the future of technology. “Mobile networks are upgraded at a dizzying pace; new generations come out every ten years,” says Luiz Anet Neto. “So we already have to be thinking about how to meet future requirements for 6G!

Bastien Contreras

IoT, Internet of Things

A standardized protocol to respond to the challenges of the IoT

The arrival of 5G has put the Internet of Things back in the spotlight, with the promise of an influx of connected objects in both the professional and private spheres. However, before witnessing the projected revolution, several obstacles remain. This is precisely what researchers at IMT Atlantique are working on, and they have already achieved results of global significance.

The Internet of Things (IoT) refers to the interconnection of various physical devices via the Internet for the purpose of sharing data. Sometimes referred to as the “Web 3.0”, this field is set to develop rapidly in the coming years, thanks to the arrival of new networks, such as 5G, and the proliferation of connected objects. Its applications are infinite: monitoring of health data, the connected home, autonomous cars, real-time and predictive maintenance on industrial devices, and more.

Although it is booming, the IoT still faces major challenges. “We need to respond to three main constraints: energy efficiency, interoperability and security,” explains Laurent Toutain, a researcher at IMT Atlantic. But there is one problem: these three aspects can be difficult to combine.

The three pillars of the IoT

First, energy is a key issue for the IoT. For most connected objects, the autonomy of a smartphone is not sufficient. In the future, a household may have several dozen such devices. If they each need to be recharged every two or three days, the user will have to devote several hours to this task. And what about factories that could be equipped with thousands of connected objects? In some cases, these are only of value if they have a long battery life. For example, a sensor could be used to monitor the presence of a fire extinguisher at its location and send an alert if it does not detect one. If you have to recharge its battery regularly, such an installation is no longer useful.

For a connected object, communication features account for the largest share of energy consumption. Thus, the development of IoT has been made possible by the implementation of networks, such as LoRa or Sigfox, allowing data to be sent while consuming little energy.

The second issue is interoperability, i.e. the ability of a product to work with other objects and systems, both current and future. Today, many manufacturers still rely on proprietary universes, which necessarily limits the functionalities offered by the IoT. Take the example of a user who has bought connected light bulbs from two different brands. They will not be able to control them via a single application.

Finally, the notion of security remains paramount within any connected system. This observation is all the more valid in the IoT, especially with applications involving the exchange of sensitive data, such as in the health sector. There are indeed many risks. An ill-intentioned user could intercept data during transmission, or send false information to connected objects, thus inducing wrong instructions, with potentially disastrous consequences.

Read more on I’MTech: The IoT needs dedicated security – now

On the Internet, methods are already in place to limit these threats. The most common is end-to-end data encryption. Its purpose is to make information unreadable while it is being transported, since the content can only be deciphered by the sender and receiver of the message.

Three contradictory requirements?

Unfortunately, each of the three characteristics can influence the others. For example, by multiplying the number of possible interlocutors, interoperability raises more security issues. But it also affects energy consumption. “Today, the Internet is a model of interoperability,” explains Laurent Toutain. For this, it is necessary to send a large amount of information each time, with a high degree of redundancy. It offers remarkable flexibility, but it also takes up a lot of space.” This is only a minor disadvantage for a broadband network, but not for the IoT, which is constrained in its energy consumption.

Similarly, if you want to have a secure system, there are two main possibilities. The first is to close it off from the rest of the ecosystem, in order to reduce risks, which radically limits interoperability.

The second is to implement security measures, such as end-to-end encryption, which results in more data being sent, and therefore increased energy consumption.

Reducing the amount of data sent, without compromising security

For about seven years, Laurent Toutain and his teams have been working to reconcile these different constraints, in the context of the IoT. “The idea is to build on what makes the current Internet so successful and adapt it to the constrained environments, says the researcher. We are therefore taking up the principles of the encryption methods and protocols used today, such as HTTP, but taking into account the specific requirements of the IoT”.

The research team has developed a compression mechanism named SCHC (Static Context Header Compression, pronounced “chic”). It aims to improve the efficiency of encryption solutions and provide interoperability in low-power networks.

For this purpose, SCHC works on the headers of the usual Internet protocols (IP, UDP and CoAP), which contain various details: source address, destination address, location of the data to be read, etc. The particularity of this method is that it takes advantage of the specificity of the IoT: a simple connected object, such as a sensor, has far fewer functions than a smartphone. It is then possible to anticipate the type of data sent. “We can thus free ourselves from the redundancy of classic exchanges on the web, says Laurent Toutain. We then lose flexibility, which could be inconvenient for standard Internet use, but not for a sensor, which is limited in its applications”.

In this way, the team at IMT Atlantique has achieved significant results. It has managed to reduce the size of the headers traditionally sent, weighing 70-80 bytes, to only 2 bytes, and to 10 bytes in their encrypted version. “A quantity that is perfectly acceptable for a connected object and compatible with network architectures that consume very little energy,” concludes the researcher.

A protocol approved by the IETF

But what about that precious interoperability? With this objective, the authors of the study approached the IETF (Internet Engineering Task Force), the international organization for Internet standards. The collaboration has paid off, as SCHC has been approved by the organization and now serves as the global standard for compression. This recognition is essential, but is only a first step towards effective interoperability. How can we now make sure that manufacturers really integrate the protocol into their connected objects? For this, Laurent Toutain has partnered with Alexander Pelov, also a researcher at IMT Atlantic, in order to found the start-up company Acklio. The company works directly with industrialists and offers them solutions to integrate SCHC in their products. It thus intends to accelerate the democratization of the protocol, an effort supported in particular by  €2 million in funds raised at the end of 2019.

Read more on I’MTech Acklio: linking connected objects to the Internet

Nevertheless, industrialists remain to be convinced that the use of a standard is also in their interest. To this end, Acklio also aims to position SCHC among the protocols used within 5G. To achieve this, it will have to prove itself with the 3GPP (3rd Generation Partnership Project) which brings together the world’s leading telecommunications standards bodies. “A much more constraining process than that of the IETF,” however, warns Laurent Toutain.

Bastien Contreras

La Ruche à vélos, bicycle parking

La Ruche à Vélos is developing secure bicycle parking throughout France

Innovative and appropriate parking solutions must be created for the long-term development of cycling. The La Ruche à Vélos start-up incubated at IMT Atlantique offers an automated, secure and easy-to-use parking facility. This modular concept is connected to a mobile application and is intended for all users via acquisition by local authorities. For this solution, La Ruche à Vélos won the 2020 Bercy-IMT Innovation Award on February 2nd.

In 2020, many French people got back on their bikes. In its annual report published last October, the Vélo & Territoires association reported an average increase in bicycle use of 9% between January and September 2020 (compared to 2019) [1]. In a year strongly marked by strikes and the health crisis, exceptional circumstances strongly supported this trend. The attraction for bicycles shows no signs of slowing down. While local authorities support these practices, they also raise new issues in terms of security and parking. How many cyclists have already found their bike without a saddle, without a wheel, or perhaps not found their bike at all? To meet these challenges, the start-up La Ruche à Vélos, incubated at IMT Atlantique, proposes an innovative secure bicycle storage solution.

Automatic and secure parking

The increase in the number of cyclists is due in part to the emergence of electric bicycles. These bikes are heavier, bulkier and require a significant financial investment by their users. They therefore pose new constraints and require more security when parking. La Ruche à Vélos has developed a product that meets these expectations. Their solution consists of a secure bicycle parking facility which is connected to a mobile application. Its three founders were particularly attached to its ease of use. “It takes between 20 and 30 seconds to drop off or pick up a bike,” says Antoine Cochou, co-creator of the start-up. But how does it work?

The application allows the user to geolocate a parking facility with available spaces and to reserve one in advance. After identifying themselves on site, cyclists have access to a chamber, and deposit their bike on a platform before validating. There are also compartments available allowing users to recharge their batteries. Inside the parking facility, a machine stores the bike automatically. The facility covers several floors, thus saving ground space and facilitating integration of the system into the urban landscape. It can hold about 50 bikes over 24 square meters, dividing the bicycle parking space otherwise required on sidewalks by four! In addition, the size of the parking facility is flexible. The number of spaces therefore varies according to the order.

In June 2021, a first prototype of about ten spaces will be installed in the city of Angers. The young innovators hope to collect enough feedback from users to improve their next product. Two more facilities are planned for the year. They will have 62 to 64 spaces. “Depending on the location, a balance must be struck between user waiting time and the demand for services. These two parameters are related to the number of sites and the flow of users at peak times (train station, shops, etc.),” says Antoine Cochou.

Strategic locations with adapted subscriptions

La Ruche à Vélos is aimed directly at local authorities who can integrate this solution into their mobility program. It also targets businesses and real estate developers wishing to offer an additional service to their employees or future residents. Depending on the needs, the parking facilities could therefore be installed in different strategic locations. “Local authorities are currently focusing on railway stations and city centers, but office or residential areas are also being considered,” says Antoine Cochou. Each zone has its own target and therefore its own form of subscription. In other words, one-off parking in the city, daytime offers for offices, and evening and weekend passes for residents.

Initially, subscriptions for the prototype installed in Angers will be managed by the start-up. However, future models are expected to couple parking passes with local public transit passes. Subscriptions will thus be taken care of by the cities. The start-up will focus on maintenance support. “In this sense, our next models will be equipped with cameras and it will be possible to control them remotely,” says Maël Beyssat, co-creator of La Ruche à Vélos. Communities will have a web interface to monitor the condition and operating status of the parking facility (rate of use, breakdowns, availability, etc.)

For the future, the company is considering the installation of solar panels to offer a zero-electricity consumption solution. Finally, other locations could be considered outside of popular touring sites on cycle routes.

[1] Result obtained with the help of sensors measuring the number of bikes going past.

By Anaïs Culot

e-VITA

e-VITA, a virtual coach for seniors

Virtual coaching can play a crucial role in maintaining healthy and active ageing through early detection of risks and intervention tailored to the individual needs of senior citizens. However, current technologies do not meet these requirements. Instead they offer limited interaction and are often intrusive. The 22 European and Japanese partners of the e-VITA project will develop a “multi-modal personal voice coach” to assist and safeguard the elderly person at home. With a budget of €4m funded by the European Union and of an equivalent amount funded by the Japanese MIC (Ministry of Internal Affairs and Communications), the project began in January 2021 for a duration of 3 years. Interview with Jérôme Boudy, researcher at Télécom SudParis, and project partner.

How did the European e-VITA project come about?

Jérôme Boudy – In a context of ageing populations, the idea of this project gradually took shape from 2016 onwards. Initially, there were ongoing projects such as EMPATHIC, of which Télécom SudParis is a partner, followed by a collaboration with Brazil, and finally the e-VITA (European-Japanese virtual coach for smart ageing) project with Japan, which aims to develop tools to ensure active and healthy ageing (AHA) through the early detection of the risks associated with old age. 

Read more on I’MTech: AI to assist the elderly

What is the goal of e-VITA?

JB – The aim is to keep the elderly at home in a secure environment. Founded on international cooperation between Europe and Japan, e-VITA offers an innovative approach to “virtual coaching” that addresses the crucial areas of active and healthy ageing: cognition, physical activity, mobility, mood, social interaction, leisure… enabling older people to better manage their own health and daily activities.

What method will be used?

JB – By taking  into account different cultural factors in European countries and in Japan, in particular the acceptability of interfaces used preferentially in these countries (smartphones, 3D holograms, social robots, etc.) e-VITA will develop an automatic multi-modal human-machine interface. Based on Natural Language Processing (NLP) and automatic spoken dialog management, it will also be equipped with several complementary non-verbal modalities such as recognition of a person’s gestures, emotions, and situation.

This “virtual coach” will detect potential risks in the user’s daily environment and how these risks could be prevented by collecting data from external sources and non-intrusive sensors. It will provide individualized profiling and personalized recommendations based on big data analytics and socio-emotional informatics. Interoperability and data confidentiality will be guaranteed through FIWARE and a federated data AI platform.

What expertise will Télécom SudParis and IMT Atlantique researchers involved in e-VITA bring to the table?

JB – Researchers from IMT schools will mainly ensure the interoperability and processing of the data provided by the different sensors, as well as the automatic monitoring of emotions on the face. In addition, our two living labs – Experiment’HaaL for IMT Atlantique and Evident for Télécom SudParis –  will be made available to project partners. Finally, we will be in charge of the management of the “dissemination and exploitation” work package.

The project brings together a large number of partners. What are their roles in this project?

JBThe consortium brings together 12 partners in Europe and 10 in Japan, each with their respective complementary roles. Siegen University (Germany) and Tohoku University, are co-ordinating the project for Europe and for Japan, which brings together three major groups: end users responsible for needs specification and field assessment, such as APHP (France), AGE Platform Europe (Belgium), IRNCA (Italy), Caritas Germany, NCGG and IGOU (Japan); Academics and research organizations specializing in AI algorithms (automatic learning, fusion, expression recognition, etc.): alongside the IMT schools are Fraunhofer and INFAI (Germany), UNIVPM (Italy), Tohoku University, AIST, Waseda University (Japan)… ; and lastly, industrialists in charge of technical definition and process integration, mainly SMEs: IXP (Germany), Ingegneria Informatica (Italy), Delta Dore (France), Gatebox and NEU (Japan), and a single large group: Misawa (Japan)

What are the expected results?

JB – The creation of a “multi-modal personal voice coach” whose job is to assist, accompany and safeguard the elderly at home, and the operation of this system through several physical interfaces (smart-phones, robots, etc…) thanks to the integration of start-up incubators in our living labs and structures.

The coaching system will be installed into the living environments of healthy elderly people in France, Germany, Italy, and Japan to evaluate its feasibility and effectiveness. The results of the e-VITA project also include new standards and policies beyond technology, and will therefore be explored and transferred across Europe, Japan and worldwide.

What are the next big steps for the e-VITA project?

JB – The next step is the phase of specifying user needs according to cultural factors, and defining the architecture of the targeted system, which requires the organization of several workshops.

Find out more about e-VITA

Interview by Véronique Charlet

In search of a future for fast neutron reactors

In August 2019, it was announced that the Astrid project for sodium fast reactors (SFR) was to be abandoned. In late 2020, Stéphanie Tillement, a researcher at IMT Atlantique, analyzed the rationale behind this abandonment in an article for I’MTech. But what is the global situation? Does this technology still have a future? Stéphanie Tillement and her colleague Frédéric Garcias analyze the prospects for this industry.

In 2000, fast neutron reactors returned to center stage, after years of being forgotten. The United States Department of Energy (DOE) organized a very important event for the nuclear industry at the global level, the “Generation IV International Forum” (GIF). This forum sought to help the nuclear industry recover by kick-starting research and innovation based on what were described as “revolutionary” reactors, which had to fulfill a number of very general objectives: safer, more cost-effective, reduce the risks of proliferation, save natural resources and minimize waste. And sodium fast reactors (SFR) fulfill these five criteria.

It was during this forum that this notion of generations of nuclear reactors was first defined. Those currently in operation in France – all of which are Pressurized Water Reactors (PWR) – are referred to as generation II. The European Pressurized Reactor (EPR) being built in Flamanville is referred to as a generation III – as are those being built in England and in Finland and the two EPRs in operation in Taishan in China. Generation IV reactors refer to reactors that are able to fulfill the previously-mentioned objectives. The members of the Generation IV forum agreed on six concepts of reactors referred to as generation IV, three of which are SFR. Among them, one is lead-cooled, another is gas-cooled, and the third is sodium-cooled, like the Astrid prototype introduced by France.

Saving uranium

We have to put ourselves in the context of the 2000s,” says Frédéric Garcias, a researcher in organizational management at the University of Lille. “The nuclear industry was going through a lull in the construction of new reactors, in particular in the wake of the Chernobyl accident, but many believed that it remained a solution for the future. In what form, and within what timeframe? Growth was anticipated in China and in emerging countries, which could give rise to a high level of uranium consumption. Thus the interest in seeking uranium-efficient sectors.” Fast neutron reactors are able to consume depleted uranium and plutonium, which are waste products of previous generations of reactors.  

Today, Russia and China are at the forefront of the SFR sector. Russia has two (BN-600 and BN-800, respectively 560 and 820 megawatts of electricity) that are based on an earlier design, which are not considered generation IV. China started out in 2011 with an experimental generation IV low-power sodium-cooled fast reactor (20 megawatts of electricity). France, which had also been at the forefront, has fallen behind since the Astrid project was abandoned in 2019. As for the United States, after kick-starting research efforts through the Generation IV Forum, it abandoned nuclear energy for a while, enticed by the prospects of shale gas and oil. But the prospects of a controllable, zero-carbon energy offered by nuclear power has led the country to turn its attention to this industry once again. While many start-ups are working in this area, there are no plans to build reactors at this time.

Towards small, modular reactors

If nuclear energy is struggling to revive itself, especially in countries like France and the United States, it may be that the prevailing approach to development throughout the second half of the 20th century is now outdated. It focused on building ever-bigger, more powerful, more complex reactors. This meant that they were more expensive and harder to fund. This race to build giant reactors may have reached its limitations with the EPR. “Big reactors are, or were, developed in highly-centralized countries like Russia or China, or like France was in some ways in the 1980s,” observes Stéphanie Tillement, a researcher in industrial sociology at IMT Atlantique. “It’s clear that historically, nuclear energy has been less successful in decentralized countries. With the rise of decentralized methods of governance, stakeholders wondered, ‘Why not consider smaller, modular reactors built to respond to needs – in short, decentralized?'” This is precisely the principle of Small Modular Reactors (SMR), which are anywhere from three to one hundred times less powerful than Generation III reactors. They do not have the same business model or organization: smaller reactors require less funding, and seem to be easier to deploy when there is less long-term visibility. SMR concepts are extremely varied: some are inspired by known tried and tested technologies while others are more “exotic.” They all use fission, and can be pressurized water (like generations II and III) or fast neutron reactors.

Read more on I’MTech: Nuclear: A Multitude of Scenarios to Help Imagine the Future of the Industry

Will we see a resurgence of nuclear power in France, whether through SMR or otherwise? “So far, the French government has been quiet on these issues,” says Stéphanie Tillement. “Neither the decision to launch (and then stop) a project to build a reactor like Astrid, or the launch of new projects have been debated or voted on in Parliament. There is no real industrial strategy for nuclear power in the multiannual energy program, a tool designed to steer France’s energy policy.” Clearly, no government, current or previous, seems to want to discuss these issues, as they are deemed to be too risky from a political viewpoint. And yet, nuclear energy is built over time, requiring long-term strategies and public investment. Without this, there is a risk that there will be a massive loss of skills, which will have definite consequences on the industry.

Loss of skills

“The French project to build EPR was started in 2007, after ten years without any building in France,” explains Frédéric Garcias. “The longer we go between projects, the more skills we lose, along with the entire industrial fabric. Industrial capacity requires upkeepBut politicians are unaware of this question of skills – they think that we can stop for twenty years and then just flip a switch to start up again.” Moreover, when there are few construction projects, and few prospects, the nuclear industry becomes less attractive, which impacts recruitment.

Politicians and the nuclear industry do not operate on the same timeframe. A presidential term lasts five years, while nuclear power take decades to develop. Could SMRs be an answer to this short-term vision? “We don’t yet know the answer to this question,” say the two researchers. SMRs would certainly be better- suited to a more volatile, less centralized world, with more participatory democracy. But we would also lose some of the advantages of the sector, such as its small physical footprint. And there would still be safety issues. France currently counts 18 nuclear power plants (56 reactors) – far more SMRs would be needed to produce the same amount of energy. This is unlikely to gain wide public acceptance!

Learning to innovate again

The nuclear industry is struggling due to its difficulty to plan for the future as it awaits government decisions that never come. Perhaps it should also reconsider the way it works. “The abandonment of the Astrid project raises questions about opportunities for innovation in the French nuclear sector,” says Stéphanie Tillement. “The French nuclear sector depends on a very small group of players, primarily EDF/Framatome for operations and design, the French Atomic and Alternative Energy Commission (CEA) and regulators — the Nuclear Safety Authority (ASN) with technical support from the French Institute for Radiation Protection and Nuclear Safety (IRSN). This provides stability, but also a certain inertia. The system has a hard time innovating and breaking away from pre-established models and modes of operation. This was seen in the work on Generation IV: only CNRS dared to develop a concept that was truly a radical breakthrough from previously-developed technologies, a concept for a molten salt reactor, which had never been built in France.” A debate has therefore been set in motion within the nuclear sector: is it still capable of innovating and changing?

By Cécile Michaut.

Power-to-gas, when hydrogen electrifies research

Hydrogen is presented as an energy vector of the future. In a power-to-gas system, it serves as an intermediary for the transformation of electricity into synthetic methane. The success of this energy solution depends heavily on its production cost, so IMT researchers are working on optimizing the different processes for a more competitive power-to-gas solution.

Increasing production of renewable energy and reducing greenhouse gas emissions. What if the solution to these two ambitions were to come from a single technology: power-to-gas? In other words, the conversion of electricity into gas. But why? This method allows the storage of surplus electricity produced by intermittent renewable sources that cannot be injected into the grid. The energy is then used to produce hydrogen by electrolysis of water. The gas can then be consumed on site, stored, or used to power hydrogen vehicles. But these applications are still limited. This is why researchers are looking at transforming it into other useful products such as methane (CH4) to supply natural gas networks. What is the potential of this technology?

Costly hydrogen?

The main issue with the development of power-to-gas today is its cost,” says Rodrigo Rivera Tinoco, a researcher in energy systems modeling at Mines ParisTechIf we take into account the cost of producing hydrogen using a low-temperature electrolyzer (PEM), the technology envisaged in power-to-gas installations, a 1 GW hydrogen reactor (almost the power equivalent of a nuclear reactor) would today cost €3 billion.” In September, the French government allocated a budget of €7 billion in aid for the development of the national hydrogen industry. A reduction in the production cost of this gas is therefore necessary. All the more so since power-to-gas technologies are destined to compete with other energy modes on the market.

France wants to reach a cost of €50 per megawatt-hour in 2030. However, a low-cost but short-lasting technology would not be suitable. “To be cost-effective, systems must have a minimum 60,000 to 90,000 hour operating guarantee,” adds Rodrigo Rivera Tinoco. Currently, low-temperature electrolyzers (PEMs) have an operating life of between 30,000 and 40,000 hours. This is where research comes in. The objective is to optimize the energy efficiency of low-cost technology.

Which technology for which use?

Digital modeling enables the identification of the strengths and weaknesses of technologies prior to their installation. “We carry out technical and economic studies on the various water electrolysis processes in order to increase their efficiency and reduce their cost,” says Chakib Bouallou, an expert in digital modeling and energy storage at Mines ParisTech. Several technologies exist, but which one is the most suitable for storing renewable energy? On an industrial scale, low-temperature electrolyzers are mature and seem to respond to the intermittent nature of these energy sources.

However, in the evaluation phase, no technology is being ruled out. As part of the ANR MCEC project and in collaboration with Chimie ParisTech, Chakib Bouallou’s team is currently working on a solution based on molten carbonates that relies on the co-electrolysis of water and CO2. “Using performance curves of materials depending on the current, we estimate the efficiency of the systems under different usage scenarios. The overall analysis of this technology will then be compared to other existing techniques”, says the researcher. Indeed, the adaptability of a system will depend above all on the intended use. To complete these studies, however, experiments are essential.

Minerve: a demonstrator for research purposes

In order to gain the knowledge needed to make the transition to power-to-gas, the Minerve demonstrator was installed in 2018 on the Chantrerie campus north of Nantes. “The platform is first and foremost a research tool that meets the needs of experimentation and data collection. The results are intended to help develop simulation models for power-to-gas technologies,” explains Pascaline Pré, a process engineering researcher at IMT Atlantique. Equipped with solar panels and a wind turbine, Minerve also has an electrolyzer dedicated to the production of hydrogen converted, with CO2 in cylinders, into methane. This is then redistributed to a fuel distribution station for natural gas vehicles (CNG) and used for mobility.  The next step is to integrate CO2 capture technology from the combustion fumes of the site’s heating network boilers to replace the cylinders.

Carbon dioxide is very stable in the air. Turning it into useful products is therefore difficult. Pascaline Pré’s team is developing a new process to capture this gas by absorption using a solvent. The gas collected is purified, dried, compressed and sent to the methane plant. However, some hurdles need to be overcome in order to optimize this approach: “Solvent regeneration consumes a lot of heat. It would be possible to improve the energy efficiency of the device by developing an electrified microwave heating system,” explains the researcher. This concept would also reduce the size of the facilities needed for this process for a future industrial installation.

In the long term, Minerve should also serve as a basis for the study of many issues in the Carnot HyTrend project, which brings together part of the French scientific community to look at hydrogen. Within three years, initial recommendations on the different technologies (electrolysis, methanation, CO2 capture, etc.) will be published to improve the existing situation, as well as studies on the risks and environmental impacts of power-to-gas.

What about power-to-gas-to-power?

It is possible to go beyond current power-to-gas techniques by adding an oxycombustion step. As part of the ANR project FluidStory, Chakib Bouallou’s team focused on modeling a device based on three advanced technologies: low-temperature PEM electrolysis, methanation (allowing the storage of electricity in the form of gas) and oxycombustion power plants for the destocking stages. The first two steps are therefore the same as in a classical power-to-gas infrastructure as mentioned above. The difference here is that oxygen and CH4, obtained respectively by electrolysis of water and methanation, are stored in caves for an indefinite period of time. Thus, when the price of electricity rises, the oxy-fuel combustion process reuses these gases to produce electricity. The CO2 also emitted during this reaction will be reused by the methanation process in the next cycle.

This closed-cycle design therefore allows autonomous operation with regard to the required reagents, which is not possible in conventional power-to-gas setups. However, analyses aimed at better understanding its mechanics and the nature of the interactions between its components have yet to be conducted.

Looking towards power-to-X

The methanation at the heart of the processes mentioned so far is only one example of the transformation of hydrogen in contact with CO2. Indeed, these reactions, called hydrogenation reactions, are used to synthesize many chemicals usually obtained from fossil resources. At IMT Albi Mines, Doan Pham Minh’s team is working on the optimization of these processes. As well as methane production, researchers are targeting the synthesis of liquid biofuels, methanol, ethanol and other carbon-based chemicals. All these “X” compounds are therefore obtained from hydrogen and CO2. Two factors determine the nature of the result: the operating conditions (temperature, pressure, residence time, etc.) and the catalyst used. “This is what drives the reaction to a target productThus, by developing active, selective and stable catalytic materials, we will improve yields in synthesizing the desired product,” the researcher explains.

Methanol is of particular interest to industries. Indeed, this compound is everywhere around us and is used in particular for the surface materials of furniture, in paints, plastics for cars, etc. The same is true for ethanol, biofuels or chemical intermediates of renewable origin. Beyond the role of hydrogen in the national energy mix, the researcher therefore insists on its use by other high-consumption sectors: “It is widely used by the chemical industry and we must be ready to develop competitive and high-performance processes by anticipating future uses of hydrogen and power-to-X.”

By Anaïs Culot

Frédéric Boyer

IMT Atlantique | #biorobotics #incarnatedintelligence #electricalsense

[toggle title=”Find all his articles on I’MTech” state=”open”]

[/toggle]