ASTRID project

Astrid: a nuclear project goes up in smoke

The abandonment of the Astrid project marks a turning point for France’s nuclear industry. The planned nuclear reactor was supposed to be “safer, more efficient and more sustainable”, but therefore required significant funding. Stéphanie Tillement, a researcher at IMT Atlantique, has studied how Fukushima impacted the nuclear industry. Her work has focused in particular on the rationale for abandoning the Astrid project, taking into account the complicated history of nuclear energy and how it has evolved in the public and political spheres.

 

Since the early days of nuclear energy, France has positioned itself as a global leader in terms of both research and energy production. In this respect, the abandonment of the Astrid project in August 2019 marked a move away from this leading position. Astrid (Advanced Sodium Technological Reactor for Industrial Demonstration) was supposed to be France’s first industrial demonstrator for what are referred to as “4th-generation” reactors. The selected technology was the sodium-cooled fast neutron reactor (FNR). At present, nuclear power in France is supplied by 58 second-generation pressurized water reactors, which operate with “slowed-down” neutrons. As an FNR, ASTRID held the promise of more renewable energy – it was supposed to be able to use depleted uranium and plutonium resulting from the operation of current plants as a fuel source, meaning it would consume much less natural uranium.

As part of the AGORAS research project, IMT Atlantique researcher Stéphanie Tillement, studied the impact of the Fukushima accident on the world of nuclear energy. This led her to study the Astrid project, and in particular the many challenges it encountered. “We ruled out the link with Fukushima early on,” says the researcher. The problems Astrid ran into are not related to a paradigm shift as a result of the catastrophe. The reasons it was abandoned are endogenous to the industry and its history.” And financial reasons, though by no means negligible, are not enough to explain why the project was abandoned.

A tumultuous history

In the 2000s, the United States Department of Energy launched the Generation IV International Forum to develop international cooperation for new concepts for nuclear reactors. Out of the six concepts selected by this forum as the most promising, France focused on sodium-cooled reactors, a project which would be launched in 2010 under the name Astrid. The country preferred this concept in particular due to the fact that three French reactors using the technology had already been  built. However, none of them had been used on an industrial scale and the technology had not advanced beyond the prototyping stage. The first such reactor, Rapsodie, was dedicated purely to research. The second was Phénix. It was an intermediary step – it had to produce energy but remained an experimental reactor, far from an industrial scale. It was the third such reactor, Superphénix, which would be given the role of representing the first in the series of this new French industrial-scale energy. But from the beginning, it experienced shut-down periods following several incidents and in 1997, Prime Minister Lionel Jospin announced that it would be shut down once and for all.

 “This decision was widely criticized by the nuclear industry,” says Stéphanie Tillement, “who accused him of acting for the wrong reasons.” During the election campaign, Lionel Jospin had aligned himself with the Green party, who were openly in favor of decommissioning the power plant. “Its sudden shutdown would be taken very badly and destroy all hope for the use of such technology on an industrial-scale. Superphénix was supposed to be the first in a long line, and some remember it as ‘a cathedral in a desert.'” This also reflected public opinion on nuclear energy: the industry was facing growing mistrust and opposition.

“For a lot of stakeholders in the nuclear industry, in particular the CEA (The French Atomic and Alternative Energy Commission), Astrid gave hope to the idea of reviving this highly promising technology,” explains the researcher. One of the biggest advantages was the possibility of a closed nuclear cycle, which would make it possible to recycle nuclear material from current power plants – such as plutonium – to use as a fuel source in the reactors. “In this respect, the discontinuation of the Astrid project may in the long run call into question the very existence of the La Hague reprocessing plant,” she says. This plant processes used fuel, a portion of which (plutonium in particular) is reused in reactors, in the form of MOX fuel. “Without reactors that can use reprocessed materials effectively, it’s difficult to justify its existence.”

Read more on I’MTech: MOx strategy and the future of French nuclear plants

“From the beginning, our interviews showed that it was difficult for the Astrid stakeholders to define the status of the project precisely,” explains Stéphanie Tillement. The concept proposed when applying for funding was that of an industrial demonstrator. The goal was therefore to build a reactor within a relatively short period of time, which could produce energy on a large scale based on technology for which there was already a significant amount of operating experience. But the CEA also saw Astrid as a research project, to improve the technology and develop new design options. This would require far more time. “As the project advanced,” adds the researcher, “the CEA increasingly focused on a research and development approach. The concept moved away from previous reactors and its development was delayed. When they had to present the roadmap in 2018, the project was at a ‘basic design’ stage and still needed a lot of work, as far as design was concerned, but also in terms of demonstrating compliance with nuclear safety requirements.”

An abandoned or postponed project?

Stéphanie Tillement confirms that, “the Astrid project, as initially presented, has been permanently abandoned.” Work on the sodium technology is expected to be continued, but the construction of a potential demonstrator of this technology will be postponed until the second half of the 21st century. “It’s a short-sighted decision,” she insists. Uranium, which is used to operate reactors, is currently inexpensive. So there’s no need to turn to more sustainable resources – at least not yet. But abandoning the Astrid project means running the risk of losing the expertise acquired for this technology. Though some research may be continued, it will not be enough to maintain industrial expertise in developing new reactors, and the knowledge in this sector could be lost. “The process of regaining lost knowledge,” she says, “is ultimately as expensive as starting from scratch.”

A short-term decision, therefore, relying instead on EPR, 3rd-generaton reactors. But the construction of this type of reactor in Flamanville also faces its own set of hurdles. According to Stéphanie Tillement, “the challenges the Astrid project encountered are similar to those of the EPR project.” To secure funding for such projects, nuclear industry stakeholders seek to align themselves with the short timeframes of the political world. Yet, short deadlines are ultimately unrealistic and inconsistent with the timeframes for developing nuclear technology, and even less so when it’s a matter of the first of a series. This creates problems for nuclear projects – they fall behind schedule and their costs rise dramatically. In the end, this makes politicians rather wary of funding this sort of project. “So nuclear energy gets stuck in this vicious circle,” says the researcher, “in a world that’s increasingly unfavorable to this sector.”

This decision also aligns with the government’s energy strategy. In  broad terms, the State has announced that nuclear energy will be reduced to 50% of France’s energy mix, in favor of renewable energies. “The problem,” says Stéphanie Tillement, “is that we only have an outline. If there’s a political strategy on nuclear issues, it remains unclear. And there’s no long-term position – this is a way of  leaving the decision to future decision-makers. But making no decision is a decision. Choosing not to pursue the development of technologies which require a long time to develop may implicitly mean abandoning the idea of any such development in the future. Which leads some to consider, rather cynically, that politicians must think that when need it, we’ll buy the required technology from other powers (China, Russia) who have already developed it.”

connected devices

A dictionary for connected devices

The field of connected devices is growing at a staggering pace across all industries. There is a growing need to develop a communication standard, meaning a ‘common language’ that different smart systems could understand and interpret. To contribute to this goal, ETSI (European Telecommunications Standards Institute) is funding a European project in which Mines Saint-Étienne researchers Maxime Lefrançois and Antoine Zimmermann[1] are taking part.

 

In order to work together, connected devices must be able to communicate with one another. This characteristic, known as ‘semantic interoperability,’ is one of the key challenges of the digital transition. To be effective, semantic interoperability must be based on the adoption of an agreed-upon set of best practices. This would culminate in the creation of a standard adopted by the IoT community. At the European level, ETSI (European Telecommunications Standards Institute) is in charge of setting standards for information and communication technologies. “For example, ETSI standardized the SIM card, which acts as an identifier in mobile phone networks to this day,” explains Maxime Lefrançois. He and his colleague Antoine Zimmermann are researchers at Mines Saint-Étienne and specialize in the semantic web and knowledge representation. They are taking part in the STF 578 project on the interoperability of connected devices funded by ETSI, in partnership two researchers from Universidad Politécnica de Madrid.

“Instead of proposing a standard that strictly defines the content of communications between connected devices, we define and formally identify the concepts involved, through what is known as an ontology,” says Antoine Zimmermann. This provides IoT players with greater flexibility since the content of messages exchanged may use the language and format best suited to the device, as long as an explicit link is made with the concept identified in the reference ontology. The two researchers are working on the SAREF reference ontology (Smart Applications Reference Ontology), a set of ETSI specifications which include a generic base and specializations for the various sectors related to the IoT: energy, environment, building, agriculture, smart cities, smart manufacturing, industry and manufacturing, water, automotive, e-health, wearables.

“The SAREF standard describes smart devices, their functions and the services they provide, as well as the various properties of the physical systems these devices can control,” explains Maxime Lefrançois. For example, a light bulb can say, “I can provide light” by using a concept defined by SAREF. A system or application may then refer to the same lighting concept to tell the object to turn on. “Ultimately, this knowledge should be described following the same standard models within each industry to facilitate harmonization between industries.” adds the researcher. The aim of the project is therefore to develop a public web portal for the standard SAREF ontology to facilitate its adoption by companies and collect their feedback and suggestions for improvement.

A specially-designed ‘dictionary’

“The SAREF public web portal is a little bit like a ‘dictionary’ for connected devices,” explains Maxime Lefrançois. “If we take the example of a water heater that can measure energy consumption and can be remotely-controlled, SAREF will describe its possible actions, the services it can provide, and how it can be used to lower energy costs or improve household comfort.” But his colleague Antoine Zimmermann explains, “It isn’t a dictionary in the traditional sense. SAREF specifies in particular the technical and IT-related constraints we may encounter when communicating with the water heater.”

Imagine if one day all water heaters and heat pumps were connected to the IoT and could be remotely controlled. They could then theoretically be used as an energy resource that could ensure the stability and energy efficiency of the country’s electricity grid. If, in addition, there was a uniform way to describe and communicate with these devices, companies in the smart building and energy sectors would waste less time individually integrating products made by different manufacturers. They could then focus instead on developing innovative services connected to their core business, giving them a competitive advantage. “The goal of semantic interoperability is to develop a service for a certain type of smart equipment, and then reuse this service for all similar types of equipment,” says Maxime Lefrançois. “That’s the heart of SAREF”.

Read more on I’MTech: How the SEAS project is redefining the energy market

At present, the existing standards are compartmentalized by sector. The energy industry has standards for describing and communicating with the electrical equipment of a water tower, but the water tower must then implement different standards to interface with other equipment in the water distribution network. “There are several different consortia for each sector,” explain the researchers, “but we now have to bridge the gap between these consortia, in order to harmonize their standards.” Thus the need for a ‘dictionary,’ a common vocabulary that can be used by connected devices in all industries.

Take the example of automotive manufacturers who are developing new batteries for electric vehicles. Such batteries could theoretically be used by energy suppliers to regulate the voltage and frequency of the electricity grid. “The automotive and energy industries are two sectors that had absolutely no need to communicate until now,” says Maxime Lefrançois, “in the future, they may have to work together to develop a common language, and SAREF could be the solution.”

A multilingual ‘dictionary’

The IoT community is currently engaged in something of a ‘standards war’ in which everyone is developing their own specification and hoping that it will become the standard. Impetus from public authorities is therefore needed to channel the existing initiatives  — SAREF at the European level. “We can well imagine that in the future, there will only be a single, shared vocabulary for everyone,” says Antoine Zimmermann. “But we may find ourselves with different vocabularies being developed at the same time, which then remain. That would be problematic. This is how it is today, for example, with electrical outlets. A machine intended to be used in the United States will not work with European outlets and vice versa.”

“The development of the SAREF public web portal is an important step since it encourages companies to take part in creating this dictionary,” adds Maxime Lefrançois. The more companies are involved in the project, the more comprehensive and competitive it will be. “The value of a standard is related to the size of the community that adopts it,” he says.

“The semantic web is particularly useful in this respect,” says Antoine Zimmermann, “it allows everyone to agree. Companies are all engaged in digital transformation and use the web as a common platform to get in touch with clients and partners. They use the same protocols. We think the semantic web is also a good way to build these common vocabularies that will work in various sectors. We aren’t looking for the right solution, but to demonstrate best practices and make them more widespread so that companies look beyond their own community.” 

A collaborative ‘dictionary’

The researchers’ work also involves developing a methodology for building this standard: a company must be able to suggest a new addition to the vocabulary that is highly specific to a certain field, while ensuring that this contribution aligns with the standard models and best practices that have been established for the entire ‘dictionary.’

“And that’s the tricky part,” says Maxime Lefrançois. How can the SAREF public portal be improved and updated to make sure that companies use it? “We know how to write ‘dictionaries’ but supporting companies is no simple task.” Because there are a number of constraints involved: all these different vocabularies and jargons must be assimilated, and companies may not necessarily be familiar with them.

“So we have to reinvent collaborative support methods for this dictionary. That’s where DevOps approaches implemented for software development are useful,” he says. These approaches make it possible to automatically check the suggestions based on a set of quality criteria, then automatically make a new version of the portal available online if the criteria are  fulfilled. “The goal is to shorten SAREF development cycles while maintaining an optimal level of quality,” concludes the researcher.

There are other hurdles to overcome to get the connected devices themselves to ‘speak SAREF,’ due to the specific limitations of connected devices –  limited storage and computing capacity, low battery life, limited bandwidth, intermittent connectivity. The use of ontologies for communication and ‘reasoning’ was first thought up without these constraints, and must be reinvented for these types of ‘edge computing’ configurations. These issues will be explored in the upcoming ANR CoSWoT project (Constrained Semantic Web of Things) which will include researchers from LIRIS, Mines Saint-Étienne, INRAE (merger of INRA and IRSTEA), Université Jean-Monnet and the company Mondeca.

 

[1] Maxime Lefrançois and Antoine Zimmermann are researchers at the Laboratory Hubert Curien, a joint research unit between CNRS/Mines Saint-Étienne/Université Jean Monnet.

Being Human with algorithms : Marc-Oliver Pahl meets Raimund Seidel

Marc-Oliver Pahl is a researcher in cybesecurity at IMT Atlantique. In 2018, he launched “Being human with algorithms”, a series of video interviews between technicians and non-technicians around the topic of digital transformation. Through open discussions and dialogues, he depicts how digital technologies are perceived, and affect humans as citizens, consumers, workers…

In this episode, Marc-Oliver meets with Raimund Seidel, Director of the Schloss Dagstuhl – Leibniz Center for Informatics.

immune system

Understanding the resilience of the immune system through mathematical modeling

Gaining insight into how the immune system works using mathematics is the ultimate goal of the research carried out by IMT Atlantique researcher Dominique Pastor, along with his team. Although the study involves a great degree of abstraction, the scientists never lose sight of practical applications, and not only in relation to biology.

 

In many industries, the notion of “resilience” is a key issue, even though there is no clear consensus on the definition of the term. From the Latin verb meaning “to rebound,” the term does not exactly refer to the same thing as resistance or robustness. A resilient system is not unaffected by external events, but it is able to fulfill its function, even in a degraded mode, in a hostile environment. For example, in computer science, resilience means the ability to provide an acceptable level of services in the event of a failure

This capacity is also found in the human body ­— and in general, in all living beings. For example, when you have a cold, your abilities may be reduced, but in most cases you can keep living more or less normally.

This phenomenon is regularly observed in all biological systems, but remains quite complex. It is still difficult to understand how resilience works and the set of behaviors to which it gives rise.

A special case of functional redundancy: degeneracy

It was through discussions with Véronique Thomas-Vaslin, a biologist at Sorbonne University, that Dominique Pastor, a telecommunications researcher at IMT Atlantique, became particularly aware of this property of biological systems. Working with Roger Waldeck, who is also a researcher at IMT Atlantique, and PhD student Erwan Beurier, he carried out research to mathematically model this resilience, in order to demonstrate its basic principles and better understand how it works.

To do so, they drew on publications by other scientists, including American biologist Gerald Edelman (Nobel prize winner for medicine in 1972), underscoring another property of living organisms: degeneracy. (This term is usually translated in French as dégénérescence, which means ‘degeneration,’ but this word is misleading). “Degeneracy” refers to the ability of two structurally different elements to perform the same function. It is therefore a kind of functional redundancy, which also implies different structures. This characteristic can be found at multiple levels in living beings.

For example, amino acids, which are the building blocks of essential proteins, are produced from “messages” included in portions of DNA. More specifically, each message is called a “codon”: a sequence of three molecules, known as nucleotides. However, there are 4 possible nucleotides, meaning there are 64 possible combinations, for only 22 amino acids. That means that some codons correspond to the same amino acid: a perfect example of degeneracy.

My hunch is that that degeneracy is central to any resilient system,” explains Dominique Pastor. “But it’s just a hunch. The aim of our research is to formalize and test this idea based on mathematical results. This can be referred to as the mathematics of resilience.”

To this end, he relied on the work of French mathematician Andrée Ehresmann, Emeritus Professor at the University of Picardie Jules Verne, who established a mathematical model of degeneracy, known as the “Multiplicity Principle,” with Jean-Paul Vanbremeersch, an Amiens-based physician who specializes in gerontology.

Recreating resilience  in the form of mathematical modeling

Dominique Pastor and his team therefore started out with biologists’ concrete observations of the human body, and then focused on theoretical study. Their goal was to develop a mathematical model that could imitate both the degeneracy and resilience of the immune system in order to “establish a link between the notion of resilience, this Multiplicity Principle, and statistics.” Once this link was established, it would then be possible to study it and gain insight into how the systems work in real life.

The researchers therefore examined the performance of two categories of statistical testing, for a given problem, namely to detect a phenomenon. The first category is called “Neyman-Pearson testing,” and is optimal for determining whether or not an event has occurred. The second category, RDT, (Random Distortion Testing), is also optimal, but for a different task: detecting whether an event has moved away from an initial model.

The two types of procedures were not created with the same objective. However, the researchers  successfully demonstrated that RDT testing could also be used, in a “degenerative” manner, to detect a phenomenon, with a comparable performance to Neyman-Pearson testing. That means that in the theoretical case of an infinite amount of data, they can detect the presence or absence of a phenomenon with the same level of precision. The two categories therefore perform the same function, although they are structurally different. “We therefore made two sub-systems in line with the Multiplicity Principle,” concludes the IMT Atlantique researcher.

What’s more, the nature of RDT testing gives it an advantage over Neyman-Pearson testing since the latter only works optimally when real events follow a certain mathematical model.  If this is not the case — as so often happens in nature — it is more likely to be incorrect. RDT testing can adapt to a variable environment, since it is designed to detect such variations, and is therefore more robust. Combining the two types of testing can result in a system with the inherent characteristics of resilience, meaning the ability to function in a variety of situations.

From biology to cybersecurity

These findings are not intended to remain confined to a theoretical universe. “We don’t work with theory for the sake of theory,” says Dominique Pastor. “We never forget the practical side: we continually seek to apply our findings.” The goal is therefore to return to the real world, and not only in relation to biology. In this respect, the approach is similar to that used in research on neural networks – initially focused on understanding how the human brain works, it ultimately resulted in systems used in the field of computer science.

The difference is that neural networks are like black boxes: we don’t know  how they make their decisions,” explains the researcher. “Our mathematical approach, on the other hand, provides an understanding of the principles underlying the workings of another black box: the immune system.” This understanding is also supported by collaboration with David Spivak, a mathematician at MIT (United States), again, in the field of mathematical modeling of biological systems.

The first application Dominique Pastor is working on falls within the realm of cybersecurity. The idea is to imitate the resilient behavior of an immune system for protective purposes. For example, many industrial sites are equipped with sensors to monitor various factors (light, opening and closing of doors, filling a container etc.) To protect these devices, they could be combined with a system to detect external attacks. This could be made up of a network, which would receive data recorded by the sensors and run a series of tests to determine whether there has been an incident. Since these tests could be subject to attacks themselves, they would have to be resilient in order to be effective – hence the importance of using different types of tests, in keeping with the previously obtained results.

For now it is still too early to actually apply these theories. It remains to be proven that the Multiplicity Principle is a sufficient guarantee of resilience, given that this notion does not have a mathematical definition as of today. This is one of Dominique Pastor’s ambitions. The researcher admits that it is still his “pipe dream” and says, “My ultimate goal would still be to go back to biology. If our research could help biologists better understand and model the immune system, in order to develop better care strategies, that would be wonderful.”

Unéole

Unéole on our roofs

We know how to use wind to produce electricity, but large three-bladed turbines do not have their place in urban environments. The start-up Unéole has therefore developed a wind turbine that is suitable for cities, as well as other environments. It also offers a customized assessment of the most efficient energy mix. Clovis Marchetti, a research engineer at Unéole, explains the innovation developed by the start-up, which was incubated at IMT Lille Douai.

 

The idea for the start-up Unéole came from a trip to French Polynesia, islands that are cut off from the continent, meaning that they must be self-sufficient in terms of energy. Driven by a desire to develop renewable energies, Quentin Dubrulle focused on the fact that such energy sources are scarce in urban areas. Wind, in particular, is an untapped energy source in cities. “Traditional, three-bladed  wind turbines are not suitable,” says Clovis Marchetti, a research engineer at Unéole. “They’re too big, make too much noise and are unable to capture the swirling winds created by the corridors between buildings.”.

Supported by engineers and researchers, Quentin Dubrulle put together a team to study the subject. Then, in July 2014 he founded Unéole, which was incubated at IMT Lille Douai.  Today the start-up proposes an urban turbine measuring just under 4 meters high and 2 meters wide that can produce up to 1,500 kWh per year. It is easy to install on flat roofs and designed to be used in cities, since it captures the swirling winds found in urban environments.

Producing energy with a low carbon footprint is a core priority for the project. This can be seen in the choice of materials and method of production. The parts are cut by laser, a technology that is well-understood and widely used by many industries around the world. So if these wind turbines have to be installed on another continent, the parts can be cut and assembled on location.

Another important aspect is the use of eco-friendly materials. “This is usually a second step,” says Clovis Marchetti, “but it was a priority for Unéole from the very beginning.” The entire skeleton of the turbine is built with recyclable materials. “We use aluminum and  recycled and recyclable stainless steel,” he says. “For the electronics, it’s obviously a little harder.”

Portrait of an urban wind turbine

The wind turbine has a cylindrical shape and is built in three similar levels with slightly curved blades that are able to trap the wind. These blades are offset by 60° from one level to the next. “This improves performance since the production is more uniform throughout the turbine’s rotation.” says Clovis Marchetti. Another advantage to this architecture is that it makes it easy to start: no matter what direction the wind comes from, a part of the wind turbine will be sensitive to it, making it possible to induce movement.

 

Photograph of the urban wind turbine proposed by Unéole.

 

To understand how a wind turbine works, two concepts of aerodynamics are important: lift and drag. In the former, a pressure difference diverts the flow of air and therefore exerts a force. “It’s what makes planes fly for example,” explains Clovis Marchetti. In the latter, the wind blows on a surface and pushes it. “Our wind turbine works primarily with drag, but lift effects also come into play,” he adds. “Since the wind turbine is directly pushed by the wind, its rotational speed will always be roughly equal to the wind speed.”

And that plays a significant role in terms of the noise produced by the wind turbine. Traditional three-bladed turbines turn faster than the wind due to lift. They therefore slice through the wind and produce a swishing noise. “Drag doesn’t create this problem since the wind turbine vibrates very little and doesn’t make any noise.” he says.

An optimal energy mix

The urban wind turbine is not the only innovation proposed by Unéole. The central aim of this project is to combine potential renewable energies to find the optimal energy mix for a given location. As such, a considerable amount of modeling is required in order to analyze the winds on site. That means modeling a neighborhood by taking into consideration all the details that affect wind: topographical relief, buildings, vegetation etc. Once the data about the wind has been obtained from Météo France, the team studies how the wind will behave in a given situation on a case-by-case basis.

“Depending on relief and location, the energy capacity of the wind turbine can change dramatically,” says Clovis Marchetti. These wind studies allow them to create a map in order to identify locations that are best suited for promoting the turbine, and places where it will not work as well. “The goal is to determine the best way to use roofs to produce energy and optimize the energy mix, so we sometimes suggest that clients opt for photovoltaic energy,” he says.

“An important point is the complementary nature of photovoltaic energy and wind turbines,” says Clovis Marchetti. Wind turbines maintain production at night, and are also preferable for winter, whereas photovoltaics are better for summer. Combining the two technologies offers significant benefits at the energy level, for example, uniform production. “If we only install solar panels, we’ll have a peak of productivity at noon in the summer, but nothing at night,”  he explains. This peak of activity must therefore be stored, which is costly and still involves some loss of production. A more uniform production would therefore make it possible to produce energy on a more regular basis without having to store the energy produced.

To this end, Unéole is working on a project for an energy mix platform: a system that includes their urban wind turbines, supplemented with a photovoltaic roof. Blending the two technologies would make it possible to produce up to 50% more energy than photovoltaic panels installed alone.

A connected wind turbine

“We’re also working on making this wind turbine connected,” says Clovis Marchetti. This would provide two major benefits. First, the wind turbine could provide information directly about its production and working condition. This is important so that the owner can monitor the energy supply and ensuring that it is working properly. “If the wind turbine communicates the fact that it is not turning even though it’s windy, we know right away that action is required;” he explains.

In addition, a connected wind turbine could predict its production capacity based on weather forecasts. “A key part of the smart city of tomorrow is the ability to manage consumption based on production,” he says. Today, weather forecasts are fairly reliable up to 36 hours in advance, so it would be possible to adjust our behavior. Imagine, if for example, strong winds were forecast for 3 pm. In this case, it would be better to wait until then to launch a simulation that requires a lot of energy.

plastics

A sorting algorithm to improve plastic recycling

Producing high-quality raw materials from waste is contingent on effective sorting. Plastics from waste electrical and electronic equipment (WEEE) are no exception. To help solve this problem, researchers at IMT Mines Alès have developed a selective automation algorithm designed for these plastics. It can be integrated in new industrial-scale sorting machines.

 

How will your coffee maker be reincarnated after it dies? This electrical appliance composed primarily of plastic, metal and glass falls into the category of waste electrical and electronic equipment (WEEE). Your smartphone and washing machine are also included in this category. After it is thrown away, the coffee maker will find itself drowning in what amounts to over 750,000 tons of WEEE collected every year in France, before it is recovered by a specialized recycling center. There, it is dismantled, crushed and separated from its ferrous and non-ferrous metals, such as copper or aluminum, until all that’s left of the machine is a heap of plastic. Plastic is the second-largest component of WEEE after steel, so recycling it is a major concern.

And successful recycling starts with effective sorting. 20% of plastic materials are recovered through flotation after being placed in a tank filled with water. But how are the remaining 80% be processed? “Samples measuring 1 cm² are placed on a converyor belt equipped with an infrared camera at the end, which scans the material and determines what type of plastic it’s made of,”  says Didier Perrin, a physical chemist at IMT Mines Alès. The radiation excites the atomic bonds of the molecules and creates a spectral signature that characterizes the plastic to be identified. A technique using a near infrared source (NIRS) is especially rapid but cannot be used to identify dark plastics, which absorb the radiation. But black plastic, which holds up over time better than colored plastic, represents nearly 50 % of the waste. “Accurate and effective identification of the material is therefore crucial to generate high-quality raw material to be recycled, combining purity and mechanical performance,” adds the researcher. However, this method does not always make it possible to determine the exact type of plastic contained within a sample.

An automated sorting algorithm

Researchers at IMT Mines Alès have therefore developed an automated method for sorting plastic by working with SUEZ and Pellenc ST, a company that develops smart, connected sorting machines. The focus of their collaboration was on establishing a classification of the plastics contained in WEEE. The researchers generated a database in which each plastic has its own clearly-defined spectral identity. WEEE were therefore divided into four major families: ABS (acrylonitrile butadiene styrene), a polymer commonly used in industry which represents 50 to 60% of plastic waste (cases, covers, etc.); HIPS (high-impact polystyrene), which are similar to ABS but less expensive and with lower mechanical performance (refrigerator racks, cups); polypropylene (material which is more ductile than ABS and HIPS (soft covers for food containers, cups)); and what is referred to as ‘crystal’ polystyrene (refrigerator interior, clear organic glass).

Their first step was to better recognize the plastics to be sorted. “We used a supervised learning method on the data measured in the laboratory and then analyzed the same samples in industrial conditions,” explains PhD student Lucie Jacquin. Nevertheless, it is not always easy to characterize the type of plastic contained in waste. First of all, plastic degrades over time, which modifies its properties and makes it difficult to identify. And second, industrial conditions — with 3,000 kg of waste analyzed per hour — often result in incomplete spectral measurements.

Beyond the uncertainties of the measurements, the most traditional sorting methods also have their flaws. For example, they are based on probabilistic classification algorithms, which are used to determine how similar a sample is to those in a reference database. Except that these algorithms do not distinguish between equiprobability and ignorance. In the event of equiprobability, the spectrum of a sample is 50% similar to the spectrum of plastic A and 50% similar to that of plastic B. In the event of ignorance, even though the spectrum of a sample is not similar to any element within the database, the algorithm gives the same result as in the event of equiprobability (50% A and 50% B). So how can it be determined whether the information provided by the algorithm reflects uncertainty or ignorance? The researchers’ aim is therefore to better manage uncertainty in measurements in real conditions.

Understanding the material to recycle it better

We approached this problem using modern uncertainty theories, which allow us to better represent uncertainty in the classification of a sample, based on the uncertainty in its spectrum obtained in real conditions. Belief functions can distinguish between equiprobability and ignorance, for example,” explains computer science researcher Abdelhak Imoussaten. The algorithm attempts to determine the class of plastic to which a sample belongs. When there is a doubt, it determines the set of classes of plastic to which it may belong and eliminates the others. For example, we can be sure that a sample is either ABS or HIPS, but definitely not polypropylene. “In this way, we use ‘cautious’ machine learning to control what the machine will send to the sorting bins,” adds Abdelhak Imoussaten. Since that’s the real goal: determining to which sorting bin these small bits of plastic will be sent in an automated way.

Each category of plastic accepts a certain quantity of other plastics without affecting the matrix of the recycled material,” says Didier Perrin. In practice, this means that it is possible to send a plastic to a sorting bin with some certainty, even if the exact type of plastic is unclear (A or B but not C). While completing his PhD at IMT Mines Alès under the supervision of Didier Perrin, Charles Signoret studied all the possible mixtures of the various plastics and their compatibility. For example, ABS may only contain 1% polypropylene in its structure in order to maintain its mechanical properties, but it may contain up to 8% HIPS.

While the presence of impurities is inevitable in recycling, the researchers consider a sorting  method to be effective when it results in materials with 5% impurities or less. One thing is certain: the collaborative work of the researchers, SUEZ and Pellenc ST has proved to be effective in terms of sorting quality. It has already resulted in a demonstration machine which will subsequently be implemented in the production of new sorting machines.

Improving the effectiveness of sorting systems is crucial to the economic viability of the recycling industry. The ADEME estimates that 1.88 million tons of household appliances are brought to the market every year in France. These products will eventually have to be sorted in order to provide high-quality material to produce future equipment for this ever-growing market. “Our goal is also to ensure that the term ‘recycled,’ when referring to plastics, does not mean low-quality, as has already been achieved with glass and steel, two recycled materials whose quality is no longer questioned,” concludes Didier Perrin.

 

Article written in French by Anaïs Culot for I’MTech

 

Hiboo

Tracking mobile industrial equipment in real time with Hiboo

The Hiboo start-up was incubated at Télécom Paris and provides a platform to help companies better manage their mobile assets: equipment, merchandise, vehicles etc. This solution is now widely used in the construction industry.

 

In 2016, the start-up Hiboo, which was incubated at Télécom Paris at the time, created a connected device in order to bring this type of equipment to the construction industry. But the industry was already facing an overwhelming amount of unused data and resolving this problem was not one of its top priorities. Instead, the sector sought to solve an ongoing problem – although it generated significant annual revenue, its profit margins remained low.

We started out with the idea that one of the best ways to optimize this profit margin was to better understand the equipment’s activity using the available data,” explains François Jacob, co-founder of Hiboo. The start-up therefore made a radical shift to become a super aggregator of data sources. This transformation gave rise to its current platform, which helps companies manage their operations more effectively.

Hiboo helps collect and sort data

On paper, all construction companies face the same problems when it comes to mobile industrial assets: renting equipment, inventory, time spent on site, consumption, identifying machine problems etc. But on site, they lack an overall vision, and taking inventory of their equipment takes a long time and it is not always very thorough. Hiboo collects information provided by three categories of equipment: connected vehicles, unpowered equipment, and non-connected equipment containing an onboard computer.

In the construction industry, companies may manage thousands of pieces of equipment at the same time. Such equipment may include some twenty brands of connected vehicles. However, if a company wants to understand how each brand fits into their overall equipment operations, users must log into each brand’s individual platform to retrieve this information, which is impossible to do on a daily basis.

Hiboo solves this problem by aggregating key data such as GPS coordinates, energy consumption and machine error codes by logging in to all of the manufacturers’ individual servers in the client’s place. The data are then harmonized before being automatically analyzed by ‘workers’. These robots isolate outliers, such as a vehicle that has consumed 2,500 liters of fuel in one day. The process is then checked by engineers at Hiboo who send a final report to the clients. Users may therefore access all operations inputs and outputs for connected equipment on a single website.

Solutions hidden in data

Hiboo also equips unpowered equipment such as crane parts, dumpsters and trailers with connected devices that communicate via low-frequency networks. They are energy-efficient, and make it possible to obtain GPS coordinates and track equipment’s activity over a number of years. The information is sent to Hiboo using traditional telephone networks. With the help of a partner, the start-up also equips non-connected vehicles with devices in order to collect the information obtained in their on-board computers. “So we provide equipment managers with a comprehensive solution for practically all of their assets,” adds François Jacob.

All of this data is made available to users on the Hiboo platform. But it can also be integrated in applications such as invoicing software. The start-up helped the Swiss company Bernasconi shorten its invoicing process by one week every month by eliminating paper invoices. And a major industrial equipment rental company was able to save up to 700 billable days a month by identifying the over-usage of its equipment. “By processing data from the field, we can help companies revolutionize asset management, maintenance, assignment, invoicing etc.” explains François Jacob.

A versatile technology

Hiboo  wishes to go further in leveraging data, in particular machine error codes and their severity levels. “Using this data and maintenance records, we want to provide predictive maintenance so that we can predict the probability of a machine breaking down,” explains François Jacob. This could involve a failure on a compressor, an oil leak, a motor with low voltage etc. To do so, the start-up team combines information about the errors with the computerized maintenance management systems (CMMS) already used by companies to monitor machines and keep them in good working order.

Although originally intended for the construction industry, Hiboo’s solution can be used for other applications, given its ability to control the flow of data between different networks. For example, the start-up will be covering the Dakar rally in 2020. “By connecting to Marlink, the satellite communication network used to track the rally participants, we can collect information about the various vehicles and track their performance on our platform,” explains François Jacob.

Learn more about Hiboo

SOCCRATES

SOCCRATES automates cybersecurity for industrial systems

Projets européens H2020SOCCRATES is a H2020 European project launched in September 2019 for a three-year period. It aims to develop at least one platform to automate the detection of certain attacks and launch appropriate countermeasures. In doing so, it should help cyber security operators for industrial systems act more quickly and effectively in the event of a cyber attack. Hervé Debar, an information systems security researcher at Télécom SudParis, explains how the research consortium, which includes the school, is going about developing this solution.  

 

What is the SOCCRATES platform?

Hervé Debar: The SOCCRATES platform is a “Security Information and Event Management” environment that aims to detect and block cyber-attacks more effectively. To do so, the platform collects data about the vulnerabilities present on the monitored system, malicious activity targeting the IT environment, and general information about the threat. It then proposes appropriate countermeasures for the attacks that are detected and makes it possible to implement them.

How does it hope to address the needs of companies and organizations?

HD: SIEM platforms are the core of Security Operating Centers (SOC), where operators  manage cyber threats. All operators of critical infrastructures must monitor their information systems as required by French and European regulations. Faced with growing threats, the SOCCRATES platform aims to provide a greater degree of automation, making it possible to respond to attacks more quickly and precisely. Operators could then focus on the most complex attacks.

What is your approach to developing this platform?

HD: The project focuses primarily on the knowledge with which SOC operators are provided in order to respond to attacks. This knowledge takes one of three forms. The first is increased knowledge of the monitored information system, and of the potential attack paths that could be used to compromise a vulnerable target. Blocking the easiest attack paths can help prevent a hacker from spreading throughout the system. The second form of knowledge is based on an understanding of the threat. This means observing internet attack phenomena in order to improve the detection mechanisms used. And the third form of knowledge involves understanding the impact an attack has on operations in order to assess the risks of countermeasures and the benefits in terms of limiting the impact of an attack.

What expertise are Télécom SudParis researchers contributing to this project?

HD: We’re contributing our expertise in cyber attack remediation, which we developed in particular through the MASSIF and PANOPTESEC European FP7 projects. Our work on these two projects, which were launched in 2013 and 2014, gave us the opportunity to develop in-depth knowledge about industrial cybersecurity, managing attacks and implementing countermeasures. Our response model provides a quantitative assessment of the impact — whether positive or negative — of the remediations proposed to block attacks.

Read more on I’MTech: SPARTA: Defining Cybersecurity in Europe

How do you plan to test the effectiveness of the SOCCRATES platform?

HD: The platform will be implemented and deployed in two pilot environments involving critical infrastructures. In the field of cloud computing, with the company Mnemonic, and in the energy sector with Vattenfall. Mnemonic is a managed security service provider. At Vattenfall, the SOCCRATES platform will be used to monitor networks that control electricity production and distribution.

Beyond these industry partners, how is the project organized?

HD: SOCCRATES is coordinated by the Netherlands Organisation for Applied Scientific Research (TNO). In addition to IMT, three are three Swedish partners (KTH, Foreseeti and Mnemonic), a Finnish partner (F-Secure), ATOS Spain, Vattenfall IT Services (Poland), the Austrian Institute of Technology (AIT), and another Dutch partner, ShadowServer. This consortium is divided into three kinds of contributions: vulnerability analysis, behavioral detection, and attack remediation. Our first major step is to define the use cases and demonstration scenarios that we will use to develop, approve and demonstrate the components of the project. We plan to do this by the end of January.

Learn more about SOCCRATES

DeNoize

A window, and silence!

To combat noise pollution and its effects on human health, DeNoize, a start-up incubated at Mines Saint-Étienne, offers a solution: a window that ‘mutes sound’. This connected window would analyze outside noise and adapt to cancel it out.

 

Double glazing increases thermal insulation, but when it comes to noise, it’s another story. When we’re indoors at home or the office, the vast majority of the outside noise that reaches us comes through the windows. This is an especially big concern for people who live or work near airports or major roads. Since May 2018, DeNoize co-founders Olivier Schevin and Aman Jindal have made it their mission to reduce this noise pollution, which is detrimental to our health.

DeNoize, a start-up incubated at Mines Saint-Étienne, offers an innovative solution for improving sound insulation in windows. “Our challenge is now to miniaturize the system so that it can be integrated into window frames,” says co-founder Olivier Schevin. The concept could easily be integrated into standard windows available today.

The problem with double glazing

“Double glazing is actually less effective than single glazing when it comes to sound insulation for the same thickness of glass,” says Olivier Schevin. Although it may seem counterintuitive, double glazing offers less resistance to low frequencies— between 50 and 500 Hz. A frequency band that is the main source of noise from airports and roads. “Double glazing was designed to solve thermal insulation problems, without considering the acoustic aspect,” he explains.

Double glazing is first and foremost two masses, the panes, with air or gas between them. This structure poses a problem from an acoustic point of view: certain frequencies – low frequencies – causes the air trapped between the panes to resonate and the sound propagates. This effect may be counteracted by increasing the thickness of the windows, or the space between the two panes. This passive reduction results in a bulky look from an architectural viewpoint and is also very expensive.

Sound fights back

DeNoize’s innovation is to use sound to fight sound, making it an active noise reduction system. “We’re going to generate a counter-vibration suited to the vibration of the outside noise,” explains Olivier Schevin. “The system produces a vibration that counters that of the outside noise, creating a destructive interference.” The vibrations ‘cancel each other out,’ reducing the noise transmitted by up to 75% for low frequencies.  

“This technology is somewhat similar to that used in noise-cancelling headphones,” adds Olivier Schevin. “The technical difference is the surface of the area we want to treat. For the headphones, it’s a really small area close to the ear.” The system developed by DeNoize users sensors to analyze outside noise in real time and adapt to it accordingly. The actuators produce a counter-vibration that interferes with the original noise. It must also include a control unit and an electronic board responsible for determining the most effective actions for sensors and actuators.

The system is integrated into the window frames and requires an electrical connection nearby to supply it with energy. This is already common today with rolling shutters for example. The  innovation in step with advances in smart home technology.

Read more on I’MTech: Smart homes: A world of conflict and collaboration

This communication between actuators, sensors and control unit makes it possible to customize noise reduction in real time which adapts to outside variations. “As of now, we have a working prototype,” says Olivier Schevin, “But the system doesn’t calculate in real time yet. So we still have a development phase ahead of us for the electronics part.”

Olivier Schevin is launching an industrial project with students to develop a real-time demonstrator. The electronic component is still to be developed, since the existing control unit  was made using laboratory equipment that cannot be integrated into window frames. “In general, we’re still looking for ways to improve performance at the lowest possible cost.”

digital twin

What is a digital twin?

Digital twins, digital doubles – what exactly do these terms mean? Raksmey Phan, an engineer at the Mines Saint-Étienne Centre for Biomedical and Health Engineering (CIS)[1], talks to us about the advantages and advances offered by these new tools, as well as the issues involved.

 

What does a digital twin refer to?

Raksmey Phan: If you have a digital, mathematical model representing a real system, based on data from this real system, then you have a digital twin. Of course, the quality of the digital twin depends first and foremost on the mathematical model. Industrial ovens are a historic example that can help explain this idea.

To create a digital twin, we record information about the oven, which could include its operating hours or the temperature each time it’s used. Combined with algorithms that take into account the physical components that make up the oven, this digital twin will calculate its rate of wear and tear and anticipate breakdown risks. The use of the oven can then be monitored in real time and simulated in its future state with different use scenarios in order to plan for its replacement.

In what fields are they used?

RP: They can be used in any field where there is data to be recorded. We could say that climatologists make a digital twin of our planet: based on observational data recorded about our planet, they run simulations, and therefore mathematical models, resulting in different scenarios. To give another example, at the Mines Saint-Étienne CIS, we have scientists such as Xiaolan Xie, who are internationally renowned for their experience and expertise in the field of modeling healthcare systems. One of our current projects is a digital twin of the emergency department at Hôpital Nord de Saint-Étienne, which is located 200 meters from our center.

What advantages do digital twins offer?

RP: Let’s take the example of the digital twin of the emergency room. We’ve integrated anonymized patient pathways over a one-year period in a model of the emergency room. In addition to this mathematical model, we receive data in what can be referred to as ‘pseudo-real time,’ since there is a lapse of one hour from the time patients arrive in the department. This makes it possible for us to do two important things. The first is to track the patients’ movement through the department in pseudo-real time, using the data received and the analysis of pathway records. The second is the ability to plan ahead and predict future events. Imagine if there was a bus accident in the city center. Since we know what types of injuries result from such an accident, we can visualize the impact it would have on the department, and if necessary, call in additional staff.

What did people do before there were digital twins?

RP: Companies and industries were already using the concept before the term existed. Since we’ve been using machines, engineers have tried to monitor tools with replicas – whether digitally or on paper. It’s a bit like artificial intelligence. The term is back in fashion but the concept goes back much further. Algorithms are mathematics, and Napoleon used algorithms for his war logistics.

When did the term digital twin first start to be used?

RP: The term ‘digital twin’ was first used in 2002 in articles by Michael Grieves, a researcher at the Florida Institute of Technology. But the concept has existed since we have been trying to model real phenomena digitally, which is to say since the early days of computing. But there has been renewed interest in digital twins in recent years due to the convergence of three scientific and technological innovations. First, the impressive growth in our ability to analyze large amounts of data — Big Data. Second, the democratization of connected sensors — the Internet of Things. And third, renewed interest for algorithms in general, as well as for cognitive sciences — Artificial Intelligence.

How have the IoT and Big Data transformed digital twins?

RP: A digital twin’s quality depends on the quantity and quality of data, as well as on its ability to analyze this data, meaning its algorithms and computing capacity. IoT devices have provided us with a huge amount of data. The development of these sensors is an important factor – production has increased while costs have decreased. The price of such technologies will continue to drop, and at the same time, they will become increasingly accurate. That means that we’ll be able to create digital twins of larger, more complex systems, with a greater degree of accuracy. We may soon be able to make a digital twin of a human being (project in the works at CIS).

Are there technological limitations to digital twins?

RP: Over the last five years, everything’s been moving faster at the technological level. It’s turned into a race for the future. We’ll develop better sensors, and we’ll have more data, and greater computing power. Digital twins will also follow these technological advances. The major limitation is sharing data – the French government was right to take steps towards Open Data, which is free data, shared for the common good. Protecting and securing data warehouses are limiting factors but are required for the technological development of digital twins. In the case of our digital twin of the hospital, this involves a political and financial decision for hospital management.

What are some of the challenges ahead?

RP: The major challenge, which is a leap into the unknown, is ethics. For example, we can assess and predict the fragility of senior citizens, but what should we do with this information after that? If an individual’s health is likely to deteriorate, we could warn them, but without help it will be hard for them to change their lifestyle. However, the information may be of interest to their insurance providers, who could support individuals by offering recommendations (appropriate physical activity, accompanied walks etc.) This example hinges on the issues of confidentially and anonymization of data, not to mention the issue of informed consent of the patient.

But it’s incredible to be talking about confidentiality, anonymization and informed consent as a future challenge  — although it certainly is the case — when for the past ten years or so, a portion of the population has been publishing their personal information on social media and sharing their data with wellness applications whose data servers are often located on another continent.

[1] Raksmey Phan is a researcher at the Laboratory of Informatics, Modelling and Optimization of the Systems (LIMOS), a joint research unit between Mines Saint-Étienne/CNRS/Université Clermont-Auvergne.

Read on I’MTech: