MADEin4

MADEin4: digital twinning and predictive maintenance for industry

Projets européens H2020The European MADEin4 project was launched in April 2019 for a three-year period. It aims to help semiconductor manufacturers and equipment suppliers play an active role in the continuous improvement of their equipment. How? By relying on new digital twinning and predictive maintenance technologies. Agnès Roussy and Valéria Borodin, research professors at Mines Saint-Étienne, a member of the MADEin4 project, explain the context that gave rise to this project and discuss the scientific hurdles to overcome.   

 

What was the context for developing the MADEin4 project?

Agnès Roussy: The MADEin4 project (Metrology Advances for Digitized ECS Industry 4.0) is an ECSEL project (Electronic Components and Systems for European Leadership). Its aim is to support and bring together the semiconductor industry in Europe in the transition to digital technology.

What is the overall goal of this project?

Valéria Borodin: To increase production output without affecting reliability levels in the manufacturing of electronic devices, the quality of which must comply with the increasingly demanding requirements of the highly competitive semiconductors market.

And how are you going about this?

AR: In order to improve productivity and facilitate the integration of digital technology into the organization of manufacturing processes for semiconductor and equipment manufacturers, going beyond the state of the art, the project will rely on an Industry 4.0 approach. To do so, two complementary boosters will be leveraged in the development of a pilot line: a physical accelerator based on next-generation metrology and inspection equipment for the microelectronics industry; and a digital accelerator – the digital twin (see box) – integrating artificial intelligence technology to improve output and equipment performance prediction.

[box type=”shadow” align=”” class=”” width=””]

Plus, loupeThe technique of digital twinning is used by manufacturers to monitor the operational status of their equipment (manufacturing, metrology, inspection). Digital twins of physical equipment are used. They evolve over time using data collected by sensors to measure the health status of equipment in order to prevent or anticipate breakdowns.[/box]

What technological and scientific challenges do you face?

VB: The development of digital twins and simulation models for managing and coordinating a production unit at different levels of decision-making poses a number of challenges, in particular, in terms of consistency of digital technology and decision-making across all industrial activities. In this regard, the help and expertise of semiconductor manufacturers and equipment suppliers (manufacturing and metrology) play a pivotal role in confirming the usefulness and industrial feasibility of the solutions we propose as academics.

How are Mines Saint-Étienne researchers contributing to the project?

AR: One of the research areas, in which Mines Saint Étienne’s Manufacturing and Logistics Sciences department (SFL) is primarily active, focuses on microelectronic manufacturing. This involves advanced process control, quantitative management of operations in the manufacturing process, and decision support at different levels (operational, tactical and strategic). As part of the MADEin4 project, we seek to explore opportunities and identify the limitations of new digital technologies in the intensive use and analysis of the massive quantities of data collected by inspection and metrology equipment.

Who are you partners for this project, and which collaborations are important for your work?

VB: The MADEin4 project brings together the expertise of 42 industrial and academic partners from 11 countries. Our key industrial partners for this project are STMicroelectronics in Rousset and Crolles. This project, among others, allows us to continue the long-standing, successful academic collaboration between the Manufacturing and Logistics Sciences Department at Mines Saint Etienne’s Provence Microelectronics Center (CMP) and the ST sites of Rousset and Crolles, who we’ve worked with for over 15 years. Many equipment suppliers are also involved in this project, so we’ll have the opportunity to work with them more closely on the equipment. And likewise for the academic partners involved: this European project will help foster new opportunities for collaboration through PhD theses or future calls for projects.

What are the expected benefits?

AR: The expected benefits of the MADEin4 project closely reflect the scientific and strategic priorities of Mines Saint-Etienne and the Provence Microelectronics Center (CMP), which promote a number of important topics: the industry of the future (Industry 4.0) and artificial intelligence (IA). Through the MADEin4 project, we seek to provide process control solutions for semiconductor manufacturers, explore opportunities for applications of digital twinning technology, strengthen the partnership with semiconductor manufacturers, and increase international recognition for the CMP on topics related to microelectronic manufacturing.

What are the important steps coming up for the project?

VB: The MADEin4 project started just over six months ago. This initial phase is exciting because everything seems to be possible. As for Mines Saint Étienne, the industrial data soon to be provided by the different partners will allow us to compare our research to the realities of industry. By the end of the first year, the research findings will be publicized through articles in international journals and presentations to the scientific communities involved.

Find out more about the MADEin4 project

codling moth

A tribe of irreducible codling moths

In agriculture, biological alternatives to pesticides are being sought for environmental and health reasons. Use of a virus as a biocontrol agent for crop pests has become relatively widespread. One such example is Cydia pomonella granulovirus. It has been used for decades to target a crop pest that is fond of apples and pears: the codling moth. Miguel Lopez-Ferber, a researcher at IMT Mines Alès, has been working on this topic since 2005, the year in which the larvae of this insect developed resistance to a commercial product made from this granulovirus.

 

Cydia pomonella. Common name: codling moth. Adult moths measure approximately 18 millimeters long and are a brownish-gray color. Shortly after they hatch, larvae bore into orchard fruits and feed on them from the inside. The apples and pears are damaged and rot, so they are no longer suitable to be sold. However, these insects are very susceptible to a virus: their granulovirus, which is known as CpGV. If they come into contact with it, the larvae become ill and die, leaving the orchards healthy and untouched.

Unfortunately for apple trees, codling moths have developed resistance to CpGV, which poses a number of problems. For one, farmers need to find a new way to protect their orchards. And manufacturers need to determine whether it is possible to improve the viral preparation, find a substitute, or if production must be stopped. Then there’s the scientific question: “Codling moths have been in contact with this virus for millions of years and have never developed resistance to it. Why now?” wonders Miguel Lopez-Ferber, a researcher at IMT Mines Alès: “If there had been widespread resistance in the past, we would no longer find this virus in nature.” 

One possible explanation is that, “we’ve underestimated the inventiveness of nature,” says Miguel Lopez-Ferber. “We’ve acted as if it were a chemical product: for years, exactly the same viral solution has been spread over orchards.” In nature, when an insect repeatedly comes into contact with the same chemical product, it will adapt and find a way to resist it. So the product will no longer work as well. Viruses, on the other hand, will also adapt and find new ways to reach the insects if we don’t prevent them from doing so – they are in co-evolution.

“It works the same way with humans, with the flu virus, for example,” explains the researcher. “We develop defenses to protect ourselves from the virus, but it evolves and comes back stronger the next year.” And CpGV exists in different forms throughout the world. There are slight variations in genotype – which is the sum total of an individual’s genes. And the solution available on the market corresponds to the culmination of research on a single isolated genotype of this virus.

Research to overcome resistance

With CpGV, the same virus isolate has been applied massively for years. This means that it’s possible that codling moth larvae are not resistant to other isolates of the virus. The different genotypes of the virus have been divided into 5 groups, from A to E. The “A” group is most similar to the Mexican isolate, which has been used historically. The researchers found that the other groups infected resistant larvae. At the beginning, however, the other viral isolates were less effective than those originally used – a greater quantity of product was needed for the same plot of land. But with a little selection, the performance reached the same levels as that of the original product. “We were also worried that we would observe resistance to these new genotypes in the future,” says Miguel Lopez-Ferber. But it is likely that this resistance phenomenon will not appear if there is greater viral diversity.

The researchers therefore tried another method: they combined the Mexican isolate to which the larvae had become resistant, with another genotype of the virus that infected the larvae. And they discovered that together, the two were even more effective in infecting the larvae. In a way, the second genotype, “opened the door to the one that had been blocked until then,” explains Miguel Lopez-Ferber, “but we still don’t really understand how it works.” The researchers are therefore studying how the different forms of the virus interact with one another to infect larvae. They could then use this knowledge to develop one or several optimal mixtures, by appropriately combining the specific characteristics of each genotype.

“Viral diversity is an asset, but we don’t yet fully understand how it works,” explains the researcher. “Imagine, for example, if we wanted to colonize a desert island. If all the colonists were miners, there’d be a lack of skills for building houses, planting crops etc. We need different, complementary jobs. That’s what we get when we combine several genotypes of a virus – except we don’t really know what their jobs are. We just know they work better together.”

And studying the virus’s behavior in codling moth larvae is no simple task. Let’s say that a type A and type B virus are dispersed in an orchard. How can we determine if a larva has absorbed both forms of the virus or only a single form? Or, if one of them has prevented the other from developing? In order to understand how the tasks are distributed, the researchers would must able to track A and B as they colonize the larva. The molecular tools available today are not the best suited for this. Miguel Lopez-Ferber’s team is currently working in partnership with NeoVirTech, a Toulouse-based company, to develop a better technique for tracking viruses.

[box type=”info” align=”” class=”” width=””]

The origins of granulovirus

Using a natural predator to protect our harvests is not a new idea,” says Miguel Lopez-Ferber. “We domesticated cats to combat mice. It’s the same principle with granulovirus.”

It was in Mexico in 1964 that the codling moth granulovirus (CpGV) was discovered. Codling moth larvae were found dead and researchers sought to determine the cause. They then isolated the virus responsible: the Mexican isolate of CpGV. Shortly after, other forms of the virus were observed in Russia, followed by the United Kingdom. Slight variations existed between the viruses, subtle differences in genotype – the sum total of an individual’s genes. The Mexican isolate was favored for a potential commercial product because it was more homogenous. This made it easier to characterize in order to comply with regulatory criteria for insecticides, which are equivalent for phytosanitary and biological products and require rigorous identification of a product’s makeup.  

After 25 years of research, the viral solution was ready for use and authorized for sale. In practice, it is used like a chemical product. A concentrated liquid solution is prepared in a tank and is then dispersed in a spray of fine droplets, ideally covering the entire orchard in a uniform manner. Starting in the 1990s, the product was widely used and applied several times a year. Until 2005, that is, when it was observed that codling moths were still present after the treatment.[/box]

Guillaume Balarac

Guillaume Balarac, turbulence simulator

Turbulence is a mysterious phenomenon in fluid mechanics. Although it has been observed and studied for centuries, it still holds secrets that physicists and mathematicians strive to unlock. Guillaume Balarac is part of this research community. A researcher at Grenoble INP (at the LEGI Geophysical and Industrial Flows Laboratory), he uses and improves simulations to understand turbulent flows better. His research has given rise to innovations in the energy sector. The researcher, who has recently received the 2019 IMT-Académie des Sciences Young Scientist Award, discusses the scientific and industrial challenges involved in his field of research.

 

How would you define turbulent flows, which are your research specialty?

Guillaume Balarac: They are flows with an unpredictable nature. The weather is a good example for explaining this. We can’t predict the weather more than five days out, because the slightest disturbance at one moment can radically alter what occurs in the following hours or days . It’s the butterfly effect. Fluid flows in the atmosphere undergo significant fluctuations that limit our ability to predict them. This is typical of turbulent flows, unlike laminar flows which are not subject to such fluctuations and whose state may be predicted more easily.

Apart from air mass movements in the atmosphere, where can turbulent flows be found?

GB: Most of the flows that we may encounter in nature are actually turbulent flows. The movement of oceans is described by turbulent flows, as is that of rivers. The movement of molten masses in the Sun generates a turbulent flow. This is also the case for certain biological flows in our bodies, like blood flow near the heart. Apart from nature, these flows are found in rocket propulsion, the motion  of wind turbines and that of hydraulic or gas turbines etc.

Why do you seek to better understand these flows?

GB: First of all, because we aren’t able to do so! It’s still a major scientific challenge. Turbulence is a rather uncharacteristic example – it has been observed for centuries. We’ve all seen a river or felt the wind. But the mathematical description of these phenomena still eludes us. The equations that govern these turbulent flows have been known for two centuries. And the underlying mechanics have been understood since ancient times.  And yet, we aren’t able to solve these equations and we’re ill-equipped to model and understand these events.

You say that researchers can’t solve the equations that govern turbulent flows. Yet, some weather forecasts for several days out are accurate…

GB: The iconic equation that governs turbulent flows is the Navier-Stokes equation. That’s the one that has been known since the 19th century. No one is able to find a solution with a pencil and paper. Finding a unique, exact solution to this equation is even one of the seven millennium problems established by the Clay Mathematics Institute.  As such, the person who finds the solution will be awarded $1 million. That gives you an idea about the magnitude of the challenge. To get around our inability to find this solution, we either try to approach it using computers, as is the case for weather forecasts  — with varying degrees of accuracy — or we try to observe it. And finding a link between observation and equation is no easy task either!

Beyond this challenge, what can a better understanding of turbulent flows help accomplish?

GB: There are a wide range of applications which require an understanding of these flows and the equations that govern them. Our ability to produce energy relies in part on fluid mechanics, for example. Nuclear power plants function with water and steam systems. Hydroelectric turbines work with water flows, as do water current turbines. For wind turbines, it’s air flows.  And these examples are only as far as the energy sector is concerned.

You use high-resolution simulation to understand what happens at the fundamental level in a turbulent flow. How does that work?

GB: One of the characteristics of turbulent flows are eddies. The more turbulent the flow, the more eddies of varying sizes it has. The principle of high resolution simulation is to define billions of points in the space in which the flow is produced, and calculate the fluid velocity at each of these points. This is called a mesh, and it must be fine enough to describe the smallest eddy in the flow. These simulations use the most powerful supercomputers in France and Europe. And even with all that computing power, we can’t simulate realistic situations – only academic flows in idealized conditions . These high-resolution simulations allow us to observe and better understand the dynamics of turbulence in canonical configurations.

Simulation des écoulements turbulents sur une hydrolienne.

Simulation of turbulent flows on a marine turbine.

Along with using these simulation tools, you work on improving them. Are the two related?

GB: They are two complementary approaches. The idea for that portion of my research is to accept that we don’t have the computing power to simulate the Navier-Stokes equation in realistic configurations. So the question I ask myself is – how can this equation be modified so that it can be possible to solve with our current computers, while ensuring that the prediction is still reliable? The approach is to solve the big eddies first. And since we don’t have the power to make a fine enough mesh for the small eddies, we look for physical terms, mathematical expressions, which replace the influence of the small eddies on the big ones. That means that we don’t have the small eddies in this modeling, but their overall contribution to flow dynamics is taken into account. This helps us improve simulation tools by making them able to address flows in realistic conditions.

Are these digital tools you’re developing used solely by researchers?

GB: I seek to carry out research that is both fundamental and application-oriented. For example, we worked with Hydroquest, on the performance of water current turbines to generate electricity. The simulations we carried out made it possible to assess the performance loss due to the support structures, which do not contribute to capturing the energy from the flow. Our research led to patents for new designs, with a 50% increase in yield.

More generally, do energy industry players realize how important it is to understand turbulent flows in order to make their infrastructures more efficient?

GB: Of course, and we have a number of partners who illustrate industrial interest for our research.    For example, we’ve adopted the same approach to improve the design of floating wind turbines. We’re also working with General Electric on hydroelectric dam turbines. These hydraulic turbines are increasingly being used to operate far from their optimal operating point, in order to mitigate the intermittence of renewable solar or wind energy.  In these systems, hydrodynamic instability develops, which has a significant effect on the machines’ performance. So we’re trying to optimize the operation of these turbines to limit yield loss.

What scientific challenges do you currently face as you continue your efforts to improve simulations and our understanding turbulent flows?

GB: At the technical level, we’re trying to improve our simulation codes to take full advantage of advances in supercomputers. We’re also trying to improve our numerical methods and models to increase our predictive capacity.  For example, we’re now trying to integrate learning tools to avoid simulating small eddies and save computing time. I’ve started working with Ronan Fablet, a researcher at IMT Atlantique, on precisely this topic. Then, there’s the huge challenge of ensuring the reliability of the simulations carried out. As it stands now, if you give a simulation code to three engineers, you’ll end up with different models. This is due to the fact the tools aren’t objective, and a lot depends on the individuals using them. So we’re working on mesh and simulation criteria that are objective. This should eventually make it possible for industry players and researchers to work with the same foundations,  and better understand one another when discussing turbulent flows.

 

Véronique Bellon-Maurel

Véronique Bellon-Maurel: from infrared spectroscopy to digital agriculture

Measuring and quantifying have informed Véronique Bellon-Maurel’s entire scientific career. A pioneer in near infrared spectroscopy, the researcher’s work has ranged from analyzing fruit to digital agriculture. Over the course of her fundamental research, Véronique Bellon-Maurel has contributed to the optimization of many industrial processes. She is now the Director of #DigitAg, a multi-partner Convergence Lab, and is the winner of the 2019 IMT-Académie des Sciences Grand Prix. In this wide-ranging interview, she retraces the major steps of her career and discusses her seminal work.   

 

You began your research career by working with fruit. What did this research involve?

Véronique Bellon-Maurel: My thesis dealt with the issue of measuring the taste of fruit in sorting facilities. I had to meet industrial requirements, particularly in terms of speed: three pieces of fruit per second! The best approach was to use near infrared spectroscopy to measure the sugar level, which is indicative of taste. But when I was beginning my thesis in the late 1980s, it took spectrometers one to two minutes to scan a piece of fruit. I suggested working with very near infrared, meaning a different type of radiation than the infrared that had been used up to then, which made it possible to use new types of detectors that were very fast and inexpensive.

So that’s when you started working on near infrared spectroscopy (NIRS), which went on to became your specialization. Could you tell us what’s behind this technique with such a complex name?

VBM: Near infrared spectroscopy (NIRS) is a method for analyzing materials. It provides a simple way to obtain information about the chemical and physical characteristics of an object by illuminating it with infrared light, which will pass through the object and become charged with information. For example, when you place your finger on your phone’s flashlight, you’ll see a red light shining through it. This light is red because the hemoglobin has absorbed all the other colors of the original light. So this gives you information about the material the light has passed through. NIRS is the same thing, except that we use particular radiation with wavelengths that are located just beyond the visible spectrum.

Out of all the methods for analyzing materials, what makes NIRS unique?

VBM: Near infrared waves pass through materials easily. Much more easily than “traditional” infrared waves which are called “mid-infrared.” They are produced by simple sources such as sunlight or halogen lamps. The technique is therefore readily available and is not harmful: it is used on babies’ skulls to assess the oxygenation saturation of their brains! But when I was starting my career, there were major drawbacks to NIRS. The signal we obtain is extremely cluttered because it contains information about both the physical and chemical components of the object.

And what is hiding behind this “cluttered signal”?

VBM: In concrete terms, you obtain hill-shaped curves and the shape of these curves depends on both the object’s chemical composition and its physical characteristics. You’ll get a huge hill that is characteristic of water. And the signature peak of sugar, which allows you to calculate a fruit’s sugar level, is hidden behind it. That’s the chemical component of the spectrum obtained. But the size of the hills also depends on the physical characteristics of your material, such as the size of the particles or cells that make it up, physical interfaces — cell walls, corpuscles — the presence of air etc. Extracting solely the information we’re interested in is a real challenge!

Near infrared spectrums of apples.

 

One of your earliest significant findings for NIRS was precisely that – separating the physical component from the chemical component on a spectrum. How did you do that?

VBM: The main issue at the beginning was to get away from the physical component, which can be quite a nuisance. For example, light passes through water, but not the foam in the water, which we see as white, even though they are the same molecules! Depending on whether or not the light passes through foam, the observation — and therefore the spectrum — will change completely. Fabien Chauchard was the first PhD student with whom I worked on this problem. To better understand this optical phenomenon, which is called diffusion, he went to the Lund Laser Center in Sweden. They have highly-specialized cameras: time-of-flight cameras, which operate at a very high speed and are able to capture photos “in flight.” We send photons onto a fruit in an extremely short period of time and we recover the photons as they come out since not all of them come out at the same time. In our experiments, if we place a transmitter and a receiver on a fruit spaced 6 millimeters apart, when they came out, certain photons had travelled over 20 centimeters! They had been reflected, refracted, diffracted etc. inside the fruit. They hadn’t travelled in a straight line at all. This gave rise to an innovation, spatially resolved spectroscopy (SRS) developed by the Indatech company that Fabien Chauchard started after completing his PhD.

We looked for other optical arrangements for separating the “chemical” component from the “physical” component. Another PhD student, Alexia Gobrecht, with whom I worked on soil, came up with the idea of using polarized near infrared light. If the photons penetrate the soil, they lose their polarization. Those that have only travelled on the surface conserve it. By differentiating between the two, we recover spectrums that only depend on the chemical component. This research on separating chemical and physical components was continued in the laboratory, even after I stopped working on it. Today, my colleagues are very good at identifying aspects that have to do with the physical component of the spectrum and those that have to do with to the chemical component. And it turns out that this physical component is useful! And to think that twenty years ago, our main focus was to get rid of it.

After this research, you transitioned from studying fruit to studying waste. Why did you change your area of application?

VBM: I’d been working with the company Pellenc SA on sorting fruit since around 1995, and then on detectors for grape ripeness. Over time, Pellenc transitioned to waste characterization for the purpose of sorting, based on the infrared knowledge developed through sorting fruit. They therefore called on us, with a new speed requirement, but this one was much tougher. A belt conveyor moves at a speed of several meters per second. In reality, the areas of application for my research were already varied. In 1994, while I was still working on fruit with Pellenc, I was also carrying out projects for biodegradable plastics. NIRS made it possible to provide quality measurements for a wide range of industrial processes. I was Ms. “Infrared sensors!”

 

“I was Ms. ‘Infrared sensors’!”
– Véronique Bellon-Maurel

 

Your work on plastics was among the first in the scientific community concerning biodegradability. What were your contributions in this area?

VBM: 1990 was the very beginning of biodegradable plastics. Our question was determining whether we could measure a plastic’s biodegradability in order to say for sure, “this plastic is truly biodegradable.” And to do so as quickly as possible, so why not use NIRS? But first, we had to define the notion of biodegradability, with a laboratory test. For 40 days, the plastics were put in reactors in contact with microorganisms, and we measured their degradation. We were also trying to determine whether this test was representative of biodegradability in real conditions, in the soil. We buried hundreds of samples in different plots of land in various regions and we dug them up every six months to compare real biodegradation and biodegradation in the laboratory. We wanted to the find out if the NIRS measurement was able to achieve the same result, which was estimating the degradation kinetics of a biodegradable plastic – and it worked. Ultimately, this benchmark research on the biodegradability of plastics contributed to the industrial production and deployment of the biodegradable plastics that are now found in supermarkets.

For that research, was your focus still on NIRS?

VBM: The crux of my research at that time was the rapid, non-destructive characterization — physical or chemical— of products. NIRS was a good tool for this. We used it again after that on dehydrated household waste in order to assess the anaerobic digestion potential of waste. With the laboratory of environmental biotechnology in Narbonne, and IMT Mines Alès, we developed a “flash” method to quickly determine the quantity of bio-methane that waste can release, using NIRS. This research was subsequently transferred to the Ondalys company, created by Sylvie Roussel, one of my former PhD students. My colleague Jean-Michel Roger is still working with them to do the same thing with raw waste, which is more difficult.

So you gradually moved from the agri-food industry to environmental issues?

VBM: I did, but it wasn’t just a matter of switching topics, it also involved a higher degree of complexity. In fruit, composition is restricted by genetics – each component can vary within a known range. With waste, that isn’t the case! This made environmental metrology more interesting than metrology for the food industry. And my work became even more complex when I started working on the topic of soil. I wondered whether it would be possible to easily measure the carbon content in soil. This took me to Australia, to a specialized laboratory at the University of Sydney. To my mind, all this different research is based on the same philosophy: if you want to improve something, you have to measure it!

So you no longer worked with NIRS after that time? 

VBM: A little less, since I changed from sensors to assessment. But even that was a sort of continuation: when sensors were no longer enough, how could we make measurements? We had to develop assessment methods. It’s very well to measure the biodegradability of a plastic, but is that enough to successfully determine if that biodegradable plastic has a low environmental impact? No, it isn’t – the entire system must be analyzed. I started working on life-cycle analysis (LCA) in Australia after realizing that LCA methods were not suited to agriculture: they did not account for water, or notions of using space. Based on this observation, we improved the LCA framework to develop the concept of a regional LCA, which didn’t exist at the time, allowing us to make an environmental assessment of a region and compare scenarios for how this region would evolve. What I found really interesting with this work was determining how to use data from information systems and sensors to build the most reliable and reproducible model as possible. I wanted the assessments to be as accurate as possible. This is what led me to my current field of research – digital agriculture.

Read more on I’MTech: The many layers of our environmental impact

In 2013 you founded #DigitAg, an institute dedicated to this topic. What research is carried out there?

VBM: The “Agriculture – Innovation 2025” report submitted to the French government in 2015 expresses a need to structure French research on digital agriculture. We took advantage of the opportunity to create Convergence Labs by founding the #DigitAg, Digital Agriculture Convergence Lab. It’s one of ten institutes funded by the Investments in the Future program. All of these institutes were created in order to carry out interdisciplinary research on a major emerging issue. At #DigitAg, we draw on engineering sciences, digital technology, biology, agronomy, economy, social sciences, humanities, management etc. Our aim is to establish knowledge bases to ensure that digital agriculture develops in a harmonious way. The challenge is to develop technologies but also to anticipate how they will be used and how such uses will transform agriculture – we have to predict how technologies will be used and the impacts they will have to help ensure ethical uses and prevent misuse. To this end, I’ve also set up a living lab, Occitanum — for Occitanie Digital Agroecology — set to start in mid-2020. The lab will bring together stakeholders to assess the use value of different technologies and understand innovation processes. It’s a different way of carrying out research and innovation, by incorporating the human dimension.

Stéphanie Tillement

IMT Atlantique | risk management, sociology, nuclear safety

[toggle title=”Find here all her articles” state=”open”]

[/toggle]

Fukushima: 8 years on, what has changed in France?

Fukushima was the most devastating nuclear disaster since Chernobyl. The 1986 disaster led to radical changes in international nuclear governance, but has the Japanese catastrophe had the same effect? This is what the AGORAS project is trying to find out. IMT Atlantique, the IRSN, Mines ParisTech, Orano and SciencesPo are all working on the AGORAS project, which aims to understand the impact of Fukushima on France’s nuclear industry. Stéphanie Tillement, a sociologist at IMT Atlantique explains the results of the project, which is coming to an end after 6 years of research.

 

Why do we need to know about the consequences of a Japanese nuclear incident in France?

Stéphanie Tillement: Fukushima was not just a shock for Japan. Of course, the event influenced everywhere that uses nuclear energy as an important part of energy production, such as Europe, North America, and Russia; but it also affected less nuclearized countries. Fukushima called into question the safety, security and reliability of nuclear power plants. Groups which are strongly involved in the industry, such as nuclear operators, counter-experts, associations and politicians, were all affected by the event. Therefore, we expected that Fukushima would have a strong impact on nuclear governance. There is also another, more historical, reason; both the Chernobyl and Three Mile Island accidents had an impact on the organization of the nuclear industry. So, Fukushima could be part of this trend.

How did Chernobyl and Three Mile Island impact the industry?

ST: The consequences of nuclear disasters are generally felt 10 to 20 years after the event itself. In France, Chernobyl contributed to the 2006 French Nuclear Safety and Transparency Act, which marked a major change in the nuclear risk governance system. This law notably led to the creation of the French Nuclear Safety Authority, ASN. A few years earlier, the French Radioprotection and Nuclear Safety Institute, IRSN, was created. The 2006 law still regulates the French nuclear industry today. The Three Mile Island disaster caused the industry to question people’s involvement in these complex systems, notably in terms of human error. This led to major changes in human-computer interfaces within nuclear infrastructure, and the understanding of human error mechanisms.

Has the Fukushima accident led to similar changes?

ST: The disaster was in 2011; it’s not even been 10 years since it happened. However, we can already see that Fukushima will probably not have the same affect in France as the other accidents. Rather than criticizing the French system, industry analysis of Fukushima has emphasized the benefits of France’s current mode of governance. Although technical aspects have undergone changes, particularly regarding Complementary Safety Assessments (CSR), the relationships between nuclear operators, the ASN and the IRSN have not changed after Fukushima.

Why has this disaster not considerably affected the French mode of governance?

ST: At first, the French nuclear industry thought that the Fukushima disaster was unlikely to happen in France, as the Japanese power plant was managed in a completely different way. In Japan, several operators share the country’s nuclear power plants. When analyzing crisis management, the post-accident report showed that the operator’s independence was not enforced, and that there was collusion between the government, the regulators and the operators. In France, the Nuclear Safety and Transparency Act strictly regulates relationships between industry operators and assures that each operator has their independence. This is a strength of the French governance model that is recognized internationally. As well as this, French nuclear power plants are managed by only one operator, EDF, which controls 58 identical plants. The governance issues in Japan reassured French operators, as they confirmed that legally enforcing the independence of the regulatory authority was the right thing to do.

How did the anti-nuclear movement respond to this lack of change?

ST: During our investigations into Fukushima, we realized that the accident did not create any new anti-nuclear movements or opinions. Opposition already existed. There is no denying that the event gave these anti-nuclear organizations, collectives and experts some material, but this didn’t radically change their way of campaigning nor their arguments. This again shows how Fukushima did not cause major changes. The debate surrounding the nuclear industry is still structured in the same way as it was before the disaster.

Does that also mean that there have been no political consequences post-Fukushima?

ST: No, and that’s also one of the findings of the AGORAS project. Recent political decisions on nuclear sector strategy have been mainly made according to processes established before the Fukushima accident. For example, the cancellation of the ASTRID project was not due to a radical political change in the nuclear sector, but actually because of economic arguments and a lack of political desire to tackle the subject. Clearly, politicians do not want to tackle these issues, as the decisions they make have an impact in 10, 20, or even 30 years’ time. This just doesn’t work with their terms of office. The political turnover also means that very few politicians know enough about the subject, which raises questions about the government’s ability to get involved in nuclear, and therefore energy politics.

Read on I’MTech: What nuclear risk governance exists in France?

Your work suggests that there has been almost no change in any aspect of nuclear governance

ST: The AGORAS project started by asking the question: Did Fukushima cause a change in governance in the same way as the accidents that preceded it? If we look at it from this perspective, our studies say no, due to all the reasons that I’ve already mentioned. However, we need to put this into context. Many things have changed, just not in the same radical way as they did after Chernobyl or Three Mile Island. Amongst these changes, is the modification of certain technical specifications for infrastructure. For example, one of the reasons why ASN called for EDF to review the welding of their EPR reactors was due to technical developments decided following Fukushima. There have also been changes in crisis management and post-accident management.

How have we changed the way we would manage this type of disaster?

ST: Following Fukushima, a rapid response force for nuclear accidents (FARN) was created in France to manage the emergency phase of an accident. Changes were also made to the measures taken during a crisis, so that the civil security and prefects can act more quickly. The most notable changes have been in the post-accident phase. Historically, accident preparation measures were mainly focused on the emergency phase. As a result, different roles are well-defined in this phase. However, Fukushima showed that managing the after-crisis was also equally as important. What’s unique about a nuclear accident, is that it has extremely long-term consequences. However, in Fukushima, once the emergency phase was over, the organization became less defined. No one knew who was responsible for controlling food consumption, soil contamination, or urban planning. Therefore, the local information commissions (CLIS) have worked with nuclear operators to improve the post-accident phase in particular. But, once again, our research has shown that this work was started before the Fukushima disaster. The accident just accelerated these processes and increased the importance of this issue.

Fukushima took place less than 10 years ago; do you plan on continuing your work and studying the implications of the disaster after 10 and 20 years have passed?

ST: We would particularly like to continue to address other issues and to develop our results further. We have already carried out field research with ASN, IRSN, local information commissions, politicians, associations, and manufacturers such as Framatome or Orano. However, one of the biggest limitations to our work is that we cannot work with EDF, who is a key player in nuclear risk governance. In the future, we want to be able to work with plant operators, so we can study the impact of an accident on their operations. As well as this, politicians’ understanding could also be improved.  Understanding politicians’ opinions regarding nuclear governance, and the nuclear strategy decision-making process is a real challenge.

lithium-ion battery

What is a lithium-ion battery?

The lithium-ion battery is one of the best-sellers of recent decades in microelectronics. It is present in most of the devices we use in our daily lives, from our mobile phones to electric cars. The 2019 Nobel Prize in Chemistry was awarded to John Goodenough, Stanley Wittingham, and Akira Yoshino, in recognition of their initial research that led to its development. In this new episode of our “What’s?” series, Thierry Djenizian explains the success of this component. Djenizian is a researcher in microelectronics at Mines Saint-Étienne and is working on the development of new generations of lithium-ion batteries.

 

Why is the lithium-ion battery so widely used?

Thierry Djenizian: It offers a very good balance between storage and energy output. To understand this, imagine two containers: a glass and a large bottle with a small neck. The glass contains little water but can emptied very quickly. The bottle contains a lot of water but will be slower to empty. The electrons in a battery behave like the water in the containers. The glass is like a high-power battery with a low storage capacity, and the bottle a low-power battery with a high storage capacity. Simply put, the lithium-ion battery is like a bottle but with a wide neck.

How does a lithium-ion battery work?

TD: The battery consists of two electrodes separated by a liquid called electrolyte. One of the two electrodes is an alloy containing lithium. When you connect a device to a charged battery, the lithium will spontaneously oxidize and release electrons – lithium is the chemical element that releases electrodes most easily. The electrical current is produced by the electrons flowing between the two electrodes via an electrical circuit, while the lithium ions from the oxidation reaction migrate through the electrolyte into the second electrode.

The lithium ions will thus be stored until they no longer have any available space or until the first electrode has released all its lithium atoms. The battery is then discharged and you simply apply a current to force the reverse chemical reactions and have the ions migrate in the other direction to return to their original position. This is how lithium-ion technology works: the lithium ions are inserted into and extracted from the electrodes reversibly depending on whether the battery is charging or discharging.

What were the major milestones in the development of the lithium-ion battery?

TD: Wittingham discovered a high-potential material composed of titanium and sulfur capable of reacting with lithium reversibly, then Goodenough proposed the use of metal alloys. Yoshino marketed the first lithium-ion battery using graphite and a metal oxide as electrodes, which considerably reduced the size of the batteries.

What are the current scientific issues surrounding lithium-ion technology?

TD: One of the main trends is to replace the liquid electrolyte with a solid electrolyte. It is best to avoid the presence of flammable liquids, which also present risks of leakage, particularly in electronic devices. If the container is pierced, this can have irreversible consequences on the surrounding components. This is particularly true for sensors used in medical applications in contact with the skin. Recently, for example, we developed a connected ocular lens with our colleagues from IMT Atlantique. The lithium-ion battery we used included a solid polymer-based electrolyte because it would be unacceptable for the electrolyte to come into contact with the eye in the event of a problem. Solid electrolytes are not new. What is new is the research work to optimize them and make them compatible with what is expected of lithium-ion batteries today.

Are we already working on replacing the lithium-ion battery?

TD: Another promising trend is to replace the lithium with sodium. The two elements belong to the same family and have very similar properties. The difference is that lithium is extracted from mines at a very high environmental and social cost. Lithium resources are limited. Although lithium-ion batteries can reduce the use of fossil fuels, if their extraction results in other environmental disasters, they are less interesting. Sodium is naturally present in sea salt. It is therefore an unlimited resource that can be extracted with a considerably lower impact.

Can we already do better than the lithium-ion battery for certain applications?

TD: It’s hard to say. We have to change the way we think about our relationship to energy. We used to solve everything with thermal energy. We cannot use the same thinking for electric batteries. For example, we currently use lithium-ion button cell batteries for the internal clocks of our computers. For this very low energy consumption, a button cell has a life span of several hundred years, while the computer will probably be replaced in ten years. A 1mm² battery may be sufficient. The size of energy storage devices needs to be adjusted to suit our needs.

Read on I’MTech: Towards a new generation of lithium batteries?

We also have to understand the characteristics we need. For some uses, a lithium-ion battery will be the most appropriate. For others, a battery with a greater storage capacity but a much lower output may be more suitable. For still others, it will be the opposite. When you use a drill, for example, it doesn’t take four hours to drill a hole, nor do you need a battery that will remain charged for several days. You want a lot of power, but you don’t need a lot of autonomy. “Doing better” than the lithium-ion battery, perhaps simply means doing things differently.

What does it mean to you to have a Nobel Prize awarded to a technology that is at the heart of your research?

TD:  They are names that we often mention in our scientific publications, because they are the pioneers of the technologies we are working on. But beyond that, it is great to see a Nobel Prize awarded to research that means something to the general public. Everyone uses lithium-ion batteries on a daily basis, and people recognize the importance of this technology. It is nice to know that this Nobel Prize in Chemistry is understood by many people.

healthcare

When engineering helps improve healthcare

Editorial.

 

Tomorrow’s medicine will be at least 4P: personalized, preventive, predictive, participative. ‘At least,’ because some would readily add “precise,” “proof” (evidence-based), “pathway-based” etc. Beyond naming this type of medicine and determining the correct number of Ps, medicine is clearly undergoing a profound change. A transformation supported in part by engineering, which is bringing major innovations to the healthcare industry. New technologies — whether in relation to digital technology or materials — have led to advances in many areas of medicine.

Researchers at Mines Saint-Étienne’s Centre for Biomedical and Health Engineering (CIS) are carrying out their research in the heart of the Saint-Étienne hospital campus. They are working to improve health systems and technology, in daily contact with medical professionals. Jérémie Pourchez is developing an artificial lung model to study certain respiratory diseases and the toxicity of inhaled particles. David Marchat is developing a new generation of bone implants to facilitate bone regeneration in grafts. As such, they are directly addressing needs for knowledge and tools expressed by practitioners.

[one_half]

[/one_half][one_half_last]

[/one_half_last]

Meanwhile, at IMT Lille Douai, Caroline Duc is developing an “artificial nose,” an electronic device that can monitor patients by analyzing their breath – an innovation which could eventually be used to help diagnose diseases.

Sometimes innovation comes from unexpected places. This is the case for a new type of liquid xenon scanner developed at IMT Atlantique. The technology developed by physicists was drawn directly from their efforts to search for dark matter. In the future, we may be able to detect the smallest diseased cells based on our observations of the most distant galaxies!

To learn more about new healthcare technologies and how they are impacting patients and practitioners, I’MTech suggests a selection of our archives on this topic:

Dominique Thers’ team at IMT Atlantique is working on XEMIS, a medical imaging device that uses liquid xenon.

Is dark matter the key to the medical scanner of the future?

A team of researchers at IMT Atlantique is developing a new type of medical scanner called XEMIS. To create the device, the team drew on their previous research in fundamental physics and the detection of dark matter, using liquid xenon technology. The first time the device was tested was using small animals. It allowed the scientists to significantly lower the injected dose, the time of the examinations, and to improve the resolution of the images produced.

 

This article is part of our dossier “When engineering helps improve healthcare

For the past 10 years, researchers at IMT Atlantique have been tracking dark matter as part of the XENON international collaboration. Their approach, which uses liquid xenon, currently makes them world leaders in one of the biggest mysteries of the universe: what is dark matter made of? Although the answer to this question is still waiting to be discovered, the team’s fundamental work in physics has already given rise to new ideas… in medicine! As well as detecting dark matter, the innovations produced by the XENON collaboration have proven to be extremely useful for medical imaging, as they are much more efficient than current scanners.

Improving current medical imaging techniques is one of the great challenges for medicine’s future. The personalization of patients’ healthcare and follow-up care, as well as the prediction of illnesses, mean that doctors are needing to acquire more patient data more often. However, having to remain still for 30 minutes straight is not enjoyable for patients- especially when they are now being asked to get scans more often! For hospitals, more examinations means more machines and more staff. This means that faster imaging techniques are not only more practical for patients, but also more economic for health services.

All over the world, several initiatives are competing to try and find the scanners of the future. These all have similar structures to scanners that are currently being used. “This is an area of study where investment is important” states Dominique Thers, a researcher in fundamental physics at IMT Atlantique and French coordinator of the XENON collaboration. Manufacturers are improving the scanners logically, for example, by increasing their camera size or resolution. “Since we are researching dark matter, our technology comes from another field. This means that our solution is completely different to the routes that are currently being explored”, highlights Thers, whilst reminding us that his team’s work is still in the research phase, and not for industrial use.

A xenon bath

The physicists’ solution has been named XEMIS (Xenon Medical Imaging System). Although he uses the word ‘camera’, Dominique Thers’ description of the device is completely different to anything we could have imagined. “XEMIS is a cylindrical bathtub filled with liquid xenon”. Imagine a donut led on its side and then stretched out lengthways with a hole in the middle to form a tube. The patient is led down inside the hole and surrounded by the tube’s 12cm thick wall, which is filled with liquid xenon. Although XEMIS is shaped like any other scanner, the imaging principle is completely different. In this device, the entire tube is the ‘camera’.

To understand this, let’s go into more detail. Currently, there are two main types of medical scanners: the CT scanner and the PET scanner  The CT scanner uses an X-ray source which passes through the patient’s body, and is received by a receptor on the other side of the tube. Whereas, for the PET scanner, the patient needs to be injected with a weak radioactive substance. This substance then emits ionizing radiation, which is detected by a circle of sensors that moves along the patient in the tube.

However, both devices have a size limit, which is called the parallax effect. Since the sensors face the center of the tube, their detection capacity is not the same in every direction around the patient. Therefore, image resolution is better in the center of the field of view, compared to the edge. This is why a PET scanner can only produce medical images by section, as the receptors need to be repositioned to get an accurate scan of each area of the patient.

Although XEMIS uses an injection like the PET scanner, there is no need to move sensors during the scan, as each atom of liquid xenon surrounding the patient acts as a sensor. This means that it has a large field of view that offers the same image quality in every direction. The device offers a huge advantage; now, there is no longer any need to move the sensors and work bit-by-bit. In the same amount of time as a traditional scan, XEMIS provides a more precise image; or the same quality image in a shorter amount of time.

Three photons are better than two

The highly consistent level of detection is not the only advantage of this new scanner. “The substances in traditional injections, such as those with a fluorine-18 base, emit two diametrically opposed gamma rays”, explains Dominique Thers. “When they come into contact with the xenon surrounding the patient, two things happen: light is emitted, and an electrical current is produced”. This method, which is based on contact with xenon, comes directly from the approaches used to detect dark matter.

Therefore, on both sides of the tube there are interaction points with xenon. The line between these two points passes through the patient and, more precisely, through the point where the gamma rays are emitted. Algorithms then record these signals and join them together with associated points in the patient. This is then used to build an image. “Because XEMIS technology eliminated the parallax effects, the signal-to-noise ratio of the image created by this detection method is ten times better than the images produced by classical PET scanners”, explains Dominique Thers.

As well as this, XEMIS cameras give doctors a new imaging modality called three-photon imaging! Instead of using fluorine-18, the researchers would inject the patient with scandium-44, which is made by the ARRONAX cyclotron in Nantes. This isotope emits two diametrically opposed photons, as well as a third. “Not only do we have a line that passes through the patient, but XEMIS also measures a hollow cone which contains the third proton. This passes through the emission point”. The additional geometric information allows the machine to efficiently calculate the emission point using triangulation, which ultimately leads to an even better signal-to-noise ratio.

According to Dominique Thers, “By removing the parallax effect, the device can improve the signal-to-noise ratio of a medical image of a human by a factor of ten. It can improve this ratio by another factor of ten if it uses a camera that can capture the entire patient in its field of view, and another factor of ten with the third photon. In total, this means that XEMIS can improve the signal-to-noise ratio by a factor of 1,000.” This makes it possible to make adjustments that can reduce the injected dose, or to improve the imaging protocol in order to increase the frame rate of the cameras.

Can this become an industrial scanner?

Researchers at IMT Atlantique have already demonstrated the effectiveness of XEMIS using three-photon radioactive tracers, but they have not yet tried this with living things. A second phase of the project, called XEMIS2, is now coming to an end. “Our goal now is to produce the first image of a living thing using a small animal, for example. This will demonstrate the huge difference between XEMIS and traditional scanners”, Thers explains. This is a key step in ensuring that the technology is adopted by the medical imaging community. The team are working with the Nantes University Hospital, which should help them to achieve this objective.

In the meantime, the IMT Atlantique team has already patented several innovations with Air Liquide for a xenon renewal and emptying system for the XEMIS camera. 30% of this technology uses cryogenic liquids. It is important that the team have already planned technical processes that will make it easier for health services to use XEMIS, just in case the scanner is adopted by the medical profession. This is a step forward in the international development of this unique technology.