SOCCRATES

SOCCRATES automates cybersecurity for industrial systems

Projets européens H2020SOCCRATES is a H2020 European project launched in September 2019 for a three-year period. It aims to develop at least one platform to automate the detection of certain attacks and launch appropriate countermeasures. In doing so, it should help cyber security operators for industrial systems act more quickly and effectively in the event of a cyber attack. Hervé Debar, an information systems security researcher at Télécom SudParis, explains how the research consortium, which includes the school, is going about developing this solution.  

 

What is the SOCCRATES platform?

Hervé Debar: The SOCCRATES platform is a “Security Information and Event Management” environment that aims to detect and block cyber-attacks more effectively. To do so, the platform collects data about the vulnerabilities present on the monitored system, malicious activity targeting the IT environment, and general information about the threat. It then proposes appropriate countermeasures for the attacks that are detected and makes it possible to implement them.

How does it hope to address the needs of companies and organizations?

HD: SIEM platforms are the core of Security Operating Centers (SOC), where operators  manage cyber threats. All operators of critical infrastructures must monitor their information systems as required by French and European regulations. Faced with growing threats, the SOCCRATES platform aims to provide a greater degree of automation, making it possible to respond to attacks more quickly and precisely. Operators could then focus on the most complex attacks.

What is your approach to developing this platform?

HD: The project focuses primarily on the knowledge with which SOC operators are provided in order to respond to attacks. This knowledge takes one of three forms. The first is increased knowledge of the monitored information system, and of the potential attack paths that could be used to compromise a vulnerable target. Blocking the easiest attack paths can help prevent a hacker from spreading throughout the system. The second form of knowledge is based on an understanding of the threat. This means observing internet attack phenomena in order to improve the detection mechanisms used. And the third form of knowledge involves understanding the impact an attack has on operations in order to assess the risks of countermeasures and the benefits in terms of limiting the impact of an attack.

What expertise are Télécom SudParis researchers contributing to this project?

HD: We’re contributing our expertise in cyber attack remediation, which we developed in particular through the MASSIF and PANOPTESEC European FP7 projects. Our work on these two projects, which were launched in 2013 and 2014, gave us the opportunity to develop in-depth knowledge about industrial cybersecurity, managing attacks and implementing countermeasures. Our response model provides a quantitative assessment of the impact — whether positive or negative — of the remediations proposed to block attacks.

Read more on I’MTech: SPARTA: Defining Cybersecurity in Europe

How do you plan to test the effectiveness of the SOCCRATES platform?

HD: The platform will be implemented and deployed in two pilot environments involving critical infrastructures. In the field of cloud computing, with the company Mnemonic, and in the energy sector with Vattenfall. Mnemonic is a managed security service provider. At Vattenfall, the SOCCRATES platform will be used to monitor networks that control electricity production and distribution.

Beyond these industry partners, how is the project organized?

HD: SOCCRATES is coordinated by the Netherlands Organisation for Applied Scientific Research (TNO). In addition to IMT, three are three Swedish partners (KTH, Foreseeti and Mnemonic), a Finnish partner (F-Secure), ATOS Spain, Vattenfall IT Services (Poland), the Austrian Institute of Technology (AIT), and another Dutch partner, ShadowServer. This consortium is divided into three kinds of contributions: vulnerability analysis, behavioral detection, and attack remediation. Our first major step is to define the use cases and demonstration scenarios that we will use to develop, approve and demonstrate the components of the project. We plan to do this by the end of January.

Learn more about SOCCRATES

DeNoize

A window, and silence!

To combat noise pollution and its effects on human health, DeNoize, a start-up incubated at Mines Saint-Étienne, offers a solution: a window that ‘mutes sound’. This connected window would analyze outside noise and adapt to cancel it out.

 

Double glazing increases thermal insulation, but when it comes to noise, it’s another story. When we’re indoors at home or the office, the vast majority of the outside noise that reaches us comes through the windows. This is an especially big concern for people who live or work near airports or major roads. Since May 2018, DeNoize co-founders Olivier Schevin and Aman Jindal have made it their mission to reduce this noise pollution, which is detrimental to our health.

DeNoize, a start-up incubated at Mines Saint-Étienne, offers an innovative solution for improving sound insulation in windows. “Our challenge is now to miniaturize the system so that it can be integrated into window frames,” says co-founder Olivier Schevin. The concept could easily be integrated into standard windows available today.

The problem with double glazing

“Double glazing is actually less effective than single glazing when it comes to sound insulation for the same thickness of glass,” says Olivier Schevin. Although it may seem counterintuitive, double glazing offers less resistance to low frequencies— between 50 and 500 Hz. A frequency band that is the main source of noise from airports and roads. “Double glazing was designed to solve thermal insulation problems, without considering the acoustic aspect,” he explains.

Double glazing is first and foremost two masses, the panes, with air or gas between them. This structure poses a problem from an acoustic point of view: certain frequencies – low frequencies – causes the air trapped between the panes to resonate and the sound propagates. This effect may be counteracted by increasing the thickness of the windows, or the space between the two panes. This passive reduction results in a bulky look from an architectural viewpoint and is also very expensive.

Sound fights back

DeNoize’s innovation is to use sound to fight sound, making it an active noise reduction system. “We’re going to generate a counter-vibration suited to the vibration of the outside noise,” explains Olivier Schevin. “The system produces a vibration that counters that of the outside noise, creating a destructive interference.” The vibrations ‘cancel each other out,’ reducing the noise transmitted by up to 75% for low frequencies.  

“This technology is somewhat similar to that used in noise-cancelling headphones,” adds Olivier Schevin. “The technical difference is the surface of the area we want to treat. For the headphones, it’s a really small area close to the ear.” The system developed by DeNoize users sensors to analyze outside noise in real time and adapt to it accordingly. The actuators produce a counter-vibration that interferes with the original noise. It must also include a control unit and an electronic board responsible for determining the most effective actions for sensors and actuators.

The system is integrated into the window frames and requires an electrical connection nearby to supply it with energy. This is already common today with rolling shutters for example. The  innovation in step with advances in smart home technology.

Read more on I’MTech: Smart homes: A world of conflict and collaboration

This communication between actuators, sensors and control unit makes it possible to customize noise reduction in real time which adapts to outside variations. “As of now, we have a working prototype,” says Olivier Schevin, “But the system doesn’t calculate in real time yet. So we still have a development phase ahead of us for the electronics part.”

Olivier Schevin is launching an industrial project with students to develop a real-time demonstrator. The electronic component is still to be developed, since the existing control unit  was made using laboratory equipment that cannot be integrated into window frames. “In general, we’re still looking for ways to improve performance at the lowest possible cost.”

digital twin

What is a digital twin?

Digital twins, digital doubles – what exactly do these terms mean? Raksmey Phan, an engineer at the Mines Saint-Étienne Centre for Biomedical and Health Engineering (CIS)[1], talks to us about the advantages and advances offered by these new tools, as well as the issues involved.

 

What does a digital twin refer to?

Raksmey Phan: If you have a digital, mathematical model representing a real system, based on data from this real system, then you have a digital twin. Of course, the quality of the digital twin depends first and foremost on the mathematical model. Industrial ovens are a historic example that can help explain this idea.

To create a digital twin, we record information about the oven, which could include its operating hours or the temperature each time it’s used. Combined with algorithms that take into account the physical components that make up the oven, this digital twin will calculate its rate of wear and tear and anticipate breakdown risks. The use of the oven can then be monitored in real time and simulated in its future state with different use scenarios in order to plan for its replacement.

In what fields are they used?

RP: They can be used in any field where there is data to be recorded. We could say that climatologists make a digital twin of our planet: based on observational data recorded about our planet, they run simulations, and therefore mathematical models, resulting in different scenarios. To give another example, at the Mines Saint-Étienne CIS, we have scientists such as Xiaolan Xie, who are internationally renowned for their experience and expertise in the field of modeling healthcare systems. One of our current projects is a digital twin of the emergency department at Hôpital Nord de Saint-Étienne, which is located 200 meters from our center.

What advantages do digital twins offer?

RP: Let’s take the example of the digital twin of the emergency room. We’ve integrated anonymized patient pathways over a one-year period in a model of the emergency room. In addition to this mathematical model, we receive data in what can be referred to as ‘pseudo-real time,’ since there is a lapse of one hour from the time patients arrive in the department. This makes it possible for us to do two important things. The first is to track the patients’ movement through the department in pseudo-real time, using the data received and the analysis of pathway records. The second is the ability to plan ahead and predict future events. Imagine if there was a bus accident in the city center. Since we know what types of injuries result from such an accident, we can visualize the impact it would have on the department, and if necessary, call in additional staff.

What did people do before there were digital twins?

RP: Companies and industries were already using the concept before the term existed. Since we’ve been using machines, engineers have tried to monitor tools with replicas – whether digitally or on paper. It’s a bit like artificial intelligence. The term is back in fashion but the concept goes back much further. Algorithms are mathematics, and Napoleon used algorithms for his war logistics.

When did the term digital twin first start to be used?

RP: The term ‘digital twin’ was first used in 2002 in articles by Michael Grieves, a researcher at the Florida Institute of Technology. But the concept has existed since we have been trying to model real phenomena digitally, which is to say since the early days of computing. But there has been renewed interest in digital twins in recent years due to the convergence of three scientific and technological innovations. First, the impressive growth in our ability to analyze large amounts of data — Big Data. Second, the democratization of connected sensors — the Internet of Things. And third, renewed interest for algorithms in general, as well as for cognitive sciences — Artificial Intelligence.

How have the IoT and Big Data transformed digital twins?

RP: A digital twin’s quality depends on the quantity and quality of data, as well as on its ability to analyze this data, meaning its algorithms and computing capacity. IoT devices have provided us with a huge amount of data. The development of these sensors is an important factor – production has increased while costs have decreased. The price of such technologies will continue to drop, and at the same time, they will become increasingly accurate. That means that we’ll be able to create digital twins of larger, more complex systems, with a greater degree of accuracy. We may soon be able to make a digital twin of a human being (project in the works at CIS).

Are there technological limitations to digital twins?

RP: Over the last five years, everything’s been moving faster at the technological level. It’s turned into a race for the future. We’ll develop better sensors, and we’ll have more data, and greater computing power. Digital twins will also follow these technological advances. The major limitation is sharing data – the French government was right to take steps towards Open Data, which is free data, shared for the common good. Protecting and securing data warehouses are limiting factors but are required for the technological development of digital twins. In the case of our digital twin of the hospital, this involves a political and financial decision for hospital management.

What are some of the challenges ahead?

RP: The major challenge, which is a leap into the unknown, is ethics. For example, we can assess and predict the fragility of senior citizens, but what should we do with this information after that? If an individual’s health is likely to deteriorate, we could warn them, but without help it will be hard for them to change their lifestyle. However, the information may be of interest to their insurance providers, who could support individuals by offering recommendations (appropriate physical activity, accompanied walks etc.) This example hinges on the issues of confidentially and anonymization of data, not to mention the issue of informed consent of the patient.

But it’s incredible to be talking about confidentiality, anonymization and informed consent as a future challenge  — although it certainly is the case — when for the past ten years or so, a portion of the population has been publishing their personal information on social media and sharing their data with wellness applications whose data servers are often located on another continent.

[1] Raksmey Phan is a researcher at the Laboratory of Informatics, Modelling and Optimization of the Systems (LIMOS), a joint research unit between Mines Saint-Étienne/CNRS/Université Clermont-Auvergne.

Read on I’MTech:

AiiNTENSE

AiiNTENSE: AI for intensive care units

The start-up AiiNTENSE was incubated at IMT Starter and develops decision support tools for healthcare with the aim of advising intensive care personnel on the most appropriate therapeutic procedures. To this end, the start-up is developing a data platform of all diseases and conditions, which it has made available to researchers. It therefore seeks to provide support for launching clinical studies and increase medical knowledge.

 

Patients are often admitted to intensive care units due to neurological causes, especially in the case of a coma. And patients who leave these units are at risk of developing neurological complications that may impact their cognitive and functional capacities. These various situations pose diagnostic, therapeutic and ethical problems for physicians. How can neurological damage following intensive care be predicted in the short, medium and long term in order to provide appropriate care? What will the neurological evolution of a coma patient involve, between brain death, a vegetative state and partial recovery of consciousness? An incorrect assessment of the prognosis could have tragic consequences.

In 2015, Professor Tarek Sharshar, a neurologist specialized in intensive care, saw a twofold need for training – on one hand neurology training for intensivists, and on the other, intensive care training for neurologists. He proposed a tele-expertise system connecting the two communities. In 2017, this project gave rise to AiiNTENSE, a start-up incubated at IMT Starter, whose focus soon expanded. “We started out with our core area of expertise: neuro-intensive care and drawing on support from other experts and learned societies, we shifted to developing decision support tools for all of the diseases and conditions encountered in intensive care units,” says Daniel Duhautbout, co-founder of AiiNTENSE. The start-up is developing a database of patient records which it analyzes with algorithms using artificial intelligence.

AI to aid in diagnosis and prognosis

The start-up team is working on a prototype concerning post-cardiac arrest coma. Experts largely agree on methods for assessing the neurological prognosis for this condition. And yet, in 50% of the cases of this condition, physicians are not yet able to determine whether or not a patient will awake from the coma. “Providing a prognosis for a patient in a coma is extremely complex and many available variables are not taken into account, due to a lack of appropriate clinical studies and tools to make use of these variables,” explains Daniel Duhautbout. That’s where the start-up comes in.

In 2020, AiiNTENSE will launch its pilot prototype in five or six hospitals in France and abroad. This initial tool comprises, first and foremost, patient records, taken from the hospital’s information system, which contain all the relevant data for making medical decisions. This includes structured biomedical information and non-structured clinical data (hospitalization or exam reports). In order to make use of the latter, the start-up uses technology for the automated processing of natural language. This results in patient records with semantic, homogenized data, which take into account international standards for interoperability.

A use for each unit

The start-up is developing a program that will in time respond to intensivists’ immediate needs. It will provide a quick, comprehensive view of an individual patient’s situation. The tool will offer recommendations for therapeutic procedures or additional observations to help reach a diagnosis. Furthermore, it will guide the physician in order to assess how the patient’s state will evolve. The intensivist will still have access to an expert from AiiNTENSE’s tele-expertise network to discuss cases in which the medical knowledge implemented in the AiiNTENSE platform is not sufficiently advanced.

The start-up also indirectly responds to hospital management issues. Proposing accurate, timely diagnoses means limiting unnecessary exams, making for shorter hospital stays and, therefore lower costs. In addition, the tool optimizes the traceability of analyses and medical decisions, a key medical-legal priority.

In the long term, the start-up seeks to develop a precision intensive care model. That means being able to provide increasingly reliable diagnoses and prognoses tailored for each patient. “For the time being, for example, it’s hard to determine what a patient’s cognitive state will be when they awaken from a coma. We need clinical studies to improve our knowledge,” says Daniel Duhautbout. The database and its analytical tools are therefore open to researchers who wish to improve our knowledge of conditions that require intensive care. The results of their studies will then be disseminated through AiiNTENSE’s integration platform.

Protecting data on a large scale

In order to provide a viable and sustainable solution, AiiNTENSE must meet GDPR requirements and protect personal health data. With this aim, the team is collaborating with researchers at IMT Atlantique and plans to use the blockchain to protect data. Watermarking, a sort of invisible mark attached to data, would also appear to be a promising approach. It would make it possible to track those who use the data and who may have been involved in the event of data leakage to external servers. “We also take care to ensure the integrity of our algorithms so that they support physicians confronted with critical neurological patients in an ethical manner,” concludes Daniel Duhautbout.

 

supply chain

Meet your supply chain using virtual reality

Immersive technologies for industrial engineering and risk management? This is the focus of research to be carried out at the SIReN laboratory launched on 15 November, bringing together researchers from IMT Mines Albi (France) and Georgia Tech (USA). On the French side, Frédérick Benaben, an expert in collaborative networks, is already using virtual reality to develop a supply chain decision support and management tool for companies.  

 

In front of you, a beam of green light traces a path leading you straight to the finishing line. You’re continuing along this path when, suddenly, a floating red sphere comes flying right toward you! It makes you veer sharply from your path, taking you away from where you were headed. You now face a crucial question: how can you achieve your goal from this new point? This is not a revolutionary video game set to be released at the end of the year. It’s a decision support tool developed by researchers at IMT Mines Albi and Georgia Tech to facilitate the visualization of data produced by artificial intelligence.

Building on their collaboration begun in 2015, the two academic partners have continued their joint research since 15 November through SIReN[1], a new international associated laboratory jointly based in Albi and Atlanta. “At the laboratory, we’re carrying out research on how immersive technology can help us develop and manage response networks,” explains Frédérick Benaben, an IMT Mines Albi researcher who specializes in the field of collaborative networks and information systems. Such networks include supply chains and crisis management. The researchers’ expertise is based on an original vision of artificial intelligence, at the crossroads between industrial engineering and computer and data sciences, and on a partnership that is already as concrete as the work it seeks to carry out.

Making the abstract concrete with virtual reality

A supply chain is a dynamic system that must be as agile as possible. It evolves over time in response to opportunities (opening of a new market, tax reductions etc.) or risks (weather events, closing of a border etc). Yet, understanding how these various events could impact the supply chain proves to be very complex. That’s where virtual reality comes in!

Read more on I’MTech: What is supply chain management?

Unlike “traditional” uses of virtual reality where the aim is to represent a copy of reality, as with digital twins for example, the researchers use virtual reality to get away from reality. “We can then project ourselves into a world where our physical reference points (such as up and down, distance, etc.) are conserved, but where we have the possibility of visualizing abstract concepts. Using spheres, we represent opportunities or risks, for example. Color effects can indicate how dangerous they are,” says the researcher.

In the virtual universe, the spatio-temporal reference points are defined by the set of performance indicators for a supply chain. Let’s consider a simplified case where there are only three indicators: cost, product quality and delivery time. The researchers therefore define a three-dimensional frame of reference in which the supply chain is situated. Like a mechanical force, each risk or opportunity that has an impact on the network will push or pull it in a certain direction. For example, flooding along a delivery route will push the supply chain down on the delivery time axis.

Through virtual reality, a user can observe a supply chain, move and observe risks and opportunities related to the supply chain.

 

In reality, logistics networks have dozens of performance indicators and over a hundred risks and opportunities — and therefore mechanical forces — to represent at each instant, making them complex to visualize. This is made possible through extensive work to identify and process data. To continue with the flooding example, data is identified such as the number of blocked routes, how delayed the trucks are, the percentage of damaged warehouses, implications for the products etc. The researchers turn this information into a macroscopic force exerted on the performance indicators.

Read more on I’MTech: C2Net: supply chain logistics on cloud nine

Virtual reality therefore helps respond to a need for agility in supply chains in an interactive way. Once the users are immersed in their supply chain universe, they can follow its movement and also interact with the supply chain. The spheres representing risks and opportunities are closer or further from the user based on how likely they are to occur. Their color indicates how dangerous they are, making it possible to identify areas of action more effectively. “The user steers the course of their system towards a precise objective. Virtual reality allows them to identify the forces they must use to achieve their goal, much like a sailor uses winds and currents. If a risk materializes, they deviate from the path but may be able to use a force to correct the effect,” explains Frédérick Benaben.

This decision support tool could also help anticipate the effects of an event on the path and avoid it, if possible. These are precisely the questions being explored through the SCAN research program (Agile, Digital Collaborative Systems) launched in September 2019 with Scalian, a company that specializes in digital transformation, conducted through the SIReN laboratory.

Virtuality and risk management

The supply chain is not the only area of research to benefit from virtual reality through the SIReN laboratory. In March 2019, researchers from IMT Mines Albi created the EGCERSIS[2] research program with support from the Occitanie region and in partnership with the companies Immersive Factory and Report One. The aim is to use immersive technologies to develop crisis management systems for sensitive industrial environments. In particular, they are building on an emerging need expressed by Immersive Factory, a company specialized in developing digital twins to provide safety training for staff at industrial sites. The company is seeking to expand its offerings by providing training for crisis situations. Employees may have already learned how to make sure a valve is closed after using it, but what should they do if it catches on fire? The four-year partnership will be based on demonstrating the strength of digital simulation to respond to this sort of issue.

To do so, the researchers will rely on the IOMEGA platform equipped with multi-screen interfaces displayed in the form of a cockpit, allowing interaction between software, artificial intelligence visualization etc. They will also take advantage of state-of-the-art equipment for immersion and virtual reality, giving users 360° freedom of movement, via the new IOMEGA VR platform launched on 15 November. On the other side of the Atlantic, a twin platform is being developed at Georgia Tech.

More generally, the two partner institutions will draw on their complementary expertise for the projects carried out through SIReN. They seek to increase the agility of collaborative networks but come at the problem from two different angles. The French team is developing technologies intended for the supply chain, while the American team has given rise to the concept of the physical internet, which aims to transport physical goods as efficiently as data is transmitted over the internet. Like the internet, logistics must become fluid, and immersive technologies have a key role to play in making this possible.

[1] Sentient Immersive Response Network

[2] Crisis Management Training in an Environment Representative of Sensitive Industrial Sites

digital transformation

No, employees are not afraid of digital transformation

This article was originally published (in French) on The Conversation. By Emmanuel Baudoin, Institut Mines-Télécom Business School.

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]T[/dropcap]he 2019 edition of the study (in French) “French employees in the age of digital transformation” conducted by the HRM Digital Lab at Institut Mines-Télécom Business School shows that French employees are not afraid of digital transformation, and some even hope to see their company take greater steps towards it.

This study was conducted on a representative sample of just over 1,000 French employees and was managed by Opinion Way with support from CFA-EVE and the Essonne ANDRH (French Association of Human Resources Directors). An additional study was carried out with 100 decision-makers in the private sector, located in the Paris region, in order to collect their perceptions of digital transformation in their companies.

A multi-facetted transformation

The first finding is that 90% of the employees report that digital technology has had an impact on their job. This figure is unquestionable. However, it covers a wide variety of different situations. 48% of these employees consider that digital technology has had a great impact on their job while 42% say that it has just had a moderate impact, if any. As an illustration of the wide variety of situations, when asked how important digital technology was to their work,  11% said digital technology represented the core part of their work, 35% said digital technology represented a significant part of their work, 32% said digital technology was used only to support their work, 14% said digital technology was little-used in their work and 9% said that they did not use digital technology at all in their work.

The interviews carried out with decision-makers from companies of all sizes and industries told the same story –  digital technology is here to stay. 98% of those interviewed said digital technology has had an impact on their company. 65% consider that it has had a strong impact, while 11% feel that it has had a minor impact, if any. They report that all company processes have been affected: customer experience, supply chain management, administrative process management, the way products and services are developed, the way work is organized, the approach to managing teams, and employee experience.

Employees have a relatively high digital maturity level

Far from being pessimistic about the impact digital technology has had on their job, 84% of the employees say that it is making their work more interesting or that it has not had a major impact on their work. When asked to choose a response that describes their relationship with digital technology, 43% chose a very positive response, 47% chose a neutral response, while 11% chose a negative response. Another significant finding was that 40% hope their companies will go further in their digital transformation while 13% do not.

This overall positive perception of digital technology contributes to employees’ digital maturity level. This level can be defined as an employee’s overall relationship with digital technology at work, their perceived level of proficiency with digital technology, and the extent to which they use it to carry out their work activities.

As far as perceived level of proficiency is concerned, 61% consider that they have a very high or high level of proficiency, whereas 6% say that they have a low or very low level. At the same time, they are aware of their potential for improvement. 62% of respondents felt that they wasted time every day due to a lack of proficiency in certain digital skills.

A final point of interest is that the results also confirm that a new type of employee has emerged, one who can be called a ‘self HR employee,’ as identified in the first edition of this study. These ‘self HR’ employees take advantage of digital technology for a wide variety of purposes, including to learn independently and develop their skills, which means that they engage in informal digital learning. They also implement strategies to enhance their employee brand or even sell their skills.

 

       French employees are optimistic about the digital transformation! (Emmanuel Baudoin on Xerfi canal, 2019).

[divider style=”normal” top=”20″ bottom=”20″]

Emmanuel Baudoin, Associate Professor in HR, Institut Mines-Télécom Business School

The original version of this article (in French) was published on The Conversation under a Creative Commons license. Read the original article.

MADEin4

MADEin4: digital twinning and predictive maintenance for industry

Projets européens H2020The European MADEin4 project was launched in April 2019 for a three-year period. It aims to help semiconductor manufacturers and equipment suppliers play an active role in the continuous improvement of their equipment. How? By relying on new digital twinning and predictive maintenance technologies. Agnès Roussy and Valéria Borodin, research professors at Mines Saint-Étienne, a member of the MADEin4 project, explain the context that gave rise to this project and discuss the scientific hurdles to overcome.   

 

What was the context for developing the MADEin4 project?

Agnès Roussy: The MADEin4 project (Metrology Advances for Digitized ECS Industry 4.0) is an ECSEL project (Electronic Components and Systems for European Leadership). Its aim is to support and bring together the semiconductor industry in Europe in the transition to digital technology.

What is the overall goal of this project?

Valéria Borodin: To increase production output without affecting reliability levels in the manufacturing of electronic devices, the quality of which must comply with the increasingly demanding requirements of the highly competitive semiconductors market.

And how are you going about this?

AR: In order to improve productivity and facilitate the integration of digital technology into the organization of manufacturing processes for semiconductor and equipment manufacturers, going beyond the state of the art, the project will rely on an Industry 4.0 approach. To do so, two complementary boosters will be leveraged in the development of a pilot line: a physical accelerator based on next-generation metrology and inspection equipment for the microelectronics industry; and a digital accelerator – the digital twin (see box) – integrating artificial intelligence technology to improve output and equipment performance prediction.

[box type=”shadow” align=”” class=”” width=””]

Plus, loupeThe technique of digital twinning is used by manufacturers to monitor the operational status of their equipment (manufacturing, metrology, inspection). Digital twins of physical equipment are used. They evolve over time using data collected by sensors to measure the health status of equipment in order to prevent or anticipate breakdowns.[/box]

What technological and scientific challenges do you face?

VB: The development of digital twins and simulation models for managing and coordinating a production unit at different levels of decision-making poses a number of challenges, in particular, in terms of consistency of digital technology and decision-making across all industrial activities. In this regard, the help and expertise of semiconductor manufacturers and equipment suppliers (manufacturing and metrology) play a pivotal role in confirming the usefulness and industrial feasibility of the solutions we propose as academics.

How are Mines Saint-Étienne researchers contributing to the project?

AR: One of the research areas, in which Mines Saint Étienne’s Manufacturing and Logistics Sciences department (SFL) is primarily active, focuses on microelectronic manufacturing. This involves advanced process control, quantitative management of operations in the manufacturing process, and decision support at different levels (operational, tactical and strategic). As part of the MADEin4 project, we seek to explore opportunities and identify the limitations of new digital technologies in the intensive use and analysis of the massive quantities of data collected by inspection and metrology equipment.

Who are you partners for this project, and which collaborations are important for your work?

VB: The MADEin4 project brings together the expertise of 42 industrial and academic partners from 11 countries. Our key industrial partners for this project are STMicroelectronics in Rousset and Crolles. This project, among others, allows us to continue the long-standing, successful academic collaboration between the Manufacturing and Logistics Sciences Department at Mines Saint Etienne’s Provence Microelectronics Center (CMP) and the ST sites of Rousset and Crolles, who we’ve worked with for over 15 years. Many equipment suppliers are also involved in this project, so we’ll have the opportunity to work with them more closely on the equipment. And likewise for the academic partners involved: this European project will help foster new opportunities for collaboration through PhD theses or future calls for projects.

What are the expected benefits?

AR: The expected benefits of the MADEin4 project closely reflect the scientific and strategic priorities of Mines Saint-Etienne and the Provence Microelectronics Center (CMP), which promote a number of important topics: the industry of the future (Industry 4.0) and artificial intelligence (IA). Through the MADEin4 project, we seek to provide process control solutions for semiconductor manufacturers, explore opportunities for applications of digital twinning technology, strengthen the partnership with semiconductor manufacturers, and increase international recognition for the CMP on topics related to microelectronic manufacturing.

What are the important steps coming up for the project?

VB: The MADEin4 project started just over six months ago. This initial phase is exciting because everything seems to be possible. As for Mines Saint Étienne, the industrial data soon to be provided by the different partners will allow us to compare our research to the realities of industry. By the end of the first year, the research findings will be publicized through articles in international journals and presentations to the scientific communities involved.

Find out more about the MADEin4 project

Guillaume Balarac

Guillaume Balarac, turbulence simulator

Turbulence is a mysterious phenomenon in fluid mechanics. Although it has been observed and studied for centuries, it still holds secrets that physicists and mathematicians strive to unlock. Guillaume Balarac is part of this research community. A researcher at Grenoble INP (at the LEGI Geophysical and Industrial Flows Laboratory), he uses and improves simulations to understand turbulent flows better. His research has given rise to innovations in the energy sector. The researcher, who has recently received the 2019 IMT-Académie des Sciences Young Scientist Award, discusses the scientific and industrial challenges involved in his field of research.

 

How would you define turbulent flows, which are your research specialty?

Guillaume Balarac: They are flows with an unpredictable nature. The weather is a good example for explaining this. We can’t predict the weather more than five days out, because the slightest disturbance at one moment can radically alter what occurs in the following hours or days . It’s the butterfly effect. Fluid flows in the atmosphere undergo significant fluctuations that limit our ability to predict them. This is typical of turbulent flows, unlike laminar flows which are not subject to such fluctuations and whose state may be predicted more easily.

Apart from air mass movements in the atmosphere, where can turbulent flows be found?

GB: Most of the flows that we may encounter in nature are actually turbulent flows. The movement of oceans is described by turbulent flows, as is that of rivers. The movement of molten masses in the Sun generates a turbulent flow. This is also the case for certain biological flows in our bodies, like blood flow near the heart. Apart from nature, these flows are found in rocket propulsion, the motion  of wind turbines and that of hydraulic or gas turbines etc.

Why do you seek to better understand these flows?

GB: First of all, because we aren’t able to do so! It’s still a major scientific challenge. Turbulence is a rather uncharacteristic example – it has been observed for centuries. We’ve all seen a river or felt the wind. But the mathematical description of these phenomena still eludes us. The equations that govern these turbulent flows have been known for two centuries. And the underlying mechanics have been understood since ancient times.  And yet, we aren’t able to solve these equations and we’re ill-equipped to model and understand these events.

You say that researchers can’t solve the equations that govern turbulent flows. Yet, some weather forecasts for several days out are accurate…

GB: The iconic equation that governs turbulent flows is the Navier-Stokes equation. That’s the one that has been known since the 19th century. No one is able to find a solution with a pencil and paper. Finding a unique, exact solution to this equation is even one of the seven millennium problems established by the Clay Mathematics Institute.  As such, the person who finds the solution will be awarded $1 million. That gives you an idea about the magnitude of the challenge. To get around our inability to find this solution, we either try to approach it using computers, as is the case for weather forecasts  — with varying degrees of accuracy — or we try to observe it. And finding a link between observation and equation is no easy task either!

Beyond this challenge, what can a better understanding of turbulent flows help accomplish?

GB: There are a wide range of applications which require an understanding of these flows and the equations that govern them. Our ability to produce energy relies in part on fluid mechanics, for example. Nuclear power plants function with water and steam systems. Hydroelectric turbines work with water flows, as do water current turbines. For wind turbines, it’s air flows.  And these examples are only as far as the energy sector is concerned.

You use high-resolution simulation to understand what happens at the fundamental level in a turbulent flow. How does that work?

GB: One of the characteristics of turbulent flows are eddies. The more turbulent the flow, the more eddies of varying sizes it has. The principle of high resolution simulation is to define billions of points in the space in which the flow is produced, and calculate the fluid velocity at each of these points. This is called a mesh, and it must be fine enough to describe the smallest eddy in the flow. These simulations use the most powerful supercomputers in France and Europe. And even with all that computing power, we can’t simulate realistic situations – only academic flows in idealized conditions . These high-resolution simulations allow us to observe and better understand the dynamics of turbulence in canonical configurations.

Simulation des écoulements turbulents sur une hydrolienne.

Simulation of turbulent flows on a marine turbine.

Along with using these simulation tools, you work on improving them. Are the two related?

GB: They are two complementary approaches. The idea for that portion of my research is to accept that we don’t have the computing power to simulate the Navier-Stokes equation in realistic configurations. So the question I ask myself is – how can this equation be modified so that it can be possible to solve with our current computers, while ensuring that the prediction is still reliable? The approach is to solve the big eddies first. And since we don’t have the power to make a fine enough mesh for the small eddies, we look for physical terms, mathematical expressions, which replace the influence of the small eddies on the big ones. That means that we don’t have the small eddies in this modeling, but their overall contribution to flow dynamics is taken into account. This helps us improve simulation tools by making them able to address flows in realistic conditions.

Are these digital tools you’re developing used solely by researchers?

GB: I seek to carry out research that is both fundamental and application-oriented. For example, we worked with Hydroquest, on the performance of water current turbines to generate electricity. The simulations we carried out made it possible to assess the performance loss due to the support structures, which do not contribute to capturing the energy from the flow. Our research led to patents for new designs, with a 50% increase in yield.

More generally, do energy industry players realize how important it is to understand turbulent flows in order to make their infrastructures more efficient?

GB: Of course, and we have a number of partners who illustrate industrial interest for our research.    For example, we’ve adopted the same approach to improve the design of floating wind turbines. We’re also working with General Electric on hydroelectric dam turbines. These hydraulic turbines are increasingly being used to operate far from their optimal operating point, in order to mitigate the intermittence of renewable solar or wind energy.  In these systems, hydrodynamic instability develops, which has a significant effect on the machines’ performance. So we’re trying to optimize the operation of these turbines to limit yield loss.

What scientific challenges do you currently face as you continue your efforts to improve simulations and our understanding turbulent flows?

GB: At the technical level, we’re trying to improve our simulation codes to take full advantage of advances in supercomputers. We’re also trying to improve our numerical methods and models to increase our predictive capacity.  For example, we’re now trying to integrate learning tools to avoid simulating small eddies and save computing time. I’ve started working with Ronan Fablet, a researcher at IMT Atlantique, on precisely this topic. Then, there’s the huge challenge of ensuring the reliability of the simulations carried out. As it stands now, if you give a simulation code to three engineers, you’ll end up with different models. This is due to the fact the tools aren’t objective, and a lot depends on the individuals using them. So we’re working on mesh and simulation criteria that are objective. This should eventually make it possible for industry players and researchers to work with the same foundations,  and better understand one another when discussing turbulent flows.

 

Véronique Bellon-Maurel

Véronique Bellon-Maurel: from infrared spectroscopy to digital agriculture

Measuring and quantifying have informed Véronique Bellon-Maurel’s entire scientific career. A pioneer in near infrared spectroscopy, the researcher’s work has ranged from analyzing fruit to digital agriculture. Over the course of her fundamental research, Véronique Bellon-Maurel has contributed to the optimization of many industrial processes. She is now the Director of #DigitAg, a multi-partner Convergence Lab, and is the winner of the 2019 IMT-Académie des Sciences Grand Prix. In this wide-ranging interview, she retraces the major steps of her career and discusses her seminal work.   

 

You began your research career by working with fruit. What did this research involve?

Véronique Bellon-Maurel: My thesis dealt with the issue of measuring the taste of fruit in sorting facilities. I had to meet industrial requirements, particularly in terms of speed: three pieces of fruit per second! The best approach was to use near infrared spectroscopy to measure the sugar level, which is indicative of taste. But when I was beginning my thesis in the late 1980s, it took spectrometers one to two minutes to scan a piece of fruit. I suggested working with very near infrared, meaning a different type of radiation than the infrared that had been used up to then, which made it possible to use new types of detectors that were very fast and inexpensive.

So that’s when you started working on near infrared spectroscopy (NIRS), which went on to became your specialization. Could you tell us what’s behind this technique with such a complex name?

VBM: Near infrared spectroscopy (NIRS) is a method for analyzing materials. It provides a simple way to obtain information about the chemical and physical characteristics of an object by illuminating it with infrared light, which will pass through the object and become charged with information. For example, when you place your finger on your phone’s flashlight, you’ll see a red light shining through it. This light is red because the hemoglobin has absorbed all the other colors of the original light. So this gives you information about the material the light has passed through. NIRS is the same thing, except that we use particular radiation with wavelengths that are located just beyond the visible spectrum.

Out of all the methods for analyzing materials, what makes NIRS unique?

VBM: Near infrared waves pass through materials easily. Much more easily than “traditional” infrared waves which are called “mid-infrared.” They are produced by simple sources such as sunlight or halogen lamps. The technique is therefore readily available and is not harmful: it is used on babies’ skulls to assess the oxygenation saturation of their brains! But when I was starting my career, there were major drawbacks to NIRS. The signal we obtain is extremely cluttered because it contains information about both the physical and chemical components of the object.

And what is hiding behind this “cluttered signal”?

VBM: In concrete terms, you obtain hill-shaped curves and the shape of these curves depends on both the object’s chemical composition and its physical characteristics. You’ll get a huge hill that is characteristic of water. And the signature peak of sugar, which allows you to calculate a fruit’s sugar level, is hidden behind it. That’s the chemical component of the spectrum obtained. But the size of the hills also depends on the physical characteristics of your material, such as the size of the particles or cells that make it up, physical interfaces — cell walls, corpuscles — the presence of air etc. Extracting solely the information we’re interested in is a real challenge!

Near infrared spectrums of apples.

 

One of your earliest significant findings for NIRS was precisely that – separating the physical component from the chemical component on a spectrum. How did you do that?

VBM: The main issue at the beginning was to get away from the physical component, which can be quite a nuisance. For example, light passes through water, but not the foam in the water, which we see as white, even though they are the same molecules! Depending on whether or not the light passes through foam, the observation — and therefore the spectrum — will change completely. Fabien Chauchard was the first PhD student with whom I worked on this problem. To better understand this optical phenomenon, which is called diffusion, he went to the Lund Laser Center in Sweden. They have highly-specialized cameras: time-of-flight cameras, which operate at a very high speed and are able to capture photos “in flight.” We send photons onto a fruit in an extremely short period of time and we recover the photons as they come out since not all of them come out at the same time. In our experiments, if we place a transmitter and a receiver on a fruit spaced 6 millimeters apart, when they came out, certain photons had travelled over 20 centimeters! They had been reflected, refracted, diffracted etc. inside the fruit. They hadn’t travelled in a straight line at all. This gave rise to an innovation, spatially resolved spectroscopy (SRS) developed by the Indatech company that Fabien Chauchard started after completing his PhD.

We looked for other optical arrangements for separating the “chemical” component from the “physical” component. Another PhD student, Alexia Gobrecht, with whom I worked on soil, came up with the idea of using polarized near infrared light. If the photons penetrate the soil, they lose their polarization. Those that have only travelled on the surface conserve it. By differentiating between the two, we recover spectrums that only depend on the chemical component. This research on separating chemical and physical components was continued in the laboratory, even after I stopped working on it. Today, my colleagues are very good at identifying aspects that have to do with the physical component of the spectrum and those that have to do with to the chemical component. And it turns out that this physical component is useful! And to think that twenty years ago, our main focus was to get rid of it.

After this research, you transitioned from studying fruit to studying waste. Why did you change your area of application?

VBM: I’d been working with the company Pellenc SA on sorting fruit since around 1995, and then on detectors for grape ripeness. Over time, Pellenc transitioned to waste characterization for the purpose of sorting, based on the infrared knowledge developed through sorting fruit. They therefore called on us, with a new speed requirement, but this one was much tougher. A belt conveyor moves at a speed of several meters per second. In reality, the areas of application for my research were already varied. In 1994, while I was still working on fruit with Pellenc, I was also carrying out projects for biodegradable plastics. NIRS made it possible to provide quality measurements for a wide range of industrial processes. I was Ms. “Infrared sensors!”

 

“I was Ms. ‘Infrared sensors’!”
– Véronique Bellon-Maurel

 

Your work on plastics was among the first in the scientific community concerning biodegradability. What were your contributions in this area?

VBM: 1990 was the very beginning of biodegradable plastics. Our question was determining whether we could measure a plastic’s biodegradability in order to say for sure, “this plastic is truly biodegradable.” And to do so as quickly as possible, so why not use NIRS? But first, we had to define the notion of biodegradability, with a laboratory test. For 40 days, the plastics were put in reactors in contact with microorganisms, and we measured their degradation. We were also trying to determine whether this test was representative of biodegradability in real conditions, in the soil. We buried hundreds of samples in different plots of land in various regions and we dug them up every six months to compare real biodegradation and biodegradation in the laboratory. We wanted to the find out if the NIRS measurement was able to achieve the same result, which was estimating the degradation kinetics of a biodegradable plastic – and it worked. Ultimately, this benchmark research on the biodegradability of plastics contributed to the industrial production and deployment of the biodegradable plastics that are now found in supermarkets.

For that research, was your focus still on NIRS?

VBM: The crux of my research at that time was the rapid, non-destructive characterization — physical or chemical— of products. NIRS was a good tool for this. We used it again after that on dehydrated household waste in order to assess the anaerobic digestion potential of waste. With the laboratory of environmental biotechnology in Narbonne, and IMT Mines Alès, we developed a “flash” method to quickly determine the quantity of bio-methane that waste can release, using NIRS. This research was subsequently transferred to the Ondalys company, created by Sylvie Roussel, one of my former PhD students. My colleague Jean-Michel Roger is still working with them to do the same thing with raw waste, which is more difficult.

So you gradually moved from the agri-food industry to environmental issues?

VBM: I did, but it wasn’t just a matter of switching topics, it also involved a higher degree of complexity. In fruit, composition is restricted by genetics – each component can vary within a known range. With waste, that isn’t the case! This made environmental metrology more interesting than metrology for the food industry. And my work became even more complex when I started working on the topic of soil. I wondered whether it would be possible to easily measure the carbon content in soil. This took me to Australia, to a specialized laboratory at the University of Sydney. To my mind, all this different research is based on the same philosophy: if you want to improve something, you have to measure it!

So you no longer worked with NIRS after that time? 

VBM: A little less, since I changed from sensors to assessment. But even that was a sort of continuation: when sensors were no longer enough, how could we make measurements? We had to develop assessment methods. It’s very well to measure the biodegradability of a plastic, but is that enough to successfully determine if that biodegradable plastic has a low environmental impact? No, it isn’t – the entire system must be analyzed. I started working on life-cycle analysis (LCA) in Australia after realizing that LCA methods were not suited to agriculture: they did not account for water, or notions of using space. Based on this observation, we improved the LCA framework to develop the concept of a regional LCA, which didn’t exist at the time, allowing us to make an environmental assessment of a region and compare scenarios for how this region would evolve. What I found really interesting with this work was determining how to use data from information systems and sensors to build the most reliable and reproducible model as possible. I wanted the assessments to be as accurate as possible. This is what led me to my current field of research – digital agriculture.

Read more on I’MTech: The many layers of our environmental impact

In 2013 you founded #DigitAg, an institute dedicated to this topic. What research is carried out there?

VBM: The “Agriculture – Innovation 2025” report submitted to the French government in 2015 expresses a need to structure French research on digital agriculture. We took advantage of the opportunity to create Convergence Labs by founding the #DigitAg, Digital Agriculture Convergence Lab. It’s one of ten institutes funded by the Investments in the Future program. All of these institutes were created in order to carry out interdisciplinary research on a major emerging issue. At #DigitAg, we draw on engineering sciences, digital technology, biology, agronomy, economy, social sciences, humanities, management etc. Our aim is to establish knowledge bases to ensure that digital agriculture develops in a harmonious way. The challenge is to develop technologies but also to anticipate how they will be used and how such uses will transform agriculture – we have to predict how technologies will be used and the impacts they will have to help ensure ethical uses and prevent misuse. To this end, I’ve also set up a living lab, Occitanum — for Occitanie Digital Agroecology — set to start in mid-2020. The lab will bring together stakeholders to assess the use value of different technologies and understand innovation processes. It’s a different way of carrying out research and innovation, by incorporating the human dimension.

aerosol therapy

Aerosol therapy: An ex vivo model of lungs

A researcher in Health Engineering at Mines Saint-Étienne, Jérémie Pourchez, and his colleagues at the Saint-Étienne University Hospital, have developed an ex vivo model of lungs to help improve medical aerosol therapy devices. An advantage of this technology is that scientists can study inhalation therapy whilst limiting the amount of animal testing that they use.

 

This article is part of our dossier “When engineering helps improve healthcare

Imagine a laboratory where, sat on top of the workbench, is a model of a human head made using 3D printing. This anatomically correct replica is connected to a pipe which mimics the trachea, and then a vacuum chamber. Inside this chamber is a pair of lungs. Periodically, a pump stimulates the natural movement of the rib cage to induce breathing and the lungs inflate and deflate. This physical model of the respiratory system might sound like science-fiction, but it is actually located in the Health Engineering Center at Mines Saint-Étienne.

Developed by a team led by Jérémie Pourchez, a specialist in inhaled particles and aerosol therapy, and as part of the ANR AMADEUS project, this ex-vivo model stimulates breathing and the illnesses associated with it, such as asthma or fibrosis. For the past two years, the team have been working on developing and validating the model, and have already created both an adult and child-sized version of the device. By offering an alternative and less expensive solution to animal testing, the device has allowed the team to study drug delivery using aerosol therapy. The ex vivo pulmonary model is also far more ethical, as it only uses organs which would otherwise be thrown away by slaughterhouses

Does this ex vivo model work the same as real lungs?

One of the main objectives for the researchers is to prove that these ex vivo models can predict what happens inside the human body accurately. To do this, they must demonstrate three things.

First, the researchers need to study the respiratory physiology of the models. By using the same indicators that are used on real patients, such as those used for measuring the respiratory parameters of an asthmatic, the team demonstrate that the model’s parameters are the same as those for a human being. After this, the team must analyze the model’s ventilation; for example, by making sure that there are no obstacles in the bronchi. To do this, the ex vivo lungs inhale krypton, a radioactive gas, which is then used as a tracer to visualize the air-flow throughout the model. Finally, the team study aerosol deposition in the respiratory tract, which involves observing where inhaled particles settle when using a nebulizer or spray. Again, this is done by using radioactive materials and nuclear medical imaging.

These results are then compared to results you would expect to see in humans, as defined by scientific literature. If the parameters match, then the model is validated. However, the pig’s lungs used in the model behave like the lungs of a healthy individual. This poses a problem for the team, as the aim of their research is to develop a model that can mimic illnesses so they can test the effectiveness of aerosol therapy treatments.

From a healthy model to an ill one

There are various pathologies that can affect someone’s breathing, and air pollution tends to increase the likelihood of contracting one of them. For example, fibrosis damages the elastic fibers that help our lungs to expand when we breathe in. This makes the lung more rigid and breathing more difficult. In order to mimic this in the ex vivo lungs, the organs are heat treated with steam to stiffen the surface of the tissue. This then changes their elasticity, and recreates the mechanical behavior of human lungs with fibrosis.

Other illnesses such as cystic fibrosis occur, amongst other things, due to the lungs secreting substances that make it difficult for air to travel through the bronchi. To recreate this, the researchers insert artificial secretions made from thickening agents, which allows the model lung to mimic these breathing difficulties.

Future versions of the model

Imitating these illnesses is an important first step. But in order to study how aerosol therapy treatments work, the researchers also need to observe how they diffuse into the bloodstream. This can be either an advantage or a disadvantage. Since the lungs are also an entry point where the drug can spread through the body, the research team installed one last tool: a pump to simulate a heartbeat. “The pump allows fluid to circulate around the model in the same way that blood circulates in lungs,” explains Jérémie Pourchez. “During a test, we can then measure the amount of inhaled drug that will diffuse into the bloodstream. We are currently validating this improved model.”

One problem that the team is now facing is the development of new systemic inhalation treatments. These are designed to treat illnesses in other organs, but are inhaled and use the lungs as an entry point into the body. “A few years ago, an insulin spray was put on the market,” says Jérémie Pourchez. Insulin, which is used to treat diabetes, needs to be regularly injected. “This would be a relief for patients suffering from the disease, as it would replace these injections with inhalations. But the drug also requires an extremely precise dose of the active ingredient, and obtaining this dose of insulin using an aerosol remains a scientific and technical challenge.”

As well as being easier to use, an advantage of inhaling a treatment is how quickly the active ingredient enters into the bloodstream. “That’s why people who are trying to quit smoking find that electronic cigarettes work better than patches at satisfying nicotine cravings”, says the researcher. The dose of nicotine inhaled is deposited in the lungs and is diffused directly into the blood. “It also led me to study electronic cigarette-type devices and evaluate whether they can be used to deliver different drugs by aerosol,” explains Jérémie Pourchez.

By modifying certain technical aspects, these devices could become aerosol therapy tools, and be used in the future to treat certain lung diseases. This is one of the ongoing projects of the Saint-Étienne team. However, there is still substantial research that needs to be done into the device’s potential toxicity and its effectiveness depending on the inhaled drugs being tested. Rather than just being used as a tool for quitting smoking, the electronic cigarette could one day become the newest technology in medical aerosol treatment.

Finally, this model will also provide an answer to another important issue surrounding lung transplants. When an organ is donated, it is up to the biomedicine agency to decide whether this donation can be given to the transplant teams. But this urgent decision may be based on factors that are sometimes not sufficient enough to be able to assess the real quality of the organ. “For example, to assess the quality of a donor’s lung,” says Jérémie Pourchez, “we refer to important data such as: smoking, age, or the donor’s known illneses. Therefore, our experimental device which makes lungs breathe ex vivo, can be used as a tool to more accurately assess the quality of the lungs when they are received by the transplant team. Then, based on the tests performed on the organ that is going to be transplanted, we can determine whether it is safe to perform the operation.”