immune system

Understanding the resilience of the immune system through mathematical modeling

Gaining insight into how the immune system works using mathematics is the ultimate goal of the research carried out by IMT Atlantique researcher Dominique Pastor, along with his team. Although the study involves a great degree of abstraction, the scientists never lose sight of practical applications, and not only in relation to biology.

 

In many industries, the notion of “resilience” is a key issue, even though there is no clear consensus on the definition of the term. From the Latin verb meaning “to rebound,” the term does not exactly refer to the same thing as resistance or robustness. A resilient system is not unaffected by external events, but it is able to fulfill its function, even in a degraded mode, in a hostile environment. For example, in computer science, resilience means the ability to provide an acceptable level of services in the event of a failure

This capacity is also found in the human body ­— and in general, in all living beings. For example, when you have a cold, your abilities may be reduced, but in most cases you can keep living more or less normally.

This phenomenon is regularly observed in all biological systems, but remains quite complex. It is still difficult to understand how resilience works and the set of behaviors to which it gives rise.

A special case of functional redundancy: degeneracy

It was through discussions with Véronique Thomas-Vaslin, a biologist at Sorbonne University, that Dominique Pastor, a telecommunications researcher at IMT Atlantique, became particularly aware of this property of biological systems. Working with Roger Waldeck, who is also a researcher at IMT Atlantique, and PhD student Erwan Beurier, he carried out research to mathematically model this resilience, in order to demonstrate its basic principles and better understand how it works.

To do so, they drew on publications by other scientists, including American biologist Gerald Edelman (Nobel prize winner for medicine in 1972), underscoring another property of living organisms: degeneracy. (This term is usually translated in French as dégénérescence, which means ‘degeneration,’ but this word is misleading). “Degeneracy” refers to the ability of two structurally different elements to perform the same function. It is therefore a kind of functional redundancy, which also implies different structures. This characteristic can be found at multiple levels in living beings.

For example, amino acids, which are the building blocks of essential proteins, are produced from “messages” included in portions of DNA. More specifically, each message is called a “codon”: a sequence of three molecules, known as nucleotides. However, there are 4 possible nucleotides, meaning there are 64 possible combinations, for only 22 amino acids. That means that some codons correspond to the same amino acid: a perfect example of degeneracy.

My hunch is that that degeneracy is central to any resilient system,” explains Dominique Pastor. “But it’s just a hunch. The aim of our research is to formalize and test this idea based on mathematical results. This can be referred to as the mathematics of resilience.”

To this end, he relied on the work of French mathematician Andrée Ehresmann, Emeritus Professor at the University of Picardie Jules Verne, who established a mathematical model of degeneracy, known as the “Multiplicity Principle,” with Jean-Paul Vanbremeersch, an Amiens-based physician who specializes in gerontology.

Recreating resilience  in the form of mathematical modeling

Dominique Pastor and his team therefore started out with biologists’ concrete observations of the human body, and then focused on theoretical study. Their goal was to develop a mathematical model that could imitate both the degeneracy and resilience of the immune system in order to “establish a link between the notion of resilience, this Multiplicity Principle, and statistics.” Once this link was established, it would then be possible to study it and gain insight into how the systems work in real life.

The researchers therefore examined the performance of two categories of statistical testing, for a given problem, namely to detect a phenomenon. The first category is called “Neyman-Pearson testing,” and is optimal for determining whether or not an event has occurred. The second category, RDT, (Random Distortion Testing), is also optimal, but for a different task: detecting whether an event has moved away from an initial model.

The two types of procedures were not created with the same objective. However, the researchers  successfully demonstrated that RDT testing could also be used, in a “degenerative” manner, to detect a phenomenon, with a comparable performance to Neyman-Pearson testing. That means that in the theoretical case of an infinite amount of data, they can detect the presence or absence of a phenomenon with the same level of precision. The two categories therefore perform the same function, although they are structurally different. “We therefore made two sub-systems in line with the Multiplicity Principle,” concludes the IMT Atlantique researcher.

What’s more, the nature of RDT testing gives it an advantage over Neyman-Pearson testing since the latter only works optimally when real events follow a certain mathematical model.  If this is not the case — as so often happens in nature — it is more likely to be incorrect. RDT testing can adapt to a variable environment, since it is designed to detect such variations, and is therefore more robust. Combining the two types of testing can result in a system with the inherent characteristics of resilience, meaning the ability to function in a variety of situations.

From biology to cybersecurity

These findings are not intended to remain confined to a theoretical universe. “We don’t work with theory for the sake of theory,” says Dominique Pastor. “We never forget the practical side: we continually seek to apply our findings.” The goal is therefore to return to the real world, and not only in relation to biology. In this respect, the approach is similar to that used in research on neural networks – initially focused on understanding how the human brain works, it ultimately resulted in systems used in the field of computer science.

The difference is that neural networks are like black boxes: we don’t know  how they make their decisions,” explains the researcher. “Our mathematical approach, on the other hand, provides an understanding of the principles underlying the workings of another black box: the immune system.” This understanding is also supported by collaboration with David Spivak, a mathematician at MIT (United States), again, in the field of mathematical modeling of biological systems.

The first application Dominique Pastor is working on falls within the realm of cybersecurity. The idea is to imitate the resilient behavior of an immune system for protective purposes. For example, many industrial sites are equipped with sensors to monitor various factors (light, opening and closing of doors, filling a container etc.) To protect these devices, they could be combined with a system to detect external attacks. This could be made up of a network, which would receive data recorded by the sensors and run a series of tests to determine whether there has been an incident. Since these tests could be subject to attacks themselves, they would have to be resilient in order to be effective – hence the importance of using different types of tests, in keeping with the previously obtained results.

For now it is still too early to actually apply these theories. It remains to be proven that the Multiplicity Principle is a sufficient guarantee of resilience, given that this notion does not have a mathematical definition as of today. This is one of Dominique Pastor’s ambitions. The researcher admits that it is still his “pipe dream” and says, “My ultimate goal would still be to go back to biology. If our research could help biologists better understand and model the immune system, in order to develop better care strategies, that would be wonderful.”

Unéole

Unéole on our roofs

We know how to use wind to produce electricity, but large three-bladed turbines do not have their place in urban environments. The start-up Unéole has therefore developed a wind turbine that is suitable for cities, as well as other environments. It also offers a customized assessment of the most efficient energy mix. Clovis Marchetti, a research engineer at Unéole, explains the innovation developed by the start-up, which was incubated at IMT Lille Douai.

 

The idea for the start-up Unéole came from a trip to French Polynesia, islands that are cut off from the continent, meaning that they must be self-sufficient in terms of energy. Driven by a desire to develop renewable energies, Quentin Dubrulle focused on the fact that such energy sources are scarce in urban areas. Wind, in particular, is an untapped energy source in cities. “Traditional, three-bladed  wind turbines are not suitable,” says Clovis Marchetti, a research engineer at Unéole. “They’re too big, make too much noise and are unable to capture the swirling winds created by the corridors between buildings.”.

Supported by engineers and researchers, Quentin Dubrulle put together a team to study the subject. Then, in July 2014 he founded Unéole, which was incubated at IMT Lille Douai.  Today the start-up proposes an urban turbine measuring just under 4 meters high and 2 meters wide that can produce up to 1,500 kWh per year. It is easy to install on flat roofs and designed to be used in cities, since it captures the swirling winds found in urban environments.

Producing energy with a low carbon footprint is a core priority for the project. This can be seen in the choice of materials and method of production. The parts are cut by laser, a technology that is well-understood and widely used by many industries around the world. So if these wind turbines have to be installed on another continent, the parts can be cut and assembled on location.

Another important aspect is the use of eco-friendly materials. “This is usually a second step,” says Clovis Marchetti, “but it was a priority for Unéole from the very beginning.” The entire skeleton of the turbine is built with recyclable materials. “We use aluminum and  recycled and recyclable stainless steel,” he says. “For the electronics, it’s obviously a little harder.”

Portrait of an urban wind turbine

The wind turbine has a cylindrical shape and is built in three similar levels with slightly curved blades that are able to trap the wind. These blades are offset by 60° from one level to the next. “This improves performance since the production is more uniform throughout the turbine’s rotation.” says Clovis Marchetti. Another advantage to this architecture is that it makes it easy to start: no matter what direction the wind comes from, a part of the wind turbine will be sensitive to it, making it possible to induce movement.

 

Photograph of the urban wind turbine proposed by Unéole.

 

To understand how a wind turbine works, two concepts of aerodynamics are important: lift and drag. In the former, a pressure difference diverts the flow of air and therefore exerts a force. “It’s what makes planes fly for example,” explains Clovis Marchetti. In the latter, the wind blows on a surface and pushes it. “Our wind turbine works primarily with drag, but lift effects also come into play,” he adds. “Since the wind turbine is directly pushed by the wind, its rotational speed will always be roughly equal to the wind speed.”

And that plays a significant role in terms of the noise produced by the wind turbine. Traditional three-bladed turbines turn faster than the wind due to lift. They therefore slice through the wind and produce a swishing noise. “Drag doesn’t create this problem since the wind turbine vibrates very little and doesn’t make any noise.” he says.

An optimal energy mix

The urban wind turbine is not the only innovation proposed by Unéole. The central aim of this project is to combine potential renewable energies to find the optimal energy mix for a given location. As such, a considerable amount of modeling is required in order to analyze the winds on site. That means modeling a neighborhood by taking into consideration all the details that affect wind: topographical relief, buildings, vegetation etc. Once the data about the wind has been obtained from Météo France, the team studies how the wind will behave in a given situation on a case-by-case basis.

“Depending on relief and location, the energy capacity of the wind turbine can change dramatically,” says Clovis Marchetti. These wind studies allow them to create a map in order to identify locations that are best suited for promoting the turbine, and places where it will not work as well. “The goal is to determine the best way to use roofs to produce energy and optimize the energy mix, so we sometimes suggest that clients opt for photovoltaic energy,” he says.

“An important point is the complementary nature of photovoltaic energy and wind turbines,” says Clovis Marchetti. Wind turbines maintain production at night, and are also preferable for winter, whereas photovoltaics are better for summer. Combining the two technologies offers significant benefits at the energy level, for example, uniform production. “If we only install solar panels, we’ll have a peak of productivity at noon in the summer, but nothing at night,”  he explains. This peak of activity must therefore be stored, which is costly and still involves some loss of production. A more uniform production would therefore make it possible to produce energy on a more regular basis without having to store the energy produced.

To this end, Unéole is working on a project for an energy mix platform: a system that includes their urban wind turbines, supplemented with a photovoltaic roof. Blending the two technologies would make it possible to produce up to 50% more energy than photovoltaic panels installed alone.

A connected wind turbine

“We’re also working on making this wind turbine connected,” says Clovis Marchetti. This would provide two major benefits. First, the wind turbine could provide information directly about its production and working condition. This is important so that the owner can monitor the energy supply and ensuring that it is working properly. “If the wind turbine communicates the fact that it is not turning even though it’s windy, we know right away that action is required;” he explains.

In addition, a connected wind turbine could predict its production capacity based on weather forecasts. “A key part of the smart city of tomorrow is the ability to manage consumption based on production,” he says. Today, weather forecasts are fairly reliable up to 36 hours in advance, so it would be possible to adjust our behavior. Imagine, if for example, strong winds were forecast for 3 pm. In this case, it would be better to wait until then to launch a simulation that requires a lot of energy.

plastics

A sorting algorithm to improve plastic recycling

Producing high-quality raw materials from waste is contingent on effective sorting. Plastics from waste electrical and electronic equipment (WEEE) are no exception. To help solve this problem, researchers at IMT Mines Alès have developed a selective automation algorithm designed for these plastics. It can be integrated in new industrial-scale sorting machines.

 

How will your coffee maker be reincarnated after it dies? This electrical appliance composed primarily of plastic, metal and glass falls into the category of waste electrical and electronic equipment (WEEE). Your smartphone and washing machine are also included in this category. After it is thrown away, the coffee maker will find itself drowning in what amounts to over 750,000 tons of WEEE collected every year in France, before it is recovered by a specialized recycling center. There, it is dismantled, crushed and separated from its ferrous and non-ferrous metals, such as copper or aluminum, until all that’s left of the machine is a heap of plastic. Plastic is the second-largest component of WEEE after steel, so recycling it is a major concern.

And successful recycling starts with effective sorting. 20% of plastic materials are recovered through flotation after being placed in a tank filled with water. But how are the remaining 80% be processed? “Samples measuring 1 cm² are placed on a converyor belt equipped with an infrared camera at the end, which scans the material and determines what type of plastic it’s made of,”  says Didier Perrin, a physical chemist at IMT Mines Alès. The radiation excites the atomic bonds of the molecules and creates a spectral signature that characterizes the plastic to be identified. A technique using a near infrared source (NIRS) is especially rapid but cannot be used to identify dark plastics, which absorb the radiation. But black plastic, which holds up over time better than colored plastic, represents nearly 50 % of the waste. “Accurate and effective identification of the material is therefore crucial to generate high-quality raw material to be recycled, combining purity and mechanical performance,” adds the researcher. However, this method does not always make it possible to determine the exact type of plastic contained within a sample.

An automated sorting algorithm

Researchers at IMT Mines Alès have therefore developed an automated method for sorting plastic by working with SUEZ and Pellenc ST, a company that develops smart, connected sorting machines. The focus of their collaboration was on establishing a classification of the plastics contained in WEEE. The researchers generated a database in which each plastic has its own clearly-defined spectral identity. WEEE were therefore divided into four major families: ABS (acrylonitrile butadiene styrene), a polymer commonly used in industry which represents 50 to 60% of plastic waste (cases, covers, etc.); HIPS (high-impact polystyrene), which are similar to ABS but less expensive and with lower mechanical performance (refrigerator racks, cups); polypropylene (material which is more ductile than ABS and HIPS (soft covers for food containers, cups)); and what is referred to as ‘crystal’ polystyrene (refrigerator interior, clear organic glass).

Their first step was to better recognize the plastics to be sorted. “We used a supervised learning method on the data measured in the laboratory and then analyzed the same samples in industrial conditions,” explains PhD student Lucie Jacquin. Nevertheless, it is not always easy to characterize the type of plastic contained in waste. First of all, plastic degrades over time, which modifies its properties and makes it difficult to identify. And second, industrial conditions — with 3,000 kg of waste analyzed per hour — often result in incomplete spectral measurements.

Beyond the uncertainties of the measurements, the most traditional sorting methods also have their flaws. For example, they are based on probabilistic classification algorithms, which are used to determine how similar a sample is to those in a reference database. Except that these algorithms do not distinguish between equiprobability and ignorance. In the event of equiprobability, the spectrum of a sample is 50% similar to the spectrum of plastic A and 50% similar to that of plastic B. In the event of ignorance, even though the spectrum of a sample is not similar to any element within the database, the algorithm gives the same result as in the event of equiprobability (50% A and 50% B). So how can it be determined whether the information provided by the algorithm reflects uncertainty or ignorance? The researchers’ aim is therefore to better manage uncertainty in measurements in real conditions.

Understanding the material to recycle it better

We approached this problem using modern uncertainty theories, which allow us to better represent uncertainty in the classification of a sample, based on the uncertainty in its spectrum obtained in real conditions. Belief functions can distinguish between equiprobability and ignorance, for example,” explains computer science researcher Abdelhak Imoussaten. The algorithm attempts to determine the class of plastic to which a sample belongs. When there is a doubt, it determines the set of classes of plastic to which it may belong and eliminates the others. For example, we can be sure that a sample is either ABS or HIPS, but definitely not polypropylene. “In this way, we use ‘cautious’ machine learning to control what the machine will send to the sorting bins,” adds Abdelhak Imoussaten. Since that’s the real goal: determining to which sorting bin these small bits of plastic will be sent in an automated way.

Each category of plastic accepts a certain quantity of other plastics without affecting the matrix of the recycled material,” says Didier Perrin. In practice, this means that it is possible to send a plastic to a sorting bin with some certainty, even if the exact type of plastic is unclear (A or B but not C). While completing his PhD at IMT Mines Alès under the supervision of Didier Perrin, Charles Signoret studied all the possible mixtures of the various plastics and their compatibility. For example, ABS may only contain 1% polypropylene in its structure in order to maintain its mechanical properties, but it may contain up to 8% HIPS.

While the presence of impurities is inevitable in recycling, the researchers consider a sorting  method to be effective when it results in materials with 5% impurities or less. One thing is certain: the collaborative work of the researchers, SUEZ and Pellenc ST has proved to be effective in terms of sorting quality. It has already resulted in a demonstration machine which will subsequently be implemented in the production of new sorting machines.

Improving the effectiveness of sorting systems is crucial to the economic viability of the recycling industry. The ADEME estimates that 1.88 million tons of household appliances are brought to the market every year in France. These products will eventually have to be sorted in order to provide high-quality material to produce future equipment for this ever-growing market. “Our goal is also to ensure that the term ‘recycled,’ when referring to plastics, does not mean low-quality, as has already been achieved with glass and steel, two recycled materials whose quality is no longer questioned,” concludes Didier Perrin.

 

Article written in French by Anaïs Culot for I’MTech

 

Hiboo

Tracking mobile industrial equipment in real time with Hiboo

The Hiboo start-up was incubated at Télécom Paris and provides a platform to help companies better manage their mobile assets: equipment, merchandise, vehicles etc. This solution is now widely used in the construction industry.

 

In 2016, the start-up Hiboo, which was incubated at Télécom Paris at the time, created a connected device in order to bring this type of equipment to the construction industry. But the industry was already facing an overwhelming amount of unused data and resolving this problem was not one of its top priorities. Instead, the sector sought to solve an ongoing problem – although it generated significant annual revenue, its profit margins remained low.

We started out with the idea that one of the best ways to optimize this profit margin was to better understand the equipment’s activity using the available data,” explains François Jacob, co-founder of Hiboo. The start-up therefore made a radical shift to become a super aggregator of data sources. This transformation gave rise to its current platform, which helps companies manage their operations more effectively.

Hiboo helps collect and sort data

On paper, all construction companies face the same problems when it comes to mobile industrial assets: renting equipment, inventory, time spent on site, consumption, identifying machine problems etc. But on site, they lack an overall vision, and taking inventory of their equipment takes a long time and it is not always very thorough. Hiboo collects information provided by three categories of equipment: connected vehicles, unpowered equipment, and non-connected equipment containing an onboard computer.

In the construction industry, companies may manage thousands of pieces of equipment at the same time. Such equipment may include some twenty brands of connected vehicles. However, if a company wants to understand how each brand fits into their overall equipment operations, users must log into each brand’s individual platform to retrieve this information, which is impossible to do on a daily basis.

Hiboo solves this problem by aggregating key data such as GPS coordinates, energy consumption and machine error codes by logging in to all of the manufacturers’ individual servers in the client’s place. The data are then harmonized before being automatically analyzed by ‘workers’. These robots isolate outliers, such as a vehicle that has consumed 2,500 liters of fuel in one day. The process is then checked by engineers at Hiboo who send a final report to the clients. Users may therefore access all operations inputs and outputs for connected equipment on a single website.

Solutions hidden in data

Hiboo also equips unpowered equipment such as crane parts, dumpsters and trailers with connected devices that communicate via low-frequency networks. They are energy-efficient, and make it possible to obtain GPS coordinates and track equipment’s activity over a number of years. The information is sent to Hiboo using traditional telephone networks. With the help of a partner, the start-up also equips non-connected vehicles with devices in order to collect the information obtained in their on-board computers. “So we provide equipment managers with a comprehensive solution for practically all of their assets,” adds François Jacob.

All of this data is made available to users on the Hiboo platform. But it can also be integrated in applications such as invoicing software. The start-up helped the Swiss company Bernasconi shorten its invoicing process by one week every month by eliminating paper invoices. And a major industrial equipment rental company was able to save up to 700 billable days a month by identifying the over-usage of its equipment. “By processing data from the field, we can help companies revolutionize asset management, maintenance, assignment, invoicing etc.” explains François Jacob.

A versatile technology

Hiboo  wishes to go further in leveraging data, in particular machine error codes and their severity levels. “Using this data and maintenance records, we want to provide predictive maintenance so that we can predict the probability of a machine breaking down,” explains François Jacob. This could involve a failure on a compressor, an oil leak, a motor with low voltage etc. To do so, the start-up team combines information about the errors with the computerized maintenance management systems (CMMS) already used by companies to monitor machines and keep them in good working order.

Although originally intended for the construction industry, Hiboo’s solution can be used for other applications, given its ability to control the flow of data between different networks. For example, the start-up will be covering the Dakar rally in 2020. “By connecting to Marlink, the satellite communication network used to track the rally participants, we can collect information about the various vehicles and track their performance on our platform,” explains François Jacob.

Learn more about Hiboo

SOCCRATES

SOCCRATES automates cybersecurity for industrial systems

Projets européens H2020SOCCRATES is a H2020 European project launched in September 2019 for a three-year period. It aims to develop at least one platform to automate the detection of certain attacks and launch appropriate countermeasures. In doing so, it should help cyber security operators for industrial systems act more quickly and effectively in the event of a cyber attack. Hervé Debar, an information systems security researcher at Télécom SudParis, explains how the research consortium, which includes the school, is going about developing this solution.  

 

What is the SOCCRATES platform?

Hervé Debar: The SOCCRATES platform is a “Security Information and Event Management” environment that aims to detect and block cyber-attacks more effectively. To do so, the platform collects data about the vulnerabilities present on the monitored system, malicious activity targeting the IT environment, and general information about the threat. It then proposes appropriate countermeasures for the attacks that are detected and makes it possible to implement them.

How does it hope to address the needs of companies and organizations?

HD: SIEM platforms are the core of Security Operating Centers (SOC), where operators  manage cyber threats. All operators of critical infrastructures must monitor their information systems as required by French and European regulations. Faced with growing threats, the SOCCRATES platform aims to provide a greater degree of automation, making it possible to respond to attacks more quickly and precisely. Operators could then focus on the most complex attacks.

What is your approach to developing this platform?

HD: The project focuses primarily on the knowledge with which SOC operators are provided in order to respond to attacks. This knowledge takes one of three forms. The first is increased knowledge of the monitored information system, and of the potential attack paths that could be used to compromise a vulnerable target. Blocking the easiest attack paths can help prevent a hacker from spreading throughout the system. The second form of knowledge is based on an understanding of the threat. This means observing internet attack phenomena in order to improve the detection mechanisms used. And the third form of knowledge involves understanding the impact an attack has on operations in order to assess the risks of countermeasures and the benefits in terms of limiting the impact of an attack.

What expertise are Télécom SudParis researchers contributing to this project?

HD: We’re contributing our expertise in cyber attack remediation, which we developed in particular through the MASSIF and PANOPTESEC European FP7 projects. Our work on these two projects, which were launched in 2013 and 2014, gave us the opportunity to develop in-depth knowledge about industrial cybersecurity, managing attacks and implementing countermeasures. Our response model provides a quantitative assessment of the impact — whether positive or negative — of the remediations proposed to block attacks.

Read more on I’MTech: SPARTA: Defining Cybersecurity in Europe

How do you plan to test the effectiveness of the SOCCRATES platform?

HD: The platform will be implemented and deployed in two pilot environments involving critical infrastructures. In the field of cloud computing, with the company Mnemonic, and in the energy sector with Vattenfall. Mnemonic is a managed security service provider. At Vattenfall, the SOCCRATES platform will be used to monitor networks that control electricity production and distribution.

Beyond these industry partners, how is the project organized?

HD: SOCCRATES is coordinated by the Netherlands Organisation for Applied Scientific Research (TNO). In addition to IMT, three are three Swedish partners (KTH, Foreseeti and Mnemonic), a Finnish partner (F-Secure), ATOS Spain, Vattenfall IT Services (Poland), the Austrian Institute of Technology (AIT), and another Dutch partner, ShadowServer. This consortium is divided into three kinds of contributions: vulnerability analysis, behavioral detection, and attack remediation. Our first major step is to define the use cases and demonstration scenarios that we will use to develop, approve and demonstrate the components of the project. We plan to do this by the end of January.

Learn more about SOCCRATES

DeNoize

A window, and silence!

To combat noise pollution and its effects on human health, DeNoize, a start-up incubated at Mines Saint-Étienne, offers a solution: a window that ‘mutes sound’. This connected window would analyze outside noise and adapt to cancel it out.

 

Double glazing increases thermal insulation, but when it comes to noise, it’s another story. When we’re indoors at home or the office, the vast majority of the outside noise that reaches us comes through the windows. This is an especially big concern for people who live or work near airports or major roads. Since May 2018, DeNoize co-founders Olivier Schevin and Aman Jindal have made it their mission to reduce this noise pollution, which is detrimental to our health.

DeNoize, a start-up incubated at Mines Saint-Étienne, offers an innovative solution for improving sound insulation in windows. “Our challenge is now to miniaturize the system so that it can be integrated into window frames,” says co-founder Olivier Schevin. The concept could easily be integrated into standard windows available today.

The problem with double glazing

“Double glazing is actually less effective than single glazing when it comes to sound insulation for the same thickness of glass,” says Olivier Schevin. Although it may seem counterintuitive, double glazing offers less resistance to low frequencies— between 50 and 500 Hz. A frequency band that is the main source of noise from airports and roads. “Double glazing was designed to solve thermal insulation problems, without considering the acoustic aspect,” he explains.

Double glazing is first and foremost two masses, the panes, with air or gas between them. This structure poses a problem from an acoustic point of view: certain frequencies – low frequencies – causes the air trapped between the panes to resonate and the sound propagates. This effect may be counteracted by increasing the thickness of the windows, or the space between the two panes. This passive reduction results in a bulky look from an architectural viewpoint and is also very expensive.

Sound fights back

DeNoize’s innovation is to use sound to fight sound, making it an active noise reduction system. “We’re going to generate a counter-vibration suited to the vibration of the outside noise,” explains Olivier Schevin. “The system produces a vibration that counters that of the outside noise, creating a destructive interference.” The vibrations ‘cancel each other out,’ reducing the noise transmitted by up to 75% for low frequencies.  

“This technology is somewhat similar to that used in noise-cancelling headphones,” adds Olivier Schevin. “The technical difference is the surface of the area we want to treat. For the headphones, it’s a really small area close to the ear.” The system developed by DeNoize users sensors to analyze outside noise in real time and adapt to it accordingly. The actuators produce a counter-vibration that interferes with the original noise. It must also include a control unit and an electronic board responsible for determining the most effective actions for sensors and actuators.

The system is integrated into the window frames and requires an electrical connection nearby to supply it with energy. This is already common today with rolling shutters for example. The  innovation in step with advances in smart home technology.

Read more on I’MTech: Smart homes: A world of conflict and collaboration

This communication between actuators, sensors and control unit makes it possible to customize noise reduction in real time which adapts to outside variations. “As of now, we have a working prototype,” says Olivier Schevin, “But the system doesn’t calculate in real time yet. So we still have a development phase ahead of us for the electronics part.”

Olivier Schevin is launching an industrial project with students to develop a real-time demonstrator. The electronic component is still to be developed, since the existing control unit  was made using laboratory equipment that cannot be integrated into window frames. “In general, we’re still looking for ways to improve performance at the lowest possible cost.”

digital twin

What is a digital twin?

Digital twins, digital doubles – what exactly do these terms mean? Raksmey Phan, an engineer at the Mines Saint-Étienne Centre for Biomedical and Health Engineering (CIS)[1], talks to us about the advantages and advances offered by these new tools, as well as the issues involved.

 

What does a digital twin refer to?

Raksmey Phan: If you have a digital, mathematical model representing a real system, based on data from this real system, then you have a digital twin. Of course, the quality of the digital twin depends first and foremost on the mathematical model. Industrial ovens are a historic example that can help explain this idea.

To create a digital twin, we record information about the oven, which could include its operating hours or the temperature each time it’s used. Combined with algorithms that take into account the physical components that make up the oven, this digital twin will calculate its rate of wear and tear and anticipate breakdown risks. The use of the oven can then be monitored in real time and simulated in its future state with different use scenarios in order to plan for its replacement.

In what fields are they used?

RP: They can be used in any field where there is data to be recorded. We could say that climatologists make a digital twin of our planet: based on observational data recorded about our planet, they run simulations, and therefore mathematical models, resulting in different scenarios. To give another example, at the Mines Saint-Étienne CIS, we have scientists such as Xiaolan Xie, who are internationally renowned for their experience and expertise in the field of modeling healthcare systems. One of our current projects is a digital twin of the emergency department at Hôpital Nord de Saint-Étienne, which is located 200 meters from our center.

What advantages do digital twins offer?

RP: Let’s take the example of the digital twin of the emergency room. We’ve integrated anonymized patient pathways over a one-year period in a model of the emergency room. In addition to this mathematical model, we receive data in what can be referred to as ‘pseudo-real time,’ since there is a lapse of one hour from the time patients arrive in the department. This makes it possible for us to do two important things. The first is to track the patients’ movement through the department in pseudo-real time, using the data received and the analysis of pathway records. The second is the ability to plan ahead and predict future events. Imagine if there was a bus accident in the city center. Since we know what types of injuries result from such an accident, we can visualize the impact it would have on the department, and if necessary, call in additional staff.

What did people do before there were digital twins?

RP: Companies and industries were already using the concept before the term existed. Since we’ve been using machines, engineers have tried to monitor tools with replicas – whether digitally or on paper. It’s a bit like artificial intelligence. The term is back in fashion but the concept goes back much further. Algorithms are mathematics, and Napoleon used algorithms for his war logistics.

When did the term digital twin first start to be used?

RP: The term ‘digital twin’ was first used in 2002 in articles by Michael Grieves, a researcher at the Florida Institute of Technology. But the concept has existed since we have been trying to model real phenomena digitally, which is to say since the early days of computing. But there has been renewed interest in digital twins in recent years due to the convergence of three scientific and technological innovations. First, the impressive growth in our ability to analyze large amounts of data — Big Data. Second, the democratization of connected sensors — the Internet of Things. And third, renewed interest for algorithms in general, as well as for cognitive sciences — Artificial Intelligence.

How have the IoT and Big Data transformed digital twins?

RP: A digital twin’s quality depends on the quantity and quality of data, as well as on its ability to analyze this data, meaning its algorithms and computing capacity. IoT devices have provided us with a huge amount of data. The development of these sensors is an important factor – production has increased while costs have decreased. The price of such technologies will continue to drop, and at the same time, they will become increasingly accurate. That means that we’ll be able to create digital twins of larger, more complex systems, with a greater degree of accuracy. We may soon be able to make a digital twin of a human being (project in the works at CIS).

Are there technological limitations to digital twins?

RP: Over the last five years, everything’s been moving faster at the technological level. It’s turned into a race for the future. We’ll develop better sensors, and we’ll have more data, and greater computing power. Digital twins will also follow these technological advances. The major limitation is sharing data – the French government was right to take steps towards Open Data, which is free data, shared for the common good. Protecting and securing data warehouses are limiting factors but are required for the technological development of digital twins. In the case of our digital twin of the hospital, this involves a political and financial decision for hospital management.

What are some of the challenges ahead?

RP: The major challenge, which is a leap into the unknown, is ethics. For example, we can assess and predict the fragility of senior citizens, but what should we do with this information after that? If an individual’s health is likely to deteriorate, we could warn them, but without help it will be hard for them to change their lifestyle. However, the information may be of interest to their insurance providers, who could support individuals by offering recommendations (appropriate physical activity, accompanied walks etc.) This example hinges on the issues of confidentially and anonymization of data, not to mention the issue of informed consent of the patient.

But it’s incredible to be talking about confidentiality, anonymization and informed consent as a future challenge  — although it certainly is the case — when for the past ten years or so, a portion of the population has been publishing their personal information on social media and sharing their data with wellness applications whose data servers are often located on another continent.

[1] Raksmey Phan is a researcher at the Laboratory of Informatics, Modelling and Optimization of the Systems (LIMOS), a joint research unit between Mines Saint-Étienne/CNRS/Université Clermont-Auvergne.

Read on I’MTech:

AiiNTENSE

AiiNTENSE: AI for intensive care units

The start-up AiiNTENSE was incubated at IMT Starter and develops decision support tools for healthcare with the aim of advising intensive care personnel on the most appropriate therapeutic procedures. To this end, the start-up is developing a data platform of all diseases and conditions, which it has made available to researchers. It therefore seeks to provide support for launching clinical studies and increase medical knowledge.

 

Patients are often admitted to intensive care units due to neurological causes, especially in the case of a coma. And patients who leave these units are at risk of developing neurological complications that may impact their cognitive and functional capacities. These various situations pose diagnostic, therapeutic and ethical problems for physicians. How can neurological damage following intensive care be predicted in the short, medium and long term in order to provide appropriate care? What will the neurological evolution of a coma patient involve, between brain death, a vegetative state and partial recovery of consciousness? An incorrect assessment of the prognosis could have tragic consequences.

In 2015, Professor Tarek Sharshar, a neurologist specialized in intensive care, saw a twofold need for training – on one hand neurology training for intensivists, and on the other, intensive care training for neurologists. He proposed a tele-expertise system connecting the two communities. In 2017, this project gave rise to AiiNTENSE, a start-up incubated at IMT Starter, whose focus soon expanded. “We started out with our core area of expertise: neuro-intensive care and drawing on support from other experts and learned societies, we shifted to developing decision support tools for all of the diseases and conditions encountered in intensive care units,” says Daniel Duhautbout, co-founder of AiiNTENSE. The start-up is developing a database of patient records which it analyzes with algorithms using artificial intelligence.

AI to aid in diagnosis and prognosis

The start-up team is working on a prototype concerning post-cardiac arrest coma. Experts largely agree on methods for assessing the neurological prognosis for this condition. And yet, in 50% of the cases of this condition, physicians are not yet able to determine whether or not a patient will awake from the coma. “Providing a prognosis for a patient in a coma is extremely complex and many available variables are not taken into account, due to a lack of appropriate clinical studies and tools to make use of these variables,” explains Daniel Duhautbout. That’s where the start-up comes in.

In 2020, AiiNTENSE will launch its pilot prototype in five or six hospitals in France and abroad. This initial tool comprises, first and foremost, patient records, taken from the hospital’s information system, which contain all the relevant data for making medical decisions. This includes structured biomedical information and non-structured clinical data (hospitalization or exam reports). In order to make use of the latter, the start-up uses technology for the automated processing of natural language. This results in patient records with semantic, homogenized data, which take into account international standards for interoperability.

A use for each unit

The start-up is developing a program that will in time respond to intensivists’ immediate needs. It will provide a quick, comprehensive view of an individual patient’s situation. The tool will offer recommendations for therapeutic procedures or additional observations to help reach a diagnosis. Furthermore, it will guide the physician in order to assess how the patient’s state will evolve. The intensivist will still have access to an expert from AiiNTENSE’s tele-expertise network to discuss cases in which the medical knowledge implemented in the AiiNTENSE platform is not sufficiently advanced.

The start-up also indirectly responds to hospital management issues. Proposing accurate, timely diagnoses means limiting unnecessary exams, making for shorter hospital stays and, therefore lower costs. In addition, the tool optimizes the traceability of analyses and medical decisions, a key medical-legal priority.

In the long term, the start-up seeks to develop a precision intensive care model. That means being able to provide increasingly reliable diagnoses and prognoses tailored for each patient. “For the time being, for example, it’s hard to determine what a patient’s cognitive state will be when they awaken from a coma. We need clinical studies to improve our knowledge,” says Daniel Duhautbout. The database and its analytical tools are therefore open to researchers who wish to improve our knowledge of conditions that require intensive care. The results of their studies will then be disseminated through AiiNTENSE’s integration platform.

Protecting data on a large scale

In order to provide a viable and sustainable solution, AiiNTENSE must meet GDPR requirements and protect personal health data. With this aim, the team is collaborating with researchers at IMT Atlantique and plans to use the blockchain to protect data. Watermarking, a sort of invisible mark attached to data, would also appear to be a promising approach. It would make it possible to track those who use the data and who may have been involved in the event of data leakage to external servers. “We also take care to ensure the integrity of our algorithms so that they support physicians confronted with critical neurological patients in an ethical manner,” concludes Daniel Duhautbout.

 

supply chain

Meet your supply chain using virtual reality

Immersive technologies for industrial engineering and risk management? This is the focus of research to be carried out at the SIReN laboratory launched on 15 November, bringing together researchers from IMT Mines Albi (France) and Georgia Tech (USA). On the French side, Frédérick Benaben, an expert in collaborative networks, is already using virtual reality to develop a supply chain decision support and management tool for companies.  

 

In front of you, a beam of green light traces a path leading you straight to the finishing line. You’re continuing along this path when, suddenly, a floating red sphere comes flying right toward you! It makes you veer sharply from your path, taking you away from where you were headed. You now face a crucial question: how can you achieve your goal from this new point? This is not a revolutionary video game set to be released at the end of the year. It’s a decision support tool developed by researchers at IMT Mines Albi and Georgia Tech to facilitate the visualization of data produced by artificial intelligence.

Building on their collaboration begun in 2015, the two academic partners have continued their joint research since 15 November through SIReN[1], a new international associated laboratory jointly based in Albi and Atlanta. “At the laboratory, we’re carrying out research on how immersive technology can help us develop and manage response networks,” explains Frédérick Benaben, an IMT Mines Albi researcher who specializes in the field of collaborative networks and information systems. Such networks include supply chains and crisis management. The researchers’ expertise is based on an original vision of artificial intelligence, at the crossroads between industrial engineering and computer and data sciences, and on a partnership that is already as concrete as the work it seeks to carry out.

Making the abstract concrete with virtual reality

A supply chain is a dynamic system that must be as agile as possible. It evolves over time in response to opportunities (opening of a new market, tax reductions etc.) or risks (weather events, closing of a border etc). Yet, understanding how these various events could impact the supply chain proves to be very complex. That’s where virtual reality comes in!

Read more on I’MTech: What is supply chain management?

Unlike “traditional” uses of virtual reality where the aim is to represent a copy of reality, as with digital twins for example, the researchers use virtual reality to get away from reality. “We can then project ourselves into a world where our physical reference points (such as up and down, distance, etc.) are conserved, but where we have the possibility of visualizing abstract concepts. Using spheres, we represent opportunities or risks, for example. Color effects can indicate how dangerous they are,” says the researcher.

In the virtual universe, the spatio-temporal reference points are defined by the set of performance indicators for a supply chain. Let’s consider a simplified case where there are only three indicators: cost, product quality and delivery time. The researchers therefore define a three-dimensional frame of reference in which the supply chain is situated. Like a mechanical force, each risk or opportunity that has an impact on the network will push or pull it in a certain direction. For example, flooding along a delivery route will push the supply chain down on the delivery time axis.

Through virtual reality, a user can observe a supply chain, move and observe risks and opportunities related to the supply chain.

 

In reality, logistics networks have dozens of performance indicators and over a hundred risks and opportunities — and therefore mechanical forces — to represent at each instant, making them complex to visualize. This is made possible through extensive work to identify and process data. To continue with the flooding example, data is identified such as the number of blocked routes, how delayed the trucks are, the percentage of damaged warehouses, implications for the products etc. The researchers turn this information into a macroscopic force exerted on the performance indicators.

Read more on I’MTech: C2Net: supply chain logistics on cloud nine

Virtual reality therefore helps respond to a need for agility in supply chains in an interactive way. Once the users are immersed in their supply chain universe, they can follow its movement and also interact with the supply chain. The spheres representing risks and opportunities are closer or further from the user based on how likely they are to occur. Their color indicates how dangerous they are, making it possible to identify areas of action more effectively. “The user steers the course of their system towards a precise objective. Virtual reality allows them to identify the forces they must use to achieve their goal, much like a sailor uses winds and currents. If a risk materializes, they deviate from the path but may be able to use a force to correct the effect,” explains Frédérick Benaben.

This decision support tool could also help anticipate the effects of an event on the path and avoid it, if possible. These are precisely the questions being explored through the SCAN research program (Agile, Digital Collaborative Systems) launched in September 2019 with Scalian, a company that specializes in digital transformation, conducted through the SIReN laboratory.

Virtuality and risk management

The supply chain is not the only area of research to benefit from virtual reality through the SIReN laboratory. In March 2019, researchers from IMT Mines Albi created the EGCERSIS[2] research program with support from the Occitanie region and in partnership with the companies Immersive Factory and Report One. The aim is to use immersive technologies to develop crisis management systems for sensitive industrial environments. In particular, they are building on an emerging need expressed by Immersive Factory, a company specialized in developing digital twins to provide safety training for staff at industrial sites. The company is seeking to expand its offerings by providing training for crisis situations. Employees may have already learned how to make sure a valve is closed after using it, but what should they do if it catches on fire? The four-year partnership will be based on demonstrating the strength of digital simulation to respond to this sort of issue.

To do so, the researchers will rely on the IOMEGA platform equipped with multi-screen interfaces displayed in the form of a cockpit, allowing interaction between software, artificial intelligence visualization etc. They will also take advantage of state-of-the-art equipment for immersion and virtual reality, giving users 360° freedom of movement, via the new IOMEGA VR platform launched on 15 November. On the other side of the Atlantic, a twin platform is being developed at Georgia Tech.

More generally, the two partner institutions will draw on their complementary expertise for the projects carried out through SIReN. They seek to increase the agility of collaborative networks but come at the problem from two different angles. The French team is developing technologies intended for the supply chain, while the American team has given rise to the concept of the physical internet, which aims to transport physical goods as efficiently as data is transmitted over the internet. Like the internet, logistics must become fluid, and immersive technologies have a key role to play in making this possible.

[1] Sentient Immersive Response Network

[2] Crisis Management Training in an Environment Representative of Sensitive Industrial Sites

ixblue

iXblue: Extreme Fiber Optics

Belles histoires, Bouton, CarnotSince 2006, iXblue, a French company based in Lannion, and the Hubert Curien laboratory [1] in Saint-Étienne have partnered to develop cutting-edge fiber optics. This long partnership has established iXblue as a global reference in the use of fiber optics in harsh environments. The scientific and technological advances have enabled the company to offer solutions for the nuclear, space and health sectors. But there’s something different about these optical fibers: they’re not used for telecommunications.

 

Last June, iXblue and the Hubert Curien laboratory officially opened LabH6, a joint research laboratory dedicated to fiber optics. This latest development comes from a partnership that has existed since 2006 and the explosion of the internet bubble. In fact, iXblue was born from the ashes of a start-up specializing in fiber optics for telecommunications. After the disappointment experienced in the digital technology sector in the early 2000s, “we decided to make a complete U-turn, leaving telecommunications behind, while remaining in fiber optics,” explains Thierry Robin, present since the beginning and currently the company’s CTO.

A daring move, at a time when fiber optics in domestic networks was in its infancy. But it was a move that paid off. In 13 years, the young company became a pivotal stakeholder in fiber optics for harsh environments. The company owes its success to the innovations developed with the Hubert Curien laboratory. The company’s products are now used in high-temperature conditions, under nuclear irradiation and in the vacuum of space.

Measuring nuclear irradiation

One of the major achievements of this partnership has been the development of optical fibers that can measure the radiation dose in an environment. The light passing through an optical fiber is naturally diminished over the length of the fiber. This attenuation, called optical loss, increases when the fiber is under nuclear radiation. “We understand the law governing the relationship between optical loss and the radiation dose received by the fiber,” explains Sylvain Girard, a researcher at the Hubert Curien laboratory. “We can therefore have an optical fiber play the role of hundreds of dosimeters by measuring the radiation value.”

There are two advantages to this application of the fiber. First of all, the resulting data can be used to establish a continuous mapping of the radiation over the length of the fiber, whereas dosimeters provide a value from their specific location. Secondly, the optical fiber provides a real-time measurement, since the optical loss is measured live. Dosimeters, on the other hand, are usually left for days or months in their locations before the value of the accumulated radiation can be measured.

The fibers used in this type of application are unique. They must be highly sensitive to radiation in order to accurately measure the variations. Research conducted for this purpose resulted in fibers doped with phosphorus or aluminum. This type of optical fiber is currently installed in the CERN Large Hadron Collider (LHC) in Geneva during the 2-year shutdown that will continue until 2020. “This will enable CERN to assess the vulnerability of the electronic equipment to radiation and hence avoid unplanned shutdowns caused by outages,” Sylvain Girard explains.

These optical fibers are also being assessed at the TRIUMF particle accelerator center in Canada for proton therapy. This high-precision medical technique treats ocular melanomas using radiation. The radiation dose deposited on the melanoma must be very precise. “The fiber should make it possible to measure the radiation dose in real-time and stop it once the required value is reached,” the researcher explains. “Without the fiber, doctors can only determine the total dose the patient received at the end of the treatment. They must therefore accumulate three low-dose radiation sessions one after the other to come as close as possible to the total target dose.”

Surviving space

While the fibers used in dosimetry must be sensitive to radiation for measurement purposes, others must be highly resistant. This is the case for fibers used in space. Satellites are susceptible to space radiation. However, the gyroscopes satellites use to position themselves use optical fiber amplifiers. iXblue and the Hubert Curien laboratory therefore partnered together to develop hydrogen or cerium-doped optical fibers. Two patents have been filed for these fiber amplifiers, and their level of resistance has made them the reference in optical fibers for the space sector.

The same issue of resistance to radiation exists in the nuclear industry, where it is important to measure the temperature and mechanical stress in the core of nuclear reactors. “These environments are exposed to doses of a million Grays. For comparison purposes, a lethal dose for humans is 5 Grays,” Sylvain Girard explains. The optical fiber sensors must therefore be extremely resistant. Once again, the joint research conducted by iXblue and the Hubert Curien laboratory led to two patents for new fibers that meet the needs of manufacturers like Orano (formerly AREVA). These fibers will also be deployed in the fusion reactor project, ITER.

All this research will continue at the new LabH6, which will facilitate the industrial application of the research conducted by iXblue and the Hubert Curien laboratory. The stakes are high, as the uses for optical fibers beyond telecommunications continue to increase. While space and nuclear environments may seem to be niche sectors, the optical fibers developed for these applications could also be used in other contexts. “We are currently working on fibers that are resistant to high temperatures for use in autonomous cars,” says Thierry Robin. “These products are indirectly derived from developments made for radiation-resistant fibers,” he adds. After leaving the telecommunications sector and large volume production 13 years, iXblue could soon return to its origins.

[box type=”shadow” align=”” class=”” width=””]A word from the company: Why partner with an academic institute like the Hubert Curien laboratory?

We knew very early on that we wanted an open approach and exchanges with scientists. Our partnership with the Hubert Curien laboratory allowed us to progress within a virtuous relationship. In an area where competitors maintain a culture of secrecy, we inform the researchers we work with of the exact composition of the fibers. We even produce special fibers for them that are only used for the scientific purposes of testing specific compositions. We want to enable our academic partners to conduct their research by giving them all the elements they need to make advances in the field. This spirit is what has allowed us to create unique products for the space and nuclear sectors.[/box]

[1] The Hubert Curien Laboratory is a joint research unit of CNRS/Université Jean Monnet/Institut d’Optique Graduate School, where Télécom Saint-Étienne conducts much of its research.