CUBAIR : un prototype pour purifier l’air intérieur

CUBAIR: a prototype for purifying indoor air

Improving indoor air quality? That’s what the CUBAIR project aims to do. By developing a new treatment system, researchers have managed to significantly reduce fine particle concentration and nitrogen oxides.

 

An important reminder: indoor air is often more polluted than outdoor air. In addition to the automobile exhaust and industrial pollution that enter our homes and offices through the windows, molds and pollutants also come from building materials or cleaning products. What can we do to make the air we breathe inside our homes and offices healthier? That is the big question for researchers working on the CUBAIR project funded by ADEME.

For four years, the group of researchers from Cerema, IMT Atlantique and LaTep (laboratory of Université de Pau et des pays de l’Adour) have been developing a prototype for an air purification system. The air is cleaned through a 3-step process. First, the air taken into the system is filtered by activated carbons with different characteristics. These materials are able to capture organic compounds present in the air —pesticides are one such example. As the air leaves the system, it goes through a more traditional filtering stage to eliminate fine particles. The last step is a photocatalysis stage. When exposed to ultraviolet light, titanium dioxide molecules react with some of the pollutants that remain in the air.

Last year, this prototype was tested at the Human Resource Development Centre in Paris. The goal was to study how effective it was in real conditions throughout an entire year. The device’s performance was measured for different kinds of pollutants: volatile organic compounds, fine particles, mold etc. The results were especially promising for nitrogen oxides— particularly nitrogen dioxide, a major air pollutant— since the treatment system reduces their concentration by 60% in the treated air. Positive results were also observed for fine particles, with the concentration dropping by 75% for particles with diameters less than 1 micron.

The only drawbacks: volatile organic compounds are not eliminated as effectively and the system tends to heat up during use which leads to extra air conditioning costs in summer. The researchers noted, however, that this can be an advantage in cooler weather and that this inconvenience should be weighed against the significantly improved air quality in a room.

Overall, the CUBAIR project offers good prospects for breathing healthier air in our future buildings. Figures published by the World Health Organization in 2018 serve as a reminder that air pollution causes 7 million premature deaths worldwide every year. This pollution also represents an annual cost of approximately €20 billion in France. Combating this pollution is therefore a major health, environmental and economic issue.

Also read on I’MTech:

cyber CNI chair

Renewal of the Cyber CNI Chair: Partners Signed at FIC 2019

After three years of operation, the Cybersecurity of Critical Infrastructure Chair (Cyber CNI) is being renewed for three more years thanks to the commitment of its academic and industrial partners. The IMT Chair led by IMT Atlantique benefits from partnerships with Télécom ParisTech and Télécom SudParis schools and support from the Brittany region —a region at the forefront of cutting-edge technology— within the framework of the Cyber Center of Excellence.

In the context of a sponsor partnership led by Fondation Mines-Télécom, five industrial partners have committed to this new period: AIRBUS, AMOSSYS, BNP Paribas, EDF and Nokia Bell Labs. The official signing to renew the Chair took place at FIC (Forum International de la Cybersécurité) in Lille on 22 January 2019.

 

The Cyber CNI Chair was founded in January 2016. Since then, it has led research and contributed to training in its area of expertise: the cybersecurity of critical infrastructures (energy networks, digital infrastructures, industrial processes, water production factories, financial systems, etc.). This recent scientific discipline is rapidly becoming an unavoidable topic for many companies and organizations. The Chair’s first phase made significant headway in improving the security and resilience of infrastructures, particularly in detecting malicious behavior using automatic learning techniques, 3D visualization of security events and the diagnosis of accidental or malicious causes of security incidents.

While the main theme remains the cybersecurity of critical infrastructures, in its second phase, the Cyber CNI Chair will expand its field of expertise and research: reinforcing efforts aimed at applying artificial intelligence (AI) in the area of cybersecurity, sharing and pooling information on threats (Threat Intelligence), industrial applications of connected objects (Industrial IoT) and the blockchain are all included in the Chair’s new research topics.

New theses will contribute to these objectives, in addition to those already underway. At the same time, phase 2 of the Cyber CNI Chair will increase communication efforts and promotion events to highlight the work achieved.

The Cyber CNI Chair, led by IMT Atlantique, is historically one of the first to have addressed cybersecurity with such a high level of skills and innovation in the complex area of critical infrastructures. The remarkable work carried out in recent years will continue in the second phase of the process through the renewed commitment of all the academic and industrial partners, sponsors, students and PhD students on new vital topics. The Chair’s renewal is therefore critically important and is justified by new lines of research, growing challenges and the major expectations of contributors, whose needs are increasingly acute,” says Serge Maurice, Airbus & Chairman of the Cyber CNI Chair Steering Committee.

 

care pathway

When AI helps predict a patient’s care pathway

Researchers at Mines Saint Etienne are using process mining tools to attempt to describe typical care pathways for patients with a given disease. These models can be used to help doctors predict the next steps for treatment or how a disease will progress.

 

Will doctors soon be able to anticipate patient complications arising from a disease? Will they be able to determine an entire care pathway in advance for patients with a specific disease? These are the goals of Vincent Augusto and his team at Mines Saint-Étienne. “Based on a patient’s treatment records, their condition at a given moment, and care pathways of similar patients, we’re trying to predict what the next steps will be for the patients,” says Hugo De Oliveira, a PhD student in Health Systems Engineering whose CIFRE thesis is funded by HEVA, a company based in Lyon.

Anticipating how a disease will progress and treatment steps helps limit risks to which the patient is exposed. For people with diabetes — the example used by the researchers in their work — the process is based on detecting weak signals that are precursors of complications as early as possible. For a given patient, the analysis would focus on several years of treatment records and a comparison with other diabetic patients. This would make it possible to determine the patient’s risk of developing renal failure or requiring an amputation related to diabetes.

In order to predict these progressions, the researchers do not rely on personal medical data, such as X-rays or biological analyses. They use medico-administrative data from the national health data system (SNDS). “In 2006, activity-based pricing was put into place,” notes Hugo De Oliveira. With this shift in the principle of funding for healthcare institutions, a large database was created to provide hospitals with the necessary information for reimbursement of treatment. “It’s a very useful database for us, because each line collects information about a patient’s stay: age, sex, care received, primary diagnosis, associated pathologies from which they suffer etc,” says the young researcher.

An entire pathway in one graph

Vincent Augusto’s team is developing algorithms that analyze these large volumes of data. Patients are sorted and put into groups with similar criteria. Different care pathway categories can then be established, each of which groups together several thousands of similar pathways (similar patients, identical complications etc.). In one category — diabetic patients who have undergone amputation for example — the algorithm analyzes all of the steps for the entire group of patients in order to deduce which ones are most characteristic. A graph is produced to represent the typical pathway for this category of patient. It may then be used as a reference to find out whether a patient in the early stages of the disease is following similar steps, and to determine the probability that he/she belongs to this category.

This graph represents the care pathway for patients monitored over an 8-year period who have had a cardiac defibrillator implanted. The part before the implantation can be used to establish statistics for the steps preceding the procedure. The part after the implantation provides information about the future of patients following the implantation.

 

In this way, the researchers are working on developing longitudinal graphs: each treatment step represents a point on the graph, and the whole graph can be read chronologically: “Doctors can read the graph very easily and determine where the patient is situated in the sequence of steps that characterize his/her pathway,” explains Hugo De Oliveira. The difficulty with this type of data representation comes from its comprehensiveness: “We have to find a way to fit an entire patient pathway into a single line,” says the PhD student. In order to do so, the team chose to use process mining, a data mining and knowledge extraction tool. Machine learning is another such tool.

Process mining helps make care pathway descriptions more effective and easier to read, but it also provides another benefit: it is not a ‘black box’. This characteristic is often encountered in neural network type algorithms. Such algorithms are effective at processing data, but it is impossible to understand the processes that led to the results of the algorithm. Unlike these algorithms, the process mining algorithms used to predict treatment pathways are transparent. “When a patient is characterized by a type of graph, we’re able to understand why by looking at past treatment steps, and studying each graph for the patient’s categories to understand how the algorithm evaluated the pathway,” says Hugo De Oliveira.

Making artificial intelligence applications more transparent is one of the issues brought forth by the working group that produced a French report on AI led by Cédric Villani. The project is also in keeping with the objectives set by the mathematician and member of the French parliament to facilitate AI experimentation for applications, for healthcare in particular. “Our research directly benefits from policies for opening access to health data,” says the PhD student. This access will continue to open up for the researchers, since later on this year they will be able to use the database of the national health insurance cross-scheme system (SNIIRAM): the 1.2 billion healthcare forms contained in the system will be used to improve the algorithms and better identify patient treatment pathways.

 

Canaries were once used in coal mines to forewarn of impending firedamp explosions. This story has inspired a cyberdefense tool : stack canaries.

Stack canaries: overestimating software protection

Android, Windows, Mac, Linux… All operating systems contain stack canaries — one of the most common forms of software protection. These safeguards that protect computer systems from intrusions are perceived as very effective. Yet, recent research carried out by EURECOM and the Technical University of Munich show that most stack canaries contain vulnerabilities. The results obtained through a project led by the German-French Academy for the Industry of the Future highlight the fragility of computer systems in the context of increasingly digitized organizations.

 

During the 19th century, canaries were used in coal mines to forewarn of impending firedamp explosions. The flammable, odorless gas released by the miners’ activities caused the birds either to lose consciousness or to die. This alerted the workers that something was wrong. Several decades later, in the early 2000s, researchers in cybersecurity were inspired by the story of canaries in coal mines. They invented a simple protection system for detecting software corruption—calling it “stack canary”. Since then, it has become one of the most common protection systems in the software we use and is now present in almost all operating systems. But is it really effective?

Perhaps it seems strange to be asking this question over 20 years after the first stack canaries were used in computer products. “The community assumed that the protection worked,” explains Aurélien Francillon, a researcher in cybersecurity at EURECOM. “There was some research revealing potential vulnerabilities of stack canaries, but without any in-depth investigation into the issue.” Researchers from EURECOM and the Technical University of Munich (TUM) have therefore partnered together to remedy this lack of knowledge. They assessed the vulnerabilities of stack canaries in 17 different combinations of 6 operating systems, to detect potential defects and determine good practices to adopt to remedy the situations. Linux, Windows 10, macOS Sierra and Android 7.0 were all included in the studies.

We showed that, in the majority of operating systems, these countermeasures for detecting defects are not very secure,” Aurélien Francillon explains. 8 out of the 17 tested combinations are qualified by the researchers as using an inefficient stack canary (see table below). 6 others can be improved, and the last 3 are blameless. This study of the vulnerabilities of stack canaries, carried out in the context of the Secure connected industry of the future (SeCIF) project, part of the German-French Academy for the Industry of the Future, is linked to the growing digital component of organizations. Industries and companies are increasingly reliant on connected objects and IT processes. Defects in the protection devices for operating systems can therefore endanger companies’ overall security, whether it be access to confidential data or gaining control of industrial machinery.

Out of the 17 operating systems tested, only Android 7.0 “Nougat”, macOS 10.12.1 “Sierra”, and OpenBSD 6.0 (Unix) had completely secure stack canaries. A red cross means that it is possible to bypass the stack canary in the given combination. An orange cross mean that stack canary security can be improved. Table columns are different memory types from a programming logic standpoint.

Out of the 17 operating systems tested, only Android 7.0 “Nougat”, macOS 10.12.1 “Sierra”, and OpenBSD 6.0 (Unix) had completely secure stack canaries. A red cross means that it is possible to bypass the stack canary in the given combination. An orange cross mean that stack canary security can be improved. Table columns are different memory types from a programming logic standpoint.

The canary in the memory

To understand the impacts of the defects revealed by this research, it is important to first understand why stack canaries are used and how they work. Many attacks that occur are aimed at changing values in a program that are not meant to be changed. The values are stored in memory space. “Let’s say this space has a capacity of 20 bytes,” says Julian Kirsch, a cybersecurity researcher at TUM and co-author of this study. “I would store my name and height on 20 of these bytes. Then, on another space located just behind it, I would store my bank account number. If a hacker wants to corrupt this information, he will add values, for example by adding a number to the value for my height. By doing this, my height data will overflow from the 20-byte space to the space where my bank account number is stored, and the information will no longer be correct. When the program needs to read and use this data, things will go wrong.”

In more complex cases for operating systems, the consequences include more critical errors than the wrong bank account number. To determine whether the information stored in the memory was altered, a known numerical value can be inserted between the storage spaces, as a type of memory buffer. If a hacker adds information, like in Julian Kirsch’s example in which the height was changed, everything will shift, and the value indicated in the memory buffer will change. The stack canary is simply a memory buffer. If the stack canary’s security is compromised, the hacker can modify it and then hide it by resetting it to the initial value.

To make the hacker’s work more difficult, the value of most stack canaries is changed regularly. A copy of the new value is stored in another memory space and both values, the real one and the reference one, are compared to ensure the integrity of the software. In their work, the researchers showed that the vulnerabilities of stack canaries are primarily linked to the place where this reference value is stored. “Sometimes it is stored in a memory space located right next to the stack canary,” Julian Kirsch explains. The hacker therefore does not need to access another part of the system and can change both values at the same time. “This is a defect we see in Linux, for example, which really surprised us because this operating system is widely used,” the TUM researcher explains.

How can such commonly used protection systems be so vulnerable on operating systems like Linux and Windows? First of all, Aurélien Francillon reminds us that stack canaries are not the only countermeasures that exist for operating systems. “In general, these are not the only countermeasures used, but stack canaries still represent significant hurdles that hackers must overcome to gain control of the system,” the EURECOM researcher explains. Their vulnerability therefore does not threaten the entire security for operating systems, but it is one less door for hackers to break into.

The second, less technical reason for permissiveness regarding stack canaries is related to developers’ choices. “They do not want to increase the security of these countermeasures, because it would it decrease performance,” Julian Kirsch explains. For software publishers, security is a less competitive argument than the software’s performance. Greater security implies a greater allocation of computing resources for tasks that do not directly respond to the software user’s requests. Still, customers rarely appreciate computer system intrusions. Considering organizations’ growing concerns about cybersecurity issues, we can hope that the software chosen better integrates this aspect. Security could then become a serious argument in the software solution market.

bioDigital

BioDigital, a new technology to combat identity spoofing

I’MTech is dedicating a series of articles to success stories from research partnerships supported by the Télécom & Société Numérique Carnot Institute (TSN), to which Télécom SudParis belongs. The original version of this article was published in French on Télécom SudParis website

[divider style=”normal” top=”20″ bottom=”20″]

Belles histoires, Bouton, CarnotFollowing an 18-month collaboration agreement, Télécom SudParis (a member of the Télécom & Société Numérique Carnot Institute (TSN) and IDEMIA, the global leader in augmented identity, have finalized the design for a contactless biometric reader, based on a patent filed by two Télécom SudParis researchers. The technology transfer to IDEMIA has just been completed.

 

Fingerprint spoof detection

The technology comprises a next-generation biometric fingerprint scanner called BioDigital. It is an effective tool for combating identity spoofing and also provides a solution to difficulties related to the very nature of biometric data through improved recognition of damaged fingerprint surfaces. “The quality of the reconstructed image of the internal fingerprint is what makes our technology truly original,” says Bernadette Dorizzi, Dean of Research at Télécom SudParis.

Télécom SudParis and IDEMIA have worked in close collaboration. The group provided an assessment algorithm and compiled a database for its assessment, which made it possible to demonstrate that BioDigital is able to provide safer and more effective fingerprint matching by also detecting spoofed fingerprints, and has a success rate of nearly 100%.

Subcutaneous fingerprint and sweat pore network recognition

This contactless technology recognizes not only the fingerprint, but also the subcutaneous print and the network of sweat pores. It is based on optical coherence tomography which produces 3D images using light “echoes”. This allows BioDigital to provide access to fingerprints without direct contact with the reader. Along with this innovation, the system also provides an exceptional image reconstruction quality. “By fusing phase and intensity images, we’ve succeeded in obtaining as natural an image as possible,” says Yaneck Gottesman, research professor at Télécom SudParis.

For a certain number of crucial applications such as the protection of critical infrastructures, spoofing attacks are a real issue and it’s a race between hackers and technology developers like IDEMIA. Once this technology is put into production and integrated in our products, it has the potential to put us another step ahead,” adds Jean-Christophe Fondeur, Executive Vice-President for Research & Development at IDEMIA.

 

[divider style=”normal” top=”20″ bottom=”20″]

A guarantee of excellence in partnership-based research since 2006

The Télécom & Société Numérique Carnot Institute (TSN) has been partnering with companies since 2006 to research developments in digital innovations. With over 1,700 researchers and 50 technology platforms, it offers cutting-edge research aimed at meeting the complex technological challenges posed by digital, energy and industrial transitions currently underway in in the French manufacturing industry. It focuses on the following topics: industry of the future, connected objects and networks, sustainable cities, transport, health and safety.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Institut Mines-Télécom Business School, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.

[divider style=”normal” top=”20″ bottom=”20″]

fine particles

Fine particles: how can their impact on health be better assessed?

In order to assess the danger posed by fine particles in ambient air, it is crucial to do more than simply take regulatory measurements of their mass in the air. The diversity of their chemical composition means that different toxicological impacts are possible for an equal mass. Chemists at IMT Lille Douai are working on understanding the physicochemical properties of the fine particle components responsible for their adverse biological effects on health. They are developing a new method to indicate health effects, based on measuring the oxidizing potential of these pollutants in order to better identify those which pose risks to our health.

 

The smaller they are, the greater their danger. That is the rule of thumb to sum up the toxicity of the various types of particles present in the atmosphere. This is based on the ease with which the smallest particles penetrate deep into our lungs and get trapped there. While the size of particles clearly plays a major role in how dangerous they are, the impact of their chemical composition must not be understated. For an equal mass of fine particles in the air, those we breath in Paris are not the same as the ones we breathe in Dunkirk or Grenoble, due to the different nature of the sources which produce them.  And even within the same city the particles we inhale vary greatly depending on where we are located in relation to a road or a factory.

Fine particles are very diverse: they contain hundreds, or even thousands of chemical compounds,” say Laurent Alleman and Esperanza Perdrix, researchers in atmospheric pollution in the department of atmospheric sciences and environmental engineering at IMT Lille Douai. Carboxylic acid, polycyclic aromatic hydrocarbons are just some of the many examples of molecules found in particles in higher or lower proportions. A great number of metals and metalloids can be added to this organic cocktail: copper, iron, arsenic etc., as well as carbon black. The final composition of a fine particle therefore depends on its proximity to sources of each of these ingredients. Copper and antimony, for example, are commonly found in particles near roads, produced by cars when braking, while nickel and lanthanum are typical of fine particles produced from petrochemistry.

Read more on I’MTech: What are fine particles?

Today, only the mass concentration as a function of certain sizes of particles in the air is considered in establishing thresholds for warning the population. For Laurent Alleman and Esperanza Perdrix, it is important to go beyond mass and size to better understand and prevent the health impacts of particles based on their chemical properties.  Each molecule, each chemical species present in a particle has a different toxicity. “When they penetrate our lungs, fine particles break down and release these components,” explains Laurent Alleman. “Depending on their physicochemical properties, these exogenous agents will have a more or less serious aggressive effect on the cells that make up our respiratory system.”

Measuring particles’ oxidizing potential

This aggression mainly takes the form of oxidation chemical reactions in cells: this is oxidative stress. This effect induces deterioration of biological tissue and inflammation, which can lead to different pathological conditions, whether in the respiratory system — asthma, chronic obstructive pulmonary diseases — or throughout the body. Since the chemical components and molecules produced by these stressed cells enter the bloodstream, they also create oxidative stress elsewhere in the body. “That’s why fine particles are also responsible for cardiovascular diseases such as cardiac rhythm disorders,” says Esperanza Perdrix. When it becomes too severe and chronic, oxidative stress can have mutagenic effects by altering DNA and can promote cancer.

For researchers, the scientific challenge is therefore to better assess a fine particle’s ability to cause oxidative stress. At IMT Lille Douai, the approach is to measure this ability in test tubes by determining the resulting production of oxidizing molecules for a specific type of particle. “We don’t directly measure the oxidative stress produced at the cellular level, but rather the fine particle’s potential to cause this stress,” explains Laurent Alleman. As such, the method is less expensive and quicker than a study in a biological environment. Most importantly, “Unlike tests on biological cells, measuring particles’ oxidizing potential is quick and can be automated, while giving us a good enough indication of the oxidative stress that would be produced in the body,” says Esperanza Perdrix. A winning combination, which would make it possible to make oxidizing potential a reference base for the analysis and ongoing, large-scale prevention of the toxicity of fine particles.

To measure the toxicity of fine particles, researchers are finding alternatives to biological analysis.

 

This approach has already allowed the IMT Lille Douai team to measure the harmfulness of metals. They have found that copper and iron are the chemical elements with the highest oxidizing potential. “Iron reacts with the hydrogen peroxide in the body to produce what we call free radicals: highly reactive chemical species with short lifespans, but very strong oxidizing potential,” explains Laurent Alleman. If the iron provided by the fine particles is not counterbalanced by an antioxidant — such as vitamin C — the radicals formed can break molecular bonds and damage cells.

Researchers caution, however, that, “Measuring oxidizing potential is not a unified method; it’s still in the developmental stages.” It is based on the principle of bringing together the component whose oxidizing potential is to be assessed with an antioxidant, and then measuring the quantity or rate of antioxidant consumed. In order for oxidizing potential to become a reference method, it still has to be become more popular among the scientific community, demonstrate its ability to accurately assess biological oxidative stress produced in vivo, and be standardized.

So for now, the mass concentration of fine particles remains the preferred method. Nevertheless, a growing number of studies are being carried out with the aim of taking account of chemical composition and health aspects. This is reflected in the many disciplines involved in this research. “Toxicological issues bring together a wide variety of fields such as chemistry, physics, biology, medicine, bioinformatics and risk analysis, to name just a few,” says Esperanza Perdrix, who also cites communities other than those with scientific expertise. “This topic extends beyond our disciplinary fields and must also involve environmental groups, citizens, elected officials and others,” she adds. 

Research is ongoing at the international level as well, in particular through MISTRALS, a large-scale meta-program led by CNRS, launched in 2010 for a ten-year period. One of its programs, called ChArMEx, aims to study pollution phenomena in the Mediterranean basin. “Through this program, we’re developing international collaboration to improve methods for measuring oxidizing potential,” explains Laurent Alleman. “We plan to develop an automated tool for measuring oxidizing potential over the next few years, by working together with a number of other countries, especially those in the Mediterranean region such as Crete, Lebanon, Egypt, Turkey etc.”

 

Also read on I’MTech:

[divider style=”normal” top=”20″ bottom=”20″]

A MOOC to learn all about air pollution

On October 8, IMT launched a MOOC dedicated to air quality, drawing on the expertise of IMT Lille Douai. It presents the main air pollutants and their origin, whether man-made or natural. The MOOC will also provide an overview of the health-related, environmental and economic impacts of air pollution.

[divider style=”normal” top=”20″ bottom=”20″]

Prix, Awards

Three new winners of the IMT-Académie des Sciences awards

The IMT-Académie des Sciences Awards, established in 2017, honor outstanding European scientific contributions to the fields of digital technology, energy and the environment. The 2018 formal awards ceremony was held on Tuesday, November 20th in the dome of the Institut de France. The three award winners (two “Grand Prix” winners and one “Young Scientist” winner) answered our questions.

 

Awards Ceremony at the Institut de France, 20 November 2018

 

Two winners share the IMT-Académie des Sciences Grand Prix

For this second edition, the jury selected two candidates for the IMT-Académie des Sciences Grand Prix: Ange Nzihou, director of the RAPSODEE-CNRS laboratory at IMT Mines-Albi and Pierre Comon, CNRS research director, Grenoble INP.

– Ange NZIHOU, 52, is a pioneer in waste recovery. His research has led to the development of new energy production mechanisms using industrial, urban and biomass waste. Ange Nzihou has greatly contributed to creating an international scientific community on this subject.

– Pierre COMON first made his mark in the field of signal processing. He significantly contributed to laying the foundations of signal source separation techniques. He then focused his work on the factorization of mathematical matrices, which have applications in imaging and chemistry for detecting toxic molecules.

The IMT-Académie des Sciences Young Scientist Prize

The IMT-Académie des Sciences Young Scientist Prize was awarded to Ioan-Mihai Miron, research director at CNRS-CEA

Ioan-Mihai MIRON, 38, is a global expert in spintronics—a field that seeks to study and control the electromagnetic properties of electrons. His work has applications in the development of new technology for data storage that is both more powerful and more energy efficient.

 

Élise Vareilles

IMT Mines Albi | Configuration, Decision Support Systems, Constraint Programming, Knowledge Management

 

Dr Elise Vareilles has been an Associate Professor at IMT Mines Albi, France, since 2005. She received her PhD in 2005 and won the prize for the best PhD thesis of INP Toulouse in the same year. She received her accreditation to supervise research (HdR) in the field of Industrial Engineering in 2015. She works on the development of an interactive aiding configuration/design/decision tools based on knowledge and is part of the development and the improvement of the CoFiADe software. Since January 2016, she has been the leader of the ORKID (Organization, Risk, Knowledge in Design) research team, in the Industrial Engineering Center of IMT Mines Albi.

[toggle title=”Find all her articles on I’MTech” state=”open”]

[/toggle]

Mihai Miron

Ioan-Mihai Miron: Magnetism and Memory

Ioan Mihai Miron’s research in spintronics focuses on new magnetic systems for storing information. The research carried out at Spintec laboratory in Grenoble is still young, having begun in 2011. However, it already represents major potential in addressing the current limits facing technology in terms of our computers’ memory. The research also offers a solution to problems experienced by magnetic memories until now, which have prevented their industrial development. Ioan-Mihai Miron received the 2018 IMT-Académie des sciences Young Scientist Award for his groundbreaking and promising research. 

 

Ioan-Mihai Miron’s research is a matter of memory… and a little architecture too. When presenting his work on the design of new nanostructures for storing information, the researcher from Spintec* uses a three-level pyramid diagram. The base represents broad and robust mass memory. Its large size enables it to store large amounts of information, but it is difficult to access. The second level is the central memory, which is not as big but faster to access. It includes the information required to launch programs. Finally, the top of the pyramid is cache memory, which is much smaller but more easily accessible. “The processor only works with this cache memory,” the researcher explains. “The rest of the computer system is there to retrieve information lower down in the pyramid as fast as possible and bring it up to the top.

Of course, computers do not actually contain pyramids. In microelectronics, this memory architecture takes the form of thousands of microscopic transistors that are responsible for the central and cache memories. They work as switches, storing the information in binary format and either letting the current circulate or blocking it. With the commercial demand for miniaturization, transistors have gradually reached their limit. “The smaller the transistor, the greater the stand-by consumption,” Ioan-Mihai Miron explains. This is why the goal is now for the types of memory located at the top of the pyramid to rely on new technologies based on storing information at the electronic level. By modifying the current sent into magnetic material, the magnetization can be altered at certain points. “The material’s electrical resistance will be different based on this magnetization, meaning information is being stored,” Ioan-Mihai Miron explains. In simpler terms, a high electrical resistance corresponds to one value, a low resistance to another, which forms a binary system.

In practical terms, information is written in these magnetic materials by sending two perpendicular currents, one from above and one from below the material. The point of intersection is where the magnetization is modified. While this principle is not new, it still is not currently used for cache memory in commercial products. Pairing magnetic technologies with this type of data storage has remained a major industrial challenge for almost 20 years. “Memory capacities are still too low in comparison with transistors, and miniaturizing the system is complicated,” the researcher explains. These two disadvantages are not offset by the energy savings that the technology offers.

To compensate for these limitations, the scientific community has developed a simplified geometry of these magnetic architectures. “Rather than intersecting two currents, a new approach has been to only send a single linear path of current into the material,” Ioan-Mihai Miron explains. “But while this technique solved the miniaturization and memory capacity problems, it created others.” In particular, writing the information involves applying a strong electric current that could damage the element where the information is stored. “As a result, the writing speed is not sufficient. At 5 nanoseconds, it is slower than the latest generations of transistor-based memory technology.

Electrical geometry

In the early 2010s, Ioan-Mihai Miron’s research opened major prospects for solving all these problems. By slightly modifying the geometry of the magnetic structures, he demonstrated the possibility of writing at speeds in under a nanosecond. And the same size offers a greater memory capacity. The principle is based on the use of a current sent into a plane that is parallel to the layers of the magnetized material, whereas previously the current had been perpendicular. This difference makes the change in magnetization faster and more precise. The technology developed by Ioan-Mihai Miron offers still more benefits: less wear on the elements and the elimination of writing errors. It is called SOT-MRAM, for Spin-Orbit Torque Magnetic Random Access Memory. This technical name reflects the complexity of the effects at work in the layers of electrons of the magnetic materials exposed to the interactions of the electrical currents.

The nanostructures developed by Ioan-Mihai Miron and his team are opening new prospects for magnetic memories.

 

The progressive developments of magnetic memories may appear minimal. At first glance, a transition from two perpendicular currents to one linear current to save a few nanoseconds seems to be only a minor advance. However, the resulting changes in performance offer considerable opportunities for industrial actors. “SOT-MRAM has only been in existence since 2011, yet all the major microelectronics businesses already have R&D programs on this technology that is fresh out of the laboratory,” says Ioan-Mihai Miron. SOT-MRAM is perceived as the technology that is able to bring magnetic technologies to the cache memory playing field.

The winner of the 2018 IMT – Académie des Sciences 2018 Young Scientist award seeks to remain realistic regarding the industrial sector’s expectations for SOT-MRAM. “Transistor-based memories are continuing to improve at the same time and have recently made significant progress,” he notes. Not to mention that these technologies have been mature for decades, whereas SOT-MRAM has not yet passed the ten-year milestone of research and sophistication. According to Ioan-Mihai Miron, this technology should not be seen as a total break with previous technology, but as an alternative that is gradually gaining ground, albeit rapidly and with significant competitive opportunities.

But there are still steps to be made to optimize SOT-MRAM and have it integrated into our computer products. These steps may take a few years. In the meantime, Ioan-Mihai Miron is continuing his research on memory architectures, while increasingly entrusting SOT-MRAM to those who are best suited to transferring it to society. “I prefer to look elsewhere rather than working to improve this technology. What interests me is discovering new capacities for storing information, and these discoveries happen a bit by chance. I therefore want to try other things to see what happens.

*Spintec is a mixed research unit of CNRS, CEA, Université Grenoble Alpes.

[author title=”Ioan-Mihai Miron: a young expert in memory technology” image=”https://imtech-test.imt.fr/wp-content/uploads/2018/11/mihai.png”]

Ioan-Mihai Miron is a researcher at the Spintec laboratory in Grenoble. His major contribution involves the discovery of the reversal of magnetization caused by spin orbit coupling. This possibility provides significant potential for reducing energy consumption and increasing the reliability of MRAM, a new type of non-volatile memory that is compatible with the rapid development of the latest computing processors. This new memory should eventually come to replace SRAM memories alongside processors.

Ioan-Mihai Miron is considered a world expert, as shown by the numerous citations of his publications (over 3,000 citations in a very short period of time). In 2014 he was awarded the ERC Starting Grant. His research has also led to several patents and contributed to creating the company Antaios, which won the Grand Prix in the I-Lab innovative company creation competition in 2016. Fundraising is currently underway, demonstrating the economic and industrial impacts of the work carried out by the winner of the 2018 IMT-Académie des Sciences Young Scientist award.[/author]

Pierre Comon

Pierre Comon: from the brain to space, searching for a single solution

Pierre Comon’s research focuses on a subject that is as simple as it is complex: how to find a single solution to a problem. From environment to health and telecommunications, this researcher in information science at GIPSA-Lab is taking on a wide range of issues. Winner of the IMT-Académie des Sciences 2018 Grand Prix, he juggles mathematical abstraction and the practical, scientific reality in the field.

 

When asked to explain what a tensor is, Pierre Comon gives two answers. The first is academic, factual, and rather unattractive despite a hint of vulgarization: “it is a mathematical object that is equivalent to a polynomial with several variables.” The second answer reveals a researcher conscious of the abstract nature of his work, passionate about explaining it and experienced at doing so. “If I want to determine the concentration of a molecule in a sample, or the exact position of a satellite in space, I need a single solution to my mathematical problem. I do not want several possible positions of my satellite or several concentration values, I only want one. Tensors allow me to achieve this.

Tensors are particularly powerful in conditions in which the number of parameters is not particularly high. For example, they cannot be used to find the unknown position of 100 satellites with only 2 antennas. However, when the ratio between the parameters to be determined and the data samples are balanced, they become a very useful tool. There are many applications for tensors, including telecommunications, environment and healthcare.

Pierre Comon recently worked on tensor methods for medical imaging at the GIPSA-Lab* in Grenoble. For patients with epilepsy, one of the major problems is determining the source of the seizures in the brain. This not only makes it possible to treat the disease, but also to potentially prepare for surgery. “When patients have a disease that is too resistant, it is sometimes necessary to perform an ablation,” the researcher explains.

Today, these points are localized using invasive methods: probes are introduced into the patient’s skull to record brainwaves, a stage that is particularly difficult for patients. The goal is therefore to find a way to determine the same parameters using non-invasive techniques, such as electroencephalography and magnetoencephalography. Tensor tools are integrated into the algorithms used to process the brain signals recorded through these methods. “We have obtained promising results,” explains Pierre Comon. Although he admits that invasive methods currently remain more efficient, he also points out that they are older. Research on this topic is still young but has already provided reason to hope that treating certain brain diseases could become less burdensome for patients.

An entire world in one pixel

For environmental applications, on the other hand, results are much less prospective. Over the past decade, Pierre Comon has demonstrated the relevance of using tensors in planetary imaging. In satellite remote sensing, each pixel can cover anywhere from a few square meters to several square kilometers. The elements present in each pixel are therefore very diverse: forests, ice, bodies of water, limestone or granite formations, roads, farm fields, etc. Detecting these different elements can be difficult depending on the resolution. Yet, there is a clear benefit in the ability to automatically determine the number of elements within one pixel. Is it just a forest? Is there a lake or road that runs through this forest? What is the rock type?

The tensor approach answers these questions. It makes it possible to break down pixels by indicating the number of the different components. Better still, it can do this “without using a dictionary, in other words, without knowing ahead of time what elements might be in the pixel,” the researcher explains. This possibility owes to an intrinsic property of tensors, which Pierre Comon has brought to light. In certain mathematical conditions, they can only be broken down one way. In practice, for satellite imaging, a minimum number of variables are required: the intensity received for each pixel, each wavelength and each angle of incidence must be known. Therefore, the unique nature of tensor decomposition makes it possible to retrace the exact proportion of different elements in each image pixel.

For planet Earth, this approach has limited benefits, since the various elements are already well known. However, it could be particularly helpful in monitoring how forests or water supplies develop. On the other hand, the tensor approach is especially useful for other planets in the solar system. “We have tested our algorithms on images of Mars,” says Pierre Comon. “They helped us to detect different types of ice.” For planets that are still very distant and not as well known, the advantage of this “dictionary free” approach is that it helps bring unknown geological features to light. Whereas the human mind tends to compare what it sees with something it is familiar with, the tensor approach offers a neutral description and can help reveal structures with unknown geochemical properties.

The common theme: a single solution

Throughout his career, Pierre Comon has sought to understand how a single solution can be found for mathematical problems. His first major research in this area began in 1989 and focused on blind source separation in telecommunications. How could the mixed signals from two transmitting antennas be separated without knowing where they were located? “Already at that point, it was a matter of finding a single solution,” the researcher recalls. This research led him to develop techniques for analyzing signals and decomposing them into independent parts to determine the source of each one.

The results he proposed in this context during the 1990s had a huge resonance in both the academic world and industry. In 1988, he joined Thales and developed several patents used to analyze satellite signals. His pioneer article on the analysis of independent components has been cited by fellow researchers thousands of times and continues to be used by scientists. According to Pierre Comon, this work formed the foundation for his research topic. “My results at the time allowed us to understand the conditions for the uniqueness of a solution but did not always provide the solution. That required something else.” That “something else” is in part the tensors, which he has demonstrated to be valuable in finding single solutions.

His projects now focus on increasing the number of practical applications of his research. Beyond the environment, telecommunications and brain imaging, his work also involves chemistry and public health. “One of the ideas I am currently very committed to is that of developing an affordable device for quickly determining the levels of toxic molecules in urine,” he explains. This type of device would quickly reveal polycyclic aromatic hydrocarbon contaminations—a category of harmful compounds found in paints. Here again, Pierre Comon must determine certain parameters in order to identify the concentration of pollutants.

*The GIPSA-Lab is a joint research unit of CNRS, Université Grenoble Alpes and Grenoble INP.

[author title=”Pierre Comon: the mathematics of practical problems” image=”https://imtech-test.imt.fr/wp-content/uploads/2018/11/pierre-comon.jpg”]Pierre Comon is known in the international scientific community for his major contributions to signal processing. He became interested in exploring higher order statistics for separating sources very early on, establishing foundational theories for analyzing independent components, which has now become one of the standard tools used for the statistical processing of data. His significant contribution recently included his very original results on tensor factorization.

The applications of Pierre Comon’s contributions are very diverse and include telecommunications, sensor networks, health and environment. All these areas demonstrate the scope and impact of his work. His long industrial history, strong desire for his scientific approach to be grounded in practical problems and his great care in developing algorithms for implementing the obtained results all further demonstrate how strongly Pierre Comon’s qualities resonate with the criteria for the 2018 IMT-Académie des Sciences Grand Prix.[/author]