hydrogen

What is hydrogen energy?

In the context of environmental and energy challenges, hydrogen energy offers a clean alternative to fossil fuels. Doan Pham Minh, a chemist and environmental engineering specialist at IMT Mines Albi, explains why this energy is so promising, how it works and the prospects for its development.

 

What makes hydrogen so interesting?

Doan Pham Minh: The current levels of interest in hydrogen energy can be explained by the pollution problems linked to carbon-based energy sources. They emit fine particles, toxic gases and volatile organic compounds. This poses societal and environmental problems that must be remedied. Hydrogen offers a solution because it does not emit any pollutants. In fact, hydrogen reacts with oxygen to “produce” energy in the form of heat or electricity. The only by-product of this reaction is water. It can therefore be considered clean energy.

Is hydrogen energy “green”? 

DPM: Although it is clean, it cannot be called “green”. It all depends on how the dihydrogen molecule is formed. Today, around 96% of hydrogen is produced from fossil raw materials, like natural gas and hydrocarbon fractions from petrochemicals. In these cases, hydrogen clearly is not “green”. The remaining 4% is produced through the electrolysis of water. This is the reverse reaction of the combustion of hydrogen by oxygen: water is separated into oxygen and hydrogen by consuming electricity. This electricity can be produced by nuclear power stations, coal-fired plants or by renewable energies: biomass, solar, hydropower, wind, etc. The environmental footprint of the hydrogen produced by electrolysis depends on the electricity’s origin.

How is hydrogen produced from biomass?

DPM: In terms of the chemistry, it is fairly similar to the production of hydrogen from oil. Biomass is also made up of hydrocarbon molecules, but with a little more oxygen. At IMT Mines Albi, we work a great deal on thermo-conversion. Biomass, in other words wood, wood waste and agricultural residues, etc. is heated without oxygen, or in a low-oxygen atmosphere. The biomass is then split into small molecules and primarily produces carbon monoxide and the dihydrogen. Biomass can also be transformed into biogas through anaerobic digestion by microorganisms. This biogas can then be transformed into a mixture of carbon monoxide and dihydrogen. An additional reforming step uses water vapor to transform the carbon monoxide into carbon dioxide and hydrogen. We work with industrial partners like Veolia to use the CO2 and prevent the release of greenhouse gas. For example, it can be used to manufacture sodium bicarbonate, which neutralizes the acidic and toxic gases from industrial incinerators. The production of hydrogen from biomass is therefore also very clean, making it a promising technique.

Read more on I’MTech: Vabhyogaz uses our waste to produce hydrogen

Why is it said that hydrogen can store electricity?

DPM: Storing electricity is difficult. It requires complex batteries, used on a large scale. A good strategy is therefore to transform electricity into another energy that is easier to store. Through the electrolysis of water, electrical energy is used to produce dihydrogen molecules. This hydrogen can easily be compressed, transported, stored and distributed before being reused to produce heat or generate electricity. This is a competitive energy storage method compared to mechanical and kinetic solutions, such as dams and flywheels.

Why is it taking so long to develop hydrogen energy?

DPM: In my opinion, it is above all a matter of will. We see major differences between different countries. Japan, for example, is very advanced in the use of hydrogen energy. South Korea, the United States and China have also invested in hydrogen technologies. Things are beginning to change in certain countries. France now has a hydrogen plan, launched last June by Nicolas Hulot. However, it remains a new development, and it will take time to establish the infrastructures. We currently only have around 20-25 hydrogen fuel stations in France, which is not many. Hydrogen vehicles remain expensive: a Toyota Mirai sedan costs €78,000 and a hydrogen bus costs approximately €620,000. These vehicles are much more expensive than the equivalent in vehicles with diesel or gas engines. Nevertheless, these prices are expected to decline in coming years, because the number of hydrogen vehicles is still very limited. Investment programs must be established, and they take time to implement.

Read more on I’MTech:

emotions

From human feelings to digital emotions

Making powerful machines is no longer enough. They must also connect with humans socially and emotionally. This imperative to increase the efficiency of human-computer interactions has created a new field of research: social computing. This field is aimed at understanding, modeling and reproducing human emotions. But how can an emotion be extracted and then reproduced, based only on a vocal track, video or text? This is the complexity of the research Chloé Clavel is working on at Télécom ParisTech.

 

All those moments will be lost in time, like tears in rain.” We are in Los Angeles in 2019, and Roy Batty utters these words, knowing he has only seconds left to live. Melancholy, sadness, regret… Many different feelings fill this famous scene from the 1982 cult film Blade Runner by Ridley Scott. There would not be anything very surprising about these words, if it were not for the fact that Roy Batty is a replicant: an anthropomorphic machine.

In reality, in 2018, there is little chance of us seeing a humanoid robot walking the streets next year, capable of developing such complex emotions. Yet, there is a trend towards equipping our machines to create emotional and social connections with humans. In 1995, this led to the creation of a new field of research called affective computing. Today, it has brought about sub-disciplines such as social computing.

These fields of research involve two aspects,” explains Chloé Clavel, a researcher in the field at Télécom ParisTech. “The first is the automatic analysis of our social behaviors, interactions and emotions. The second is our work to model these behaviors, simulate them and integrate them into machines.” The objective: promote common ground and produce similarities to engage the user. Human-computer interaction would then become more natural and less frustrating for users who sometimes regret not having another human to interact with, who would better understand their position and desires.

Achieving this result first requires understanding how we communicate our emotions to others. Researchers in affective computing are working to accomplish this by analyzing different modes of human expression. They are interested in the way we share a feeling in writing on the internet, whether it be on blogs, in reviews on websites or on social networks. They are also studying the acoustic content of the emotions we communicate through speech such as pitch, speed and melody of voice, as well as the physical posture we adopt, our facial expressions and gestures.

The transition from signals to behaviors

All this data is communicated through signals such as a series of words, the frequency of a voice and the movement of points on a video. “The difficulty we face is transitioning from this low-level information to rich information related to social and emotional behavior” explains Chloé Clavel. In other words, what variation in a tone of voice is characteristic of fear? Or what semantic choice is used in speech to reflect satisfaction? This transition is a complex one because it is subjective.

The Télécom ParisTech researcher uses the example of voice analysis to explain this subjectivity criterion. “Each individual has a different way of expressing their social attitudes through speech, therefore large volumes of data must be used to develop models which integrate this diversity.” For example, dominant people generally express themselves with a deeper voice. To verify and model this tendency, multiple recordings are required, and several third parties must validate the various audio excerpts. “The concept of a dominant attitude varies from one person to another. Several annotations are therefore required for the recordings to avoid bias in the interpretation,” Chloé Clavel explains.

The same is true in the analysis of comments on online platforms. The researchers use a corpus of texts annotated by external individuals. “We collect several annotations for a single piece of text data,” the researcher explains. Scientists provide the framework for these annotations using guides based on literature in sociology and psychology. “This helps us ensure the annotations focus on the emotional aspects and makes it easier to reach a consensus from several annotations.” Machine learning methods are then used, without introducing any linguistic expertise into the algorithms first. This provides classifications of emotional signals that are as unbiased as possible, which can be used to identify semantic structures that characterize discontent or satisfaction.

Emotions for mediation

Beyond the binary categorization of an opinion—as positive or negative—one of the researchers’ greatest tasks is to determine the purpose and detailed nature of this opinion. Chloé Clavel led a project on users’ interactions with a chatbot. The goal was to determine the source of a user’s negative criticism, whether it was caused by the chatbot itself being unable to answer the user correctly, by the interaction, for example the unsuitable format of the interface, or by the user who might simply be in a bad mood. For this project, which benefited from virtual assistance from EDF, the semantic details in messages written to the chatbot had to be examined. “For example, the word ‘power’ does not have the same connotation when someone refers to contract power with EDF as it does when used to refer to the graphics power of a video game,” explains Chloé Clavel. “To gain an in-depth understanding of opinions, we must disambiguate each word based on the context.

Read more on I’MTech Coming soon: new ways to interact with machines

The chatbot example does not only illustrate the difficulty involved in understanding the nature and context of an opinion, but it also offers a good example of the value of this type of research for the end user. If the machine is able to understand the reasons why the human it is interacting with is frustrated, it will have a better chance of adapting to provide its services in the best conditions. If the cause is the user being in a bad mood, the chatbot can respond with a humorous or soothing tone. If the problem is cause by the interaction, the chatbot can determine when it is best to refer the user to a human operator.

Recognizing emotions and the machine’s ability to react in a social manner therefore allows it to play a conciliatory role. This aspect of affective computing was used in the H2020 Animatas project, in which Télécom ParisTech has been involved since 2018 and will continue for four years. “The goal is to introduce robots in schools to assist teachers and manage the social interactions with students,” Chloé Clavel explains. The idea is to provide robots with social skills to help promote the child’s learning. The robot could therefore offer each student personalized assistance during class to support the teacher’s lessons. Far from the imaginary humanoid robot hidden among humans, an educational mediator could improve learning for children.

 

CUBAIR : un prototype pour purifier l’air intérieur

CUBAIR: a prototype for purifying indoor air

Improving indoor air quality? That’s what the CUBAIR project aims to do. By developing a new treatment system, researchers have managed to significantly reduce fine particle concentration and nitrogen oxides.

 

An important reminder: indoor air is often more polluted than outdoor air. In addition to the automobile exhaust and industrial pollution that enter our homes and offices through the windows, molds and pollutants also come from building materials or cleaning products. What can we do to make the air we breathe inside our homes and offices healthier? That is the big question for researchers working on the CUBAIR project funded by ADEME.

For four years, the group of researchers from Cerema, IMT Atlantique and LaTep (laboratory of Université de Pau et des pays de l’Adour) have been developing a prototype for an air purification system. The air is cleaned through a 3-step process. First, the air taken into the system is filtered by activated carbons with different characteristics. These materials are able to capture organic compounds present in the air —pesticides are one such example. As the air leaves the system, it goes through a more traditional filtering stage to eliminate fine particles. The last step is a photocatalysis stage. When exposed to ultraviolet light, titanium dioxide molecules react with some of the pollutants that remain in the air.

Last year, this prototype was tested at the Human Resource Development Centre in Paris. The goal was to study how effective it was in real conditions throughout an entire year. The device’s performance was measured for different kinds of pollutants: volatile organic compounds, fine particles, mold etc. The results were especially promising for nitrogen oxides— particularly nitrogen dioxide, a major air pollutant— since the treatment system reduces their concentration by 60% in the treated air. Positive results were also observed for fine particles, with the concentration dropping by 75% for particles with diameters less than 1 micron.

The only drawbacks: volatile organic compounds are not eliminated as effectively and the system tends to heat up during use which leads to extra air conditioning costs in summer. The researchers noted, however, that this can be an advantage in cooler weather and that this inconvenience should be weighed against the significantly improved air quality in a room.

Overall, the CUBAIR project offers good prospects for breathing healthier air in our future buildings. Figures published by the World Health Organization in 2018 serve as a reminder that air pollution causes 7 million premature deaths worldwide every year. This pollution also represents an annual cost of approximately €20 billion in France. Combating this pollution is therefore a major health, environmental and economic issue.

Also read on I’MTech:

cyber CNI chair

Renewal of the Cyber CNI Chair: Partners Signed at FIC 2019

After three years of operation, the Cybersecurity of Critical Infrastructure Chair (Cyber CNI) is being renewed for three more years thanks to the commitment of its academic and industrial partners. The IMT Chair led by IMT Atlantique benefits from partnerships with Télécom ParisTech and Télécom SudParis schools and support from the Brittany region —a region at the forefront of cutting-edge technology— within the framework of the Cyber Center of Excellence.

In the context of a sponsor partnership led by Fondation Mines-Télécom, five industrial partners have committed to this new period: AIRBUS, AMOSSYS, BNP Paribas, EDF and Nokia Bell Labs. The official signing to renew the Chair took place at FIC (Forum International de la Cybersécurité) in Lille on 22 January 2019.

 

The Cyber CNI Chair was founded in January 2016. Since then, it has led research and contributed to training in its area of expertise: the cybersecurity of critical infrastructures (energy networks, digital infrastructures, industrial processes, water production factories, financial systems, etc.). This recent scientific discipline is rapidly becoming an unavoidable topic for many companies and organizations. The Chair’s first phase made significant headway in improving the security and resilience of infrastructures, particularly in detecting malicious behavior using automatic learning techniques, 3D visualization of security events and the diagnosis of accidental or malicious causes of security incidents.

While the main theme remains the cybersecurity of critical infrastructures, in its second phase, the Cyber CNI Chair will expand its field of expertise and research: reinforcing efforts aimed at applying artificial intelligence (AI) in the area of cybersecurity, sharing and pooling information on threats (Threat Intelligence), industrial applications of connected objects (Industrial IoT) and the blockchain are all included in the Chair’s new research topics.

New theses will contribute to these objectives, in addition to those already underway. At the same time, phase 2 of the Cyber CNI Chair will increase communication efforts and promotion events to highlight the work achieved.

The Cyber CNI Chair, led by IMT Atlantique, is historically one of the first to have addressed cybersecurity with such a high level of skills and innovation in the complex area of critical infrastructures. The remarkable work carried out in recent years will continue in the second phase of the process through the renewed commitment of all the academic and industrial partners, sponsors, students and PhD students on new vital topics. The Chair’s renewal is therefore critically important and is justified by new lines of research, growing challenges and the major expectations of contributors, whose needs are increasingly acute,” says Serge Maurice, Airbus & Chairman of the Cyber CNI Chair Steering Committee.

 

care pathway

When AI helps predict a patient’s care pathway

Researchers at Mines Saint Etienne are using process mining tools to attempt to describe typical care pathways for patients with a given disease. These models can be used to help doctors predict the next steps for treatment or how a disease will progress.

 

Will doctors soon be able to anticipate patient complications arising from a disease? Will they be able to determine an entire care pathway in advance for patients with a specific disease? These are the goals of Vincent Augusto and his team at Mines Saint-Étienne. “Based on a patient’s treatment records, their condition at a given moment, and care pathways of similar patients, we’re trying to predict what the next steps will be for the patients,” says Hugo De Oliveira, a PhD student in Health Systems Engineering whose CIFRE thesis is funded by HEVA, a company based in Lyon.

Anticipating how a disease will progress and treatment steps helps limit risks to which the patient is exposed. For people with diabetes — the example used by the researchers in their work — the process is based on detecting weak signals that are precursors of complications as early as possible. For a given patient, the analysis would focus on several years of treatment records and a comparison with other diabetic patients. This would make it possible to determine the patient’s risk of developing renal failure or requiring an amputation related to diabetes.

In order to predict these progressions, the researchers do not rely on personal medical data, such as X-rays or biological analyses. They use medico-administrative data from the national health data system (SNDS). “In 2006, activity-based pricing was put into place,” notes Hugo De Oliveira. With this shift in the principle of funding for healthcare institutions, a large database was created to provide hospitals with the necessary information for reimbursement of treatment. “It’s a very useful database for us, because each line collects information about a patient’s stay: age, sex, care received, primary diagnosis, associated pathologies from which they suffer etc,” says the young researcher.

An entire pathway in one graph

Vincent Augusto’s team is developing algorithms that analyze these large volumes of data. Patients are sorted and put into groups with similar criteria. Different care pathway categories can then be established, each of which groups together several thousands of similar pathways (similar patients, identical complications etc.). In one category — diabetic patients who have undergone amputation for example — the algorithm analyzes all of the steps for the entire group of patients in order to deduce which ones are most characteristic. A graph is produced to represent the typical pathway for this category of patient. It may then be used as a reference to find out whether a patient in the early stages of the disease is following similar steps, and to determine the probability that he/she belongs to this category.

This graph represents the care pathway for patients monitored over an 8-year period who have had a cardiac defibrillator implanted. The part before the implantation can be used to establish statistics for the steps preceding the procedure. The part after the implantation provides information about the future of patients following the implantation.

 

In this way, the researchers are working on developing longitudinal graphs: each treatment step represents a point on the graph, and the whole graph can be read chronologically: “Doctors can read the graph very easily and determine where the patient is situated in the sequence of steps that characterize his/her pathway,” explains Hugo De Oliveira. The difficulty with this type of data representation comes from its comprehensiveness: “We have to find a way to fit an entire patient pathway into a single line,” says the PhD student. In order to do so, the team chose to use process mining, a data mining and knowledge extraction tool. Machine learning is another such tool.

Process mining helps make care pathway descriptions more effective and easier to read, but it also provides another benefit: it is not a ‘black box’. This characteristic is often encountered in neural network type algorithms. Such algorithms are effective at processing data, but it is impossible to understand the processes that led to the results of the algorithm. Unlike these algorithms, the process mining algorithms used to predict treatment pathways are transparent. “When a patient is characterized by a type of graph, we’re able to understand why by looking at past treatment steps, and studying each graph for the patient’s categories to understand how the algorithm evaluated the pathway,” says Hugo De Oliveira.

Making artificial intelligence applications more transparent is one of the issues brought forth by the working group that produced a French report on AI led by Cédric Villani. The project is also in keeping with the objectives set by the mathematician and member of the French parliament to facilitate AI experimentation for applications, for healthcare in particular. “Our research directly benefits from policies for opening access to health data,” says the PhD student. This access will continue to open up for the researchers, since later on this year they will be able to use the database of the national health insurance cross-scheme system (SNIIRAM): the 1.2 billion healthcare forms contained in the system will be used to improve the algorithms and better identify patient treatment pathways.

 

Canaries were once used in coal mines to forewarn of impending firedamp explosions. This story has inspired a cyberdefense tool : stack canaries.

Stack canaries: overestimating software protection

Android, Windows, Mac, Linux… All operating systems contain stack canaries — one of the most common forms of software protection. These safeguards that protect computer systems from intrusions are perceived as very effective. Yet, recent research carried out by EURECOM and the Technical University of Munich show that most stack canaries contain vulnerabilities. The results obtained through a project led by the German-French Academy for the Industry of the Future highlight the fragility of computer systems in the context of increasingly digitized organizations.

 

During the 19th century, canaries were used in coal mines to forewarn of impending firedamp explosions. The flammable, odorless gas released by the miners’ activities caused the birds either to lose consciousness or to die. This alerted the workers that something was wrong. Several decades later, in the early 2000s, researchers in cybersecurity were inspired by the story of canaries in coal mines. They invented a simple protection system for detecting software corruption—calling it “stack canary”. Since then, it has become one of the most common protection systems in the software we use and is now present in almost all operating systems. But is it really effective?

Perhaps it seems strange to be asking this question over 20 years after the first stack canaries were used in computer products. “The community assumed that the protection worked,” explains Aurélien Francillon, a researcher in cybersecurity at EURECOM. “There was some research revealing potential vulnerabilities of stack canaries, but without any in-depth investigation into the issue.” Researchers from EURECOM and the Technical University of Munich (TUM) have therefore partnered together to remedy this lack of knowledge. They assessed the vulnerabilities of stack canaries in 17 different combinations of 6 operating systems, to detect potential defects and determine good practices to adopt to remedy the situations. Linux, Windows 10, macOS Sierra and Android 7.0 were all included in the studies.

We showed that, in the majority of operating systems, these countermeasures for detecting defects are not very secure,” Aurélien Francillon explains. 8 out of the 17 tested combinations are qualified by the researchers as using an inefficient stack canary (see table below). 6 others can be improved, and the last 3 are blameless. This study of the vulnerabilities of stack canaries, carried out in the context of the Secure connected industry of the future (SeCIF) project, part of the German-French Academy for the Industry of the Future, is linked to the growing digital component of organizations. Industries and companies are increasingly reliant on connected objects and IT processes. Defects in the protection devices for operating systems can therefore endanger companies’ overall security, whether it be access to confidential data or gaining control of industrial machinery.

Out of the 17 operating systems tested, only Android 7.0 “Nougat”, macOS 10.12.1 “Sierra”, and OpenBSD 6.0 (Unix) had completely secure stack canaries. A red cross means that it is possible to bypass the stack canary in the given combination. An orange cross mean that stack canary security can be improved. Table columns are different memory types from a programming logic standpoint.

Out of the 17 operating systems tested, only Android 7.0 “Nougat”, macOS 10.12.1 “Sierra”, and OpenBSD 6.0 (Unix) had completely secure stack canaries. A red cross means that it is possible to bypass the stack canary in the given combination. An orange cross mean that stack canary security can be improved. Table columns are different memory types from a programming logic standpoint.

The canary in the memory

To understand the impacts of the defects revealed by this research, it is important to first understand why stack canaries are used and how they work. Many attacks that occur are aimed at changing values in a program that are not meant to be changed. The values are stored in memory space. “Let’s say this space has a capacity of 20 bytes,” says Julian Kirsch, a cybersecurity researcher at TUM and co-author of this study. “I would store my name and height on 20 of these bytes. Then, on another space located just behind it, I would store my bank account number. If a hacker wants to corrupt this information, he will add values, for example by adding a number to the value for my height. By doing this, my height data will overflow from the 20-byte space to the space where my bank account number is stored, and the information will no longer be correct. When the program needs to read and use this data, things will go wrong.”

In more complex cases for operating systems, the consequences include more critical errors than the wrong bank account number. To determine whether the information stored in the memory was altered, a known numerical value can be inserted between the storage spaces, as a type of memory buffer. If a hacker adds information, like in Julian Kirsch’s example in which the height was changed, everything will shift, and the value indicated in the memory buffer will change. The stack canary is simply a memory buffer. If the stack canary’s security is compromised, the hacker can modify it and then hide it by resetting it to the initial value.

To make the hacker’s work more difficult, the value of most stack canaries is changed regularly. A copy of the new value is stored in another memory space and both values, the real one and the reference one, are compared to ensure the integrity of the software. In their work, the researchers showed that the vulnerabilities of stack canaries are primarily linked to the place where this reference value is stored. “Sometimes it is stored in a memory space located right next to the stack canary,” Julian Kirsch explains. The hacker therefore does not need to access another part of the system and can change both values at the same time. “This is a defect we see in Linux, for example, which really surprised us because this operating system is widely used,” the TUM researcher explains.

How can such commonly used protection systems be so vulnerable on operating systems like Linux and Windows? First of all, Aurélien Francillon reminds us that stack canaries are not the only countermeasures that exist for operating systems. “In general, these are not the only countermeasures used, but stack canaries still represent significant hurdles that hackers must overcome to gain control of the system,” the EURECOM researcher explains. Their vulnerability therefore does not threaten the entire security for operating systems, but it is one less door for hackers to break into.

The second, less technical reason for permissiveness regarding stack canaries is related to developers’ choices. “They do not want to increase the security of these countermeasures, because it would it decrease performance,” Julian Kirsch explains. For software publishers, security is a less competitive argument than the software’s performance. Greater security implies a greater allocation of computing resources for tasks that do not directly respond to the software user’s requests. Still, customers rarely appreciate computer system intrusions. Considering organizations’ growing concerns about cybersecurity issues, we can hope that the software chosen better integrates this aspect. Security could then become a serious argument in the software solution market.

bioDigital

BioDigital, a new technology to combat identity spoofing

I’MTech is dedicating a series of articles to success stories from research partnerships supported by the Télécom & Société Numérique Carnot Institute (TSN), to which Télécom SudParis belongs. The original version of this article was published in French on Télécom SudParis website

[divider style=”normal” top=”20″ bottom=”20″]

Belles histoires, Bouton, CarnotFollowing an 18-month collaboration agreement, Télécom SudParis (a member of the Télécom & Société Numérique Carnot Institute (TSN) and IDEMIA, the global leader in augmented identity, have finalized the design for a contactless biometric reader, based on a patent filed by two Télécom SudParis researchers. The technology transfer to IDEMIA has just been completed.

 

Fingerprint spoof detection

The technology comprises a next-generation biometric fingerprint scanner called BioDigital. It is an effective tool for combating identity spoofing and also provides a solution to difficulties related to the very nature of biometric data through improved recognition of damaged fingerprint surfaces. “The quality of the reconstructed image of the internal fingerprint is what makes our technology truly original,” says Bernadette Dorizzi, Dean of Research at Télécom SudParis.

Télécom SudParis and IDEMIA have worked in close collaboration. The group provided an assessment algorithm and compiled a database for its assessment, which made it possible to demonstrate that BioDigital is able to provide safer and more effective fingerprint matching by also detecting spoofed fingerprints, and has a success rate of nearly 100%.

Subcutaneous fingerprint and sweat pore network recognition

This contactless technology recognizes not only the fingerprint, but also the subcutaneous print and the network of sweat pores. It is based on optical coherence tomography which produces 3D images using light “echoes”. This allows BioDigital to provide access to fingerprints without direct contact with the reader. Along with this innovation, the system also provides an exceptional image reconstruction quality. “By fusing phase and intensity images, we’ve succeeded in obtaining as natural an image as possible,” says Yaneck Gottesman, research professor at Télécom SudParis.

For a certain number of crucial applications such as the protection of critical infrastructures, spoofing attacks are a real issue and it’s a race between hackers and technology developers like IDEMIA. Once this technology is put into production and integrated in our products, it has the potential to put us another step ahead,” adds Jean-Christophe Fondeur, Executive Vice-President for Research & Development at IDEMIA.

 

[divider style=”normal” top=”20″ bottom=”20″]

A guarantee of excellence in partnership-based research since 2006

The Télécom & Société Numérique Carnot Institute (TSN) has been partnering with companies since 2006 to research developments in digital innovations. With over 1,700 researchers and 50 technology platforms, it offers cutting-edge research aimed at meeting the complex technological challenges posed by digital, energy and industrial transitions currently underway in in the French manufacturing industry. It focuses on the following topics: industry of the future, connected objects and networks, sustainable cities, transport, health and safety.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Institut Mines-Télécom Business School, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.

[divider style=”normal” top=”20″ bottom=”20″]

fine particles

Fine particles: how can their impact on health be better assessed?

In order to assess the danger posed by fine particles in ambient air, it is crucial to do more than simply take regulatory measurements of their mass in the air. The diversity of their chemical composition means that different toxicological impacts are possible for an equal mass. Chemists at IMT Lille Douai are working on understanding the physicochemical properties of the fine particle components responsible for their adverse biological effects on health. They are developing a new method to indicate health effects, based on measuring the oxidizing potential of these pollutants in order to better identify those which pose risks to our health.

 

The smaller they are, the greater their danger. That is the rule of thumb to sum up the toxicity of the various types of particles present in the atmosphere. This is based on the ease with which the smallest particles penetrate deep into our lungs and get trapped there. While the size of particles clearly plays a major role in how dangerous they are, the impact of their chemical composition must not be understated. For an equal mass of fine particles in the air, those we breath in Paris are not the same as the ones we breathe in Dunkirk or Grenoble, due to the different nature of the sources which produce them.  And even within the same city the particles we inhale vary greatly depending on where we are located in relation to a road or a factory.

Fine particles are very diverse: they contain hundreds, or even thousands of chemical compounds,” say Laurent Alleman and Esperanza Perdrix, researchers in atmospheric pollution in the department of atmospheric sciences and environmental engineering at IMT Lille Douai. Carboxylic acid, polycyclic aromatic hydrocarbons are just some of the many examples of molecules found in particles in higher or lower proportions. A great number of metals and metalloids can be added to this organic cocktail: copper, iron, arsenic etc., as well as carbon black. The final composition of a fine particle therefore depends on its proximity to sources of each of these ingredients. Copper and antimony, for example, are commonly found in particles near roads, produced by cars when braking, while nickel and lanthanum are typical of fine particles produced from petrochemistry.

Read more on I’MTech: What are fine particles?

Today, only the mass concentration as a function of certain sizes of particles in the air is considered in establishing thresholds for warning the population. For Laurent Alleman and Esperanza Perdrix, it is important to go beyond mass and size to better understand and prevent the health impacts of particles based on their chemical properties.  Each molecule, each chemical species present in a particle has a different toxicity. “When they penetrate our lungs, fine particles break down and release these components,” explains Laurent Alleman. “Depending on their physicochemical properties, these exogenous agents will have a more or less serious aggressive effect on the cells that make up our respiratory system.”

Measuring particles’ oxidizing potential

This aggression mainly takes the form of oxidation chemical reactions in cells: this is oxidative stress. This effect induces deterioration of biological tissue and inflammation, which can lead to different pathological conditions, whether in the respiratory system — asthma, chronic obstructive pulmonary diseases — or throughout the body. Since the chemical components and molecules produced by these stressed cells enter the bloodstream, they also create oxidative stress elsewhere in the body. “That’s why fine particles are also responsible for cardiovascular diseases such as cardiac rhythm disorders,” says Esperanza Perdrix. When it becomes too severe and chronic, oxidative stress can have mutagenic effects by altering DNA and can promote cancer.

For researchers, the scientific challenge is therefore to better assess a fine particle’s ability to cause oxidative stress. At IMT Lille Douai, the approach is to measure this ability in test tubes by determining the resulting production of oxidizing molecules for a specific type of particle. “We don’t directly measure the oxidative stress produced at the cellular level, but rather the fine particle’s potential to cause this stress,” explains Laurent Alleman. As such, the method is less expensive and quicker than a study in a biological environment. Most importantly, “Unlike tests on biological cells, measuring particles’ oxidizing potential is quick and can be automated, while giving us a good enough indication of the oxidative stress that would be produced in the body,” says Esperanza Perdrix. A winning combination, which would make it possible to make oxidizing potential a reference base for the analysis and ongoing, large-scale prevention of the toxicity of fine particles.

To measure the toxicity of fine particles, researchers are finding alternatives to biological analysis.

 

This approach has already allowed the IMT Lille Douai team to measure the harmfulness of metals. They have found that copper and iron are the chemical elements with the highest oxidizing potential. “Iron reacts with the hydrogen peroxide in the body to produce what we call free radicals: highly reactive chemical species with short lifespans, but very strong oxidizing potential,” explains Laurent Alleman. If the iron provided by the fine particles is not counterbalanced by an antioxidant — such as vitamin C — the radicals formed can break molecular bonds and damage cells.

Researchers caution, however, that, “Measuring oxidizing potential is not a unified method; it’s still in the developmental stages.” It is based on the principle of bringing together the component whose oxidizing potential is to be assessed with an antioxidant, and then measuring the quantity or rate of antioxidant consumed. In order for oxidizing potential to become a reference method, it still has to be become more popular among the scientific community, demonstrate its ability to accurately assess biological oxidative stress produced in vivo, and be standardized.

So for now, the mass concentration of fine particles remains the preferred method. Nevertheless, a growing number of studies are being carried out with the aim of taking account of chemical composition and health aspects. This is reflected in the many disciplines involved in this research. “Toxicological issues bring together a wide variety of fields such as chemistry, physics, biology, medicine, bioinformatics and risk analysis, to name just a few,” says Esperanza Perdrix, who also cites communities other than those with scientific expertise. “This topic extends beyond our disciplinary fields and must also involve environmental groups, citizens, elected officials and others,” she adds. 

Research is ongoing at the international level as well, in particular through MISTRALS, a large-scale meta-program led by CNRS, launched in 2010 for a ten-year period. One of its programs, called ChArMEx, aims to study pollution phenomena in the Mediterranean basin. “Through this program, we’re developing international collaboration to improve methods for measuring oxidizing potential,” explains Laurent Alleman. “We plan to develop an automated tool for measuring oxidizing potential over the next few years, by working together with a number of other countries, especially those in the Mediterranean region such as Crete, Lebanon, Egypt, Turkey etc.”

 

Also read on I’MTech:

[divider style=”normal” top=”20″ bottom=”20″]

A MOOC to learn all about air pollution

On October 8, IMT launched a MOOC dedicated to air quality, drawing on the expertise of IMT Lille Douai. It presents the main air pollutants and their origin, whether man-made or natural. The MOOC will also provide an overview of the health-related, environmental and economic impacts of air pollution.

[divider style=”normal” top=”20″ bottom=”20″]

Prix, Awards

Three new winners of the IMT-Académie des Sciences awards

The IMT-Académie des Sciences Awards, established in 2017, honor outstanding European scientific contributions to the fields of digital technology, energy and the environment. The 2018 formal awards ceremony was held on Tuesday, November 20th in the dome of the Institut de France. The three award winners (two “Grand Prix” winners and one “Young Scientist” winner) answered our questions.

 

Awards Ceremony at the Institut de France, 20 November 2018

 

Two winners share the IMT-Académie des Sciences Grand Prix

For this second edition, the jury selected two candidates for the IMT-Académie des Sciences Grand Prix: Ange Nzihou, director of the RAPSODEE-CNRS laboratory at IMT Mines-Albi and Pierre Comon, CNRS research director, Grenoble INP.

– Ange NZIHOU, 52, is a pioneer in waste recovery. His research has led to the development of new energy production mechanisms using industrial, urban and biomass waste. Ange Nzihou has greatly contributed to creating an international scientific community on this subject.

– Pierre COMON first made his mark in the field of signal processing. He significantly contributed to laying the foundations of signal source separation techniques. He then focused his work on the factorization of mathematical matrices, which have applications in imaging and chemistry for detecting toxic molecules.

The IMT-Académie des Sciences Young Scientist Prize

The IMT-Académie des Sciences Young Scientist Prize was awarded to Ioan-Mihai Miron, research director at CNRS-CEA

Ioan-Mihai MIRON, 38, is a global expert in spintronics—a field that seeks to study and control the electromagnetic properties of electrons. His work has applications in the development of new technology for data storage that is both more powerful and more energy efficient.

 

Élise Vareilles

IMT Mines Albi | Configuration, Decision Support Systems, Constraint Programming, Knowledge Management

 

Dr Elise Vareilles has been an Associate Professor at IMT Mines Albi, France, since 2005. She received her PhD in 2005 and won the prize for the best PhD thesis of INP Toulouse in the same year. She received her accreditation to supervise research (HdR) in the field of Industrial Engineering in 2015. She works on the development of an interactive aiding configuration/design/decision tools based on knowledge and is part of the development and the improvement of the CoFiADe software. Since January 2016, she has been the leader of the ORKID (Organization, Risk, Knowledge in Design) research team, in the Industrial Engineering Center of IMT Mines Albi.

[toggle title=”Find all her articles on I’MTech” state=”open”]

[/toggle]