infrared

MAGIC: the wonders of infrared camouflage

The MAGIC project aims to develop a camouflage technique against infrared cameras. Mines Saint-Etienne is using its expertise in the optical properties of materials to achieve the project’s objective. Funded by the DGA and supported by the ANR, MAGIC primarily focusses on military applications. Jenny Faucheu, a researcher on the project at Mines Saint-Étienne, explains the scientific approach used.

 

Infrared detection is particularly known for its application in thermal cameras. How do these cameras work?

Jenny Faucheu: They are based on thermography, which is used in thermal diagnosis, for example. The technique produces colorful images that indicate thermal radiation. The principle of cameras that produce this kind of image is based on the capture of distant infrared wavelengths: these are wavelengths of light that are greater than those of visible light, and correspond to the electromagnetic radiation of an object whose temperature is in the region of ten to several hundred degrees. The image displayed reflects the quantities of these wavelengths.

The ANR MAGIC project aims to develop a camouflage technique against this type of detection. What is this exactly?

JF: We use a material based on vanadium dioxide. It has thermochromic properties, meaning that its ability to emit infrared rays will change according to the temperature. More precisely, we use a polymorph of this vanadium oxide – a particular crystalline form. When heated to above 70°C, its crystalline form changes and the material passes from 80% energy radiation to 40%, making it appear colder than it actually is on thermal cameras. 40% radiation from an object at 75°C will still correspond to less radiation than 80% of an object at 65°C. This is one of the two camouflage properties we aim to develop.

What is the other camouflage property you are working on?

JF: Thermographic cameras that produce multicolor images are not the only cameras based on infrared emission. The other detection mechanism is the one used by cameras that produce grayscale night images. These cameras amplify near-visible infrared wavelengths and display them in white in the image. Things that emit no infrared radiation are displayed in black. If there is not enough energy to amplify on the image, the camera emits a beam and records what is reflected back to it, a bit like a sonar. In this case, even if the vanadium oxide material emits less radiation, it will still be detected because it will reflect the camera beam.

How can you ensure discretion faced with this second type of camera?

JF: We need to work on the surface texture of the materials and their structure. The approach we use consists in laser texturing the vanadium oxide material. We shape the surface to disperse the infrared rays emitted by the camera in different directions. To do this, we are working with Manutech-USD, which has a laser texturing platform capable of working on large and complex parts. Since the beam is not reflected back towards the camera, it is as though it had passed straight through the object. As far as the camera is concerned, if it receives no reflection there is nothing in front of it. Objects that should be displayed in white in the image without camouflage will instead be displayed in black.

What applications do you foresee for this work?

JF: MAGIC is a response to the ASTRID call, whose projects are funded by the Directorate General for Armaments (DGA). The planned applications are therefore essentially military. We are working with Hexadrone to build a surveillance drone like those found in stores… a stealthy one. We also want to show that it is possible to reduce the thermal signature of engines and infantrymen. By adding a few tungsten atoms to the vanadium oxide material, the temperature for crystalline form change can be decreased from about 70°C to about 35°C. This is very practical for potential human applications. A normally dressed person would appear at 37°C on a camera, but a suit made of this special material could make them undetectable by making them appear much colder.

 

Imed Boughzala

Institut Mines-Télécom Business School | Digital intelligence, Digital transformation , Data intelligence

 

Imed Boughzala is Professor of Information Systems and Director of the TIM (Technology, Information & Management, Ex-DSI) department at IMT-BS. PhD in Computer Science from the University of Paris Pierre & Marie Curie and HDR accredited in both Computer Science and Management Science, Imed has a deep and rich international experience acquired through his research, lecturing and collaboration on major projects. He has recently completed an Executive MBA at IMT-BS and the program “Management and Leadership in higher education” at Harvard Graduate School of Education.

His research interests focus on Digital Intelligence and Digital Transformation. He is the founder of SMART BIS (Smart Business Information Systems) research team and presently Director of the IS (Innovation Support) Lab, which includes scholars from different areas working on the future IS generation. Since September 2018, he is co-heading of the observatory of digital transformation within business schools and member of the labeling colleges of pedagogical initiatives at the FNEGE.

[toggle title=”Find his articles on I’MTech” state=”open”]

[/toggle]

KM3Net

KM3NeT: Searching the Depths of the Sea for Elusive Neutrinos

The sun alone produces more than 64 billion neutrinos per second and per cm2 that pass right through the Earth. These elementary particles of matter are everywhere, yet they remain almost entirely elusive. The key word is almost… The European infrastructure KM3NeT, currently being installed in the depths of the Mediterranean Sea, has been designed to detect the extremely faint light generated by neutrino interactions in the water. Researcher Richard Dallier from IMT Atlantique offers insight on the major scientific and technical challenge of searching for neutrinos. 

 

These “little neutral particles” are among the most mysterious in the universe. “Neutrinos have no electric charge, very low mass and move at a speed close to that of light. They are hard to study because they are extremely difficult to detect,” explains Richard Dallier, member of the KM3NeT team from the Neutrino group at Subatech laboratory[1]. “They interact so little with matter that only one particle out of 100 billion encounters an atom!”

Although their existence was first postulated in the 1930s by physicist Wolfgang Pauli, it was not confirmed experimentally until 1956 by American physicists Frederick Reines and Clyde Cowan–awarded the Nobel Prize in Physics for this discovery in 1995. This was a small revolution for particle physics. “It could justify the excess matter that enabled our existence. The Big Bang created as much matter as it did antimatter, but they mutually annihilated each other very quickly. So, there should not be any left! We hope that studying neutrinos will help us understand this imbalance,” Richard Dallier explains.

The Neutrino Saga

While there is still much to discover about these bashful particles, we do know that neutrinos exist in three forms or “flavors”: the electron neutrino, the muon neutrino and the tau neutrino. The neutrino is certainly an unusual particle, capable of transforming over the course of its journey. This phenomenon is called oscillation: “The neutrino, which can be generated from different sources, including the Sun, nuclear power plants and cosmic rays, is born as a certain type, takes on a hybrid form combining all three flavors as it travels and can then appear as a different flavor when it is detected,” Richard Dallier explains.

The oscillation of neutrinos was first revealed in 1998 with the Super-Kamiokande experiment, a Japanese neutrino observatory, which also received the Nobel Prize in Physics in 2015. This change in identity is key: it provides indirect evidence that neutrinos indeed have a mass, albeit extremely low.  However, another mystery remains: what is the mass hierarchy of these 3 flavors? The answer to this question would further clarify our understanding of the Standard Model of particle physics.

The singularity of neutrinos is a fascinating area of study. An increasing number of observatories and detectors dedicated to the subject are being installed in great depths, where the combination of darkness and concentration of matter is ideal. Russia has installed a detector at the bottom of Lake Baikal and the United States in the South Pole. Europe, on the other hand, is working in the depths of the Mediterranean Sea. This phenomenon of fishing for neutrinos first began in 2008 with the Antares experiment, a unique type of telescope that can detect even the faintest light crossing the depths of the sea. Antares then made way for KM3NeT, with improved sensitivity to orders of magnitude. This experiment has brought together nearly 250 researchers from around 50 laboratories and institutes, including four French laboratories. In addition to studying the fundamental properties of neutrinos, the collaboration aims to discover and study the astrophysical sources of cosmic neutrinos.

Staring into the Universe

KM3NeT is actually comprised of two gigantic neutrino telescopes currently being installed at the bottom of the Mediterranean Sea. The first, called ORCA (Oscillation Research with Cosmics in the Abyss), is located off the coast of Toulon in France. Submerged at a depth of nearly 2,500 meters, it will eventually be composed of 115 strings attached to the seabed. “Optical detectors are placed on each of these 200-meter flexible strings, spaced 20 meters apart: 18 spheres measuring 45 centimeters spaced 9 meters apart each contain 31 light sensors,” explains Richard Dallier, who is participating in the construction and installation of these modules. “This unprecedented density of detectors is required in order to study the properties of the neutrinos: their nature, oscillations and thus their masses and classification thereof. The sources of neutrinos ORCA will focus on are the Sun and the terrestrial atmosphere, where they are generated in large numbers by the cosmic rays that bombard the Earth.”

KM3Net

Each of KM3Net’s optical modules is comprised of 31 photomultipliers to detect the light produced by interactions between neutrinos and matter. These spheres with a diameter of 47 centimeters (including a covering of nearly 2 cm!) were designed to withstand pressures of 350 bar.

The second KM3Net telescope is ARCA (Astroparticles Research with Cosmics in the Abyss). It will be located 3,500 meters under the sea in Sicily. There will be twice as many strings, which will be longer (700 meters) and spaced further apart (90 meters), but with the same number of sensors. With a volume of over one km3—hence the name KM3Net for km3 Neutrino Telescope—ARCA will be dedicated to searching for and observing the astrophysical sources of neutrinos, which are much rarer. A total of over 6,000 optical modules containing over 200,000 light sensors will be installed by 2022. These numbers make KM3Net the largest detector in the world, in equal position with its cousin, IceCube, in Antarctica.

Both ORCA and ARCA operate on the same principle based on the indirect detection of neutrinos. When a neutrino encounters an atom of matter, such as an atom of air, water, or the Earth itself—since they easily travel right through it—the neutrino can “deposit” its energy there. This energy is then instantly transformed into one of the three particles of the flavor of the neutrino: an electron, a muon or a tau. This “daughter” particle then continues its journey on the same path as the initial neutrino and at the same speed, emitting light in the atmosphere it is passing through, or interacting itself with atoms in the environment and disintegrating into other particles, which will also radiate as blue light.

Since this is all happening at the speed of light, an extremely short light pulse of a few nanoseconds occurs. If the environment the neutrino is passing through is transparent–which is the case for the water in the Mediterranean Sea–and the path goes through the volume occupied by ORCA or ARCA, the light sensors will detect this extremely faint flash,” Richard Dallier explains. Therefore, if several sensors are touched, we can reconstitute every direction of the trajectory and determine the energy and nature of the original neutrino. But regardless of the source, the probability of neutrino interactions remains extremely low: with a volume of 1 km3, ARCA only expects to detect a few neutrinos originating from the universe.

Neutrinos: New Messengers Revealing a Violent Universe

Seen as cosmic messengers, these phantom particles open a window onto a violent universe. “Among other things, the study of neutrinos will provide a better understanding and knowledge of cosmic cataclysms,” says Richard Dallier. The collisions of black holes and neutron stars, supernovae, and even massive stars that collapse, produce gusts of neutrinos that bombard us without being absorbed or deflected in their journey. This means that light is no longer the only messenger of the objects in the universe.

Neutrinos have therefore strengthened the arsenal of “multi-messenger” astronomy, involving the cooperation of a maximum number of observatories and instruments throughout the world. Each wavelength and particle contributes to the study of various processes and additional aspects of astrophysical objects and phenomena. “The more observers and objects observed, the greater the chances of finding something,” Richard Dallier explains. And in these extraterrestrial particles lies the possibility of tracing our own origins with greater precision.

[1] SUBATECH is a research laboratory co-operated by IMT Atlantique, the Institut National de Physique Nucléaire et de Physique des Particules (IN2P3) of CNRS, and the Université de Nantes.

Article written for I’MTech (in French) by Anne-Sophie Boutaud

maladie chronique, chronic disease

Chronic disease: what does the Internet really change in patients’ lives?

For the first time, a study has assessed the impact of digital technology on the lives of patients with chronic diseases. It was conducted by the ICA patient association collective, in partnership with researchers from the Smart Objects and Social Networks chair at Institut Mines-Télécom Business School. The study provides a portrait of the benefits and limitations perceived by chronically ill people for three technologies: the Internet, mobile applications and smart objects. Multiple factors were evaluated, such as the quality of the relationship with the physician, the degree of expertise, the patient’s level of incapacitation and their quality of life.

 

Internet research has become an automatic reflex to learn about any disease. From the common cold to the rarest diseases, patients find information about their cases through more or less specialized sites. Scientific publications have already shown that social networks and health forums are especially used by patients when they are diagnosed. However, the true usefulness of the Internet, apps or smart objects for patients remains unclear. To gain a better understanding of how new technology helps patients, the Impatients, Chroniques & Associés collective (ICA) contacted the Smart Objects and Social Networks Chair at Institut Mines-Télécom Business School. The study, conducted between February and July 2018, focused on people living with chronic disease and their use of digital technology. The results were presented on February 20, 2019 at the Cité des Sciences et de l’Industrie in Paris.

More than 1,013 patients completed the questionnaire designed by the researchers. The data collected on technology usage shows that, overall, patients are not very attracted by smart objects. 71.8% of respondents reported that they used the Internet only, 1 to 3 times a month. 19.3% said they used both the Internet and mobile applications on a weekly basis. Only 8.9% use smart objects in addition to the Internet and apps.

Read on I’MTech Healthcare: what makes some connected objects a success and others a flop?

The study therefore shows that uses are very different and that a certain proportion of patients are characterized by the “multi-technology” category. However, “contrary to what we might think, the third group comprising the most connected respondents is not necessarily made up of the youngest people,” indicates Christine Balagué, holder of the Smart Objects and Social Networks chair. In the 25-34 age group, the study found “almost no difference between the three technology use groups (20% of each use group is in this age group)“. The desire for digital health solutions is therefore not a generational issue.

Digital technology: a relative benefit for patients?

The specificity of the study is that it cross-references the use of digital technology (Internet, mobile applications and smart objects) with standard variables in publications that characterize patients’ behavior towards their health. This comparison revealed a new result: the patients who use technology the most are on average no more knowledgeable about their disease than patients who are not very connected. They are also no more efficient in their ability to adopt preventive behavior related to their disease.

On the other hand, the more connected patients are, the greater their ability is to take action in the management of their disease,” says Christine Balagué. Patients in the most connected category believe they are better able to make preventive decisions and to reassure themselves about their condition. However, technology has little impact on the patient-doctor relationship. “The benefit is relative: there is a difference between the benefit perceived by the patient and the reality of what digital tools provide,” concludes Christine Balagué.

Some of the criteria measured by the researchers nevertheless show a correlation with the degree of use of technology and the use of several technology devices. This is the case, for example, with patient empowerment. Notably, the most connected patients are also those who most frequently take the initiative to ask their doctor for information or give their opinion about treatment. These patients also report being most involved by the doctor in medical care. On this point, the study concludes that:

“The use of technology[…] does not change the self-perception of chronically ill patients, who all feel equally knowledgeable about their disease regardless of their use of digital technology. On the other hand, access to this information may subtly change their position in their interactions with the medical and nursing teams, leading to a more positive perception of their ability to play a role in decisions concerning their health.”

The flip side of the coin

Although information found on the Internet offers genuine benefits in the relationship with the medical profession, the use of technologies also has some negative effects, according to patient feedback. 45% believe that the use of digital technology has negative emotional consequences. “Patients find that the Internet reminds them of the disease on a daily basis, and that this increases stress and anxiety,” says Christine Balagué. This result may be linked to the type of use among the chronically ill. The vast majority of them generally search for stories from other people with similar pathologies, which frequently exposes them to the experiences of other patients and their relatives.

Personal stories are considered the most reliable source of information by patients, ahead of content provided by health professionals and patient associations, a fact due to the large, and unequal, amount of information available. Three quarters of respondents indicated that it is difficult to identify and choose reliable information. This sense of mistrust is underlined by other data collected by the researchers during the questionnaire: “71% believe that the Internet is likely to induce self-diagnosis errors.” In addition, a certain proportion of patients (48%) also express mistrust of the privacy of certain mobile sites and applications. This point highlights the challenge for applications and websites to improve the transparency of the use of personal data and respect for privacy, in order to gain their trust.

Read on I’MTech Ethical algorithms in health: a technological and societal challenge

The future development of dedicated web services and patient usage is an issue that researchers want to address. “We want to continue this work of collecting experiences to evaluate changes in use over time,” says Christine Balagué. The continuation of this work will also integrate other developing uses, such as telemedicine and its impact on patients’ quality of life. Finally, the researchers are also considering taking an interest in the other side: the doctors’ side. How do practitioners use digital technologies in their practice? What are the benefits in the relationship with the patient? By combining the results from patient and physician studies, the aim will be to obtain the most accurate portrait possible of patient-physician relationships and of treatment processes in the era of hyperconnectivity.

 

 

cyber sovereignty

What is cyber sovereignty?

Sovereignty is a concept that is historically linked to the idea of a physical territory, whereas the digital world is profoundly dematerialized and virtual. So what does the notion of cyber sovereignty mean? It combines the economic strength of online platforms, digital technologies and regulations based on new societal values. Francis Jutand, Deputy CEO of IMT and member of the Scientific Council of the Institut de la Souveraineté Numérique (Institute of Cyber Sovereignty), presents his view on the foundations of this concept.

 

What does it mean to be “sovereign”?

Francis Jutand: The notion of sovereignty can apply to individuals, companies or nations. To be sovereign is to be able to choose. This means being able to both understand and act. Sovereignty is therefore based on a number of components for taking action: technological development, economic and financial autonomy (and therefore power), and the ability to influence regulatory mechanisms. In addition to these three conditions, there is security, in the sense that being sovereign also means being in a space where you can protect yourself from the potential hostility of others. The fifth and final parameter of sovereignty for large geographical areas, such as nations or economic spaces, is the people’s ability to make their voices heard.

How does this notion of sovereignty apply in the case of digital technology?

FJ: The five components of the ability to act transpose naturally into this field. Being sovereign in a digital world means having our own technology and being independent from major economic players in the sector, such as Google, and their huge financial capacity. It also means developing specific regulations on digital technology and being able to protect against cyber-attacks. As far as the general public is concerned, sovereignty consists in training citizens to understand and use digital technology in an informed way. Based on these criteria, three main zones of cyber sovereignty can be defined around three geographical regions: the United States, Europe and China.

What makes these zones of sovereignty so distinct?

FJ: The American zone is based on economic superpowers and powerful national policy on security and technology operated by government agencies. On the other hand, the state of their regulation in the cyber field is relatively weak. China relies on an omnipresent state with strict regulation and major investments. After its scientific and industrial backwardness in this area, China has caught up over the past few years. Lastly, Europe has good technological skills in both industry and academia, but is not in a leading position. In its favor, the region of European sovereignty has strong market power and pioneering regulations based on certain values, such as the protection of personal data. Its biggest weakness is its lack of economic leadership that could lead to the existence of global digital players.

How is the concept of sovereignty embodied in concrete terms in Europe?

FJ: Europe and its member countries are already investing at a high level in the digital field, through the European Framework Programmes, as well as national programs and ongoing academic research. On the other hand, the small number of world-class companies in this field weakens the potential for research and fruitful collaborations between the academic and industrial worlds. The European Data Protection Board, which is composed of the national data protection authorities of the European Union member states, is another illustration of sovereignty work in the European zone. However, from the point of view of regulations concerning competition law and financial regulation, Europe is still lagging behind in the development of laws and is unassertive in their interpretation. This makes it vulnerable to lobbies as shown by the debates on the European directive on copyright.

How does the notion of cyber sovereignty affect citizens?

FJ: Citizens are consumers and users of cyber services. They play a major role in this field, as most of their activities generate personal data. They are a driving force of the digital economy, which, we must remember, is one of the five pillars of sovereignty. This data, which directly concerns users’ identity, is also governed by regulations. Citizens’ expression is therefore very important in the constitution of an area of sovereignty.

Why is the academic world concerned by this issue of cyber sovereignty?

FJ: Researchers, whether from IMT or other institutions, have insights to provide on cyber sovereignty. They are at the forefront of the development and control of new technology, which is also one of the conditions of sovereignty. They train students and work with companies to disseminate this technology. IMT and its schools are active in all these areas. We therefore also have a role to play, notably by using our neutrality to inform our parliamentarians. We have experimented in this sense with an initial event for deputies and senators on the theme of technological and regulatory sovereignty. Our researchers discussed the potential impacts of technology on citizens, businesses and the economy in general.

 

Chantal Morley

Institut Mines-Télécom Business School | Management, Information systems, Gender studies

Professor Chantal Morley is a faculty member of Institut Mines-Télécom Business School. She holds a PhD in Information Systems Management from HEC-Paris, and an accreditation to direct research (HDR) from IAE-Montpellier II. She has previously served as a consultant on IT projects (Steria, CGI). She has published several books on project management, and Information Systems modelling. After graduating from EHESS (Sociology of Gender), she has been working since 2005 on gender and information technology in the research group Gender@IMT. Her research topics focus on: male gendering of the computer field, dynamics of IT stereotyping, women inclusion in digital occupations, and feminist approach in research. In 2018, she has developed a MOOC on Gender Diversity in Digital Occupations.

[toggle title=”Find here her articles on I’MTech” state=”open”]

[/toggle]

Indoor air

Indoor Air: under-estimated pollutants

While some sources of indoor air pollution are well known, there are others that researchers do not yet fully understand. This is the case for cleaning products and essential oils. The volatile organic compounds (VOCs) they become and their dynamics within buildings are being studied by chemists at IMT Lille Douai.

When it comes to air quality, staying indoors does not keep us safe from pollution. “In addition to outdoor pollutants, which enter buildings, there are the added pollutants from the indoor environment! A wide variety of volatile organic compounds are emitted by building materials, paint and even furniture,” explains Marie Verriele Duncianu, researcher in atmospheric chemistry at IMT Lille Douai. Compressed wood combined with resin, which is often used to make indoor furniture, is one of the leading sources of formaldehyde. In fact, indoor air is generally more polluted than outdoor air. This observation is not new, it has been the focus of numerous information campaigns by environmental agencies, including ADEME and the OQAI, the monitoring center for the quality of indoor air. However, the recent results of much academic research tend to show that the sources of indoor pollutants are still underestimated, and the emissions are poorly known.

In addition to sources from construction and interior design, many compounds are emitted by the occupants’ activities,” the researcher explains. Little research has been conducted on sources of volatile organic compounds such as cleaning products, cooking activities, and hygiene and personal care products. Unlike their counterparts produced by furniture and building materials, these pollutants originating from resident’s products are much more dynamic. While a wall constantly emits small quantiles of VOCs, a cleaning product spontaneously emits a quantity up to ten times more concentrated. This rapid emission makes the task of measuring the concentrations and defining the sources much more complex.

Since they are not as well known, these pollutants linked to users are also less controlled. “They are not taken into account in regulations at all,” explains Marie Verriele Duncianu. “The only legislation related to this issue is legislation for nursery schools and schools, and legislation requiring a label for construction materials.” Since 1st January 2018, institutions receiving children and young people are required to monitor the concentrations of formaldehyde and benzene in their indoor air. However, no actions have been imposed regarding the sources of these pollutants. Meanwhile, ADEME has issued a series of recommendations that advocate the use of green cleaning products for cleaning floors and buildings.

The green product paradox

These recommendations come at a time when consumers are becoming increasingly responsible in terms of their purchases, including for cleaning products. Certain cleaning products benefit from an Ecolabel, for example, guaranteeing a smaller environmental footprint. However, the impacts of these environmentally friendly products in terms of pollutant emissions has not been studied any more than it has for their label-free counterparts. Supported by marketing arguments alone, products featuring essential oils are being hailed as beneficial, without any evidence to back them up. Simply put, Researchers do not yet have a good understanding of indoor pollution, traditional cleaning products or those presented as green products. However, it is fairly easy to find false information claiming the opposite.

In fact, it was upon observing received ideas and “miracle” properties on consumer websites that Marie Verriele Duncianu decided to start a new project called ESSENTIEL.  “My fellow researchers and I saw statements claiming that essential oils purified the indoor air,” the researcher recalls. “On some blogs, we even read consumer testimonials of how essential oils eliminate pollutants. It’s not true: while they do have the ability to clean the environment in terms of bacteria, they definitely do not eliminate all air pollutants. On the contrary, they add more!”

In the laboratory, the researchers are studying the behavior of products featuring essential oils. What VOCs do they release? How are they distributed in indoor air?

 

Essential oils are in fact high in terpenes. These molecules are allergenic, particularly for the skin. They can also interact with ozone to form fine particles or formaldehyde. In focusing on essential oils and the molecules they release into the air; the ESSENTIAL project wants to help remedy this lack of knowledge about indoor pollutants. Therefore, the researchers are pursuing two objectives: understand how emissions from essential oil volatile organic compounds behave, and determine the risks related to these emissions.

The initial results show unusual emission dynamics. For floor cleaners, “there is a peak concentration of terpenes during the first half-hour following use,” explains Shadia Angulo Milhem, PhD student participating in the project with Marie Verriele Duncianu’s team. “Furthermore, the concentration of formaldehyde begins to regularly increase four hours after the cleaning activity.” Formaldehyde is a very controlled substance because it is an irritant and is carcinogenic in cases of high and repeated exposure. The concentrations measured up to several hours after the use of the cleaning products containing essential oils can be attributed to two factors. First of all, terpenes react with the ozone to create formaldehyde. Secondly, the decomposition of formaldehyde donors, used as preservatives, and biocide contained in the cleaning products.

A move towards regulatory thresholds?

In the framework of the ESSENTIAL project, researchers have not only measured cleaning products containing essential oils. They also studied diffusion devices for essential oils. The results show characteristic emissions for each device. “Reed diffusers, which are small bottles containing wooden sticks, take several hours to reach full capacity” Shadia Angulo Milhem explains. “The terpene concentrations then stabilize and remain constant for several days.” Vaporizing devices, on the other hand, which heat the oils, have a more spontaneous emission, resulting in terpene concentrations that are less permanent in the home.

In addition to the measurements of the concentrations, the dynamics of the volatile organic compounds that are released is difficult to determine. In some buildings, they can be trapped in porous materials, then released later due to changes in humidity and temperature. One of the areas the researchers want to explore in the future is how they are absorbed by indoor surfaces. Understanding the behavior of pollutants is essential in establishing the risks they present. How dangerous a compound is depends on whether it is dispersed quickly in the air or accumulates for several days in paint or in drop ceilings.

Currently, there are no regulatory thresholds for terpene concentrations in the air, due to a lack of knowledge about the public’s exposure and about long and short-term toxicity. We must keep in mind that the risk associated with exposure to a pollutant depends on the toxicity of the compound, its concentration in the air and the duration of contact. Upon completion of the ESSENTIAL project, anticipated for 2020, the project team will provide ADEME with a technical and scientific report. While waiting for legislation to be introduced, the results should at least offer recommendation sheets on the use of products containing essential oils. This will provide consumers with real information regarding the benefits as well as the potentially harmful effects of the products they purchase, a far cry from pseudo-scientific marketing arguments.