non-destructive inspection

Medicine for Materials

Did you know that materials have health problems too? To diagnose their structural integrity, researchers are increasingly using techniques similar to those used in the medical field for humans. X-rays, temperature checks and ultrasound imaging are just a few of the tools that help detection of abnormalities in parts. The advantage of these various techniques is that they are non-destructive. When used together, they can provide much information on a mechanical system without taking it out of service. Salim Chaki is one of the French pioneers in this area. The researcher with IMT Lille Douai explains why manufacturers are keeping a close watch on the latest advances in this field.

 

What is the principle behind a non-destructive inspection?

Salim Chaki: It is a set of techniques that can provide information about a part’s state of health without modifying it. Before these techniques were developed, the traditional approach involved cutting up a defective part and inspecting it to identify the defect. With the non-destructive method, the philosophy is the same as that of human medicine: we use x-rays and ultrasounds, for example, to study what is inside the part, or infrared thermography to take its surface temperature to detect abnormalities. The development of nuclear energy during the post-war period demanded this type of techniques since radioactivity introduced new constraints in handling radioactive objects.

Your research approach involves performing a non-destructive “multi-technical” approach. What is the advantage of this approach?

SC: Historically, engineers would choose to use x-rays, ultrasound or other techniques based on their needs. For several decades, manufacturers did not really consider using several techniques simultaneously, whereas in the medical world a more global approach was already being used, including a clinical examination, blood test, x-rays and possibly further tests to diagnose a patient’s illness. In 2006, we became pioneers by proposing a combination of several techniques to diagnose the structural integrity of composite parts during operation. At that point, manufacturers became very interested, convinced by the high potential of the approach. The possibility of diagnosing a defect without modifying the part and even without taking it out of service represents a major economic advantage. We demonstrated the benefit of the non-destructive multi-technical approach by using infrared cameras, optical cameras for measuring the deformation fields and passive acoustic sensors attached to the structure. These sensors pick up the sound of the vibrations emitted by the part when it cracks. Combining several non-destructive techniques therefore makes it possible to confirm the diagnosis of a part’s condition; it complements the information and improves its reliability.

 

Salim Chaki was one of the pioneers in this field when he began working on non-destructive multi-technical inspection in 2006.

Is it really that difficult for manufacturers who have not yet done this to combine two or more techniques?

SC: Yes, actually implementing several techniques is not necessarily straightforward. There are technical problems in real time related to synchronization: the data collected by one sensor must be able to be correlated both spatially and temporally with the data from the others. This requires them to all be perfectly synchronized during the measurements.  There is also a major “data processing” aspect. For example, the infrared cameras record imaging data that are very big. They then must manage how these data are stored and processed. Finally, the interpretation process requires multiple skills since the data originate from different sensors related to different fields—optics, acoustics, heat science. However, we are currently working on data processing algorithms that would facilitate the use and interpretation of data in industrial settings.

What are the concrete applications of non-destructive multi-technical inspections?

SC: One of the most interesting applications involves pressure vessels—typically gas storage tanks. Regulations require that they be inspected periodically to assess their condition and whether they should remain in use. The non-destructive multi-technical approach not only allows this inspection to occur without emptying the tank and taking it out of service for each inspection, it could also be used to forecast the device’s remaining useful life. This is currently one of the major issues in our research. However, the multi-technique approach is still fairly recent, and therefore not many industrial applications exist. On the other hand, we believe that the future will be more conducive to multi-technical processes which will make this inspection more reliable, an aspect that is repeatedly requested by industrial equipment and plant operators, as well as by the administrative authorities responsible for their safety.

What are your lines of research now that manufacturers have begun adopting these techniques?

SC: First of all, it is important to pursue our efforts in convincing manufacturers of the advantages of multi-technical inspections, particularly the increased reliability of the inspection. There is no universal technique that offers a comprehensive diagnosis of a part’s condition. This introduces another interesting parallel with human medicine: it would be unrealistic to think a single test could detect everything. Also, as I said earlier, we are trying to go beyond the diagnosis by proposing an estimated remaining useful life for a part based on non-destructive measurements carried out while the part is in service. Very soon we will extend this concept to the inspection of parts’ initial health condition. The goal is to quickly predict if a part is healthy or not, starting at the production phase, and determine the duration of its service life. This is known as predictive maintenance.

Is the analysis of the data collected from all of the combined techniques also a research issue?

SC: Yes, of course! Since IMT Lille Douai was founded in 2017, as a result of the merger between Télécom Lille and Mines Douai, new perspectives have opened up through the synergy between our expertise in non-destructive testing of materials and our computer science colleagues’ specialization in data processing. The particular contribution of artificial intelligence algorithms and of big data to processing large volumes of data is crucial in anticipating anomalies for predictive maintenance.  If we could streamline the prognosis using these digital tools it would be a major advantage for industrial applications.

Q4Health

Q4Health: a network slice for emergency medicine

Projets européens H2020How can emergency response services be improved? The H2020 Q4Health project raised this question. The European consortium that includes EURECOM, the University of Malaga and RedZinc has demonstrated the possibility to relay video between first responders at an emergency scene and doctors located remotely. To do so, the researchers had to develop innovative tools for 4G network slicing. This work has paved the way for applications for other types of services and lays the groundwork for the 5G.

 

Doctors are rarely the first to intervene in emergency situations. In the event of traffic accidents, strokes or everyday accidents and injuries, victims first receive care from nearby witnesses. The response chain is such that citizens then usually hand the situation over to a team of trained first responders — which does not necessarily include a doctor — who then bring the victim to the hospital. But before the patient reaches the doctor for a diagnosis, time is precious. Patients’ lives depend on medical action being taken as early as possible in this chain. The European H2020 Q4Health project studied a video streaming solution to provide doctors with real-time images of victims at the emergency scene.

The Q4Health project, which was started in January 2016 and completed in December 2017, had to face the challenge of ensuring that the video flow was of high enough quality to make a diagnosis. To this end, the project consortium which includes EURECOM, the University of Malaga in Spain and the project leader SME RedZinc, proved the feasibility of programming a mobile 4G network that can be virtually sliced. The network “slice” created therefore includes all the functions of the regular network, from its structural portion (antennas) to its control software. It is isolated from the rest of the network, and is reserved for communication between emergency response services and nearby doctors.

Navid Nikaein, a communication systems researcher at EURECOM sates that “The traditional method of creating a network slice consists of establishing a contract with an operator who guarantees the quality of service for the slice“. But there is a problem with this sort of system: emergency response services do not have complete control over the network; they remain dependent on the operator. “What we have done with Q4Health is to give real control to emergency response services over inbound and outbound data traffic,” adds the researcher.

Controlling the network

In order to carry out this demonstration, the researchers developed application programming interfaces (API) for the infrastructure network (the central portion of the internet, that interconnects all the other access points) and the mobile network that connects 4G devices, such as telephones, to an access point (this is referred to as an access network). These programming interfaces allow emergency response services to define priority levels for their members. The service can use the SIM card associated with a firefighter or paramedic’s professional mobile phone to identify the user’s network connection. Via the API, it has been determined that the paramedic would benefit from privileged access to the network, enabling dynamic use of the slice reserved for emergency services.

In the Q4Health project, this privileged access for first responders allows them to stream video independent of data traffic in the area, which is a great advantage in crowded areas. Without such privileged access, in a packed stadium, for example it would be impossible to transmit high-quality video over a 4G network. And to ensure the quality of the video flow, a system analyzes the radio rate between the antenna and the first responders’ device — for the Q4Health project, this is not necessarily a smartphone but glasses equipped with a camera to facilitate emergency care. The video rate is then adjusted depending on the radio rate. “If there is a lower radio rate, video processing is optimized to prevent deterioration of image quality,” explains Navid Nikaein.

Through this system first responders are able to give doctors a real-time view of the situation. These may be doctors at the hospital to which the patient will be transported, or volunteer doctors nearby who are available to provide emergency assistance. They obtain not only visual information about the victim’s condition, which facilitates diagnosis, but also gain a better understanding of the circumstances of the accident by observing the scene. They can therefore guide non-physician responders through delicate actions, or even allow them to perform treatment which could not be carried out without consent from a doctor.

Beyond its medical application, Q4Health has above all proved the feasibility of network slicing through a control protocol in which the service provider, rather than the operator, has control. This demonstration is of particular interest for the development of the 5G network, which will require network slicing. “As far as I know, the tool we have developed to achieve this result is one of the first of its kind in the world,” notes Navid Nikaein. And highlighting these successful results, achieved in part thanks to EURECOM’s OpenAirInterface and Mosaic5G platforms, the researcher adds, “Week after week, we are increasingly contacted about using these tools,” This has opened up a wide range of prospects for use cases, representing opportunities to accelerate 5G prototyping. In addition to emergency response services, many other sectors could be interested in this sort of network slicing, starting with security services or transport systems.

 

political ecology, écologie politique, Fabrice Flipo, Télécom École de Management

Philosophy of science and technology in support of political ecology

Fabrice Flipo, a philosopher of science and technology and researcher at Institut Mines-Télécom Business School, has specialized in political ecology, sustainable development and social philosophy for nearly 20 years. Throughout the fundamental research that shapes his more technical teaching, he tries to produce an objective view of current political trends, the ecological impact of digital technology and an understanding of the world more broadly.

 

For Fabrice Flipo, the philosophy of science and technology can be defined as the study of how truth is created in our society. “As a philosopher of science and technology, I’m interested in how knowledge and know-how are created and in the major trends in technical and technological choices, as well as how they are related to society’s choices,” he explains. It is therefore necessary to understand technology, the organization of society and how politics shapes the interaction between major world issues.

The researcher shares this methodology with students at Institut Mines-Télécom Business School, in his courses on major technological and environmental risks and his introductory course on sustainable development. He helps students analyze the entire ecosystem surrounding some of the most disputed technological and environmental issues (ideas, stakeholders, players, institutions etc.) of today and provides them with expertise to navigate this divisive and controversial domain.

Fundamental research to understand global issues

This is why Fabrice Flipo has focused his research on political ecology for nearly 20 years. Political ecology, which first appeared in France in the 1960s, strives to profoundly challenge France’s social and economic organization and to reconsider relationships between man and his environment. It is it rooted in the ideas of a variety of movements, including feminism, third-worldism, pacifism and self-management among others.

Almost 40 years later, Fabrice Flipo seeks to explain and provide insight into this political movement by examining how its emergence has created controversies with other political movements, primarily liberalism (free-market economics), socialism and conservatism. “I try to understand what political ecology is, and the issues involved, not just as a political party of its own, but also as a social movement,” explains the researcher.

Fabrice Flipo carries out his research in two ways. The first is a traditional approach to studying political theory, based on analyzing arguments and debates produced by the movement and the issues it supports. This approach is supplemented by ongoing work with the Laboratory of Social and Political Change at the University of Paris 7 Diderot and other external laboratories specializing in the subject. He works in collaboration with an interdisciplinary team of engineers, sociologists and political scientists to examine the relationship between ICT (Information and Communication Technologies) and ecology. He also involves networks linked to ecology to expand this collaboration, works with NGOs and writes and appears in specialized or national media outlets. For some of his studies, he also draws on a number of different works in other disciplines, such as sociology, history or political science.

The societal impact of political ecology

Today political ecology is a minor movement compared to the liberal, socialist and conservative majorities,” says the researcher. Indeed, despite growing awareness of environmental issues (CoP 21, development of a trade press, energy transition for companies, adopting a “greener” lifestyle etc.) the environmental movement has not had a profound effect on the organization of industrialized human societies, so it needs to be more convincing. This position makes it necessary to present arguments in its minority status on the political spectrum. “Can political ecology be associated with liberalism, socialism or even conservatism?” asks the researcher. “Although it does not belong to any of the existing currents, each of them tries to claim it for their own.”

More than just nature is at stake. A major ecosystem crisis could open the door for an authoritarian regime seeking to defend the essential foundation of a particular society from all others. This sort of eco-fascism would strive to protect resources rather than nature (and could not therefore be considered “environmentalism”), pitching one society against another. Political ecology is therefore firmly aligned with freedom.

To stay away from extremes, “the challenge is to carry out basic research to better understand the world and political ideas, and to go beyond debates based on misunderstandings or overly-passionate approaches,” explains Fabrice Flipo. “The goal is to produce a certain objectivity about political currents, whether environmentalism, liberalism or socialism. The ideas interact with, oppose, and are defined by one another.”

Challenging the notion that modernity is defined by growth and a Cartesian view of nature, the study of political ecology has led Fabrice Flipo to philosophical anthropological questions about freedom.

[box type=”shadow” align=”” class=”” width=””]

Analyzing the environmental impact of digital technology in the field

Political ecology raises questions about the ecology of infrastructures. Fabrice Flipo has begun fieldwork with sociologists on an aspect of digital technology that has been little studied overall: the environmental impacts of making human activities paper-free, the substitution of functions and “100% digital” systems.

Some believe that we must curb our use of digital technologies since manufacturing these devices requires great amounts of energy and raw materials and the rise of such technology produces harmful electronic waste. But others argue that transitioning to an entirely digital system is a way to decentralize societies and make them more environmentally-friendly.

Through his research project on recovering mobile phones (with the idea that recycling helps reduce planned obsolescence) Fabrice Flipo seeks to highlight existing solutions in the field which are not used enough, with priority being given to the latest products and constant renewal.[/box]

Philosophy to support debates about ideas

“Modernity defines itself as the only path to develop freedom (the ability to think), control nature, technology, and democracy. The ecological perspective asserts that it may not be that simple,” explains the researcher. “In my different books I’ve tried to propose a philosophical anthropology that considers ecological questions and different propositions offered by post-colonial and post-modern studies,” he continues.

Current societal debates prove that ecological concerns are a timely subject, underscore the relevance of the researcher’s work in this area, and show that there is growing interest in the topic. Based on the literature, it would appear that citizens have become more aware of available solutions (electric cars, solar panels etc.) but have been slow to adopt them. Significant contradictions between the majority call to “produce more and buy more” and the minority call encouraging people to be “green consumers” as part of the same public discourse make it difficult for citizens to form their own opinions.

“So political ecology could progress through an open debate on ecology,” concludes Fabrice Flipo, “involving politicians, scientists, journalists and specialists. The ideas it champions must resonate with citizens on a cultural level, so that they can make connections between their own lifestyles and the ecological dimension.” An extensive public communication, to which the researcher contributes through his work, coupled with a greater internalization and understanding of these issues and ideas by citizens could help spark a profound, far-reaching societal shift towards true political ecology.

[author title=”Political ecology: The common theme of a research career” image=”https://imtech-test.imt.fr/wp-content/uploads/2018/02/Fabrice-Flipo_format_en_hauteur.jpg”]A philosopher of science and technology, Fabrice Flipo is an associate research professor accredited to direct research in social and political philosophy and specializes in environmentalism and modernity. He teaches courses in sustainable development and major environmental and technological risks at Télécom École de Management, and is a member of the Laboratory of Social and Political Change at the University of Paris Diderot. His research focuses on political ecology, philosophical anthropology of freedom and the ecology of digital infrastructures.

He is the author of many works including: Réenchanter le monde. Politique et vérité “Re-enchanting the world. Politics and truth” (Le Croquant, 2017), Les grandes idées politiques contemporaines “Key contemporary political ideas” (Bréal, 2017), The ecological movement: how many different divisions are there?  (Le Croquant, 2015), Pour une philosophie politique écologiste “For an ecological political philosophy” (Textuel, 2014), Nature et politique (Amsterdam, 2014), and La face cachée du numérique “The Hidden Face of Digital Technology” (L’Echappée, 2013).[/author]

What nuclear risk governance exists in France?

Stéphanie Tillement, IMT Atlantique – Institut Mines-Télécom

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]I[/dropcap]t will take a long time to learn all the lessons from the Fukushima accident, and even longer to bring about a change in the practices and principles of nuclear risk governance. Yet several major themes are already emerging in France in this respect.

Next Sunday, March 11, 2018 will mark the 7-year anniversary of the Fukushima disaster, when the Northeast coast of Japan was struck by a record magnitude-9 earthquake, followed by a tsunami. These natural disasters led to an industrial disaster, a nuclear accident rated 7, the highest level on the INES scale, at the Dai-ichi nuclear power plant in Fukushima.

In the aftermath of the disaster, the world was stunned at the realization of the seriousness and suddenness of this event, which, according to Jacques Repussard, Director General of the French Institute for Radiological Protection and Nuclear Safety (IRSN) calls for us to “imagine the unimaginable and prepare for it.” It confronts all those involved in nuclear safety with a critical challenge: how can we guarantee safety in the midst of unexpected events?

Beyond its unpredictable nature, this accident served as a brutal and particularly relevant reminder that nuclear energy, more than any other technology or industry, transcends all borders, whether they be geographic, temporal, institutional or professional. The consequences of nuclear accidents extend well beyond the borders of a region or a country and remain present for hundreds or even thousands of years, thus exceeding any “human” time scale.

La Hague nuclear waste reprocessing plant. Jean Marie Taillat/Wikimedia, CC BY-SA

La Hague nuclear waste reprocessing plant. Jean Marie Taillat/Wikimedia, CC BY-SA

 

Fukushima revealed that the safety of socio-technical systems with this level of complexity cannot be limited to only to a few stakeholders, nor can it be ensured without creating strong and transparent ties between a multitude of stakeholders, including nuclear operators, citizens, safety authorities, technical support, and government services. Fukushima calls into question the nature and quality of the relationships between these multiple stakeholders and demands that we reconsider nuclear risk governance practices, including in France, and then rethink the boundaries of the “ecosystem of nuclear safety,” to use the term proposed by Benoît Journé.

Learning from nuclear accidents: a long-term process

Immediately after the accident, the entire community of international experts worked to manage the crisis and to understand the dynamics of the accident in terms of its technical, human and socio-organizational aspects. A few months later, the European Commission asked nuclear countries to carry out what it termed “stress-tests” aimed at assessing nuclear facilities’ ability to withstand external stress (such as major weather events) and serious technical malfunctions. In France, this led to the launch of safety assessment reports (ECS) for the country’s nuclear facilities.

While the technical causes of the Fukushima accident were quickly understood, socio-organizational causes were also identified. The Japanese Fukushima Nuclear Accident Independent Investigation Commission found that the “collusion between the government, the regulators and TEPCO, and the lack of governance by said parties” was one of the major causes of the disaster. The accident also highlighted the importance of involving civil society participants in risk prevention and in risk management preparation very early on.

Volunteers from the town of Minamisoma, near the nuclear power plant. Hajime Nakano/Flickr, CC BY

 

Above all, it reveals the long-term need to plan and get equipped to manage a nuclear accident.  Far too often, efforts concentrate on the emergency phase, the days or weeks immediately following the accident, leaving local stakeholders virtually on their own in the “post-accident” phase. Yet this phase involves major problems, involving, for example, the consumption of basic foodstuffs (water, milk, etc.), displacing populations and cultivating potentially contaminated land.

After the Three Mile Island (1979) and Chernobyl (1986) accidents caused the human and organizational aspects of safety measures to be considered, Fukushima marks a new era focused on examining inter-organizational relations and the long-term methods for managing nuclear risks.

The need for openness towards civil society

Although this term is sometimes criticized and even mocked as being a popular buzzword, nuclear risk “governance” refers to a very practical reality involving all the stakeholders, measures and policies that are mobilized to guide the decisions made primarily by the public authorities and the nuclear operators to better manage nuclear risks and help ensure greater transparency of these risks. This implies the need to reflect on how each stakeholder can participate, the material and immaterial resources that could enable this participation and software that could support and help coordinate it.

Public awareness, organized by the Nuclear Safety Authority. ASN, CC BY

Public awareness, organized by the Nuclear Safety Authority. ASN, CC BY

 

In this sense, Fukushima serves as a powerful reminder of the need for greater transparency and greater involvement of civil society participants. Contrary to popular belief, the longstanding institutional stakeholders in the nuclear industry are aware of the need for greater openness to civil society. In 2012 Jacques Repussard stated: “Nuclear energy must be brought out of the secrecy of executive boards and ministerial cabinets.” And as early as 2006, the French Nuclear Safety and Transparency Act confirmed this desire to involve civil society stakeholders in nuclear safety issues, particularly by creating local information committees (CLI), although some regret that this text has only been half-heartedly implemented.

Of course, bringing about a change in practices and pushing the boundaries is not an easy thing, since the nuclear industry has often been described, sometimes rightly, as a world frozen in time. It continues to be burdened by its history. For a long time, nuclear safety was an issue reserved only for a small group of stakeholders, sometimes referred to as “authorized” experts, and traces of these practices are still visible today. This characteristic is embodied in the extremely centralized safety organization. Even the French word for a nuclear power plant, “centrale nucléaire” attests to the prominence given to centralization.

French nuclear power plants. Sting, Roulex_45, Domaina/Wikimedia, CC BY-SA

French nuclear power plants. Sting, Roulex_45, Domaina/Wikimedia, CC BY-SA

 

One thing is for sure, there must be an ongoing dialog between the communities. This implies taking the heat out of the debates and moving beyond the futile and often exaggerated divide between the pro-nuclear and anti-nuclear camps.

A form of governance founded on open dialog and the recognition of citizen expertise is gradually emerging. The challenge for longstanding stakeholders is to help increase this citizen expertise. The AGORAS project (improvement of the governance of organizations and stakeholder networks for nuclear safety) questions governance practices, but also seeks to create a place for dialog and collective reflection. A symposium organized in late 2017 provided the first opportunity for implementing this approach through discussions organized between academic researchers and operational and institutional stakeholders. The 2018 symposium (more information here: colloque2agoras@imt-atlantique.fr) will continue this initiative.

 

[divider style=”normal” top=”20″ bottom=”20″]

The original version of this article was published in The Conversation.

VOC, Volatile organic compound

What is a volatile organic compound (VOC)?

Pollution in urban areas is a major public health issue. While peaks in the concentration of fine particles often make the news, they are not the only urban pollutants. Volatile organic compounds, or VOC, also present a hazard. Some are carcinogenic, while others react in the atmosphere, contributing to the formation of secondary pollutants such as ozone or secondary aerosols—which are very small particles. Nadine Locoge, researcher at IMT Lille Douai, reviews the basics about VOCs, reminding us that they are not only present in outdoor air.  

 

What is a volatile organic compound (VOC)?

Nadine Locoge: It is a chemical composed primarily of carbon and hydrogen. Other atoms can be integrated into this molecule in variable amounts, such as nitrogen, sulfur, etc. All VOCs are volatile at ambient temperature. This is what differentiates them from other pollutants like fine particles, which are in condensed form at ambient temperature.

Read more on I’MTech: What are fine particles?

How do they form?

NL: On a global scale, nature is still the primary source of VOCs. Vegetation, typically forests, produce 90% of the earth’s total emissions. But in the urban setting, this trend is reversed, and anthropogenic sources are more prominent. In cities, the main sources of emissions are automobiles, both from exhaust and the evaporation of fuel, and various heating methods—oil, gas wood… Manufacturers are also major sources of VOC emissions.

Are natural VOCs the same as those produced by humans?

NL: No, in general they are not part of the same chemical families. They have different structures, which implies different consequences. The natural types produce a lot of isoprene and terpenes, which are often used for their fragrant properties. Anthropogenic activities, on the other hand, produce aromatic compounds, such as benzene, which is highly carcinogenic.

Why is it important to measure the concentrations of VOCs in the air?

NL: There are several reasons. First, because some have direct impacts on our health. For example, the concentrations of benzene in the outside air are regulated. They must not exceed an annual average of 5 micrograms per cubic meter. Also, some VOCS react once they are in the air, forming other pollutants. For example, they can generate aerosols—nanoparticles—after interacting with other reactive species. VOCs can also react with atmospheric oxidants and cause the formation of ozone.

Are VOCs only found in outside air?

NL: No, in fact these species are particularly present in indoor air. All the studies at both the national and European level show that VOC concentrations in indoor air in buildings are higher than outside. These are not necessarily the same compounds in these two cases, yet they pose similar risks. One of the emblematic indoor air pollutants is formaldehyde, which is carcinogenic.

There are several sources of VOCs in indoor air: outdoor air due to the renewal of indoor air, for example, but construction materials and furniture are particularly significant sources of VOC emissions.  Regulation in this area is progressing, particularly through labels on construction materials that take this aspect into account. The legislative aspect is crucial as buildings become more energy efficient, since this often means less air is exchanged in order to retain heat, and therefore the indoor air is renewed less frequently.

How can we fight VOC emissions?

NL: Inside, in addition to using materials with the least possible emissions and ventilating rooms as recommended by the ADEME, there are devices that can trap and destroy VOCs. The principle is either to trap them in an irreversible manner, or to cause them to react in order to destroy them—or more precisely, transform them into species that do not affect our health, ideally into carbon dioxide and water. These techniques are widely used in industrial environments, where the concentrations of emissions are relatively significant, and the chemical species are not very diverse. But in indoor environments VOCs are more varied, with lower concentrations. They are therefore harder to treat. In addition, the use of these treatment systems remains controversial because if the chemical processes used are not optimized and adapted to the target species, they can cause chemical reactions that generate secondary compounds that are even more hazardous to human health than the primary species.

Is it possible to decrease VOC concentrations in the outside air?

NL: The measures in this area are primarily regulatory and are aimed at reducing emissions. Exhaust fumes from automobiles, for example, are regulated in terms of emissions. For the sources associated with heating, the requirements vary greatly depending on whether the heating is collective or individual. In general, the methods are ranked according to the amount of emissions. Minimum performance requirements are imposed to optimize combustion and therefore lead to less VOCs being produced, and emission limit values have been set for certain pollutants (including VOCs). In general, emission-reduction targets are set at the international and national level and are then broken down by industry.

In terms of ambient concentrations, there have been some experiments in treating pollutants—including volatile organic compounds—like in the tunnel in Brussels where the walls and ceiling were covered with a cement-based photocatalytic coating. Yet the results from these tests have not been convincing. It is important to keep in mind that in ambient air, the sources of VOCs are numerous and diffuse. It is therefore difficult to lower the concentrations. The best method is still to act to directly reduce the quantity of emissions.

 

 

confiance numérique, digital trust

20 words for understanding digital trust

The issue of digital trust has never been more relevant. The 15th IMT Personal Data Values and Policies Chair Meeting, held on 8 March a few weeks before the European general data protection regulation (GDPR) enters into force, was devoted to presenting its book “Signes de confiance : l’impact des labels sur la gestion des données personnelles” (Signs of Trust: The impact of labels on personal data management). Here is a closer look at some key terms surrounding digital trust.

 

API  Application Programming Interface, an interface that enables the user to connect to an application in order to access the data it produces.

Auditability – The availability of proof that information has been supplied in an authenticated and non-repudiated manner.

Audit trail  The complete history of a transaction

Blockchain  Technology that stores and transfers intangible assets without an intermediary in a transparent, secure, reliable and inalterable manner.

Read more on our blog: What is a blockchain?

Confidence  Trust related to a social context and established institutions.

Consortium  Refers to a hybrid blockchain that is not public, involving participants with different rights.

Crypto-currency – Electronic currency used in a peer-to-peer or decentralized computer network that relies on the principles of cryptography to validate transactions and issue currency.

Decentralized autonomous organization –A program that maintains an organization’s governance by embedding it into a blockchain. It involves several smart contracts (see definition below) that interact together.

Decentralized consensus – Mechanisms used to ensure that all the nodes within a network have the same information available and the same overall internal status.

Distributed Ledger Technology (DLT) – Refers to private blockchains and consortia.

Immutability  The property of being inalterable once created.

Ledger  Book of accounts, a register. A blockchain is a decentralized ledger or register.

Oracle  A service that gathers data from one or more services (private or public databases, social networks…) which it submits to be used by smart contracts (see definition below).

Pseudonymity  An individual’s ability to prove a coherent identity without providing a real name.

Side chain  A secondary blockchain attached to the primary one that can be used to increase the (otherwise limited) volume of information the blockchain can process.

Smart contracts  Autonomous programs that automatically apply the terms of a contract without requiring any human intervention once initiated.

Token – Generic name for a transactional information unit within a blockchain, which does not necessarily refer to the idea of currency.

Transaction  Refers to an operation involving the transfer of assets or information between two participants.

Trust  An individual accepting something as true based on a personal frame of reference.

Trusted Third Party  an entity authorized to perform transactions that must remain confidential and secure on behalf of a third party.

To find out more about this topic, check out our series on trust in the digital age.

 

 

 

roaming

The end of roaming charges in the European Union: a cure-all solution?

Patrick Maillé, IMT Atlantique Institut Mines-Télécom (IMT) and Bruno Tuffin, Inria

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]T[/dropcap]he European Union has required wireless network operators to stop charging roaming fees during trips to other EU countries. For nomad users who regularly travel throughout Europe, this added comfort is truly appreciated: no more fears of additional charges.  However, while the benefit is real, some questions about roaming costs remain.

Respecting European unity

Before the end of roaming fees in June 2017, the operator of your mobile plan allowed you to communicate within your country and allowed a maximum amount of internet data you could consume (once depleted, you would either be charged additional fees, or your service would be restricted). Any travel outside your country of origin involved an additional flat-rate fee or charges based on volume. This situation limited communication and went against the European spirit of unity. To remedy this, in October 2016, the European Commission approved a law prohibiting operators from charging their users for communications and data usage while traveling abroad.

The goal of this decision was clearly established: create a single open market for electronic communications. Now when you travel, your usage will be charged to your plan exactly as it is in your country of origin. This means no more fears of extra fees, including for data usage: no need to wait to find WIFI access to use data, 3G and 4G networks can now be used without resulting in bad surprises. This new system required agreements to be made between the different operators and countries that are transparent for users in order to locate mobile phones and direct communications.

To prevent any unfair competition within the EU and prevent citizens from choosing a plan from the least expensive country, the rule was established that users must take out a plan in their own country, which is defined as the country where they spend the most time. In addition, roaming usage must be “reasonable”.

Completely free roaming?

As mentioned, “free” roaming is guaranteed by the law only “within a reasonable limit of use”. Specifically, operators can set a roaming limit for mobile internet usage without additional fees in order to prevent usage and associated costs from rocketing. However, this limit must be controlled by the regulation and the user must be clearly informed. The framework for this application is therefore not necessarily the same abroad as in the user’s country. In addition, the roaming rules only apply to services within the European Economic Area (EEA); therefore your plan may include services intended for countries outside the EEA which will only apply if you are in your country of origin.

It is also worth noting that there is still a missing step to truly achieving a single market and real freedom within the EU. In general, calling another EU country from your own country is not including in your mobile plan and incurs additional costs, so there is a distinction that is made within the European Community. Similarly, if you make a call while traveling, the call is not counted within your plan, but is charged as if you were calling from your country of origin, which could potentially be outside your plan, and yet it would be natural to be able to call to make a reservation at a restaurant without paying extra fees.

Therefore, integrating these additional aspects, in other words no longer differentiating between a call from or to another EU country, could be the final step towards achieving a fully open market perceived by users as a single market.

A risk of rising rates?

Another aspect to monitor is how this new rule will impact the rates of users’ plans: is there a risk that this will lead to a rise in prices, as an averaging effect in which those who rarely travel will have to pay for those who travel frequently? This potential risk was brought to light in scientific publications through theoretical modeling and game theory. The operator’s income could also decrease. It is still too soon since the application of this new regulation to effectively assess its impact, yet all these reasons clearly show that we will need to pay special attention to how prices change.

[divider style=”dotted” top=”20″ bottom=”20″]

The ConversationTo learn more :
– P. Maillé and B. Tuffin (2017) « Enforcing free roaming among UE countries : an economic analysis »,  13th International Conference on Network and Service Management (CNSM), Tokyo, Japan, Presses de Sciences Po.
– P. Maillé and B. Tuffin (2017), « How does imposing free roaming in EU impact users and ISPs’ relations ? », 8th International Conference Network of the Future, London, UK.

Patrick Maillé, Professor, IMT AtlantiqueInstitut Mines-Télécom (IMT) and Bruno Tuffin, Director of Research, Inria

The original version of this article (in French)  The Conversation

Koovea

Koovea: an intelligent system for monitoring temperature-sensitive drugs

Koovea offers a service for monitoring temperature-sensitive drugs that ensures safe packaging conditions throughout the entire shipping process. The startup has just raised €60K through Créalia Occitanie. The interest-free loan will help the startup finance its R&D and strengthen its own capital before launching its product in June 2018.

 

One out of every two drugs is temperature-sensitive. These fragile and expensive drugs are vulnerable to alterations during shipping if the cold chain is broken. This could result in serious consequences: time lost in transit, significant financial loss for the laboratories, safety risks for patients if they consume altered, ineffective or even dangerous drugs. In response to this problem, Koovea has invented a connected tracking and recording solution that reports data in real time.

Adrien Content and his associates worked together for two and a half years to develop this solution. The incubator and mechatronics platform at IMT Mine Alès provided the startup with the support it needed to overcome technical challenges and create a prototype. This dual assistance combining both economic and technological support helped structure the company as it developed and offered the opportunity to present its innovation at the Las Vegas Consumer Electronic Show (CES) in January 2018.

From manufacturing to use, the integrity of the cold chain is guaranteed

Koovea’s solution makes it possible to track the temperature and location of batches of drugs in real time, providing an opportunity to react if necessary. Its major benefit is that it sends warnings if it detects a deterioration in storage conditions for a supply of products.  The young company’s solution is based on three elements. First, it relies on a flexible temperature sensor the size of a credit card, which features a system for recording and displaying data.  This sensor is complemented by an intelligent and self-reliant router which can report data in real time, anywhere in the world. Finally, a “Koovea” application provides an optimal solution for sharing and using this data.

The device is currently in the midst of a full-scale test phase in the French Hérault department. It has already proven its appeal by winning several awards: Coup de Pousse 2016, Bourse French Tech Emergence 2016, Booste Ton Projet 2016. Today, the startup’s growth has reached a new milestone thanks to the interest-free innovation loan it received from Créalia Occitanie.  Koovea makes no secret of its desire to become a benchmark in the intelligent monitoring of drug product. It then hopes to branch out to include other costly and sensitive products.

Better monitoring solutions for fragile and expensive products

Koovea’s solution is an interesting one for stakeholders in the medical sector: specifically, for laboratories and transport systems for blood, bone marrow and organs. Yet other sectors could also benefit from intelligent real-time monitoring. All the sensitive and expensive products handled in the agri-food sector, viticulture, cosmetics, luxury market and the art world could benefit from this type of solution. This is especially true since Koovea aims to extend its range to integrate other controlled parameters, such as brightness and humidity.  This expansion would pave the way for numerous fields of application. The data could even make it possible to predict the time, place and circumstances under which a product was altered.

virtualisation, virtualization

From design to management, virtualization is inventing the industry of the future

How is industry reinventing itself? Three startups from the Télécom ParisTech incubator give us a glimpse of the changes underway in industry.

 

If the industry of the future is difficult to define, it is because it is as nebulous as the term used to describe it. Does it still make sense to talk about the “future” when the industrial world is already several years into its digital transformation? Although artificial intelligence and object network technologies may still be little-used, the modernization of processes is already a pressing issue for today’s industries. We can hardly use the term “industry of the present”— it isn’t sexy enough—and some prefer the term industry 4.0 over “industry of the future.” If indeed industries 1.0, 2.0 and 3.0 can be precisely defined and no one wonders what patch 2.1 or 3.2 refer to, we are free to choose our favorite term for this rapidly changing industry. Would “industry in transformation” not be a better name? This concept encompasses a plethora of technologies that do not have much in common, other than contributing to the same goal: reorganizing production facilities in a more intelligent way. This makes it difficult to attempt to identify a common thread to explain how industry is transforming. However, virtualization remains a cross-disciplinary theme for many different types of technologies. Technicians, engineers and managers increasingly rely on virtualization in their approach to the technical and organizational challenges they face.

Modeling and simulation software have been used in the design sector for several decades. The abbreviation CAD (computer-aided design) has become an everyday word for those involved in designing and manufacturing industrial parts. But the arrival of artificial intelligence (AI) has brought its share of changes. Smart tools are being developed. These tools do more than simply make it possible to design engineers’ ideas more efficiently: they have become an integral part of the design stage. “What’s important with AI is optimization,” explains Pierre-Emmanuel Dumouchel, founder of the startup Dessia. “The engineer works on pieces at the unit level and it’s difficult for him to optimize complicated layouts because he has to think about a large number of structures to find the best one.”  The startup has developed a software program that uses AI to study a large number of layouts at the same time and find the best ones. The tool then models them virtually and provides engineers with different propositions.  Engineers may then print the engineering drawings after they have been approved. In sectors such as the automotive industry, where drive shafts are increasingly complex, the Dessia software helps save time on the design and prototyping stages. Here, virtualization goes beyond helping to visualize systems. It bypasses a long process of study, reflection and comparing.

“A major headache”

For Philippe Gicquel, founder of CIL4Sys, there are other benefits to virtualization. One of them is that it helps simplify product specifications. The specifications stage involves establishing a written description of the product’s behavior, part by part. “This must be done to create specifications books for suppliers” explains the entrepreneur. With the rise of electronics, parts function in increasingly complex ways, making specifications increasingly long to write. “The electronic control unit for a car, which includes GPS, telephone and other functions, requires specifications with over 10,000 lines of text,” says Philippe Gicquel. “This is a huge headache for the engineering teams!” Rather than continuing to work on increasingly complicated documents, CIL4Sys uses advances in software engineering to simplify the specifications stage. Instead of writing out lines of text, engineers can use the startup’s tools to create diagrams to describe the objects involved, their actions and their interactions. In short, they create a sort of tree covering the events associated with the object and how it works (see video below). The generated codes may then be executed in a simulator developed by the startup and the specifications text is automatically generated. “We still send the requirements in a text document, but before doing so we also send a model showing how the product works and a simulation to ensure that the product behaves as it is supposed to,” explains the founder.

 

Example of the use of CIL4Sys tools on an automated parking lot management system:

 

The benefits of the CIL4Sys tools were demonstrated in a concrete example, when PSA put the startup in competition with an engineering firm to develop a specifications document. “We only used one engineer instead of the two our competitor used and we were given a 30% higher score by the PSA experts,” says Philippe Gicquel. By virtualizing this step the startup helps improve the clarity of operations. Engineers can now quickly get a sense of the progress of the specifications process for a given part, whereas before they had to decipher a lengthy text. “The design process is often represented as a V-shaped cycle: throughout the process leading to the prototype, the downward portion of the V, teams make their way through a dark tunnel because no one really knows where they are in the process. By introducing simulation starting in the specifications stage, we bring some light to this tunnel.”

Looking to video games for inspiration

Design in the broad sense has greatly benefited from the virtualization of specific processes in industrial activities, but it is not the only field to take advantage of the technology. The startup Perfect Industry develops tools for managing production lines inspired by technology from the video gaming world. The startup’s founder, Emmanuel Le Gouguec, sees two major strengths to draw on: “In the world of video games, there aren’t any consultants who spend hours training the player. And there is a motivational aspect that makes the experience fun and lively.” Based on this observation, the startup provides a complete virtualization of production lines. Sensors are installed in key locations to aggregate data about the machines’ performance. Using its Perfect Twin product, a manager can therefore visit a production line from his office using virtual reality (VR) and can access different data, such as the speed of the machines. This data may also be consulted using smartphones. “We are developing applications based on this idea, such as tracking virtual trips made by individuals with VR headsets,” says the founder. This helps provide a better understanding of how the space is situated and how people move through this space.

The entire goal of Perfect Industry’s projects focuses on managing the complexity for operators. Improving the performance of production lines is one of the challenges facing industry today. The data collected and quick immersion make it easier to identify losses. “Our tools provide managers with the same sorts of recommendations made by consultants,” explains Emmanuel Le Gouguec. To prove his point, he cites the example of a SME that needed to optimize its production line to reduce the cost of a product to respond to a call for tenders. “The recommendations made based on an analysis of data and the production space allowed them to increase line speed by 15%,” he says. He was able to achieve these results by looking for tools in another sector, that, according to the founder is not that different from industrial data processing. “There is a major division in the digital sector between people who do the same thing: make codes. From a technical perspective, what we do is a common part of the video gaming world. We simply apply it to factories.” So transforming industry may not only mean looking to future technologies. Importing what is done in neighboring sectors also appears to be a promising way to drive progress.

 

 

connected objects

Healthcare: what makes some connected objects a success and others a flop?

Christine Balagué, Institut Mines-Telecom Business School (ex Télécom École de Management)

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]W[/dropcap]earing the Oura connected ring on your finger day and night can help you find out how well you sleep. A connected patch diabetics wear on their arms enables them to monitor their blood sugar levels without pricking their fingers. On February 9, these two objects received one of the mobile healthcare trophies presented at Paris-Diderot IUT, awarded by a panel of experts, attesting to their significant added value for users.

In recent years manufacturers of watches, bracelets, glasses and other connected objects have made many promises. Too many, judging by the gap between the proliferation of these objects and the modest role these devices play in our daily lives. For the most part they are seen as gadgets, bought on a whim then quickly forgotten in the back of a drawer. The time has not yet come where these devices are as familiar and vital to us as our smartphones.

While connected objects for well-being struggle to prove their usefulness, certain connected medical devices have become indispensable for patients. They are primarily used for diagnostic or preventative purposes or to help treat a disease, such as blood glucose monitors for diabetes. This leads us to explore the process through which users make these objects their own.

More connected objects than humans on our planet

In 2017, for the first time, the number of connected objects surpassed the number of humans on our planet. There are now 8.4 billion of these devices that collect, store, process and transmit data, according to the Gartner technological consulting firm. And it expects this number to exceed 20 billion by the end of 2020.

Connected blood glucose monitor by Freestyle Libre

Health and well-being devices are expected to grow just as dramatically. The number of these devices is set to increase from 73 million worldwide in 2016 to 161 million in 2020, according to the Grand View Research consulting firm.

But what do users think? They remain… doubtful. Though 73% of French people believe that connected objects may useful for their health, according to a survey carried out by Opinion Way in March 2017, only 35% say that they see the benefit of such products for monitoring their health. And just 11% report owning a connected watch.

High prices, risk of dependence and lack of reliability measurements

So how can this lack of enthusiasm amongst users be explained? In 2017, the two associations that group together the major manufacturers of connected objects, Acsel and the Carrefour de l’Internet des objets, published an Observatory of Connected Life. Their study revealed several obstacles for these devices: excessively high prices, the fear of having personal data used without informed consent, the risk of becoming dependent, problems with reliability and measuring security.

Even beyond these concerns, it would seem that manufacturers were a bit too quick to believe that these revolutionary objects would win over their fellow citizens. As a result, though some consumers have adopted them, very few have actually taken ownership of these objects.

These are two entirely different concepts, as manufactures are only starting to find out. A product or service is “adopted” by consumers when they decide to try it out or buy it. “Taking ownership,” of these objects, however, involves a longer process and is only achieved when the technology has become a part of an individual’s daily life.

A physical object, coupled with a service for the individual

Taking ownership of a connected object means taking ownership of each of its four specific aspects.

First, users must take ownership of the product itself, in its physical aspects. A connected watch, for example is first and foremost a watch, meaning it is an object worn on the wrist to tell the time.

The ring Oura records information about sleep quality

Then, users must take ownership of the service provided by the object, its intangible dimension–often through a mobile application. This service involves presenting data collected in the form of graphs or charts and usually offers a coaching function or program designed to improve the user’s health. For example, connected scales transmit weight and body fat percentage measurements to an app. The app then provides recommendations to help us stabilize them.

The object itself is connected to one or several other objects. It transmits data to a smartphone, to other connected objects or to a data platform. This dimension goes beyond the object itself, and must also become part of the individual’s everyday life.

Lastly, the object makes it possible to communicate with others, by sharing the number of steps taken during the day with a group of friends participating in a challenge, for instance. Users may only get used to this human-to-human social connectedness through a process in which they take full ownership of the device.

Four steps for taking ownership of connected objects

Before making a connected object part of our daily lives, we must go through four different steps without realizing we are doing so. Studies carried out in recent years in our team at the Conservatoire National des Arts et Métiers (Cnam), with individuals who own these devices, has allowed us to describe each of these steps.

The first stage is taking ownership of the object on a symbolic level. This either happens in the store before purchasing the object, or the first time the individual sees the connected object if it is a gift. The interactions are primarily sensory-based: seeing, touching, hearing. For some people a so-called “wow” factor can be observed: this user reaction expresses astonishment or even fascination for an object seen as “smart.” At this stage, the user projects an imagined value onto the object and service.

Then the user enters the second stage, called “exploration.” This stage involves physically handling the object to learn about the device and its application, interactions that give rise to a cognitive process for the user to understand how it works; object-to-object interactions where the object interacts with the mobile phone to transfer data collected and to enable the application to provide the service. During this stage, use of the object leads to real value creation for the user.

Measuring heart rate to strengthen the heart

The third phase of taking ownership of an object is determining the object’s function for its user. Individuals may use an object for one of many specific functions available, such as measuring physical activity, heart rate or weight. This phase is accompanied by joint value production between the object and the user—the user determines and sets his/her desired function. For example, someone who wants to strengthen his heart decides to monitor his heart rate on a daily basis.

In the final phase known as “stabilization” the user makes the object a part of in his/her daily life. The user’s interactions with the device become passive. For example, the user wears a connected bracelet but forgets that it is there, while the object continuously collects data and automatically sends it to the mobile application on the user’s smartphone. This stage also gives rise to emotional responses, forging a relationship between individual and object.

During this stage, the perceived value of the object is “transformative,” meaning that the object has transformed the individual’s habits. For example, he/she may have made a habit of getting off the subway two stops early to walk more during his/her commute, or automatically choose the stairs over the elevator.

Different uses than those intended by manufacturers

If manufacturers of connected objects were to carry out a closer study of how individuals take ownership of devices and focus their strategies on users, they could better anticipate uses and increase objects’ value. In the hyperconnected world of today, it is paradoxical to observe such a great “disconnect” between manufacturers and users. This distance contributes to individuals’ limited use of connected objects and their tendency to abandon them in time.

And yet, most companies do incorporate use cases in the development of objects. But these strategies are based on imagining how users may behave, while it has been shown that in real life, individuals do not use connected household objects as manufacturers imagined they would! This was observed in 2015 by American researchers Donna Hoffman and Thomas Novak.

For individuals to really use their connected objects, manufacturers must develop responsible technologies: secure, reliable devices that respect privacy, both in terms of data collected and algorithms for processing the data. Most importantly, these devices must gain real value in the eyes of users. For this to happen, companies must learn how to study users’ behavior in real-life situations and how they come to take ownership of these objects.

Christine Balagué, Professor and holder of the Connected Objects and Social Networks Chair at Institut Mines-Telecom Business School (ex Télécom École de Management)

The original version of this article (in French) was published on The Conversation.