political ecology, écologie politique, Fabrice Flipo, Télécom École de Management

Philosophy of science and technology in support of political ecology

Fabrice Flipo, a philosopher of science and technology and researcher at Institut Mines-Télécom Business School, has specialized in political ecology, sustainable development and social philosophy for nearly 20 years. Throughout the fundamental research that shapes his more technical teaching, he tries to produce an objective view of current political trends, the ecological impact of digital technology and an understanding of the world more broadly.

 

For Fabrice Flipo, the philosophy of science and technology can be defined as the study of how truth is created in our society. “As a philosopher of science and technology, I’m interested in how knowledge and know-how are created and in the major trends in technical and technological choices, as well as how they are related to society’s choices,” he explains. It is therefore necessary to understand technology, the organization of society and how politics shapes the interaction between major world issues.

The researcher shares this methodology with students at Institut Mines-Télécom Business School, in his courses on major technological and environmental risks and his introductory course on sustainable development. He helps students analyze the entire ecosystem surrounding some of the most disputed technological and environmental issues (ideas, stakeholders, players, institutions etc.) of today and provides them with expertise to navigate this divisive and controversial domain.

Fundamental research to understand global issues

This is why Fabrice Flipo has focused his research on political ecology for nearly 20 years. Political ecology, which first appeared in France in the 1960s, strives to profoundly challenge France’s social and economic organization and to reconsider relationships between man and his environment. It is it rooted in the ideas of a variety of movements, including feminism, third-worldism, pacifism and self-management among others.

Almost 40 years later, Fabrice Flipo seeks to explain and provide insight into this political movement by examining how its emergence has created controversies with other political movements, primarily liberalism (free-market economics), socialism and conservatism. “I try to understand what political ecology is, and the issues involved, not just as a political party of its own, but also as a social movement,” explains the researcher.

Fabrice Flipo carries out his research in two ways. The first is a traditional approach to studying political theory, based on analyzing arguments and debates produced by the movement and the issues it supports. This approach is supplemented by ongoing work with the Laboratory of Social and Political Change at the University of Paris 7 Diderot and other external laboratories specializing in the subject. He works in collaboration with an interdisciplinary team of engineers, sociologists and political scientists to examine the relationship between ICT (Information and Communication Technologies) and ecology. He also involves networks linked to ecology to expand this collaboration, works with NGOs and writes and appears in specialized or national media outlets. For some of his studies, he also draws on a number of different works in other disciplines, such as sociology, history or political science.

The societal impact of political ecology

Today political ecology is a minor movement compared to the liberal, socialist and conservative majorities,” says the researcher. Indeed, despite growing awareness of environmental issues (CoP 21, development of a trade press, energy transition for companies, adopting a “greener” lifestyle etc.) the environmental movement has not had a profound effect on the organization of industrialized human societies, so it needs to be more convincing. This position makes it necessary to present arguments in its minority status on the political spectrum. “Can political ecology be associated with liberalism, socialism or even conservatism?” asks the researcher. “Although it does not belong to any of the existing currents, each of them tries to claim it for their own.”

More than just nature is at stake. A major ecosystem crisis could open the door for an authoritarian regime seeking to defend the essential foundation of a particular society from all others. This sort of eco-fascism would strive to protect resources rather than nature (and could not therefore be considered “environmentalism”), pitching one society against another. Political ecology is therefore firmly aligned with freedom.

To stay away from extremes, “the challenge is to carry out basic research to better understand the world and political ideas, and to go beyond debates based on misunderstandings or overly-passionate approaches,” explains Fabrice Flipo. “The goal is to produce a certain objectivity about political currents, whether environmentalism, liberalism or socialism. The ideas interact with, oppose, and are defined by one another.”

Challenging the notion that modernity is defined by growth and a Cartesian view of nature, the study of political ecology has led Fabrice Flipo to philosophical anthropological questions about freedom.

[box type=”shadow” align=”” class=”” width=””]

Analyzing the environmental impact of digital technology in the field

Political ecology raises questions about the ecology of infrastructures. Fabrice Flipo has begun fieldwork with sociologists on an aspect of digital technology that has been little studied overall: the environmental impacts of making human activities paper-free, the substitution of functions and “100% digital” systems.

Some believe that we must curb our use of digital technologies since manufacturing these devices requires great amounts of energy and raw materials and the rise of such technology produces harmful electronic waste. But others argue that transitioning to an entirely digital system is a way to decentralize societies and make them more environmentally-friendly.

Through his research project on recovering mobile phones (with the idea that recycling helps reduce planned obsolescence) Fabrice Flipo seeks to highlight existing solutions in the field which are not used enough, with priority being given to the latest products and constant renewal.[/box]

Philosophy to support debates about ideas

“Modernity defines itself as the only path to develop freedom (the ability to think), control nature, technology, and democracy. The ecological perspective asserts that it may not be that simple,” explains the researcher. “In my different books I’ve tried to propose a philosophical anthropology that considers ecological questions and different propositions offered by post-colonial and post-modern studies,” he continues.

Current societal debates prove that ecological concerns are a timely subject, underscore the relevance of the researcher’s work in this area, and show that there is growing interest in the topic. Based on the literature, it would appear that citizens have become more aware of available solutions (electric cars, solar panels etc.) but have been slow to adopt them. Significant contradictions between the majority call to “produce more and buy more” and the minority call encouraging people to be “green consumers” as part of the same public discourse make it difficult for citizens to form their own opinions.

“So political ecology could progress through an open debate on ecology,” concludes Fabrice Flipo, “involving politicians, scientists, journalists and specialists. The ideas it champions must resonate with citizens on a cultural level, so that they can make connections between their own lifestyles and the ecological dimension.” An extensive public communication, to which the researcher contributes through his work, coupled with a greater internalization and understanding of these issues and ideas by citizens could help spark a profound, far-reaching societal shift towards true political ecology.

[author title=”Political ecology: The common theme of a research career” image=”https://imtech-test.imt.fr/wp-content/uploads/2018/02/Fabrice-Flipo_format_en_hauteur.jpg”]A philosopher of science and technology, Fabrice Flipo is an associate research professor accredited to direct research in social and political philosophy and specializes in environmentalism and modernity. He teaches courses in sustainable development and major environmental and technological risks at Télécom École de Management, and is a member of the Laboratory of Social and Political Change at the University of Paris Diderot. His research focuses on political ecology, philosophical anthropology of freedom and the ecology of digital infrastructures.

He is the author of many works including: Réenchanter le monde. Politique et vérité “Re-enchanting the world. Politics and truth” (Le Croquant, 2017), Les grandes idées politiques contemporaines “Key contemporary political ideas” (Bréal, 2017), The ecological movement: how many different divisions are there?  (Le Croquant, 2015), Pour une philosophie politique écologiste “For an ecological political philosophy” (Textuel, 2014), Nature et politique (Amsterdam, 2014), and La face cachée du numérique “The Hidden Face of Digital Technology” (L’Echappée, 2013).[/author]

What nuclear risk governance exists in France?

Stéphanie Tillement, IMT Atlantique – Institut Mines-Télécom

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]I[/dropcap]t will take a long time to learn all the lessons from the Fukushima accident, and even longer to bring about a change in the practices and principles of nuclear risk governance. Yet several major themes are already emerging in France in this respect.

Next Sunday, March 11, 2018 will mark the 7-year anniversary of the Fukushima disaster, when the Northeast coast of Japan was struck by a record magnitude-9 earthquake, followed by a tsunami. These natural disasters led to an industrial disaster, a nuclear accident rated 7, the highest level on the INES scale, at the Dai-ichi nuclear power plant in Fukushima.

In the aftermath of the disaster, the world was stunned at the realization of the seriousness and suddenness of this event, which, according to Jacques Repussard, Director General of the French Institute for Radiological Protection and Nuclear Safety (IRSN) calls for us to “imagine the unimaginable and prepare for it.” It confronts all those involved in nuclear safety with a critical challenge: how can we guarantee safety in the midst of unexpected events?

Beyond its unpredictable nature, this accident served as a brutal and particularly relevant reminder that nuclear energy, more than any other technology or industry, transcends all borders, whether they be geographic, temporal, institutional or professional. The consequences of nuclear accidents extend well beyond the borders of a region or a country and remain present for hundreds or even thousands of years, thus exceeding any “human” time scale.

La Hague nuclear waste reprocessing plant. Jean Marie Taillat/Wikimedia, CC BY-SA

La Hague nuclear waste reprocessing plant. Jean Marie Taillat/Wikimedia, CC BY-SA

 

Fukushima revealed that the safety of socio-technical systems with this level of complexity cannot be limited to only to a few stakeholders, nor can it be ensured without creating strong and transparent ties between a multitude of stakeholders, including nuclear operators, citizens, safety authorities, technical support, and government services. Fukushima calls into question the nature and quality of the relationships between these multiple stakeholders and demands that we reconsider nuclear risk governance practices, including in France, and then rethink the boundaries of the “ecosystem of nuclear safety,” to use the term proposed by Benoît Journé.

Learning from nuclear accidents: a long-term process

Immediately after the accident, the entire community of international experts worked to manage the crisis and to understand the dynamics of the accident in terms of its technical, human and socio-organizational aspects. A few months later, the European Commission asked nuclear countries to carry out what it termed “stress-tests” aimed at assessing nuclear facilities’ ability to withstand external stress (such as major weather events) and serious technical malfunctions. In France, this led to the launch of safety assessment reports (ECS) for the country’s nuclear facilities.

While the technical causes of the Fukushima accident were quickly understood, socio-organizational causes were also identified. The Japanese Fukushima Nuclear Accident Independent Investigation Commission found that the “collusion between the government, the regulators and TEPCO, and the lack of governance by said parties” was one of the major causes of the disaster. The accident also highlighted the importance of involving civil society participants in risk prevention and in risk management preparation very early on.

Volunteers from the town of Minamisoma, near the nuclear power plant. Hajime Nakano/Flickr, CC BY

 

Above all, it reveals the long-term need to plan and get equipped to manage a nuclear accident.  Far too often, efforts concentrate on the emergency phase, the days or weeks immediately following the accident, leaving local stakeholders virtually on their own in the “post-accident” phase. Yet this phase involves major problems, involving, for example, the consumption of basic foodstuffs (water, milk, etc.), displacing populations and cultivating potentially contaminated land.

After the Three Mile Island (1979) and Chernobyl (1986) accidents caused the human and organizational aspects of safety measures to be considered, Fukushima marks a new era focused on examining inter-organizational relations and the long-term methods for managing nuclear risks.

The need for openness towards civil society

Although this term is sometimes criticized and even mocked as being a popular buzzword, nuclear risk “governance” refers to a very practical reality involving all the stakeholders, measures and policies that are mobilized to guide the decisions made primarily by the public authorities and the nuclear operators to better manage nuclear risks and help ensure greater transparency of these risks. This implies the need to reflect on how each stakeholder can participate, the material and immaterial resources that could enable this participation and software that could support and help coordinate it.

Public awareness, organized by the Nuclear Safety Authority. ASN, CC BY

Public awareness, organized by the Nuclear Safety Authority. ASN, CC BY

 

In this sense, Fukushima serves as a powerful reminder of the need for greater transparency and greater involvement of civil society participants. Contrary to popular belief, the longstanding institutional stakeholders in the nuclear industry are aware of the need for greater openness to civil society. In 2012 Jacques Repussard stated: “Nuclear energy must be brought out of the secrecy of executive boards and ministerial cabinets.” And as early as 2006, the French Nuclear Safety and Transparency Act confirmed this desire to involve civil society stakeholders in nuclear safety issues, particularly by creating local information committees (CLI), although some regret that this text has only been half-heartedly implemented.

Of course, bringing about a change in practices and pushing the boundaries is not an easy thing, since the nuclear industry has often been described, sometimes rightly, as a world frozen in time. It continues to be burdened by its history. For a long time, nuclear safety was an issue reserved only for a small group of stakeholders, sometimes referred to as “authorized” experts, and traces of these practices are still visible today. This characteristic is embodied in the extremely centralized safety organization. Even the French word for a nuclear power plant, “centrale nucléaire” attests to the prominence given to centralization.

French nuclear power plants. Sting, Roulex_45, Domaina/Wikimedia, CC BY-SA

French nuclear power plants. Sting, Roulex_45, Domaina/Wikimedia, CC BY-SA

 

One thing is for sure, there must be an ongoing dialog between the communities. This implies taking the heat out of the debates and moving beyond the futile and often exaggerated divide between the pro-nuclear and anti-nuclear camps.

A form of governance founded on open dialog and the recognition of citizen expertise is gradually emerging. The challenge for longstanding stakeholders is to help increase this citizen expertise. The AGORAS project (improvement of the governance of organizations and stakeholder networks for nuclear safety) questions governance practices, but also seeks to create a place for dialog and collective reflection. A symposium organized in late 2017 provided the first opportunity for implementing this approach through discussions organized between academic researchers and operational and institutional stakeholders. The 2018 symposium (more information here: colloque2agoras@imt-atlantique.fr) will continue this initiative.

 

[divider style=”normal” top=”20″ bottom=”20″]

The original version of this article was published in The Conversation.

VOC, Volatile organic compound

What is a volatile organic compound (VOC)?

Pollution in urban areas is a major public health issue. While peaks in the concentration of fine particles often make the news, they are not the only urban pollutants. Volatile organic compounds, or VOC, also present a hazard. Some are carcinogenic, while others react in the atmosphere, contributing to the formation of secondary pollutants such as ozone or secondary aerosols—which are very small particles. Nadine Locoge, researcher at IMT Lille Douai, reviews the basics about VOCs, reminding us that they are not only present in outdoor air.  

 

What is a volatile organic compound (VOC)?

Nadine Locoge: It is a chemical composed primarily of carbon and hydrogen. Other atoms can be integrated into this molecule in variable amounts, such as nitrogen, sulfur, etc. All VOCs are volatile at ambient temperature. This is what differentiates them from other pollutants like fine particles, which are in condensed form at ambient temperature.

Read more on I’MTech: What are fine particles?

How do they form?

NL: On a global scale, nature is still the primary source of VOCs. Vegetation, typically forests, produce 90% of the earth’s total emissions. But in the urban setting, this trend is reversed, and anthropogenic sources are more prominent. In cities, the main sources of emissions are automobiles, both from exhaust and the evaporation of fuel, and various heating methods—oil, gas wood… Manufacturers are also major sources of VOC emissions.

Are natural VOCs the same as those produced by humans?

NL: No, in general they are not part of the same chemical families. They have different structures, which implies different consequences. The natural types produce a lot of isoprene and terpenes, which are often used for their fragrant properties. Anthropogenic activities, on the other hand, produce aromatic compounds, such as benzene, which is highly carcinogenic.

Why is it important to measure the concentrations of VOCs in the air?

NL: There are several reasons. First, because some have direct impacts on our health. For example, the concentrations of benzene in the outside air are regulated. They must not exceed an annual average of 5 micrograms per cubic meter. Also, some VOCS react once they are in the air, forming other pollutants. For example, they can generate aerosols—nanoparticles—after interacting with other reactive species. VOCs can also react with atmospheric oxidants and cause the formation of ozone.

Are VOCs only found in outside air?

NL: No, in fact these species are particularly present in indoor air. All the studies at both the national and European level show that VOC concentrations in indoor air in buildings are higher than outside. These are not necessarily the same compounds in these two cases, yet they pose similar risks. One of the emblematic indoor air pollutants is formaldehyde, which is carcinogenic.

There are several sources of VOCs in indoor air: outdoor air due to the renewal of indoor air, for example, but construction materials and furniture are particularly significant sources of VOC emissions.  Regulation in this area is progressing, particularly through labels on construction materials that take this aspect into account. The legislative aspect is crucial as buildings become more energy efficient, since this often means less air is exchanged in order to retain heat, and therefore the indoor air is renewed less frequently.

How can we fight VOC emissions?

NL: Inside, in addition to using materials with the least possible emissions and ventilating rooms as recommended by the ADEME, there are devices that can trap and destroy VOCs. The principle is either to trap them in an irreversible manner, or to cause them to react in order to destroy them—or more precisely, transform them into species that do not affect our health, ideally into carbon dioxide and water. These techniques are widely used in industrial environments, where the concentrations of emissions are relatively significant, and the chemical species are not very diverse. But in indoor environments VOCs are more varied, with lower concentrations. They are therefore harder to treat. In addition, the use of these treatment systems remains controversial because if the chemical processes used are not optimized and adapted to the target species, they can cause chemical reactions that generate secondary compounds that are even more hazardous to human health than the primary species.

Is it possible to decrease VOC concentrations in the outside air?

NL: The measures in this area are primarily regulatory and are aimed at reducing emissions. Exhaust fumes from automobiles, for example, are regulated in terms of emissions. For the sources associated with heating, the requirements vary greatly depending on whether the heating is collective or individual. In general, the methods are ranked according to the amount of emissions. Minimum performance requirements are imposed to optimize combustion and therefore lead to less VOCs being produced, and emission limit values have been set for certain pollutants (including VOCs). In general, emission-reduction targets are set at the international and national level and are then broken down by industry.

In terms of ambient concentrations, there have been some experiments in treating pollutants—including volatile organic compounds—like in the tunnel in Brussels where the walls and ceiling were covered with a cement-based photocatalytic coating. Yet the results from these tests have not been convincing. It is important to keep in mind that in ambient air, the sources of VOCs are numerous and diffuse. It is therefore difficult to lower the concentrations. The best method is still to act to directly reduce the quantity of emissions.

 

 

confiance numérique, digital trust

20 words for understanding digital trust

The issue of digital trust has never been more relevant. The 15th IMT Personal Data Values and Policies Chair Meeting, held on 8 March a few weeks before the European general data protection regulation (GDPR) enters into force, was devoted to presenting its book “Signes de confiance : l’impact des labels sur la gestion des données personnelles” (Signs of Trust: The impact of labels on personal data management). Here is a closer look at some key terms surrounding digital trust.

 

API  Application Programming Interface, an interface that enables the user to connect to an application in order to access the data it produces.

Auditability – The availability of proof that information has been supplied in an authenticated and non-repudiated manner.

Audit trail  The complete history of a transaction

Blockchain  Technology that stores and transfers intangible assets without an intermediary in a transparent, secure, reliable and inalterable manner.

Read more on our blog: What is a blockchain?

Confidence  Trust related to a social context and established institutions.

Consortium  Refers to a hybrid blockchain that is not public, involving participants with different rights.

Crypto-currency – Electronic currency used in a peer-to-peer or decentralized computer network that relies on the principles of cryptography to validate transactions and issue currency.

Decentralized autonomous organization –A program that maintains an organization’s governance by embedding it into a blockchain. It involves several smart contracts (see definition below) that interact together.

Decentralized consensus – Mechanisms used to ensure that all the nodes within a network have the same information available and the same overall internal status.

Distributed Ledger Technology (DLT) – Refers to private blockchains and consortia.

Immutability  The property of being inalterable once created.

Ledger  Book of accounts, a register. A blockchain is a decentralized ledger or register.

Oracle  A service that gathers data from one or more services (private or public databases, social networks…) which it submits to be used by smart contracts (see definition below).

Pseudonymity  An individual’s ability to prove a coherent identity without providing a real name.

Side chain  A secondary blockchain attached to the primary one that can be used to increase the (otherwise limited) volume of information the blockchain can process.

Smart contracts  Autonomous programs that automatically apply the terms of a contract without requiring any human intervention once initiated.

Token – Generic name for a transactional information unit within a blockchain, which does not necessarily refer to the idea of currency.

Transaction  Refers to an operation involving the transfer of assets or information between two participants.

Trust  An individual accepting something as true based on a personal frame of reference.

Trusted Third Party  an entity authorized to perform transactions that must remain confidential and secure on behalf of a third party.

To find out more about this topic, check out our series on trust in the digital age.

 

 

 

roaming

The end of roaming charges in the European Union: a cure-all solution?

Patrick Maillé, IMT Atlantique Institut Mines-Télécom (IMT) and Bruno Tuffin, Inria

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]T[/dropcap]he European Union has required wireless network operators to stop charging roaming fees during trips to other EU countries. For nomad users who regularly travel throughout Europe, this added comfort is truly appreciated: no more fears of additional charges.  However, while the benefit is real, some questions about roaming costs remain.

Respecting European unity

Before the end of roaming fees in June 2017, the operator of your mobile plan allowed you to communicate within your country and allowed a maximum amount of internet data you could consume (once depleted, you would either be charged additional fees, or your service would be restricted). Any travel outside your country of origin involved an additional flat-rate fee or charges based on volume. This situation limited communication and went against the European spirit of unity. To remedy this, in October 2016, the European Commission approved a law prohibiting operators from charging their users for communications and data usage while traveling abroad.

The goal of this decision was clearly established: create a single open market for electronic communications. Now when you travel, your usage will be charged to your plan exactly as it is in your country of origin. This means no more fears of extra fees, including for data usage: no need to wait to find WIFI access to use data, 3G and 4G networks can now be used without resulting in bad surprises. This new system required agreements to be made between the different operators and countries that are transparent for users in order to locate mobile phones and direct communications.

To prevent any unfair competition within the EU and prevent citizens from choosing a plan from the least expensive country, the rule was established that users must take out a plan in their own country, which is defined as the country where they spend the most time. In addition, roaming usage must be “reasonable”.

Completely free roaming?

As mentioned, “free” roaming is guaranteed by the law only “within a reasonable limit of use”. Specifically, operators can set a roaming limit for mobile internet usage without additional fees in order to prevent usage and associated costs from rocketing. However, this limit must be controlled by the regulation and the user must be clearly informed. The framework for this application is therefore not necessarily the same abroad as in the user’s country. In addition, the roaming rules only apply to services within the European Economic Area (EEA); therefore your plan may include services intended for countries outside the EEA which will only apply if you are in your country of origin.

It is also worth noting that there is still a missing step to truly achieving a single market and real freedom within the EU. In general, calling another EU country from your own country is not including in your mobile plan and incurs additional costs, so there is a distinction that is made within the European Community. Similarly, if you make a call while traveling, the call is not counted within your plan, but is charged as if you were calling from your country of origin, which could potentially be outside your plan, and yet it would be natural to be able to call to make a reservation at a restaurant without paying extra fees.

Therefore, integrating these additional aspects, in other words no longer differentiating between a call from or to another EU country, could be the final step towards achieving a fully open market perceived by users as a single market.

A risk of rising rates?

Another aspect to monitor is how this new rule will impact the rates of users’ plans: is there a risk that this will lead to a rise in prices, as an averaging effect in which those who rarely travel will have to pay for those who travel frequently? This potential risk was brought to light in scientific publications through theoretical modeling and game theory. The operator’s income could also decrease. It is still too soon since the application of this new regulation to effectively assess its impact, yet all these reasons clearly show that we will need to pay special attention to how prices change.

[divider style=”dotted” top=”20″ bottom=”20″]

The ConversationTo learn more :
– P. Maillé and B. Tuffin (2017) « Enforcing free roaming among UE countries : an economic analysis »,  13th International Conference on Network and Service Management (CNSM), Tokyo, Japan, Presses de Sciences Po.
– P. Maillé and B. Tuffin (2017), « How does imposing free roaming in EU impact users and ISPs’ relations ? », 8th International Conference Network of the Future, London, UK.

Patrick Maillé, Professor, IMT AtlantiqueInstitut Mines-Télécom (IMT) and Bruno Tuffin, Director of Research, Inria

The original version of this article (in French)  The Conversation

Koovea

Koovea: an intelligent system for monitoring temperature-sensitive drugs

Koovea offers a service for monitoring temperature-sensitive drugs that ensures safe packaging conditions throughout the entire shipping process. The startup has just raised €60K through Créalia Occitanie. The interest-free loan will help the startup finance its R&D and strengthen its own capital before launching its product in June 2018.

 

One out of every two drugs is temperature-sensitive. These fragile and expensive drugs are vulnerable to alterations during shipping if the cold chain is broken. This could result in serious consequences: time lost in transit, significant financial loss for the laboratories, safety risks for patients if they consume altered, ineffective or even dangerous drugs. In response to this problem, Koovea has invented a connected tracking and recording solution that reports data in real time.

Adrien Content and his associates worked together for two and a half years to develop this solution. The incubator and mechatronics platform at IMT Mine Alès provided the startup with the support it needed to overcome technical challenges and create a prototype. This dual assistance combining both economic and technological support helped structure the company as it developed and offered the opportunity to present its innovation at the Las Vegas Consumer Electronic Show (CES) in January 2018.

From manufacturing to use, the integrity of the cold chain is guaranteed

Koovea’s solution makes it possible to track the temperature and location of batches of drugs in real time, providing an opportunity to react if necessary. Its major benefit is that it sends warnings if it detects a deterioration in storage conditions for a supply of products.  The young company’s solution is based on three elements. First, it relies on a flexible temperature sensor the size of a credit card, which features a system for recording and displaying data.  This sensor is complemented by an intelligent and self-reliant router which can report data in real time, anywhere in the world. Finally, a “Koovea” application provides an optimal solution for sharing and using this data.

The device is currently in the midst of a full-scale test phase in the French Hérault department. It has already proven its appeal by winning several awards: Coup de Pousse 2016, Bourse French Tech Emergence 2016, Booste Ton Projet 2016. Today, the startup’s growth has reached a new milestone thanks to the interest-free innovation loan it received from Créalia Occitanie.  Koovea makes no secret of its desire to become a benchmark in the intelligent monitoring of drug product. It then hopes to branch out to include other costly and sensitive products.

Better monitoring solutions for fragile and expensive products

Koovea’s solution is an interesting one for stakeholders in the medical sector: specifically, for laboratories and transport systems for blood, bone marrow and organs. Yet other sectors could also benefit from intelligent real-time monitoring. All the sensitive and expensive products handled in the agri-food sector, viticulture, cosmetics, luxury market and the art world could benefit from this type of solution. This is especially true since Koovea aims to extend its range to integrate other controlled parameters, such as brightness and humidity.  This expansion would pave the way for numerous fields of application. The data could even make it possible to predict the time, place and circumstances under which a product was altered.

virtualisation, virtualization

From design to management, virtualization is inventing the industry of the future

How is industry reinventing itself? Three startups from the Télécom ParisTech incubator give us a glimpse of the changes underway in industry.

 

If the industry of the future is difficult to define, it is because it is as nebulous as the term used to describe it. Does it still make sense to talk about the “future” when the industrial world is already several years into its digital transformation? Although artificial intelligence and object network technologies may still be little-used, the modernization of processes is already a pressing issue for today’s industries. We can hardly use the term “industry of the present”— it isn’t sexy enough—and some prefer the term industry 4.0 over “industry of the future.” If indeed industries 1.0, 2.0 and 3.0 can be precisely defined and no one wonders what patch 2.1 or 3.2 refer to, we are free to choose our favorite term for this rapidly changing industry. Would “industry in transformation” not be a better name? This concept encompasses a plethora of technologies that do not have much in common, other than contributing to the same goal: reorganizing production facilities in a more intelligent way. This makes it difficult to attempt to identify a common thread to explain how industry is transforming. However, virtualization remains a cross-disciplinary theme for many different types of technologies. Technicians, engineers and managers increasingly rely on virtualization in their approach to the technical and organizational challenges they face.

Modeling and simulation software have been used in the design sector for several decades. The abbreviation CAD (computer-aided design) has become an everyday word for those involved in designing and manufacturing industrial parts. But the arrival of artificial intelligence (AI) has brought its share of changes. Smart tools are being developed. These tools do more than simply make it possible to design engineers’ ideas more efficiently: they have become an integral part of the design stage. “What’s important with AI is optimization,” explains Pierre-Emmanuel Dumouchel, founder of the startup Dessia. “The engineer works on pieces at the unit level and it’s difficult for him to optimize complicated layouts because he has to think about a large number of structures to find the best one.”  The startup has developed a software program that uses AI to study a large number of layouts at the same time and find the best ones. The tool then models them virtually and provides engineers with different propositions.  Engineers may then print the engineering drawings after they have been approved. In sectors such as the automotive industry, where drive shafts are increasingly complex, the Dessia software helps save time on the design and prototyping stages. Here, virtualization goes beyond helping to visualize systems. It bypasses a long process of study, reflection and comparing.

“A major headache”

For Philippe Gicquel, founder of CIL4Sys, there are other benefits to virtualization. One of them is that it helps simplify product specifications. The specifications stage involves establishing a written description of the product’s behavior, part by part. “This must be done to create specifications books for suppliers” explains the entrepreneur. With the rise of electronics, parts function in increasingly complex ways, making specifications increasingly long to write. “The electronic control unit for a car, which includes GPS, telephone and other functions, requires specifications with over 10,000 lines of text,” says Philippe Gicquel. “This is a huge headache for the engineering teams!” Rather than continuing to work on increasingly complicated documents, CIL4Sys uses advances in software engineering to simplify the specifications stage. Instead of writing out lines of text, engineers can use the startup’s tools to create diagrams to describe the objects involved, their actions and their interactions. In short, they create a sort of tree covering the events associated with the object and how it works (see video below). The generated codes may then be executed in a simulator developed by the startup and the specifications text is automatically generated. “We still send the requirements in a text document, but before doing so we also send a model showing how the product works and a simulation to ensure that the product behaves as it is supposed to,” explains the founder.

 

Example of the use of CIL4Sys tools on an automated parking lot management system:

 

The benefits of the CIL4Sys tools were demonstrated in a concrete example, when PSA put the startup in competition with an engineering firm to develop a specifications document. “We only used one engineer instead of the two our competitor used and we were given a 30% higher score by the PSA experts,” says Philippe Gicquel. By virtualizing this step the startup helps improve the clarity of operations. Engineers can now quickly get a sense of the progress of the specifications process for a given part, whereas before they had to decipher a lengthy text. “The design process is often represented as a V-shaped cycle: throughout the process leading to the prototype, the downward portion of the V, teams make their way through a dark tunnel because no one really knows where they are in the process. By introducing simulation starting in the specifications stage, we bring some light to this tunnel.”

Looking to video games for inspiration

Design in the broad sense has greatly benefited from the virtualization of specific processes in industrial activities, but it is not the only field to take advantage of the technology. The startup Perfect Industry develops tools for managing production lines inspired by technology from the video gaming world. The startup’s founder, Emmanuel Le Gouguec, sees two major strengths to draw on: “In the world of video games, there aren’t any consultants who spend hours training the player. And there is a motivational aspect that makes the experience fun and lively.” Based on this observation, the startup provides a complete virtualization of production lines. Sensors are installed in key locations to aggregate data about the machines’ performance. Using its Perfect Twin product, a manager can therefore visit a production line from his office using virtual reality (VR) and can access different data, such as the speed of the machines. This data may also be consulted using smartphones. “We are developing applications based on this idea, such as tracking virtual trips made by individuals with VR headsets,” says the founder. This helps provide a better understanding of how the space is situated and how people move through this space.

The entire goal of Perfect Industry’s projects focuses on managing the complexity for operators. Improving the performance of production lines is one of the challenges facing industry today. The data collected and quick immersion make it easier to identify losses. “Our tools provide managers with the same sorts of recommendations made by consultants,” explains Emmanuel Le Gouguec. To prove his point, he cites the example of a SME that needed to optimize its production line to reduce the cost of a product to respond to a call for tenders. “The recommendations made based on an analysis of data and the production space allowed them to increase line speed by 15%,” he says. He was able to achieve these results by looking for tools in another sector, that, according to the founder is not that different from industrial data processing. “There is a major division in the digital sector between people who do the same thing: make codes. From a technical perspective, what we do is a common part of the video gaming world. We simply apply it to factories.” So transforming industry may not only mean looking to future technologies. Importing what is done in neighboring sectors also appears to be a promising way to drive progress.

 

 

connected objects

Healthcare: what makes some connected objects a success and others a flop?

Christine Balagué, Institut Mines-Telecom Business School (ex Télécom École de Management)

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]W[/dropcap]earing the Oura connected ring on your finger day and night can help you find out how well you sleep. A connected patch diabetics wear on their arms enables them to monitor their blood sugar levels without pricking their fingers. On February 9, these two objects received one of the mobile healthcare trophies presented at Paris-Diderot IUT, awarded by a panel of experts, attesting to their significant added value for users.

In recent years manufacturers of watches, bracelets, glasses and other connected objects have made many promises. Too many, judging by the gap between the proliferation of these objects and the modest role these devices play in our daily lives. For the most part they are seen as gadgets, bought on a whim then quickly forgotten in the back of a drawer. The time has not yet come where these devices are as familiar and vital to us as our smartphones.

While connected objects for well-being struggle to prove their usefulness, certain connected medical devices have become indispensable for patients. They are primarily used for diagnostic or preventative purposes or to help treat a disease, such as blood glucose monitors for diabetes. This leads us to explore the process through which users make these objects their own.

More connected objects than humans on our planet

In 2017, for the first time, the number of connected objects surpassed the number of humans on our planet. There are now 8.4 billion of these devices that collect, store, process and transmit data, according to the Gartner technological consulting firm. And it expects this number to exceed 20 billion by the end of 2020.

Connected blood glucose monitor by Freestyle Libre

Health and well-being devices are expected to grow just as dramatically. The number of these devices is set to increase from 73 million worldwide in 2016 to 161 million in 2020, according to the Grand View Research consulting firm.

But what do users think? They remain… doubtful. Though 73% of French people believe that connected objects may useful for their health, according to a survey carried out by Opinion Way in March 2017, only 35% say that they see the benefit of such products for monitoring their health. And just 11% report owning a connected watch.

High prices, risk of dependence and lack of reliability measurements

So how can this lack of enthusiasm amongst users be explained? In 2017, the two associations that group together the major manufacturers of connected objects, Acsel and the Carrefour de l’Internet des objets, published an Observatory of Connected Life. Their study revealed several obstacles for these devices: excessively high prices, the fear of having personal data used without informed consent, the risk of becoming dependent, problems with reliability and measuring security.

Even beyond these concerns, it would seem that manufacturers were a bit too quick to believe that these revolutionary objects would win over their fellow citizens. As a result, though some consumers have adopted them, very few have actually taken ownership of these objects.

These are two entirely different concepts, as manufactures are only starting to find out. A product or service is “adopted” by consumers when they decide to try it out or buy it. “Taking ownership,” of these objects, however, involves a longer process and is only achieved when the technology has become a part of an individual’s daily life.

A physical object, coupled with a service for the individual

Taking ownership of a connected object means taking ownership of each of its four specific aspects.

First, users must take ownership of the product itself, in its physical aspects. A connected watch, for example is first and foremost a watch, meaning it is an object worn on the wrist to tell the time.

The ring Oura records information about sleep quality

Then, users must take ownership of the service provided by the object, its intangible dimension–often through a mobile application. This service involves presenting data collected in the form of graphs or charts and usually offers a coaching function or program designed to improve the user’s health. For example, connected scales transmit weight and body fat percentage measurements to an app. The app then provides recommendations to help us stabilize them.

The object itself is connected to one or several other objects. It transmits data to a smartphone, to other connected objects or to a data platform. This dimension goes beyond the object itself, and must also become part of the individual’s everyday life.

Lastly, the object makes it possible to communicate with others, by sharing the number of steps taken during the day with a group of friends participating in a challenge, for instance. Users may only get used to this human-to-human social connectedness through a process in which they take full ownership of the device.

Four steps for taking ownership of connected objects

Before making a connected object part of our daily lives, we must go through four different steps without realizing we are doing so. Studies carried out in recent years in our team at the Conservatoire National des Arts et Métiers (Cnam), with individuals who own these devices, has allowed us to describe each of these steps.

The first stage is taking ownership of the object on a symbolic level. This either happens in the store before purchasing the object, or the first time the individual sees the connected object if it is a gift. The interactions are primarily sensory-based: seeing, touching, hearing. For some people a so-called “wow” factor can be observed: this user reaction expresses astonishment or even fascination for an object seen as “smart.” At this stage, the user projects an imagined value onto the object and service.

Then the user enters the second stage, called “exploration.” This stage involves physically handling the object to learn about the device and its application, interactions that give rise to a cognitive process for the user to understand how it works; object-to-object interactions where the object interacts with the mobile phone to transfer data collected and to enable the application to provide the service. During this stage, use of the object leads to real value creation for the user.

Measuring heart rate to strengthen the heart

The third phase of taking ownership of an object is determining the object’s function for its user. Individuals may use an object for one of many specific functions available, such as measuring physical activity, heart rate or weight. This phase is accompanied by joint value production between the object and the user—the user determines and sets his/her desired function. For example, someone who wants to strengthen his heart decides to monitor his heart rate on a daily basis.

In the final phase known as “stabilization” the user makes the object a part of in his/her daily life. The user’s interactions with the device become passive. For example, the user wears a connected bracelet but forgets that it is there, while the object continuously collects data and automatically sends it to the mobile application on the user’s smartphone. This stage also gives rise to emotional responses, forging a relationship between individual and object.

During this stage, the perceived value of the object is “transformative,” meaning that the object has transformed the individual’s habits. For example, he/she may have made a habit of getting off the subway two stops early to walk more during his/her commute, or automatically choose the stairs over the elevator.

Different uses than those intended by manufacturers

If manufacturers of connected objects were to carry out a closer study of how individuals take ownership of devices and focus their strategies on users, they could better anticipate uses and increase objects’ value. In the hyperconnected world of today, it is paradoxical to observe such a great “disconnect” between manufacturers and users. This distance contributes to individuals’ limited use of connected objects and their tendency to abandon them in time.

And yet, most companies do incorporate use cases in the development of objects. But these strategies are based on imagining how users may behave, while it has been shown that in real life, individuals do not use connected household objects as manufacturers imagined they would! This was observed in 2015 by American researchers Donna Hoffman and Thomas Novak.

For individuals to really use their connected objects, manufacturers must develop responsible technologies: secure, reliable devices that respect privacy, both in terms of data collected and algorithms for processing the data. Most importantly, these devices must gain real value in the eyes of users. For this to happen, companies must learn how to study users’ behavior in real-life situations and how they come to take ownership of these objects.

Christine Balagué, Professor and holder of the Connected Objects and Social Networks Chair at Institut Mines-Telecom Business School (ex Télécom École de Management)

The original version of this article (in French) was published on The Conversation.

 

eco-material, Gwenn Le Saout, IMT Mines Alès

What is an eco-material?

Reducing the environmental footprint of human constructions is one of the major issues facing the ecological transition. Achieving this goal requires the use of eco-materials. Gwenn Le Saout, a researcher in materials at IMT Mines Alès, explains what these materials are, their advantages and the remaining constraints that prevent their large-scale use.

 

How would you define an eco-material?

Gwenn Le Saout: An eco-material is an alternative to a traditional material for a specific use. It has a lower environmental impact than the traditional material it replaces, yet it maintains similar properties, particularly in terms of durability. Eco-materials are used within a general eco-construction approach aimed at reducing the structures’ environmental footprint.

Can you give us an example of an eco-material?

GLS: Cement has a significant COfootprint. Cement eco-materials are therefore being developed in which part of the cement is replaced by foundry slags. Slags are byproduct materials from steel processes that are generated when metal is melted. So, interestingly, we now call slags “byproducts”, whereas they used to be seen as waste! This proves that there is a growing interest in recovering them, partly for the cement industry.

Since concrete is one of the primary construction materials, are there any forms of eco-concrete?

GLS: Eco-concrete is a major issue in eco-construction, and a lot of scientific work has been carried out to support its development. Producing concrete requires aggregates—often sand from mining operations. These natural aggregates can be replaced by aggregates from demolition concrete which can thus be reused. Another way of producing eco-concrete is by using mud. Nothing revolutionary here, but this process is gaining in popularity due to a greater awareness of materials’ environmental footprint.

Are all materials destined to be replaced by eco-materials?

GLS: No, the goal of eco-materials is not to replace all existing materials. Rather, the aim is to target uses for which materials with a low environmental impact can be used. For example, it is completely possible to build a house using concrete containing demolition aggregates. However, this would not be a wise choice for building a bridge, since the materials do not have exactly the same properties and different expertise is required.

What are the limitations of eco-materials?

GLS: The key point is their durability. For traditional concrete and materials, manufacturers have several decades of feedback. For eco-materials, and particularly eco-concrete, there is less knowledge about their durability. Many question marks remain concerning their behavior over time. This is such an important aspect of the research: finding formulations that can ensure good long-term behavior and characterizing the existing eco-materials to predict their durability.  At The Civil Engineering Institute (IGC), we worked on the national RECYBETON from 2014 to 2016 with Lafarge-Holcim, and were able to provide demonstrators for testing the use of recycled aggregates.

How can industrial stakeholders be convinced to switch to these eco-materials?

GLS: The main advantage is economic. Transporting and storing demolition materials is expensive. In the city, reusing demolition materials in the construction of new buildings therefore represents an interesting opportunity because it would reduce the transport and storage costs. We also participated in the ANR project ECOREB with IGC on this topic to find solutions for recycling concrete. We must also keep in mind that Europe has imposed an obligation to reuse materials: 70% of demolition waste must be recycled. Switching to eco-materials using demolition products therefore offers a way for companies to comply with this directive.

data-moove

Seamless vacations thanks to a research lab

Belles histoires, Bouton, CarnotFor four years now, researchers from EURECOM and the startup Data-Moove have worked together to radically improve the tourist experience in various regions. With help from technological innovations from laboratories, they have succeeded in aggregating the information available on the web and social networks to create a local and comprehensive picture of what a geographical area has to offer.

 

Finding a restaurant, concert hall or hotel when traveling abroad can turn into quite an ordeal. Every restaurant and event has a Facebook page and website, yet few sites gather all a destination’s activities into one spot. For tourists, this means spending time on social networks, time they would rather spend enjoying their vacation. Data-Moove’s challenge was therefore significant: the French startup’s mission was to offer a solution to this problem by creating a comprehensive overview of a region’s tourism offering. On March 2nd, the young company inaugurated an interactive board in the Saint-Barthélemy airport in the West Indies. Travelers arriving on the island can now see an overview of the activities available to them in the area and can create an itinerary for their stay. This interactive board is complemented by a mobile application offered by the island’s Tourism Board which is free for the end user.

This service responds to tourism offices’ growing demand for digital technology to help promote their regions. To meet this need Data-Moove worked together with EURECOM research teams, which are part of the Télécom & Société Numérique Carnot Institute. Their partnership started in 2015. At that time, Raphaël Troncy, a researcher in data science at EURECOM, and his team were involved in the European project 3cixty led by EIT Digital. “We were working to automate the collection of tourism and cultural information,” the scientist recalls. “We wanted a platform that would bring together all the information about accommodation, places of interest, and seasonal, sports and cultural activities…” In short, offering comprehensive and local information. The project was launched a year earlier and already provided a fully developed technical solution. All that was missing was a commercial partner. Data-Moove, which had just been founded, met this need throughout the entire project, which ended in 2016.

Searching social media

During the three-year 3cixty project, the EURECOM researchers needed to solve the problem posed by the heterogeneity of the information sources. TripAdvisor and Facebook do not use the same language and information about a restaurant is not always available in the same format. They therefore needed to represent this stream of data collected from social networks by using semantic graphs: word clouds were linked together based on how they were related. People, places, dates and actions were described in a standardized way and then processed to provide the user with streamlined information, regardless of the source.

Because we aggregate information from many sources, there is a good chance the same information will be presented twice in the data stream,” says Raphaël Troncy. This brings us to the second technological challenge: solving the problem of duplicates involved measuring the similarity in the references to places, dates and names of events.  “We therefore developed a learning algorithm to automatically carry out this work of studying the similarities,” the researcher explains. Another learning model was established to automatically predict the category of an event without much description. This makes it possible to directly present information as being related to sports, theater or music, for example.

A tourism application for discovering all the tourist attractions Saint-Barthélemy has to offer.

Data-Moove implemented the technical solutions developed during the 3cixty project in its first product: City Moove, based on an application like the one used by Saint-Barthélemy. “Our technology for aggregating flows of information can also be connected to a preexisting application,” explains Frédéric Bossard, co-founder of Data-Moove. The goal is to avoid having an excessive number of digital tools for a region. The company also prefers to work with tourist offices to improve the tools they already use. “The problem many regions have is that they often have too many applications, each for a specific area,” he explains.

Tourism of the future, brick by brick

The two partners decided to capitalize on this success by taking the use of digital technology in tourism a step further. In 2017, they began partnering with the European PasTime project—also supported by EIT Digital—which is intended to make suggestions for activities when people are traveling. “The idea is to ask end users when they will arrive in a city and then directly propose an itinerary,” Raphaël Troncy explains. Once again, they carried out machine learning research on large volumes of data. They developed standard profiles based on interactions with users on social media. “The real challenge is to develop a package, in other words, connect interests with tastes in food and preferences for events,” the researcher explains. Here they were able to build on City Moove, to take the technology to a new level by adding a customized aspect.

And a third level is underway. Since February 2018, EURECOM and Data-Moove have been working on a new product: a smart conversational assistant to answer questions about a region’s tourist attractions.  Their work, entitled MinoTour, is being carried out in the context of the European project H2020 Data Pitch. The chatbot they develop will also learn from users’ searches and provide answers based on the aggregated data flow from City Moove. “There is a logic to our products,says Frédéric Bossard: “we build brick by brick, from the database to the chatbot, developing solutions that are best adapted to the geographical areas.”

After Saint-Barthélemy, Data-Moove will test its solutions in Saint-Tropez, Madeira, and on a wider scale in the Provence-Alpes-Côte d’Azur region. Areas with significant tourism activities, which will allow them to continue improving their products to better meet the needs of both the regions and the tourists.

[divider style=”normal” top=”20″ bottom=”20″]

The advantage of a partnership with Eurecom: “An operational perspective”

Frédéric Bossard, co-founder of Data-Moove

One of the objectives of the Télécom & Société Numérique Carnot Institute is to professionalize relations between companies and researchers. Frédéric Bossard, co-founder of Data-Moove, can testify to this: “It is nice to work with EURECOM because the researchers truly have an operational perspective, which is rare among academic partners. They quickly understand our constraints and what we want to accomplish. The quality of these discussions convinced us to enter a partnership rather than a simple collaboration. Today, EURECOM is a partner of Data-Moove since the school has taken shares in the company. By making their laboratories and knowledge available to us, they allow us to take the development of our products to whole new levels.

[divider style=”normal” top=”20″ bottom=”20″]

 

[box type=”shadow” align=”” class=”” width=””]

The TSN Carnot institute, a guarantee of excellence in partnership-based research since 2006

Having first received the Carnot label in 2006, the Télécom & Société numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Télécom École de Management, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering. Learn more [/box]