new heroism

New Heroism: a paradoxical model of success

Today, the ideals of success cover multiple areas of society, such as work, cinema, and personal life. In his book Le Nouvel Héroïsme, Olivier Fournout, a sociologist and semiologist at Télécom Paris, analyzes the conditions that have allowed the emergence of a model of success that is riddled with paradoxes.

A hero is a someone capable of bravery; of feats that reveal extraordinary courage. That is the dictionary definition, in any case. Another dictionary entry defines a hero as an individual worthy of public esteem, for their strength of character, their genius, and total dedication to their work or cause. In terms of fiction, it relates to mythology; to legendary characters who accomplish great feats. The term can also refer to literary, dramatic and cinematographic works.

According to Olivier Fournout, a researcher in Sociology at Télécom Paris, the modern approach to the hero intersects all these definitions. In our society, a hero can be Arnaud Beltrame, the policeman who saved a hostage and defended republican values. At the singer’s funeral, Emmanuel Macron proclaimed that Johhny Hallyday was also a hero – a star who conveyed an imaginary of rebellion and freedom. It was Emmanuel Macron who then declared: “We must return to political heroism” in an interview in August 2017. “Right now, on the roads of France,” reports Olivier Fournout, “there are Carrefour delivery trucks with the slogan, ‘Thank you, heroes’ written on the side, and a photo of the supermarket’s employees”. For the sociologist, “the common use of the word hero to refer to such different people calls our most contemporary modernity into question”.

The matrix of heroism

The heroic model can be found in a myriad of paradoxical orders which seem appropriate to the present time, and are found in extremely diverse and heterogeneous fields. The whole imaginary of the tragic hero is found in paradoxes that abound today on a multitude of levels. According to the discourses and images in broad circulation, in order to be a hero today, one has to be both “with others and against others, respect the frameworks and shatter them, and to be good, both on the outside and in one’s inner dimension,” argues Olivier Fournout, based on numerous pieces of evidence. Individuals are pushed to strive for this ideal either by myths, or by examples with real people like bosses and artists.

The difficulty lies in having to be empathetic while also being in competition. The researcher illustrates this in his book Le Nouvel Héroïsme with a Nike advertisement that portrays a young hockey player who knocks over garbage cans in the street, slams doors in people’s faces, and destroys walls by hitting pucks at them. Yet he also carries a disabled person up the stairs. Here we see both violence and a concern for others in everyday life. “This must be seen both as a notion of empowerment, that can be positive for individuals, and an endangerment. This duality that characterizes the complexity of the matrix of heroism is what I analyze in my book,” explains Olivier Fournout.

“The pressure on individuals to succeed and to constantly surpass themselves can give rise to psychosocial risks such as burnout or depression,” says the sociologist. To strive for this heroic model that is presented as an ideal, a person can overwork themself. The difficulty in managing paradoxes like cooperation and competition with those in one’s milieu can lead an individual to endure psychological or cognitive stress. The discourse of surpassing oneself creates difficulties for people. Furthermore, the pressure weighing on each person is accompanied by a call for training or self-training, with the promise of an “increase in skills of self-expression, of creativity, and of management of social relations,” Olivier Fournout believes.  

To describe the matrix of heroism, which he also calls the “matrix of paradoxical injunctions”, the sociologist used more than 200 treaties on management and personal growth, advertisements, news articles portraying bosses, and a corpus of 500 Hollywood-style movies. The goal was to show the common structure of these extremely diverse fields. “Even though the word hero comes from cinema, I have seen it used by professors and consultants in the United States to illustrate management theories,” says the researcher.

Establishing an imaginary

In his book, Olivier Fournout indicates that the establishment of a dominant imaginary in our media spaces must first be incarnated into as wide a range of characters as possible. In the case of new heroism, this could be Arnaud Beltrame or Johnny Hallyday, but could also be representatives of Generation Z or the Start-up Nation, activists, or even a Sunday mountain biker. This imaginary must then be placed in a game of distorting mirrors in very heterogeneous universes, such as the world of work, individuals’ privacy, and great Hollywood myths. Thirdly, the matrix must be stabilized in the dominant editorial forms. In the end, the imaginary must pass through ‘portrait galleries’, i.e. role models conveyed in the press or in the world of management. These could be soccer players, artists, big bosses, or everyday heroes.   

Olivier Fournout uses a theatrical metaphor to articulate this. He speaks of scenes and counter-scenes to illustrate the succession of public and private moments, of great, exceptional feats, and heroism for everyone in everyday life. He thus highlights its heterogeneity, which forms part of the foundation of the heroic model. The sociologist uses the example of Shakespeare’s theater, which, in its historical plays, invites the spectator to observe the great official parades of power and to take a look behind the scenes. Some scenes portray the grand speeches of the future King Henry V, while others draw the spectator into the life of this Prince who, before becoming King, lived in taverns with thieves. “What I call counter-scenes are the gray areas, the sequences that are less official than those that take place in the spotlight,” says the researcher.

Applied to the professional world, counter-scenes refer to the personal investment in one’s work, everything related to, for example, passion, sweat, or emotions such as fear in the face of risks or changes. The scenes, on the other hand, portray the performance in social roles with a control over the outward signals that one conveys. “Another metaphor that can illustrate this heterogeneity of the scenes and counter-scenes is that of forging and counter-forging. When blacksmiths forge, they strike the metals to shape them, but they also hold back their blows at times to regain momentum, which they call counter-forging,” says Olivier Fournout.

A model that covers different spheres

 “In my corpus, there are management books written by professors from Harvard and MIT (Massachusetts Institute of Technology). These universities have a great power of dissemination that facilitates the propagation of an imaginary such as that of new heroism,” says the researcher. These universities also have a porosity with the world of consultants who participate in the writing of bestsellers in this field.

But universities and businesses are not the only environments covered by the heroic model. During the Covid-19 pandemic, Camille Étienne, an environmental activist, made a clip in which she referred to citizens as ‘heroes in pyjamas’, regarding the reduction in pollution. The matrix of success is highly malleable and is able to prepare for the world of tomorrow. This power of metamorphosis has been theorized by sociologists Ève Chiapello and Luc Boltanski in their work Le Nouvel Esprit du Capitalisme (The New Spirit of Capitalism). The strength of capitalism is to incorporate criticism in order to remain in a state of constant innovation. This could also apply to the model of new heroism. “Among the paradoxical orders of the modern hero is the lesson to follow rules and to break them. A bestselling management book advises: ‘Firstly, break all the rules’ – but of course, when you look closely, it is not all the rules. The art of the hero is there, hanging in a precarious balance, which can border on the tragic in times of crisis,” concludes Olivier Fournout.

Rémy Fauvel

Sobriété numérique, digital sobriety

What is digital sufficiency?

Digital consumption doubles every 5 years. This is due in particular to the growing number of digital devices and their increased use. This consumption also has an increasing impact on the environment. Digital sufficiency refers to finding the right balance for the use of digital technology in relation to the planet and its inhabitants. Fabrice Flipo, a researcher at Institut Mines-Télécom Business School and the author of the book “L’impératif de la sobriété numérique” (The Imperative of Digital Sufficiency) explains the issues relating to this sufficiency.

What observation is the concept of digital sufficiency based on?

Fabrice Flipo: On the observation of our increasing consumption of digital technology and its impacts on the environment, especially in terms of greenhouse gases. This impact is due to the growing use of digital tools and their manufacturing. Materials for digital tools depend on their extraction, which relies primarily on fossil fuels, and therefore carbon. The use of these tools is also increasingly energy-intensive.

The goal is to include digital technology in discussions currently underway in other sectors, such as energy or transportation. Until recently, digital technology has been left out of these debates. This is the end of the digital exception.

How can we calculate the environmental impacts of digital technology?

FF: The government’s roadmap for digital technology primarily addresses the manufacturing of digital tools, which it indicates accounts for 75% of its impacts. According to this roadmap, the solution is to extend the lifespan of digital tools and combat planned obsolescence. But that’s not enough, especially since digital devices have proliferated in all infrastructure and their use is increasingly costly in energy. The amount of data consumed doubles every 5 years or so and the carbon footprint of the industry has doubled in 15 years.  

It’s hard to compare figures about digital technology because they don’t all measure the same thing. For example, what should we count in order to measure internet consumption? The number of devices, the number of individual uses, the type of uses? So standardization work is needed.

A device such as a smartphone is used for many purposes. Consumption estimations are averages based on typical use scenarios. Another standardization issue is making indicators understandable for everyone. For example, what measurements should be taken into account to evaluate environmental impact?

What are the main energy-intensive uses of digital technology?

FF: Today, video is one of the uses that consumes the most energy. What matters is the size of the files and their being transmitted in computers and networks. Every time they are transmitted, energy is consumed. Video, especially high-resolution video, commands pixels to be switched on up to 60 times per second. The size of the files makes their transmission and processing very energy-intensive. This is the case for artificial intelligent programs that process images and video as well. Autonomous vehicles are also likely to use a lot of energy in the future, since they involve huge amounts of information. 

What are the mechanisms underlying the growth of digital technology?

FF: Big companies are investing heavily in this area. They use traditional marketing strategies: target an audience that is particularly receptive to arguments and able to pay, then gradually expand this audience and find new market opportunities. The widespread use of a device and a practice leads to a gradual phasing out of alternative physical methods. When digital technology starts to take hold in a certain area, it often ends up becoming a necessary part of our everyday lives, and is then hard to avoid. This is referred to as the “lock-in” effect. A device is first considered to be of little use, but then becomes indispensable. For example, the adoption of smartphones was largely facilitated by offers funded by charging other users, through the sale of SMS messages. This helped lower the market entry cost for the earliest adopters of smartphones and create economies of scale. Smartphones then became widespread. Now, it is hard to do without one.

How can we apply digital sufficiency to our lifestyles?

FF: Sufficiency is not simply a matter of “small acts”, but it cannot be enforced by a decree either. The idea is to bring social mindedness to our lifestyles, to regain power over the way we live. The balance of power is highly asymmetrical: on one side are the current or potential users who are scattered, and on the other are salespeople who tout only the advantages of their products and have extensive resources for research and for attracting customers. This skewed balance of power must be shifted. An important aspect is informing consumers’ choices. When we use digital devices today, we have no idea about how much energy we’re consuming or our environmental impact: we simply click. The aim is to make this information perceptible at every level, and to make it a public issue, something everyone’s concerned about. Collective intelligence must be called upon to change our lifestyles and reduce our use of digital technology, with help from laws if necessary.

For example, we could require manufacturers to obtain marketing authorization, as is required for medications. Before marketing a product or service (a new smartphone or 5G), the manufacturer or operator would have to provide figures for the social-ecological trajectory they seek to produce, through their investment strategy. This information would be widely disseminated and would allow consumers to understand what they are signing up for, collectively, when they choose 5G or a smartphone. That is what it means to be socially-minded: to realize that the isolated act of purchasing actually forms a system.

Today, this kind of analysis is carried out by certain associations or non-governmental organizations. For example, this is what The Shift Project does for free. The goal is therefore to transfer this responsibility and its cost to economic players who have far greater resources to put these kinds of analyses in place. Files including these analyses would then be submitted to impartial public organizations, who would decide whether or not a product or service may be marketed. The organizations that currently make such decisions are not impartial since they base their decisions on economic criteria and are stakeholders in the market that is seeking to expand.  

How can sufficiency be extended to a globalized digital market?  

FF: It works through a leverage effect: when a new regulation is established in one country, it helps give more weight to collectives that are dealing with the same topic in other countries. For example, when the electronic waste regulation was introduced, many institutions protested. But gradually, an increasing number of  countries have adopted this regulation.

Some argue that individual efforts suffice to improve the situation, while others think that the entire system must be changed through regulations. We must get away from such either-or reasoning and go beyond  opposing viewpoints in order to combine them. The two approaches are not exclusive and must be pursued simultaneously.

By Antonin Counillon

LCA

What is life cycle analysis?

Life cycle analysis (LCA) is increasingly common, in particular for eco-design or to obtain a label. It is used to assess the environmental footprint of a product or service by taking into account as many sources as possible. In the following interview, Miguel Lopez-Ferber, a researcher in environmental assessment at IMT Mines Alès, offers insights about the benefits and complexity of this tool.

What is life cycle analysis?

Miguel Lopez-Ferber: Life cycle analysis is a tool for considering all the impacts of a product or service over the course of its life, from design to dismantling of assemblies, and possibly recycling – we also refer to this as “cradle to grave.” It’s a multi-criteria approach that is as comprehensive as possible, taking into account a wide range of environmental impacts. This tool is crucial for analyzing performance and optimizing the design of goods and services.

Are there standards?

MLF: Yes, there are European regulations and today there are standards, in particular ISO standards 14040 and 14044. The first sets out the principles and framework of the LCA. It clearly presents the four phases of a LCA study: determining the objectives and scope of the study; the inventory phase; assessing the impact, and the interpretation phase. The ISO 14044 standard specifies the requirements and guidelines.

What is LCA used for?

MLF: The main benefit is that it allows us to compare different technologies or methods to guide decision-making. It’s a tremendous tool for companies looking to improve their products or services. For example, the LCA will immediately pinpoint the components of a product with the biggest impact. Possible substitutes for this component may then be explored, while studying the impacts these changes could lead to. And the same goes for services. Another advantage of the “life cycle” view is that it takes impact transfer into account. For example, in order to lower the impact of an oven’s power consumption, we can improve its insulation. But that will require more raw material and increase the impact of production. The LCA allows us to take these aspects into account and compare the entire lifetime of a product. The LCA is a very powerful tool for quickly detecting these impact transfers.

How is this analysis carried out?

MLF: The ISO 14040 and 14044 standards clearly set out the procedure. Once the framework of the study and objectives have been identified, the inflows and outflows associated with the product or service must be determined – this is the inventory phase. These flows must be brought back to flows from the environment. To do so, there are growing databases, with varying degrees of ease of access, containing general or specialized information. Some focus on agricultural products and their derivatives, others on plastics or electricity production. This information about flows is collected, assembled and related to the flow for a functional unit (FU) that makes it possible to make comparisons. There is also accounting software to help compile the impacts of various stages of a product or service.  

The LCA does not directly analyze the product, but its function, and it is able to compare very different technology. So we will define a FU that focuses on the service provided. Take two shoe designs, for example. Design A is of very high quality so it requires more material to be produced, but lasts twice as long as Design B. Design A may have greater production impacts, but it will be equivalent to two Design Bs over time. For the same service provided, Design A could ultimately have a lower impact.

What aspects are taken into account in the LCA?

MLF: The benefit of life cycle analysis is that it has a broad scope, and therefore takes a wide range of factors into account. This includes direct as well as indirect impacts, consumption of resources such as raw material extraction, carbon footprint, and pollution released. So there is a temporal aspect, since the entire lifetime of a good or service must be studied, a geographical aspect, since several sites are taken into consideration, and the multi-criteria aspect, meaning all the environmental compartments. 

Who conducts the LCA?

MLF: When they are able to, and have the expertise to do so, companies have them done in-house. This is increasingly common. Otherwise, they can hire a consulting firm to conduct them. In any case, if the goal is to share this information with the public, the findings must be made available so that they can be reviewed, verified and validated by outside experts.

What are the current limitations of the tool?

MLF: There is the question of territoriality. For example, power consumption will not have the same impact from one country to another. In the beginning, we used global averages for LCA. We now have continental, and even national averages, but not yet regional ones. The more specific the data, the more accurate the LCA will be.  

Read more on I’MTech: The many layers of our environmental impact

Another problem is additional or further impacts. We operate under the assumption that impacts are cumulative and linear, meaning that manufacturing two pens doubles the impacts of a single pen. But this isn’t always the case. Imagine if a factory releases a certain amount of pollutants – this may be sustainable if it is alone, but not if three other companies are also doing so. After a certain level, the environmental impact may increase.  

And we’re obviously limited by our scientific knowledge. Environmental and climate impacts are complex and the data changes in response to scientific advances. We’re also starting to take social aspects into consideration, which is extremely complex but very interesting.

By Tiphaine Claveau

NFV, Import et export dans le cloud, virtualisation.

What is NFV (Network Function Virtualization) ?

The development of 5G has been made possible through the development of new technologies. The role of Network Function Virtualization, or NFV, is to virtualize network equipment. Adlen Ksentini, a researcher at EURECOM, gives us a detailed overview of this virtualization.

 

What is NFV ?

Adlen Ksentini:  NFV is the virtualization of network functions, a system that service providers and network operators had hoped for in order to decouple software from hardware. It’s based on cloud computing: the software can be placed in a virtual environment – the cloud – and be run on PCs every day. The goal is to be able to use software that implements a network function and run it on different types of hardware, instead of having to purchase dedicated hardware.

How does it work?

A.K.: It relies on the use of a hypervisor, a virtualization layer that makes it possible to abstract the hardware. The goal is to virtualize the software that implements a network function to make it run on a virtual machine or a cloud-based container.

What kind of functions are virtualized?

A.K. : When we talk about network functions, it could refer to the router that sends packets to the right destination, firewalls that protect networks, DNS servers that translate domain names into IP addresses, or intrusion detection. All of these functions will be deployed in virtual machines or containers, so that a small or medium-sized company, for example, doesn’t have to invest in infrastructure to host these services, and may instead rent them from a cloud services provider, using the Infrastructure as a Service (IaaS) model.

What are the advantages of NFV?

A.K.: NFV provides all the benefits of cloud computing. First of all, it lowers costs since you only have to pay for the resources used. It also provides greater freedom since the virtualization layer enables it to be work on several types of hardware. It also makes it possible to react according to varying degree of traffic. If there’s a sudden rise in traffic it’s possible to scale up to respond to the demands.

Performance is another factor involved. Under normal circumstances, the computer’s operating system will not dedicate all of the processor’s capacity to a single task – it will spread it out and performance may suffer. The benefit of cloud computing is that it can take advantage of the almost unlimited resources of the cloud. This also makes for greater elasticity, since resources can be freed up when they are no longer needed.

Why is this technology central to 5G?

A.K.: 5G core networks are virtualized, they will run natively in the cloud. So we need software that is able to run these network functions in the cloud. NFV provides a number of advantages and that’s why it is used for the core of 5G. NFV and SDN are complementary and make it possible to obtain a virtual network.

Read more on I’MTech: What is SDN (Software-Defined networking)?

What developments are ahead for NFV?

A.K. : Communication technologies have created a framework for orchestrating and managing virtual resources, but the standard continues to evolve and a number of studies seek to improve it. Some aim to work on the security aspect, to better defend against attacks. But we’re also increasingly hearing about using artificial intelligence to enable the operator to improve resources without human intervention. That’s the idea behind Zero Touch Management, so that NFV networks can be self-correcting, self-manageable and, of course, secure.

 

Tiphaine Claveau for I’MTech

tribology

What is tribology?

The science of friction: this is the definition of tribology. Tribology is a focal point shared by several disciplines and an important field of study for industrial production. Far from trivial, friction is a particularly complex phenomenon. Christine Boher, a tribologist at IMT Mines Albi[1], introduces us to this subject.

 

What does tribology study?

Christine Boher: The purpose of tribology is to understand what happens at the contact surface of two materials when they rub, or are in “relative movement” as we call it. Everyone is aware of the notion of friction: rubbing your hands together to warm up is friction, playing the guitar, skiing, braking, oiling machines, all involve friction. Friction induces forces that oppose the movement, resulting in damage. We are trying to understand these forces by studying how they manifest themselves, and the consequences they have on the behavior of materials. Tribology is therefore the science of friction, wear and lubrication. The phenomenon of wear and tear may seem terribly banal, but when you look more closely, you realize how complex it is!

What scientific expertise is used in this discipline?

CB: It is a “multiscience”, because it involves many disciplines. A tribologist can be a researcher specializing in solid mechanics, fluid mechanics, materials, vibratory behavior, etc. Tribology is the conjunction of all these disciplinary fields, and this is what makes it so complex. Personally, I specialize in material sciences.

Why is friction so interesting?

CB: You first need to understand the role of friction in contact. Although it sounds intuitive, when two materials rub together, many phenomena occur: the surface temperature increases, the mechanical behavior of both parts is changed, particles are created due to wear, which have an impact on the load and sliding speed. As a result, material properties arise which would not have happened without friction. Tribology focuses on both the macrometric and micrometric aspects of the surfaces of materials in contact.

How is the behavior of a material changed by friction?

CB: Take for example the wear particles generated during friction. As they are generated, they can decrease the frictional resistance between the two bodies. They then act as a solid lubricant, and in most cases they have a rather useful, desirable effect. However, these particles can damage the materials if they are too hard. If this is the case, they will accelerate the wear. Tribologists therefore try to model how, during friction, these particles are generated and under what conditions they are produced in optimal quantities.

Another illustration is the temperature increase of parts. In some cases of high-speed friction, the temperature of the materials can rise from 20°C to 700°C in just a few minutes. The mechanical properties of the material are then completely different.

Could you illustrate an application of tribology?

CB: Take the example of a rolling mill, a large tool designed to produce sheet metal by successive reductions in thickness. There is a saying in the discipline: “no friction, no lamination”. If problems arise during friction, that is, if there are problems of contact between the surface of the sheets and those of the cylinders, the sheets will be damaged. For the automotive industry, this means body sheets are damaged during the production phase, compromising surface integrity. To avoid this, we are working in collaboration with the relevant industrialists either on new metallurgical alloys or on new coatings to be put on the rollers. The purpose of the coating is to protect the material from wear and to slow down the damage of the working surfaces as much as possible.

Who are the main beneficiaries of tribology research?

CB: We work with manufacturers in the shaping industry, such as ArcelorMittal, or Aubert & Duval. We also have partnerships with companies in the aeronautics sector, such as Ratier Figeac. Generally, we are called in by major groups or subcontractors of major industrial groups because they are interested in increasing their speeds, and this is where friction-related performance becomes important.

 

[1] Christine Boher is a researcher at the Institut Clément Ader, a joint research unit
of IMT Mines Albi/CNRS/INSA Toulouse/ISAE Supaero/University Toulouse III Paul Sabatier/Federal University Toulouse Midi-Pyrénées.

Image d'un globe terrestre vert - écoconception, eco-design

What is eco-design?

In industry, it is increasingly necessary to design products and services with concern and respect for environmental issues.  Such consideration is expressed through a practice that is gaining ground in a wide range of sectors: eco-design. Valérie Laforest, a researcher in environmental assessment and environmental engineering and organizations at Mines Saint-Étienne, explains the term.

 

What does eco-design mean?

Valérie Laforest: The principle of eco-design is to incorporate environmental considerations from the earliest stages of creating a service or product, meaning from the design stage. It’s a method governed by standards, at the national and international level, describing concepts and setting out current best practices for eco-design. We can just as well eco-design a building as we can a tee-shirt or a photocopying service.

Why this need to eco-design?

VL: There is no longer any doubt about the environmental pressure on the planet. Eco-design is one concrete way for us to think about how our actions impact the environment and consider alternatives to traditional production. Instead of producing, and then looking for solutions, it’s much more effective and efficient to ask questions from the design stage of a product to reduce or avoid the environmental impact.

What stages does eco-design apply to?

VL: In concrete terms, it’s based entirely on the life cycle of a system, from very early on in its existence. Eco-design thinking takes into account the extraction of raw materials, as well as the processing and use stages, until end of life. If we recover the product when it is no longer usable, to recycle it for example, that’s also an example of eco-design. As it stands today, end-of-life products are either sent to landfills, incinerated or recycled. Eco-design means thinking about the materials that can be used, but also thinking about how a product can be dismantled so as to be incorporated within another cycle.

When did we start hearing about this principle?

VL: The first tools arrived in the early 2000s but the concept may be older than that. Environmental issues and associated research have increased since 1990. But eco-design really emerged in a second phase when people started questioning the environmental impact of everyday things: our computer, sending an email, the difference between a polyester or cotton tee-shirt.

What eco-design tools are available for industry?

VL: The tools can fall into a number of categories. There are relatively simple ones, like check-lists or diagrams, while others are more complex. For example, there are life-cycle analysis tools to identify the environmental impacts, and software to incorporate environmental indicators in design tools. The latter require a certain degree of expertise in environmental assessment and a thorough understanding of environmental indicators. And developers and designers are not trained to use these kinds of tools.

Are there barriers to the development of this practice?

VL: There’s a real need to develop special tools for eco-design. Sure, some already exist, but they’re not really adapted to eco-design and can be hard to understand. This is part of our work as researchers, to develop new tools and methods for the environmental performance of human activities. For example, we’re working on projects with the Écoconception center, a key player in the Saint-Etienne region as well as at the national level.

In addition to tools, we also have to go visit companies to get things moving and see what’s holding them back. We have to consider how to train, change and push companies to get them to incorporate eco-design principles. It’s an entirely different way of thinking that requires an acceptance phase in order to rethink how they do things.

Is the circular economy a form of eco-design?

VL: Or is eco-design a form of the circular economy? That’s an important question, and answers vary depending on who you ask. Stakeholders who contribute to the circular economy will say that eco-design is part of this economy. And on the other side, eco-design will be seen as an initiator of the circular economy, since it provides a view of the circulation of material in order to reduce the environmental impact. What’s certain is that the two are linked.

Tiphaine Claveau for I’MTech

[box type=”info” align=”” class=”” width=””]

This article was published as part of Fondation Mines-Télécom‘s 2020 brochure series dedicated to sustainable digital technology and the impact of digital technology on the environment. Through a brochure, conference-debates, and events to promote science in conjunction with IMT, this series explores the uncertainties and challenges of the digital and environmental transitions.

[/box]

 

digital twin

What is a digital twin?

Digital twins, digital doubles – what exactly do these terms mean? Raksmey Phan, an engineer at the Mines Saint-Étienne Centre for Biomedical and Health Engineering (CIS)[1], talks to us about the advantages and advances offered by these new tools, as well as the issues involved.

 

What does a digital twin refer to?

Raksmey Phan: If you have a digital, mathematical model representing a real system, based on data from this real system, then you have a digital twin. Of course, the quality of the digital twin depends first and foremost on the mathematical model. Industrial ovens are a historic example that can help explain this idea.

To create a digital twin, we record information about the oven, which could include its operating hours or the temperature each time it’s used. Combined with algorithms that take into account the physical components that make up the oven, this digital twin will calculate its rate of wear and tear and anticipate breakdown risks. The use of the oven can then be monitored in real time and simulated in its future state with different use scenarios in order to plan for its replacement.

In what fields are they used?

RP: They can be used in any field where there is data to be recorded. We could say that climatologists make a digital twin of our planet: based on observational data recorded about our planet, they run simulations, and therefore mathematical models, resulting in different scenarios. To give another example, at the Mines Saint-Étienne CIS, we have scientists such as Xiaolan Xie, who are internationally renowned for their experience and expertise in the field of modeling healthcare systems. One of our current projects is a digital twin of the emergency department at Hôpital Nord de Saint-Étienne, which is located 200 meters from our center.

What advantages do digital twins offer?

RP: Let’s take the example of the digital twin of the emergency room. We’ve integrated anonymized patient pathways over a one-year period in a model of the emergency room. In addition to this mathematical model, we receive data in what can be referred to as ‘pseudo-real time,’ since there is a lapse of one hour from the time patients arrive in the department. This makes it possible for us to do two important things. The first is to track the patients’ movement through the department in pseudo-real time, using the data received and the analysis of pathway records. The second is the ability to plan ahead and predict future events. Imagine if there was a bus accident in the city center. Since we know what types of injuries result from such an accident, we can visualize the impact it would have on the department, and if necessary, call in additional staff.

What did people do before there were digital twins?

RP: Companies and industries were already using the concept before the term existed. Since we’ve been using machines, engineers have tried to monitor tools with replicas – whether digitally or on paper. It’s a bit like artificial intelligence. The term is back in fashion but the concept goes back much further. Algorithms are mathematics, and Napoleon used algorithms for his war logistics.

When did the term digital twin first start to be used?

RP: The term ‘digital twin’ was first used in 2002 in articles by Michael Grieves, a researcher at the Florida Institute of Technology. But the concept has existed since we have been trying to model real phenomena digitally, which is to say since the early days of computing. But there has been renewed interest in digital twins in recent years due to the convergence of three scientific and technological innovations. First, the impressive growth in our ability to analyze large amounts of data — Big Data. Second, the democratization of connected sensors — the Internet of Things. And third, renewed interest for algorithms in general, as well as for cognitive sciences — Artificial Intelligence.

How have the IoT and Big Data transformed digital twins?

RP: A digital twin’s quality depends on the quantity and quality of data, as well as on its ability to analyze this data, meaning its algorithms and computing capacity. IoT devices have provided us with a huge amount of data. The development of these sensors is an important factor – production has increased while costs have decreased. The price of such technologies will continue to drop, and at the same time, they will become increasingly accurate. That means that we’ll be able to create digital twins of larger, more complex systems, with a greater degree of accuracy. We may soon be able to make a digital twin of a human being (project in the works at CIS).

Are there technological limitations to digital twins?

RP: Over the last five years, everything’s been moving faster at the technological level. It’s turned into a race for the future. We’ll develop better sensors, and we’ll have more data, and greater computing power. Digital twins will also follow these technological advances. The major limitation is sharing data – the French government was right to take steps towards Open Data, which is free data, shared for the common good. Protecting and securing data warehouses are limiting factors but are required for the technological development of digital twins. In the case of our digital twin of the hospital, this involves a political and financial decision for hospital management.

What are some of the challenges ahead?

RP: The major challenge, which is a leap into the unknown, is ethics. For example, we can assess and predict the fragility of senior citizens, but what should we do with this information after that? If an individual’s health is likely to deteriorate, we could warn them, but without help it will be hard for them to change their lifestyle. However, the information may be of interest to their insurance providers, who could support individuals by offering recommendations (appropriate physical activity, accompanied walks etc.) This example hinges on the issues of confidentially and anonymization of data, not to mention the issue of informed consent of the patient.

But it’s incredible to be talking about confidentiality, anonymization and informed consent as a future challenge  — although it certainly is the case — when for the past ten years or so, a portion of the population has been publishing their personal information on social media and sharing their data with wellness applications whose data servers are often located on another continent.

[1] Raksmey Phan is a researcher at the Laboratory of Informatics, Modelling and Optimization of the Systems (LIMOS), a joint research unit between Mines Saint-Étienne/CNRS/Université Clermont-Auvergne.

Read on I’MTech:

Mendeleev

Mendeleev: The history of a table

2019 marks the 150th anniversary of the periodic table of elements. To celebrate this anniversary, the Mines ParisTech Library and Mineralogy Museum have teamed up to create the exhibition Before Mendeleev: Genesis of a Table, on view until 31 January 2020. The exhibition traces the contributions of the scientists who preceded Mendeleev and led him to present the periodic table of elements, which has since served as a reference for all scientists and students.

 

To celebrate the 150th anniversary of the periodic table of elements, Mines ParisTech is presenting the exhibition Before Mendeleev: Genesis of a Table until 31 January 2020. Visitors have the opportunity to discover the scientists who contributed to formulating this classification and to developing knowledge over the years. Amélie Dessens is a curator at the library and head of Mines ParisTech’s heritage collections and Sarah Hijmans is a PhD student at the Université de Paris’ SPHère laboratory. They created the exhibition in collaboration with Didier Nectoux, curator at the Mineralogy Museum, to showcase and share the rich cultural collections of the school’s library and museum. “It’s this type of exhibition, along with school partnerships,” says Didier Nectoux, “that allows us to keep this heritage alive outside of the school.” He adds, “this rich heritage must be preserved and shared. And these collections are still essential today. The transformations of the 21st century are driving us to study new possibilities to find alternatives, and we need documentation, archives, in order to know which avenues have already been studied and abandoned, and the reasons why.”

The exhibition, which is presented in chronological order, starts on the doorstep of the Library with the beginnings of the study of elements: alchemy. “The alchemists were not just interested in turning lead into gold,” explains Amélie Dessens. “Beyond the esoteric sense with which alchemy is often related today, it was also – and more importantly – the beginning of chemistry and of identifying the elements that are presented here.” In display cases, eight minerals accompany the works. The first seven elements identified, and bismuth, the earliest written record of which dates from 1558 by the German scholar Georg Agricola. However, it was already well-known in European mining centers prior to this date. This also demonstrates the importance of accompanying discoveries with publication, which is crucial to situating knowledge in time.

A long road to developing the table

From Bergen to Lavoisier, Döbereiner to Newland, a series of display cases present the various steps of the advances, decisions and research that shaped the study and classification of the elements. First, there was Lavoisier, who brought about a true chemical revolution by introducing a scientific method to prove his theories, proposing the first classification of the “33 simple substances,” and working with Berthollet to develop a chemical nomenclature, which made it possible for everyone to use the same names for the elements. The second major turning point came in the 1860s, when scientists realized that elements could have similar chemical properties based on their atomic weight. They thus started to classify them based on these criteria and proposed potential classification formats, which are presented in the exhibition through diagrams, notes and publications.

For example, there was Alexandre-Émile Béguyer de Chancourtois, geologist, mineralogist and professor at the school of Mines de Paris, who made a significant contribution in 1862. He was the first to demonstrate the principle of periodicity through a spiral-shaped classification: the telluric screw. “Mendeleev was not the first to demonstrate periodicity, or to indicate where the missing elements should be placed in the table,” explains exhibition curator Amélie Dessens, “but unlike the others, he dared to predict the properties of the missing elements.” Dmitri Mendeleev published his table in 1869. When gallium was discovered in 1875, confirming his predictions, the news spread throughout the scientific community. It was at this point that Mendeleev’s classification would make its mark in history and earn its place in our textbooks.

quantique, quantum technology

20 terms for understanding quantum technology

Quantum mechanics is central to much of the technology we use every day. But what exactly is it? The 11th Fondation Mines-Télécom booklet explores the origins of quantum technology, revealing its practical applications by offering a better understanding of the issues. To clarify the concepts addressed, the booklet includes a glossary, from which this list is taken.

 

Black-body radiation – Thermal radiation of an ideal object absorbing all the electromagnetic energy it receives.

Bra-ket notation (from the word bracket) – Formalism that facilitates the writing of equations in quantum mechanics.

Coherent detectors – Equipment used to detect photons based on amplitude and the phase of the electromagnetic signal rather than interactions with other particles.

Decoherence – Each possibility of a quantum superposition state interacts with its environment at a degree of complexity that makes the different possibilities incoherent and unobservable.

Entanglement – Phenomenon in which two quantum systems present quantum states that are dependent on one another, regardless of the distance separating them.

Locality (principle of) – The idea that two distant objects cannot directly influence each other.

Momentum – Product of the mass and velocity vector of a hypothetical object in time.

NISQ (Noisy Intermediate-Scale Quantum) – Current class of quantum computers

Observable (noun) – Concept in the quantum world comparable to a physical value (position, momentum, etc.) in the classical world.

Quanta – The smallest indivisible unit (of energy, momentum, etc.)

Quantum Hall effect – classical Hall effect refers to the phenomenon of voltage created by an electric current flowing through material immersed in a magnetic field. According to the conditions, this voltage increases in increments. This is the quantum Hall effect.

Quantum state – A concept that differs from a classical physical system, in which measured physical values like position and speed are sufficient in defining the system. A quantum state provides a probability distribution for each observable of the quantum system to which it refers.

Quantum system – Refers to an object studied in a context in which its quantum properties are interesting, such as a photon, mass of particles, etc.

Qubit – Refers to a quantum system in which a given observable (the spin for example) is the superposition of two independent quantum states.

Spin – Like the electric charge, one of the properties of particles.

Superposition principle – Principle that a same quantum state can have several values for one of its given observables.

The Schrödinger wave function – A fundamental concept of quantum mechanics, a mathematical function representing the quantum state of a quantum system.

Uncertainty Principle – Mathematical inequality that expresses a fundamental limit to the level of precision with which two physical properties of a same particle can be simultaneously known.

Wave function collapse – Fundamental concept of quantum mechanics that states that after a measurement, a quantum system’s state is reduced to what was measured.

Wave-particle duality (or wave-corpuscle duality) – The principle that a physical object sometimes has wave properties and sometimes corpuscular properties.

Also read on I’MTech

Christiine Lors

Interactions Materials-Microorganisms

This book is devoted to biocolonization, the biodeterioration of materials and possible improvements in their performance. Many materials age according to their use and their environment. The presence of microorganisms can then lead to biodeterioration. However, these can also help protect structures, provided their properties are used wisely. Christine LORS, researcher at IMT Lille Douai is co-author of this book published in English. Here is the presentation.

Read on I’MTech When microorganisms attack or repair materials

[box type=”shadow” align=”” class=”” width=””]This multidisciplinary book is the result of a collective work synthesizing presentations made by various specialists during the CNRS «BIODEMAT» school, which took place in October 2014 in La Rochelle (France). It is designed for readers of a range of scientific specialties (chemistry, biology, physics, etc.) and examines various industrial problems (e.g., water, sewerage and maintaining building materials).

Metallic, cementitious, polymeric and composite materials age depending on their service and operational environments. In such cases, the presence of microorganisms can lead to biodeterioration. However, microorganisms can also help protect structures, provided their immense possibilities are mastered and put to good use.

This book is divided into five themes related to biocolonization, material biodeterioration, and potential improvements to such materials resulting in better performance levels with respect to biodeterioration:
• physical chemistry of surfaces;
• biofilm implication in biodeterioration;
• biocorrosion of metallic materials;
• biodeterioration of non-metallic materials;
• design and modification of materials.

The affiliations of the authors of the various chapters illustrate the synergy between academic research and its transfer to industry. This demonstrates the essential interaction between the various actors in this complex field: analysing, understanding, and responding to the scientific issues related to biodeterioration.[/box]

[divider style=”normal” top=”20″ bottom=”20″]

Christine LorsInteractions Materials – Microorganisms
Concretes and Metals more Resistant to Biodeterioration
Christine Lors, Françoise Feugeas, Bernard Tribollet
EDP Sciences, 2019
416 pages
75,00 € (Paperback) – 51,99 € (PDF)

Order the book