noise

Without noise, virtual images become more realistic

With increased computing capacities, computer-generated images are becoming more and more realistic. Yet generating these images is very time-consuming. Tamy Boubekeur, specialized in 3D Computer Graphics at Télécom ParisTech, is on a quest to solve this problem. He and his team have developed new technology that relies on noise-reduction algorithms and saves computing resources while offering high-quality images.

 

Have you ever been impressed by the quality of an animated film? If you are familiar with cinematic video games or short films created with computer-generated images, you probably have. If not, keep in mind that the latest Star Wars and Fantastic Beasts and Where to Find Them movies were not shot on a satellite superstructure the size of a moon or by filming real magical beasts. The sets and characters in these big-budget films were primarily created using 3D models of astonishing quality. One of the many examples of these impressive graphics: the demonstration by the team from Unreal Engine, a video game engine, at the Game Developers Conference last March. They worked in collaboration with Nvidia and ILMxLAB to create a fictitious scene from Star Wars created using only computer-generated images, for all the characters and sets.

 

To trick viewers, high-quality images are crucial. This is an area Tamy Boubekeur and his team from Télécom ParisTech specialize in. Today, most high-quality animation is produced using a specific type of computer-generated image: photorealistic computer generation using path tracing. This method begins with a 3D model of the desired scene, with the structures, objects and people. Light sources are then placed in the artificial scene: the sun outside, or lamps inside. Then paths are traced starting from the camera—what will be projected on the screen from the viewer’s vantage point—and moving towards the light source. These are the paths light takes as it is reflected off the various objects and characters in the scene. Through these reflections, the changes in the light are associated with each pixel in the image.

This principle is based on the laws of physics and Helmholtz’s principle of reciprocity, which makes it possible to ‘trace the light’ using the virtual sensor,” Tamy Boubekeur explains. Each time the light bounces off objects in the scene, the equations governing the light’s behavior and the properties of the modeled materials and surfaces define the path’s next direction. The spread of the modeled light therefore makes it possible to capture all the changes and optical effects that the eye perceives in real life. “Each pixel in the image is the result of hundreds or even thousands of paths of light in the simulated scene,” the researcher explains. The final color of the pixel is then generated by computing the average of the color responses from each path.

Saving time without noise

The problem is, achieving a realistic result requires a tremendous number of paths. “Some scenes require thousands of paths per pixel and per image: it takes a week of computing to generate the image on a standard computer!” Tamy Boubekeur explains. This is simply too long and too expensive. A film contains 24 images per second. In one year of computing, less than two seconds of a film would be produced on a single machine. Enter noise-reduction algorithms—specifically those developed by the team from Télécom ParisTech. “The point is to stop the calculations before reaching thousands of paths,” the researcher explains. “Since we have not gone far enough in the simulation process, the image still contains noise. Other algorithms are used to remove this noise.” The noise alters the sharpness of the image and is dependent on the type of scene, the materials, lighting and virtual camera.

Research on noise has been carried out and has flourished since 2011. Today, many algorithms exist based on different approaches. Competition is fierce in the quest to achieve satisfactory results. What is at stake in the achieved performance? The programs’ capacity to reduce calculation times and produce a final result without noise. The Bayesian collaborative denoiser (BCD) technology, developed by Tamy Boubekeur’s team, is particularly effective in achieving this goal. Developed from 2014 to 2017 as part of Malik Boudiba’s thesis, the algorithms used in this technology are based on a unique approach.

Normally, noise removal methods attempt to guess the amount of noise present in a pixel based on properties in the observed scene—especially its visible geometry—in order to remove it. “We recognized that the properties of the scene being observed could not account for everything,” Tamy Boubekeur explains. “The noise also originates from areas not visible in the scene, from materials reflecting the light, the semi-transparent matter the light passes through or properties of the modeled optics inside the virtual camera.” A defocused background or a window in the foreground can create varying degrees of noise in the image. The BCD algorithm therefore only takes into account the color values associated with the hundreds of paths calculated before the simulation is stopped, just before the values are averaged into a color pixel. “Our model estimates the noise associated with a pixel based on the distribution of these values, by analyzing similarities with the properties of other pixels and removes the noise from them all at once,” the researcher explains.

A sharp image of Raving Rabbids

The BCD technology was developed as part of the PAPAYA project launched as part of the French National Fund for Digital Society. The project was led in partnership with Ubisoft Motion Pictures to define the key challenges in terms of noise-reduction for professional animation. The company was really impressed by the algorithms in the BCD technology and integrated them into its graphics production engine, Shining. It then used them to produce its animated series, Raving Rabbids. “They liked that our algorithms work with any type of scene, and that the technology is integrated without causing any interference,” Tamy Boubekeur explains. The BCD noise-remover does not require any changes in image calculation methods and can easily be integrated into systems and teams that already have well-established tools.

The source code for the technology has been published in open source on Github. It is freely available, particularly for animation film professionals who prefer open technology over the more rigid proprietary technology. An update to the code integrates an interactive preview module that allows users to adjust the algorithm’s parameters, thus making it easier to optimize the computing resources.

The BCD technology has therefore proven its worth and has now been integrated into several rendering engines. It offers access to high-quality image synthesis, even for those with limited resources. Tamy Boubekeur reminds us that a film like Disney’s Big Hero 6 contains approximately 120,000 images, requires 200 million hours of computing time and the use of thousands of processors to be produced in a reasonable timeframe. For students and amateur artists, these technical resources are inaccessible. Algorithms like those used in the BCD technology offer them the hope of more easily producing very high-quality films. And the team from Télécom ParisTech is continuing its research to even further reduce the amount of computing time required. Their objective: develop new light simulation calculation distribution methods using several low-capacity machines.

[divider style=”normal” top=”20″ bottom=”20″]

Illustration of BCD denoising a scene, before and after implementing the algorithm

 

competition

Coopetition between individuals, little-understood interactions

Mehdi Elmoukhliss, Institut Mines-Télécom Business School and Christine Balagué, Institut Mines-Télécom Business School

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]C[/dropcap]oopetition is a concept used in the field of management science (especially in strategy), originally used to describe situations in which organizations (companies, clubs etc.) simultaneously cooperate and compete with one another, as paradoxical as that may seem. A recent article in The Conversation pointed to the potential role of coopetition in evolution, underscoring that it can be found in the animal kingdom (to explain the evolution of species) as much as in companies and organizations.

We would like to provide another perspective here by highlighting the fact that coopetition can be observed in relationships between individuals, which opens up a wide range of potential applications.

A few examples

A variety of situations can be considered as examples of coopetition between individuals. In companies, for example, how many colleagues cooperate, while knowing that only one of them will become the boss in the event of a promotion? When he was serving as Minister of the Economy under François Hollande while secretly preparing to run for president, was Emmanuel Macron not in coopetition with the President of the Republic, since he had to cooperate with his rival?

Relationships between individuals are rarely archetypal (purely cooperative or purely competitive). They are often mixed, hybrid, simultaneously cooperative and competitive. Inter-individual coopetition is even a hiring technique in human resource management: some recruiters interview candidates by asking them to work together on a project only to select certain candidates to continue in the interview process.

Inter-individual coopetition can also be seen in the scientific world, where researchers often cooperate with others to carry out a study, while competing with one another in terms of career or prestige. Online, a number of platforms (for crowdsourcing ideas for example) seek to promote cooperation between users while making them compete with one another to identify “the best contributors,” for example. Coopetition also occurs in the world of sports. In football or cycling, athletes must sometimes cooperate to win, while competing to become the “star” of the game or race. In basketball, the famous Shaquille O’Neal–Kobe Bryant duo helped the Lakers win three consecutive titles between 1999 and 2003, despite the rivalry between the two players.

And the rivalry between these players continues today.

But coopetition is not a sort of interdependence reserved for the “ruthless” worlds of business, politics, research, competition for ideas or competitive sports. Consider the example of mushroom lovers. Many of them communicate on forums or social networks. In these virtual communities, members exchange advice (for example opinions about the toxicity of mushrooms) as well as important information about locations where highly-coveted mushrooms grow. While amateur and experienced mycologists collaborate to identify zones of interest, the information exchanged is intentionally vague. Members indicate their geographic area but rarely specify the slope, altitude and even less so the GPS coordinates! The information they share is enough to help others without “letting the mushrooms” out of the bag.

A forgotten model

Coopetition, like cooperation and competition, appears to be an observable phenomenon in a wide range of social situations. It is not a new ideological ideal but rather a “forgotten” model for collective action. It is not unique to contemporary western societies either. Anthropologist Margaret Mead’s research showed that certain indigenous tribes are based on “varying” degrees of cooperation and competition.

Surprisingly, this possibility has received little research attention. As pointed out by Paul Chiambaretto and Anne-Sophie Fernandez or Julien Granata in The Conversation, this can be explained by a cultural approach, specific to the western world, anchored in philosophical views in which cooperation and competition are seen as opposites.

Further reading: Coopétition, moteur de l’évolution des espèces (Coopetition, a driving force for the evolution of species)

In social psychology – one major area for studies on cooperation and competition between individuals – Morton Deutsch’s research led to developing the theory of social interdependence in 1949, which is now considered to be the theory of reference on cooperation and competition between individuals. One of the assumptions of this structuralist theory is that mixed situations are common but are of little theoretical interest, since they will always be guided by a dominant mechanism (cooperation or competition).

Deutsch adds that these situations are, in any event, sub-optimal. As a result, studies on cooperation and competition in psychology have primarily adopted an either/or approach to cooperation and competition. Yet, the opposition assumed by Morton Deutsch has not been formally proven, and for many psychological researchers, this assumption should be challenged. Although this limitation was originally pointed out in the 1960s, several decades would go by before social sciences researchers started working on this topic, showing how coopetition between individuals differs from the two traditional models.

What we know

Emerging research on inter-individual coopetition focuses primarily on companies and virtual platforms, which have been studied in laboratory experiments. This research shows that coopetition between individuals boosts their creativity in a variety of contexts, whether in face-to-face or online situations. Far from being counterproductive, this duality has certain benefits.

Research carried out in companies shows that inter-individual coopetition does not hinder learning in teams. Although little is known about how this particular organizational method impacts individuals, it has been shown that employees do not all react to coopetition in the same way: some easily accept the situation in which they find themselves and know how to “play the game” with great skill, while others find it more difficult and ultimately “choose a side” – cooperate or compete. Inter-individual coopetition can also create tension and governance issues, which may be resolved in part through a new management style better suited to “coopetitive” teams.

The risks of inter-individual coopetition

Despite the level of enthusiasm for these little-studied situations, the risks of inter-individual coopetition must not be ignored. It raises some important questions:

  • Does it not open the door to widespread suspicion, and to paranoia? Does coopetition between individuals not create an unhealthy atmosphere? How can tension and ambivalence be handled?
  • Is coopetition not conducive to conflicts of interest, which are harmful to team dynamics? Is it not a question of paradoxical demands likely to give rise to anxiety and psychosomatic disorders. And what role does Machiavellianism play in these situations?
  • In what cases are the results of coopetition worse than those which would have been obtained through a purely cooperative and/or purely competitive approach?

In other words, the conditions for genuinely constructive, socially-positive coopetition must still be established, to ensure that it is not detrimental to individuals’ health or to group dynamics.

A radical change of perspective

Still, hybrid situations are common and in some cases they prove to be useful. For the philosopher Pierre Lévy, who evokes “competitive cooperation” and “cooperative competition”, inter-individual coopetition is even “the preferred way of organizing collective intelligence.” This promising new research area requires further studies in order to confirm the usefulness of coopetitive inter-individual systems by studying their benefits and potentially harmful effects in greater detail.

More fundamentally, the idea of coopetition between individuals proposes a radical change of perspective: competition is not the opposite of cooperation and these two types of interdependence can be combined. This sheds new light on how we function as individuals and in groups and suggests a more nuanced understanding of human relationships. It is exciting on an intellectual level and represents a potential source of innovation in fields such as management, education or digital technology.

[divider style=”normal” top=”20″ bottom=”20″]

Mehdi Elmoukhliss, PhD student in Management Sciences and expert in collective intelligence systems, Institut Mines-Télécom Business School and Christine Balagué, Professor and Head of the Smart Objects and Social Networks Chair, Institut Mines-Télécom Business School

The original version of this article (in French) was published on The Conversation and republished under a Creative Commons license. Read the original article.

 

MOx

MOx Strategy and the future of French nuclear plants

Nicolas Thiollière, a researcher in nuclear physics at IMT Atlantique, and his team are assessing various possibilities for the future of France’s nuclear power plants. They seek to answer the following questions: how can the quantity of plutonium in circulation in the nuclear cycle be reduced? What impacts will the choice of fuel — specifically MOx — have on nuclear plants? To answer to these questions, they are using a computer simulator that models different scenarios: CLASS (Core Library for Advanced Scenario Simulation).

 

Today, the future of French nuclear power plants remains uncertain. Many reactors are coming to the end of their roughly forty-year lifespan. New proof of concept trials must be carried out to extend their duration of use. In his quest to determine which options are viable, Nicolas Thiollière, a researcher at IMT Atlantique with the Subatech laboratory, and his team are conducting nuclear scenario studies. In this context, they are working to assess future options for France’s nuclear power plants.

Understanding the nuclear fuel cycle

The nuclear fuel cycle encompasses all the steps in the nuclear energy process, from uranium mining to managing the radioactive waste. UOx fuel, which stands for uranium oxide, represents roughly 90% of the fuel used in the 58 pressurized water reactors in French nuclear power plants. It consists of uranium enriched in uranium 235. After a complete cycle, i.e. after it has passed through the reactor, the radiation generates approximately 4% fission products (matter used to produce energy), 1% plutonium and 0.1% minor actinides. In most countries, these components are not recycled; this is referred to as an open cycle.

However, France has adopted a partially closed cycle, in which the plutonium is reused. Therefore, the plutonium is not considered waste, despite being the element with the highest radiotoxicity. In other words, it is the most hazardous of the nuclear cycle materials in the medium-to-long-term, for thousands to millions of years. France has a plutonium recycling system based on MOx fuel, which stands for “mixed oxide”. “MOx is fuel that consists of 5% to 8% plutonium produced during the UOx combustion cycle and supplemented by the depleted uranium,” Nicolas Thiollière explains.

The use of this new mixed fissile material helps slightly reduce the consumption of uranium resources. In the nuclear power plants in France, MOx fuel represents approximately 10% of total fuel—the rest is UOx fuel. After a MOx irradiation cycle, 3% to 5% of the remaining plutonium is not considered waste and could theoretically be reused. In practice, however, it currently is not reused. MOx fuel must therefore be stored for processing, forming a strategic reserve of plutonium. “We estimate that there were approximately 350 tons of plutonium in the French nuclear cycle in 2018. The majority is located in used UOx and MOx fuel,” explains Nicolas Thiollière. Thanks to their simulations, the researchers estimate that with an open cycle—without recycling using MOx —there would be approximately 16% more plutonium in 2020 than is currently projected with a closed cycle.

The fast neutron reactor strategy

In the current pressurized water reactors, the natural uranium must first be enriched. 8 mass units of natural uranium are needed to produce 1 unit of enriched uranium. In the reactor, only 4% of the fuel’s mass undergoes fission and produces energy. Directly or indirectly fissioning the entire mass of the natural uranium would result in resource gains by a factor 20. In practice, this involves multi-recycling the plutonium produced by the neutron absorption of uranium during irradiation to continuously incinerate it. One of the possible industrial solutions is the use of Fast Neutron Reactors (FNR). FNRs rely on the use of fast neutrons that offer the advantage of fissioning the plutonium more effectively, thus enabling it to be recycled several times.

Historically, the development of MOx fuel was part of a long-term industrial plan based on multi-recycling the plutonium used in FNRs. Now, a completely different story is in the making. Although three FNRs were used in France beginning in the 1960s (Rapsodie, Phénix and Superphénix), the permanent suspension decision for Superphénix by the Council of State in 1997 signaled the end of the expansion of FNRs in France. The three pioneer reactors were shut down, and no FNRs have been operated since. However, the Act of 2006 on the sustainable development of radioactive materials and waste revitalized the project by setting a goal for commissioning an FNR prototype by 2020. The ASTRID project, led by the CEA (The French Alternative Energies and Atomic Energy Commission), took shape.

Recently, funding for this reactor with its pre-industrial power level (approximately 600 megawatts compared to 1 gigawatt for an industrial reactor) has been scaled down. The power of the ASTRID concept, significantly reduced to 100 megawatts, redefines its status and will probably extend the industrial development potential of FNRs beyond 2080. “Without the perspective of deploying FNRs, the MOx strategy is called into question. The industrial processing of plutonium is a cumbersome and expensive process resulting in limited gains in terms of the inventory and resource if the MOx is only used in the current reactors,” Nicolas Thiollière observes.

In this context of uncertainty regarding the deployment of FNRs and as plutonium accumulates in the cycle, Nicolas Thiollière and his team are asking a big question. Under what circumstances can nuclear power plants multi-recycle (recycle more than once) plutonium using the current reactors and technology to stabilize inventory? In practice, major research and development efforts would be required to define a new type of fuel assembly compatible with multi-recycling. “Many theoretical studies have already been carried out by nuclear industry operators, revealing a few possibilities to explore,” the researcher explains.

Nuclear scenario studies: simulating different courses of action for nuclear power plants

Baptiste Mouginot and Baptiste Leniau, former researchers with Subatech laboratory, developed the cycle simulator CLASS (Core Library for Advanced Scenarios Simulations) from 2012 to 2016. This modeling tool can scientifically assess future strategies for the fuel cycle. It can therefore be used to calculate and monitor the inventory and flow of materials over time for all nuclear plant units (fuel fabrication and separation factories, power stations, etc.) based on hypotheses for developing factories and the installed nuclear capacity.

In the context of her PhD work, supervised by Nicolas Thiollière, Fanny Courtin studied the objective of stabilizing the quantity of plutonium recycled in the reactors of nuclear plants by 2100. One of the constraints in the simulation was that all the power plant reactors needed to use the current pressurized water technology. Based on this criterion, the CLASS tool carried out thousands of simulations to identify possible strategies. “The condition for stabilizing the quantity of plutonium and minor actinides would be to have 40 to 50% of the pressurized water reactors dedicated to the multi-recycling of plutonium,” Nicolas Thiollière explains. “However, the availability of plutonium in these scenarios would also mean a regular decrease in the nuclear capacity, to a level between 0 to 40% of the current capacity.” This effect is caused by minor actinides, which are not recycled and therefore build up. The plants must therefore incinerate plutonium to stabilize the overall inventory. However, incinerating plutonium implies reducing the power plants’ capacity at an equivalent rate.

On these charts, each line represents a possible course of action. In purple, the researchers indicated the scenarios that would meet a mass stabilization condition for the plutonium and minor actinides in circulation (top). These scenarios imply reducing the thermal energy of the power plants over the course of the century (bottom).

 

The researchers also tested the condition of minimizing the inventory of plutonium and minor actinides. In addition to increasing the number of reactors used for multi-recycling, the researchers showed that the scenario for reducing the quantity of plutonium and minor actinides in the cycle would imply phasing out nuclear power in a few years. Reducing the stock of plutonium is tantamount to reducing the fuel inventory, which would mean no longer having enough to supply all the nuclear power plants. “Imagine you have 100 units of plutonium to supply 10 power plants. At the end of the cycle, you would only have 80 units remaining and would only be able to supply 8 plants. You would have to close 2. In recycling the 80 units, you would have even less in the output, etc.,” Nicolas Thiollière summarizes. In practice, it therefore seems impractical to implement major R&D efforts to recycle MOx without FNRs, considering that this solution implies abandoning the technology in the short term.

The industrial feasibility of these options must first be validated by extensive safety studies. However, in the initial stages, the scenarios involving the stabilization of plutonium and minor actinides seem compatible with diversifying France’s electricity mix and rolling out renewable energy to replace nuclear sources. Industrial feasibility studies looking at both safety issues and costs are all the more valuable in considering the uncertainty involved in deploying fast neutron reactors and the future of the nuclear sector. It is important to address the uncertainty of these economic and safety issues before deploying a strategy that would radically change France’s nuclear power plants.

Also read on I’MTech: What nuclear risk governance exists in France?

Article written for I’MTech (in French) by Anaïs Culot

gilets jaunes, yellow vests

Debate: Purchasing power and carbon tax, the obsolescence of political categories

Fabrice Flipo, Institut Mines-Télécom Business School

[divider style=”dotted” top=”20″ bottom=”20″]

[dropcap]W[/dropcap]hen it comes to social and ecological concerns, many dream of reconciling the two, but few have a solution. The “yellow vests” have just provided a reminder of this fact, as they do not identify with anything offered by organized groups, whether political, union-based or even community-oriented.

So what can be done? Are we destined for failure? The study of major political ideas can help provide a way forward.

With the “yellow vests,” the traditional questions about ways of life can be seen from an unexpected angle. The key demand is for purchasing power, but observations and surveys have also revealed concern about climate issues. What can be done? How can social and ecological issues be brought together?

“Being poor” in France and elsewhere

First let’s take a look at the key factors involved.

To begin with, let us remember that if we disregard redistribution through taxes, purchasing power is determined by two key elements: income and wealth. As for income, the richest 10% of the population in France earns around €300 billion, while the poorest 10% earn 10 times less. But this figure masks the slow salary growth for the majority of the population; those in management positions are the only ones who have seen significant increases in their income over time.

As far as wealth is concerned, there is even greater inequality: the poorest 50% of the population possesses 8% of all wealth while those who make up the richest 1% possess 17% of the wealth (and this figure rises to 50% when considering the wealthiest 10%). That explains the average wealth of €16,000 for unskilled workers. And this is only taking averages into consideration. Individual examples of success are even more striking.

Bernard Arnault earns €3.5 million a month for his work at LVMH; a salary comparable to that of football player Kylian Mbappé with his monthly salary of €2 million. This means Bernard Arnault earns the equivalent of a monthly minimum wage salary every four minutes. He also possesses a fortune of €73 billion which provides him with €300 million in dividends, which is the equivalent of 100 times his salary at LVMH. It could be argued that this is an extreme case, but it is a visible reality in France, where so many people are struggling just to make ends meet.

Furthermore, there is the problem of “necessities,” to use Marx’s expression, meaning items that are considered necessary to live. While inflation may be low, the basket price for items that are deemed essential is on the rise. The cost of digital technology, for example, has been added to this list.

The scale used by the French charity Secours Populaire Français illustrates this point: “being poor” in France corresponds to an ever-higher level of income; it is now defined as earning €1,118 a month, while the minimum net monthly salary is €1,150. Earning minimum wage in France now means earning only €32 more than what is considered the poverty line.

This threshold must be compared with that used to define belonging to the “global middle class” based on the same statistics as those used in France: this level is defined as between €4,000 and €6,000 per year, which works out to between €300 and €500 per month.

As economics Nobel Prize winner Amartya Sen has pointed out, poverty is a social construct and when it comes to this issue, the benchmarks are still largely national. Taken together, these two observations clearly conclude that the majority of French people are trapped between two blades of scissors – that which allows them to earn money and that which controls how they spend it.

1,000,000 tons of CO2 for Bernard Arnault

We will now try to express this information in terms of climate and energy.

Based on several studies, economist Jean Gadrey estimates, using “the means available,” that the richest 1% of the French population emit approximately 160 tons of carbon per person per year, compared to 4 tons for the poorest 10%. The poorest 10% of the population therefore emits 28 million tons, compared to 112 million tons for the richest 1% (based on a population of 70 million).

Based on this type of calculation, it can be concluded that Bernard Arnault emits 1,000,000 tons by himself (if one minimum wage salary is worth 4 tons of CO2).

This simple observation illustrates the futility of a carbon tax which would not, at minimum, be based on income. Either it would be a high tax and low-income households, meaning the majority French households, would not be liable for paying it, or it would be kept at a very low level and it would have no impact on the climate. A recent report by ADEME (the French Environment and Energy Management Agency) shows that French people are engaged in climate protection, but to take further action they ask that changes be shared in a fair, democratic way. These opinions have remained stable in the last two surveys.

Surprisingly, the figures presented here and those reported by Jean Gadrey using “the means available” have not been a focus of the debate or have not been presented clearly. The “yellow vests” plainly see that those being asked to make the biggest sacrifices are those who need their small quantity of CO2  the most. Admittedly, it is still too much for the planet …  but then what can be said about the others!

Strategies to be invented

So what can be done?

The current government has embraced a traditionally liberal argument: let the market sort itself out. In other words, the economy should be left alone, in order to “respond to demand” and help France prosper. Historically, this strategy has been partially successful since France possesses huge multinationals and remains one of the top-ranking world economies despite its small size, without being as financialized as London’s City. Yet there is a downside to this strategy: an ever-greater concentration of wealth.

Can people really improve their situation simply by “crossing the street” or by founding start-ups? Economist Thomas Piketty’s work has shown that this is clearly not the case: when there is little growth – meaning few possibilities to create wealth that could result in income – those with the largest fortunes benefit since they run the game.

A number of different solutions may be explored, from all sides of the political spectrum.

Those with the most liberal leanings will encourage the richest part of the population to decarbonize the economy, which is undoubtedly what Emmanuel Macron hoped to do by giving businesses and their owners more room for maneuver. This would require these stakeholders to be ready to take on this role and to a great enough extent. Whatever we may think about the credibility of such a scenario, the fact remains that it has not been supported by any evidence.

Another scenario would be for major players, like Bernard Arnault for one, to stop acting like rentiers, using and abusing their market position to keep new players out of the market. This would mean putting an end to the “laissez-faire” system, but it would not necessarily mean the end of a liberal system in the sense that some liberals – like the classic Natural Capitalism by Paul Hawken, Amory B. Lovins and L. Hunter Lovins – could also consider that those with giant fortunes pose a threat to freedom. This was one of the reasons behind United States antitrust laws, for example. Those who are more conservative (including the Rassemblement National) will certainly be averse to opposing major interests and will refuse to change the social order as it stands. A more socialist approach would seek to make use of the State, whether directly through public expenditure (Keynesian), by controlling public companies, or by increasing the minimum wage. And we must remember that the current government did not increase the minimum wage because it wants to use taxes, meaning the income of non-minimum wage employees, to increase in-work benefits, and not wages themselves.

Yet the State alone will not be able to change people’s daily lives: reorganizing territories and deploying renewable energy require much greater efforts – setting up networks, training people to work with the equipment, etc.

The traditional political positions appear to be poorly suited to respond to ecological issues. This can be seen by studying the political ideas and events that have occurred over the past decades. The “yellow vests” have successfully demonstrated this through their refusal of the existing divisions. So aren’t they the ones who could help us determine where to go from here? An alliance of progressives, across historic divides, is the most plausible path to take.

[divider style=”dotted” top=”20″ bottom=”20″]

Fabrice Flipo, Professor of social and political philosophy, epistemology and history of science and technology at Institut Mines-Télécom Business School

The original version of this article (in French) was published on the website of The Conversation France.

Also read on I’MTech

[one_half]

[/one_half][one_half_last]

[/one_half_last]

Imagination, imaginaire

Imagination: an architect and driving force of transitions

All technology starts with a vision, a tool created to meet one of society’s objectives. Its development path is formed by both human projections and the dynamics of the transformations it generates. It is therefore important to take the time to ask ourselves what we intended to do with digital technology and what we will do with it. We must also analyze the transformations this technology has already initiated in the digital transition and work to build the world of the tomorrow. In the book by Carine Dartiguepeyrou and Gilles Berhault entitled Un autre monde est possible – Lost in transitions?, Francis Jutand—Deputy President of IMT—raised the question of the role imagination plays in the current digital transition. He describes how important it is in defining our future. Upon the release of this book, I’MTech spoke with Francis Jutand to learn more.

 

Francis Jutand

How can we study a transformation as profound as the one generated by digital technology?

Francis Jutand: This is a true metamorphosis, the fourth to occur in the history of humanity. A metamorphosis is characterized by an initial transformation period that is extremely fast and powerful, which can be referred to as the transition period. We do not often have the opportunity to study the conditions of a metamorphosis before the transition occurs, except in the case of artists and creators who sense its approach or foresight experts who suspect its coming. The work of foresight experts takes place during this transition period, or better yet, this “prenatal” period. Their work is aimed at analyzing, understanding and sharing their findings to influence the path of development and, above all, to contribute to designing the world of tomorrow. Every transformation has causes. This means there are also early signs of its development and the implementation of structures that will make it possible. The printing press, encyclopedia and the scientific development the 17th and 18th centuries all paved the way for the industrial transformation. Electronics, telecommunications, computer science, and media paved the way for the digital transformation that took place as they converged in 2000.

Why use imagination to study the digital transition?

FJ: It is impossible to master the dynamics of the transformation that is underway: before we even have time to see what will come of one innovation, others have emerged. This results in a divergence and a tipping point, which is more Lamarckian than Darwinian in nature. We simply know that all activities and individuals will be reached and transformed in the process. We are actors, but at the same time we are also subject to the forces working to operate this change. The question that arises is, how can we anticipate and act now to design and influence the world of tomorrow? This world of tomorrow is shaped by the ideas at the origin of the transformation as well as those that emerge as it progresses. It is in the convergence of these ideas that imagination can act as an architect and builder of this new world. The last transformation was industrial, and researchers like Pierre Musso at Télécom Paris have carefully analyzed that transformation and the role it played in structuring the industrial society and in creating infrastructures of communication networks, services and content on which the metamorphosis was built. When a transformation begins, imagination changes. Digital imagination cannot be regarded as a mere extension of industrial imagination.

Why is the industrial imagination insufficient in explaining the digital metamorphosis underway?

FJ: The industrial imagination is above all based on processes and rational models. This is the mentality that takes a complex problem and cuts it into smaller pieces, clearly defines the steps that must be taken to resolve them and creates an assembly design to make it all work. This relies on methods of design, description, fragmentation, task automation, deployment and monitoring, which are structured around successive phases: analysis, modeling, simulation, decision-making, implementation, feedback and adaptation. It is extremely effective, yet this type of imagination and methods have reached their limits and are now being exhausted due to the new complexities of digital technology. This is primarily because this process is slow: it takes years to carry out a large project, design infrastructures and large-scale information systems. Industrial imagination was successful for large systems: nuclear energy, aeronautics, space, transportation systems and many types of networks… Yet it cannot withstand the complexities and acceleration of the digital world. This form of imagination is based on rationality and efficiency that attempts to cut corners when it comes to involving humans, who are seen as cost factors. It therefore promotes automation to achieve performance, to the detriment of development. This approach has now reached its limit in the current context of new social and environmental issues and the expectations of new generations seeking to develop their individuality, rather than integrate a system. Our society must establish a new projection that will allow us to solve new problems. This is already being done and will continue to develop as we pursue a new form of imagination.

What characterizes this new digital imagination?

FJ: We need to understand that this imagination thrives on the one hand on the development of science and digital technology and, on the other hand, on the development of the consumer society that created this individuation phenomenon by emphasizing the value of personality and desires that must be expressed and satisfied. These changes fueled the phenomenon of individuation, which was accelerated by consumer society and further matured through the development of networks. The individual has therefore taken on an increasingly important role. The individual no longer exists as part of a community or class, but as an autonomous entity capable of becoming personally involved in an activity and defining and adopting his own positions. In this sense it shares a common point with the hippy and hacker movements—in terms of hacker ethics, despite this culture often being mistakenly viewed as mere attackers. At different points in the development of the digital imagination, these two groups took a stand to demand that individuals be taken into consideration, not as belonging to a consumer class, but based on their individualities that can work alone and in cooperation with others. This all led to the creation of an imagination combining sharing, instant and global communication. In short: the creation of cooperative networks of individuals.

Today, the digital imagination offers a vision of the world in which individuals can act, experiment, share, cooperate and, in so doing, explore multiple answers aimed at providing solutions to problems. This is the opensource and start-up spirit, which relies on collective synchronization based on common goals and values. It is a sort of inversion of architectures, organizations and decision-making methods. It also marks a transition from an automation and efficiency economy limiting the human factor, to a culture based on effectiveness and performativity relying on cooperative, associative and parallel exploration. This vision of progress relies on personal and collective experience. Finally, it is the power of a multitude of individuals searching for solutions through discussion and decision-making processes.

Does this form of imagination completely replace the industrial form that preceded it?

FJ: One form of imagination does not replace the previous one, it enriches it and adds new dimensions. Digital imagination governs new spaces and alters the industrial imagination, just like the industrial imagination transformed that of agriculture and traditional trades, which were also altered, but not destroyed. It is clear, however, that we can expect these new areas of human development taking shape through digital imagination to play an increasingly significant role in society. This form of imagination can go a long way, since it affects cognitive functions and permeates collective narratives. Science fiction, as both an artistic and projective activity, contributes to bringing the digital imagination into being and is playing a leading role in exploring the magnitude of the possible utopian and dystopian outcomes of digital technology.

Many works of science fiction are more dystopian. How would you explain this pessimism in the digital imagination?

FJ: The inner workings of society have always been reflected through, on the one hand, problems related to power, domination and money and, on the other hand, a spiritual dimension. The hubris, or excessive nature caused by this first aspect is not specific to the digital transformation. It can, however, lead to forms of pride or even perversity that could influence developments in digital technology. We therefore see positions of domination using digital tools—overthrowing democracy, privacy—and transhumanism advocating messianic hubris such as immortality. There is also a less conspicuous but real phenomenon taking place as new sources progressively deepen existing inequalities, as rural areas are neglected due to a focus on urban issues and even in the threat of deterritorialization. These kinds of developments can cause us to lose perspective and want to bail out, leaving behind the collective interests of the human project. Today, one of the ways we can control this type of hubris is to prioritize ecological and global concerns and focus on social justice.

If digital imagination is not sufficient in finding solutions to the challenges facing society, should we expect a new form of imagination to emerge?

FJ: In my opinion, within the context of digital society, the digital transition will lay the foundations for a new transformation: that of cognition, which will be partially based on powerful artificial intelligence, once it is developed. This is a form of co-development, a symbiotic relationship between humans and machines and the capacity of intermediated collective individuation. And what forms of technology will make all this possible? We do not yet know. What is more, this coming transformation might not even be physical. Transitions also alter beliefs, approaches to spirituality, social structures, the nature of wealth… For now, we can only observe that a new form of imagination linked to cognition is beginning to develop. What remains to be seen is whether this will be a completely new form of imagination, or an extension of the digital imagination we have been building on for a few decades now.

 

production

The future of production systems, between customization and sustainable development

Editorial

 

[dropcap]W[/dropcap]hat will the production lines of tomorrow look like? Over the past decades, machines have played an increasingly important role in factories. We all have an image in our minds of robotic arms moving at lightning speed and with truly superhuman precision, carrying parts that are undoubtedly too heavy for our arms. Faced with such a demonstration of physical superiority, it is hard to imagine how anything organic can compete. When it comes to production rate one thing is certain: we are beaten by machines. And we’re already imagining humans being excluded from production lines, or at least reassigned to different tasks—complex programming of robots, overseeing machine networks, data analysis etc. All of these “new careers” are exclusively high-skilled positions and require profound changes in training and in companies.

But being so quick to eliminate humans and replace them with robots may be going a step too far. When we talk about production, we’re talking first and foremost about meeting a demand. What is produced is that which is desired, bought and consumed by end users. And what today’s customers want more than anything is a customized product. They want a car that aligns with their own needs, desires and values. They do not want to buy one of the 500,000 diesel cars with options they won’t use. They want the same model, only electric, without air conditioning because it’s bad for the environment, but with a sun roof because they love pulling over in the countryside and looking up at the stars.

But entirely-automated production lines have a hard time adapting to such specific demands. It is amusing to learn that researchers studying the issues involved in this new commercial paradigm are reasserting the importance of humans in production systems. Yes, we are slower, weaker and less precise, but we are also more flexible, versatile and better able to adapt to the typically human demand for diversity. At Mines Saint-Étienne, Xavier Delorme is one such researcher. His work has shown that it is important not to dehumanize production in order to respond to new demands from customers.

This does not mean adopting a primarily anti-technology stance, but rather emphasizing the strength of human-machine cooperation. At IMT Mines Albi, Élise Varielles is working on software tools that do precisely that by helping teams understand customers’ needs. The tools developed by the Albi-based researcher tackle the task of breaking down a demand, understanding it in great detail to determine whether it is feasible, then determining how it can be met as effectively as possible.

But growing demand for tailor-made products is just one of many new demands. Having a customized product is not enough. Customers also need to have it right away—or at least, as soon as possible. For this reason, new production systems cannot be considered in isolation from the transportation and distribution networks further downstream. The reality is that the entire supply chain is undergoing a transformation. It must transport goods more quickly, but must also meet sustainable development requirements. The environmental footprint is no longer a mere detail. Trucks can no longer travel half-empty and must progressively be replaced by trains. For this to happen, companies will have to learn how to communicate and collaborate with one another. The logistics network is undergoing profound changes.

[one_half]

[/one_half][one_half_last]

[/one_half_last]

This series takes a look at some of the new issues facing industry, for which researchers are trying to find solutions. It was created following the IMT symposium on production systems of the future. As such, it focuses less on political and social aspects—training for new careers, disappearance of low-skilled jobs—than on technical subjects involving major scientific challenges. Against a backdrop of artificial intelligence, ecological and energy transition and human-machine interaction, it presents some interesting examples of research for the benefit of society and the industry of the future.

physical internet

What is the physical internet?

The physical internet is a strange concept. It borrows its name from the best-known computer network, yet it bears little connection with it, other than being an inspiration for bringing together economic stakeholders and causing them to work together. The physical internet is in fact a new way of organizing the logistics network. In light of the urgent climate challenges facing our planet and the economic challenges of companies, we must rethink logistics from a more sustainable perspective. Shenle Pan, a researcher in management science at Mines ParisTech and specialist in logistics and transport, explains this concept and its benefits.

This article is part of our series on “The future of production systems, between customization and sustainable development.”

 

What does the physical internet refer to?

Shenle Pan: It’s the metaphor of the internet applied to supply chain networks and related services. When we talk about the physical internet, the objective is to interconnect distribution networks, storage centers, suppliers, etc. Today, each contributor to the supply chain system is on their own. Companies are independent and have their own network. The idea of the physical internet is to introduce interoperability between stakeholders. The internet is a good analogy for guiding the ideas and structuring new organizational methods.

What is the benefit of this subject?

SP: Above all, it is a way of making logistics more sustainable. For example, when each stakeholder works on its own, a delivery truck leaves without being full. The delivery must be on time, and the truck leaves even if it is only half full. By connecting stakeholders, a truck can be filled with more goods for another supplier. If enough companies share transport resources, they can even reach a flow of goods significant enough to use rail freight. Since one full truck emits less CO2 than two half-filled trucks, and the train runs on electricity, the environmental impact would be greatly reduced for the same flow of goods. Companies also save due to the scale effect. The benefits are also related to other logistics departments, such as storage, packaging and handling.

How will this impact the logistics markets?

SP: By interconnecting stakeholders, competing companies will be connected. Yet today, these stakeholders do not share their information and logistical means. New rules and protocols must therefore be established to control stakeholders’ access to components in the supply chain, using the networks, transporting goods, etc. This is what protocols do, which in the case of the internet include TCP/IP. New intermediaries must also be introduced on the markets. Some are already beginning to appear. Start-ups offer to mutualize transport to maximize the trucks’ capacity. Others sell storage areas for one pallet for a short period of time to adapt to the demand, whereas stakeholders are generally used to buying entire warehouses they do not always fill. The physical internet therefore leads us toward a new logistics model called Logistics as a Service. This new model is more flexible, efficient, interoperable and sustainable.

What makes the physical internet a field of study?

SP: Real interdisciplinary research is needed to make all these changes. It is not easy, for example, to design standardized means for promoting interoperability. We must determine which mechanisms are the best suited and why. Then, in the area of management science, we must ask which intermediaries should be introduced into the network to manage the openness and the new business models this would involve. From a computer science perspective: how can the services of the various stakeholders be connected? Personally, I am working on the mathematical aspect, modelling new types of organization for the network, for example for assessing gains.

What are the tangible gains of the physical internet in terms of logistics?

SP: We took two major supply chains from mass distribution in France and we integrated the data into our new organizational models to simulate the gains. Depending on the scenarios, we improved the filling of trucks by 65% to 85%. Greenhouse gases decreased 60% for CO2 emissions due to multi-modality. In our simulations, these significant results were directly linked to interoperability and the creation of the network. Our models allow us to determine the strategic locations where shared storage centers should be established for several companies, optimize transport times, reduce supply times and storage volumes… We also had gains of over 20% in stock sizes.

Does the logistics sector already use the principles of the physical internet?

SP: The physical internet is a fairly recent concept. The first scientific publication on the topic dates to 2009, and companies have only been interested in the subject for approximately three years. They are adopting the concept very quickly, but they still need time. This is why we have a research chair on the physical internet at Mines ParisTech, with French and European companies; they submit their questions and use cases to help develop the potential of this concept. They recognize that we need a new form of organization to make logistics more sustainable, but the market has not yet reached a point where the major players are restructuring based on the physical internet model. We are currently seeing start-ups beginning to emerge and offer new intermediary services.

When will we experience the benefits of the physical internet?

SP: In Europe, the physical internet has established a solid roadmap, developed in particular by the ALICE alliance, which connects the most significant logistics platforms on the continent. This alliance regularly issues recommendations that are used by European H2020 research programs. Five focus areas have been proposed for integrating the physical internet principles in European logistics by 2030. This is one of the largest initiatives worldwide. In Europe, we therefore hope to quickly see the physical internet comprehensively redefine logistics and offer its benefits, particularly in terms of environmental impacts.

 

transportation

Synchronizing future transportation: from trucks to drones

With the development of delivery services, the proliferation of various means of transportation, saturated cities and mutualized goods, optimizing logistics networks is becoming so complex that humans can no longer find solutions without using intelligent software. Olivier Péton, specialized in operational research for optimizing transportation at IMT Atlantique, is seeking to answer this question: how can deliveries be made to thousands of customers under good conditions? He presented his research at the IMT Symposium in October on production systems of the future

This article is part of our series on “The future of production systems, between customization and sustainable development.”

 

Have you ever thought about the future of the book, pair of jeans or alarm clock you buy with just one click? Moved, transferred, stored, redistributed, these objects made their way from one strategic place to the next, across the entire country to your city. Several trucks, vans and bikes are used in the delivery. You receive your order thanks to the careful organization of a logistics network that is becoming increasingly complex.

At IMT Atlantique, Fabien Lehuédé and Olivier Péton are carrying out operational research on how to optimize transportation solutions and logistics networks. “A logistics network must take into account the location of the factories and warehouses, decide which production site will serve a given customer, etc. Our job is to establish a network and develop it over time using recent optimization methods,” explains Olivier Péton.

This job is in high demand. Changes in legislation to limit the access of certain vehicles during given timeframes in city centers has required companies to rethink their distribution methods. At the same time, like these new requirements in the city, the development of new technology and new distribution methods offer opportunities for re-optimizing transportation.

What are the challenges facing the industry of the future?

Most of the work from the past 10 years pertains to logistic systems and the synchronization of vehicles,” remarks Olivier Péton. “In other words, several vehicles must manage to arrive at practically the same time at the same place.” This is the case, for example, in projects involving the pooling of transportation means, in which goods are grouped together at a logistics platform before being sent to the final customer. “This is also the case for multimodal transportation, in which high-capacity vehicles transfer their contents to several smaller-capacity vehicles for the last mile,” the researcher explains. These concepts of mutualization and multimodal transport are at the heart of industry of the future.

In the path from the supplier to the customer, the network sometimes transitions from the national level to that of a city. On the one hand, national transport relies on a network of logistic hubs that handle large volumes of goods. On the other hand, urban networks, particularly for e-commerce, focus on last-mile delivery. “The two networks involve different constraints. For a national network, the delivery forecast can be limited to one week. The trucks often only visit three or four places per day. In the city, we can visit many more customers in one day, and replenish supplies at a warehouse. We must take into account delays, congestion, and the possibility of adjusting the itinerary along the way,” Olivier Péton explains.

Good tools make good networks

A network’s complexity depends on the amount of combinations that can be made with the elements it contains. The higher the number of sites, customer orders and stops, the more difficult it becomes to optimize the network. There could be billions of solutions, but it is impossible to list them all to find the best one. This is where the researchers’ algorithms come into play. They rely on the development of heuristic methods, in other words, coming as close as possible to an optimal solution within a reasonable calculation time of a few seconds or a few minutes. To accomplish this, it is vital to have reliable data: transport costs, delivery time schedules, etc.

There are also specific constraints related to each company. “In some cases, transport companies require truck itineraries in straight lines, with as few detours as possible to make deliveries to intermediate customers,” explains Olivier Péton. Other constraints include the maximum number of customers on one route, fair working times for drivers, etc. These types of constraints are modeled as equations. “To resolve these optimization problems, we start with an initial transport plan and we try to improve it iteratively. Each time we change the transport plan, we make sure it still meets all the constraints”. The ultimate result is based on the quality of service: ensuring that the customer is served within the time slot and in only one delivery.

Growing demand

Today, this research is primarily used prior to delivery in national networks. It helps design transport plans, determine how many trucks must be chartered and create the drivers’ schedules. Olivier Péton adds, “it also helps develop simulations that show the savings a company can hope to make by changing its logistics practices. To accomplish this, we work with 4S Network, a company that supports its customers throughout their entire transport mutualization projects.” This work can also be of interest to the major decision-makers managing a fleet with transport that can vary greatly on a daily basis. If the requests are very different from one day to the next, the software solution can develop a transport plan in a few minutes.

Read more on I’MTech: What is the Physical Internet?

What is the major challenge facing researchers? The tool’s robustness. In other words, its ability to react to unforeseeable incidents: congestion, technical problems… It must allow for small variations without having to re-optimize the entire solution. This is especially the case as new issues arise. Which exchange zone should be used in a city to transfer goods: a parking lot or vacant area? For what tonnage is it best to invest in electrical trucks? There are many different points to consider before real-time optimization can be achieved.

Another challenge involves developing technologically viable solutions with a sustainable business model that are acceptable from a societal and environmental perspective. As part of the ANR Franco-German project OPUSS, Fabien Lehuédé and Olivier Péton are working to optimize complex distribution systems. These systems combine urban trucks and transport with fleets of smaller, autonomous vehicles for last mile deliveries. That is, until drones come on the scene…

 

Article written by Anaïs Gall, for I’MTech.

human

Production line flexibility: human operators to the rescue!

Changing customer needs have cast a veil of uncertainty over the future of industrial production. To respond to these demands, production systems must be flexible. Although industry is becoming increasingly automated, a good way to provide flexibility is to reintroduce human operators. An observation that goes against current trends, presented by Xavier Delorme, an industrial management researcher at Mines Saint-Étienne at the IMT symposium on “Production Systems of the Future”.

This article is part of our series on “The future of production systems, between customization and sustainable development.”

 

Automation, digitization and robotization are concepts associated with our ideas about the industry of the future. With a history marked by technological and technical advances, industry is counting on autonomous machines that make it possible to produce more, faster. Yet, this sector is now facing another change: customer needs. A new focus on product customization is upending how production systems are organized. The automotive industry is a good example of this new problem. Until now, it has invested in production lines that would be used for ten to twenty years. But today, the industry has zero visibility on the models it will produce over such a period of time. A production system that remains unchanged for so long is no longer acceptable.

In order to meet a wide range of customer demands that impact many steps of their production, companies must have flexible manufacturing systems. “That means setting up a system that can evolve to respond to demands that have not yet been identified – flexibility – so that it can be adjusted by physically reconfiguring the system more or less extensively,” explains Xavier Delorme, a researcher at Mines Saint-Étienne. Flexibility can be provided through digital controls or reprogramming a machine, for example.

But in this increasingly machine-dominated environment, “another good way to provide flexibility is to reintroduce versatile human operators, who have an ability to adapt,” says the researcher. The primary aim of his work is to leverage each side’s strengths, while attempting to limit the weaknesses of the other side. He proposes software solutions to help design production lines and ensure that they run smoothly.

Versatility of human operators

This conclusion is based on field observations, in particular through collaboration with MBtech Group, in which the manufacturer drew attention to this problem. The advanced automation of its production lines was reducing versatility. The solution proposed by researchers: reintroduce human operators. “We realized that some French companies had conserved this valuable resource, although they were often behind in terms of automation. There’s a balance to be found between these two elements,” says Xavier Delorme. It appears that the best way to create value, in terms of economic efficiency and flexibility, is to combine robots and humans in a complementary manner.

A system that produces engines manufactures different models but does not need to be modified for each variant. It adapts to the product, switching almost instantaneously from one to another. However, the workload for different stations varies according to the model. This classic situation requires versatility. “A well-trained, versatile human operator reorganizes his work by himself. He repositions himself as needed at a given moment; this degree of autonomy doesn’t exist in current automated systems, which cannot be moved quickly from one part of the production line to another,” says Xavier Delorme.

This flexibility presents a twofold problem for companies. Treating an operator like a machine reduces his range of abilities which does result in efficiency. It is therefore in companies’ interest to enhance operators’ versatility through training and assigning them various tasks in different parts of the production system. But the risk of turnover and the loss of skills associated with short contracts and frequent changes in staff still remain.

The arduous working conditions of multifunctional employees must also not be overlooked. This issue is usually considered too late in the design process, leading to serious health problems and malfunctions in production systems. “That’s why we also focus on workstation ergonomics starting in the design stage,” explains Xavier Delorme. The biggest health risks are primarily physical: fatigue due to a poor position, repetitiveness of tasks etc. The versatility of human operators can reduce these risks, but it can also contribute to them. Indeed, the risks increase if an employee lacks experience and finds it difficult to carry out tasks at different workstations. Once again, the best solution is to find the right balance.

Educating SMEs about the industry of the future

Large corporations are already on their way to the industry of the future, but it’s more difficult for SMEs, says Xavier Delorme. In June 2018, Mines Saint-Étienne researchers launched IT’mFactory, an educational platform developed in partnership with the Union des industries et métiers de la métallurgie (Union of Metallurgy Industries). This demonstration tool makes it possible to connect with SMEs to discuss the challenges and possibilities presented by the industry of the future. It also provides an opportunity to address problems facing SMEs and direct them towards appropriate innovations.

Such interactions are precious in a time in which production methods are undergoing considerable changes (cloud manufacturing, additive manufacturing, etc.) The business models involved provide researchers with new challenges. And flexibility alone will not respond to the needs of customization — or how to produce by the unit and on demand. Servicization, which sells a service associated with a product, is also radically changing the ways in which companies have to be organized.

 

Article written by Anaïs Culot, for I’MTech.

 

personalizing, customization

Breaking products down for customization

Customers’ desire to take ownership of products is driving companies to develop more customized products. Élise Vareilles, a researcher in industrial engineering at IMT Mines Albi, works to develop interactive decision support tools. Her goal: help companies mass-produce customized goods while controlling the risks related to production. This research was presented at the IMT symposium on “Production Systems of the Future”.

This article is part of our series on “The future of production systems, between customization and sustainable development.”

 

Mr. Martin wants a red 5-door car with a sunroof. Mrs. Martin wants a hybrid with a rearview camera and leather seats. The salesperson wants to sell them a model combining all these features, but the “hybrid” and “sunroof” options are incompatible. More and more companies are beginning to offer customized services (loans, credit, etc.) and goods (cars, furniture, etc.). Yet they are facing a challenge: how can they mass-produce a product that will meet the customer’s specific request? To find a solution to this aspect, companies must customize their production. But how is this possible?

To configure their products, companies must, in a sense, cut them down into pieces. For example, for a car, they must separate the engine from the wheels and the bodywork. Identifying all these elements plays a major role in customizing a product.

The goal is to develop computer tools that allow us to model each of these elements, like Lego bricks we put together. I have different color bricks, called “variants” and several shapes that represent options. We model their compatibility. Companies’ goal is to ensure the customer’s request can be met using all the options they offer,” explains Élise Vareilles, a researcher at IMT Mines Albi.

What catalog of products or services can I offer my customers using the options I have and their compatibility? To address this growing concern, Élise Vareilles’ team turned to artificial intelligence.

Identifying and controlling risks

The research team works on developing interactive configurators that enable a dialogue with the user. These configurators allow the user to enter various customization criteria for a product or service and view the result. For example, the options for your next car. To accomplish this, the artificial intelligence is fueled by the company’s knowledge. “We make computerized records of explicit knowledge (behavior laws, weight of components, etc.) and implicit knowledge related to the trade (manufacturing processes, best practices, etc.). All this information allows us to create modules, or building blocks, that make up the software’s knowledge base,” Élise Vareilles explains.

Yet not all a company’s knowledge is needed to manufacture each product. Therefore, the tool activates the relevant knowledge base according to the context specified by the user. Targeting this pertinent information allows the system to accompany the user by making suitable suggestions. Élise Vareilles adds, “with some configurators, I enter all my needs and they indicate, without explanation, that none of the products match my request. I do not know which options are causing this. Our tool guides the user by specifying the incompatibility of certain criteria. For example, it can tell the user that the size of the engine affects the size of the wheels and vice versa.”

Challenges 4.0 for the factory of the future

Researchers have developed a generic tool that can be applied to a variety of contexts. It has especially helped architects in configuring new insulation systems for the facades of 110 social housing units in the Landes area of France. “The best way to arrange the panels was to install them symmetrically, but the architects told us it didn’t look nice! An attractive appearance is not a parameter we can program with an equation. We had to find a compromise from among the software’s proposals by assessing all the assembly options and the constraints that only a human can evaluate,” the researcher recalls. The tool’s interactive aspect helped remedy this problem. It proposed assembly configurations for the insulation panels that the architects could adjust. It could also intervene to add to the architects’ proposals based on constraints related to the facades (windows, shutters, etc.) and the geometric nature of the panels.

In the context of the industry of the future, this type of tool could offer a competitive advantage by taking into account 80% of customers’ needs. It also helps control design costs. Breaking the knowledge and possible associations down into bricks means that the tool can help design increasingly adaptable products, which can be modified according to customers’ whims. It also increases the control of production risks by preventing the salesperson from selling a product that is too difficult or even impossible to manufacture. In addition, data mining techniques access the company’s memory to offer recommendations. However, if the knowledge is not constantly updated, the model faces the risk of becoming obsolete. The company’s experts must therefore determine the best time to update their tool.

Humans take on new roles

The two major risks involved in the manufacturing processes have therefore been reduced thanks to this tool from IMT Mines Albi. First, it reduces the risk of designing an object that does not match the customer’s request. By integrating knowledge from the company’s experts (risks, marketing, etc.) into the software, the company guarantees the feasibility of long projects. For example, the tool reduces risks linked to staff turnover, which could result in a loss of skills due to an engineer leaving the company.

However, humans are not being replaced. Instead, they are taking on new roles. “With this tool, 40% of an employee’s activities will be redirected to more complex tasks in which the added value of humans is undeniable. Keep in mind that our tool offers decision support and must rely on the previous work of experts,” Élise Vareilles adds. Yet implementing this type of solution is a long process—lasting approximately 2 years. This conflicts with the short-term investment mentality advocated by industrial culture. It’s now up to stakeholders to recognize these long-term benefits before their competition.

 

Article by Anaïs Culot, for I’MTech.