Jean-Louis de Bougrenet de la Tocnaye

IMT Atlantique | Optics, Photonics, Augmented reality

[toggle title=”Find all his articles on I’MTech” state=”open”]

[/toggle]

fonds industrie

Eclore and ThermiUp, new beneficiaries of the IMT “Industry & Energy 4.0” honor loans

After the IMT Digital Fund, Institut Mines-Télécom (IMT) and the Fondation Mines-Télécom launched a second fund last October, dedicated to the sciences of energy, materials and processes: “Industry & Energy 4.0”. Its committee, made up of experts from the major partners of the Fondation Mines-Télécom (Orange, BNP Paribas, Accenture, Airbus, Dassault Systèmes and Sopra Steria) met on March 18. Eclore and ThermiUp were granted honor loans for a total amount of €80,000. They are both incubated at IMT Atlantique.

L’attribut alt de cette image est vide, son nom de fichier est Logo_ECLORE-300x119-1.jpg.

Eclore Actuators offers a bio-inspired pneumatic and hydraulic actuator solution which is highly energy efficient, 100% recyclable, and based on unique and patented industrial bending processes. Eclore actuators are less expensive, lighter, less bulky and require less maintenance than traditional actuators. There are many sectors of application, such as industrial automation, robotics, IOT and home appliances. Find out more

L’attribut alt de cette image est vide, son nom de fichier est Logo_thermiUP-300x188-1.jpeg.

ThermiUp has developed a heat exchanger that recovers heat from the gray water of buildings to preheat domestic water. It allows builders to save up to 1/3 of the energy needed to produce domestic hot water, which represents half of the energy needs in new housing. This renewable energy device reduces greenhouse gas emissions by 1/3. Find out more

Digital Service Act

Digital Service Act: Regulating the content of digital platforms, Act 1

The Digital Service Act, proposed by the European Commission in early 2020, seeks to implement a new regulatory framework for digital platforms. Grazia Cecere, an economics researcher at Institut Mines-Télécom Business School, explains various aspects of these regulations.

Why has it become necessary to regulate the content of platforms?

Grazia Cecere: Technological developments have changed the role of the internet and platforms. Previous regulations specified that publishers were responsible for the totality of their content, but that web hosts were only responsible if flagged content was not handled adequately. With the emergence of super platforms and social media, the role of web hosts has changed. Their algorithms lead to more specific distribution of content, through rankings, search engine optimization and highlighting content, which may have significant impacts and contain dangerous biases.

What kind of content must be better regulated by digital platforms?

GC: There are many issues addressed, in particular combating cyber-bullying, disinformation and fake news, as well different types of discrimination. Today the platforms’ algorithms self-regulate based on the available data and may reproduce and amplify discrimination that exists in society. For example, if data analyzed by the algorithm shows wage gaps between men and women, it is likely to build models based on this information. So it’s important to identify these kinds of biases and correct them. Discrimination not only poses ethical problems: it also has economic implications. For example, if an algorithm designed to propose a job profile is biased based on an individual’s gender or skin color, the only important criteria – professional ability – will be less clear.

Read more on l’IMTech: Social media: The everyday sexism of advertising algorithms

What does the Digital Service Act propose so that platforms regulate their content?

C: The Digital Service Act seeks to set clear rules for the responsibilities that come with digital platforms. They must monitor the information distributed on their platforms, especially fake news and potentially harmful content. The goal is also to inform users better about the content and ensure their fundamental rights online. Platforms must also increase their transparency and make data about their activity available. This data would then be available to researchers who could test for whether it contains biases. The purpose of the Digital Service Act is to provide a harmonized legislative and regulatory system across all EU member states.

How can platforms regulate their own content?

GC : Another aspect of the Digital Service Act is providing the member states with regulatory instruments for their platforms. Different kinds of tools can be implemented. For example, a tool called “Fast Tracking” is being developed for Google to detect false information about Covid-19 automatically. This kind of tool, which determines whether information is false based on written content, can be complicated since it requires sophisticated natural language processing tools. Some issues are more complicated to regulate than others.

Are digital platforms starting to take into account the Digital Service Act?

GC: It depends on the platform. AirBnb and Uber, for example, have made a lot of data available to researchers so that they can determine what kinds of discriminatory biases it contains. And Google and Facebook are also providing access to an increasing amount of data. But Snapchat and TikTok are a whole other story!

Will the Digital Service Act also help regulate the internet market?

 GC: The previous regulation, the E-Commerce Directive, dates from 2000. Over time, it has become obsolete. Internet players today are different than they were 20 years ago and some have a lot more power. One of the challenges is for the internet market to remain open to everyone and for new companies to be able to be founded independently from the super platforms to boost competition, since today, any company that is founded depends on the monopoly of big tech companies.

By Antonin Counillon

air intérieur

Our indoor air is polluted, but new materials could provide solutions

Frédéric Thévenet, IMT Lille Douai – Institut Mines-Télécom

We spend 80% of our lives in enclosed spaces, whether at home, at work or in transit. We are therefore very exposed to this air, which is often more polluted than outdoor air. The issue of health in indoor environments is thus associated with chronic exposure to pollutants and to volatile organic compounds (VOCs) in particular. These species can cause respiratory tract irritation or headaches, a set of symptoms that is referred to as “sick building syndrome.” One VOC has received special attention: formaldehyde. This compound is a gas at room temperature and pressure and is very frequently present in our indoor environments although it is classified as a category 1B CMR compound (carcinogenic, mutagenic, reprotoxic). It is therefore subject to indoor air quality guidelines which were updated and made more restrictive in 2018.

The sources of volatile organic compounds

VOCs may be emitted in indoor areas by direct, or primary sources. Materials are often identified as major sources, whether associated with the building (building materials, pressed wood, wood flooring, ceiling tiles), furniture (furniture made from particle board, foams), or decoration (paint,  floor and wall coverings). The adhesives, resins and binders contained in these materials are clearly identified and well-documented sources.

To address this issue, mandatory labeling has existed for these products since 2012: they are classified in terms of emissions. While these primary sources related to the building and furniture are now well-documented, those related to household activities and consumer product choices are more difficult to characterize (cleaning activities, cooking, smoking etc.) For example, what products are used for cleaning, are air fresheners or interior fragrances used, are dwellings ventilated regularly? Research is being conducted in our laboratory to better characterize how these products contribute to indoor pollution. We have recently worked on cleaning product emissions and their elimination. And studies have also recently been carried out on the impact of essential oils at our laboratory (at IMT Lille Douai) in partnership with the CSTB (French National Scientific and Technical Center for Building) in coordination with ADEME (French Environmental and Energy Management Agency).

L’attribut alt de cette image est vide, son nom de fichier est file-20210520-13-13s7338.png.
Emission, deposition and reactivity of essential oils in indoor air (Shadia Angulo-Milhem, IMT Lille Douai). Author provided

In addition to the primary sources of VOCs, there are also secondary sources resulting from the transformation of primary VOCs. These transformations are usually related to oxidative processes. Through these reactions, other kinds of VOCs are also formed, including formaldehyde, among others.

What solutions are there for VOCs in indoor air?

Twenty years ago, an approach referred to as a “destructive process” was being considered. The idea was to pass the air to be treated through a purification system to destroy the VOCs. These can either be stand-alone devices and therefore placed directly inside a room to purify the air, or integrated within a central air handling unit to treat incoming fresh air or re-circulated air.

Photocatalysis was also widely studied to treat VOCs in indoor air, as well as cold plasma. Both of these processes target the oxidation of VOCs, ideally their transformation into CO2 and H2O. Photocatalysis is a process that draws on a material’s – usually titanium dioxide (TiO2) – ability to  adsorb and oxidize VOCs under ultraviolet irradiation. Cold plasma is a process where, under the effect of a high electric field, electrons ionize a fraction of the air circulating in the system, and form oxidizing species.

The technical limitations of these systems lie in the fact that the air to be treated must be directed and moved through the system, and most importantly, the treatment systems must be supplied with power. Moreover, depending on the device’s design and the nature of the effluent to be treated (nature of the VOC, concentration, moisture content etc.) it has been found that some devices may lead to the formation of by-products including formaldehyde, among others. Standards are currently available to oversee the assessment of this type of system’s performance and they are upgraded with technological advances.

Over the past ten years, indoor air remediation solutions have been developed focusing on the adsorption – meaning the trapping – of VOCs. The idea is to integrate materials with adsorbent properties in indoor environments to trap the VOCs. We have seen the emergence of materials, paint, tiles and textiles that incorporate adsorbents in their compositions and claim these properties.

Among these adsorbent materials, there are two types of approaches. Some trap the VOCs, and do not re-emit them – it’s a permanent, irreversible process. The “VOC” trap can therefore completely  fill up after some time and become inoperative, since it is saturated. Today, it seems wiser to develop materials with “reversible” trapping properties: when there is a peak in pollution, the material adsorbs the pollutant, and when the pollution decreases, for example, when a room is ventilated, it releases it, and the pollutant is evacuated through ventilation.

These materials are currently being developed by various academic and industry players working in this field. It is interesting to note that these materials were considered sources of pollution 20 years ago, but can now be viewed as sinks for pollution.

How to test these materials’ ability to remove pollutants

Many technical and scientific obstacles remain, regardless of the remediation strategy chosen. The biggest one is determining whether these new materials can be tested on a 1:1 scale, as they will be used by the end consumer, meaning in “real life.”  

That means these materials must be able to be tested in a life-size room, and with conditions that are representative of real indoor atmospheres, while controlling environmental parameters perfectly. This technical aspect is one of the major research challenges in IAQ since it determines the representativeness and therefore the validity of the results we obtain.  

L’attribut alt de cette image est vide, son nom de fichier est file-20210520-21-joiruv.png.
Experimental IRINA room (Innovative Room for Indoor Air studies, IMT Lille Douai). Author provided

We developed a large enclosed area in our laboratory for precisely this purpose a few years ago. With its 40 square meters, it is a real room that we can go into, called IRINA (Innovative Room For Indoor Air Studies). Seven years ago, it was France’s first fully controlled and instrumented experimental room on a 1:1 scale. Since its development and validation, it has housed many research projects and we upgrade it and make technical updates every year. It allows us to recreate the indoor air composition of a wood frame house, a Parisian apartment located above a ring road, an operating room and even a medium-haul aircraft cabin. The room makes it possible to effectively study indoor air quality and treatment devices in real-life conditions.

Connected to this room, we have a multitude of measuring instruments, for example to measure VOCs in general, or to monitor the concentration of one in particular, such as formaldehyde.

Frédéric Thévenet, Professor (heterogeneous/atmospheric/indoor air quality physical chemistry), IMT Lille Douai – Institut Mines-Télécom

This article has been republished from The Conversation under a Creative Commons license. Read the original article (in French).

IMPETUS: towards improved urban safety and security

How can traffic and public transport be managed more effectively in a city, while controlling pollution, ensuring the safety of users and at the same time, taking into account ethical issues related to the use of data and mechanisms to ensure its protection? This is the challenge facing IMPETUS, a €9.3 million project receiving funding of €7.9 million from the Horizon 2020 programme of the European Union[1]. The two-year project launched in September 2020 will develop a tool to increase cities’ resilience to security-related events in public areas. An interview with Gilles Dusserre, a researcher at IMT Mines Alès, a partner in the project.

What was the overall context in which the IMPETUS project was developed?

Gilles Dusserre The IMPETUS project was the result of my encounter with Matthieu Branlat, the scientific coordinator of IMPETUS, who is a researcher at SINTEF (Norwegian Foundation for Scientific and Industrial Research) which supports research and development activities. Matthieu and I have been working together for many years. As part of the eNOTICE European project, he came to take part in a use case organized by IMT Mines Alès on health emergencies and the resilience of hospital organizations. Furthermore, IMPETUS is the concrete outcome of efforts made by research teams at Télécom SudParis and IMT Mines Alès for years to promote joint R&D opportunities between IMT schools.

What are the security issues in smart cities?

GD A smart city can be described as an interconnected urban network of sensors, such as cameras and environmental sensors; it offers a multitude of valuable big data. In addition to better managing traffic and public transport and controlling pollution, this data allows for better police surveillance, adequate crowd control. But these smart systems increase the risk of unethical use of personal data, in particular given the growing use of AI (artificial intelligence) combined with video surveillance networks. Moreover, they increase the attack surface for a city since several interconnected IoT (Internet of Things) and cloud systems control critical infrastructure such as transport, energy, water supply and hospitals (which play a central role in current problems). These two types of risks associated with new security technologies are taken very seriously by the project: a significant part of its activities is dedicated to the impact of the use of these technologies on operational, ethical and cybersecurity aspects. We have groups within the project and external actors overseeing ethical and data privacy issues. They work with project management to ensure that the solutions we develop and deploy adhere to ethical principles and data privacy regulations. Guidelines and other decision-making tools will also be developed for cities to help them identify and take into account the ethical and legal aspects related to the use of intelligent systems in security operations.

What is the goal of IMPETUS?

GD In order to respond to these increasing threats for smart cities, the IMPETUS project will develop an integrated toolbox that covers the entire physical and cybersecurity value chain. The tools will advance the state of the art in several key areas such as detection (social media, web-based threats), simulation and analysis (AI-based tests) and intervention (human-machine interface and eye tracking, optimization of the physical and cyber response based on AI). Although the toolbox will be tailored to the needs of smart city operators, many of the technological components and best practices will be transferable to other types of critical infrastructure.

What expertise are researchers from IMT schools contributing to the project?  

GD The work carried out by Hervé Debar‘s team at Télécom SudParis, in connection with researchers at IMT Mines Alès, resulted in the creation of the overall architecture of the IMPETUS platform, which will integrate the various modules of smart city as proposed in the project. Within this framework, the specification of the various system components, and the system as a whole, will be designed to meet the requirements of the final users (cities of Oslo and Padua), but also to be scalable to future needs.

What technological barriers must be overcome?

GD The architecture has to be modular, so that each individual component can be independently upgraded by the provider of the technology involved. The architecture also has to be integrated, which means that the various IMPETUS modules can exchange information, thereby providing significant added value compared to independent smart city and security solutions that work as silos.  

To provide greater flexibility and efficiency in terms of collecting, analyzing, storing and access to data, the IMPETUS platform architecture will combine IoT and cloud computing approaches. Such an approach will reduce the risks associated with an excessive centralization of large amounts of smart city data and is in line with the expected changes in communication infrastructure, which will be explored at a later date.  

This task will also develop a testing plan. The plan will include the prerequisites, the execution of tests, and the expected results. The acceptance criteria will be defined based on the priority and percentage of successful test cases. In close collaboration with the University of Nimes, IMT Mines Alès will work on innovative approach to environmental risks, in particular related to chemical or biological agents, and to hazard assessment processes.

The consortium includes 17 partners and 11 EU member states and associated countries. What are their respective roles?

GD The consortium was formed to bring together a group of 17 organizations that are complementary in terms of basic knowledge, technical skills, ability to create new knowledge, business experience and expertise. The consortium comprises a complementary group of academic institutions (universities) and research organizations, innovative SMEs, industry representatives, NGOs and final users.

The work is divided into a set of interdependent work packages. It involves interdisciplinary innovation activities that require a high level of collaboration. The overall strategy consists of an iterative exploration, an assessment and a validation, involving the final users at every step.

[1] This project receives funding from Horizon 2020, the European Union’s Framework Programme for Research and Innovation (H2020) under grant agreement N° 883286. Learn more about IMPETUS.

nucléaire

Three Mile Island, Chernobyl, Fukushima: the role of accidents in nuclear governance

Stéphanie TillementIMT Atlantique – Institut Mines-Télécom and Olivier BorrazSciences Po

Until the 1970s, nuclear power plants were considered to be inherently safe, by design. Accidents were perceived as being highly unlikely, if not impossible, by designers and operators, in spite of recurring incidents that were not publicized.

This changed abruptly in 1979 with the Three Mile Island (TMI) accident in the United States. It was given wide media coverage, despite the fact that there were no casualties, and demonstrated that what were referred to as “major” accidents were possible, with a meltdown in this case.

The decades that followed have been marked by the occurrence of two other major accidents rated as level 7 on the INES (International Nuclear Event) scale: Chernobyl in 1986 and Fukushima in 2011.

Turning point in the 1980s

This article will not address this organization or the invention, in the wake of the Chernobyl accident, of the  INES scale used to rank events that jeopardize safety on a graduated scale, ranging from a deviation from a standard to a major accident.

Our starting point will be the shift that occurred in 1979, when accidents changed from being seen as unconceivable to a possible event, considered and described by nuclear experts as an opportunity for learning and improvement.  

Accidents therefore provide an opportunity to “learn lessons” in order to enhance nuclear safety and strive for continuous improvement.

But what lessons precisely? Has the most recent accident, Fukushima, led to profound changes in nuclear risk governance, as Chernobyl did?

The end of the human error rationale

Three Mile Island is often cited as the first nuclear accident: despite the technical and procedural barriers in place at the time, the accident occurred – such an accident was therefore possible.

Some, such as sociologist Charles Perrow, even described it as “normal,” meaning inevitable, due to the complexity of nuclear facilities and their highly coupled nature – meaning that the components that make up the system are closely interconnected – which are likely to lead to hard-to-control “snowball effects.”

For institutional, industrial and academic experts, the analysis of the accident changed views on man’s role in these systems and on human error: accidents went from being a moral problem, attributable to humans’ “bad behavior”, to a systematic problem, attributable to poor system design.

Breaking with the human error rationale, these lessons paved the way for the systematization of learning from experience, promoting a focus on transparency and learning.  

Chernobyl and risk governance

It was with Chernobyl that accidents became “organizational,” leading nuclear organizations and public authorities to introduce structural reforms of safety doctrines, based on recognition of the essential nature of “organizational and cultural problems […] for the safety of operations.” (AIEA, 1999).

Chernobyl also marked the beginning of major changes in risk governance arrangements at the international, European and French levels. An array of organizations and legal and regulatory provisions were introduced, with the twofold aim of learning from the accident that occurred at the Ukrainian power plant and preventing such an accident from happening elsewhere.

The law of 13 June 2006 on “Nuclear Transparency and Safety” (referred to as TSN) proclaiming, among other things, the ASN’s status as an administrative authority independent from the government, is one emblematic example.

A possibility for every country

25 years after Chernobyl, Japan experienced an accident at its Fukushima-Daiichi power plant.

Whereas the accident that occurred in 1986 could be attributed in part to the Soviet regime and its RBMK technology, the 2011 catastrophe involved American-designed technology and a country that many considered to be at the forefront of modernity.

With Fukushima, a serious accident once again became a possibility that no country could rule out. And yet, it did not give rise to the same level of mobilization as that of 1986.  

Fukushima – a breaking point?

Ten years after the Japanese catastrophe, it can be said that it did not bring about any profound shifts – whether in the way facility safety is designed, managed and monitored, or in the plans and arrangements designed to manage a similar crisis in France (or in Europe).

This has been shown in the research carried out through the Agoras research project.

As far as preparedness for crisis management is concerned, Fukushima led to a re-examination of the temporal boundaries between the emergency phase and the post-accident phase, and for greater investment in the latter.

This catastrophe also led the French authorities to publish a preparedness plan in 2014 for managing a nuclear accident, making it a part of the common crisis management system.

These two aspects are reflected in the strengthening of the public safety portion of the national crisis management exercises carried out annually in France.   

But, as underscored by recent research, the observation of these national exercises did not reveal significant changes, whether in the way they are organized and carried out, the content of plans and arrangements, or, more generally, in the approach to a crisis caused by a major accident – with the exception of the creation of national groups that can intervene quickly on site (FARN).

Limited changes

It may, of course, be argued that, like the effects of the Three Mile Island and Chernobyl accidents, structural transformations take time and it may still be too early to observe a lack of significant change.

But the research carried out through the Agoras project leads us to put forward the hypothesis that changes remain limited, based on two reasons.

The first reason comes from the fact that structural changes were initiated in the 20 years following the Chernobyl  accident. This period saw the rise of organizations dedicated to accident prevention and crisis management preparedness, such as the ASN in France, and European (WENRA, ENSREG) and international cooperation organizations.

These organizations initiated continuous research on nuclear accidents, gradually developing tools for  understanding and responding to accidents, as well as mechanisms for coordination between public officials and industry leaders at the national and international levels.

These tools were “activated” following the Fukushima accident and made it possible to quickly provide an explanation for the accident, launch shared procedures such as supplementary safety assessments (the  much-discussed “stress tests”), and collectively propose limited revisions to nuclear safety standards.

This work contributed to normalizing the accident, by bringing it into existing organizations and frameworks for thinking about nuclear safety.

This helped establish the conviction, among industry professionals and French public authorities, that the  governance regime in place was capable of preventing and responding to a large-scale event, without the need to profoundly reform it.

The inertia of the French system

A second reason comes from the close relationships in France between the major players in the civil nuclear sector (operators – EDF primarily – and regulators – the ASN and its technical support organization IRSN), in particular with regard to establishing and assessing safety measures at power plants.

These relationships form an exceptionally stable organized action system. The Fukushima accident provided a short window of opportunity to impose additional measures on operators.

Read more: L’heure des comptes a sonné pour le nucléaire français (Time for a Reckoning in the French Nuclear Industry)

But this window closed quickly, and the action system returned to a stable state. The inertia of this system can be seen in the production of new regulatory instruments, the development and upgrading of which take several years.   

It can also be seen in the organization of crisis management exercises, which continue to perpetuate distinctions between safety and security, accident and crisis, the facility interiors and the environment, and more generally, between technical and political considerations – distinctions that preserve the structure and content of relationships between regulators and operators.

Learning from accidents

Like Chernobyl, Fukushima was first viewed as an exceptional event: by insisting on the perfect storm of a tsunami of unprecedented magnitude and a nuclear power plant, highlighting the lack of an independent regulatory agency in Japan, insisting on the excessive respect for hierarchy among the Japanese, the aim was to construct a unique event so as to suggest that it could not happen in the same way in other parts of the world.

But, at the same time, a normalization process took place, in France in particular, focusing not as much on the event itself, as on the risks it posed for the organization of the nuclear industry, meaning stakeholders and forms of knowledge with legitimacy and authority.

The normalization process led to the accident being included in the existing categories, institutions and systems, in order to demonstrate their ability to prevent such an accident from happening and to limit the impact, should such an accident occur.

This was the result of efforts to delineate the boundaries, with some parties seeking to maintain them and others disputing them and trying to change them.

Ultimately, the boundaries upheld so strongly by industry stakeholders (operators and regulators) – between technical and political considerations, between experts and laymen – were maintained.

Relentlessly questioning nuclear governance

While the Fukushima accident was taken up by political and civil society leaders to challenge the governance of the nuclear industry and its “closed-off” nature, operators and regulators in France and throughout Europe quickly took steps to demonstrate their ability both to prevent such an accident, and to manage the consequences, in order to suggest that they could continue to be entrusted with regulating this sector.

As far as making the sector more open to civil society players is concerned, this movement was initiated well before the Fukushima accident (with the TSN Law in 2006, notably), and was, at best, the continuation of a pre-existing trend.

But other boundaries seem to have emerged or been strengthened in recent years, especially between technical factors and human and organizational factors, or safety requirements and other requirements for nuclear organizations (economic and industrial performance in particular), although it is not exactly clear whether this is related to the accidents.

These movements go hand in hand with a bureaucratization of relationships between the regulator and its technical expert, and between these two parties and operators, and require further research in order to investigate their effects on the foundations of nuclear risk governance.

Talking and listening to one another

As like causes produce like effects, it is indeed the fact that the nuclear industry is unreceptive to any “uncomfortable knowledge” – based on the idea introduced by Steve Rayner – that is the problem.

Social science research has long demonstrated that in order to solve complex problems, a wide range of individuals from various backgrounds and training must be brought together, for research that transcends disciplinary and institutional boundaries.

Social science researchers, engineers and public authorities must talk to – and more importantly – listen to one another. For engineers and policy-makers, that means being ready to take into account facts or knowledge that may challenge established doctrines and arrangements and their legitimacy.  

And social science researchers must be ready to go and see nuclear organizations, to get a first-hand look at their day-to-day operations, listen to industry stakeholders and observe working situations.

But our experience, in particular through Agoras, has shown us that not only is such work time-consuming and costly, it is also fraught with pitfalls. For even when one stakeholder does come to see the soundness of certain knowledge, the highly interconnected nature of relationships with other industry stakeholders, who make up the governance system, complicates the practical implementation of this knowledge, and therefore prevents major changes from being made to governance arrangements.

Ultimately, the highly interconnected nature of the nuclear industry’s governance system is arguably one of the vulnerabilities.  

Stéphanie Tillement, Sociologist, IMT Atlantique – Institut Mines-Télécom and Olivier Borraz, CNRS Research Director – Centre for the Sociology of Organisations, Sciences Po

This article has been republished from The Conversation under a Creative Commons license. Read the  original article (in French).

flux des conteneurs

DMS Logistics is optimizing inland container transportation

The inland container logistics chain suffers from low digitization, which limits organization and communication between the various parts of the chain. To overcome this problem, the start-up DMS Logistics, incubated at Mines Saint-Étienne, is developing a platform to optimize management of these flows of goods. It uses machine learning methods to automate the creation of schedules, reduce congestion at terminals and boost the competitiveness of ports and their logistics networks.

Ports’ ability to streamline the transfer of goods impacts their competitiveness in a sector where the competition is fierce. Yet, one of the major problems for this type of infrastructure is their permanent congestion. This is explained in part by the physical capacity to receive trucks at terminals, but is also due to the lack of anticipation in the exchange of containers with carriers. This results in costly slowdowns and detention charges for every delay. At the heart of the problem is a lack of communication between the various participants in the same supply chain: terminals, road carriers, container parks, ship-owners, freight forwarders etc.

To help overcome this problem, the start-up DMS Logistics seeks to bring together all of these participants through a platform to optimize and anticipate the inland flow of containers. “This is a major market with 800 million containers exchanged every year worldwide,” explains one of the three founders, Xavier Des Minières, a specialist in inland logistics. Using operational data from each participant in the chain, the start-up has successfully optimized the overall supply chain rather than an individual part of the chain. This software solution therefore achieves the goals of the French government’s strategic plan “France Logistics 2025” initiated in 2016 to make national logistics more efficient and attractive.

Digitizing companies that work with containers

The logistics sector is still little digitized. It is made up of many SMEs with very small profit margins who cannot afford to buy digital tools to manage their operations,” explains Xavier Des Minières. DMS Logistics is solving this problem by equipping these users digitally and adapting to their resources. However, the solution becomes more useful when it groups together all the parts of the supply chain. To do so, the company is targeting the terminals around which all the other inland transportation participants revolve.

DMS Logistics’ solution is a distributed cloud-based SaaS (Software as a Service) platform. It enables users to enter their operational data online: container movement, missions to accomplish or already carried out etc. For participants that have already gone digital, the service connects to their data through an API (Application Programming Interface) protocol. Since it was founded in 2020, the start-up has collected 700,000 container movements. This massive amount of data will feed its machine learning algorithms. “We’re automating three key time-consuming actions: managing operations schedules, making appointments at terminals and communication between partners,” says Xavier Des Minières.

Predicting flows based on data history

Why does this sector need to be automated? In the field, many participants in the chain respond to management difficulties in real time, using walkie-talkies or over the phone. They are constantly dealing with seemingly unforeseen difficulties. “However, we have shown that there is a lot of redundancy in the operational behavior of the various participants. The difficulties are therefore predictable and our algorithms make it possible to anticipate them,” explainsCyriac Azefack, a data scientist at DMS Logistics who holds a PhD in artificial intelligence.

The prediction is even more accurate when it cross-references data from the various participants. For example, carriers can optimize drivers’ schedules based on times for appointments offered by terminals to pick up goods. Furthermore, carriers’ behavior (history of their operations, inventory movement etc.) can be used to identify appropriate time slots for these appointments. Carriers can then access the terminal when it is convenient for them to do so and when it is not crowded. This seemingly simple organization was not possible before now.

An even higher level of optimization can be reached. “Still using carriers’ behavioral data, we identify the drivers and trucks that are most suited for a mission (local, long distance, etc.),” adds Taki-Eddine Korabi, a data scientist at DMS Logistics who holds a PhD in mathematics, computer science and automation. Ultimately, the overall optimization of an ecosystem results in better local management.

Towards the optimization of local logistics ecosystems

DMS Logistics’ solution is distributed in the Ivory Coast and in Marseille, where a team of 12 people are  based. “After 4 months of operations at our pilot facility, we can predict the arrival of trucks at a terminal with a reliability rate of 98% over a week,” explains Xavier Des Minières. For the terminal, this means 15% savings in resources. Moreover, when a port is efficient, it boosts the attractiveness of the entire region. The economic benefits are therefore wide-ranging.

Another key finding: optimizing flow at the terminals also helps the ports in their efforts toward ecological transition. More efficient organization means less unnecessary transportation, a reduction in traffic at ports, and therefore less local pollution. And air quality is improved. 

On the scientific side, research has only focused on optimizing container carriers’ operations since 2015 and on-the-ground information is still lacking. “We’re going to be starting a Cifre PhD with Mines Saint-Étienne which will rely on the use of data collected by our platform. That will allow us to explore this topic in an optimal way and offer bright prospects for research and the logistics business,” concludes Taki-Eddine Korabi.

By Anaïs Culot

Read on I’MTech: AlertSmartCity, Cook-e, Dastra, DMS, GoodFloow, JobRepublik, PlaceMeet and Spectronite supported by the “honor loan” scheme