silent cities project

Locked-down world, silent cities

Last spring, France decided to impose a lockdown to respond to the health crisis. Our cities came to a standstill and cars disappeared from the streets, allowing residents to rediscover quieter sounds like birdsong. A team of researchers decided to take advantage of this calm that suddenly settled over our lives to better understand the impacts of sound pollution, and created the Silent Cities project.

 

When the lockdown was announced and France was getting ready to come to a halt, a team of researchers launched a collaborative, interdisciplinary project: Silent Cities. The team includes Samuel Challéat,¹ Nicolas Farrugia,² Jérémy Froidevaux³ and Amandine Gasc,4 researchers in environmental geography, artificial intelligence, biology and ecology, respectively. The aim of their project is to record the sounds heard in cities around the world to study the impacts that lockdown and social distancing measures may have on noise pollution. The project also seeks to assess the effects of the variation of our activities on other animal species as our lives gradually return to normal.

Listening to cities

“We had to develop a standard protocol to obtain high-quality recordings for the analyses, but they also had to be light and easy to implement during the lockdown,” explains Nicolas Farrugia, a researcher in machine learning and deep learning at IMT Atlantique. Due to the lockdown, it was not possible to go directly into the field to carry out these acoustic surveys. A collaborative system was set up to allow  a large number of participants around the world to take part in the project by making recordings from their homes. The four researchers provided a collaborative platform so that the participants could then upload their recordings.

Interactive map of the Silent Cities project participants around the world.

The researchers analyzed and compared recordings at different sites using what they call ecoacoustic indices. These are mathematical values. The higher they are, the more they show the diversity and complexity of sounds in an acoustic survey. “Still using an open-access approach, we used a code base  to develop an algorithm that would automatically calculate these ecoacoustic indices in order to catalogue our recordings” explains Nicolas Farrugia.

“The goal is to run audio-tagging algorithms to automatically recognize and tag different sounds heard in a recording,” he adds. This makes it possible to obtain a fairly accurate identification of sound sources, indicating, for example, the presence of a car, a raven’s caw or a discussion between several people in a sound survey.

This type of algorithm based on deep neural networks has become increasingly popular in recent years. For acoustic ecologists, they provide recognition that is relatively accurate, and more importantly, multi-targeted: the algorithm is able to seek many different sounds at the same time to tag all the acoustic surveys. “We can also use them as a filter if we want to find all the recordings where we hear a raven. That could be useful for measuring the appearance of a species, by visualizing the time, date or location,” says Nicolas Farrugia.

The contribution of artificial intelligence is also a help to estimate the frequency of different categories of sounds  — for automobile traffic for example — and visualize the increase or decrease. During the lockdown, the researchers clearly observed a drop in automobile traffic and now expect to see it go back up as our lives are gradually returning to normal. What they are interested in is being able to visualize how this may disturb the behavior of other animal species.

What changes?

“Some studies have shown that in urban environments, birds can change the frequency or time of day at which they communicate, due to ambient noise,” says Nicolas Farrugia. The sound of human activities, saturating the urban environment can, for example, make it difficult for certain species to reproduce. “That said, it’s hard to talk about causality since, in normal times, we can’t listen to urban ecosystems without the contribution of human activities.”  It is therefore usually difficult for eco-acoustics researchers to fully understand the biodiversity of our cities.

In this respect, the Silent Cities project provides an opportunity to directly study the variation in human activity and how it impacts ecosystems. Some of the measures put in place to respond to the health crisis could subsequently be promoted for ecological reasons. One such example is cycling, which is now being encouraged  through financial assistance to repair old bicycles and creating new cycle paths. Another example is initiatives to establish staggered working hours, which would also limit the associated noise pollution. One of the possible prospects of the project is to inform discussions about how urban environments should be organized.

” Samuel Challéat, the researcher who initiated this project, works on light pollution and what we can be done to limit artificial light,” he adds. For example — like “green and blue belts,” which seek to promote the preservation of so-called “ordinary” biodiversity including in urban environments — he is currently working on an emerging planning tool, the “black belt,” which aims to restore nocturnal ecological continuity which has been harmed by artificial light. Since we know that the sounds created by human activities disturb certain ecological processes, this reasoning on ecological continuity could be transferred to the field of eco-acoustics, where the challenge would be to work to maintain or restore spaces free from any noise pollution. The data and results of the Silent Cities project could help provide insights in this area.

By Tiphaine Claveau

 

¹Samuel Challéat, Environmental Geography, University of Toulouse 2, CNRS, GEODE (guest researcher), Toulouse, France

²Nicolas Farrugia, Machine Learning & Deep Learning, IMT Atlantique, CNRS, Lab-STICC, Brest, France

³Jérémy Froidevaux, Conservation Biology, University of Bristol, School of Biological Sciences, Bristol, UK

4Amandine Gasc, Conservation Ecology, Aix Marseille University, Avignon University, CNRS, IRD, IMBE, Marseille, France

 

data sharing

Data sharing, a common European challenge

Promoting data sharing between economic players is one of Europe’s major objectives via its digital governance strategy. To accomplish this, there are two specific challenges to be met. Firstly, a community must be created around data issues, bringing together various stakeholders from multiple sectors. Secondly, the technological choices implemented by these stakeholders must be harmonised.

 

‘If we want more efficient algorithms, with qualified uncertainty and reduced bias, we need not only more data, but more diverse data’, explains Sylvain Le Corff. This statistics researcher at Télécom SudParis thus raises the whole challenge around data sharing. This need applies not only to researchers. Industrial players must also strengthen their data with that from their ecosystem. For instance, an energy producer will benefit greatly from industrial data sharing with suppliers or consumer groups, and vice versa. A car manufacturer will become all the more efficient with more data sources from their sub-contractors.

The problem is that this sharing of data is far from being a trivial operation. The reason lies in the numerous technical solutions that exist to produce, store and use data. The long-standing and over-riding idea for economic players was to try to exploit their data themselves, and each organisation therefore made personal choices in terms of architecture, format or data-related protocols. An algorithm developed to exploit data sets in a specific format cannot use data packaged in another format. This then calls for a major harmonisation phase.

‘This technical aspect is often under-estimated in data sharing considerations’, Sylvain Le Corff comments. ‘Yet we are aware that there is a real difficulty with the pre-treatment operation to harmonise data.’ The researcher quotes the example of automatic language analysis, a key issue for artificial intelligence, which relies on the automatic processing of texts from multiple sources: raw texts, texts generated by audio or video documents, or texts derived from other texts, etc. This is the notion of multi-modality. ‘The plurality of sources is well-managed in the field, but the manner in which we oversee this multi-modality can vary within the same sector.’ Two laboratories or two companies will therefore not harmonise their data in the same way. In order to work together, there is an absolute need to go through this fastidious pre-treatment, which can hamper collaboration.

A European data standard

Olivier Boissier, a researcher in artificial intelligence and inter-operability at Mines Saint-Étienne, adds another factor to this issue: ‘The people who help to produce or process data are not necessarily data or AI specialists. In general, they are people with high expertise in the field of application, but don’t always know how to open or pool data sets.’ Given such technical limitations, a promising approach consists in standardising practices. This task is being taken on by the International Data Spaces Association (IDSA), whose role is to promote data sharing on a global scale, and more particularly in Europe.

Contrary to what one might assume, the idea of a data standard does not mean imposing a single norm on data format, architecture or protocol. Each sector has already worked on ontologies to help facilitate dialogue between data sets. ‘Our intention is not to provide yet another ontology’, explains Antoine Garnier, project head at IDSA. ‘What we are offering is more of a meta-model which enables a description of data sets based on those sector ontologies, and with an agnostic approach in terms of the sectors it targets.’

This standard could be seen as a list of conditions on which to base data use. To summarise the conditions in IDSA’s architectural model, ‘the three cornerstones are the inter-operability, certification and governance of data’, says Antoine Garnier. Thanks to this approach, the resulting standard serves as a guarantee of quality between players. It enables users to determine rapidly whether an organisation fulfils these conditions and is thus trustworthy. This system also raises the question of security, which is one of the primary concerns of organisations who agree to open their data.

Europe, the great lake region of data?

While developing a standard is a step forward in technical terms, it remains to be put into actual use. For this, its design must incorporate the technical, legal, economic and political concerns of European data stakeholders – producers and users alike. Hence the importance of creating a community consisting of as many organisations as possible. In Europe, since 2020, this community has had a name, Gaia-X, an association of players, including together IMT and IDSA in particular, to structure efforts around the federation of data, software and infrastructure clouds. Via Gaia-X, public and private organisations aim to roll out standardisation actions, using the IDSA standard among others, but may also implement research, training or awareness activities.

‘This is such a vast issue that if we want to find a solution, we must approach it through a community of experts in security, inter-operability, governance and data analysis’ Olivier Boissier points out, emphasising the importance of dialogue between specialists around this topic. Alongside their involvement in Gaia-X, IMT and IDSA are organising a winter school from 2 to 4 December to raise awareness among young researchers of data-sharing issues (see insert below). With the support of the German-French Academy for the Industry of the Future, it will provide the keys to understanding technical and human issues, through concrete cases. ‘Within the research community, we are used to taking part in conferences to keep up to date on the state of play of our field, but it is difficult to have a deeper understanding of the problems faced by other fields’, Sylvain Le Corff admits. ‘This type of Franco-German event is essential to structuring the European community and forming a global understanding of an issue, by taking a step back from our own area of expertise.’ 

The European Commission has made no secret of its ambition to create a space for the free circulation of data within Europe. In other words, a common environment in which personal and confidential data would be secured, but also in which organisations would have easy access to a significant amount of industrial data. To achieve this idyllic scenario of cooperation between data players, the collective participation of organisations is an absolute prerequisite. For academics, the communitarian approach is a core practice and does not represent a major challenge. For businesses, however, there remains a certain number of stakeholders to win over. The majority of major industries have understood the benefits of data sharing, ‘but some companies still see data as a monetizable war treasure that they must avoid sharing’, says Antoine Garnier. ‘We must take an informative approach and shatter preconceived ideas.’

Read on I’MTech: Data sharing: an important issue for the agricultural sector

What about non-European players? When we speak about data sharing, we systematically refer to the cloud, a market cornered by three American players, Amazon, Microsoft and Google, behind which we find other American stakeholders (IBM and Oracle) and a handful of Chinese interests such as Alibaba and Tencent. How do we convince these ‘hyper-scalers’ (the title refers to their ability to scale up to meet growing demand, regardless of the sector) to adopt a standard which is not their own, when they are the owners of the technology upon which the majority of data use is based? ‘Paradoxically, we are perhaps not such bad news for them’ Antoine Garnier assures us. ‘Along with this standard, we are also offering a form of certification. For players suffering from a negative image, this allows them to demonstrate compliance with the rules.’

This standardisation strategy also impacts European digital sovereignty and the transmission of its values. In the same way as Europe succeeded in imposing a personal data protection standard in the 2010s with the formalisation of the GDPR, it is currently working to define a standard around industrial data sharing. Its approach to this task is identical, i.e. to make standardisation a guarantee of security and responsible management. ‘A standard is often perceived as a constraint, but it is above all a form of freedom’ concludes Olivier Boissier. ‘By adopting a standard, we free ourselves of the technical and legal constraints specific to each given use.’

[box type=”info” align=”” class=”” width=””]From 2 to 4 December: a winter school on data sharing

Around the core theme of Data Analytics & AI, IMT and TU Dortmund are organising a winter school on data sharing for industrial systems, from 2 to 4 December 2020, in collaboration with IDSA, the German-French Academy for the Industry of the Future and with the support of the Franco-German University. Geared towards doctoral students and young researchers, its aim is to open perspectives and establish a state of play on the question of data exchange between European stakeholders. Through the participation of various European experts, this winter school will examine the technical, economic and ethical aspects of data sharing by bringing together the field expertise of researchers and industrial players.

Information and registration

[/box]

Dagobah

DAGOBAH: Tables, AI will understand

Human activities produce massive amounts of raw data presented in the form of tables. In order to understand these tables quickly, EURECOM and Orange are developing DAGOBAH, a semantic annotation platform. It aims to develop a generic solution that can optimize AI applications such as personal assistants, and facilitate the management of complex data sets of any company.

 

On a day-to-day basis, online keyword searches often suffice to make up for our thousands of memory lapses, clear up any doubts we may have or satisfy our curiosity. The results even anticipate our needs by offering more information than we asked for: a singer’s biography, a few song titles, upcoming concert dates etc. But have you ever wondered how the search engine always provides an answer to your questions? In order to display the most relevant results, computer programs must understand the meaning and nuances of data (often in the form of tables) so that they can answer users’ queries. This is one of the key goals of the DAGOBAH platform, created through a partnership between EURECOM and Orange research teams in 2019.

DAGOBAH’s aim is to automatically understand the tabular data produced by humans. Since there is a lack of explicit context for this type of data – compared to a text – understanding it depends on the reader’s knowledge. “Humans know how to detect the orientation of a table, the presence of headings or merging lines, relationships between columns etc. Our goal is to teach computers how to make such natural interpretations,” says Raphaël Troncy, a data science researcher at Eurecom.

The art of leveraging encyclopedic knowledge

After identifying a table’s form, DAGOBAH tries to understand its content. Take two columns, for example. The first lists names of directors and the second, film titles. How does DAGOBAH go about interpreting this data set without knowing its nature or content? It performs a semantic annotation, which means that it effectively applies a label to each item in the table. To do so, it must determine the nature of a column’s content (directors’ names etc.) and the relationship between the two columns. In this case: director – directed – film. But an item may mean different things. For example, “Lincoln” refers to a last name, a British or American city, the title of a Steven Spielberg film etc. In short, the platform must resolve any ambiguity about the content of a cell based on the overall context.

To achieve its goal, DAGOBAH searches existing encyclopedic knowledge bases (Wikidata, DBpedia). In these bases, knowledge is often formalized and associated with attributes: “Wes Anderson” is associated with “director.” To process a new table, DAGOBAH compares each item to its database and proposes possible candidates for attributes: “film title”, “city” etc. But they must remain simply candidates. Then, for each column, the candidates are grouped together and put to a majority vote. The nature being sought is therefore deduced with a varying degree of probability.

However, there are limitations to this method when it comes to complex tables. Beyond applications for the general public, industrial data may contain statistics related to business-specific knowledge or highly specialized scientific data that is difficult to identify.

Neural networks to the rescue  

To reduce the risk of ambiguity, DAGOBAH uses neural networks and a word embedding technique. The principle: represent a cell’s content in the form of a vector in multidimensional space.  Within this space, vectors of two words that are semantically close to one another are grouped together geometrically in the same place. Visually speaking, the directors are grouped together, as are the film titles. Applying this principle to DAGOBAH is based on the assumption that items in the same column must be similar enough to form a coherent whole. “To remove ambiguity between candidates, categories of candidates are grouped together in vector space. The problem is then to select the most relevant group in the context of the given table,” explains Thomas Labbé, a data scientist at Orange. This method becomes more effective than a simple search with a majority vote when there is little information available about the context of a table.

However, one of the drawbacks of using deep learning is the lack of visibility about what happens inside the neural network. “We change the hyperparameters, turning them like oven dials to obtain better results. The process is highly empirical and takes a long time since we repeat the experiment over and over again,” explains Raphaël Troncy. The approach is also time-consuming in terms of computing time. The teams are also working on scaling up the process. As such, Orange’s dedicated big data infrastructures are a major asset.  Ultimately, the researchers seek to implement an all-purpose approach, created in an end-to-end way and which is generic enough to meet the needs of highly diverse applications.

Towards industrial applications

The semantic interpretation of tables is a goal but not an end. “Working with EURECOM allows us to have almost real-time knowledge about the latest academic advances as well as an informed opinion on the technical approaches we plan to use,” says Yoan Chabot, a researcher in artificial intelligence at Orange. DAGOBAH’s use of encyclopedic data makes it possible to optimize question/response engines in the kind of natural language used by voice assistants. But the holy grail will be to provide an automatic processing solution for business-specific knowledge in an industrial environment. “Our solution will be able to address the private sector market, not just the public sector, for internal use by companies who produce massive amounts of tabular data,” adds Yoan Chabot.

This will be a major challenge, since industry does not have knowledge graphs to which DAGOBAH may refer. The next step will therefore be to succeed in semantically annotating data sets using knowledge bases in their embryonic stages. To achieve their goals, for the second year in a row the academic and industry partners have committed to take part in an international semantic annotation challenge, a very popular topic in the scientific community. For four months, they will have the opportunity to test their approach in real-life conditions and will compare their results with the rest of the international community in November.

To learn earn more: DAGOBAH: Make Tabular Data Speak Great Again

Anaïs Culot for I’MTech

Photographie de l'océan

Smarter models of the ocean

The ocean is a system that is difficult to observe, whose biodiversity and physical phenomena we still know very little about. Artificial intelligence could be an asset in understanding this environment better. Ronan Fablet, a researcher at IMT Atlantique, presents the projects of the new Océanix Research Chair. What is the objective? To use AI to optimize models for observing the ocean.

 

More than 70 % of the surface area of our planet is occupied by oceans and seas. They make up a colossal system that we know little about. The TARA expedition discovered hundreds of millions of previously unknown species of plankton, as our ability to explore the ocean floor remains limited. This is also the case with observing physical phenomena such as the dynamics of ocean currents for example, on the surface or at depth.

And yet, understanding ocean dynamics is essential for a good understanding of ecological aspects, biodiversity and ecosystems. But unlike the atmosphere, which can be observed directly, it is difficult to study the ocean. Space technologies offer some visibility of the ocean surface, including surface currents and winds, but can see nothing below. In addition, orbiting satellites capture images as they pass over certain areas but cannot provide instantaneous observation of the entire globe, and the presence of clouds can obscure the visibility of the oceans. As for beacons and buoys, some of these recover information up to 2,000 meters deep, but this remains very occasional.

Using AI to see the unknown

No observation system can provide a high-resolution image of the oceans all around the globe, everywhere and all the time,” says Ronan Fablet, signal and communications researcher at IMT Atlantique. And even decades from now I don’t think that will be possible, if we use only physical observations.” The solution is artificial intelligence: AI could make it possible to optimize observation systems and reconstruct missing data based on the observed data. Ronan Fablet launched the Océanix chair at IMT Atlantique in order to investigate this further, in collaboration with numerous institutional partners (CNES, École Navale, ENSTA Bretagne, Ifremer, IRD, ESA) and industrial partners (Argans, CLS, e-odyn, ITE-FEM, MOi, Microsoft, NavalGroup, ODL, OceanNext, Scalian).

Machine learning is a way of estimating parameters to get the best prediction of an unknown, for example at a time in the future. This works like image recognition models: “We could feed the model a lot of pictures of dogs, for example, so that it learns to recognize them,” Ronan Fablet explains. The difference here is that we’re working on systems with larger dimensions, and images of the ocean.”

Take the example of an oil spill. To find out how the oil will drift through the ocean after a spill, researchers use simulations based on physical models related to fluid dynamics. “These models are either difficult to solve or difficult to calibrate, and may require unknown parameters,” he says. Machine learning techniques should make it possible to develop digital models that are more compact, and therefore faster in simulation. This would make it easier to simulate the physical processes involved in the drift of an oil slick.

Read on I’MTech: Marine oil pollution detected from space

This also applies to obtaining better representations of climate variability, which involves very broad temporalities. “The objective is to use the data available today, and to couple it with machine learning techniques to find the missing information, to better understand the situation tomorrow”.

A better view of sea routes

Model optimization and data reconstruction are also of great interest in vessel traffic monitoring. Possible applications are the detection of abnormal behavior, such as a fishing vessel changing course or stopping; or the illegal behavior of a vessel entering a restricted area. “It is unimaginable to equip an entire maritime route as we would a motorway to monitor traffic. Observation is therefore based on other space technologies,” says the researcher.

In the field of maritime traffic, there are two main types of information: AIS (Automatic Identification System) signals and satellite imagery. Every shipping vessel is required to transmit an AIS signal to locate it, but vessels smuggling cargo usually turn off this signal. Among other things, satellite imagery allows us to observe whether or not the vessels that have navigated in an area have transmitted or not, by comparing the image with AIS signals.

This type of study on abnormal behavior related to AIS signals was the subject of the ANR Astrid Sesame project. “We have applied specific neural networks to learning data, particularly in western Brittany, to learn what normal ship behavior is,” says Ronan Fablet. The aim is then to identify behaviors that deviate from the norm, even if they are infrequent or of very low probability. An abnormal event would then send an alert to a monitoring software to determine whether specific actions are required, such as sending a patrol.

Applications of artificial intelligence in oceanography are developing more significantly today as the ability to link neural networks and mathematical models used in oceanography becomes more explicit and easier to implement. The Oceanix research chair at IMT Atlantique brings together institutions specialized in aspects of artificial intelligence and others more focused on oceanography.

Some teams have been working together for several years, such as Ifremer with IMT Atlantique. These studies will make it possible to provide answers where analytical models cannot, and to speed up calculations considerably. Ronan Fablet adds that “the Holy Grail for our teams would be to identify new laws for physical, biogeochemical or ecological processes. To be able to identify new models directly from the data – representations corresponding to a general rule”.

 

Tiphaine Claveau

 

masks

Protective masks: towards widespread reuse?

How can protective masks be recycled and reused without risking safety? Scientists, medical practitioners and manufacturers have teamed up to explore different treatment methods. As part of this consortium, IMT Atlantique researchers are studying the impact of decontamination processes on mask performance.

 

Surgical and FFP2 masks are intended for single use. Thrown away after just a few hours of use, they are designed to protect the wearer from inhaling infectious agents spread through the air. The question of recycling these masks has not been raised before, but high demand for masks to protect healthcare workers and the general public has been a game-changer. To help find a solution to the current shortage, an interdisciplinary consortium bringing together nearly 25 laboratories and manufacturers throughout France was created in early March, led by professor Philippe Cinquin from Grenoble University Hospital, the CNRS and the CEA. Its goal is to find a treatment process that makes it possible to reuse masks.

Currently, various decontamination methods recognized for both their virucidal and bactericidal effects are being explored: among others, gamma or beta irradiation, thermal decontamination with steam at 121 °C; an ethylene oxide treatment, and wet or dry heating methods at 70°C or higher. These methods must able to reduce the bioburden of protective masks, without reducing filtration efficiency or breathability.

At IMT Atlantique, which is a member of the research consortium, Laurence Le Coq and her colleagues Aurélie Joubert and Yves Andrès are working mainly on this second aspect of the project.  The researchers are drawing on their research on filtration applied to air treatment — for industrial waste applications, for example, or indoor air treatment in ventilation networks. The team has been able to quickly shift its focus and adapt its expertise to work on recycling used masks. “The contribution and dedication of researchers and technical staff, who were called on to respond to an urgent need to develop technical solutions and establish experimental conditions, has been instrumental,” says Laurence Le Coq.

By mid-March, the scientists had set up an initial test bed to closely reflect the AFNOR standards for masks in order to test their performance following decontamination. “If the masks are normally intended for single use, it’s also because they first undergo a treatment process which gives them a certain level of efficiency, as well as their mechanical strength and specific shape. When they are decontaminated, part of this pre-treatment is removed, depending on the type of decontamination and its conditions. What’s more, depending on how a mask has been put on, worn and taken off, it may be damaged and its structure could be altered,” explains the researcher.

Preliminary findings    

So, how can effective decontamination be combined with a sufficient level of protection?  “We compare how performance is maintained between new treated masks and used treated masks. More precisely, we measure the changes brought about by decontamination treatments, in particular their level of breathability and their filtration efficiency for particles with a diameter ranging from 0.3 to 3 µm, since the virus is spread by microdroplets,” explains Laurence Le Coq.

After ruling out certain methods, the scientists were able to determine favorable treatment conditions for decontaminating the masks without having too much of an effect on their inherent qualities. “Dry heat treatments, for example, are promising but we can’t move forward for the time being. Certain findings are encouraging following irradiation or washing at 95°, but only for surgical masks. For now, our findings do not allow us to converge on a single treatment, a single protocol. And most importantly, there is a huge difference between what we do in good laboratory conditions and what could be done on a greater quantity of masks in a hospital environment, or at home,” says Laurence Le Coq.

The researchers are currently trying to clarify and confirm these preliminary findings. Their goal is now to quickly establish treatment conditions that are effective for all surgical and FFP2 masks, regardless of the manufacturer.

Is widespread mask recycling possible?

The majority of masks provided for the “general public” during the lockdown are reusable after being washed at 60°C for 30 minutes. This is not yet the case for professional masks.

Lockdown measures have been accompanied by efforts to raise public awareness about the importance of wearing masks. In French departments classified as red due to a high number of cases, masks intended for the “general public” are mandatory on public transportation and in high schools. These fabric masks are less effective than professional protective masks, but they are easily reusable and can be washed at least five times. “What is lost in effectiveness is made up for by widespread mask-wearing and ease of use,” says Laurence Le Coq.

For the researcher, this unprecedented research project could also be an opportunity to consider recycling protective masks in the long term, even when there is not a shortage. “Is it really appropriate to have single-use masks if at some point we are required to use them to a greater extent, or even on a daily basis? How should the environmental costs of this medical waste be weighed? Of course, what happens next will depend on the treatment we’re able to develop.”

 

By Anne-Sophie Boutaud

digital simulation

In the midst of a crisis, hospitals are using digital simulation to organize care

Thierry Garaix and Raksmey Phan are systems engineering researchers at Mines Saint-Étienne[1]. In response to the current health crisis, they are making digital simulation and digital twins available to health services to inform their decision-making. This assistance is crucial to handling the influx of patients in hospitals and managing the post-peak period of the epidemic.

 

The organization of the various departments within a hospital is a particular concern in the management of this crisis. Based on the number of incoming patients and how many of them require special care, certain departments must be turned into dedicated wards for Covid-19 patients. Hospitals must therefore decide which departments they can afford to close in order to allocate beds and resources for new patients. “We’re working on models to simulate hospitalizations and intensive care units,” says Thierry Garaix, a researcher in healthcare systems engineering at Mines Saint-Étienne.

“Cardiac surgery operating rooms are already equipped with certain resources needed for Covid wards, such as respirators,” explains the researcher. This makes them good candidates for receiving Covid patients in respiratory distress. These simulations give caregivers a clearer view in order to anticipate the need for hospital and intensive care beds. “At the peak of the epidemic, all possible resources are reassigned,” he explains. “Once the peak has passed, the number of cases admitted to the hospital begins to drop, and the hospital must determine how to reallocate resources to the usual activities.”

Visualizing the hospital

It is essential for hospitals to have a good understanding of how the epidemic is evolving in order to define their priorities and identify possibilities. Once the peak has passed, fewer new patients are admitted to the hospital every day but those that remain still require care. These simulations make it possible to anticipate how long these departments will remain occupied by Covid patients and estimate when they will be available again.

“The tool I’m developing makes it possible to visualize how the flow of Covid patients will progress over time to help the university hospital make decisions,” says Thierry Garaix. The researcher provides the model with data about the length of hospital stays, time spent in the hospital or intensive care unit and the capacity of each hospital unit. The model can then digitally simulate patient pathways and visualize flows throughout the hospital. “It’s important to understand that the progression isn’t necessarily linear,” he adds, emphasizing that “if we see a drop in the number of cases, we have to consider the possibility that there could then be a rise in the epidemic.”

But even if a hospital unit could be freed up and reallocated to its regular activities, it may be more cautious to keep it available to handle new cases. “At the beginning of the epidemic, health services had to rush to allocate resources and set up Covid units quickly,” says Thierry Garaix. “The benefit of these simulations is that they make it easier to anticipate the management of resources, so that resources can be allocated gradually depending on how the epidemic evolves.”

“Strictly speaking, it is not a digital twin since the model does not directly interact with reality,” says the researcher. “But if a digital twin of all of the hospital’s departments had been available, it would have been of great help in planning how resources should be allocated at the beginning of the epidemic.” 

Visualizing people

A digital twin could help assess a number of complex aspects, including the effects of isolation on the health of elderly people. “It’s a project we’ve been working on for a while, but it has taken on new importance in light of the lockdown measures,” says Raksmey Phan, who is also a healthcare systems researcher at Mines Saint-Étienne. The AGGIR scale is generally used to measure an individual’s  loss of autonomy. It breaks health status into different categories   ̶̶  autonomous, at risk, fragile, dependent   ̶̶  in order to propose appropriate care. The digital twin would be used to anticipate changes in health status, identify at-risk individuals and prevent them from moving towards a situation of dependence.

“It’s important to point out that a fragile individual can, with appropriate physical activity, return to a category corresponding to a better health status. However, once an individual enters into a situation of dependence,  there’s no going back,” explains Raksmey Phan. The aim of this new digital twin project is to predict this progression in order to propose appropriate activities before it is too late. At present, the lack of physical activity as a result of the lockdown raises the risk of adverse health outcomes since it implies a loss of mobility.

In the context of lockdown, this digital twin therefore makes it possible to estimate the impact of lack of physical activity for elderly people. Before the lockdown period, researchers installed sensors in homes of volunteers, on doors, objects such as refrigerators, front doors etc. to evaluate their presence and level of activity at home. “With fairly simple sensors, we have a model that is well-aligned with reality and is effective for measuring changes in an individual’s health status,” he adds.

These sensors evaluate the time spent in bed, on the couch, or indicate if, on the other hand, individuals spend a lot of time standing up, moving around, or if he leaves the house often. With this data, the digital twin can extrapolate new data about a future situation, and therefore predict how an individual’s health status will progress over time. “The goal is essentially to analyze troubling changes that may lead to a risk of fragility, and react in order to prevent this from occurring,” explains the researcher.

The researchers, who are working with the insurance company EOVI MCD, could then propose appropriate activities to maintain good health. Even in the midst of a pandemic, and taking social distancing measures and an effort to limit contact into account, it is possible to propose activities to be done at home, in front of the TV for example. “The insurance provider could propose activities and home services or potentially direct them to a retirement home,” says Thierry Garaix. “The key focus is providing an opportunity to act before it’s too late by estimating the future health status of the individuals  concerned, and reacting by proposing appropriate structures or facilities,” say the two researchers.

[1] Thierry Garaix and Raksmey Phan are researchers at the Laboratory of Informatics, Modeling and Optimization of Systems (LIMOS), a joint research unit between Mines Saint-Étienne/CNRS/University of Clermont-Auvergne.

 

Tiphaine Claveau

Antenna 5G infrastructure

Mathematical tools to meet the challenges of 5G

The arrival of 5G marks a turning point in the evolution of mobile telecommunications standards. In order to cope with the constant increase in data traffic and the requirements and constraints of future uses, teams at Télécom SudParis and Davidson Consulting have joined forces in the AIDY-F2N joint laboratory. Their objective is to provide mathematical and algorithmic solutions to optimize the 5G network architecture.

 

Before the arrival of 5G, which is expected to be rolled out in Europe in 2020, many scientific barriers remain to be overcome. “5G will concern business networks and certain industrial sectors that have specific needs and constraints in terms of real time, security and mobility. In order for these extremely diverse uses to coexist, 5G must be capable of adapting” presents Badii Jouaber, telecommunications researcher at Télécom SudParis. To meet this challenge, he is piloting a new joint laboratory between Télécom SudParis and Davidson Consulting which was launched in early 2020. The main objective of this collaboration is to use artificial intelligence and mathematical modeling technologies to meet the requirements of new 5G applications.

Read on I’MTech: What is 5G?

Configuring custom networks

In order to support levels of service adapted to both business and consumer uses, 5G uses the concept of network slicing. The network is thus split into several virtual “slices” operated from a common shared infrastructure. Each of these slices can be configured to deliver an appropriate level of performance in terms of reliability, latency, bandwidth capacity or coverage. 5G networks will thus have to be adaptable, dynamic and programmable from end to end by means of virtual structures.

“Using slicing for 5G means we can meet these needs simultaneously and in parallel. Each slice of the network will thus correspond to a use, without encroaching on the others. However, this coexistence is very difficult to manage. We are therefore seeking to improve the dynamic configuration of these new networks in order to manage resources optimally. To do so, we are developing mathematical and algorithmic analysis tools. Our models, based on machine learning techniques, among other things, will help us to manage and reconfigure these networks on a permanent basis,” says Badii Jouaber. Networks that can therefore be set up, removed, expanded or reduced according to demand.

A priority for Davidson Consulting

Anticipating issues with 5G is one of the priorities of Davidson Consulting. The company is present in major cities in France and abroad, with 3,000 employees. It was co-founded in 2005 by Bertrand Bailly, a former Télécom SudParis student, and is a major player in telecoms and information systems. “For 15 years we have been carrying out expert assessment for operators and manufacturers. The arrival of 5G brings up new issues. For us, it is essential to contribute to these issues by putting our expertise to good use. It’s also an opportunity to support our clients and help them overcome these challenges”, says David Olivier, Director of Research and Development at Davidson. For him, it is thus necessary to take certain industrial constraints into account from the very first stages of research, so that their work can be operational quickly.

Another one of our goals is to achieve energy efficiency. With the increase in the number of connected objects, we believe it is essential to develop these new models of flexible, ultra-dynamic and configurable mobile networks, to minimize and reduce their impact by optimizing energy consumption”, David Olivier continues.

Bringing technology out of the labs for the networks of the future

The creation of the AIDY-FN2 joint laboratory is the culmination of several years of collaboration between Télécom SudParis and Davidson Consulting, beginning in 2016 with the support of a thesis supervised by Badii Jouaber. “By initiating a new joint research activity, we aim to strengthen our common research interests around the networks of the future, and the synergies between academic research and industry. Our two worlds have much in common!” says David Olivier enthusiastically.

Under this partnership, the teams at Davidson Consulting and Télécom SudParis will coordinate and pool their skills and research efforts. The company has also provided experts in AI and Telecommunications modeling to co-supervise, with Badii Jouaber, the scientific team of the joint laboratory that will be set up in the coming months. This work will contribute to enhancing the functionality of 5G within a few years.