detection covid-19 eaux usées

Covid-19: what could subsurface wave detection mean for the pandemic?

The detection of SARS-CoV-2 viral infections responsible for Covid-19 allows us to monitor the evolution of the pandemic. Most methods are based on individual patient screening, with the disadvantages of cost and time. Other approaches based on the detection of SARS-CoV-2 in urban wastewater have been developed to monitor the trends in infections. Miguel Lopez-Ferber, a researcher at IMT Mines Alès, conducted a study to detect the virus in wastewater on the school’s campus. This precise, small-scale approach allows us to collect information on the probable causes of infection.

How do you detect the presence of Sars-CoV-2 in wastewater?

Miguel-Lopez-Ferber: We use the technique developed by Medema in 2020. After recovering the liquid part of the wastewater samples, we use a centrifugation technique that allows us to isolate a phase that contains the virus-sized particles. From this phase, we proceed with the extraction of the viral genomes present to perform PCR tests. PCR (polymerase chain reaction) is a technique used to amplify a genetic signal. If the PCR amplifies viral genome fragments specific to Sars-CoV-2, then the virus is present in the wastewater sample.

Does this technique tell us the concentration of the virus?

MLF: Yes. Thanks to our partnership with the PHYSE team of the HydroSciences Montpellier laboratory and the IAGE startup, we use the digital PCR technique which is a higher-resolution version of quantitative PCR. This allows us to know how many copies of the viral genome are present in the samples. With weekly sampling, we can know the trend in virus concentrations in the wastewater.

What value is there in quantifying the virus in wastewater?

MLF: This method allows for early detection of viral infections: SARS-CoV-2 is present in feces the day after infection. It is therefore possible to detect infection well before the first potential symptoms appear in individuals. This makes it possible to determine quickly whether the virus is actively circulating or not and whether there is an increase, stagnation or decrease in infections. However, at the scale at which these studies are conducted, it is impossible to know who is infected, or how many people are infected, because the viral load is variable among individuals.

How can your study on the IMT Mines Alès campus contribute to this type of approach?

MLF: To date, studies of this type have been conducted at a city level. We have reduced the cohorts to the scale of the school campus, as well as to different buildings on campus. This has allowed us to trace the sampling information from the entire school to specific points within it. Since mid-August, we have been able to observe the effects of the different events that influence the circulation of the virus, in both directions.

What kind of events are we talking about?

MLF: For example, in October, we quickly saw the effect of a party in a campus building: only 72 hours later, we observed a spike in virus circulation in the wastewater of that building, thus indicating new infections. On the contrary: when restrictive measures were put in place, such as quarantine or a second lockdown, we could see a decrease in virus circulation in the following days. This is faster than waiting to see the impact of a lockdown on infection rates 2 to 3 weeks after its implementation. This not only shows the effectiveness of the measures, but also allows us to know where the infections come from and to link them to probable causes.

What could this type of approach contribute to the management of the crisis?

MLF: This approach is less time-consuming and much less expensive than testing every person to track the epidemic. On the scale of schools or similar organizations, this would allow rapid action to be taken, for example, to quarantine certain areas before infection rates become too great. In general, this would better limit the spread and anticipate future situations, such as peak hospitalizations, up to three weeks before they occur.

By Antonin Counillon

Antenne 5G

What is beamforming?

Beamforming is a telecommunications technology that enables the targeted delivery of larger and faster signals. The development of 5G relies in particular on beamforming. Florian Kaltenberger, researcher at EURECOM and 5G specialist, explains how this technology works.

What is beamforming?

Florian Kaltenberger: Beamforming consists of transmitting synchronized waves in the form of beams, from an antenna. This makes it possible to target a precise area, unlike conventional transmission systems that emit waves in all directions. This is not a new technology, it has been used for a long time in satellite communication and for radar. But it is entering mobile telecommunications for the first time with 5G.

Why is beamforming used in 5G?

FK: The principle of 5G is to direct the wave beams directly to the users. This allows a limited interference between the waves, having a more reliable signal, and saving energy. These three conditions are some of the demands that 5G must meet. Because the waves of 5G signals have high frequencies, they can carry more information, and do so faster. This system avoids congestion in hotspots, i.e. there will be no throughput problems in places where there are many connections simultaneously. Also, the network can be more locally diverse: there can be completely different services used on the same network at the same time.

How does network coverage work with this system?

FK: Numerous antennas are needed. There are several reasons for this. The size of the antennas is proportional to the length of the waves they generate. As the wavelength of 5G signals is smaller, so is the size of the antennas: they are only a few centimeters long. But the energy that the antennas are able to emit is also proportional to their size: a 5G antenna alone could only generate a signal with a range of about ten meters. In order to increase the range, multiple 5G antennas are assembled on base stations and positioned to target a user whenever they are in range. This allows a range of about 100 meters in all directions. So you still need many base stations to cover the network of a city. With beamforming it is possible to target multiple users in the same area at the same time, as each beam can be directed at a single user.

How are the beams targeted to users and how are they then tracked?

The user’s position signal is received by different parts of the 5G antennas. On each of these parts, there is a shift in the time of arrival of the signal, depending on the angle at which it hits the antenna. With mathematical models that incorporate these different time shifts, it is possible to locate the user and target the beam in their direction.

Then you have to track the users, and that’s more complicated. Base stations use sets of fixed beams that point at preset angles. There is a mechanism that allows the user’s device to measure the power of the received beam relative to adjacent beams. The device sends this information back to the base station, which is then able to choose the best beam.

What are the main difficulties when it comes to implementing beamforming?

FK: Today the 5G network still cannot work without the 4G network because of the short range of the beams, which makes its use only effective and useful in urban environments, and especially in hotspots. In more remote areas, 4G takes over. Beamforming cannot be used for a mobile user located several hundred meters from the antenna – let alone a few kilometers away in the countryside. Another difficulty encountered is the movement of users as they move from one base station to another. Algorithms are being developed to anticipate these movements, which is also what we are working on at EURECOM.

Should we expect the next generation of mobile communications, 6G, to go even further than beamforming?

FK: With every generation, there is a breakthrough. For example, 3G was initially designed as a voice communication network, then all the aspects related to internet data were implemented. For 4G it was the other way around: the network was designed to carry internet data, then voice communication was implemented. The operating principle of 6G has not yet been clearly defined. There’s roughly one new generation of cell phones every ten years, so it shouldn’t be long before the foundation for 6G is laid, and we’ll know more about the future of beamforming.

Interview by Antonin Counillon

Data collection protection, GDPR impact

GDPR: Impact on data collection at the international level

The European data protection regulation (GDPR), introduced in 2018, set limits on the use of trackers that collect personal data. This data is used to target advertising to users. Vincent Lefrère, associate professor in digital economy at Institut Mines-Télécom Business School, worked with Alessandro Acquisti from Carnegie Mellon University to study the impact of the GDPR on tracking users in Europe and internationally.

What was your strategy for analyzing the impact of GDPR on tracking users in different countries?

Vincent Lefrère: We conducted our research on online media such as Le Monde in France or the New York Times in the United States. We looked at whether the introduction of the GDPR has had an impact on the extent to which users are tracked and the amount of personal data collected.

How were you able to carry out these analyses at the international level?

VL: The work was carried out in partnership with researchers at Carnegie Mellon University in the United States, in particular Alessandro Acquisti, who is one of the world’s specialists in personal digital data. We worked together to devise the experimental design and create a wider partnership with researchers at other American universities, in particular the Minnesota Carlson School of Management and Cornell University in New York.

How does the GDPR limit the collection of personal data?

VL: One of the fundamental principles of the GDPR is consent. This makes it possible to require websites that collect data to obtain users’ consent  before tracking them. In our study, we never gave our consent or explicitly refused the collection of data. That way, we could observe how a website behaves in relation to a neutral user. Moreover, one of the important features of GDPR is that it applies to all parties who wish to process data pertaining to European citizens. As such, the New York Times must comply with the GDPR when a website visitor is European. 

How did you compare the impact of the GDPR on different media?

VL: We logged into different media sites with IP addresses from different countries, in particular with French and American IP addresses.

We observed that American websites limit tracking more than European websites, and therefore better comply with the GDPR, but only when we were using a European IP address.  It would therefore appear that the GDPR has been more dissuasive on American websites for these users. However, the American websites increased the tracking of American users, for whom the GDPR does not apply.  One hypothesis is that this increase is used to offset the loss of data from European users.

How have online media adapted to the GDPR?

VL: We were able to observe a number of effects. First of all, online media websites have not really played along. Since mechanisms of consent are somewhat vague,  the formats developed in recent years have often encouraged users to accept personal data collection rather than reject it. There are reasons for this: data collection has become crucial to the business model of these websites, but little has been done to offset the loss of data resulting from the introduction of the GDPR, so it is understandable that they have stretched the limits of the law in order to continue offering high quality content for free. With the recent update by the French National Commission on Information Technology and Liberties (CNIL) to fight against this, consent mechanisms will become clearer and more standardized.  

In addition, the GDPR has limited tracking of users by third parties, and replaced it with tracking by first parties. Before, when a user logged into a news site, other companies such as Google, Amazon or Facebook could collect their data directly on the website. Now, the website itself tracks data, which may then be shared with third parties.

Following the introduction of the GDPR, the market share of Google’s online advertising service increased in Europe, since Google is one of the few companies who could pay the quota for the regulation, meaning it could pay the price of ensuring compliance. This is an unintended, perverse  consequence: smaller competitors have disappeared and there has been a concentration of ownership of data by Google.  

Has the GDPR had an effect on the content produced by the media?

VL: We measured the quantity and quality of content produced by the media. Quantity simply reflects the number of posts. The quality is assessed by the user engagement rate, meaning the number of comments or likes, as well as the number of pages viewed each time a user visits the website.

In the theoretical framework for our research, online media websites use targeted advertising to generate revenue. Since the GDPR makes access to data more difficult, it could decrease websites’ financing capacity and therefore lead to a reduction in content quality or quantity. By verifying these aspects, we can gain insights into the role of personal data and targeted advertising in the business model for this system.   

Our preliminary results show that after the introduction of the GDPR, the quantity of content produced by European websites was not affected, and the amount of engagement remained stable. However, European users reduced the amount of time they spent on European websites in comparison to American websites. This could be due to the the fact that certain American websites may have prohibited access to European users, or that American websites covered European topics less since attracting European users had become less profitable. These are hypotheses that we are currently discussing.

We are assessing these possible explanations by analyzing data about the newspapers’ business models, in order to estimate how important personal data and targeted advertising are to these business models.  

By Antonin Counillon

intelligence artificielle, artificial intelligence

Is there intelligence in artificial intelligence?

Jean-Louis Dessalles, Télécom Paris – Institut Mines-Télécom (IMT)

Nearly a decade ago, in 2012, the scientific world was enthralled by the achievements of deep learning.  Three years later, this technique enabled the AlphaGo program to beat Go champions. And this frightened some people. Elon MuskStephen Hawking and Bill Gates were worried about an imminent end to the human race, replaced by out-of-control artificial intelligence.

Wasn’t this a bit of an exaggeration? AI thinks so. In an article it wrote in 2020 in The Guardian, GPT-3, a gigantic neural network with 175 billion parameters explains:

“I’m here to convince you not to worry. Artificial intelligence will not destroy humans. Believe me.”

At the same time, we know that the power of computers continues to increase. Training a network like GPT-3 was literally unconceivable just five years ago. It is impossible to know what its successors may be able to do five, ten or twenty years from now. If current neural networks can replace dermatologists, why would they not eventually replace all of us? Let’s turn the question around.

Are there any human mental abilities that remain strictly out of reach for artificial intelligence?

The first thing that comes to mind are skills involving our “intuition” or “creativity.” No such luck – AI is coming for us in these areas too. This is evidenced by the fact that works created by programs are sold at high prices, reaching nearly half a million dollars at times. When it comes to music, everyone will obviously form their own opinion, but we can already recognize acceptable bluegrass or works that approach Rachmaninoff in imitations by the MuseNet program created, like GPT-3, by OpenAI.

Should we soon submit with resignation to the inevitable supremacy of artificial intelligence? Before calling for a revolt, let’s take a look at what we’re up against. Artificial intelligence relies on many techniques,  but its recent success is due to one in particular: neural networks, especially deep learning ones. Yet a neural network is nothing more than a matching machine. The deep neural network that was much discussed in 2012 matched images –  a horse, a boat, mushrooms – with corresponding words. Hardly a reason to hail it as a genius.

Except that this matching mechanism has the rather miraculous property  of being “continuous.” If you present the network with a horse it has never seen, it recognizes it as a horse. If you add noise to an image, it does not disturb it. Why? Because the continuity of the process ensures that if the input to the network changes slightly, its output will change slightly as well. If you force the network, which always hesitates, to opt for its best response, it will probably not vary: a horse remains a horse, even if it is different from the examples learned, even if the image is noisy.

Matching is not enough

But why is such matching behavior referred to as “intelligent?” The answer seems clear: it makes it possible to diagnose melanoma, grant bank loans, keep a vehicle on the road, detect disorders in physiological signals and so forth. Through their matching ability, these networks acquire forms of expertise that require years of study for humans. And when one of these skills, for example, writing a press article, seems to resist for a while, the machine must simply be fed more examples, as was the case with GPT-3, so that it can start to produce convincing results.

Is this really what it means to be intelligent? No, this type of performance represents only a small aspect of intelligence, at best. What the neural networks do resembles learning by heart. It isn’t, of course, since networks continuously  fill in the gaps between the examples with which they have been presented. Let’s call it’s almost-by heart. Human experts, whether doctors, pilots or Go players, often act the same way when they decide instinctively, based on the large number of examples learned during their training. But humans have many other powers too.

Learning to calculate or reason over time  

Neural networks cannot learn to calculate. There are limits to matching operations like 32+73 and their result. They can only reproduce the strategy of the struggling student who tries to guess the result and sometimes happens upon the right answer. If calculating is too difficult, what about a basic IQ test like: continue the sequence 1223334444. Matching based on continuity is of no help to see that the structure, n repeated n times, continues with 5 fives. Still too difficult? Matching programs cannot even guess that an animal that is dead on Tuesday will not be alive on Wednesday. Why? What do they lack?  

Modeling in cognitive science has shown the existence of several mechanisms, other than matching based on continuity, which are all components of human intelligence. Since their expertise is entirely precalculated, neural networks cannot reason over time to determine that a dead animal remains dead or to understand the meaning of the sentence “he still isn’t dead” and the oddity of this other sentence: “he is not still dead.” And digesting large amounts of data in advance is not enough to allow them to recognize new structures that are very simple for us, such as groups of identical numbers in the sequence 1223334444. Their almost-by-heart strategy is also blind to unprecedented anomalies.

Detecting anomalies is an interesting example, since we often judge others’ intelligence based precisely on this. A neural network will not “see” that a face is missing a nose. Based on continuity, it will continue to recognize the person, or may confuse him or her with someone else. But it has no way of realizing that the absence of a nose in the middle of a face represents an anomaly.

There are many other cognitive mechanisms that are inaccessible to neural networks. Research is being conducted on the automation of these mechanisms. It implements operations carried out at the time of processing,  while neural networks simply make associations learned in advance.

With a decade of perspective on deep learning, the informed public is starting to see neural networks  more as “super-automation” and less as intelligent. For example, the media recently reported on the astonishing performances of the DALL-E program, which produces creative images based on a verbal description – for example, images that DALL-E imagined based on the terms “avocado-shaped chair” on the OpenAI site. We now hear much more tempered assessments than the alarmist reactions following the release of AlphaGo: “It is quite impressive, but we must not forget that it is an artificial neural network, trained to perform a task; there is no creativity or form of intelligence.” (Fabienne Chauvière, France Inter, 31 January 2021)

No form of intelligence? Let’s not be too demanding, but at the same time, let’s remain clear-sighted about the huge gap that separate neural networks from what would be a true artificial intelligence.

Jean‑Louis Dessalles wrote “Des intelligences très artificielles” (Very Artificial Intelligence)  published by Odile Jacob (2019).

Jean-Louis Dessalles, Associate professor at Télécom Paris – Institut Mines-Télécom (IMT)

This article has been republished from The Conversation under a Creative Commons license. Read the original article in French.

Fonds IMT numérique

AlertSmartCity, Cook-e, Dastra, DMS, GoodFloow, JobRepublik, PlaceMeet and Spectronite supported by the “honor loan” scheme

The members of the IMT Digital Fund-IGEU, IMT and Fondation Mines-Télécom held a meeting on 23 February. On this occasion, 8 start-ups from the incubators of IMT Mines Albi, IMT Atlantique, IMT Lille Douai, Télécom Paris, Mines Saint-Étienne, Télécom SudParis and Institut Mines-Télécom Business School were awarded 18 honor loans (interest-free) for a total of €340,000.

L’attribut alt de cette image est vide, son nom de fichier est logo_AlertSmartCity-1.jpg.

AlertSmartCity (the incubator at IMT Mines Albi) wishesto create an interoperable alert management platform, to be used in the event of a major risk (natural, industrial, health or terrorist disaster). This platform will allow municipalities to send qualified and geolocalized alerts to their public institutions (schools, cultural, sports, hospitals, administrations and other palaces receiving the public) using dedicated communication terminals that are resilient to network outages and are interactive (bi-directional communication). These reception terminals will allow disaster victims to report back to the crisis unit.
Two honor loans of €20,000 each.

L’attribut alt de cette image est vide, son nom de fichier est logo_Cook-e.png.

Cook-e (Télécom Paris Novation Center) proposes a multi-function connected robot for restaurant kitchens. The restaurant owner enters a recipe into the robot software and then loads the ingredient tanks. These tanks can be stored cool, dry or warm. The robot then prepares the recipe: it measures out, cuts, cooks, mixes and cleans itself automatically. It can prepare all dishes with mixed ingredients in small pieces: pasta with sauce, salads, bowls, rice, meat and fish in small pieces, vegetable side dishes, etc.
One honor loan of €20,000 and two honor loans of €10,000. Find out more

L’attribut alt de cette image est vide, son nom de fichier est logo_Dastra_300x85.jpg.

Dastra (IMT Starter) is the simple, guided data governance solution that enables data protection professionals to meet the requirements of the GDPR, save time, and develop a company data culture. One small step for DPOs, one giant leap for data protection!
Two honor loans of €8,000 and two honor loans of €12,000. Find out more

L’attribut alt de cette image est vide, son nom de fichier est logo_DMS-logistics.jpg.

DMS (the incubator at Mines Saint-Etienne) is an AI platform for managing and anticipating container flows, allowing for the fluidity of port and land container traffic. It connects all the players in the container port logistics chain (shipowners/terminals) with those located inland (carriers/depots).
Three honor loans of €20,000 each. Find out more

L’attribut alt de cette image est vide, son nom de fichier est logo_Goodflow.jpg.

GoodFloow (the IMT Lille Douai incubator) automates the tracking and management of reusable packaging. Their service consists of using IoT in individual packaging along with a web/mobile app. This solution eliminates asset management and change management issues related to packaging, makes flows more reliable, and enables a sustainable transition in logistics.
One honor loan of €40,000. Find out more

L’attribut alt de cette image est vide, son nom de fichier est logo_Jobrepublik.jpg.

JobRepublik (IMT Starter) is the meeting point between companies in need of temporary workers and anyone looking for additional income. The start-up offers the first open marketplace dedicated to “blue collar” freelancers that allows a direct relationship between 700,000 small businesses in the logistics, retail and restaurant sectors and 3 million independent workers.
Two honor loans of €20,000 each. Find out more

L’attribut alt de cette image est vide, son nom de fichier est logo_placemeet.jpg.

Placemeet (incubator at IMT Atlantique) is a simple and intuitive platform optimized for engagement and interaction. Attendees can move between rooms as if it were a physical event and enjoy an exceptional experience from anywhere in the world.
Two honor loans of €20,000 each. Find out more

L’attribut alt de cette image est vide, son nom de fichier est logo_Spectronite.png.

Spectronite (Télécom Paris Novation Center) has developed a breakthrough technology, with the implementation of an architecture based on Software Defined Radio, which can offer speeds up to 10 Gbps over very long distances, i.e. up to 20x the speed offered by traditional products. Spectronite offers a disruptive innovation for mobile operators, enabling them to deploy 4G and soon 5G, even in territories where fiber is not available.
One honor loan of €10,000 and one honor loan of €30,000. Find out more

The honor loan program

Created in late 2011 under the aegis of the Grandes Écoles and Universities Initiative (IGEU) association, the IMT Digital Fund for honor loans is co-financed by the Fondation Mines-TélécomBPI France and Revital’Emploi.

What do we know about the environmental impact of biocomposites?

Bio-based materials are an alternative to those derived from fossil resources. They have been increasingly promoted in recent years. However, given the recent development of this sector, their real environmental impact is still relatively unknown. Joana Beigbeder, researcher in environmental analysis at IMT Mines Alès, provides an update on the life cycle analysis (LCA) of these emerging materials.

Although bio-based materials are presented as an effective alternative to their petroleum-based predecessors, the details of their environmental impact are still not well known. Plant-based materials are certainly an improvement for some impacts, such as carbon footprint, but it can be trickier when it comes to soil use or soil pollution, for example. To get to the bottom of this, life cycle analysis is an indispensable tool.

Joana Beigbeder, researcher in environmental impacts at IMT Mines Alès, is working on the life cycle analysis of these new bio-based materials, particularly biocomposites. The objective is to compare the environmental impact of different scenarios, either between bio-based and petroleum-based materials, or according to the different stages of the life cycle of the same material. The researcher says, “the focus of life cycle analysis is to study several environmental impacts in parallel and to identify possible transfers of pollution throughout the life cycle of a product”.

Different sources of impact

With regard to environmental impact, some points seem obvious. If we only look at carbon footprint, it seems obvious that bio-based materials are more beneficial than their petroleum-based cousins. But the reality is more complex because there are many variables to consider. “Carbon impact is a well-known indicator, but we must also take into account the different impacts of pollution, human toxicity, soil or water pollution or global warming”, says Joana Beigbeder. Take the automotive sector, for example. If for the same function, a part made of bio-based material has a higher weight, it will require more energy to use and will not necessarily be beneficial. “It may be a better solution from an environmental point of view, but that’s precisely what the LCA will tell us,” she says.

Some points seem more obvious, such as the consumption of fossil resources and the impact on global warming in general. Plant materials absorb CO2 during their growth, which gives them an undeniable advantage. That said, their growth implies an agricultural impact, especially on land use. “Depending on the type of crop, this can lead to soil pollution. Generally, the more fertilizers and additives that are required for growth, the less beneficial some of the impacts will be,” says Joana Beigbeder. This brings up a new issue that does not exist with petroleum-based materials.

“It’s always a case of compromising, and we look for ways to improve the process or the product,” says the researcher. “Plants, such as flax or hemp, require little or no input and are therefore a preferred option. But the material is only one area for improvement when it comes to eco-design”, she notes. To reduce the environmental impact of a product, lightening its weight and extending its lifespan are key points. This can include the reparability of the product, or even new concepts such as prioritizing the use of the product over the purchase of the product.

“There is also a question of ecological common sense, if we study a disposable product, reusable or compostable materials will be favored,” says Joana Beigbeder. One research topic includes the development of compostable diapers and band-aids, a promising step in reducing plastic pollution.

Predicting potential impacts

“Life cycle analysis study is really done on a case-by-case basis depending on the service provided by a product,” says Joana Beigbeder. To estimate the consumption of solvents or the chemical synthesis of materials, the researcher uses previous scientific publications and mathematical models. But in some areas, data is missing. “We sometimes have to use fairly strong approximations on certain impacts, but this still allows us to map trends and areas for improvement,” she maintains. This can be the case for the production of plant fibers, as well as for their disposal at the end of their use.

As these materials do not yet have a major presence on the market, there are currently no channels for processing the resulting waste. In fact, there is no data on the environmental impact of the incineration, landfill, or pollution from these newly emerging materials. “To estimate these impacts, we assimilate them to other, better-known closely-related materials on which we have data, to develop our hypotheses,” she explains. This work also generates new data for LCA such as recycling simulations for end-of-life treatment.

“Some bio-based materials are at the laboratory stage today, and it’s still hard to imagine what will happen at the industrial stage,” says Joana Beigbeder. The researcher then works on predictions: she imagines scenarios for materials that are not yet on the market and analyzes whether this has any impact on an environmental level. Also, given the recent nature of the bio-materials sector, their production will be less optimized than that of traditional petroleum-based materials which have evolved and improved. “This works against the newer materials, as they will need to go through several developmental stages,” she points out.

This research is essential for the development of sustainable alternatives to conventional materials and is part of an overall vision to meet the challenges of plastic pollution. “A lot of research is focused on the development of new materials with a smaller environmental impact. The VOICE project, for example, launched in 2018 in partnership with Innovation Plasturgie Composites, Silvadec, IMT Mines Alès and funded by ADEME, focuses on the recycling of biocomposite decking. Or the Agrobranche project, which brings together 8 French partners[1], and focuses on, among other things, the study of bio-based reinforcements from agroforestry,” she concludes. These two projects reflect the blossoming of new emerging materials and the current development of research to find sustainable alternatives to the plastic problem.

[1] Scop Agroof (coordinator), IMT Mines Alès, CIRAD BiowooEB, LERMAB, FRD, CRA Bretagne, CIRAD AMAP, INRA SPO

Tiphaine Claveau

El Niño

El Niño: communities in the face of weather’s bad boy

South America must regularly face a climate event with far-ranging effects:  El Niño, which leads to localized flooding. This type of catastrophe also results in changes in the behavior of local communities – a topic which has been little studied. Yet these changes provide a good example of individuals’ resilience to crises. By studying consumption habits in the regions affected by El Niño, Vincent Gauthier, a researcher at Télécom SudParis, seeks to understand how communities react to this event.

El Niño is a recurring climate event, which takes place every two to seven years on the equatorial Pacific coast of South America. It leads to localized heavy precipitation with serious consequences. “The  2017 El Niño phenomenon was especially violent and was characterized by two periods of heavy rainfall, resulting in human casualties and extensive destruction of physical structures,” says Vincent Gauthier, a researcher at Télécom SudParis who studies complex networks and is analyzing the impact of the 2017 episode on the behavior of the local community.  

Peru was strongly impacted by the most recent El Niño phenomenon, especially in the north of the country and on its Pacific coast, which includes the Lima region. The episodes of rainfall gave rise to two periods of flooding: the first in February and the second in early April. Vincent Gauthier’s research seeks to understand how economic behavior changes before, during and after these periods.

To study these changes, the researcher uses data about consumption in the region. “Our approach is to analyze banking transaction data, with different granularity levels,” he explains. Studies were carried out in partnership with the Pacific University in Lima and led to the publication of a research article in the journal Plos One.

At the countrywide level, the results are conclusive: during each period of heavy rainfall there is a significant drop in the number and volume of transactions overall, therefore indicating that individuals consume less during the weather event. Transactions return to normal in the days following the rainfall, indicating that the overall impact is fairly limited in duration.   

Resilience to El Niño

The study was then carried out in a specific way in the region of Lima, which includes the capital and surrounding rural areas. This made it possible to categorize areas according to dynamic changes in consumption.  Unsurprisingly, the areas recording the most significant drops in transactions were the most affected by the rainfall. However, certain areas recorded rises in consumption before and during the episode, a behavior which may reflect a trend of purchasing as a precautionary measure.

To better understand such variations, Vincent Gauthier established a retail network model. This representation indicates not only consumers’ purchases, but also consumption paths. Such a model shows the various connections between stores, based on how busy they are, their ranking and the type of products sold. For example, a consumer who carries out a transaction at a pharmacy and then another at a supermarket strengthens the link between these two types of stores within the network. This makes it possible to study which links are the strongest in the event of a disturbance.  

During periods of heavy rainfall, the size of the network was strongly impacted,” says the researcher. “The connections were therefore reduced to stores that sell food, medical supplies and fuel. These connections represent the core of the network and if this core collapses, so does the whole system,” explains Vincent Gauthier. Modeling and studying resilience therefore allow us to understand the vulnerability and risks to this core network.

Using this approach, it can be seen that the first episode of rainfall had a stronger impact than the second one on the size of the core network, as well as on the time to it took to rebuild a wider network. Yet, the second period of rainfall was more violent from a weather perspective. This counterintuitive observation may be explained by better community preparedness for the second period of heavy rainfall and flooding. This difference in behavior highlighted by modeling is a marker of the resilience of the Peruvian people.

Understanding people through their purchases

To put these models in place, researchers used all the metadata associated with banking transactions. “Each transaction produces data accompanied by nomenclatures, which contain information about the type of store in which it was carried out, for example supermarkets, restaurants, pharmacies or service stations, “ says Vincent Gauthier. “This nomenclature also contains the date of purchase date and the anonymized identity of the person who made the purchase,” he continues.

This means that each individual’s purchasing path can be traced over time to provide an overview of his or her specific economic behavior during various periods. This analysis makes it possible to determine which stores are most often visited after one another by consumers, which is influenced both by the geographical proximity of the businesses to one another and similar interests among consumers.

By analyzing this data, stores can be ranked according to the number and volume of transactions carried out there, then divergence measurements can be taken to identify changes in these rankings,” explains the researcher. The divergence measurements focus on differences in stores’ rankings at the time of the El Niño phenomenon compared to the original distribution. Such differences can also be seen during festive events, when there is a greater number of transactions  in certain kinds of stores. “We therefore categorized stores based on the variation in their ranking during the El Nino phenomenon,” says Vincent Gauthier.

This approach allows researchers to create a profile of various stores over time so that they could  see how their ranking varies at the time of events. For example, the ranking of restaurants fell sharply during the short periods corresponding to times of heavy rainfall, while the ranking of stores selling medical supplies increased for a relatively long period of time. Supermarkets were the type of store whose rankings were generally the most stable.

Better preparing for crises

Future climate change will lead to an increase in extreme phenomena. Community resilience to these events will become an important issue to understand,” says Vincent Gauthier. The research carried out in relation to El Niño offers insights into community preparedness. It provides valuable knowledge for regions who are not used to dealing with extreme climate events, but who may have to face them in the years to come.  

That would make it possible to identify what services to develop and logistics to put in place in order to mitigate the effects of future crises, by organizing supply and inventory as well as keeping essential services open during crises. For example, we observed serious gasoline supply problems, although the demand for this product was high during the crisis and in its aftermath, and significant  delays in consumption in geographic areas that were less exposed to the crisis,” says the researcher.

Beyond the climate issue, the wider issue of preparedness and resilience to crisis was studied. Understanding how the consumption network varies, what parts must be strengthened, or on the other hand, what parts are secondary, makes it possible to better focus efforts in an exceptional situation. The study of the current health crisis is a part of this work. “We’re studying the effects of the Covid-19 pandemic on the behavior of the Peruvian people, by analyzing consumption data as well as mobility data.”  The analysis of mobility patterns could have a major impact on decisions to make in the event of a lockdown. “The methodology for the Covid-19 health crisis will be a bit different since the impact will be measured over a longer term unlike the crisis caused by El Niño where the underlying processes were essentially transitional,”  concludes Vincent Gauthier.

Antonin Counillon