Bitcoin crash: cybercrime and over-consumption of electricity, the hidden face of cryptocurrency

Donia Trabelsi, Institut Mines-Télécom Business School ; Michel Berne, Institut Mines-Télécom Business School et Sondes Mbarek, Institut Mines-Télécom Business School

Wednesday 19 May will be remembered as the day of a major cryptocurrency crash: -20% for  dogecoin, -19% for ethereum, -22% for definity, the supposedly-infinite blockchain that was recently launched with a bang. The best-known of these currencies, bitcoin, limited the damage to 8.5% (US $39,587) after being down by as much as 30% over the course of the day. It is already down 39% from its record value reached in April.

L’attribut alt de cette image est vide, son nom de fichier est Elon-Musk-1.jpg.
Elon Musk has gone from being seen as an idol to a traitor in the cryptocu-rrency market. Commons.wikimedia.org

Very few of the 5,000 cryptocurrencies recorded today have experienced growth. The latest ones to be launched, “FuckElon” and “StopElon”, say a lot about the identity of the individual considered to be responsible for this drop in prices set off over a week ago.

The former idol of the cryptocurrency world and iconic leader of Tesla Motors, Elon Musk, now seems to be seen as a new Judas by these markets. The founders of “StopElon” have even stated that their aim is to drive up the price of their new cryptocurrency in order to buy shares in Tesla and oust its top executive. However, bitcoin’s relatively smaller drop seems to be attributed to its reassuring signals.  

Elon Musk sent shockwaves rippling through the crypto world last week when he announced that it would no longer be possible to pay for his cars in bitcoin, reversing the stance he had taken in March. He even hinted that Tesla may sell all of its bitcoins. As the guest host of the Saturday Night Live comedy show in early May, he had already caused the dogecoin to plummet, though he had appeared on the show to support it, by referring to it as a “hustle” during a sketch.

 

The reason for his change of heart? The fact that it is harmful to the planet, as transactions using this currency require high electricity consumption. “Cryptocurrency is a good idea on many levels and we believe it has a promising future, but this cannot come at great cost to the environment,” stated Musk, who is also the head of the SpaceX space projects.

China also appears to have played a role in Wednesday’s events. As the country is getting ready to launch a digital yuan, its leaders announced that financial institutions would be prohibited from using cryptocurrency. “After Tesla’s about-face, China twisted the knife by declaring that virtual currencies should not and cannot be used in the market because they are not real currencies,” declared Fawad Razaqzada, analayst at Thinkmarkets, to AFP yesterday.

While a single man’s impact on the price of these assets – which have seen a dramatic rise over the course of a year – may be questioned, his recent moves and about-face urge us to at least examine the ethical issues they raise. Our research has shown that there at least two categories of issues.

The darknet and ransomware

The ethical issues surrounding cryptocurrencies remain closely related to the nature and very functioning of these assets. Virtual currencies are not associated with any governmental authority or institution. The bitcoin system was even specifically designed to avoid relying on conventional trusted intermediaries, such as banks, and escape the supervision of central banks. The value of a virtual currency therefore relies entirely, in theory, on the trust and honesty of its users, and on the security of an algorithm that can track all of the transactions.

Yet, due to their anonymity, lack of strict regulation and gaps in infrastructure, cryptocurrencies also appear to be likely to attract groups of individuals who seek to use them in a fraudulent way. Regulatory concerns focus on their use in illegal trade (drugs, hacking and theft, illegal pornography), cyberattacks and their potential for funding terrorism, laundering money and evading taxes.

Illegal activities accounted for no less than 46% of bitcoin transactions from 2009 to 2017, amounting to US $76 billion per year over this period, which is equivalent to the scale of US and European markets for illegal drugs. In April 2017, approximately 27 million bitcoin market participants were using bitcoin primarily for illegal purposes.

One of the best-known examples of cybercrime involving cryptocurrency is still the “Silk Road.”  In this online black marketplace dedicated to selling drugs on the darknet, the part of the internet that can only be accessed with specific protocols, payments are made exclusively in cryptocurrencies.  

In 2014, at a time when the price of the bitcoin was around US $150, the FBI’s seizure of over US $4 million in bitcoins on the Silk Road gives an idea of the magnitude of the problem facing regulators. At the time, the FBI estimated that this sum accounted for nearly 5% of the total bitcoin economy.

Cryptocurrencies have also facilitated the spread of attacks using ransomware, malware that blocks companies’ access to their own data, and will only unblock it in exchange for a cryptocurrency ransom payment. A study carried out by researchers at Google revealed that victims paid over US $25 million in ransom between 2015 and 2016. In France, according to a Senate report submitted in July 2020, such ransomware attacks represent 8% of requests for assistance from professionals on the cybermalveillance.gouv.fr website and 3% of requests from private individuals.

Energy-intensive assets

The main cryptocurrencies use a large quantity of electricity for mining, meaning IT operations in order to make them and verify transactions. The two main virtual currencies, bitcoin and ethereum, require complicated calculations that are extremely energy-intensive.

According to Digiconomist, for bitcoin, the peak energy consumption was between 60 and 73 TWh in October 2018. On an annualized basis, in mid-April 2021, this figure is somewhere between 50 and 120 TWh, which is higher than the energy consumption of a country such as Kazakhstan. These figures are even more staggering when they are given per transaction: on 6 May 2019, the figure was 432 KWh per transaction and over 1,000 KWh in mid-April 2021, which is equivalent to the annual consumption of a 30m2 studio apartment in France.

A comparison is often made with the Visa electronic payment system, which requires roughly 300,000 less energy consumption than bitcoin for each transaction. The figures cannot be strictly compared, but clearly show that bitcoin transactions are extremely energy-intensive compared to routine electronic transactions.

How can we find a balance?

There are solutions to reduce the cost and energy impact of bitcoins, such as using green energy or increasing the energy efficiency of mining computers.

However, computer technology must still be improved to make this possible. Most importantly, the miners’ reward for mining new bitcoins and verifying transactions is expected to decrease in the future, forcing them to consume more energy to ensure the same level of income.

The initiators of this technology consider that the innovation offered by bitcoin promotes a free world market and connects the world financially. However, it remains a challenge to find the right balance between promoting an innovative technology and deterring the crime and reducing the ecological impact associated with it.

Donia Trabelsi, associate professor of finance, Institut Mines-Télécom Business SchoolMichel Berne, Economist, director of training (retired), Institut Mines-Télécom Business School and Sondes Mbarek, associate professor of finance, Institut Mines-Télécom Business School

This article was republished from the The Conversation under a Creative Commons license. Read the original article (in French).

Digital Service Act

Digital Service Act: Regulating the content of digital platforms, Act 1

The Digital Service Act, proposed by the European Commission in early 2020, seeks to implement a new regulatory framework for digital platforms. Grazia Cecere, an economics researcher at Institut Mines-Télécom Business School, explains various aspects of these regulations.

Why has it become necessary to regulate the content of platforms?

Grazia Cecere: Technological developments have changed the role of the internet and platforms. Previous regulations specified that publishers were responsible for the totality of their content, but that web hosts were only responsible if flagged content was not handled adequately. With the emergence of super platforms and social media, the role of web hosts has changed. Their algorithms lead to more specific distribution of content, through rankings, search engine optimization and highlighting content, which may have significant impacts and contain dangerous biases.

What kind of content must be better regulated by digital platforms?

GC: There are many issues addressed, in particular combating cyber-bullying, disinformation and fake news, as well different types of discrimination. Today the platforms’ algorithms self-regulate based on the available data and may reproduce and amplify discrimination that exists in society. For example, if data analyzed by the algorithm shows wage gaps between men and women, it is likely to build models based on this information. So it’s important to identify these kinds of biases and correct them. Discrimination not only poses ethical problems: it also has economic implications. For example, if an algorithm designed to propose a job profile is biased based on an individual’s gender or skin color, the only important criteria – professional ability – will be less clear.

Read more on l’IMTech: Social media: The everyday sexism of advertising algorithms

What does the Digital Service Act propose so that platforms regulate their content?

C: The Digital Service Act seeks to set clear rules for the responsibilities that come with digital platforms. They must monitor the information distributed on their platforms, especially fake news and potentially harmful content. The goal is also to inform users better about the content and ensure their fundamental rights online. Platforms must also increase their transparency and make data about their activity available. This data would then be available to researchers who could test for whether it contains biases. The purpose of the Digital Service Act is to provide a harmonized legislative and regulatory system across all EU member states.

How can platforms regulate their own content?

GC : Another aspect of the Digital Service Act is providing the member states with regulatory instruments for their platforms. Different kinds of tools can be implemented. For example, a tool called “Fast Tracking” is being developed for Google to detect false information about Covid-19 automatically. This kind of tool, which determines whether information is false based on written content, can be complicated since it requires sophisticated natural language processing tools. Some issues are more complicated to regulate than others.

Are digital platforms starting to take into account the Digital Service Act?

GC: It depends on the platform. AirBnb and Uber, for example, have made a lot of data available to researchers so that they can determine what kinds of discriminatory biases it contains. And Google and Facebook are also providing access to an increasing amount of data. But Snapchat and TikTok are a whole other story!

Will the Digital Service Act also help regulate the internet market?

 GC: The previous regulation, the E-Commerce Directive, dates from 2000. Over time, it has become obsolete. Internet players today are different than they were 20 years ago and some have a lot more power. One of the challenges is for the internet market to remain open to everyone and for new companies to be able to be founded independently from the super platforms to boost competition, since today, any company that is founded depends on the monopoly of big tech companies.

By Antonin Counillon

nucléaire

Three Mile Island, Chernobyl, Fukushima: the role of accidents in nuclear governance

Stéphanie TillementIMT Atlantique – Institut Mines-Télécom and Olivier BorrazSciences Po

Until the 1970s, nuclear power plants were considered to be inherently safe, by design. Accidents were perceived as being highly unlikely, if not impossible, by designers and operators, in spite of recurring incidents that were not publicized.

This changed abruptly in 1979 with the Three Mile Island (TMI) accident in the United States. It was given wide media coverage, despite the fact that there were no casualties, and demonstrated that what were referred to as “major” accidents were possible, with a meltdown in this case.

The decades that followed have been marked by the occurrence of two other major accidents rated as level 7 on the INES (International Nuclear Event) scale: Chernobyl in 1986 and Fukushima in 2011.

Turning point in the 1980s

This article will not address this organization or the invention, in the wake of the Chernobyl accident, of the  INES scale used to rank events that jeopardize safety on a graduated scale, ranging from a deviation from a standard to a major accident.

Our starting point will be the shift that occurred in 1979, when accidents changed from being seen as unconceivable to a possible event, considered and described by nuclear experts as an opportunity for learning and improvement.  

Accidents therefore provide an opportunity to “learn lessons” in order to enhance nuclear safety and strive for continuous improvement.

But what lessons precisely? Has the most recent accident, Fukushima, led to profound changes in nuclear risk governance, as Chernobyl did?

The end of the human error rationale

Three Mile Island is often cited as the first nuclear accident: despite the technical and procedural barriers in place at the time, the accident occurred – such an accident was therefore possible.

Some, such as sociologist Charles Perrow, even described it as “normal,” meaning inevitable, due to the complexity of nuclear facilities and their highly coupled nature – meaning that the components that make up the system are closely interconnected – which are likely to lead to hard-to-control “snowball effects.”

For institutional, industrial and academic experts, the analysis of the accident changed views on man’s role in these systems and on human error: accidents went from being a moral problem, attributable to humans’ “bad behavior”, to a systematic problem, attributable to poor system design.

Breaking with the human error rationale, these lessons paved the way for the systematization of learning from experience, promoting a focus on transparency and learning.  

Chernobyl and risk governance

It was with Chernobyl that accidents became “organizational,” leading nuclear organizations and public authorities to introduce structural reforms of safety doctrines, based on recognition of the essential nature of “organizational and cultural problems […] for the safety of operations.” (AIEA, 1999).

Chernobyl also marked the beginning of major changes in risk governance arrangements at the international, European and French levels. An array of organizations and legal and regulatory provisions were introduced, with the twofold aim of learning from the accident that occurred at the Ukrainian power plant and preventing such an accident from happening elsewhere.

The law of 13 June 2006 on “Nuclear Transparency and Safety” (referred to as TSN) proclaiming, among other things, the ASN’s status as an administrative authority independent from the government, is one emblematic example.

A possibility for every country

25 years after Chernobyl, Japan experienced an accident at its Fukushima-Daiichi power plant.

Whereas the accident that occurred in 1986 could be attributed in part to the Soviet regime and its RBMK technology, the 2011 catastrophe involved American-designed technology and a country that many considered to be at the forefront of modernity.

With Fukushima, a serious accident once again became a possibility that no country could rule out. And yet, it did not give rise to the same level of mobilization as that of 1986.  

Fukushima – a breaking point?

Ten years after the Japanese catastrophe, it can be said that it did not bring about any profound shifts – whether in the way facility safety is designed, managed and monitored, or in the plans and arrangements designed to manage a similar crisis in France (or in Europe).

This has been shown in the research carried out through the Agoras research project.

As far as preparedness for crisis management is concerned, Fukushima led to a re-examination of the temporal boundaries between the emergency phase and the post-accident phase, and for greater investment in the latter.

This catastrophe also led the French authorities to publish a preparedness plan in 2014 for managing a nuclear accident, making it a part of the common crisis management system.

These two aspects are reflected in the strengthening of the public safety portion of the national crisis management exercises carried out annually in France.   

But, as underscored by recent research, the observation of these national exercises did not reveal significant changes, whether in the way they are organized and carried out, the content of plans and arrangements, or, more generally, in the approach to a crisis caused by a major accident – with the exception of the creation of national groups that can intervene quickly on site (FARN).

Limited changes

It may, of course, be argued that, like the effects of the Three Mile Island and Chernobyl accidents, structural transformations take time and it may still be too early to observe a lack of significant change.

But the research carried out through the Agoras project leads us to put forward the hypothesis that changes remain limited, based on two reasons.

The first reason comes from the fact that structural changes were initiated in the 20 years following the Chernobyl  accident. This period saw the rise of organizations dedicated to accident prevention and crisis management preparedness, such as the ASN in France, and European (WENRA, ENSREG) and international cooperation organizations.

These organizations initiated continuous research on nuclear accidents, gradually developing tools for  understanding and responding to accidents, as well as mechanisms for coordination between public officials and industry leaders at the national and international levels.

These tools were “activated” following the Fukushima accident and made it possible to quickly provide an explanation for the accident, launch shared procedures such as supplementary safety assessments (the  much-discussed “stress tests”), and collectively propose limited revisions to nuclear safety standards.

This work contributed to normalizing the accident, by bringing it into existing organizations and frameworks for thinking about nuclear safety.

This helped establish the conviction, among industry professionals and French public authorities, that the  governance regime in place was capable of preventing and responding to a large-scale event, without the need to profoundly reform it.

The inertia of the French system

A second reason comes from the close relationships in France between the major players in the civil nuclear sector (operators – EDF primarily – and regulators – the ASN and its technical support organization IRSN), in particular with regard to establishing and assessing safety measures at power plants.

These relationships form an exceptionally stable organized action system. The Fukushima accident provided a short window of opportunity to impose additional measures on operators.

Read more: L’heure des comptes a sonné pour le nucléaire français (Time for a Reckoning in the French Nuclear Industry)

But this window closed quickly, and the action system returned to a stable state. The inertia of this system can be seen in the production of new regulatory instruments, the development and upgrading of which take several years.   

It can also be seen in the organization of crisis management exercises, which continue to perpetuate distinctions between safety and security, accident and crisis, the facility interiors and the environment, and more generally, between technical and political considerations – distinctions that preserve the structure and content of relationships between regulators and operators.

Learning from accidents

Like Chernobyl, Fukushima was first viewed as an exceptional event: by insisting on the perfect storm of a tsunami of unprecedented magnitude and a nuclear power plant, highlighting the lack of an independent regulatory agency in Japan, insisting on the excessive respect for hierarchy among the Japanese, the aim was to construct a unique event so as to suggest that it could not happen in the same way in other parts of the world.

But, at the same time, a normalization process took place, in France in particular, focusing not as much on the event itself, as on the risks it posed for the organization of the nuclear industry, meaning stakeholders and forms of knowledge with legitimacy and authority.

The normalization process led to the accident being included in the existing categories, institutions and systems, in order to demonstrate their ability to prevent such an accident from happening and to limit the impact, should such an accident occur.

This was the result of efforts to delineate the boundaries, with some parties seeking to maintain them and others disputing them and trying to change them.

Ultimately, the boundaries upheld so strongly by industry stakeholders (operators and regulators) – between technical and political considerations, between experts and laymen – were maintained.

Relentlessly questioning nuclear governance

While the Fukushima accident was taken up by political and civil society leaders to challenge the governance of the nuclear industry and its “closed-off” nature, operators and regulators in France and throughout Europe quickly took steps to demonstrate their ability both to prevent such an accident, and to manage the consequences, in order to suggest that they could continue to be entrusted with regulating this sector.

As far as making the sector more open to civil society players is concerned, this movement was initiated well before the Fukushima accident (with the TSN Law in 2006, notably), and was, at best, the continuation of a pre-existing trend.

But other boundaries seem to have emerged or been strengthened in recent years, especially between technical factors and human and organizational factors, or safety requirements and other requirements for nuclear organizations (economic and industrial performance in particular), although it is not exactly clear whether this is related to the accidents.

These movements go hand in hand with a bureaucratization of relationships between the regulator and its technical expert, and between these two parties and operators, and require further research in order to investigate their effects on the foundations of nuclear risk governance.

Talking and listening to one another

As like causes produce like effects, it is indeed the fact that the nuclear industry is unreceptive to any “uncomfortable knowledge” – based on the idea introduced by Steve Rayner – that is the problem.

Social science research has long demonstrated that in order to solve complex problems, a wide range of individuals from various backgrounds and training must be brought together, for research that transcends disciplinary and institutional boundaries.

Social science researchers, engineers and public authorities must talk to – and more importantly – listen to one another. For engineers and policy-makers, that means being ready to take into account facts or knowledge that may challenge established doctrines and arrangements and their legitimacy.  

And social science researchers must be ready to go and see nuclear organizations, to get a first-hand look at their day-to-day operations, listen to industry stakeholders and observe working situations.

But our experience, in particular through Agoras, has shown us that not only is such work time-consuming and costly, it is also fraught with pitfalls. For even when one stakeholder does come to see the soundness of certain knowledge, the highly interconnected nature of relationships with other industry stakeholders, who make up the governance system, complicates the practical implementation of this knowledge, and therefore prevents major changes from being made to governance arrangements.

Ultimately, the highly interconnected nature of the nuclear industry’s governance system is arguably one of the vulnerabilities.  

Stéphanie Tillement, Sociologist, IMT Atlantique – Institut Mines-Télécom and Olivier Borraz, CNRS Research Director – Centre for the Sociology of Organisations, Sciences Po

This article has been republished from The Conversation under a Creative Commons license. Read the  original article (in French).

personnel hospitalier

Do hospital staff feel prepared?

Marie Bossard, a specialist in the social psychology of health, has been studying the feeling of preparedness among hospital staff in the face of exceptional health situations in her PhD since 2018. She explores the factors that may influence this feeling to better understand the dynamics of preparation in health systems.

The Covid-19 crisis is a case in point: our care system must sometimes confront exceptional health situations. Hospital staff are trained to respond to such situations, but there is little scientific literature on the way in which those concerned perceive their preparation. So how do caregivers, medical doctors, administrative staff and medical center directors feel in the face of these exceptional situations? This is the subject of Marie Bossard’s PhD at IMT Mines Alès and the University of Nîmes.

When she began her work in 2018, the Covid-19 crisis and pandemics were not yet a major concern. Exceptional health situations include anything that goes beyond the usual functioning of healthcare services. “We originally had in mind the emergency services being overwhelmed after an attack”, explains Gilles Dusserre, a researcher in risk sciences at IMT Mines Alès and joint supervisor of Marie Bossard with Karine Weiss at the University of Nîmes. Whatever the cause, this research fits into a global reflection on the current problems in emergency medicine. This is what the researchers want to understand better in order to provide operational responses to special users or hospital staff.

The feeling of “preparedness

The idea is to start with the individual and study how each person perceives his or her level of preparedness, and then develop these reflections on a collective scale,” says Marie Bossard. The aim is to measure the feeling of “preparedness” and identify the factors that influence it, as well as to apply psychosocial models to the level of preparedness of hospital staff. The PhD student is exploring the social representations of hospital staff through interviews with medical doctors, paramedics, health executives and administrative employees in different French university hospitals.

We can differentiate the feeling of preparedness, the perception of our preparation, and the reported preparation”, explains Marie Bossard. If hospital staff consider that exceptional health situations are only linked to an attack, for example, they might never be prepared for a fire,” she continues.

And, although the preparation received has an influence on the feeling of preparedness, she insists that “there are many other aspects to take into account. The feeling of self-efficacy is important, in particular.” This psycho-social concept represents, in a way, the power to act: the individual perception of having sufficient skills to manage a situation and knowing how to apply them. The perception of preparation, whether positive or negative, also affects the feeling of preparedness. The role of the collective is also undeniable. “A common response is that, individually, the person doesn’t feel ready, but they still have confidence in the collective, she adds. There’s a certain resignation”, says the joint PhD supervisor. “Hospital systems are already going through a difficult time and are coping, so collectively they feel capable of facing one more challenge.”

In a second phase, the aim is to propose hypotheses on the structure and content of these social representations. For example, health executives do not give the same type of spontaneous responses as paramedics when asked to list words in connection with exceptional health situations. The former generally talk about the practice of preparation (logistics, influx), while the second generally mention everyday examples or emotion (danger, serious, disaster).

The context of the Covid crisis

Given that the development of an exceptional health situation was completely unforeseeable, it initially seemed impossible to carry out a field study. However, the pandemic caused by the new coronavirus in early 2020 provided a characteristic field of study for the researchers. Marie Bossard and her joint supervisors reorganized their methodology and two new studies were prepared. The first before the arrival of the virus in France, which studied the preparedness of more than 400 participants among personnel and collectives. The second after the first peak of the epidemic and before a potential second wave, which was still an uncertainty at the time. The questionnaires from the study carried out among 534 participants provide a comparison between the feeling of readiness before and after Covid-19.

The post-Covid study confirmed that the feeling of preparedness depends on psycho-social variables and not just the level of preparation. Age and years of professional experience also influence this feeling, as do the profession and any previous experience of managing an exceptional health situation. These are individual variables, but the role of the collective was also confirmed. “The more ready and prepared others are, the higher the perception of personal preparedness, says Marie Bossard. Similarly, perceiving the hospital as ready, with sufficient human and material resources, has a great influence.” The PhD student is currently studying the results of the latest study conducted in September.

The situation, although difficult, provides “a context for the answers given during the first interviews,” says the PhD student. For example, it confirms that all hospital staff are involved, not just those considered on the front line. Indeed, the mobilization affects every hospital department. She admits that “the Covid-19 health crisis has given us a new perspective on this PhD subject, which is now topical and concretely demonstrates the need for a better understanding in this field“. It is also an opportunity to explore the effect of this exceptional health situation on the feeling of preparedness among those first concerned and the factors that influence this feeling with a concrete application of the subject.

We haven’t found any previous studies that have explored this subject from the same angle, says Marie Bossard. We’re starting from scratch. The aim is to remain as open-minded as possible to identify initial indicators, and then dig deeper into more specific questions,” she concludes. It could lead to new studies, for example to understand why the feeling of auto-efficacy plays such an important role in the feeling of preparedness.

 Tiphaine Claveau

Circular economy, environmental assessment and environmental budgeting

To implement a robust and durable circular economy strategy, it is important to assess its environmental impacts. Valérie Laforest and Natacha Gondran, both researchers at Mines Saint-Étienne, explain the reasons for incorporating an absolute environmental sustainability assessment method and the underlying concept of environmental budgeting.

The lifestyles of our contemporary societies are exerting constant and unsustainable pressure on the balance of our planet. One of the proposed strategies for protecting the Earth’s resources is the circular economy. The concept may seem simple – to encourage recycling and reuse to limit the consumption of raw materials – but environmental impact assessment involves a large number of variables and makes things complicated. This is why researchers are working to design more effective assessment methods for these impacts than the current tools, which are still insufficient. In particular, they are developing a systemic approach that integrates absolute environmental impact assessment.

This issue is at the heart of Valérie Laforest and Natacha Gondran’s work, both researchers at Mines Saint-Étienne1 and members of the Environmental Assessment of Waste, Effluents, Materials, Sediments and Soils (EDEEMS) Scientific Interest Group (SIG). Bringing together seven regional institutions, the EDEEMS SIG carries out, among other things, research on the health and environmental impacts of the circular economy. “The aim is to show that our collaborations can offer the economic world scientific support to overcome the obstacles that still pose a problem”, says Valérie Laforest. The researcher is a specialist in environmental assessment and focusses on the evaluation methods for these impacts. At the heart of the issue, it is important to define the indicators to assess pressure on natural resources and environments caused by humans.

A systemic approach

“This can be very experimental,” says Valérie Laforest. Within the SIG, “we’re starting out on a laboratory scale, then we’ll progressively move up to a pilot level to demonstrate the validity of our work on an industrial scale”. Let us consider the building sector and its impact on ecosystems as an example. Analyses and monitoring are done through ecotoxicology studies or environmental impact assessments from the source of pollutant emissions to their final destination. At the same time, the different transfers constituting all possible interactions between the source and the target, such as groundwater or soils, are also studied.

In the context of the circular economy, evaluating the “source” elements of pollution requires meticulous characterization of the materials produced from recycling. For example, besides the composition of the recycled materials itself, their reactivity must also be studied, with biodegradation tests for sources of organic pollution. These indicators are essential for assessing the different types of pressure on the ecosystems in greater detail.

There is a growing interest in research into the planet’s limits today. The idea is to compare this work with the impacts generated by production systems using what are known as absolute environmental sustainability assessment methods,” says Valérie Laforest. The Earth not only has a limited amount of resources, but also a limited capacity for absorption. We must therefore take account of all the impacts, both positive and negative, across all sectors. The researcher adds that in order to implement a sustainable circular economy, it is necessary to have “robust and transparent methods that allow us to act with knowledge of the consequences and perfect control of the risks.

Environmental budgeting

It is essential to integrate a systemic approach to standardize indicators for the evaluation of environmental impacts,” says Valérie Laforest. And, ultimately, to understand the impact of anthropogenic activities in relation to our planet’s capacity to absorb them. To avoid exceeding this capacity, one idea is to put in place an “environmental budget”. “We are aiming to break down the planet’s absorption capacity by type of activity according to the needs and contribution of each one”, explains Valérie Laforest. “Imagine allocating to each sector of activity a level of emissions that can be absorbed by the planet without too much disruption to the natural balance.”

However, distributing the planet’s total budget across the different activities of society raises various scientific, ethical and political questions. In addition, the total environmental budget for a given sector would have to be able to be broken down between the different brands or companies to see what they consume out of the available budget. “As part of a PhD by Anastasia Wolff, we adapted existing models and tested these methods for the food industry branch of a major retail group. For some indicators, such as climate change, they had already exceeded the allocated budget. Just for eating, this brand and its clients were already exceeding the environmental budget available to them,” explains Natacha Gondran.

Valérie Laforest and Natacha Gondran’s team focuses its work on the choice of relevant indicators, the definition and allocation of this ecological budget to a sector of activity and the evaluation of a given sector’s consumption of and contribution to this budget. It is a mammoth task. This global approach also aims to raise awareness of the scope of the issues in order to target which points to work on to efficiently reduce the environmental impact.

Besides this, there are other essential dimensions for implementing a sustainable circular economy. “The participation and involvement of local actors in the process is essential. It is a key factor of success”, says Valérie Laforest. While the researchers are developing the right tools, it is still vital to work with local actors to understand the situation and implement the process. “At IMT, the circular economy is one of the priority actions on the theme of renewable energy and resources. In addition, IMT is at the heart of numerous projects within its different schools. IMT also supports platforms such as the Plateforme Territoire at Mines Saint-Étienne, which aims in particular to help local actors visualize information through a spatial representation and target priority issues,” says Valérie Laforest.

1 Valérie Laforest and Natacha Gondran carry out their research in the framework of the Environment, City and Society laboratory, a joint CNRS research unit composed of 7 members including Mines Saint-Étienne.

Tiphaine Claveau

The Alicem app: a controversial digital authentication system

Laura Draetta, Télécom Paris – Institut Mines-Télécom and Valérie Fernandez, Télécom Paris – Institut Mines-Télécom

[dropcap]S[/dropcap]ome digital innovations, although considered to be of general interest, are met with distrust. A responsible innovation approach could anticipate and prevent such confidence issues.

“Alicem” is a case in point. Alicem is a smartphone app developed by the State to offer the French people a national identity solution for online administrative procedures. It uses face recognition as a technological solution to activate a user account and allow the person to prove their digital identity in a secure way.

After its authorization by decree of May 13, 2019 and the launch of the experimentation of a prototype among a group of selected users a few months later, Alicem was due to be released for the general public by the end of 2019.

However, in July of the same year, La Quadrature du Net, an association for the defense of rights and freedoms on the Internet, filed an appeal before the Council of State to have the decree authorizing the system annulled. In October 2019, the information was relayed in the general press and the app was brought to the attention of the general public. Since then, Alicem has been at the center of a public controversy surrounding its technological qualities, potential misuses and regulation, leading to it being put on hold to dispel the uncertainties.

At the start of the summer of 2020, the State announced the release of Alicem for the end of the autumn, more than a year later than planned in the initial roadmap. Citing the controversy on the use of facial recognition in the app, certain media actors argued that it was still not ready: it was undergoing further ergonomic and IT security improvements and a call to tender was to be launched to build “a more universal and inclusive offer” incorporating, among other things, alternative activation mechanisms to facial recognition.

Controversy as a form of “informal” technology assessment

The case of Alicem is similar to that of other controversial technological innovations pushed by the State such as the Linky meters, 5G and the StopCovid app, and leads us to consider controversy as a form of informal technology assessment that defies the formal techno-scientific assessments that public decisions are based on. This also raises the issue of a responsible innovation approach.

Several methods have been developed to evaluate technological innovations and their potential effects. In France, the Technology Assessment – a form of political research that examines the short- and long-term consequences of innovation – is commonly used by public actors when it comes to technological decisions.

In this assessment method, the evaluation is entrusted to scientific experts and disseminated among the general public at the launch of the technology. The biggest challenge with this method is supporting the development of public policies while managing the uncertainties associated with any technological innovation through evidence-based rationality. It must also “educate” the public, whose mistrust of certain innovations may be linked to a lack of information.

The approach is perfectly viable for informing decision-making when there is no controversy or little mobilization of opponents. It is less pertinent, however, when the technology is controversial. A technological assessment focused exclusively on scholarly expertise runs the risk of failing to take account of all the social, ethical and political concerns surrounding the innovation, and thus not being able to “rationalize” the public debate.

Participation as a pillar of responsible innovation

Citizen participation in technology assessment – whether to generate knowledge, express opinions or contribute to the creation and governance of a project – is a key element of responsible innovation.

Participation may be seen as a strategic tool for “taming” opponents or skeptics by getting them on board or as a technical democracy tool that gives voice to ordinary citizens in expert debates, but it is more fundamentally a means of identifying social needs and challenges upstream in order to proactively take them into account in the development phase of innovations.

In all cases, it relies on work carried out beforehand to identify the relevant audiences (users, consumers, affected citizens etc.) and choose their spokespersons. The definition of the problem, and therefore the framework of the response, depends on this identification. The case of Linky meters is an emblematic example: anti-radiation associations were not included in the discussions prior to deployment because they were not deemed legitimate to represent consumers; consequently, the figure of the “affected citizen” was nowhere to be seen during the discussions on institutional validation but is now at the center of the controversy.

Experimentation in the field to define problems more effectively

Responsible innovation can also be characterized by a culture of experimentation. During experimentation in the field, innovations are confronted with a variety of users and undesired effects are revealed for the first time.

However, the question of experimentation is too often limited to testing technical aspects. In a responsible innovation approach, experimentation is the place where different frameworks are defined, through questions from users and non-users, and where tensions between technical efficiency and social legitimacy emerge.

If we consider the Alicem case through the prism of this paradigm, we are reminded that technological innovation processes carried out in a confined manner – first of all through the creation of devices within the ecosystem of paying clients and designers and then through the experimentation of the use of artifacts already considered stable – inevitably lead to acceptability problems. Launching a technological innovation without participation in its development by the users undoubtedly makes the process faster, but may cost its legitimization and even lead to a loss of confidence for its promoters.

In the case of Alicem, the experiments carried out among “friends and family”, with the aim of optimizing the user experience, could be a case in point. This experimentation was focused more on improving the technical qualities of the app than on taking account of its socio-political dimensions (risk of infringing upon individual freedoms and loss of anonymity etc.). As a result, when the matter was reported in the media it was presented through an amalgamation of face recognition technology use cases and anxiety-provoking arguments (“surveillance”, “freedom-killing technology”, “China”, “Social credit” etc.). Without, however, presenting the reality of more common uses of facial recognition which carry the same risks as those being questioned.

These problems of acceptability encountered by Alicem are not circumstantial ones unique to a specific technological innovation, but must be understood as structural markers of the contemporary social functioning. For, although the “unacceptability” of this emerging technology is a threat for its promoters and a hindrance to its adoption and diffusion, it is above all indicative of a lack of confidence in the State that supersedes the reality of the intrinsic qualities of the innovation itself.

This text presents the opinions stated by the researchers Laura Draetta and Valérie Fernandez during their presentation at the Information Mission on Digital Identity of the National Assembly in December 2019. It is based on the case of the biometric authentication app Alicem, which sparked controversy in the public media sphere from the first experiments.

Laura Draetta, a Lecturer in Sociology, joint holder of the Responsibility for Digital Identity Chair, Research Fellow Center for Science, Technology, Medicine & Society, University of California, Berkeley, Télécom Paris – Institut Mines-Télécom and Valérie Fernandez, Professor of Economics, Holder of the Responsibility for Digital Identity chair, Télécom Paris – Institut Mines-Télécom

This article was republished from The Conversation under the Creative Commons license. Read the original article here.

 

digital intelligence

Digital transformation: how to avoid missing out on the new phase of work that has begun

Aurélie DudézertInstitut Mines-Télécom Business School and Florence LavalIAE de Poitiers

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]A[/dropcap]fter a lockdown that has helped reveal how far along companies are in their digital transformation, the easing of lockdown measures has ushered in a new phase marked by a desire to return to “normal” activities, which is impossible due to changing health restrictions.

Some organizations have therefore tried to use the health context as a pretext for regaining control over informal exchanges and adjustments that are impossible to control in a remote work environment (employees clocking in on site vs. remotely; identifying who is working with whom, at what time, etc.).

The organizational agility required for the goal of digital transformation and implemented in teams during the lockdown has been undermined by attempts to standardize work and return to uniform processes for the entire organization.

Mask-wearing has also become a point of tension. Besides being uncomfortable, masks conceal faces after what was often a period of remote work – in which it was difficult to perceive others’ emotions – and therefore complicate relationships. We must learn to read emotions differently and communicate with others differently.

These challenges are compounded by uncertainty over changing health restrictions. Highly adaptive ways of working must be put in place. Periods of site closures are followed by periods of hybrid work with employees taking turns working on site to comply with health restrictions.

Designing the transformation

After learning how to work together remotely, employees and managers must once again learn how to work together constantly. 

To respond to this situation, three strategies, which we explain in the collective work L’impact de la crise sur le management (The Impact of the Crisis on Management, Éditions EMS) seem to be useful to help get through this second wave of the crisis and continue the digital transformation of working methods.

The first is to work with teams on emerging stress and tensions by seeing them not as symptoms of individuals’ inability/incompetence to cope with the situation, but as drivers for developing appropriate ways to coordinate work.

For instance, if mask-wearing is a source of tension, bringing teams together to discuss what is causing the tension could provide an opportunity to create a new working arrangement that is more effective and better-suited to the new digital environment. This means that the manager must acknowledge employees’ experiences and perceptions and take them seriously so they can be revealed as expectations, such as creativity, or as the rejection of the organization and its goals.

The second strategy is to develop reflexive management, which takes an objective look at the work methods put in place in the current adaptation phase. It is quite clear today that work practices are moving towards a hybridization between working from the office/remotely and synchronous/asynchronous.

Rather than seeing the successive changes in health regulations as constraints, which make it difficult to do business and seamlessly continue their digital transformation, organizations would benefit from taking advantage of these periodic adjustments to gain insight into the pros and cons of this hybrid system.  

This objective look could provide an opportunity to characterize which activities specific to each team are indisputably more productive in person than remotely, or to determine how to manage teams working both from home and on-site.

The third strategy is to “encourage digital intelligence”, meaning working with the team to determine the requirements and uses of digital technology, depending on working methods. For example, it may not be necessary to upgrade employees’ skills to increase their proficiency in an entire collaborative work if the goal is simply to enable them to work together via web conference.

Overstretching employees at such an uncertain and strange time is an additional risk that could undermine the digital transformation process. Going back to the basic uses of digital technology in order to carry out tasks seems to be much more useful and effective.

Aurélie Dudézert, Full Professor, IMT BS, Institut Mines-Télécom Business School and Florence Laval, Lecturer at IAE de Poitiers

This article has been republished from The Conversation under a Creative Commons. Read the original article (in French).

crise, gestion de crise, crisis management

Crisis management: better integration of citizens’ initiatives

Caroline Rizza, Télécom Paris – Institut Mines-Télécom

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]A[/dropcap]s part of my research into the benefits of digital technologies in crisis management and in particular the digital skills of those involved in a crisis (whether institutions or grassroots citizens), I had the opportunity to shadow the Fire and Emergency Department of the Gard (SDIS) in Nîmes, from 9 to 23 April 2020, during the COVID-19 health crisis.

This immersive investigation enabled me to fine-tune my research hypotheses on the key role of grassroots initiatives in crises, regardless of whether they emerge in the common or virtual public space.

Social media to allow immediate action by the public

So called “civil security” crises are often characterized by their rapidity (a sudden rise to a “peak”, followed by a return to “normality”), uncertainties, tensions, victims, witnesses, etc.

The scientific literature in the field has demonstrated that grassroots initiatives appear at the same time as the crisis in order to respond to it: during an earthquake or flood, members of the public who are present on-site are often the first to help the victims, and after the crisis, local people are often the ones who organize the cleaning and rebuilding of the affected area. During the Nice terror attacks of July 2016, for example, taxi-drivers responded immediately by helping to evacuate the people present on the Promenade des Anglais. A few months earlier, during the Bataclan attacks in 2015, Parisians opened their doors to those who could not go home and used the hashtag #parisportesouvertes (parisopendoors). Genoa experienced two rapid and violent floods in 1976 and 2011; on both occasions, young people volunteered to clean up the streets and help shop owners and inhabitants in the days that followed the event.

There has been an increase in these initiatives, following the arrival of social media in our daily lives, which has helped them emerge and get organized online as a complement to actions that usually arise spontaneously on the field.

My research lies within the field of “crisis informatics”. I am interested in these grassroots initiatives which emerge and are organized through social media, as well as the issues surrounding their integration into crisis management. How can we describe these initiatives? What mechanisms are they driven by? How does their creation change crisis management? Why should we integrate them into crisis response?

Social media as an infrastructure for communication and organization

Since 2018, I have been coordinating the ANR MACIV project (Citizen and volunteer management: the role of social media in crisis scenarios). We have been looking at all the aspects of social media in crisis management: the technological aspect with the tools which can automatically supply the necessary information to institutional players; the institutional aspect of the status of the information coming from social media and its use in the field; the grassroots aspect, linked to the mechanisms involved in the creation and sharing of the information on social media and the integration of grassroots initiatives into the response to the crisis.

We usually think of social media as a means of communication used by institutions (ministries, prefectures, municipalities, fire and emergency services) to communicate with citizens top-down and improve the situational analysis of the event through the information conveyed bottom-up from citizens.

The academic literature in the field of  “crisis informatics” has demonstrated the changes brought by social media, and how citizens have used them to communicate in the course of an event, provide information or organize to help.

On-line and off-line volunteers

We generally distinguish between “volunteers” in the field and online. As illustrated above, volunteers who are witnesses or victims of an event are often the first to intervene spontaneously, while social media focuses on organizing online help. This distinction can help us understand how social media have become a means of expressing and organizing solidarity.

It is interesting to note that certain groups of online volunteers are connected through agreements with public institutions and their actions are coordinated during an event. In France, VISOV (international volunteers for virtual operation support) is the French version of the European VOST (Virtual Operations Support Team); but we can also mention other groups such as the WAZE community.

Inform and organize

There is therefore an informational dimension and an organizational dimension to the contribution of social media to crisis management.

Informational in that the content that is published constitutes a source of relevant information to assess what is happening on site: for example, fire officers can use online social media, photos and videos during a fire outbreak, to readjust the means they need to deploy.

And organizational in that aim is to work together to respond to the crisis.

For example, creating a Wikipedia page about an ongoing event (and clearing up uncertainties), communicating pending an institutional response (Hurricane Irma, Cuba, in 2017), helping to evacuate a place (Gard, July 2019), taking in victims (Paris, 2015; Var, November 2019), or helping to rebuild or to clean a city (Genoa, November 2011).

crise

Screenshot of the Facebook page of VISOV to inform citizens of available accommodation following the evacuation of certain areas in the Var in December 2019. VISOV Facebook page

An increased level of organization

During my immersion within the SDIS of the Gard as part of the management of the COVID-19 crisis, I had the chance to discuss and observe the way in which social media were used to communicate with the public (reminding them of preventative measures and giving them daily updates from regional health agencies), as well as to integrate some grassroots initiatives.

Although the crisis was a health crisis, it was also one of logistics. Many citizens (individuals, businesses, associations, etc.) organized to support the institutions: sewing masks or making them with 3D printers, turning soap production into hand sanitizer production, proposing to translate information on preventative measures into different languages and sharing it to reach as many citizens as possible; these were all initiatives which I came across and which helped institutions organize during the peak of the crisis.

crise

Example of protective visors made using 3D printers for the SDIS 30.

 

The “tunnel effect”

However, the institutional actors I met and interviewed within the framework of the two studies mentioned above (SDIS, Prefecture, Defense and Security Zone, DGSCGC) all highlighted the difficulty of taking account of information shared on social media – and grassroots initiatives – during crises.

The large number of calls surrounding the same event, the excess information to be dealt with and the gravity of the situation mean that the focus has to be on the essentials. These are all examples of the “tunnel effect”, identified by these institutions as one of the main reasons for the difficulty of integrating these tools into their work and these actions into their emergency response.

The information and citizen initiatives which circulate on social media simultaneously to the event may therefore help the process of crisis management and response, but paradoxically, they can also make it more difficult.

Then there is also the sharing through social media of rumors and fake news, especially when there is a gap in the information or contradictory ideas linked to an event (go to page Wikipedia during the COVID-19 crisis).

How and why should we encourage this integration?

Citizen initiatives have impacted institutions horizontally in their professional practices.

My observation of the management of the crisis within the SDIS 30 enabled me to go one step further and put forward the hypothesis that another dimension is slowing down the integration of these initiatives which emerge in the common or virtual public space: it implies placing the public on the same level as the institution; in other words, these initiatives do not just have an “impact” horizontally on professional practices and their rules (doctrines), but this integration requires the citizen to be recognized as a participant in the management and the response to the crisis.

There is still a prevailing idea that the public needs to be protected, but the current crisis shows that the public also want to play an active role in protecting themselves and others.

The main question that then arises is that of the necessary conditions for this recognition of citizens as participants in the management and response to the crisis.

Relying on proximity

It is interesting to note that at a very local level, the integration of the public has not raised problems and on the contrary it is a good opportunity to diversify initiatives and recognize each of the participants within the region.

However, at a higher level in the operational chain of management, this poses more problems because of the commitment and responsibility of institutions in this recognition.

My second hypothesis is therefore as follows: the close relations between stakeholders within the same territorial fabric allow better familiarity with grassroots players, thereby fostering mutual trust – this trust seems to me to be the key to success and explains the successful integration of grassroots initiatives in a crisis, as illustrated by the VISOV or VOST.

[divider style=”dotted” top=”20″ bottom=”20″]

The original version of this article (in French) was published on The Conversation.
By Caroline Rizza, researcher in information sciences at Télécom Paris.

Photographie en terasse, tiers-lieux, sociabilité, third places

The end or beginning of third places?

After our homes and workplaces, the social environments in which we spend time are referred to as third places. These are places for gathering, adventures – but at the same time, of safety, security and control. In the following article Müge Özman[1], Mélissa Boudes[2], Cynthia Srnec (FESP-MGEN)[3], Nicolas Jullien[4] and Cédric Gossart[5], members of IMT’s INESS idea lab, explore our relationship with third places and the challenges and opportunities of digital technology.

 

“Space is a common symbol of freedom in the Western world. Space lies open; it suggests the future and invites action[…]. To be open and free is to be exposed and vulnerable […] Compared to space, place is a calm center of established values. Human beings require both space and place. Human lives are a dialectical movement between shelter and venture, attachment and freedom.”

Yi-Fu Tuan, Space and Place, 1980

Observing that humans clearly spent a lot of time in coffee shops, restaurants, libraries, bars and hair salons, Ray Oldenburg coined the expression “third places” to describe these places other than home or work. Although they are known by different names around the world, they always serve the same essential purpose of socialization, giving people a chance to take a break, exposing ourselves to open up to others, but in relative security.  Who would have thought that a virus would suddenly deprive billions of humans of these local gathering places by turning them into areas of vulnerability?

For many humans, the Covid-19 pandemic has led to an extension of their virtual spaces by contracting physical space to the ultimate shelter: home. From streams of video conferences to “Zoom cocktail parties,” digital technology has helped maintain a sense of continuity in social interactions. Including, at times, strengthening relationships within an apartment building or neighborhood and exploring new forms of work. It has also shown that for many meetings, virtuality is enough, saving thousands of tons of CO2 in the process. The world of tomorrow is first and foremost one of sustainable human activity, and for such uses at least, digital technology has proven to be effective.

But at the same time, digital technology has also raised concerns about accessibility and exposing oneself to risks. Exploring these new spaces with peace of mind requires opportunities for refuge and control safeguards. Driven by the algorithms of globalized platforms, digital technology shapes our way of life in terms of communication, information and consumption, without necessarily providing an opportunity to express our attachment to a “place” and our need to shape it.

Between third place and virtuality

Between infinitely large digital spaces and the intimacy of home, will there be nothing else in this “world of tomorrow”? How can systems of refuge spaces such as third places flourish once again and plant the seeds of greater resilience to external shocks? Must we give up digital technologies to save our cherished third places? The success with which the reopening of bars and restaurants has been met shows that this need for in-person socialization has not gone away. But when it comes to communicating face-to-face with individuals outside of our local area, must we choose between physical travel, which is harmful to the planet, and digital interactions, which, controlled by global companies, are outside of individuals’ control and therefore lead to a heightened sense of vulnerability?

Luckily, our choices are not limited to this binary alternative. Many solutions seek to combine the effectiveness of digital technology with the dynamics of local communities. Take FabLabs for example. With nearly 400 active Fablabs in France, these “fabrication laboratories” make digital technology available in a collaborative way to solve concrete problems. Most of them are small-scale production units open to the general public, with a high degree of flexibility in order to adapt their production processes to the needs of their local communities. From the beginning of the lockdown, the French FabLab Network reorganized its members local activities in order to produce and distribute masks and face shields. In particular, they made plans and manufacturing guidelines available through open access and provided technical and logistical advice.

Other local initiatives based on digital platforms controlled by their users have increased their activity to provide services, share resource and help increase resilience in local communities. One such example is Pwiic, a mutual aid platform for neighbors to help one another procure food and medication during the lockdown period. Or the Open Food Network, which supports the organization of local food systems, and Coopcycle for the delivery of online purchases. This platform helps bicycle delivery workers organize through associations or cooperatives to obtain more dignified working conditions than those of other better-known platforms. Restrictions on travel and gatherings imposed by lockdown orders may benefit fair travel platforms such as Les Oiseaux de Passage, which combine human connections and tourism.

The digital tools that create such platforms can also be developed in a local way and/or by their users. The free software movement has led the way, but there are also initiatives to produce and host free tools locally. This is the aim of the Collective of Alternative, Transparent, Open, Neutral, Solidary Hosting Web Hosts (or CHATONS), another example of an organization that has been very active both before and during the crisis we are experiencing.

Commons

Even though they are digitally-based, platforms can have significant tangible effects on people and places, for example, the rise in housing prices driven by tourist rentals. How, then, can we preserve the vitality of our local places without giving up the benefits of digital technology?

The first French Forum on Cooperative Platforms highlighted the importance of partnerships between local authorities and digital platforms in order to develop new business models that are sustainable from a social and ecological point of view. As illustrated in the April 2020 presentation by Plateformes en Communs – (Commons Platforms), the French network of cooperative platforms – to the European Commission, such partnerships can help make local territories more resilient and autonomous by sharing resources through inclusive governance.

These solutions proposed by citizens, social economy organizations and public players represent alternatives to technologies that tend to be monopolistic. Because they offer governance that is closer to local needs – and more importantly, shared – they invent new virtual third places. By pooling time, digital technologies, knowledge and a variety of other resources for the benefit of other citizens and organizing the collective management of these resources,  those behind such initiatives have opened the door to new digital “commons.”

Like physical commons, digital commons are based on co-management by a portion of the users of the (digital) resources so that as many people as possible may benefit. These commons, whether physical or digital, are always threatened by the breakdown of the collective and by competition from more appealing private solutions, at least in the short term. But the success of the initiatives we have cited, – and many others – their flexibility and resilience in this time of crisis, have proven their effectiveness.  They are viable, resilient solutions, and are probably more sustainable in terms of the diffusion, appropriation and control of technologies and the digital space.

What digital technology and these initiatives have shown us is that place is not necessarily physical. It is that which is close, familiar, which we can influence and shape. Even under lockdown, places were not erased from the horizon of human activity and continued to organize and host collective action, in particular in virtual third places.  The lockdown was primarily a period of exclusion of third places for leisure and the backdrop for locally-rooted social and digital innovation. The creation and renovation of commons – tangible, intangible and hybrid – initiated by cooperative platforms leads us to rethink the dimensions and potential of third places in the 21st century. But not what makes them so necessary: these are places for exploration, of course, but ones participants can control, places they can help shape and organize, in a word: govern.

[1] Müge Özman is a researcher at the Institut Mines-Télécom Business School.
[2] Mélissa Boudes is a researcher at the Institut Mines-Télécom Business School.
[3] Cynthia Srnec is a researcher at the MGEN Foundation for Public Health and an associate researcher at LITEM.
[4] Nicolas Jullien is a researcher at IMT Atlantique and a member of the GIS Marsouin Scientific Interest Group.
[5] Cédric Gossart is a researcher at the Institut Mines-Télécom Business School.

 

EHPAD, Covid19, Coronavirus, nursing homes

In French nursing homes, the Covid-19 crisis has revealed the detrimental effects of austerity policies

This article was originally published (in French) in the Conversation. 
By Laura Nirello, IMT Lille Douai, and Ilona Delouette, University of Lille.

[divider style=”normal” top=”20″ bottom=”20″]

 

[dropcap]W[/dropcap]ith apocalyptic accounts of conditions in French nursing homes, where deaths have soared (over 9,000 estimated as of 3 May 2020), the Covid-19 pandemic has revealed, more than ever, the hardships facing this sector.

For years, care providers who work in France’s nursing homes (known as EHPADs in French) have been sounding the alarm about the crisis facing such facilities, underscoring the pitfalls of austerity policies and budgeting when applied to healthcare and care for dependent persons.

The ‘EHPAD’ nursing home status was created in 1997 when, after twenty years of discussions, the government approved the idea of covering care for dependent persons through the national Social Security program. At the time, the decision was based on a number of technical aspects, and in particular, uncertainty with respect to how the cost of providing care for dependent persons would develop in the future, and therefore, how it would be budgeted over time.

In 1997, a welfare allowance, managed by the departments, was therefore put in place (PSD, (assistance allowance), which has since been replaced with the APA (personal care allowance)).

Impossible to separate ‘cure’ from ‘care’?

This theoretical separation between healthcare, funded by Social Security, and care for dependent persons, funded by the departments, is at odds with the reality of care situations. Indeed, how can that which pertains to health (cure) be separated from that which pertains to assisting dependent persons (care)?

It is even more difficult to separate the two aspects in the case of highly dependent persons who require medical care in an institutional setting. The ‘EHPAD’ nursing home status was created precisely to cope with the influx of highly dependent persons: it makes facilities eligible for funding from both the public health insurance program and the departments.

Funding for nursing homes is therefore based on a three-part pricing system according to a theoretical categorization of costs (medical care, dependent care, living expenses). This funding is provided by public authorities, all of whom have limited budgets.

Living expenses are paid for by residents and their families. ‘Medical care’ is 100% funded through public health insurance, through the Regional Health Agency (ARS) while ‘dependent care’ is primarily funded by Departmental Councils. The Regional Health Agencies are limited to the fixed budgets voted upon annually through the Social Security Financing Act, while the Departmental Councils are limited to the funds transferred from the State through the personal care allowance (APA).

Medical care for the lowest possible cost

As part of the austerity policies imposed on the hospital sector, healthcare regulators gradually sought to remove expenditure for dependent elderly persons from hospital accounts. As such, according to IGAS, over a ten-year period (2006-2016) more than half of the beds in long-term care units (USLD) filled by highly dependent persons whose condition requires constant medical supervision were converted to places in nursing homes. Elderly people suffering from a loss of independence had no choice but to follow this trend and were sent to nursing homes. The State also invested in home care services and independent living facilities for the most independent individuals. This made it possible to limit the number of new places created in nursing homes.

The funding for nursing homes is negotiated through multi-year performance and resource contracts (CPOM) that determine an average level of dependency and severity of medical conditions for residents for a five-year period: the institutions are responsible for remaining within these averages and controlling resident admissions and discharges based on their level of dependency.

In this way, the authorities who fund the nursing homes pushed them to specialize in highly-dependent residents by taking in individuals excluded from the hospital setting and no longer independent enough to live at home or at intermediate living facilities. Nursing homes also tend to provide care for a community with an increasing number of medical conditions: more than a third of residents suffer from Alzheimer’s disease and struggle to perform everyday tasks (90% of residents need help with bathing and grooming); residents are admitted at an increasingly advanced age (85 years and 8 months) and stays in nursing homes are shorter (2 years and 5 months), according to data from the DREES (Directorate for Research, Studies, Evaluation and Statistics, a directorate of the central administration of health and social ministries).

But nursing homes’ resources have not kept pace with this changing profile of the residents receiving care. According to the DREES, while nursing homes now provide care for residents whose needs closely resemble those in long-term care units (USLD), the caregiver to patient ratio is .62 full-time equivalent employees per resident compared to 1.1 full-time equivalent employee per patient in long-term care units.

Moreover, while the staff of long-term care units are primarily made up of nurses, geriatric medicine specialists and nursing aids, in nursing homes there is only a single coordinating physician. And this physician is only present on a part-time basis, since they work at several facilities. Likewise, there are few nurses (5.8 for 100 residents) and they are not on site at night, whereas nurses are present 24 hours a day in long-term care units. Nursing home staff are primarily made up of nursing assistants and auxiliary staff, who are undoubtedly extremely devoted to their work, but are not adequately trained for the tasks they carry out and certainly underpaid as a result.

Deteriorating work and care conditions

Nursing homes find themselves facing a chronic lack of public funding. It therefore comes as no surprise that faced with emergency situations and endless needs, employees inevitably perform tasks that extend beyond their job description: they have no choice but to carry out tasks that are essential, but for which they are not qualified, to provide residents with the care and assistance they need (auxiliary staff help with grooming while nursing aids provide medical care). There is a disconnect between the work performed and salary levels, which remain low, making the sector unappealing and as a result, most nursing homes struggle to recruit staff, therefore exacerbating the already low caregiver-to-resident ratio in these facilities.

Working conditions have become even more difficult as changes in managerial practices have changed, as a result of efforts to control public spending, and have led to a demand for cost-effectiveness in nursing homes. These changes run counter to the founding principles of the facilities. As a successor to retirement homes, these institutions are also living communities, with a great number of interpersonal needs relating to accommodation (laundry, dining services), individual relationships and social life (care).

But in an effort to streamline operations, which goes hand in hand with cutting costs, work is “industrialized,” tasks are standardized and must be completed at a faster pace. The goal is to cut back on time considered to be “unproductive” – meaning saying “Good morning” and, “How are you today?” to residents in the morning and talking with them calmly in the evening – which ultimately amounts to all interpersonal aspects.

As far as indicators for funding institutions are concerned, public authorities prioritize tasks meant to accurately reflect operational productivity: the number of patients assisted with bathing and grooming or number of meals served! This intensifies the trend toward the dehumanization of living conditions in nursing homes, which are gradually turning in to “places to die.”

Dependence, a challenge for Social Security to recover from the crisis

This situation is alarming under normal circumstances, in particular from an ethical and social justice perspective, but it becomes tragic in the event of a health crisis. This is especially true today during the Covid-19 crisis. As the virus is wreaking havoc in these institutions, nursing homes lack medical staff to prescribe and administer the medications needed to keep patients alive and maintain their cognitive functions, or provide end-of-life care (Midazolam, Perfalgan). Staff members who are not considered caregivers had to wait for public authorities to decide to provide them with protective equipment, although it is critical to protecting high-risk residents. And while these residents are isolated in their rooms and visits are prohibited, employees do not have the time to comfort and support them at this difficult time.

These tragic circumstances call for a drastic rethinking of the nursing home model as many reports have suggested (Mission Flash Iborra-Fiat in 2018, Libault report in 2019). The fact is that these issues are related to the way the sector is funded.

While the various studies have assessed funding requirements for nursing homes at €7 and 10 billion, establishing a way to cover care for dependent persons within the healthcare sector, accompanied by increased resources based on needs, would have the advantage of doing away with the impossible separation between ‘cure’ and ‘care’, which has been maintained up to now for budgetary reasons, but which has shown its limitations, both in terms of managing hospitals and caring for dependent persons.

[divider style=”dotted” top=”20″ bottom=”20″]

Laura Nirello, Assistant Professor in economics, IMT Lille Douai – Institut Mines-Télécom and Ilona Delouette, PhD student in economics, University of Lille

This article has been republished from The Conversation under a Creative Commons license. Read original article (in French).