H2sys

H2sys: hydrogen in the energy mix

I’MTech is dedicating a series of articles to success stories from research partnerships supported by the Télécom & Société Numérique Carnot institute (TSN), to which IMT and Femto Engineering belong.

[divider style=”normal” top=”20″ bottom=”20″]

Belles histoires, Bouton, CarnotH2sys is helping make hydrogen an energy of the future. This spin-off company from the FCLAB and Femto-ST laboratories in Franche-Comté offers efficient solutions for integrating hydrogen fuel cells. Some examples of these applications include generators and low-carbon urban mobility. And while the company was officially launched only 6 months ago, its history is closely tied to the pioneers of hydrogen technology from Franche-Comté.

 

1999, the turn of the century. Political will was focused on the new millennium and energy was already a major industrial issue. The end of the 90s marked the beginning of escalating oil prices after over a decade of price stability. In France, the share of investment in nuclear energy was waning. The quest for other forms of energy production had begun, a search for alternatives worthy of the 2000s. This economic and political context encouraged the town of Belfort and the local authorities of the surrounding region to invest in hydrogen. Thus, the FCLAB research federation was founded, bringing together relevant laboratories related to this theme. Almost two decades later, Franche-Comté has become a major hub for the discipline. FCLAB is the first national applied research community to work on hydrogen energy and the integration of fuel cell systems. It also integrates a social sciences and humanities research approach which looks at how our societies adopt new hydrogen technologies. This federation brings together 6 laboratories including FEMTO-ST and is under the aegis of 10 organizations, including the CNRS.

It was from this hotbed of scientific activity that H2sys was born. Described by Daniel Hissel, one of its founders, as “a human adventure”, the young company’s history is intertwined with that of the Franche-Comté region.  First, because it was created by scientists from FCLAB. Daniel Hissel is himself a professor at the University of Franche-Comté and leads a team of researchers at Femto-ST, both of which are partners of the federation. Secondly, because the idea at the heart of the H2sys project grew out of regional activity in the field of hydrogen energy. “As a team, we began our first discussions on the industrial potential of hydrogen fuel cell systems early as 2004-2005,” Daniel Hissel recalls.  The FCLAB teams were already working on integrating these fuel cells into energy production systems. However, the technology was not yet sufficiently mature. The fundamental work did not yet target large-scale applications.

Ten more years would be needed for the uses to develop and for the hydrogen fuel cell market to truly take shape. In 2013, Daniel Hissel and his colleagues watched intently as the market emerged. “All that time we had spent working to integrate the fuel cell technology provided us with the necessary objectivity and allowed us to develop a vision of the future technical and economic issues,” he explains. The group of scientists realized that it was the right time to start their business. They created their project the same year. They quickly received support from the Franche-Comté region, followed by the Technology Transfer Accelerator (SATT) in the Grand Est region and the Télécom & Société Numérique Carnot institute. In 2017, the project officially became the company H2sys.

Hydrogen vs. Diesel?

The spin-off now offers services for integrating hydrogen fuel cells based on its customers’ needs. It focuses primarily on generators ranging from 1 to 20 kW. “Our goal is to provide electricity to isolated sites to meet needs on a human scale,” says Daniel Hissel. The applications range from generating electric power for concerts or festivals to supporting rescue teams responding to road accidents or fires. The solutions developed by H2sys integrate expertise from FCLAB and Femto-ST, whose research involves work in system diagnosis and prognosis aimed at understanding and anticipating failures, lifespan analysis, predictive maintenance and artificial intelligence for controlling devices.

Given their uses, H2sys systems are in direct competition with traditional generators which run on combustion engines—specifically diesel. However, while the power ranges are similar, the comparison ends there, according to Daniel Hissel, since the hydrogen fuel cell technology offers considerable intrinsic benefits. “The fuel cell is powered by oxygen and hydrogen, and only emits energy in the form of electricity and hot water,” he explains. The lack of pollutant emissions and exhaust gas means that these generators can be used inside as well as outside. “This is a significant benefit when indoor facilities need to be quickly installed, which is what firefighters sometimes must do following a fire,” says the co-founder of the company.

Another argument is how unpleasant it is to work near a diesel generator. Anyone who has witnessed one in use understands just how much noise and pollutant emissions the engine generates. Hydrogen generators, on the other hand, are silent and emit only water. Their maintenance is also easier and less frequent: “Within the system, the gases react through an electrolyte membrane, which makes the technology much more robust than an engine with moving parts,” Daniel Hissel explains. All of these benefits make hydrogen fuel cells an attractive solution.

In addition to generators, H2sys also works on range extenders.  “This is a niche market for us because we do not yet have the capacity to integrate the technology into most vehicles,” the researcher explains. However, the positioning of the company does illustrate the existing demand for solutions that integrate hydrogen fuel cells. Daniel Hissel sees even more ambitious prospects. While the electric yield of these fuel cells is much better than those of diesel engines (55% versus 35%), the hot water they produce can also be recovered for various purposes. Many different options are being considered, including a water supply network for isolated sites, or for household consumption in micro cogeneration units for electricity and heating.

But finding new uses through intelligent integrations is not the only challenge facing H2sys. As a spin-off company from research laboratories, it must continue to drive innovation in the field. “With FCLAB, we were the first to work on diagnosing hydrogen fuel cell systems in the 2000s,” says Daniel Hissel. “Today, we are preparing the next move.” Their sights are now set on developing better methods for assessing the systems’ performance to improve quality assurance. In contributing to making the technology safer, H2SYS is heavily involved in developing fuel cells. And the technology’s maturation since the early 2000s is now producing results: hydrogen is now attracting the attention of manufacturers for the large-scale storage of renewable energies. Will this technology therefore truly be that of the new millennium, as foreseen by the pioneers of the Franche-Comté region in the late 90s? Without going that far, one thing is certain: it has earned its place in the energy mix of the future.

 

[box type=”shadow” align=”” class=”” width=””]

A guarantee of excellence
in partnership-based research since 2006

 

Having first received the Carnot label in 2006, the Télécom & Société numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Institut Mines-Télécom Business School, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering. Learn more [/box]

AskHub DessIA WaToo

Startups AskHub, DessIA and WaToo receive interest-free loans

On June 7, the Digital Fund of the Grandes Ecoles and Universities Initiative selected three new startups to receive interest-free loans. AskHub and DessIA, from ParisTech Entrepreneurs, the Télécom ParisTech incubator, and WaToo, from the IMT Atlantique incubator, will each receive €20,000 interest-free loans. These financial aid programs co-financed by Fondation Mines-Télécom, la Caisse des Dépôts and Revital’Emploi provide these growing companies with the funds they need to pursue their development.

 

[one_half][box type=”shadow” align=”” class=”” width=””]askhub

AskHub is a platform that analyzes requests that were not understood by chatbots and then offers an ecosystem of ready-to-use chat plug-ins to improve the user experience. Find out more

[/box]

[/one_half]

[one_half_last]

[box type=”shadow” align=”” class=”” width=””]DessIA

DessIA is design software for mechanical engineering. Using an approach based on artificial intelligence, the software can select the solution best adapted to users’ needs from among billions of possibilities. Find out more

[/box]

[/one_half_last]

[box type=”shadow” align=”” class=”” width=””]

logo watoo

WaToo offers a solution to prevent the misappropriation and falsification of sensitive documents by authorized users by concealing digital watermarks in the documents to protect them. Find out more

[/box]

Learn more about previous winners

 

GDPR

GDPR comes into effect. Now it’s time to think about certification seals!

The new European Personal Data Protection Regulation (GDPR) comes into effect on May 25. Out of the 99 articles contained in the regulation, two are specifically devoted to the question of certification. While establishing seals to demonstrate compliance with the regulation seems like a good idea in order to reassure citizens and economic stakeholders, a number of obstacles stand in the way.

 

Certification marks are ubiquitous these days since they are now used for all types of products and services. As consumers, we have become accustomed to seeing them everywhere: from the organic farming label for products on supermarket shelves to Energy certification for appliances. They can either be a sign of compliance with legislation, as is the case for CE marking, or a sign of credibility displayed by a company to highlight its good practices. While it can sometimes be difficult to make sense of the overwhelming number of seals and marks that exist today, some of them represent real value. AOC appellations, for example, are well-known and sought out by many consumers. So, why not create seals or marks to display responsible personal data management?

While this may seem like an odd question to citizens who see these seals as nothing more than red labels on free-range chicken packaging, the European Union has taken it into consideration. So much so, that Articles 42 and 43 of the new European Data Protection Regulation (GDPR) are devoted to this idea. The creation of seals and marks is encouraged by the text in order to enable companies established in the EU who process citizens’ data responsibly to demonstrate their compliance with the regulation. On paper, everything points to the establishment of clear signs of trust in relation to personal data protection.

However, a number of institutional and economic obstacles stand in the way.  In fact, the question of seals is so complicated that IMT’s Personal Data Values and Policies Chair* (VPIP) has made it a separate research topic, especially in terms of how the GDPR affects the issue. This research, carried out between the adoption of the European text on April 14, 2016 and the date it is set to come into force, May 25, 2018, has led to the creation of a work of more than 230 pages entitled Signes de confiance : l’impact des labels sur la gestion des données personnelles (Signs of Trust — the impact of seals on personal data management).

For Claire Levallois-Barth, a researcher in Law at Télécom ParisTech and coordinator of the publication, the complexity stems in part from the number and heterogeneity of personal data protection marks. In Europe alone, there are at least 75 different marks, with a highly uneven geographic distribution. “Germany alone has more than 41 different seals,” says the researcher. “In France, we have nine, four of which are granted by the CNIL (National Commission for Computer Files and Individual Liberties).” Meanwhile, the United Kingdom has only two and Belgium only one. Each country has its own approach, largely for cultural reasons. It is therefore difficult to make sense of such a disparate assortment of marks with very different meanings.

Seals for what?

Because one of the key questions is: what should the seal describe? Services? Products? Processes within companies? “It all depends on the situation and the aim,” says Claire Levallois-Barth. Until only recently, the CNIL granted the “digital safe box” seal to certify that a service respected “the confidentiality and integrity of data that is stored there” according to its own criteria. At the same time, the Commission also has a “Training” seal that certifies the quality of training programs on European or national legislative texts. Though both were awarded by the same organization they do not have the same meaning. So saying that a company has been granted “a CNIL seal” provides little information. One must delve deeper into the complexity of these marks to understand what they mean, which seems contradictory to the very principle of simplification they are intended to represent.

One possible solution could be to create general seals to encompass services, internal processes and training for all individuals responsible for data processing at an organization. However, this would be difficult from an economic standpoint. For companies it could be expensive — or even very expensive — to have their best practices certified in order to receive a seal. And the more services and teams there are to be certified, the more time and money companies would have to spend to obtain this certification.

On March 31, 2018, the CNIL officially transitioned from a labeling activity to a certification activity.

The CNIL has announced that it would stop awarding seals for free. “The Commission has decided that once the GDPR comes into effect it will concentrate instead on developing or approving certification standards. The seals themselves will be awarded by accredited certification organizations,” explains Claire Levallois-Barth. Afnor Certification or Bureau Veritas, for example, could offer certifications for which companies would have to pay. This would allow them to cover the time spent assessing internal processes and services, analyzing files, auditing information systems etc.

And for all the parties involved, the economic profitability of certification seems to be the crux of the issue. In general, companies do not want to spend tens of thousands, or even hundreds of thousands, of euros on certification just to receive a little-known seal. Certification organizations must therefore find the right formula: comprehensive enough to make the seal valuable, but without representing too much of an investment for most companies.

While it seems unlikely that a general seal will be created, some stakeholders are examining the possibility of creating sector-specific seals based on standards recognized by the GDPR, for cloud computing for example. This could occur if criteria were approved, either at the national level by a competent supervisory authority within a country (the CNIL in France), or at the European Union level by the European Data Protection Board (EDPB). A critical number of seals would then have to be granted. GDPR sets out two options for this as well.

According to Article 43 of the GDPR, certification may either be awarded by the supervisory authorities of each country, or by private certification organizations. In France, the supervisory authority is the CNIL, and certification organizations include Afnor and Bureau Veritas. These organizations are themselves monitored. They must be accredited either by the supervisory authority, or by the national accreditation body, which is the COFRAC in France.

This naturally leads to the question: if the supervisory authorities develop their own sets of standards, will they not tend to favor the accreditation of organizations that use these standards? Eric Lachaud, a PhD student in Law and Technology at Tilburg and guest at the presentation of the work by the Personal Data Values and Policies Chair on March 8, says, “this clearly raises questions about competition between the sets of standards developed by the public and private sectors.” Sophie Nerbonne, Director of Compliance at the CNIL, who was interviewed at the same event, says that the goal of the national commission is “not to foreclose the market but to draw on [its] expertise in very precise areas of certification, by acting as a data protection officer.”

A certain vision of data protection

It should be acknowledged, however that the area of expertise of a supervisory authority such as the CNIL, a pioneer in personal data protection in Europe, is quite vast. Beyond serving as a data protection officer and being responsible for ensuring compliance with GDPR within an organization that has appointed it, as an independent authority CNIL is in charge of regulating issues involving personal data processing, governances and protection, as indicated by the seals it has granted until now. Therefore, it is hard to imagine that the supervisory authorities would not emphasize their large area of expertise.

And even more so since not all the supervisory authorities are as advanced as the CNIL when it comes to certification in relation to personal data. “So competition between the supervisory authorities of different countries is an issue,” says Eric Lachaud. Can we hope for a dialogue between the 28 Member States of the European Union in order to limit this competition? “This leads to the question of having mutual recognition between countries, which has still not been solved,” says the Law PhD student. As Claire Levallois-Barth is quick to point out, “there is a significant risk of ‘a race to the bottom’.” However, there would be clear benefits. By recognizing the standards of each country, the countries of the European Union have the opportunity to give certification a truly transnational dimension, which would make the seals and marks valuable throughout Europe, thereby making them shared benchmarks for the citizens and companies of all 28 countries.

The high stakes of harmonization extend beyond the borders of the European Union. While the CE standard is criticized at times for how easy it is to obtain in comparison to stricter national standards, it has successfully imposed certain European standards around the world.  Any manufacturer that hopes to reach the 500 million-person market that the European Union represents must meet this standard. For Éric Lachaud, this provides an example of what convergence between the European Member States can lead to: “We can hope that Europe will reproduce what it has done with CE marking: that it will strive to make the voices of the 28 states heard around the world and to promote a certain vision of data protection.”

The uncertainties surrounding the market for seals must be offset by the aims of the GDPR. The philosophy of this regulation is to establish strong legislation for technological changes with a long-term focus. In one way, Articles 42 and 43 of the GDPR can be seen as a foundation for initiating and regulating a market for certification. The current questions being raised then represent the first steps toward structuring this market. The first months after the GDPR comes into effect will define what the 28 Member States intend to build.

 

*The Personal Data Values and Policies Chair brings together the Télécom ParisTech, Télécom SudParis graduate schools, and Institut Mines-Télécom Business School. It is supported by Fondation Mines-Télécom.

[box type=”info” align=”” class=”” width=””]

Personal data certification seals – what is the point?

For companies, having a personal data protection seal allows them to meet the requirements of accountability imposed by article 24 of the GDPR. It requires all organizations responsible for processing data to be able to demonstrate compliance with the regulation. This requirement also applies to personal data subcontractors.

This is what leads many experts to think that the primary application for seals will be business-to-business relationships rather than business-to-consumer relationships. SME economic stakeholders could seek certification in order to meet growing demand amongst their customers, especially major firms, for compliance in their subcontracting operations.

Nevertheless, the GDPR is a European regulation. This means that compliance is assumed: all companies are supposed to abide by the regulation as soon as it comes into effect. A compliance seal cannot therefore be used as a marketing tool. It is, however, likely that the organizations responsible for establishing certification standards will choose to encourage seals that go beyond the requirements of the GDPR. In this case, stricter control over personal data processing than what is called for by the legislation could be a valuable way to set a company apart from its competitors. [/box]

astatine, L'astate montre enfin sa liaison halogène !

Astatine halogen bond finally revealed!

Astatine is the last member of the halogen family, which also includes fluorine and chlorine. These chemical elements have a distinct feature: they are able to form an unusual kind of bond with molecules. Yet for astatine, the existence of this specific halogen bond had never before been proven. This is because in its natural state, astatine is the rarest element. Now all that has changed. The bond was revealed thanks to work by Subatech (a research unit including IMT Atlantique, CNRS and the University of Nantes) and the CEISAM. Their results have been published in the prestigious journal Nature Chemistry on March 19th.

 

Fluorine, chlorine, bromine, iodine? Toothpaste, swimming pool, photographic film, fish! Four chemical elements, and four objects that even those of us who are unscientific can associate with them. At first glance, brushing your teeth and swimming have little in common. And yet the four chemical elements mentioned above are all part of the same family: the halogen family. In fact, incandescent “halogen” lamps owe their name to the iodine and bromine contained in their bulbs. This just proves that in our lives we are often in contact with halogens, sometimes daily, for example with the chlorine that makes up half of our table salt (the other half is sodium). These elements are also well known to chemists, who in the early 20th century brought to light their ability to create an unusual type of bond: the halogen bond. These bonds are weaker than typical chemical bonds yet are significant enough to play a role in the development of liquid crystals, conductive polymers, and nanoporous materials.

However, this chemical family includes an element that is more discreet. Astatine, the fifth and last member of the group, is not very sociable: you’re unlikely to have come across this one during your lifetime. This is because scientists estimate the entire amount of astatine in the earth’s crust at less than 30 grams. It is the least abundant of all the elements naturally present on Earth. Its scarcity makes it difficult to study, and researchers even questioned its ability to form halogen bonds. The mystery was even more intriguing since experience so far had shown a link between the atom’s property and the strength of the halogen bond—astatine was suspected to be the source of the strongest halogen bonds. However, this still needed to be proven experimentally.

Now this has been accomplished, thanks to the Subatech and CEISAM teams, research laboratories which include participants from IMT Atlantique, CNRS and the University of Nantes. The work published on March 19 in the prestigious Nature Chemistry journal not only revealed astatine’s ability to create a halogen bond but confirmed that it is the strongest of its kind.  These results greatly contribute to understanding this element which is so difficult to study due to its extreme rarity. “The halogen bond shows that it is possible to form stable molecular structures with astatine,” explains Julie Champion, a chemist at Subatech. “This is very interesting for alpha-immunotherapy applications in particular.”

This radiotherapy technique involves introducing molecules which emit specific radioactive radiation—alpha particles—into the body to target cancer cells for example. Since some astatine isotopes are radioactive and emit alpha radiation, the element is considered to be a good choice for alpha-immunotherapy. Moreover, its scarcity stems from its short lifespan: after approximately eight hours, the astatine-210 isotope, which has the longest lifetime, was already half disintegrated.  This characteristic represents a great advantage for treatment, since astatine’s rapid disintegration limits side effects. Yet difficulties remain despite the first encouraging in vitro and in vivo attempts. The revelation of astatine’s potential to form halogen bonds opens new areas to explore with the goal of strengthening the connection between astatine and new biomolecules, which could potentially lead to more effective protocols for alpha-immunotherapy.

Working with the rarest element in the world

To demonstrate the existence of this bond, researchers had to adapt to the chemical element’s constraints. “We worked with isotope 211, which has an even shorter lifetime than isotope 210: after three days there is not enough astatine left for the experiments,” Julie Champion explains. The chemists had to be cunning. First, it is impossible to extract a few grams of astatine from the Earth’s crust; it must be produced artificially. “This is why we work with the Arronax cyclotron in Nantes.  This instrument is a particle accelerator used specifically for the alpha particles we use to bombard a target containing bismuth atoms. The resulting nuclear reaction produces astatine,” the researcher explains. The race against time begins for the synthetic halogen.

It is then extracted in chloroform and transported by truck to Subatech. The precious radioactive cargo must first undergo radiation protection inspections before being inserted into small test tubes which will be used for the radiochemistry experiments. “It is important to understand that we are working at the ultra-trace level,” explains Julie Champion. The quantities are so low that we cannot see the element we are studying and cannot study it using the usual spectroscopic methods.

How then can astatine’s halogen bond be revealed in this situation? For the solution, the chemists employed techniques from nuclear metrology. These tools make it possible to detect radioactivity within the sample. Two immiscible liquid phases are present in the test tubes. The first is an aqueous phase, containing the chemical species that are soluble in water. The second phase is called organic and contains the chemical species which are not soluble in water. The principle is the same as that of a drop of oil in a glass of water: the two liquids do not mix. Astatine, as it is extracted from the cyclotron, is usually present in the aqueous phase. However, when it forms a halogen bond with a molecule composed primarily of carbon and hydrogen, the astatine returns to the organic phase. By observing the radioactive emissions—in other words astatine’s signature—in both phases, the chemists were able to see astatine transitioning from one phase to the other.

In addition to the alpha-immunotherapy applications, this discovery paves the way for further research. In the experiment that was carried out, the Subatech and CEISAM teams formed a classical bond between astatine and an iodine atom to form a halogen bond with a molecule. Yet iodine can also form a halogen bond! Could it therefore be possible to create two halogen bonds, one with astatine and the other with iodine? This is the type of question that Julie Champion and her colleagues hope to study soon.

 

working classes digital, classes populaires numérique

How working classes use digital tools: The Facebook example

For over a decade now, the use of digital tools and internet connectivity has greatly developed among households, including among working classes. Yet very few studies exist on this part of the population’s specific uses of digital technology. In the context of the Poplog project, which counts Télécom ParisTech among its partners, Dominique Pasquier, a researcher in sociology, has studied this question through interviews and with help from a data set from Facebook accounts.*

 

Among low-income households, internet connectivity figures have skyrocketed. According to INSEE (the French national institute for statistics and economic studies), in 2006, 47.9% of employees and 37% of manual workers had access to the Internet at home. These figures rose to 88% among manual workers and 91.5% among employees. Within 10 years, internet use became fully integrated into the daily lives of working classes.

Yet, within the social sciences, barely any studies have focused on how the working classes relate to digital technology. “There is no reason to believe that internet uses are the same at the top and bottom of the social ladder,” explains Dominique Pasquier, researcher in sociology at Télécom ParisTech and Director of Research at the CNRS (the French national center for scientific research).

This observation is what led to the creation of the Poplog project. Funded by the ANR (the French National Research Agency), the partners for this project include Télécom ParisTech, the Centre Atlantique de Philosophie and Université de Bretagne Occidentale. The researchers looked at the use of digital technology among working classes with stable employment. Unlike very low-income classes that live on the outskirts of urban areas, the studied individuals live in rural areas and most own their own home. “This fraction of the population consists primarily of traditional families, there are very few single-parent families,” Dominique Pasquier explains. “In general, few have earned degrees and they work as manual workers or employees.

In the framework of this project, in order to study this category of the population and its relationship with digital tools, Dominique Pasquier looked specifically at how they use Facebook.

 

Data from Facebook accounts as research material

The researcher in sociology first attempted to collect information using various survey methods, particularly interviews. Yet very few people responded positively to requests for interviews. These difficulties are common in general sociology, according to Dominique Pasquier, especially when the study focuses on working classes. “These individuals do not have a clear understanding of what sociology is and do not see the point of these discussions,” she notes. “And this is a group that primarily welcomes family to their homes, but not strangers. Therefore, we face a rejection phenomenon.

This problem was avoided thanks to another project called Algopol, led by the Center for Social Analysis and Mathematics, Orange Labs France Télécom, LIAFA and Linkfluence from 2012 to 2015. The team carried out a major survey on the Facebook networks and recorded and anonymized data from approximately 15,000 accounts. Only 50 of the 15,000 accounts matched the social profiles Poplog was interested in. This number was suited to a qualitative study of the data.

The principle was that I was not allowed to meet the people who owned these accounts,” Dominique Pasquier explains. “The only information I had was their age, sex, municipality of residence, number of friends and the content they exchanged, excluding personal photos.” Yet this limited content was sufficient for conducting a sociological analysis of this data. Especially since this content complemented the information obtained during the interviews. “The two different formats do not provide the same insights,” the researcher continues. “The Facebook data reveals discussions in which the sociologist was not involved. Whereas during an interview, the person wants to give a good impression of themselves and therefore will not talk about certain subjects.”

Certain topics related to the use of digital technology was only available in the interviews, such as searches for information or online purchases. On the other hand, some topics were only available on Facebook, such as employment problems, or difficulties related to undesired singleness, a reality that affects unskilled male workers in particular.

 

Significant variations in how Facebook is used

The 50 accounts were exactly what I was looking for: adults between 30 and 50 years old who live in rural areas and are manual workers or work in personal care services,” Dominique Pasquier explains. “This is where we saw that the uses of Facebook are extremely varied.” There were many different types of users: some attempt to use the social network but do not know what to say, do not receive enough feedback and give up. Others try their hardest to attract attention, sharing ready-made catchphrases and impressive links. Some are very prolific in sharing events from their daily life, whereas others never talk about this aspect.

However, certain behaviors and phenomena were frequently observed throughout this selection of accounts. “There is a whole set of phrases about life that express a kind of circulating common ethic. During the interviews, people called them ‘quotes’,” Dominique Pasquier explains. “Furthermore, when someone posts a status update, those who respond are intergenerational and both male and female.

Finally, some things men shared about romantic difficulties, situations of undesired singleness or separation, caught Dominique Pasquier’s attention. She analyzed these comments and how others responded to it. “Some of what was shared was very aggressive, with misogynistic remarks. In this case, the comment always brought a response from the poster’s contacts, especially from women, who counteracted the remarks.”

The researcher’s goal was to analyze both what is shared on the social network and others’ reactions to it: “I analyze this content as things the individuals considered worthy of sharing and making known to their Facebook contacts who, in the context of this group of individuals from working classes with stable employment, are primarily made up of close friends and family.”

 

A different use of digital tools

I think this survey also demonstrates that these individuals are faring well with the internet, but in a completely different way,” Dominique Pasquier explains.  “In the case of Facebook, the social network is mainly used to maintain a kinship group.

Through these interviews and analysis, the researcher noticed other specific features in the use of digital tools among the studied population. “It is a social universe that presents different uses and it is important for the public authorities to be aware of this,” says Dominique Pasquier. Public policy is indeed moving towards establishing fully digital means of communication via email with social assistance institutions like Pôle Emploi and the Caisse d’Allocations Familiales. This digital transformation poses a problem. In the course of her study, the researcher observed that the individuals she surveyed did not use email as a means of interpersonal communication; they used it only to make purchases.  “These email addresses are shared by spouses or the entire family. With all the online purchases, the emails from Pôle Emploi will be lost among hundreds of spam emails and ads,” the researcher observes. “There is also a sort of rage that develops among this population, because of this inability to contact each other.

This shows how important it is to continue this work on the issue of digital technology and its use by working classes… while remaining vigilant. Although many sociology students are interested in studying digital corpora, these types of materials pose methodological problems. “Much of the data is anonymous, we often do not know who has produced it,” Dominique Pasquier explains. “Also, we often do not realize that 50% of online participation is produced by 1% of the population, by heavy contributors. We therefore mistake anecdotal occurrences for mass social phenomena.” Yet despite these challenges, digital data has “enormous potential, since we can work on large volumes of data and network phenomena…” Offering enough information to provide an understanding of how certain social groups are structured.

 

[box type=”shadow” align=”” class=”” width=””]* The i3 seminar on digital data analysis methodologies in the social sciences

The Poplog project and Dominique Pasquier’s research were presented at the Methods for the Analysis of Online Participation Seminar, organized by i3, a joint CNRS research unit of which Télécom ParisTech is a member. This seminar, which will run through June 2018, focuses on issues surrounding methods for processing digital data for research in the humanities and social sciences. The discussions focus on how the corpus is formed, analysis methods and the relationship between digital data and conventional survey methods.[/box]

 

MT 180, surgery, chirurgie, thesis, thèse

MT 180: 3D organ models facilitate surgery on children

Alessio Virzì, Biomedical Engineer – PhD student in Medical Image Processing, Télécom ParisTech – Institut Mines-Télécom, Université Paris-Saclay

The original version of this article (in French) was published on The Conversation, in connection with Alessio Virzì’s participation in the competition “My thesis in 180 seconds”.
[divider style=”normal” top=”20″ bottom=”20″]

What area have you chosen to focus on for your thesis?

I am interested in developing new tools for processing medical images for pediatric minimally invasive pelvic surgery.

This type of surgery involves a technique in which small incisions are made to enable the surgeon to reach the target area while limiting surgical trauma as much as possible. The technique involves robotic tools and a video imaging system that helps guide the surgeon’s movements. Due to the anatomical complexity of a child’s pelvis, the surgical planning stage, based on the study of medical imaging, is a crucial step.

What will your thesis work contribute to this area?

My thesis proposes new methods for processing medical images adapted to children with pelvic tumors or malformations. I am developing IT solutions for generating virtual 3D models of organs, tumors and nerve fibers based on MRI images.

Surgeons can view these 3D models before and during the surgery, thus improving the way they plan for the operation and providing more information than a simple video imaging system.

For example, I used artificial intelligence to analyze the pixels in the images to detect the different anatomical structures. I also used generic models of organs that I then adapted to the child to obtain the final 3D model.

I integrate all of these IT methodologies into interactive software that makes the surgeon the main actor in the image analysis process. In addition, thanks to this software, surgeons can easily verify the quality of the 3D models obtained and can make corrections as needed, based on their anatomical knowledge.

I developed this software based on existing open-source software that I improved by integrating specific models for MRI images of children’s pelvises.

It was very important for me to offer a tool that could easily be used in a clinical context by surgeons who are not specialized in computer science.

Example of a 3D model of the pelvis (right) obtained by processing MRI images (left).

What challenges have you faced along the way?

The first challenge was the limited amount of scientific literature on this topic, since this research area is underexplored. I therefore had to base my work on medical imaging studies on other anatomical structures like the adult brain.

Another major challenge was the need to develop methods that could be used in clinical practice. They needed to be extremely effective and easy for surgeons to use. This required additional efforts in the design and development of the software.

My communication with surgeons and radiologists played a crucial role in developing my research and allowed me to discover anatomical knowledge that I had not necessarily been aware of before and helped me understand their requirements for IT tools.

When did you decide to start a thesis?

My desire to do a thesis first arose during a research internship I did for my Masters 2 studies in biomedical engineering, that provided an opportunity to work on new applications in neuro-imaging.

In the future, I would like to continue working in the medical field because I find this area very motivating. My desire to find new applications has led me to explore the possibility of working in medical imaging in the private sector.

What are your thoughts on “My Thesis in 180 Seconds”?

I think the ability to share scientific knowledge is a key skill for researchers.

Unfortunately, I believe this aspect is not sufficiently present in scientific training. As researchers, we often have the unfortunate habit of using terms that are too specific and not accessible for non-scientists. Yet it is essential for us to help everyone understand what we are doing, both to demonstrate the importance of our work and stimulate its development.

This experience will definitely help me improve my skills in popularizing scientific knowledge and help me become more comfortable presenting this information to the public. It is also a very motivating challenge to have to present my thesis in a language that is not my first language and that I began using three years ago when I arrived here from Italy.

 

 

non-destructive inspection

Medicine for Materials

Did you know that materials have health problems too? To diagnose their structural integrity, researchers are increasingly using techniques similar to those used in the medical field for humans. X-rays, temperature checks and ultrasound imaging are just a few of the tools that help detection of abnormalities in parts. The advantage of these various techniques is that they are non-destructive. When used together, they can provide much information on a mechanical system without taking it out of service. Salim Chaki is one of the French pioneers in this area. The researcher with IMT Lille Douai explains why manufacturers are keeping a close watch on the latest advances in this field.

 

What is the principle behind a non-destructive inspection?

Salim Chaki: It is a set of techniques that can provide information about a part’s state of health without modifying it. Before these techniques were developed, the traditional approach involved cutting up a defective part and inspecting it to identify the defect. With the non-destructive method, the philosophy is the same as that of human medicine: we use x-rays and ultrasounds, for example, to study what is inside the part, or infrared thermography to take its surface temperature to detect abnormalities. The development of nuclear energy during the post-war period demanded this type of techniques since radioactivity introduced new constraints in handling radioactive objects.

Your research approach involves performing a non-destructive “multi-technical” approach. What is the advantage of this approach?

SC: Historically, engineers would choose to use x-rays, ultrasound or other techniques based on their needs. For several decades, manufacturers did not really consider using several techniques simultaneously, whereas in the medical world a more global approach was already being used, including a clinical examination, blood test, x-rays and possibly further tests to diagnose a patient’s illness. In 2006, we became pioneers by proposing a combination of several techniques to diagnose the structural integrity of composite parts during operation. At that point, manufacturers became very interested, convinced by the high potential of the approach. The possibility of diagnosing a defect without modifying the part and even without taking it out of service represents a major economic advantage. We demonstrated the benefit of the non-destructive multi-technical approach by using infrared cameras, optical cameras for measuring the deformation fields and passive acoustic sensors attached to the structure. These sensors pick up the sound of the vibrations emitted by the part when it cracks. Combining several non-destructive techniques therefore makes it possible to confirm the diagnosis of a part’s condition; it complements the information and improves its reliability.

 

Salim Chaki was one of the pioneers in this field when he began working on non-destructive multi-technical inspection in 2006.

Is it really that difficult for manufacturers who have not yet done this to combine two or more techniques?

SC: Yes, actually implementing several techniques is not necessarily straightforward. There are technical problems in real time related to synchronization: the data collected by one sensor must be able to be correlated both spatially and temporally with the data from the others. This requires them to all be perfectly synchronized during the measurements.  There is also a major “data processing” aspect. For example, the infrared cameras record imaging data that are very big. They then must manage how these data are stored and processed. Finally, the interpretation process requires multiple skills since the data originate from different sensors related to different fields—optics, acoustics, heat science. However, we are currently working on data processing algorithms that would facilitate the use and interpretation of data in industrial settings.

What are the concrete applications of non-destructive multi-technical inspections?

SC: One of the most interesting applications involves pressure vessels—typically gas storage tanks. Regulations require that they be inspected periodically to assess their condition and whether they should remain in use. The non-destructive multi-technical approach not only allows this inspection to occur without emptying the tank and taking it out of service for each inspection, it could also be used to forecast the device’s remaining useful life. This is currently one of the major issues in our research. However, the multi-technique approach is still fairly recent, and therefore not many industrial applications exist. On the other hand, we believe that the future will be more conducive to multi-technical processes which will make this inspection more reliable, an aspect that is repeatedly requested by industrial equipment and plant operators, as well as by the administrative authorities responsible for their safety.

What are your lines of research now that manufacturers have begun adopting these techniques?

SC: First of all, it is important to pursue our efforts in convincing manufacturers of the advantages of multi-technical inspections, particularly the increased reliability of the inspection. There is no universal technique that offers a comprehensive diagnosis of a part’s condition. This introduces another interesting parallel with human medicine: it would be unrealistic to think a single test could detect everything. Also, as I said earlier, we are trying to go beyond the diagnosis by proposing an estimated remaining useful life for a part based on non-destructive measurements carried out while the part is in service. Very soon we will extend this concept to the inspection of parts’ initial health condition. The goal is to quickly predict if a part is healthy or not, starting at the production phase, and determine the duration of its service life. This is known as predictive maintenance.

Is the analysis of the data collected from all of the combined techniques also a research issue?

SC: Yes, of course! Since IMT Lille Douai was founded in 2017, as a result of the merger between Télécom Lille and Mines Douai, new perspectives have opened up through the synergy between our expertise in non-destructive testing of materials and our computer science colleagues’ specialization in data processing. The particular contribution of artificial intelligence algorithms and of big data to processing large volumes of data is crucial in anticipating anomalies for predictive maintenance.  If we could streamline the prognosis using these digital tools it would be a major advantage for industrial applications.

Q4Health

Q4Health: a network slice for emergency medicine

Projets européens H2020How can emergency response services be improved? The H2020 Q4Health project raised this question. The European consortium that includes EURECOM, the University of Malaga and RedZinc has demonstrated the possibility to relay video between first responders at an emergency scene and doctors located remotely. To do so, the researchers had to develop innovative tools for 4G network slicing. This work has paved the way for applications for other types of services and lays the groundwork for the 5G.

 

Doctors are rarely the first to intervene in emergency situations. In the event of traffic accidents, strokes or everyday accidents and injuries, victims first receive care from nearby witnesses. The response chain is such that citizens then usually hand the situation over to a team of trained first responders — which does not necessarily include a doctor — who then bring the victim to the hospital. But before the patient reaches the doctor for a diagnosis, time is precious. Patients’ lives depend on medical action being taken as early as possible in this chain. The European H2020 Q4Health project studied a video streaming solution to provide doctors with real-time images of victims at the emergency scene.

The Q4Health project, which was started in January 2016 and completed in December 2017, had to face the challenge of ensuring that the video flow was of high enough quality to make a diagnosis. To this end, the project consortium which includes EURECOM, the University of Malaga in Spain and the project leader SME RedZinc, proved the feasibility of programming a mobile 4G network that can be virtually sliced. The network “slice” created therefore includes all the functions of the regular network, from its structural portion (antennas) to its control software. It is isolated from the rest of the network, and is reserved for communication between emergency response services and nearby doctors.

Navid Nikaein, a communication systems researcher at EURECOM sates that “The traditional method of creating a network slice consists of establishing a contract with an operator who guarantees the quality of service for the slice“. But there is a problem with this sort of system: emergency response services do not have complete control over the network; they remain dependent on the operator. “What we have done with Q4Health is to give real control to emergency response services over inbound and outbound data traffic,” adds the researcher.

Controlling the network

In order to carry out this demonstration, the researchers developed application programming interfaces (API) for the infrastructure network (the central portion of the internet, that interconnects all the other access points) and the mobile network that connects 4G devices, such as telephones, to an access point (this is referred to as an access network). These programming interfaces allow emergency response services to define priority levels for their members. The service can use the SIM card associated with a firefighter or paramedic’s professional mobile phone to identify the user’s network connection. Via the API, it has been determined that the paramedic would benefit from privileged access to the network, enabling dynamic use of the slice reserved for emergency services.

In the Q4Health project, this privileged access for first responders allows them to stream video independent of data traffic in the area, which is a great advantage in crowded areas. Without such privileged access, in a packed stadium, for example it would be impossible to transmit high-quality video over a 4G network. And to ensure the quality of the video flow, a system analyzes the radio rate between the antenna and the first responders’ device — for the Q4Health project, this is not necessarily a smartphone but glasses equipped with a camera to facilitate emergency care. The video rate is then adjusted depending on the radio rate. “If there is a lower radio rate, video processing is optimized to prevent deterioration of image quality,” explains Navid Nikaein.

Through this system first responders are able to give doctors a real-time view of the situation. These may be doctors at the hospital to which the patient will be transported, or volunteer doctors nearby who are available to provide emergency assistance. They obtain not only visual information about the victim’s condition, which facilitates diagnosis, but also gain a better understanding of the circumstances of the accident by observing the scene. They can therefore guide non-physician responders through delicate actions, or even allow them to perform treatment which could not be carried out without consent from a doctor.

Beyond its medical application, Q4Health has above all proved the feasibility of network slicing through a control protocol in which the service provider, rather than the operator, has control. This demonstration is of particular interest for the development of the 5G network, which will require network slicing. “As far as I know, the tool we have developed to achieve this result is one of the first of its kind in the world,” notes Navid Nikaein. And highlighting these successful results, achieved in part thanks to EURECOM’s OpenAirInterface and Mosaic5G platforms, the researcher adds, “Week after week, we are increasingly contacted about using these tools,” This has opened up a wide range of prospects for use cases, representing opportunities to accelerate 5G prototyping. In addition to emergency response services, many other sectors could be interested in this sort of network slicing, starting with security services or transport systems.

 

political ecology, écologie politique, Fabrice Flipo, Télécom École de Management

Philosophy of science and technology in support of political ecology

Fabrice Flipo, a philosopher of science and technology and researcher at Institut Mines-Télécom Business School, has specialized in political ecology, sustainable development and social philosophy for nearly 20 years. Throughout the fundamental research that shapes his more technical teaching, he tries to produce an objective view of current political trends, the ecological impact of digital technology and an understanding of the world more broadly.

 

For Fabrice Flipo, the philosophy of science and technology can be defined as the study of how truth is created in our society. “As a philosopher of science and technology, I’m interested in how knowledge and know-how are created and in the major trends in technical and technological choices, as well as how they are related to society’s choices,” he explains. It is therefore necessary to understand technology, the organization of society and how politics shapes the interaction between major world issues.

The researcher shares this methodology with students at Institut Mines-Télécom Business School, in his courses on major technological and environmental risks and his introductory course on sustainable development. He helps students analyze the entire ecosystem surrounding some of the most disputed technological and environmental issues (ideas, stakeholders, players, institutions etc.) of today and provides them with expertise to navigate this divisive and controversial domain.

Fundamental research to understand global issues

This is why Fabrice Flipo has focused his research on political ecology for nearly 20 years. Political ecology, which first appeared in France in the 1960s, strives to profoundly challenge France’s social and economic organization and to reconsider relationships between man and his environment. It is it rooted in the ideas of a variety of movements, including feminism, third-worldism, pacifism and self-management among others.

Almost 40 years later, Fabrice Flipo seeks to explain and provide insight into this political movement by examining how its emergence has created controversies with other political movements, primarily liberalism (free-market economics), socialism and conservatism. “I try to understand what political ecology is, and the issues involved, not just as a political party of its own, but also as a social movement,” explains the researcher.

Fabrice Flipo carries out his research in two ways. The first is a traditional approach to studying political theory, based on analyzing arguments and debates produced by the movement and the issues it supports. This approach is supplemented by ongoing work with the Laboratory of Social and Political Change at the University of Paris 7 Diderot and other external laboratories specializing in the subject. He works in collaboration with an interdisciplinary team of engineers, sociologists and political scientists to examine the relationship between ICT (Information and Communication Technologies) and ecology. He also involves networks linked to ecology to expand this collaboration, works with NGOs and writes and appears in specialized or national media outlets. For some of his studies, he also draws on a number of different works in other disciplines, such as sociology, history or political science.

The societal impact of political ecology

Today political ecology is a minor movement compared to the liberal, socialist and conservative majorities,” says the researcher. Indeed, despite growing awareness of environmental issues (CoP 21, development of a trade press, energy transition for companies, adopting a “greener” lifestyle etc.) the environmental movement has not had a profound effect on the organization of industrialized human societies, so it needs to be more convincing. This position makes it necessary to present arguments in its minority status on the political spectrum. “Can political ecology be associated with liberalism, socialism or even conservatism?” asks the researcher. “Although it does not belong to any of the existing currents, each of them tries to claim it for their own.”

More than just nature is at stake. A major ecosystem crisis could open the door for an authoritarian regime seeking to defend the essential foundation of a particular society from all others. This sort of eco-fascism would strive to protect resources rather than nature (and could not therefore be considered “environmentalism”), pitching one society against another. Political ecology is therefore firmly aligned with freedom.

To stay away from extremes, “the challenge is to carry out basic research to better understand the world and political ideas, and to go beyond debates based on misunderstandings or overly-passionate approaches,” explains Fabrice Flipo. “The goal is to produce a certain objectivity about political currents, whether environmentalism, liberalism or socialism. The ideas interact with, oppose, and are defined by one another.”

Challenging the notion that modernity is defined by growth and a Cartesian view of nature, the study of political ecology has led Fabrice Flipo to philosophical anthropological questions about freedom.

[box type=”shadow” align=”” class=”” width=””]

Analyzing the environmental impact of digital technology in the field

Political ecology raises questions about the ecology of infrastructures. Fabrice Flipo has begun fieldwork with sociologists on an aspect of digital technology that has been little studied overall: the environmental impacts of making human activities paper-free, the substitution of functions and “100% digital” systems.

Some believe that we must curb our use of digital technologies since manufacturing these devices requires great amounts of energy and raw materials and the rise of such technology produces harmful electronic waste. But others argue that transitioning to an entirely digital system is a way to decentralize societies and make them more environmentally-friendly.

Through his research project on recovering mobile phones (with the idea that recycling helps reduce planned obsolescence) Fabrice Flipo seeks to highlight existing solutions in the field which are not used enough, with priority being given to the latest products and constant renewal.[/box]

Philosophy to support debates about ideas

“Modernity defines itself as the only path to develop freedom (the ability to think), control nature, technology, and democracy. The ecological perspective asserts that it may not be that simple,” explains the researcher. “In my different books I’ve tried to propose a philosophical anthropology that considers ecological questions and different propositions offered by post-colonial and post-modern studies,” he continues.

Current societal debates prove that ecological concerns are a timely subject, underscore the relevance of the researcher’s work in this area, and show that there is growing interest in the topic. Based on the literature, it would appear that citizens have become more aware of available solutions (electric cars, solar panels etc.) but have been slow to adopt them. Significant contradictions between the majority call to “produce more and buy more” and the minority call encouraging people to be “green consumers” as part of the same public discourse make it difficult for citizens to form their own opinions.

“So political ecology could progress through an open debate on ecology,” concludes Fabrice Flipo, “involving politicians, scientists, journalists and specialists. The ideas it champions must resonate with citizens on a cultural level, so that they can make connections between their own lifestyles and the ecological dimension.” An extensive public communication, to which the researcher contributes through his work, coupled with a greater internalization and understanding of these issues and ideas by citizens could help spark a profound, far-reaching societal shift towards true political ecology.

[author title=”Political ecology: The common theme of a research career” image=”https://imtech-test.imt.fr/wp-content/uploads/2018/02/Fabrice-Flipo_format_en_hauteur.jpg”]A philosopher of science and technology, Fabrice Flipo is an associate research professor accredited to direct research in social and political philosophy and specializes in environmentalism and modernity. He teaches courses in sustainable development and major environmental and technological risks at Télécom École de Management, and is a member of the Laboratory of Social and Political Change at the University of Paris Diderot. His research focuses on political ecology, philosophical anthropology of freedom and the ecology of digital infrastructures.

He is the author of many works including: Réenchanter le monde. Politique et vérité “Re-enchanting the world. Politics and truth” (Le Croquant, 2017), Les grandes idées politiques contemporaines “Key contemporary political ideas” (Bréal, 2017), The ecological movement: how many different divisions are there?  (Le Croquant, 2015), Pour une philosophie politique écologiste “For an ecological political philosophy” (Textuel, 2014), Nature et politique (Amsterdam, 2014), and La face cachée du numérique “The Hidden Face of Digital Technology” (L’Echappée, 2013).[/author]

What nuclear risk governance exists in France?

Stéphanie Tillement, IMT Atlantique – Institut Mines-Télécom

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]I[/dropcap]t will take a long time to learn all the lessons from the Fukushima accident, and even longer to bring about a change in the practices and principles of nuclear risk governance. Yet several major themes are already emerging in France in this respect.

Next Sunday, March 11, 2018 will mark the 7-year anniversary of the Fukushima disaster, when the Northeast coast of Japan was struck by a record magnitude-9 earthquake, followed by a tsunami. These natural disasters led to an industrial disaster, a nuclear accident rated 7, the highest level on the INES scale, at the Dai-ichi nuclear power plant in Fukushima.

In the aftermath of the disaster, the world was stunned at the realization of the seriousness and suddenness of this event, which, according to Jacques Repussard, Director General of the French Institute for Radiological Protection and Nuclear Safety (IRSN) calls for us to “imagine the unimaginable and prepare for it.” It confronts all those involved in nuclear safety with a critical challenge: how can we guarantee safety in the midst of unexpected events?

Beyond its unpredictable nature, this accident served as a brutal and particularly relevant reminder that nuclear energy, more than any other technology or industry, transcends all borders, whether they be geographic, temporal, institutional or professional. The consequences of nuclear accidents extend well beyond the borders of a region or a country and remain present for hundreds or even thousands of years, thus exceeding any “human” time scale.

La Hague nuclear waste reprocessing plant. Jean Marie Taillat/Wikimedia, CC BY-SA

La Hague nuclear waste reprocessing plant. Jean Marie Taillat/Wikimedia, CC BY-SA

 

Fukushima revealed that the safety of socio-technical systems with this level of complexity cannot be limited to only to a few stakeholders, nor can it be ensured without creating strong and transparent ties between a multitude of stakeholders, including nuclear operators, citizens, safety authorities, technical support, and government services. Fukushima calls into question the nature and quality of the relationships between these multiple stakeholders and demands that we reconsider nuclear risk governance practices, including in France, and then rethink the boundaries of the “ecosystem of nuclear safety,” to use the term proposed by Benoît Journé.

Learning from nuclear accidents: a long-term process

Immediately after the accident, the entire community of international experts worked to manage the crisis and to understand the dynamics of the accident in terms of its technical, human and socio-organizational aspects. A few months later, the European Commission asked nuclear countries to carry out what it termed “stress-tests” aimed at assessing nuclear facilities’ ability to withstand external stress (such as major weather events) and serious technical malfunctions. In France, this led to the launch of safety assessment reports (ECS) for the country’s nuclear facilities.

While the technical causes of the Fukushima accident were quickly understood, socio-organizational causes were also identified. The Japanese Fukushima Nuclear Accident Independent Investigation Commission found that the “collusion between the government, the regulators and TEPCO, and the lack of governance by said parties” was one of the major causes of the disaster. The accident also highlighted the importance of involving civil society participants in risk prevention and in risk management preparation very early on.

Volunteers from the town of Minamisoma, near the nuclear power plant. Hajime Nakano/Flickr, CC BY

 

Above all, it reveals the long-term need to plan and get equipped to manage a nuclear accident.  Far too often, efforts concentrate on the emergency phase, the days or weeks immediately following the accident, leaving local stakeholders virtually on their own in the “post-accident” phase. Yet this phase involves major problems, involving, for example, the consumption of basic foodstuffs (water, milk, etc.), displacing populations and cultivating potentially contaminated land.

After the Three Mile Island (1979) and Chernobyl (1986) accidents caused the human and organizational aspects of safety measures to be considered, Fukushima marks a new era focused on examining inter-organizational relations and the long-term methods for managing nuclear risks.

The need for openness towards civil society

Although this term is sometimes criticized and even mocked as being a popular buzzword, nuclear risk “governance” refers to a very practical reality involving all the stakeholders, measures and policies that are mobilized to guide the decisions made primarily by the public authorities and the nuclear operators to better manage nuclear risks and help ensure greater transparency of these risks. This implies the need to reflect on how each stakeholder can participate, the material and immaterial resources that could enable this participation and software that could support and help coordinate it.

Public awareness, organized by the Nuclear Safety Authority. ASN, CC BY

Public awareness, organized by the Nuclear Safety Authority. ASN, CC BY

 

In this sense, Fukushima serves as a powerful reminder of the need for greater transparency and greater involvement of civil society participants. Contrary to popular belief, the longstanding institutional stakeholders in the nuclear industry are aware of the need for greater openness to civil society. In 2012 Jacques Repussard stated: “Nuclear energy must be brought out of the secrecy of executive boards and ministerial cabinets.” And as early as 2006, the French Nuclear Safety and Transparency Act confirmed this desire to involve civil society stakeholders in nuclear safety issues, particularly by creating local information committees (CLI), although some regret that this text has only been half-heartedly implemented.

Of course, bringing about a change in practices and pushing the boundaries is not an easy thing, since the nuclear industry has often been described, sometimes rightly, as a world frozen in time. It continues to be burdened by its history. For a long time, nuclear safety was an issue reserved only for a small group of stakeholders, sometimes referred to as “authorized” experts, and traces of these practices are still visible today. This characteristic is embodied in the extremely centralized safety organization. Even the French word for a nuclear power plant, “centrale nucléaire” attests to the prominence given to centralization.

French nuclear power plants. Sting, Roulex_45, Domaina/Wikimedia, CC BY-SA

French nuclear power plants. Sting, Roulex_45, Domaina/Wikimedia, CC BY-SA

 

One thing is for sure, there must be an ongoing dialog between the communities. This implies taking the heat out of the debates and moving beyond the futile and often exaggerated divide between the pro-nuclear and anti-nuclear camps.

A form of governance founded on open dialog and the recognition of citizen expertise is gradually emerging. The challenge for longstanding stakeholders is to help increase this citizen expertise. The AGORAS project (improvement of the governance of organizations and stakeholder networks for nuclear safety) questions governance practices, but also seeks to create a place for dialog and collective reflection. A symposium organized in late 2017 provided the first opportunity for implementing this approach through discussions organized between academic researchers and operational and institutional stakeholders. The 2018 symposium (more information here: colloque2agoras@imt-atlantique.fr) will continue this initiative.

 

[divider style=”normal” top=”20″ bottom=”20″]

The original version of this article was published in The Conversation.