gender diversity, digital professions

Why women have become invisible in IT professions

Female students have deserted computer science schools and women seem mostly absent from companies in this sector. The culprit: the common preconception that female computer engineers are naturally less competent than their male counterparts.  The MOOC entitled Gender Diversity in IT Professions*, launched on 8 March 2018, looks at how sexist stereotypes are constructed, often insidiously. Why are women now the minority, rendered invisible in the digital sector despite the many female pioneers and entrepreneurs have paved the way for the development of software and video games?  Chantal Morley, a researcher within the Gender@Telecom group at Institut Mines-Telecom Business School, takes a look back at the creation of this MOOC, highlighting the research underpinning the course.

 

In 2016, only 33% of digital jobs were occupied by women (OPIIEC). Taking into account only the “core” professions in the sector (engineer, technician or project manager), the percentage falls to 20%. Why is there such a gender gap? No, it’s not because women are less talented in technical professions, nor because they prefer other areas. The choices made by young women in their training and women in their careers is not always the result of a free and informed decision. The influence of stereotypes plays a significant role. These popular beliefs reinforce the idea that the IT field is inherently masculine, a place where women do not belong, and this influences our choices and behaviors even when we do not realize it.

The research group Gender@Telecom, which brings together several female researchers from IMT schools, is looking at the issue of women’s role in the field of information and communication technologies, and specifically the software sector. Through their studies and analysis, the group’s researchers have observed and described how these stereotypes are expressed.  “We interviewed professionals in the sector, and asked students specific questions about their choices and opinions,” explains Chantal Morley, researcher at Institut Mines-Telecom Business School. By analyzing the speech from those interviews, the researchers identified many preconceived notions. “Women do not like computer science, it does not interest them, for example,” the researcher continues. “These representations are unproven and do not match reality!” These little phrases that communicate stereotypes are heard from both men and women. “One might think that this type of differentiation in representations would not exist among male and female students, but that is not the case,” says Chantal Morley. “During a study conducted in Switzerland, we found that guidance counselors are also very much influenced by these stereotypes.” Among professionals, these views are even cited as arguments justifying certain choices.

 

Little phrases, big impacts

The Gender Diversity in IT Professions MOOC* developed by the Gender@Telecom group is aimed at deconstructing these stereotypes. “We used these studies to try to show learners how little things in everyday life, which we do not even notice, contribute to instilling these differentiated views,” Chantal Morley explains. These little phrases or representations can also be found in our speech as well as in advertisements, posters… When viewed individually, these small occurrences are insignificant, yet it is their repetition and systematic nature that pose a problem. Together they work to establish and reinforce sexist stereotypes. “They form a common knowledge, a popular belief that everyone is aware of, that we all accept, saying ‘that’s just the way it is!’”

To study this phenomenon, the researchers from the group analyzed speech from semi-structured interviews conducted with stakeholders in the digital industry. The researchers’ questions focused on the relationship with technology and an entrepreneurship competition that had recently been held at Institut Mines-Telecom Business School. “Again, in this study, some types of arguments were frequently repeated and helped reinforce these stereotypes,” Chantal Morley observes. “For example, when someone mentions a woman who is especially talented, the person will often add, ‘yes, but with her it’s different, that doesn’t count.’ There is always an excuse for not questioning the general rule that says women lack the abilities required in digital professions.

 

Unjustified stereotypes

Yet despite their pervasiveness, there is nothing to justify these remarks. The history of computer science professions proves this fact. However, the contribution of women has long been hidden behind the scenes. “When we studied the history of computer science, we were primarily looking at the area of hardware, equipment. Women were systematically rejected by universities and schools in this field, where they were not allowed to earn a diploma,” says Chantal Morley. “Also, some companies refused to keep their employees if they had a child or got married. This made careers very difficult.” In recent years, research on the history of the software industry, in which there were more opportunities, has revealed that many women contributed to major aspects of its development.

Ada Lovelace is sort of the Marie Curie of computer science… People think she is the only one! Yet she is one contributor among others,” the researcher explains. For example, the computer scientist Grace Hopper invented the first compiler and the COBOL language in the 1950s. “She had the idea of inventing a translator that would translate a relatively understandable and accessible language into machine language. Her contribution to programming was crucial,” Chantal Morley continues. “We can also mention Roberta Williams, a computer scientist who greatly contributed to the beginnings of video games, or Stephanie Shirley, a pioneer computer scientist and entrepreneur…”

In the past these women were able to fight for their place in software professions. What has happened to make women seem absent from these arenas? According to Chantal Morley, the exclusion of women occurred with the arrival of microcomputing, which at the time had been designed for a primarily male target, that of executives. “The representations conveyed at that time progressively led everyone to associate working with computers with men.” But although women are a minority in this sector, they are not completely absent. “Many have participated in the creation of very large companies, they found startups, and there are some very famous women hackers,” Chantal Morley observes. “But they are not at all in the spotlight and do not get much press coverage. As if they were an anomaly, something unusual…

Finally, women’s role in the digital industry varies greatly depending on the country and culture. In India and Malaysia, for example, computer science is a “women’s” profession. It is all a matter of perspective, not a question of innate abilities.

 

[box type=”shadow” align=”” class=”” width=””]*A MOOC combating sexist stereotypes

How are these stereotypes constructed and maintained? How can they be deconstructed? How can we promote the integration of women in digital professions? The Gender Diversity in IT Professions MOOC (in French), launched on 8 March 2018, uncovers the little-known contribution of women to the development of the software industry and what mechanisms cause them to remain hidden and discouraged from integrating this sector. The MOOC is aimed at raising awareness among companies, schools and research organizations on these issues to provide them with keys for developing a more inclusive culture for women. [/box]

Also read on I’MTech

 

GDPR

GDPR comes into effect. Now it’s time to think about certification seals!

The new European Personal Data Protection Regulation (GDPR) comes into effect on May 25. Out of the 99 articles contained in the regulation, two are specifically devoted to the question of certification. While establishing seals to demonstrate compliance with the regulation seems like a good idea in order to reassure citizens and economic stakeholders, a number of obstacles stand in the way.

 

Certification marks are ubiquitous these days since they are now used for all types of products and services. As consumers, we have become accustomed to seeing them everywhere: from the organic farming label for products on supermarket shelves to Energy certification for appliances. They can either be a sign of compliance with legislation, as is the case for CE marking, or a sign of credibility displayed by a company to highlight its good practices. While it can sometimes be difficult to make sense of the overwhelming number of seals and marks that exist today, some of them represent real value. AOC appellations, for example, are well-known and sought out by many consumers. So, why not create seals or marks to display responsible personal data management?

While this may seem like an odd question to citizens who see these seals as nothing more than red labels on free-range chicken packaging, the European Union has taken it into consideration. So much so, that Articles 42 and 43 of the new European Data Protection Regulation (GDPR) are devoted to this idea. The creation of seals and marks is encouraged by the text in order to enable companies established in the EU who process citizens’ data responsibly to demonstrate their compliance with the regulation. On paper, everything points to the establishment of clear signs of trust in relation to personal data protection.

However, a number of institutional and economic obstacles stand in the way.  In fact, the question of seals is so complicated that IMT’s Personal Data Values and Policies Chair* (VPIP) has made it a separate research topic, especially in terms of how the GDPR affects the issue. This research, carried out between the adoption of the European text on April 14, 2016 and the date it is set to come into force, May 25, 2018, has led to the creation of a work of more than 230 pages entitled Signes de confiance : l’impact des labels sur la gestion des données personnelles (Signs of Trust — the impact of seals on personal data management).

For Claire Levallois-Barth, a researcher in Law at Télécom ParisTech and coordinator of the publication, the complexity stems in part from the number and heterogeneity of personal data protection marks. In Europe alone, there are at least 75 different marks, with a highly uneven geographic distribution. “Germany alone has more than 41 different seals,” says the researcher. “In France, we have nine, four of which are granted by the CNIL (National Commission for Computer Files and Individual Liberties).” Meanwhile, the United Kingdom has only two and Belgium only one. Each country has its own approach, largely for cultural reasons. It is therefore difficult to make sense of such a disparate assortment of marks with very different meanings.

Seals for what?

Because one of the key questions is: what should the seal describe? Services? Products? Processes within companies? “It all depends on the situation and the aim,” says Claire Levallois-Barth. Until only recently, the CNIL granted the “digital safe box” seal to certify that a service respected “the confidentiality and integrity of data that is stored there” according to its own criteria. At the same time, the Commission also has a “Training” seal that certifies the quality of training programs on European or national legislative texts. Though both were awarded by the same organization they do not have the same meaning. So saying that a company has been granted “a CNIL seal” provides little information. One must delve deeper into the complexity of these marks to understand what they mean, which seems contradictory to the very principle of simplification they are intended to represent.

One possible solution could be to create general seals to encompass services, internal processes and training for all individuals responsible for data processing at an organization. However, this would be difficult from an economic standpoint. For companies it could be expensive — or even very expensive — to have their best practices certified in order to receive a seal. And the more services and teams there are to be certified, the more time and money companies would have to spend to obtain this certification.

On March 31, 2018, the CNIL officially transitioned from a labeling activity to a certification activity.

The CNIL has announced that it would stop awarding seals for free. “The Commission has decided that once the GDPR comes into effect it will concentrate instead on developing or approving certification standards. The seals themselves will be awarded by accredited certification organizations,” explains Claire Levallois-Barth. Afnor Certification or Bureau Veritas, for example, could offer certifications for which companies would have to pay. This would allow them to cover the time spent assessing internal processes and services, analyzing files, auditing information systems etc.

And for all the parties involved, the economic profitability of certification seems to be the crux of the issue. In general, companies do not want to spend tens of thousands, or even hundreds of thousands, of euros on certification just to receive a little-known seal. Certification organizations must therefore find the right formula: comprehensive enough to make the seal valuable, but without representing too much of an investment for most companies.

While it seems unlikely that a general seal will be created, some stakeholders are examining the possibility of creating sector-specific seals based on standards recognized by the GDPR, for cloud computing for example. This could occur if criteria were approved, either at the national level by a competent supervisory authority within a country (the CNIL in France), or at the European Union level by the European Data Protection Board (EDPB). A critical number of seals would then have to be granted. GDPR sets out two options for this as well.

According to Article 43 of the GDPR, certification may either be awarded by the supervisory authorities of each country, or by private certification organizations. In France, the supervisory authority is the CNIL, and certification organizations include Afnor and Bureau Veritas. These organizations are themselves monitored. They must be accredited either by the supervisory authority, or by the national accreditation body, which is the COFRAC in France.

This naturally leads to the question: if the supervisory authorities develop their own sets of standards, will they not tend to favor the accreditation of organizations that use these standards? Eric Lachaud, a PhD student in Law and Technology at Tilburg and guest at the presentation of the work by the Personal Data Values and Policies Chair on March 8, says, “this clearly raises questions about competition between the sets of standards developed by the public and private sectors.” Sophie Nerbonne, Director of Compliance at the CNIL, who was interviewed at the same event, says that the goal of the national commission is “not to foreclose the market but to draw on [its] expertise in very precise areas of certification, by acting as a data protection officer.”

A certain vision of data protection

It should be acknowledged, however that the area of expertise of a supervisory authority such as the CNIL, a pioneer in personal data protection in Europe, is quite vast. Beyond serving as a data protection officer and being responsible for ensuring compliance with GDPR within an organization that has appointed it, as an independent authority CNIL is in charge of regulating issues involving personal data processing, governances and protection, as indicated by the seals it has granted until now. Therefore, it is hard to imagine that the supervisory authorities would not emphasize their large area of expertise.

And even more so since not all the supervisory authorities are as advanced as the CNIL when it comes to certification in relation to personal data. “So competition between the supervisory authorities of different countries is an issue,” says Eric Lachaud. Can we hope for a dialogue between the 28 Member States of the European Union in order to limit this competition? “This leads to the question of having mutual recognition between countries, which has still not been solved,” says the Law PhD student. As Claire Levallois-Barth is quick to point out, “there is a significant risk of ‘a race to the bottom’.” However, there would be clear benefits. By recognizing the standards of each country, the countries of the European Union have the opportunity to give certification a truly transnational dimension, which would make the seals and marks valuable throughout Europe, thereby making them shared benchmarks for the citizens and companies of all 28 countries.

The high stakes of harmonization extend beyond the borders of the European Union. While the CE standard is criticized at times for how easy it is to obtain in comparison to stricter national standards, it has successfully imposed certain European standards around the world.  Any manufacturer that hopes to reach the 500 million-person market that the European Union represents must meet this standard. For Éric Lachaud, this provides an example of what convergence between the European Member States can lead to: “We can hope that Europe will reproduce what it has done with CE marking: that it will strive to make the voices of the 28 states heard around the world and to promote a certain vision of data protection.”

The uncertainties surrounding the market for seals must be offset by the aims of the GDPR. The philosophy of this regulation is to establish strong legislation for technological changes with a long-term focus. In one way, Articles 42 and 43 of the GDPR can be seen as a foundation for initiating and regulating a market for certification. The current questions being raised then represent the first steps toward structuring this market. The first months after the GDPR comes into effect will define what the 28 Member States intend to build.

 

*The Personal Data Values and Policies Chair brings together the Télécom ParisTech, Télécom SudParis graduate schools, and Institut Mines-Télécom Business School. It is supported by Fondation Mines-Télécom.

[box type=”info” align=”” class=”” width=””]

Personal data certification seals – what is the point?

For companies, having a personal data protection seal allows them to meet the requirements of accountability imposed by article 24 of the GDPR. It requires all organizations responsible for processing data to be able to demonstrate compliance with the regulation. This requirement also applies to personal data subcontractors.

This is what leads many experts to think that the primary application for seals will be business-to-business relationships rather than business-to-consumer relationships. SME economic stakeholders could seek certification in order to meet growing demand amongst their customers, especially major firms, for compliance in their subcontracting operations.

Nevertheless, the GDPR is a European regulation. This means that compliance is assumed: all companies are supposed to abide by the regulation as soon as it comes into effect. A compliance seal cannot therefore be used as a marketing tool. It is, however, likely that the organizations responsible for establishing certification standards will choose to encourage seals that go beyond the requirements of the GDPR. In this case, stricter control over personal data processing than what is called for by the legislation could be a valuable way to set a company apart from its competitors. [/box]

working classes digital, classes populaires numérique

How working classes use digital tools: The Facebook example

For over a decade now, the use of digital tools and internet connectivity has greatly developed among households, including among working classes. Yet very few studies exist on this part of the population’s specific uses of digital technology. In the context of the Poplog project, which counts Télécom ParisTech among its partners, Dominique Pasquier, a researcher in sociology, has studied this question through interviews and with help from a data set from Facebook accounts.*

 

Among low-income households, internet connectivity figures have skyrocketed. According to INSEE (the French national institute for statistics and economic studies), in 2006, 47.9% of employees and 37% of manual workers had access to the Internet at home. These figures rose to 88% among manual workers and 91.5% among employees. Within 10 years, internet use became fully integrated into the daily lives of working classes.

Yet, within the social sciences, barely any studies have focused on how the working classes relate to digital technology. “There is no reason to believe that internet uses are the same at the top and bottom of the social ladder,” explains Dominique Pasquier, researcher in sociology at Télécom ParisTech and Director of Research at the CNRS (the French national center for scientific research).

This observation is what led to the creation of the Poplog project. Funded by the ANR (the French National Research Agency), the partners for this project include Télécom ParisTech, the Centre Atlantique de Philosophie and Université de Bretagne Occidentale. The researchers looked at the use of digital technology among working classes with stable employment. Unlike very low-income classes that live on the outskirts of urban areas, the studied individuals live in rural areas and most own their own home. “This fraction of the population consists primarily of traditional families, there are very few single-parent families,” Dominique Pasquier explains. “In general, few have earned degrees and they work as manual workers or employees.

In the framework of this project, in order to study this category of the population and its relationship with digital tools, Dominique Pasquier looked specifically at how they use Facebook.

 

Data from Facebook accounts as research material

The researcher in sociology first attempted to collect information using various survey methods, particularly interviews. Yet very few people responded positively to requests for interviews. These difficulties are common in general sociology, according to Dominique Pasquier, especially when the study focuses on working classes. “These individuals do not have a clear understanding of what sociology is and do not see the point of these discussions,” she notes. “And this is a group that primarily welcomes family to their homes, but not strangers. Therefore, we face a rejection phenomenon.

This problem was avoided thanks to another project called Algopol, led by the Center for Social Analysis and Mathematics, Orange Labs France Télécom, LIAFA and Linkfluence from 2012 to 2015. The team carried out a major survey on the Facebook networks and recorded and anonymized data from approximately 15,000 accounts. Only 50 of the 15,000 accounts matched the social profiles Poplog was interested in. This number was suited to a qualitative study of the data.

The principle was that I was not allowed to meet the people who owned these accounts,” Dominique Pasquier explains. “The only information I had was their age, sex, municipality of residence, number of friends and the content they exchanged, excluding personal photos.” Yet this limited content was sufficient for conducting a sociological analysis of this data. Especially since this content complemented the information obtained during the interviews. “The two different formats do not provide the same insights,” the researcher continues. “The Facebook data reveals discussions in which the sociologist was not involved. Whereas during an interview, the person wants to give a good impression of themselves and therefore will not talk about certain subjects.”

Certain topics related to the use of digital technology was only available in the interviews, such as searches for information or online purchases. On the other hand, some topics were only available on Facebook, such as employment problems, or difficulties related to undesired singleness, a reality that affects unskilled male workers in particular.

 

Significant variations in how Facebook is used

The 50 accounts were exactly what I was looking for: adults between 30 and 50 years old who live in rural areas and are manual workers or work in personal care services,” Dominique Pasquier explains. “This is where we saw that the uses of Facebook are extremely varied.” There were many different types of users: some attempt to use the social network but do not know what to say, do not receive enough feedback and give up. Others try their hardest to attract attention, sharing ready-made catchphrases and impressive links. Some are very prolific in sharing events from their daily life, whereas others never talk about this aspect.

However, certain behaviors and phenomena were frequently observed throughout this selection of accounts. “There is a whole set of phrases about life that express a kind of circulating common ethic. During the interviews, people called them ‘quotes’,” Dominique Pasquier explains. “Furthermore, when someone posts a status update, those who respond are intergenerational and both male and female.

Finally, some things men shared about romantic difficulties, situations of undesired singleness or separation, caught Dominique Pasquier’s attention. She analyzed these comments and how others responded to it. “Some of what was shared was very aggressive, with misogynistic remarks. In this case, the comment always brought a response from the poster’s contacts, especially from women, who counteracted the remarks.”

The researcher’s goal was to analyze both what is shared on the social network and others’ reactions to it: “I analyze this content as things the individuals considered worthy of sharing and making known to their Facebook contacts who, in the context of this group of individuals from working classes with stable employment, are primarily made up of close friends and family.”

 

A different use of digital tools

I think this survey also demonstrates that these individuals are faring well with the internet, but in a completely different way,” Dominique Pasquier explains.  “In the case of Facebook, the social network is mainly used to maintain a kinship group.

Through these interviews and analysis, the researcher noticed other specific features in the use of digital tools among the studied population. “It is a social universe that presents different uses and it is important for the public authorities to be aware of this,” says Dominique Pasquier. Public policy is indeed moving towards establishing fully digital means of communication via email with social assistance institutions like Pôle Emploi and the Caisse d’Allocations Familiales. This digital transformation poses a problem. In the course of her study, the researcher observed that the individuals she surveyed did not use email as a means of interpersonal communication; they used it only to make purchases.  “These email addresses are shared by spouses or the entire family. With all the online purchases, the emails from Pôle Emploi will be lost among hundreds of spam emails and ads,” the researcher observes. “There is also a sort of rage that develops among this population, because of this inability to contact each other.

This shows how important it is to continue this work on the issue of digital technology and its use by working classes… while remaining vigilant. Although many sociology students are interested in studying digital corpora, these types of materials pose methodological problems. “Much of the data is anonymous, we often do not know who has produced it,” Dominique Pasquier explains. “Also, we often do not realize that 50% of online participation is produced by 1% of the population, by heavy contributors. We therefore mistake anecdotal occurrences for mass social phenomena.” Yet despite these challenges, digital data has “enormous potential, since we can work on large volumes of data and network phenomena…” Offering enough information to provide an understanding of how certain social groups are structured.

 

[box type=”shadow” align=”” class=”” width=””]* The i3 seminar on digital data analysis methodologies in the social sciences

The Poplog project and Dominique Pasquier’s research were presented at the Methods for the Analysis of Online Participation Seminar, organized by i3, a joint CNRS research unit of which Télécom ParisTech is a member. This seminar, which will run through June 2018, focuses on issues surrounding methods for processing digital data for research in the humanities and social sciences. The discussions focus on how the corpus is formed, analysis methods and the relationship between digital data and conventional survey methods.[/box]

 

political ecology, écologie politique, Fabrice Flipo, Télécom École de Management

Philosophy of science and technology in support of political ecology

Fabrice Flipo, a philosopher of science and technology and researcher at Institut Mines-Télécom Business School, has specialized in political ecology, sustainable development and social philosophy for nearly 20 years. Throughout the fundamental research that shapes his more technical teaching, he tries to produce an objective view of current political trends, the ecological impact of digital technology and an understanding of the world more broadly.

 

For Fabrice Flipo, the philosophy of science and technology can be defined as the study of how truth is created in our society. “As a philosopher of science and technology, I’m interested in how knowledge and know-how are created and in the major trends in technical and technological choices, as well as how they are related to society’s choices,” he explains. It is therefore necessary to understand technology, the organization of society and how politics shapes the interaction between major world issues.

The researcher shares this methodology with students at Institut Mines-Télécom Business School, in his courses on major technological and environmental risks and his introductory course on sustainable development. He helps students analyze the entire ecosystem surrounding some of the most disputed technological and environmental issues (ideas, stakeholders, players, institutions etc.) of today and provides them with expertise to navigate this divisive and controversial domain.

Fundamental research to understand global issues

This is why Fabrice Flipo has focused his research on political ecology for nearly 20 years. Political ecology, which first appeared in France in the 1960s, strives to profoundly challenge France’s social and economic organization and to reconsider relationships between man and his environment. It is it rooted in the ideas of a variety of movements, including feminism, third-worldism, pacifism and self-management among others.

Almost 40 years later, Fabrice Flipo seeks to explain and provide insight into this political movement by examining how its emergence has created controversies with other political movements, primarily liberalism (free-market economics), socialism and conservatism. “I try to understand what political ecology is, and the issues involved, not just as a political party of its own, but also as a social movement,” explains the researcher.

Fabrice Flipo carries out his research in two ways. The first is a traditional approach to studying political theory, based on analyzing arguments and debates produced by the movement and the issues it supports. This approach is supplemented by ongoing work with the Laboratory of Social and Political Change at the University of Paris 7 Diderot and other external laboratories specializing in the subject. He works in collaboration with an interdisciplinary team of engineers, sociologists and political scientists to examine the relationship between ICT (Information and Communication Technologies) and ecology. He also involves networks linked to ecology to expand this collaboration, works with NGOs and writes and appears in specialized or national media outlets. For some of his studies, he also draws on a number of different works in other disciplines, such as sociology, history or political science.

The societal impact of political ecology

Today political ecology is a minor movement compared to the liberal, socialist and conservative majorities,” says the researcher. Indeed, despite growing awareness of environmental issues (CoP 21, development of a trade press, energy transition for companies, adopting a “greener” lifestyle etc.) the environmental movement has not had a profound effect on the organization of industrialized human societies, so it needs to be more convincing. This position makes it necessary to present arguments in its minority status on the political spectrum. “Can political ecology be associated with liberalism, socialism or even conservatism?” asks the researcher. “Although it does not belong to any of the existing currents, each of them tries to claim it for their own.”

More than just nature is at stake. A major ecosystem crisis could open the door for an authoritarian regime seeking to defend the essential foundation of a particular society from all others. This sort of eco-fascism would strive to protect resources rather than nature (and could not therefore be considered “environmentalism”), pitching one society against another. Political ecology is therefore firmly aligned with freedom.

To stay away from extremes, “the challenge is to carry out basic research to better understand the world and political ideas, and to go beyond debates based on misunderstandings or overly-passionate approaches,” explains Fabrice Flipo. “The goal is to produce a certain objectivity about political currents, whether environmentalism, liberalism or socialism. The ideas interact with, oppose, and are defined by one another.”

Challenging the notion that modernity is defined by growth and a Cartesian view of nature, the study of political ecology has led Fabrice Flipo to philosophical anthropological questions about freedom.

[box type=”shadow” align=”” class=”” width=””]

Analyzing the environmental impact of digital technology in the field

Political ecology raises questions about the ecology of infrastructures. Fabrice Flipo has begun fieldwork with sociologists on an aspect of digital technology that has been little studied overall: the environmental impacts of making human activities paper-free, the substitution of functions and “100% digital” systems.

Some believe that we must curb our use of digital technologies since manufacturing these devices requires great amounts of energy and raw materials and the rise of such technology produces harmful electronic waste. But others argue that transitioning to an entirely digital system is a way to decentralize societies and make them more environmentally-friendly.

Through his research project on recovering mobile phones (with the idea that recycling helps reduce planned obsolescence) Fabrice Flipo seeks to highlight existing solutions in the field which are not used enough, with priority being given to the latest products and constant renewal.[/box]

Philosophy to support debates about ideas

“Modernity defines itself as the only path to develop freedom (the ability to think), control nature, technology, and democracy. The ecological perspective asserts that it may not be that simple,” explains the researcher. “In my different books I’ve tried to propose a philosophical anthropology that considers ecological questions and different propositions offered by post-colonial and post-modern studies,” he continues.

Current societal debates prove that ecological concerns are a timely subject, underscore the relevance of the researcher’s work in this area, and show that there is growing interest in the topic. Based on the literature, it would appear that citizens have become more aware of available solutions (electric cars, solar panels etc.) but have been slow to adopt them. Significant contradictions between the majority call to “produce more and buy more” and the minority call encouraging people to be “green consumers” as part of the same public discourse make it difficult for citizens to form their own opinions.

“So political ecology could progress through an open debate on ecology,” concludes Fabrice Flipo, “involving politicians, scientists, journalists and specialists. The ideas it champions must resonate with citizens on a cultural level, so that they can make connections between their own lifestyles and the ecological dimension.” An extensive public communication, to which the researcher contributes through his work, coupled with a greater internalization and understanding of these issues and ideas by citizens could help spark a profound, far-reaching societal shift towards true political ecology.

[author title=”Political ecology: The common theme of a research career” image=”https://imtech-test.imt.fr/wp-content/uploads/2018/02/Fabrice-Flipo_format_en_hauteur.jpg”]A philosopher of science and technology, Fabrice Flipo is an associate research professor accredited to direct research in social and political philosophy and specializes in environmentalism and modernity. He teaches courses in sustainable development and major environmental and technological risks at Télécom École de Management, and is a member of the Laboratory of Social and Political Change at the University of Paris Diderot. His research focuses on political ecology, philosophical anthropology of freedom and the ecology of digital infrastructures.

He is the author of many works including: Réenchanter le monde. Politique et vérité “Re-enchanting the world. Politics and truth” (Le Croquant, 2017), Les grandes idées politiques contemporaines “Key contemporary political ideas” (Bréal, 2017), The ecological movement: how many different divisions are there?  (Le Croquant, 2015), Pour une philosophie politique écologiste “For an ecological political philosophy” (Textuel, 2014), Nature et politique (Amsterdam, 2014), and La face cachée du numérique “The Hidden Face of Digital Technology” (L’Echappée, 2013).[/author]

What nuclear risk governance exists in France?

Stéphanie Tillement, IMT Atlantique – Institut Mines-Télécom

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]I[/dropcap]t will take a long time to learn all the lessons from the Fukushima accident, and even longer to bring about a change in the practices and principles of nuclear risk governance. Yet several major themes are already emerging in France in this respect.

Next Sunday, March 11, 2018 will mark the 7-year anniversary of the Fukushima disaster, when the Northeast coast of Japan was struck by a record magnitude-9 earthquake, followed by a tsunami. These natural disasters led to an industrial disaster, a nuclear accident rated 7, the highest level on the INES scale, at the Dai-ichi nuclear power plant in Fukushima.

In the aftermath of the disaster, the world was stunned at the realization of the seriousness and suddenness of this event, which, according to Jacques Repussard, Director General of the French Institute for Radiological Protection and Nuclear Safety (IRSN) calls for us to “imagine the unimaginable and prepare for it.” It confronts all those involved in nuclear safety with a critical challenge: how can we guarantee safety in the midst of unexpected events?

Beyond its unpredictable nature, this accident served as a brutal and particularly relevant reminder that nuclear energy, more than any other technology or industry, transcends all borders, whether they be geographic, temporal, institutional or professional. The consequences of nuclear accidents extend well beyond the borders of a region or a country and remain present for hundreds or even thousands of years, thus exceeding any “human” time scale.

La Hague nuclear waste reprocessing plant. Jean Marie Taillat/Wikimedia, CC BY-SA

La Hague nuclear waste reprocessing plant. Jean Marie Taillat/Wikimedia, CC BY-SA

 

Fukushima revealed that the safety of socio-technical systems with this level of complexity cannot be limited to only to a few stakeholders, nor can it be ensured without creating strong and transparent ties between a multitude of stakeholders, including nuclear operators, citizens, safety authorities, technical support, and government services. Fukushima calls into question the nature and quality of the relationships between these multiple stakeholders and demands that we reconsider nuclear risk governance practices, including in France, and then rethink the boundaries of the “ecosystem of nuclear safety,” to use the term proposed by Benoît Journé.

Learning from nuclear accidents: a long-term process

Immediately after the accident, the entire community of international experts worked to manage the crisis and to understand the dynamics of the accident in terms of its technical, human and socio-organizational aspects. A few months later, the European Commission asked nuclear countries to carry out what it termed “stress-tests” aimed at assessing nuclear facilities’ ability to withstand external stress (such as major weather events) and serious technical malfunctions. In France, this led to the launch of safety assessment reports (ECS) for the country’s nuclear facilities.

While the technical causes of the Fukushima accident were quickly understood, socio-organizational causes were also identified. The Japanese Fukushima Nuclear Accident Independent Investigation Commission found that the “collusion between the government, the regulators and TEPCO, and the lack of governance by said parties” was one of the major causes of the disaster. The accident also highlighted the importance of involving civil society participants in risk prevention and in risk management preparation very early on.

Volunteers from the town of Minamisoma, near the nuclear power plant. Hajime Nakano/Flickr, CC BY

 

Above all, it reveals the long-term need to plan and get equipped to manage a nuclear accident.  Far too often, efforts concentrate on the emergency phase, the days or weeks immediately following the accident, leaving local stakeholders virtually on their own in the “post-accident” phase. Yet this phase involves major problems, involving, for example, the consumption of basic foodstuffs (water, milk, etc.), displacing populations and cultivating potentially contaminated land.

After the Three Mile Island (1979) and Chernobyl (1986) accidents caused the human and organizational aspects of safety measures to be considered, Fukushima marks a new era focused on examining inter-organizational relations and the long-term methods for managing nuclear risks.

The need for openness towards civil society

Although this term is sometimes criticized and even mocked as being a popular buzzword, nuclear risk “governance” refers to a very practical reality involving all the stakeholders, measures and policies that are mobilized to guide the decisions made primarily by the public authorities and the nuclear operators to better manage nuclear risks and help ensure greater transparency of these risks. This implies the need to reflect on how each stakeholder can participate, the material and immaterial resources that could enable this participation and software that could support and help coordinate it.

Public awareness, organized by the Nuclear Safety Authority. ASN, CC BY

Public awareness, organized by the Nuclear Safety Authority. ASN, CC BY

 

In this sense, Fukushima serves as a powerful reminder of the need for greater transparency and greater involvement of civil society participants. Contrary to popular belief, the longstanding institutional stakeholders in the nuclear industry are aware of the need for greater openness to civil society. In 2012 Jacques Repussard stated: “Nuclear energy must be brought out of the secrecy of executive boards and ministerial cabinets.” And as early as 2006, the French Nuclear Safety and Transparency Act confirmed this desire to involve civil society stakeholders in nuclear safety issues, particularly by creating local information committees (CLI), although some regret that this text has only been half-heartedly implemented.

Of course, bringing about a change in practices and pushing the boundaries is not an easy thing, since the nuclear industry has often been described, sometimes rightly, as a world frozen in time. It continues to be burdened by its history. For a long time, nuclear safety was an issue reserved only for a small group of stakeholders, sometimes referred to as “authorized” experts, and traces of these practices are still visible today. This characteristic is embodied in the extremely centralized safety organization. Even the French word for a nuclear power plant, “centrale nucléaire” attests to the prominence given to centralization.

French nuclear power plants. Sting, Roulex_45, Domaina/Wikimedia, CC BY-SA

French nuclear power plants. Sting, Roulex_45, Domaina/Wikimedia, CC BY-SA

 

One thing is for sure, there must be an ongoing dialog between the communities. This implies taking the heat out of the debates and moving beyond the futile and often exaggerated divide between the pro-nuclear and anti-nuclear camps.

A form of governance founded on open dialog and the recognition of citizen expertise is gradually emerging. The challenge for longstanding stakeholders is to help increase this citizen expertise. The AGORAS project (improvement of the governance of organizations and stakeholder networks for nuclear safety) questions governance practices, but also seeks to create a place for dialog and collective reflection. A symposium organized in late 2017 provided the first opportunity for implementing this approach through discussions organized between academic researchers and operational and institutional stakeholders. The 2018 symposium (more information here: colloque2agoras@imt-atlantique.fr) will continue this initiative.

 

[divider style=”normal” top=”20″ bottom=”20″]

The original version of this article was published in The Conversation.

roaming

The end of roaming charges in the European Union: a cure-all solution?

Patrick Maillé, IMT Atlantique Institut Mines-Télécom (IMT) and Bruno Tuffin, Inria

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]T[/dropcap]he European Union has required wireless network operators to stop charging roaming fees during trips to other EU countries. For nomad users who regularly travel throughout Europe, this added comfort is truly appreciated: no more fears of additional charges.  However, while the benefit is real, some questions about roaming costs remain.

Respecting European unity

Before the end of roaming fees in June 2017, the operator of your mobile plan allowed you to communicate within your country and allowed a maximum amount of internet data you could consume (once depleted, you would either be charged additional fees, or your service would be restricted). Any travel outside your country of origin involved an additional flat-rate fee or charges based on volume. This situation limited communication and went against the European spirit of unity. To remedy this, in October 2016, the European Commission approved a law prohibiting operators from charging their users for communications and data usage while traveling abroad.

The goal of this decision was clearly established: create a single open market for electronic communications. Now when you travel, your usage will be charged to your plan exactly as it is in your country of origin. This means no more fears of extra fees, including for data usage: no need to wait to find WIFI access to use data, 3G and 4G networks can now be used without resulting in bad surprises. This new system required agreements to be made between the different operators and countries that are transparent for users in order to locate mobile phones and direct communications.

To prevent any unfair competition within the EU and prevent citizens from choosing a plan from the least expensive country, the rule was established that users must take out a plan in their own country, which is defined as the country where they spend the most time. In addition, roaming usage must be “reasonable”.

Completely free roaming?

As mentioned, “free” roaming is guaranteed by the law only “within a reasonable limit of use”. Specifically, operators can set a roaming limit for mobile internet usage without additional fees in order to prevent usage and associated costs from rocketing. However, this limit must be controlled by the regulation and the user must be clearly informed. The framework for this application is therefore not necessarily the same abroad as in the user’s country. In addition, the roaming rules only apply to services within the European Economic Area (EEA); therefore your plan may include services intended for countries outside the EEA which will only apply if you are in your country of origin.

It is also worth noting that there is still a missing step to truly achieving a single market and real freedom within the EU. In general, calling another EU country from your own country is not including in your mobile plan and incurs additional costs, so there is a distinction that is made within the European Community. Similarly, if you make a call while traveling, the call is not counted within your plan, but is charged as if you were calling from your country of origin, which could potentially be outside your plan, and yet it would be natural to be able to call to make a reservation at a restaurant without paying extra fees.

Therefore, integrating these additional aspects, in other words no longer differentiating between a call from or to another EU country, could be the final step towards achieving a fully open market perceived by users as a single market.

A risk of rising rates?

Another aspect to monitor is how this new rule will impact the rates of users’ plans: is there a risk that this will lead to a rise in prices, as an averaging effect in which those who rarely travel will have to pay for those who travel frequently? This potential risk was brought to light in scientific publications through theoretical modeling and game theory. The operator’s income could also decrease. It is still too soon since the application of this new regulation to effectively assess its impact, yet all these reasons clearly show that we will need to pay special attention to how prices change.

[divider style=”dotted” top=”20″ bottom=”20″]

The ConversationTo learn more :
– P. Maillé and B. Tuffin (2017) « Enforcing free roaming among UE countries : an economic analysis »,  13th International Conference on Network and Service Management (CNSM), Tokyo, Japan, Presses de Sciences Po.
– P. Maillé and B. Tuffin (2017), « How does imposing free roaming in EU impact users and ISPs’ relations ? », 8th International Conference Network of the Future, London, UK.

Patrick Maillé, Professor, IMT AtlantiqueInstitut Mines-Télécom (IMT) and Bruno Tuffin, Director of Research, Inria

The original version of this article (in French)  The Conversation

publicité numérique, digital advertising

Digital Advertising and Algorithms

Romain Gola, Télécom École de Management – Institut Mines-Télécom

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]I[/dropcap]n 2016, for the first time in France, online advertising investment exceeded that of television advertising. Algorithms now play an increasingly significant role in the purchase of advertising space on websites, raising many ethical and legal issues.

Algorithms rise to power

The digital advertising market in France is now estimated at €3.5 billion. Whereas up until now this advertising mainly involved displays on web sites, and the purchase of Google AdWords, the purchase of automated advertising space (called “programmatic buying”) has now emerged. The profiling of internet users is carried out using traces of their web activity, which makes it possible to predict their interest in an ad at any given time. Therefore, thanks to algorithms, it is possible to calculate, in real time, the value of the advertising space on a page the user is viewing.

The use of algorithms has the advantage of displaying banner ads that match our interests, but there are risks involved in their uncontrolled use. The lack of transparency in how these algorithms operate impacts internet users’ behavior without them realizing it. What is more, the algorithms sometimes benefit from exaggerated confidence, yet their results can be discriminatory. This raises the question of algorithms’ neutrality and ethical issues. The study of ethics in this area must be based on an understanding of how we are linked to these new technologies. This involves, on the one hand, how algorithms are covered by law and, on the other hand, the development of the digital advertising ecosystem.

In light of these new challenges, it would be wise to focus on the algorithms themselves, rather than on the data that is processed, by establishing systems capable of testing and controlling them, in order to prevent harmful consequences.

Law and algorithms: reforms in Europe

A new revolution is underway, based on data collection and processing that has reached an unprecedented scale, and stimulates the creation of new products and services. This increase in the amount and diversity of data is explained by the development of connected objects and the empowerment of consumers. Their ability to act has increased with the development of technology: businesses are becoming more and more dependent not only on the data consumers produce, but also on their opinion, and must therefore constantly ensure they maintain a good e-reputation.

In light of this situation, European institutions have begun the process of reforming personal data legislation. The new European General Data Protection Regulation (GDPR) will enter into force in all Member States in May 2018. It imposes increased transparency and the accountability of those who process data, based on a policy of compliance with the law, and it provides for severe penalties. Similarly, it affirms the right to data portability, and those in charge of processing personal data must ensure that their operations comply with personal data protection standards, starting at the design stages for a product or service (privacy by design).

The GDPR strives to implicitly regulate the algorithmic processing of data. We see a trend in the advertising sector: in general, all sites, services and products that use algorithms are careful not to refer to them. They hide the crucial role algorithms play, instead referring to “customization”. However, when there is customization, often there is “algorithmization”.

Legislation ill-suited to digital advertising

Laws pertaining to “traditional” advertising are based on the principle of receiving prior informed consent from individuals before processing their data. However, this concept of data protection is less relevant when it comes to digital advertising. Data collected in the context of traditional marketing often involves objective and relatively predictable information such as name, age, gender, address or marital status. Yet the concept of “data” radically changes when it comes to digital marketing. On social networks, the data is not only basic classification information (age, gender, address), but also includes data from everyday life: what I’m doing, what I’m listening to, etc.

digital advertising

Traces of web activity and individuals’ behavior on social networks make it possible to determine their profile. VisualHunt

 

This new situation questions the relevance of the distinction between personal and non-personal data. It also raises questions about the relevance of the principle of prior consent. It is often virtually impossible to use an application without accepting to be tracked. Consent therefore becomes mandatory in order to use the technology, and exactly how the data will be used by the data controller is completely unknown. Therefore, the problem is no longer related to prior consent, rather it is the automatic, predictive deductions made by the companies that collect this data.

Algorithms accentuate this trend by multiplying the collection and use of trivial and decontextualized data, likely to be used for specifically profiling individuals, and creating “knowledge” about them based on probabilities rather than certainties about their personal and intimate inclinations. In this situation, rather than examining the data feeding the algorithms, wouldn’t it be more relevant to examine the algorithms that process them and generate new data?

Legal and ethical challenges of online advertising

Influencing consumer choices, subliminal influence, submission that changes the perception of reality: behavioral targeting carries serious risks. Requirements for the accountability, transparency and verifiability of the actions caused by algorithms have become crucial in preventing potential excesses.

This situation calls into question the relationship between law and ethics, which is unfortunately often confused. Laws are established to regulate behavior—what is allowed, forbidden, or required from a legal perspective—whereas ethics refers more broadly to the distinction between good and bad, independent of and beyond any compliance with the law. Ethics applied to algorithmic processing would need to focus on two major principles: transparency, and the establishment of tests to check the algorithms’ results in order to prevent possible damage.

Transparency and accountability of algorithms

The activities of online platforms are essentially based on the selection and classification of information, as well as on offers for goods or services. They design and activate various algorithms that influence consumption behavior and how users think. This customization is sometimes misleading, since it is based on the machine’s concept of how we think. This concept is not based on who we are, but rather on what we have done and looked at. This observation reveals the need for transparency: the people impacted by an algorithm should first of all be informed of the existence of the algorithmic processing, as well as what it implies, the type of data it uses and its end purpose, so that they may file a claim if needs be.

Tests for algorithms?

In advertising, algorithms can lead to a differentiation in the price of a product or service or can even establish typologies of high-risk policyholders in order to calculate the insurance premium based on criteria that is sometimes illegal, by cross-checking “sensitive” information. Not only is the collection and the processing of this data (racial and ethnic origins, political and religious opinions) generally prohibited, the results of these algorithmic methods can be discriminatory. For example, the results of the first international beauty contest based entirely on algorithms led to the selection of only white candidates.

To avoid this type of abuse, urgent steps must be taken to establish tests for the results produced by algorithms. In addition to the legislation and the role played by the protection authorities (CNIL), codes of conduct are also beginning to appear: advertising professionals belonging to the Digital Advertising Alliance (ACD) have introduced a protocol represented by a visible icon next to a targeted ad that explains how it works.

It is in companies’ interests to adopt more ethical behavior in order to maintain a good reputation, and hence a competitive advantage. Internet users are weary of advertising deemed too intrusive. If the ultimate goal of advertising is to better anticipate our needs to “consume better”, it must occur in an environment that complies with legislation and is responsible and ethical.  This could be a vector for a new industrial revolution that is mindful of fundamental rights and freedoms, in which citizens are invited to take their rightful place and take ownership of their data.

Romain Gola, Professor of Business Law, Télécom École de Management – Institut Mines-Télécom

The original version of this article (in French) was published on The Conversation.

Also read on I’MTech

anonymized data, Teralab

Is anonymized data of any value?

Anonymization is still sometimes criticized as a practice that supposedly makes data worthless, as it deletes important information. The CNIL decided to prove the contrary through the Cabanon project conducted in 2017. It received assistance from the IMT big data platform, TeraLab, for anonymizing the data of New York taxis and showing the possibility of creating a transportation service.

 

On 10 March 2014, an image published on Twitter by the New York taxi commission sparked Chris Whong’s curiosity. It wasn’t the information on the vehicle occupancy rate during rush hour that caught the young urban planner’s interest. Rather, what caught his eye was the source of the data, cited at the bottom, that had allowed New York City’s Taxi and Limousine Commission (NYC TLC) to create the image. Through a tweet comment, he joined another Twitter user, Ben Wellington, in asking if the raw data was available. What ensued was a series of exchanges that enabled Chris Whong to retrieve the dataset through a process that is tedious, yet accessible to anyone with enough determination to cut through all the red tape. Once he had the data in his possession, he put it online. This allowed Vijay Pandurangan, a computer engineer, to demonstrate that the identity of the drivers, customers, and their addresses could all be found using the information stored on the taxi logs.

Problems in anonymizing open datasets are not new. They were not even new in 2014 when the story emerged about NYC TLC data. Yet this type of case still persists. One of the reasons is that anonymized datasets are deemed less useful than their unfiltered counterparts. Removing any possibility of tracing the identity would amount to deleting the information. In the case of the New York taxis, for example, this would mean limiting the information on the taxis’ location to geographical areas, rather than indicating the coordinates to the nearest meter. For service creators who want to build applications, and data managers who want the data to be used as effectively as possible, anonymizing means losing value.

As a fervent advocate for the protection of personal data, the French National Commission for Information Technology and Civil Liberties (CNIL) decided to confront this misconception. The Cabanon project, led by the CNIL laboratory of digital innovation (LINC) in 2017, took on the challenge of anonymizing the NYC TLC dataset and using specific scenarios for creating new services. “There are several ways to anonymize data, but there is no miracle solution that fits every purpose,” warns Vincent Toubiana, in charge of anonymizing the datasets for the project, which has since transferred from the CNIL to the ARCEP. The Cabanon team therefore had to think of a dedicated solution.

 

Spatial and temporal degradation

First step: the GPS coordinates were replaced by the ZCTA code, the U.S. equivalent of postal codes in France. This is the method chosen by Uber to ensure personal data security. This operation degrades the spatial data; it drowns the taxi’s departure and arrival positions in areas composed of several city blocks. However, this may prove insufficient in truly ensuring the anonymity of the customers and drivers. At certain times during the night, sometimes only one taxi made a trip from one area of the city to another. Even if the GPS positions are erased, it is still possible to link the geographical position and identity.

Therefore, in addition to the spatial degradation, we had to introduce a temporal degradation,” Vincent Toubiana explains. The time slots are adapted to avoid the single customer problem. “In each departure and arrival area, we look at all the people who take a taxi in time slots of 5, 15, 30 and 60 minutes,” he continues. In the data set, the time calibration is adjusted so that no time slot has fewer than ten people. If, despite these precautions, a single customer is within the largest time slot of 60 minutes, the data is simply deleted. According to Vincent Toubiana, “the goal is to find the best mathematical compromise for keeping a maximum amount of data with the smallest possible time intervals.

In the 2013 data used by the CNIL, the same data made public by Chris Whong, NYC TLC made over 130 million trips. The double degradation operations therefore demanded significant computing resources. The handling of the data to be processed using different temporal and spatial slicing required assistance from TeraLab, IMT’s big data platform. “It was essential for us to work with TeraLab in order to query the database to see the 5-minute intervals, or to test the minimum number of people we could group together,” Vincent Toubiana explains.

Read more on I’MTech: Teralab, a big data platform with a European vision

Data visualization assisting data usage

Once the dataset has been anonymized in this way, it must be proven useful. To facilitate its reading, a data visualization in the form of a choropleth map was produced—a geographical representation associating a color with each area based on the amount of trips. “The visual offers a better understanding of the differences between anonymized data and that which is not, and facilitates the analysis and narration of this data,” says Estelle Hary, designer at the CNIL who produced the data visualization.

To the left: a map representing the trips using non-anonymized data. To the right: choropleth map representing the journeys with a granularity that ensures anonymity.

 

Based on this map, they began reflection on the kinds of services that could be created using anonymized data. The map helped identify points in Brooklyn where people order taxis to complete their journey home. “We started thinking about the idea of a private transportation network that would complement public transport in New York,” says Estelle Hary. Since they would be cheaper than taxis, this private public transport could cover areas neglected by buses. “This is a typical example of a viable service that anonymized data can be used to create,” she continues. In this case, the information that was lost to protect the personal data had no impact. The processed data set is just as effective. And this is only one example of a potential use. “By linking anonymized datasets with other public data, the possibilities are multiplied,” the designer explains. In other words, the value of an open dataset depends on our capacity for creativity.

There will certainly always be cases in which the degradation of raw data limits the creation of a service. This is the case for more personalized services. But perhaps anonymity should be seen, not as a binary value, but as a gradient. Instead of seeing anonymity as a characteristic that is present or absent from datasets, wouldn’t it be more appropriate to consider several accessible degrees of anonymity according to the exposure of the data set and the control over the use? That is what is the CNIL proposed in the conclusion of the Cabanon project. The data could be publicly accessible in fully anonymized form. In addition, the same dataset could be accessible in versions that are less and less anonymized, with, in exchange, a more significant level of control over the use.

[box type=”info” align=”” class=”” width=””]

TeraLab, big data service for researchers

Teralab is a big data and artificial intelligence platform serving research, innovation and education. It is led by Institut Mines-Télécom (IMT) and the Group of National Schools of Economics and Statistics (GENES). Teralab was founded in 2014 through a call for projects by the Investments for the Future program called “Cloud Computing and Big Data”. The goal of the platform is to aggregate the demand for software and infrastructure for projects involving large volumes of data. It also offers security and sovereignty, enabling stakeholders to entrust their data to the researchers with confidence. [/box]

Projet BBM, e-santé, e-Health, business model

e-Health companies face challenges in developing business models

The development of technological tools has opened the way for many innovations in the e-health sector. These products and services allow doctors to remotely monitor their patients and help empower dependent persons. Yet the companies that develop and market these solutions find it very difficult to establish viable and sustainable business models. As part of the Better Business Models project, Charlotte Krychowski, a researcher in management at Télécom École de Management and Myriam Le Goff-Pronost, a researcher in economics at IMT Atlantique, have focused on company case studies to better understand this situation.

 

Connected capsules and blood pressure monitors, platforms for medical consultation by telephone, home automation and remote assistance services for dependent persons… All these innovations are the work of companies in the e-health sector, which has been booming with the development of new technologies. “The innovations in e-health offer real benefits for patients: some of the connected objects, for example, are able to detect when an elderly person falls and alert a doctor,” Myriam Le Goff-Pronost explains. Far from being unnecessary gadgets, these new products and services help establish efficient medical services, while reducing healthcare costs. But despite the quality of the services they offer, companies in the e-health sector face many challenges in establishing viable business models.

The BBM (Better Business Models) project, funded by the ANR, with partners including Myriam Le Goff-Pronost (IMT Atlantique), Charlotte Krychowski (Télécom École de Management), Université de Lille, Université Savoie Mont Blanc and Grenoble École de Management, focuses on the challenges companies face in establishing business models in the areas of e-health and video games. “These two industries were chosen because digital technology plays a predominant role in both, and in France there is a dynamic group of companies in these sectors. Along with twenty researchers from other schools, Charlotte and I have worked on e-health companies,” Myriam Le Goff-Pronost explains. The researchers worked on case studies to understand how business models have developed in these sectors. “We studied businesses that were very different in terms of their size and activities, but also in terms of their success, so that we could study an eclectic and representative panel,” says Myriam Le Goff-Pronost. “This has required a lot of work through regular meetings and interviews with business leaders in order to understand their decisions in terms of their business model and the way these models have developed.” Unfortunately, for now, none of the e-health companies they studied have succeeded in generating profits.

 

Two different worlds with different problems

While all the companies studied experienced economic difficulties, they faced different challenges in establishing a sustainable business model. Myriam Le Goff-Pronost and Charlotte Krychowski observed that the companies could be divided into two distinct groups: the well-being world, geared toward the general public, and the medical world, which proposed medical devices.

From connected scales to wristbands that track activity, the “well-being” products are usually sold directly to the general public. “The main difficulty for the “well-being” companies is that, often, they find themselves competing with big American manufacturers, and it is hard to make their product stand out,” explains Charlotte Krychowski. “Not to mention that they are adversely affected by the ban on marketing health data.”

In the medical world, because of how difficult it is to obtain marketing authorizations, and the health system’s structural problems, it takes a long time to reach the break-even point. While waiting to reach this break-even point in the health sector, Bodycap, which offers a connected capsule for measuring body temperature in real time, turned to veterinary medicine and top-level sports to survive. Yet there are many possible applications for human health: monitoring a patient during a lengthy surgical operation, post-surgery follow-up after the return home, monitoring patients confined in sterile room, etc. “To survive, companies are turning to sectors where regulation is much more flexible, no need for marketing authorizations! And in top-level sports, prices can be very elastic,” Charlotte Krychowski explains.

Finally, there is a reason it is so complicated for companies offering medical services and devices to establish business models in the e-health sector: the patient, who would benefit from the service, is not the one who pays. Social security, complementary health insurance organizations, EHPAD (residential homes for dependent older people), are just a few examples of the intermediaries that complicate the process of establishing cost-effective and sustainable business models. The Médecin Direct Platform, which offers medical consultations by telephone, has chosen to build partnerships with insurance providers to establish a viable business model. The insurer offers the service and pays the company. “The State’s validation of their remote medical consultation activity has enabled them to write prescriptions remotely, which really helped them economically” Myriam Le Goff-Pronost explains. “Still, the company is not yet generating profit…”

 

Structural problems to resolve

Although these companies are struggling to find a suitable business model, this does not mean they are not doing well or have made bad choices,” says Charlotte Krychowski. “For most of them, it will take years to become profitable, because the viability of their business model depends on the long-term resolution of structural problems in France’s health sector.” Because while innovations in e-health help in prevention, hospitals and doctors are paid on a fee-for-service basis, for example for a consultation or operation. “Currently, whether a patient is doing well or poorly after an operation has no impact whatsoever on the hospital. And what’s more, if the person must be re-hospitalized due to complications, the hospital earns more money! The pay received should be higher if the operation goes well and the post-surgery follow-up is carried out properly,” says Charlotte Krychowski. In her opinion, our health system will have to transition to a flat-rate fee for each patient receiving follow-up care in order to integrate e-health innovations and provide companies in the sector with a favorable environment for their economic development. It will also be necessary to train caregivers to use the digital tools, since they will increasingly need to provide follow-up care using connected devices.

Furthermore, other legislative barriers hinder the success of e-health companies and the development of their innovations, such as marketing authorizations procedures. “The companies studied that produce medical devices are required to conduct clinical trials that are extremely long in relation to the speed of technological developments,” says Charlotte Krychowski. Finally, current legislation prohibits the marketing of sensitive health data, which deprives companies of an economic lever.

The difficulties encountered by all the companies in the study led the two researchers to publish the case studies and their findings in a book that is currently in progress, which will give business leaders keys to establishing their business models. According to Myriam Le Goff, this work must be continued to produce specific recommendations to help e-health entrepreneurs break into this complicated market.

Also read on I’MTech:

Soft Landing

Soft Landing: A partnership between European incubators for developing international innovation

Projets européens H2020How can European startups be encouraged to reach beyond their countries’ borders to develop internationally? How can they come together to form new collaborations? The Soft Landing project, in which business incubator IMT Starter is participating, allows growing startups and SMEs to discover the ecosystems of different European incubators. The goal is to offer them support in developing their business internationally. 

 

Europe certainly acknowledges the importance of each country developing its own ecosystem of startups and SMEs, yet each ecosystem is developing independently,” explains Augustin Rads, business manager at IMT Starter. The Soft Landing project, which receives funding from the European Union’s Horizon 2020 program, seeks to find a solution to this problem. “The objective is, on the one hand to promote exchanges between the different startup and SME ecosystems, and on the other hand to provide these companies with a more global vision of the European market beyond their borders,” he explains.

Soft Landing resulted from collaboration between five European incubators: Startup Division in Lithuania, Crosspring Lab in the Netherlands, GTEC in Germany,  F6S Network in the UK, and IMT Starter, the incubator run by Télécom SudParis and Télécom École de Management in Évry, France. As part of the project, each of these stakeholders must first discover the startup and SME ecosystems developing in their partners’ countries. Next, interested startups that see a need for this support will be able to temporarily join an incubator abroad, for a limited period.

 

Discovering each country’s unique characteristics

Over the course of the two-year project, representatives from each country will visit partner incubators to discover and learn about the startup ecosystem that is developing there. The representatives are also seeking to identify specific characteristics, skills, and potential markets in each country that could interest startups in their own country. “Each country has its specific areas of interest: the Germans work a lot on the theme of the industry, whereas in the Netherlands and Lithuania, the projects are more focused on FinTech, “Augustin Radu adds. “At IMT Starter, we are more focused on information technologies.”

Once they have completed these discovery missions, the representatives will return to their countries’ startups to present the potential opportunities. “At IMT Starter, we have planned a mission in Germany in March, another in the Netherlands in April, in May we will host a foreign representative, and in June we will go to Lithuania,” Augustin Radu explains. “There may be other missions outside the European Union as well, in the Silicon Valley and in India.

 

Hosting foreign startups in the incubators

Once each incubator’s specific characteristics and possibilities have been defined, the startups can request to be hosted by a partner ecosystem for a limited period. “As an incubator, we will host startups that will benefit from our customized support.” says Augustin Radu. “They will be able to move into our offices, take advantage of our network of industrial partners, and work with our researchers and laboratories. The goal is to help them find talent to help grow their businesses.

Of course, there is a selection process for startups that want to join an incubator,” the business manager adds. “What are their specific needs? Does this match the host country’s areas of specialization?” In addition, the startup or SME should ideally have an advanced level of maturity, be well rooted in its country of origin and have a product that is already finalized. According to Augustin Radu, these are the prerequisites for a company to benefit from this opportunity to continue its development abroad.

 

Remove barriers that separate startups and research development

While all four of the partner structures are radically different, they are all very well-rooted in their respective countries,” the business manager explains. IMT Starter is in fact the only incubator participating in this project that is connected to a higher education and research institution, IMT. A factor that Augustin Radu believes will greatly enhance the French incubator’s visibility.

In addition to fostering the development of startups abroad, the Soft Landing project also removes barriers between companies and the research community by proposing that researchers at schools associated with IMT Starter form partnerships with the young foreign companies. “Before this initiative, it was difficult to imagine a French researcher working with a German startup! Whereas today, if a young European startup joins our incubator because it needs our expertise, it can easily work with our laboratories.”

The project therefore represents a means of accelerating the development of innovation, both by building bridges between the research community and the startup ecosystem, as well as by pushing young European companies to seek an international presence. “For those of us in the field of information technology, if we don’t think globally we won’t get anywhere!” Augustin Radu exclaims. “When I see that in San Francisco, companies immediately think about exporting outside the USA, I know our French and European startups need to do the same thing!” This is a need the Soft Landing project seeks to fulfill by broadening the spectrum of possibilities for European startups. This could allow innovations produced in the Old World to receive the international attention they deserve.