Claudine Guerrier, Security, Privacy

Security and Privacy in the Digital Era

“The state, that must eradicate all feelings of insecurity, even potential ones, has been caught in a spiral of exception, suspicion and oppression that may lead to a complete disappearance of liberties.”
—Mireille Delmas Marty, Libertés et sûreté dans un monde dangereux, 2010

This book will examine the security/freedom duo in space and time with regards to electronic communications and technologies used in social control. It will follow a diachronic path from the relative balance between philosophy and human rights, very dear to Western civilization (at the end of the 20th Century), to the current situation, where there seems to be less freedom in terms of security to the point that some scholars have wondered whether privacy should be redefined in this era. The actors involved (the Western states, digital firms, human rights organizations etc.) have seen their roles impact the legal and political science fields.

 

Author Information

Claudine Guerrier is Professor of Law at the Institut Mines-Télécom and the Télécom École de Management in Paris, France. Her research focuses on the tense relationship between technology, security and privacy.

 

Security, Privacy, digital eraSecurity and Privacy in the Digital Era
Claudine Guerrier
Wiley-ISTE, 2016
284 pages
108,70 € (hardcover) – 97,99 € (E-book)

Read an excerpt and order online

 

Ocean Remote sensing, data, IMT Atlantique

Ocean remote sensing: solving the puzzle of missing data

The satellite measurements that are taken every day rely greatly on atmospheric conditions, the main cause of missing data. In a scientific publication, Ronan Fablet, a researcher at Télécom Bretagne, proposes a new method for reconstructing the temperature of the ocean surface to complete incomplete observations. This reconstructed data provides fine-scale mapping of the homogeneous details that are essential in understanding the many different physical and biological phenomena.

 

What do a fish’s migration through the ocean, a cyclone, and the Gulf Stream have in common? They can all be studied using satellite observations. This is a theme Ronan Fablet appreciates. As a researcher at Télécom Bretagne, he is particularly interested in processing satellite data to characterize the dynamics of the ocean. This designation involves several themes, including the reconstruction of incomplete observations. Missing data impairs satellite observations and limits the representation of the ocean, its activities and interactions. This represents essential components used in various areas, from the study of marine biology to ocean-atmosphere exchanges that directly influence the climate. In an article published in June 2016 in the IEEE J-STARS[1] Ronan Fablet proposed a new statistical interpolation approach for compensating for the lack of observations. Let’s take a closer look at the data assimilation challenges in oceanography.

 

Temperature, salinity…: the oceans’ critical parameters

In oceanography, the name of a geophysical field refers to its fundamental parameters of sea surface temperature (or SST), salinity (quantity of salt dissolved in the water), water color, which provides information on the primary production (chlorophyll concentrations), and the altimetric mapping (ocean surface topography).

Ronan Fablet’s article focuses on the SST for several reasons. First of all, the SST is the parameter that is measured the most in oceanography. It benefits from high-precision or high-resolution measurements. In other words, a relatively short distance of one kilometer separates two observed points, unlike salinity measurements, which have a lower level of precision (distance of 100km between two measurement points). Surface temperature is also an input parameter that is often used to design digital models for studying ocean-atmosphere interactions. Many heat transfers take place between the two. One obvious example is cyclones. Cyclones are fed by pumping heat from the oceans’ warmer regions. Furthermore, the temperature is also essential in determining the major ocean structures. It allows surface currents to be mapped on a small-scale.

But how can a satellite measure the sea surface temperature? As a material, the ocean will react differently to a given wavelength. “To study the SST, we can, for example, use an infrared sensor that first measures the energy. A law can then be used to convert this into the temperature,” explains Ronan Fablet.

 

Overcoming the problem of missing data in remote sensing

Unlike the geostationary satellites that orbit at the same speed as the Earth’s rotation, moving satellites generally complete one orbit in a little over 1 hour and 30 minutes. This enables them to fly over several terrestrial points in one day. They therefore build images by accumulating data. Yet some points in the ocean cannot be seen. The main cause of missing data is satellite sensors’ sensitivity to atmospheric conditions. In the case of infrared measurements, clouds block the observations. “In a predefined area, it is sometimes necessary to accumulate two weeks’ worth of observations in order to benefit from enough information to begin reconstructing the given field,” explains Ronan Fablet. In addition, the heterogeneous nature of the cloud cover must be taken into account. “The rate of missing data in certain areas can be as high a 90%,” he explains.

The lack of data is a true challenge. The modelers must find a compromise between the generic nature of the interpolation model and the complexity of its calculations. The problem is that the equations that characterize the movement of fluids, such as water, are not easy to process. This is why these models are often simplified.

 

A new interpolation approach

According to Ronan Fablet, the techniques that are being used do not take full advantage of the available information. The approach he proposes reaches beyond these limits: “we currently have access to 20 to 30 years of SST data. The idea is that among these samples we can find an implicit representation of the ocean variations that can identify an interpolation. Based on this knowledge, we should be able to reconstruct the incomplete observations that currently exist.

The general idea of Ron Fablet’s method is based on the principle of learning. If a situation that is observed today corresponds to a previous situation, it is then possible to use the past observations to reconstruct the current data. It is an approach based on analogy.

 

Implementing the model

In his article, Ronan Fablet therefore used an analogy-based model. He characterized the SST based on a law that provides the best representation of its spatial variations. The law that was chosen provides the closest reflection of reality.

In his study, Ronan Fablet used low-resolution SST observations (100km distances between two observations). With low-resolution data, optimum interpolation is usually favored. The goal is to reduce errors in reconstruction (differences between the simulated field and observed field) at the expense of small-scale details. The image obtained through this process has a smooth appearance. However, when the time came for interpolation, the researcher chose to maintain a high level of detail. The only uncertainty that remains is where the given detail is located on the map. This is why he opted for a stochastic interpolation. This method can be used to simulate several examples that will place the detail in different locations. Ultimately, this approach enabled him to create SST fields with the same level of detail throughout, but with the local constraint of the reconstruction error not improving on that of the optimum method.

The proportion of ocean energy within distances under 100km is very significant in the overall balance. At these scales, a lot of interaction takes place between physics and biology. For example, schools of fish and plankton structures are formed under the 100km scale. Maintaining a small-scale level of detail also serves to measure the impact of physics on ecological processes,” explains Ronan Fablet.

 

The blue circle represents the missing data fields. The maps represent the variations in SST at low-resolution based on a model (left), and at high-resolution based on observations (center) and at high resolution based on the model in the article (right).

 

New methods ahead using deep learning

Another modeling method has recently begun to emerge using deep learning techniques. The model designed using this method learns from photographs of the ocean. According to Ronan Fablet, this method is significant: “it incorporates the idea of analogy, in other words, it uses past data to find situations that are similar to the current context. The advantage lies in the ability to create a model based on many parameters that are calibrated by large learning data sets. It would be particularly helpful in reconstructing the missing high-resolution data from geophysical fields observed using remote sensing.”

 

Also read on I’MTech

[one_half]

[/one_half][one_half_last]

[/one_half_last]

[1] Journal of Selected Topics in Applied Earth Observations and Remote Sensing. An IEEE peer-reviewed journal.

Télécom ParisTech, Michèle Wigger, Starting Grant, ERC, 5G, communicating objects

Michèle Wigger: improving communications through coordination

Last September, Michèle Wigger was awarded a Starting Grant from the European Research Council (ERC). Each year, this distinction supports projects led by the best young researchers in Europe. It will enable Michèle Wigger to further develop the work she is conducting at Télécom ParisTech on information and communications theory. She is particularly interested in optimizing information exchanges through cooperation between communicating objects.

 

The European Commission’s objective regarding 5G is clear: the next generation mobile network must be available in at least one major city in each Member state by 2020. However, the rapid expansion of 5G raises questions on network capacity levels. With this fifth-generation system, it is just a matter of time before our smartphones can handle virtual and augmented reality, videos in 4K quality and high definition video games. It is therefore already necessary to start thinking about the quality of service, particularly during peaks in data traffic, which should not hinder loading times for users.

Optimizing the communication of a variety of information is a crucial matter, especially for researchers, who are on the front lines of this challenge. At Télécom ParisTech, Michèle Wigger explores the theoretical aspects of information transmission. One of her research topics is focused on using storage space distributed throughout a network, for example, in various base stations or in terminals from internet access providers — “boxes”. “The idea is to put the data in these areas when traffic is low, during the night for example, so that they are more readily available to the user the next evening during the peaks in network use,” summarizes Michèle Wigger.

Statistical models have shown that it was possible to follow how a video spread geographically, and therefore anticipate, with a few hours in advance, where it will be viewed. Michèle Wigger’s work would therefore enable the smoother use of networks to prevent saturation. Yet she is not only focused on the theoretical aspects behind this method for managing flows. Her research focuses on the physical layer of the networks, in other words, the construction of the modulated signals to be transmitted by antennas to reduce bandwidth usage.

She adds that these communications assisted by cache memory can also go a step further, by using data that is not stocked on our boxes, but on our neighbors’ boxes. “If I want to send a message to two people who are next to each other, it’s much more practical to distribute the information between them both, rather than repeat the same thing to each person.” She explains. To further develop this aspect, Michèle Wigger is exploring power modulations that enable different data to be sent, using only one signal, to two recipients — for example, neighbors — who can then work together collaboratively to exchange the data. “Less bandwidth is therefore required to send the required data to both recipients,” she explains.

 

Improving coordination between connected objects for smart cities

Beyond optimizing communications using cache memories, Michèle Wigger’s research is more generally related to exchanging information between communicating agents. One of the other projects she is developing involves coordination between connected objects. Still focusing on the theoretical aspect, she uses the example of intelligent transportation to illustrate the work she is currently carrying out on the maximum level of coordination that can be established between two communicating entities. “Connected cars want to avoid accidents. In order to accomplish this, what they really want to do is to work together,” she explains.

Read on our blog The autonomous car: safety hinging on a 25cm margin

However, in order to work together, these cars must exchange information using the available networks, which may depend on the technology used by manufacturers or on the environment where they are located. In short, the coordination to be established will not always be implemented in the same manner, since the available network will not always be of the same quality. “I am therefore trying to find the limits of the coordination that is possible based on whether I am working with a weak or even non-existent network, or with a very powerful network,” explains Michèle Wigger.

A somewhat similar issue exists regarding sensors connected to the internet of things, aimed at assisting in decision-making. A typical example is buildings that are subject to risks such as avalanches, earthquakes or tsunamis. Instruments measuring the temperature, vibrations, noise and a variety of other parameters collect data that is sent to decision-making centers that decide whether to issue a warning. Often, the information that is communicated is linked, since the sensors are close together, or because the information is correlated.

In this case, it is important to differentiate the useful information from the repeated information, which does not add much value but still requires resources to be processed. “The goal is to coordinate the sensors so that they transmit the minimum amount of information with the smallest possible probability of error,” explains Michèle Wigger. The end goal is to facilitate the decision-making process.

 

Four focus areas, four PhD students

Her research was awarded a Starting Grant from the European Research Council (ERC) in September both for its promising nature and for its level of quality. A grant of €1.5 million over a five-year period will enable Michèle Wigger to continue to develop a total of four areas of research, all related in one way or another to improving how information is shared with the aim of optimizing communications.

Through the funding from the ERC, she plans to double the size of her team at the information processing and communication laboratory (UMR CNRS and Télécom ParisTech), which will expand to include four new PhD students and two post-doctoral students. She will therefore be able to assign each of these research areas to a PhD student. In addition to expanding her team, Michèle Wigger is planning to develop partnerships. For the first subject addressed here — that of communications assisted by cache memory — she plans to work with INSA Lyon’s Cortexlab platform. This would enable her to test the codes she has created. Testing her theory through experimental results will enable her to further develop her work.

Godefroy Beauvallet, Innovation, Economics

Research and economic impacts: “intelligent together”

What connections currently exist between the world of academic research and the economic sphere? Does the boundary between applied research and fundamental research still have any meaning at a time when the very concept of collaboration is being reinterpreted? Godefroy Beauvallet, Director of Innovation at IMT and Vice Chairman of the National Digital Technology Council provides some possible answers to these questions. During the Digital Technology Meetings of the French National Research Agency (ANR) on November 17, he awarded the Economic Impact Prize to the Trimaran project, which unites Orange, Institut Paul-Langevin, Atos, as well as Télécom Bretagne in a consortium that has succeeded in creating a connection between two worlds that are often wrongly perceived as opposites.

 

 

When we talk about the economic impact of research, what exactly does this mean?

Godefroy Beauvallet: When we talk about economic impact, we’re referring to research that causes a “disruption,” work that transforms a sector by drastically improving a service or product, or the productivity of their development. This type of research affects markets that potentially impact not just a handful, but millions of users, and therefore also directly impact our daily lives.

 

Has it now become necessary to incorporate this idea of economic impact into research?

GB: The role of research institutions is to explore realities and describe them. The economic impacts of their work can be an effective way of demonstrating they have correctly understood these realities. The impacts do not represent the compass, but rather a yardstick—one among others—for measuring whether our contribution to the understanding of the world has changed it or not. At IMT, this is one of our essential missions, since we are under the supervision of the Ministry of the Economy. Yet it does not replace fundamental research, because it is through a fundamental understanding of a field that we can succeed in impacting it economically. The Trimaran project, which was recognized alongside another project during the ANR Digital Technology Meetings, is a good example of this, as it brought together fundamental research on time reversal and issues of energy efficiency in telecommunication networks through the design of very sophisticated antennas.

 

So, for you, applied research and fundamental research do not represent two different worlds?

GB: If we only want a little economic impact, we will be drawn away from fundamental research, but obtaining major economic impacts requires a return to fundamental research, since high technological content involves a profound understanding of the phenomena that are at work. If the objective is to cause a “disruption”, then researchers must fully master the fundamentals, and even discover new ones. It is therefore necessary to pursue the dialectic in an environment where a constant tension exists between exploiting research to reap its medium-term benefits, and truly engaging in fundamental research.

“If the objective is to cause a disruption, then researchers must fully master the fundamentals”

And yet, when it comes to making connections with the economic sphere, some suspicion remains at times among the academic world.

GB: Everyone is talking about innovation these days. Which is wonderful; it shows that the world is now convinced that research is useful! We need to welcome this desire for interaction with a positive outlook, even when it causes disturbances, and without compromising the identity of researchers, who must not be expected to turn into engineers. This requires new forms of collaboration to be created that are suitable for both spheres. But failure to participate in this process would mean researchers having to accept an outside model being imposed on them. Yet researchers are in the best position to know how things should be done, which is precisely why they must become actively involved in these collaborations. So, yes, hesitations still exist. But only in areas where we have not succeeded in being sufficiently intelligent together.

 

Does succeeding in making a major economic impact, finding the disruption, necessarily involve a dialogue between the world of research and the corporate world?

GB: Yes, but what we refer to as “collaboration” or “dialogue” can take different forms. Like the crowdsourcing of innovation, it can provide multiple perspectives and more openness in facing the problems at hand. It is also a reflection of the start-up revolution the world has been experiencing, in which companies are created specifically to explore technology-market pairs. Large companies are also rethinking their leadership role by sustaining an ecosystem that redefines the boundary between what is inside and outside the company. Both spheres are seeking new ways of doing things that do not rely on becoming more alike, but rather on embracing their differences. They have access to tools that propose faster integration, with the idea that there are shortcuts available for working together more efficiently. In our field this translates into an overall transformation of the concept of collaboration, which characterizes this day and age –particularly due to the rise of digital technology.

 

From a practical perspective, these new ways of cooperating result in the creation of new working spaces, such as industrial research chairs, joint laboratories, or simply through projects carried out in partnership with companies. What do these working spaces contribute?

GB: Often, they provide the multi-company context. This is an essential element, since the technology that results from this collaboration is only effective and only has an economic impact if it is used by several companies and permeates an entire market. The company is then under certain short-term requirements, with annual or even quarterly requirements. From this point of view, it is important for the company to work with actors who have a slower, more long-term tempo; to ensure that it will have a resilient long-term strategy. And these spaces work to build trust among the participants: the practices and interactions are tightly regulated legally and culturally, which protects the researchers’ independence. This is the contribution of academic institutions, like Institut Mines-Télécom, and public research funding authorities, like ANR, which provide the spaces and means for inventing collaborations that are fruitful and respectful of each other’s identity.

 

European, Chair on Values and Policies of Personal Information

The Internet of Things in the European Ecosystem

The Internet of Things is fast becoming a vast field of experimentation with possibilities that are yet to be taken advantage of, thanks to major technological advances promoting the miniaturization of sensors and the speed of digital exchanges. It is also thanks to services in our digitalized daily life that there will soon be dozens of these new objects in every European household.

 

The major issues arising from this situation are the prime focus on November 25th of Institut Mines-Télécom’s 12th meeting of the Chair on Values and Policies of Personal Information, organized (in English) in partnership with Contexte, a specialist in European politics.

The morning session will offer the opportunity to listen to four well-known players of the digital ecosystem who are involved in the issues and scope of connected objects on a European scale. They will be debating political, economic and industrial issues.

Godefroy Beauvallet, Director of Innovation for Institut Mines-Télécom, Vice-President of the French Digital Council (CNNum),
Thibaut Kleiner, Information Coordinator for the European Commission for Digital Economy and Society,
Robert Mac Dougall, President within the Alliance for Internet of Things Innovation (AIOTI),
Olivier Ezraty, expert in the Innovation sector and influential blogger.

The afternoon session will focus on two key themes examined from an academic point of view. Firstly, the legal aspects of the Internet of Things, particularly in relation to the implementation of the new European General Data Protection Regulation (GDPR) which will come into effect in May 2018: what impact will this have on designing appliances, the application and the use of the Internet of Things? Next, the societal and philosophical aspects of this new human-machine environment and its issues and implications, on both an individual and collective scale. How will the structure of our societies evolve? What are the advantages, and at what price?

With:
Yann Padova, Auditor at the French Energy Regulation Commission,
Denise Lebeau-Marianna, Head of Data Protection at Baker & McKenzie,
Bernard Benhamou, Secretary General for the Institut de la souveraineté numérique,
Rob van Kranenburg, founder of the Council, promoter for the Internet of Things.

Together with all the Chair on Values and Policies of Personal Information research teams.

 

[toggle title=”Meeting program” state=”close”]

9:00 – Reception

9:30 – Round table: ‘European Internet of Things Ecosystem‘

Who are the stakeholders and what are the issues of this new ecosystem? What are the possible directions on a European scale?

14:00 – Round table: ‘The Internet of Things and the implementation of the European General Data Protection Regulation (GDPR)’

The European General Data Protection Regulation (GDPR) will come into effect in May 2018. What will the main impacts be on the design of objects, applications and services?

15:15 – Round table: ‘Brave New IoT? Societal and ethical aspects of the new man-machine environments’

What will the implications of these technologies be on both an individual and collective level? How will the structure of our societies evolve? What are the advantages, and at what price?

16:15 – Finish[/toggle]

 

12th meeting of the Chair on Values and Policies of Personal Information
The Internet of Things in the European Ecosystem

Friday, November 25th, 2016
Télécom ParisTech 46 rue Barrault, Paris 13e

Pollutants, Département SAGE, Mines Douai, Frédéric Thévenet, COV

Removing pollutants from our homes

Indoor air is polluted with several volatile organic compounds, some of which are carcinogenic. Frédéric Thévenet, a researcher at Mines Douai, develops solutions for trapping and eliminating these pollutants, and for improving tests for air purifying devices.

 

We spend nearly 90% of our time inside: at home, at the office, at school, or in our car. Yet the air is not as clean as we think – it contains a category of substances called volatile organic compounds (VOCs), some of which are harmful. Fighting these VOCs is Frédéric Thévenet’s mission. Frédéric is a researcher with the Department of Atmospheric Sciences and Environmental Engineering (SAGE) at Mines Douai, a lab specialized in analytical chemistry capable of analyzing trace molecules.

 

Proven carcinogens

VOCs are gaseous organic molecules emitted in indoor environments from construction materials, paint and glue on furniture, cleaning and hygiene products, and even from cooking. One specific molecule is a particular cause for concern: formaldehyde, both a proven carcinogen and the compound with the highest concentration levels. Guideline values exist (concentration levels that must not be exceeded) for formaldehyde, but they are not yet mandatory.

The first way to reduce VOCs is through commonsense measures: limit sources by choosing materials and furniture with low emissions, choose cleaning products carefully and, above all, ventilate frequently with outdoor air. But sometimes this is not enough. This is where Frédéric Thévenet comes into play: he develops solutions for eliminating these VOCs.

 

Trap and destroy

There are two methods for reducing VOCs in the air. They can be trapped on a surface through adsorption (the molecules bind irreversibly to the surface), and the traps are then replenished. The compounds can also be trapped and destroyed immediately, generally through oxidation, by using light (photocatalysis). “But in this case, you must make sure the VOCs have been completely destroyed; they decompose into water and CO2, which are harmless,” the researcher explains. “Sometimes the VOCs are only partially destroyed, thus generating by-products that are also dangerous.”

 

Polluants, Frédéric Thévenet, Mines Douai, Département SAGE

 

At the SAGE Department, Frédéric works in complementary fashion with his colleagues from the VOC metrology team. They take their measurement devices to the field. He prefers to reproduce the reality of the field in the laboratory: he created an experimental room measuring 40 cubic meters, called IRINA (Innovative Room for INdoor Air studies), where he recreates different types of atmospheres and tests procedures for capturing and destroying VOCs. These procedures are at varying stages of development: Frédéric tests technology already available on the market that the ADEME (The French Environment and Energy Management Agency) wants to evaluate, as well as adsorbent materials developed by manufacturers who are looking to improve the composition. He also works on even earlier stages, developing his own solutions in the laboratory. “For example, we test the regeneration of adsorbents using different techniques, particularly with plasma,” he explains.

 

[box type=”shadow” align=”” class=”” width=””]

A long-overdue law

Only laws and standards will force manufacturers to develop effective solutions for eliminating volatile organic compounds. Yet current legislation is not up to par. Decree no. 2011-1727 of 2 December 2011 on guideline values for formaldehyde and benzene in indoor air provides that the concentration levels of these two VOCs must not exceed certain limits in establishments open to the public: 30 µg/m³ for formaldehyde and 5 µg/m³ for benzene, for long-term exposure. However, this law has not yet come into force, since the decrees implementing this measure have not yet been issued. The number of locations affected by this law make it very difficult to implement. The law’s implementation has been postponed until 2018, and even this date remains uncertain.

Furthermore, the Decree of 19 April 2011 on labelling volatile pollutant emissions for construction products, wall cladding, floor coverings, and paint and varnishes is aimed at better informing consumers on VOC emissions from construction materials, paint and varnishes. These products must include a label indicating the emission levels for 11 substances, on a four-category scale ranging from A+ to C, based on the energy label model for household appliances.[/box]

 

Improving the standards

What are the results? For now, the most interesting results are related to adsorbent construction materials, for example, when they are designed to become VOC traps. “They don’t consume energy, and show good results in terms of long-term trapping, despite variations due to seasonal conditions (temperature and humidity),” explains Frédéric. “When these materials are well designed, they do not release the emissions they trap.” All these materials are tested in realistic conditions, by verifying how these partitions perform when they are painted, for example.

As well as testing the materials themselves, the research is also aimed at improving the standards governing anti-VOC measures, which seek to come as close as possible to real operating conditions. “We were able to create a list of precise recommendations for qualifying the treatments,” the researcher adds. The goal was to obtain standards that truly prove the devices’ effectiveness. Yet today, this is far from the case. An investigation published in the magazine Que Choisir in May 2013 showed that most of the air purifiers sold in stores were ineffective, or even negatively affected the air quality by producing secondary pollutants. There was therefore an urgent need to establish a more scientific approach in this area.

 

Polluants, Frédéric Thévenet, MInes Douai

A passion for research

For some, becoming a researcher is the fulfilment of a childhood dream. Others are led to the profession through chance and the people they happen to meet. Frédéric Thévenet did not initially see himself as a researcher. His traditional career path, taking preparatory classes for an engineering school (Polytech’ Lyon), was initially leading him towards a future in engineering. Yet a chance meeting caused him to change his mind. During his second year at Polytech’, he did an internship at a research lab under the supervision of Dominique Vouagner, a researcher who was passionate about her work at the Institut Lumière Matière (ILM), a joint research unit affiliated with the Claude Bernard University Lyon 1 and CNRS. “I thought it was wonderful, the drive to search, to question, the experimental aspect… It inspired me to earn my DEA (now a Master 2) and apply for a thesis grant.” He was awarded a grant from ADEME on the subject of air treatment… although his studies had focused on material sciences. Still, it was a logical choice, since materials play a key role in capturing pollutants. Frédéric does not regret this choice: “Research is a very inspiring activity, involving certain constraints, but also much room for freedom and creativity.”

Christian Roux, IMT, Humanities, Social Sciences

The major transformations of the 21st century: “the humanities and social sciences are essential”

New materials, artificial intelligences, green energy, virtual reality, 5G… so many new innovations are impacting our society. The transformations they bring about result in changes to organizations, and redefine the role humans play in their environment, in both the professional and private realms. According to Christian Roux, Executive VP for Research and Innovation at IMT, this aspect must not be overlooked. He defends a systemic, multidisciplinary approach to the digital, productive and environmental transitions that are taking place. Here, he shares his view of the role the humanities and social sciences play in the reflections on these issues, giving us an enticing sneak-peak of the upcoming “Society, Business, Economy: Transformation in Progress” symposium that IMT is organizing on November 3 and 4 to present the latest developments in this area.

 

 

You believe that a systemic approach to the transitions caused by new technologies is essential. Why?

Christian Roux: Only a global approach can produce solutions that truly meet needs. A co-development approach that includes all the issues is therefore essential. Involving the humanities and social sciences in questions of technological innovations provides an opportunity for questioning their relevance, which prevents situations in which designers realize too late that a product or service is completely out of step with users’ needs. A very practical example of this is the industry of the future — or industry 4.0 — which is transforming processes through technologies such as augmented reality, which will change operators’ practices, by guiding their movements, for example. If we do not consider the human factor—the people who are the users—there’s a good chance the solution will miss the intended objective. The humanities and social sciences are therefore essential.

 

Is business a priority research area for IMT in the humanities and social sciences?

ChR: Our key areas are connected to the complex context of the major transformations that are taking place. Because businesses are particularly impacted by these transitions, it is naturally an area of interest. Companies are increasingly beginning to think in terms of networks, outside the walls. This raises the question of new forms of organization and creates heightened tension among the various areas within a company, such as in logistics. A new form also demands more responsible management, with an expected level of performance in this area as well. In general, companies undergo many changes due to the very far-reaching digitization. The concept of value is challenged, and there is a need to understand what it really is. This leads to a redefinition of the components that have traditionally made up a company, such as production, the use of this value, or design.

The question of design is also a major focus of our research. What changes are made to the decisive individual and collective processes involved in the various design phases of a product or service? This is the type of design and innovation question that our researchers are working on. Our interactions with the corporate ecosystem in this area are very valuable, particularly anything related to fab labs, open innovation, etc.

 

The corporate world is part of the human environment, but digitization also affects the personal sphere. What issues are your researchers exploring from that perspective?

ChR: The ethical aspects of technological innovations are an obvious issue. For example, the issues of the governance of algorithms, for example, is directly linked to questions on artificial intelligences. Humans are also part of new networks of connected objects, but what is their role in these networks? This is the question the Télécom École de Management’s Social Networks and Connected Objects Chair is seeking to answer.

The individual’s position as a consumer is also being redefined. This is the field that has been opened by the reflections on digital labor, in which crowdsourcing platforms are emerging as work methods, such as YouTube. What compensation does the user expect in return in this type of situation?

 

The issues that you mention involve regulatory aspects. What role can an institute like IMT play in this area? 

ChR: Our role is situated upstream of regulations. It involves advising public authorities and informing the debate on regulatory mechanisms. We can offer keys for analysis and understanding. The social sciences are fully integrated in this area, since the issue of regulation involves the concept of social compromise. Once the regulations have been established, we must be able to analyze them, especially for the purpose of studying and explaining the inadequacies. A good example of this is data management, in which a compromise must be found between protecting privacy and creating value. This is the purpose of the Chair on Values and Policies of Personal Information, which brings three of our schools together to focus on this issue, while the Innovation & Regulation Chair at Télécom ParisTech also studies these issues very closely.

 

How can these skills be integrated into a systemic approach?

ChR: First of all, the chairs that were mentioned above bring together several skills and disciplines, and involve industrial stakeholders who present their problems. We then develop places for experimentation, like the living labs, which are spaces where we can analyze human behaviors in a variety of controlled technological contexts. For IMT, the systematic approach represents, above all, the legacy of the French engineering training method, which is very broad-based from both a technological stand point and from the perspective of the humanities and social sciences, enabling the development of practical solutions to the problems at hand. Over time, this approach has inevitably been applied to the research we are conducting. Some of our schools are now over two hundred years old, and have always maintained a close connection with the corporate world and society.