Antenna 5G infrastructure

Mathematical tools to meet the challenges of 5G

The arrival of 5G marks a turning point in the evolution of mobile telecommunications standards. In order to cope with the constant increase in data traffic and the requirements and constraints of future uses, teams at Télécom SudParis and Davidson Consulting have joined forces in the AIDY-F2N joint laboratory. Their objective is to provide mathematical and algorithmic solutions to optimize the 5G network architecture.

 

Before the arrival of 5G, which is expected to be rolled out in Europe in 2020, many scientific barriers remain to be overcome. “5G will concern business networks and certain industrial sectors that have specific needs and constraints in terms of real time, security and mobility. In order for these extremely diverse uses to coexist, 5G must be capable of adapting” presents Badii Jouaber, telecommunications researcher at Télécom SudParis. To meet this challenge, he is piloting a new joint laboratory between Télécom SudParis and Davidson Consulting which was launched in early 2020. The main objective of this collaboration is to use artificial intelligence and mathematical modeling technologies to meet the requirements of new 5G applications.

Read on I’MTech: What is 5G?

Configuring custom networks

In order to support levels of service adapted to both business and consumer uses, 5G uses the concept of network slicing. The network is thus split into several virtual “slices” operated from a common shared infrastructure. Each of these slices can be configured to deliver an appropriate level of performance in terms of reliability, latency, bandwidth capacity or coverage. 5G networks will thus have to be adaptable, dynamic and programmable from end to end by means of virtual structures.

“Using slicing for 5G means we can meet these needs simultaneously and in parallel. Each slice of the network will thus correspond to a use, without encroaching on the others. However, this coexistence is very difficult to manage. We are therefore seeking to improve the dynamic configuration of these new networks in order to manage resources optimally. To do so, we are developing mathematical and algorithmic analysis tools. Our models, based on machine learning techniques, among other things, will help us to manage and reconfigure these networks on a permanent basis,” says Badii Jouaber. Networks that can therefore be set up, removed, expanded or reduced according to demand.

A priority for Davidson Consulting

Anticipating issues with 5G is one of the priorities of Davidson Consulting. The company is present in major cities in France and abroad, with 3,000 employees. It was co-founded in 2005 by Bertrand Bailly, a former Télécom SudParis student, and is a major player in telecoms and information systems. “For 15 years we have been carrying out expert assessment for operators and manufacturers. The arrival of 5G brings up new issues. For us, it is essential to contribute to these issues by putting our expertise to good use. It’s also an opportunity to support our clients and help them overcome these challenges”, says David Olivier, Director of Research and Development at Davidson. For him, it is thus necessary to take certain industrial constraints into account from the very first stages of research, so that their work can be operational quickly.

Another one of our goals is to achieve energy efficiency. With the increase in the number of connected objects, we believe it is essential to develop these new models of flexible, ultra-dynamic and configurable mobile networks, to minimize and reduce their impact by optimizing energy consumption”, David Olivier continues.

Bringing technology out of the labs for the networks of the future

The creation of the AIDY-FN2 joint laboratory is the culmination of several years of collaboration between Télécom SudParis and Davidson Consulting, beginning in 2016 with the support of a thesis supervised by Badii Jouaber. “By initiating a new joint research activity, we aim to strengthen our common research interests around the networks of the future, and the synergies between academic research and industry. Our two worlds have much in common!” says David Olivier enthusiastically.

Under this partnership, the teams at Davidson Consulting and Télécom SudParis will coordinate and pool their skills and research efforts. The company has also provided experts in AI and Telecommunications modeling to co-supervise, with Badii Jouaber, the scientific team of the joint laboratory that will be set up in the coming months. This work will contribute to enhancing the functionality of 5G within a few years.

The IoT needs dedicated security – now

The world is more and more driven by networked computer systems. They dominate almost all aspects of our lives. These systems are connected to the Internet, resulting in a high threat potential. Marc-Oliver Pahl, chairholder of the cybersecurity chair Cyber CNI at IMT Atlantique, talks about what is at stakes when it comes to IoT security.

 

What is the importance of securing the Internet of things (IoT)?

Marc-Oliver Pahl: Securing the IoT is one of the, or even the most important challenge I see for computer systems at the moment. The IoT is ubiquitous. Most of us interact with it many times every day – only we are not aware of it as it surrounds us in the background. An example is the water supply system that brings drinking water to our houses. Other examples are the electricity grid, transportation, finance, or health care. The list is long. My examples are critical to our society. They are so-called “critical infrastructures.” If the IoT is not sufficiently protected, critical things can happen, such as water or power outages, or even worse, manipulated processes leading to bacteria in the water, faulty products that cause safety risks such as cars, and many more.

This strong need for security, combined with the fact that IoT devices are often not sufficiently secured, and at the same time connected to the Internet with all its threat potential, illustrates the importance of the subject. The sheer number of devices, with 41.6 billion of connected IoT devices expected by 2025, shows the urgent need for action: the IoT needs the highest security standards possible to protect our society.

Why are IoT networks so vulnerable?

MOP: I want to focus on two aspects here, the “Internet”, and the “Things”. As the name Internet of Things says, IoT devices are often connected to the Internet. This makes them connected to every single user of the Internet, including bad guys. Through the Internet, the bad guys can comfortably attack an IoT system at the other side of the planet without leaving their sofa. If an attacked IoT system is not sufficiently secured, attackers can succeed and compromise the system with potentially severe consequences to security, safety, and privacy.

The term “Thing” implies a broad range of entities and applications. Consequently, IoT systems are heterogeneous. This heterogeneity includes vendors, communication technology, hardware, or software. The IoT is a mash-up of such Things, making the resulting systems complex. Securing the IoT is a big challenge. Together with our partners at the chaire Cyber CNI, in our research we contribute every day to making the IoT more secure. Our upcoming digital PhD school from October 5-9, 2020 is a wonderful opportunity to get more insights.

What would be an example challenge that IoT security needs to address and how could it be addressed?

MOP: Taking the two areas from before, one thing we work on is ensuring that the access to IoT devices over the Internet is strictly limited. This can be done via diverse mechanisms including firewalls for defining and enforcing access policies, and Software Defined Networking for rerouting attackers away from their targets.

Regarding the heterogeneity, we look at how we can enable human operators to see what happens in the ambient IoT systems, how we can support them to express what security properties they want, and how we can build systems “secure-by-design”, so that they enforce the security policies. This is especially challenging as IoT systems are not static.

What makes securing IoT systems so difficult?

MOP: Besides the previously mentioned aspects, connectivity to the Internet and heterogeneity, a third major challenge of the IoT is its dynamicity: IoT systems continuously adapt to their environments. This is part of their job and a reason for their success. From a security-perspective, this dynamicity is a highly demanding challenge. On the one hand, we want to make the systems as restrictive as possible, to protect them as much as possible. On the other hand, we have to give the IoT systems enough room to breathe to fulfill their purpose.

Then, how can you provide security for such continuously changing systems?

MOP: First of all, security-by-design has to be applied properly, resulting in a system that applies all security-mechanisms appropriately, in a non-circumventable way. But this is not enough as we have seen before. The dynamic changes of a system cannot fully be anticipated with security-by-design mechanisms. They require the same dynamics at the defender side.

Therefore, we work on continuous monitoring of IoT systems, automated analysis of the monitoring data, and automated or adaptive defense mechanisms. Artificial Intelligence, or more-precisely Machine Learning can be of great help in this process as it allows the meaningful processing of possibly unexpected data.

More on this topic: What is the industrial internet of things?

If we are talking about AI, does this mean future security systems will be fully autonomous?

MOP: Though algorithms can do much, humans have to be in the loop at some point. This has multiple reasons, including our ability to analyze certain complex situations even better than machines. With the right data and expertise, humans outperform machines. This includes the important aspect of Ethics that is another story but central when building algorithms for autonomous IoT systems.

Another reason for the need of humans in-the-loop is that there is no objective measure for security. By that I mean that the desired security-level for a concrete system has to be defined by humans. Only they know what they want. Afterwards, computer systems can then take-over in what they are best-in, the extremely fast execution of complex tasks: enforcing that the high-level goals given by human operators are implemented in the corresponding IoT systems.

 

[box type=”info” align=”” class=”” width=””]

From October 5 to 7, IoT meets security Summer School

“IoT meets Security” is the 3rd edition of the Future IoT PhD school series. It will be 100 % digitalized to comply with current guidelines regarding the covid-19 pandemic. It gives an insight perspective from industry and academia to this hot topic. It will cover a broad range of settings, use cases, applications, techniques, and security philosophies, from humans over Information Technology (IT) to Operational Technology (OT), from research to industry.

The organizers come from the top research and education institutions in the heart of Europe, IMT Atlantique and Technische Universität München:

  • Marc-Oliver Pahl (IMT/TUM) is heading the industrial Chaire for Cybersecurity in Critical Networked Infrastructures (cyber-cni.fr) and the CNRS UMR LAB-STICC/IRIS. Together with his team, he is working on the challenges sketched above.
  • Nicolas Montavont (IMT) is heading the IoT research UMR IRISA/OCIF. With his team, he is constantly working on making the IoT more reliable and efficient.

Learn more and register here

[/box]

Metrics

A European project to assess the performance of robotic functions

Healthcare, maintenance, the agri-food industry, agile manufacturing. Metrics, a three-year H2020 project launched in January, is organizing robot competitions geared towards these four industries and is developing metrological methods to assess the data collected. TeraLab, IMT’s big data and AI platform, is a partner in this project. Interview with Anne-Sophie Taillandier, Director of TeraLab.

 

What is the aim of the European project Metrics?

Anne-Sophie Taillandier: The aim of Metrics (Metrological Evaluation and Testing of Robots in International Competitions) is threefold: First, it will organize robot competitions geared towards the industries in the four priority fields: healthcare, inspection and maintenance, agri-food, and agile manufacturing. The second goal is to develop metrological methods to assess the data provided by the robots. And lastly, the project will help structure the European robotics community around the competitions in the four priority sectors mentioned before.

Other European projects are organizing robot competitions to encourage scientific progress and foster innovation. How does Metrics stand out from these projects?

AST: One of the main things that makes the Metrics project different is that it aims to directly address the reliability and validity of AI algorithms during the competitions. To do so, the competitions must at once focus on the robot’s behavior in a physical environment and on the behavior of its AI algorithms when they are confronted with correctly qualified and controlled data sets. To the best of our knowledge, this question has not been addressed in previous European robotics competitions.

What are the challenges ahead?

AST: Ultimately, we hope to make the use of assessment tools and benchmarking widespread and ensure the industrial relevance of challenge competitions. We will also have to gain attention from industrial players, universities and the general public for the competitions  and ensure that the robots comply with ethical, legal, social and economic requirements.

How will you go about this?

AST: Metrics is developing an evaluation framework based on metrological principles in order to assess the reliability of the different competing robots in a thorough and impartial manner. For each competition, Metrics will organize 3 field evaluation campaigns  (in physical environments) and three cascade evaluation campaigns (on data sets) in order to engage with the AI community. Specific benchmarks for functions and tasks are defined in advance to assess the performance of robotic functions and the execution of specific tasks.

The Metrics partners have called upon corporate sponsors to support the competitions, verify their industrial relevance, contribute to an awareness program and provide effective communication.

How is TeraLab – IMT’s big data and AI platform – contributing to the project?

AST: First of all, TeraLab will provide sovereign, neutral spaces, enabling the Metrics partners and competitors to access data and software components in dedicated work spaces. TeraLab will provide the required level of security to protect intellectual property,  assets  and data confidentiality.

TeraLab and IMT are also in charge of the Data Management Plan setting out the rules for data management in Metrics, based on best practices for secure data sharing, with contributions from IMT experts in the fields of cybersecurity, privacy, ethics and compliance with GDPR (General Data Protection Regulation).

The consortium brings together 17 partners. Who are they?

AST: Coordinated by the French National Laboratory for Metrology and Testing (LNE), Metrics brings together 17 European partners: higher education and research institutions and organizations with expertise in the field of testing and technology transfer. They contribute expertise in robotic competitions and metrology. The partners provide test facilities and complementary networks throughout Europe in the four priority industrial areas.

What are the expected results?

AST: On a technological level, Metrics should encourage innovation in the field of robotic systems. It will also have a political impact with information for policymakers and support for robotic systems certification. And it will have a tangible socio-economic impact as well, since it will raise public awareness of robotic capacity and lead to greater engagement of commercial organizations in the four priority industries. All of this will help ensure the sustainability, at the European level, of the competition model for robotics systems that address socio-economic challenges.

Learn more about the Metrics project

Interview by Véronique Charlet for I’MTech

 

Reducing the duration of mechanical ventilation with a statistical theory

A team of researchers from IMT Atlantique has developed an algorithm that can automatically detect anomalies in mechanical ventilation by using a new statistical theory. The goal is to improve synchronization between the patient and ventilator, thus reducing the duration of mechanical ventilation and consequently shortening hospital stays. This issue is especially crucial for hospitals under pressure due to numerous patients on respirators as a result of the Covid-19 pandemic.

 

Dominique Pastor never imagined that the new theoretical approach in statistics he was working on would be used to help doctors provide better care for patients on mechanical ventilation (MV). The researcher in statistics specializes in signal processing, specifically anomaly detection. His work usually focuses on processing radar signals or speech signals. It wasn’t until he met Erwan L’Her, head of emergencies at La Cavale Blanche Hospital in Brest, that he began focusing the application of his theory, called Random Distortion Testing, on mechanical ventilation. The doctor shared a little known problem with the researcher, which would become a source of inspiration: a mismatch that often exists between patients’ efforts while undergoing MV and the respirator’s output.

Signal anomalies with serious consequences

Respirators–or ventilators–feature a device enabling them to supply pressurized air when they recognize demand from the patient. In other words, the patient is the one to initiate a cycle. Many adjustable parameters are used to best respond to an individual’s specific needs, which change as the illness progresses. These include inspiratory flow rate and number of cycles per minute. Standard settings are used at the start of MV and then modified based on flow rate/ pressure curves–the famous signal processed by the Curvex algorithm, which resulted from collaboration between Dominique Pastor and Erwan L’Her.

Patient-ventilator asynchronies are defined as time lags between the patient’s inspiration and the ventilator’s flow rate. For example, the device cannot detect a patient’s demand for air because the trigger threshold level is set too high. This leads to ineffective inspiratory effort. It can also lead to double triggering when the ventilator generates two cycles for one patient inspiratory effort. The patient may also not have time to completely empty their lungs before the respirator begins a new cycle, leading to dynamic hyperinflation of the lungs, also known as intrinsic PEEP (positive end-expiratory pressure).

Effort inspiratoire inefficace : la demande du patient n’aboutit pas à une insufflation

Example of ineffective inspiratory effort: patient demand does not result in insufflation.

 

Double déclenchement : un seul effort inspiratoire aboutit à deux insufflations rapprochées

Example of double triggering: a single inspiratory effort results in two ventilator insufflations within a short time span.

 

PEP intrinsèque : l’insufflation suivante survient alors que le débit n’est pas nul à la fin de l’expiration

Example of positive end expiratory pressure: the next ventilator insufflation occurs before the flow has returned to zero at the end of expiration.

 

These patient-ventilator anomalies are believed to be very common in clinical practice. They have serious consequences, ranging from patient discomfort to increased respiratory efforts that can lead to invasive ventilation–intubation. They involve an increase in the duration of mechanical ventilation, with an increase in weaning failure (end of MV) and therefore longer hospital stays.

However, the number of patients in need of mechanical ventilation has skyrocketed with the Covid-19 pandemic, while the number of health care workers, respirators and beds has only moderately increased, which at times gives rise to difficult ethical choices. A reduction in the duration of ventilation would therefore be a significant advantage, both for the current situation and in general, since respiratory diseases are becoming increasingly common, especially with the aging of the population.

A statistical model that adapts to various signals

Patient-ventilator asynchronies result in visible anomalies in air flow rate and pressure curves. These curves model the series of inspiratory phases, when pressure increases and expiratory phases, when it decreases, with inversion of the air flow. Control monitors for most next-generation devices display these flow rate and pressure curves. The anomalies are visible to the naked eye, but this requires regular monitoring of the curves, and a doctor to be present who can adjust the ventilator settings. Dominique Pastor and Erwan L’Her had a common objective: develop an algorithm that would detect certain anomalies automatically. Their work was patented under the name Curvex in 2013.

The detection of an anomaly represents a major deviation from the usual form for a signal. We chose an approach called supervised learning by mathematical modeling,” Dominique Pastor explains. One characteristic of his Random Distorsion Testing theory is that it makes it possible to detect signal anomalies with very little prior knowledge. “Often, the signal to be processed is not well known, as in the case of MV, since each patient has unique characteristics, and it is difficult to obtain a large quantity of medical data. The usual statistical theories have difficulty taking into account a high degree of uncertainty in the signal. Our model, on the other hand, is generic and flexible enough to handle a wide range of situations.” 

Dominique Pastor first worked with intrinsic PEEP detection algorithms with PhD student Quang-Thang Nguyen, who helped to find solutions. “The algorithm is a flow rate signal segmentation method used to identify the various breathing phases and calculate models for detecting anomalies. We introduced an adjustable setting (tolerance) to define the deviation from the model used to determine whether it is an anomaly,” Dominique Pastor explains. According to the researcher from IMT Atlantique, this tolerance is a valuable asset. It can be adjusted by the user, based on their needs, to alter the sensitivity and specificity.

The Curvex platform not only processes flow data from ventilators, but also a wide range of physiological signals (electrocardiogram, electroencephalogram). A ventilation simulator was included, with settings that can be adjusted in real-time, in order to test the algorithms and perform demonstrations. By modifying certain pulmonary parameters (compliance, airway resistance, etc.) and background noise levels, different signal anomalies (intrinsic PEEP, ineffective inspiratory effort, etc.) appear randomly. The algorithm detects and characterizes them. “In terms of methodology, it is important to have statistical signals that we can control in order to make sure it is working and then move on to real signals,” Dominique Pastor explains.

The next step is to create a proof of concept (POC) by developing electronics to detect anomalies in ventilatory signals, to be installed in emergency and intensive care units and used by health care providers. The goal is to provide versatile equipment that could adapt to any ventilator. “The theory has been expanding since 2013, but unfortunately the project has made little progress from a technical perspective due to lack of funding.  We now hope that it will finally materialize, in partnership with a laboratory, or designers of ventilators, for example. I think this a valuable use of our algorithms, both from a scientific and medical perspective,” says Dominique Pastor.

By Sarah Balfagon for I’MTech.

Learn more:

– Mechanical ventilation system monitoring: automatic detection of dynamic hyperinflation and asynchrony. Quang-Thang Nguyen, Dominique Pastor, François Lellouche and Erwan L’Her

Illustration sources:

Curves 1 and 2

Curve 3

 

Capture d'écran des cartes du Tarn pour visualiser l'épidémie de Covid-19, crisis management

Covid-19 crisis management maps

The prefecture of the Tarn department worked with a research team from IMT Mines Albi to meet their needs in managing the Covid-19 crisis. Frédérick Benaben, an industrial engineering researcher, explains the tool they developed to help local stakeholders visualize the necessary information and facilitate their decision-making.

 

The Covid-19 crisis is original and new, because it is above all an information crisis,” says Frédérick Benaben, a researcher in information system interoperability at IMT Mines Albi. Usually, crisis management involves complex organization to get different stakeholders to work together. This has not been the case in the current health crisis. The difficulty here lies in obtaining information: it is important to know who is sick, where the sick people are and where the resources are. The algorithmic crisis management tools that Frédérick Benaben’s team have been working on are thus incompatible with current needs.

When we were contacted by the Tarn prefecture to provide them with a crisis management tool, we had to start almost from scratch,” says the researcher. This crisis is not so complex in its management that it requires the help of artificial intelligence, but it is so widespread that it is difficult to display all the information at once. The researchers therefore worked on using a tool that ensures both the demographic visualization of the territory and the optimization of volunteer workers’ routes.

The Tarn team was able to make this tool available quickly and thus save a considerable amount of time for stakeholders in the territory. The success of this project also lies in the cohesion at the territorial level between a research establishment and local stakeholders, reacting quickly and effectively to an unprecedented crisis. The prefecture wanted to work on maps to visualize the needs and resources of the department, and that is what Frédérick Benaben and his colleagues, Aurélie Montarnal, Julien Lesbegueries and Guillaume Martin provided them with.

Visualizing the department

The first requirement was to be able to visualize the needs of the municipalities in the department. It was then necessary to identify the people most at risk of being affected by the disease. Researchers drew on INSEE’s public data to pool information such as age or population density. “The aim was to divide the territory into municipalities and cantons in order to diagnose fragility on a local scale,” explains Frédérick Benaben. For example, there are greater risks for municipalities whose residents are mostly over 65 years of age.

The researchers therefore created a map of the department with several layers that can be activated to visualize the different information. One showing the fragility of the municipalities, another indicating the resilience of the territory – based, for example, on the number of volunteers. By identifying themselves on the prefecture’s website, these people volunteer to go shopping for others, or simply to keep in touch or check on residents. “We can then see the relationship between the number of people at risk and the number of volunteers in a town, to see if the town has sufficient resources to respond,” says the researcher.

Some towns with a lot of volunteers appear mostly in green, those with a lack of volunteers are very red. “This gives us a representation of the Tarn as a sort of paving with red and green tiles, the aim being to create a uniform color by associating the surplus volunteers with those municipalities which need them” specifies Frédérick Benaben.

This territorial visualization tool offers a simple and clear view to local stakeholders to diagnose the needs of their towns. With this information in hand it is easier for them to make decisions to prepare or react. “If a territory is red, we know that the situation will be difficult when the virus hits,” says the researcher. The prefecture can then allocate resources for one of these territories, for example by requisitioning premises if there is no emergency center in the vicinity. It may also include decisions on communication, such as a call for volunteers.

Optimizing routes

This dynamic map is continuously updated with new data, such as the registration of new volunteers. “There is a very contemplative aspect and a more dynamic aspect that optimizes the routes of volunteers,” says Frédérick Benaben. There are many parameters to be taken into account when deciding on routes and this can be a real headache for the employees of the prefecture. Moreover, these volunteer routes must also be designed to limit the spread of the epidemic.

The needs of people who are ill or at risk must be matched with the skills of the volunteers. Some residents ask for help with errands or gardening, but others also need medical care or help with personal hygiene that requires special skills. It is also necessary to take into account the ability of volunteers to travel, whether by vehicle, bicycle or on foot. With regard to Covid-19, it is also essential to limit contact and reduce the perimeter of the routes as much as possible.

With this information, we can develop an algorithm to optimize each volunteer’s routes,” says the researcher. This is of course personal data to which the researchers do not have access. They have tested the algorithm with fictitious values to ensure functionality when the prefecture enters the real data.

The interest of this mapping solution lies in the possibilities for development,” says Frédérick Benaben. Depending on the available data, new visualization layers can be added. “Currently we have little or no data on those who are contaminated or at risk of dangerous contamination and remain at home. If we had this data we could add a new layer of visualization and provide additional support for decision making. We can configure as many layers of visualizations as we want.

 Tiphaine Claveau for I’MTech

Gaia-X

Gaia-X: a sovereign, interoperable European cloud network

France and Germany have unveiled the Gaia-X project, which aims to harmonize cloud services in Europe to facilitate data sharing between different parties. It also seeks to reduce companies’ dependence on cloud service providers, which are largely American. For Europe, this project is therefore an opportunity to regain sovereignty over its data.

 

When a company chooses a cloud service provider, it’s a little bit like when you accept terms of service or sale: you never really know how you’ll be able to change your service or how much it will cost.” Anne-Sophie Taillandier uses this analogy to illustrate the challenges companies currently face in relying on cloud service providers. As director of IMT’s TeraLab platform specializing in data analysis and AI, she is contributing to the European Gaia-X project, which aims to introduce transparency and interoperability in cloud services in Europe.

Initiated by German Economy Minister Peter Altmaier, Gaia-X currently brings together ten German founding members and ten French founding members, including cloud service providers and major users of these services, of all sizes and from all industries. Along with these companies, a handful of academic players specialized in research in digital science and technology – including IMT – are also taking part in the project. This public-private consortium is seeking to develop two types of standards to harmonize European cloud services.

First of all, it aims to introduce technical standards to harmonize practices among various players. This is an important condition to facilitate data and software portability. Each company must be able to decide to switch service providers if it so wishes, without having to modify its databases to make them compatible with a new service. The standardization of the technical framework for every cloud service is a key driver to facilitate the movement of data between European parties.

Environmental issues illustrate the significance of this technical problem. “In order to measure the environmental impact of a company’s operations, its data must be combined with that of its providers, and possibly, its customers,” explains Anne-Sophie Taillandier, who, for a number of years, has been leading research at TeraLab into the issues of data transparency and portability. “If each party’s data is hosted on a different service, with its own storage and processing architecture, they will first have to go through a lengthy process in order to harmonize the data spaces.”  This step is currently a barrier for organizations that lack either financial resources or skills, such as small companies and public organizations.

Also read on I’MTech: Data sharing: an important issue for the agricultural sector

In addition to technical standards, the members of the Gaia-X partnership are also seeking to develop a regulatory and ethical framework for cloud service stakeholders in Europe. The goal is to bring clarity to contractual relationships between service providers and customers. “SMEs don’t have the same legal and technical teams as large companies,” says Anne-Sophie Taillandier. “When they enter into an agreement with a cloud service provider, they don’t have the resources to evaluate all the subtleties of the contract.”

The consortium has already begun to work on these ethical rules.  For example, there must not be any hidden costs when a company wishes to remove its data from a service provider and switch to another provider. Ultimately, this part of the project should give companies the power to choose their cloud service providers in a transparent way. An approach that recalls the GDPR, which gives citizens the ability to choose their digital services with greater transparency and to ensure the portability of their personal data when necessary.

Restoring European digital sovereignty

It is no coincidence that the concepts guiding the Gaia-X project evoke those of the GDPR. Gaia-X is rooted in a general European Union trend for data sovereignty.  The initiative is also an integral part of the long-term EU strategy to create a sovereign space for industrial and personal data, protected by technical and legal mechanisms, which are also sovereign.

The Cloud Act adopted by the United States in 2018 gave rise to concerns among European  stakeholders. This federal law gives local and national law enforcement authorities the power to request access to data stored by American companies, should this data be necessary to a criminal investigation, including when these companies’ servers are located outside the United States. Yet, the cloud services market is dominated by American players. Together, Amazon, Microsoft and Google have over half the market share for this industry. For European companies, the Cloud Act poses a risk to the sovereignty of their data.

Even so, the project does not aim to create a new European cloud services leader, but rather to encourage the development of existing players, through its regulations and standards, while harmonizing the practices already in place among the various players. The goal is not to prevent American players or those in other countries — Chinese giant Alibaba’s cloud service is increasingly gaining ground ­— from tapping into the European market. “Our goal is to issue standards that respect European values, and then tell anyone who wishes to enter the European market that they may, as long as they play by the rules.”

For now, Gaia-X has adopted an associative structure. In the months ahead, the consortium should be opening up to incorporate other European companies who want to take part. “The project was originally a Franco-German initiative,” says Anne-Sophie Taillandier, “but it is meant to open up to include other European players who wish to contribute.” In line with European efforts over recent years to develop digital technology with a focus on cybersecurity and artificial intelligence, Gaia-X and its vision for a European cloud will rely on joint creation.

 

Benjamin Vignard for I’MTech

NFV, Import et export dans le cloud, virtualisation.

What is NFV (Network Function Virtualization) ?

The development of 5G has been made possible through the development of new technologies. The role of Network Function Virtualization, or NFV, is to virtualize network equipment. Adlen Ksentini, a researcher at EURECOM, gives us a detailed overview of this virtualization.

 

What is NFV ?

Adlen Ksentini:  NFV is the virtualization of network functions, a system that service providers and network operators had hoped for in order to decouple software from hardware. It’s based on cloud computing: the software can be placed in a virtual environment – the cloud – and be run on PCs every day. The goal is to be able to use software that implements a network function and run it on different types of hardware, instead of having to purchase dedicated hardware.

How does it work?

A.K.: It relies on the use of a hypervisor, a virtualization layer that makes it possible to abstract the hardware. The goal is to virtualize the software that implements a network function to make it run on a virtual machine or a cloud-based container.

What kind of functions are virtualized?

A.K. : When we talk about network functions, it could refer to the router that sends packets to the right destination, firewalls that protect networks, DNS servers that translate domain names into IP addresses, or intrusion detection. All of these functions will be deployed in virtual machines or containers, so that a small or medium-sized company, for example, doesn’t have to invest in infrastructure to host these services, and may instead rent them from a cloud services provider, using the Infrastructure as a Service (IaaS) model.

What are the advantages of NFV?

A.K.: NFV provides all the benefits of cloud computing. First of all, it lowers costs since you only have to pay for the resources used. It also provides greater freedom since the virtualization layer enables it to be work on several types of hardware. It also makes it possible to react according to varying degree of traffic. If there’s a sudden rise in traffic it’s possible to scale up to respond to the demands.

Performance is another factor involved. Under normal circumstances, the computer’s operating system will not dedicate all of the processor’s capacity to a single task – it will spread it out and performance may suffer. The benefit of cloud computing is that it can take advantage of the almost unlimited resources of the cloud. This also makes for greater elasticity, since resources can be freed up when they are no longer needed.

Why is this technology central to 5G?

A.K.: 5G core networks are virtualized, they will run natively in the cloud. So we need software that is able to run these network functions in the cloud. NFV provides a number of advantages and that’s why it is used for the core of 5G. NFV and SDN are complementary and make it possible to obtain a virtual network.

Read more on I’MTech: What is SDN (Software-Defined networking)?

What developments are ahead for NFV?

A.K. : Communication technologies have created a framework for orchestrating and managing virtual resources, but the standard continues to evolve and a number of studies seek to improve it. Some aim to work on the security aspect, to better defend against attacks. But we’re also increasingly hearing about using artificial intelligence to enable the operator to improve resources without human intervention. That’s the idea behind Zero Touch Management, so that NFV networks can be self-correcting, self-manageable and, of course, secure.

 

Tiphaine Claveau for I’MTech

Espace de co-working, industrie sans frontières, industry without borders

How are borders in industry changing?

The Industry Without Borders project launched by the German-French Academy for the Industry of the Future in 2017 seeks to challenge the idea that digital technology dissolves borders. Madeleine Besson and Judith Igelboeck, from Institut Mines-Télécom Business School and the Technical University of Munich respectively, explain why it is not so easy in practice.

 

Industry is going digital and this has brought about a wave of changes. The emphasis on open innovation pushes for the dissolution of borders within a company and in relationships between various organizations. “It’s not so easy,” says Madeleine Besson, a researcher in management at Institut Mines-Télécom Business School“We’ve seen that it’s becoming more fluid, but digitalization can also create new borders or reinforce those that already exist.”

The aim of the Industry Without Borders  project launched in 2017 was to identify factors that can lead to the creation of borders in companies and look at how these aspects are evolving. The project is led by the German-French Academy for the Industry of the Future, which brings together teams from IMT and the Technical University of Munich (TUM). “We looked at the way borders can be built, rebuilt, and at times, strengthened or effectively broken down,”  says Judith Igelsboeck, an organizational studies researcher at TUM. Research teams on both sides of the Rhine worked with companies, through field studies and qualitative interviews, in order to determine the changes that have been brought about by digital technology.

“We considered the idea of open innovation in particular,” says Madeleine Besson. Today, companies  consult consumers much more often in the processes of creation and innovation, but it often remains  under the company’s control. Conversely, Judith Igelsboeck reports that “a study in an IT consulting firm in Germany showed that customers went so far as to request the firm’s skills database so that they could choose the profiles of the IT specialists for their project directly themselves. The opening here is therefore clear.”

What borders?

“For a long time, borders in the business world were formalized from an economic viewpoint,” explains the French researcher. This includes assets and goods, employees and the machines used. “But the scope is much wider than that, and most importantly, it’s very mobile.”  A number of other aspects may also come into play, such as customer relationships and their involvement in innovation processes, as in the previous example, and relationships between different companies.

As far as internal borders are concerned, for example concerning organization within a department, management models tend to be moving toward eliminating borders. “This idea is reflected in efforts to redesign the very architecture of the office – the principle of open space,” explains Judith Igelsboeck. Workspaces become more agile, flexible and free of internal separations. The aim is to create more communal spaces, so that co-workers get to know each other better in order to work together.

But ultimately, open space may not be as open as it seems. “Employees reclaim ownership of the space by creating borders to mark out their personal space,” says Madeleine BessonThey do so by making special adjustments – at times perceptible only to those who use the workspace – to mark a space as their own.

Read more on I’MTech: Can workspaces become agile?

Madeleine Besson reminds us that, “the general consensus in scientific literature and the media is that digital tools and artificial intelligence facilitate instant connections, not only between people, but between things.” Supply chains should be developed in a single, seamless automated process, that can work beyond organizational borders. But it is not so clear in practice, and digital tools even appear to add new barriers.

Between theory and practice

“Imagine a printer that uses an automated tool to help manage the paper supply,”  says the French researcher. “A connected IT system between the paper supplier and the printer could help regulate paper ordering depending on current stock and the factories’ operations. The supplier becomes a sort of stockpile the company can draw on – the system is shared and the borders are therefore weakened.”

Yet, the same example could also be used to illustrate how new borders are created. If these companies use competing systems, such as Apple and Android, they will face insurmountable barriers since these two systems are not interoperable. “Technological change can also create a new border,”  adds Madeleine Besson. “It can create sub-categories between organizations that have the desire and skills to converse with computers, and others that may feel like they are merely assistants for automatons.”

“Our team encountered such a feeling during an interview with the staff of an after-sales service company,”  says the researcher. Their workday revolves around making rounds to customers whose equipment has broken down. Traditionally, these employees organized their own rounds. But their schedule is now managed by a computer system and they receive the list of customers to visit the night before. “The employees were frustrated that they were no longer in control of their own schedule. They didn’t want their responsibilities to be taken away”  she explains.

“They would meet up in the morning before making their rounds to exchange appointments. Some didn’t want to go to into big cities, others wanted to keep the customers they’d been working with for a long time. So the digital tool puts up a barrier within the company and is a source of tension and frustration, which could potentially give rise to conflicts or disputes.”  These changes are made without adequately consulting the internal parties involved and can lead to conflict in the company’s overall operations.

Across the Rhine

This initial phase of the project with a number of field studies in France and Germany is expected to lead to new collaborations. For the researchers, it would be interesting to study the changes on either side of the Rhine and determine whether similar transformations are underway, or if a cultural aspect may lead to either the dissolution or crystallization of borders.

“Each country has its own vision and strategy for the industry of the future,”  says Judith Igelsboeck. So it is conceivable that cultural differences will be perceptible. “The intercultural aspect is a point to be considered, but for now, we haven’t been able to study it in a single company with a German and French branch.”  This may be the topic for a new French-German collaboration. The German researcher says that another possible follow-up to this project could focus on the use of artificial intelligence in business management.

 

Tiphaine Claveau for I’MTech

advertising algorithms

Social media: the everyday sexism of advertising algorithms

Social media advertising algorithms can create paradoxical situations, where messages aimed at women are mostly displayed to men. These are the findings of successive research projects carried out by Grazia Cecere at the Institut Mines-Télécom Business School, in partnership with EPITECH, the University of Paris-Saclay and the MIT School of Management. The team has shed light on some of the mechanisms of algorithms that, at first glance, maintain or amplify non-parity biases.

 

Advertising algorithms prefer men. At least, those of social networks such as Facebook, Snapchat, Twitter, and LinkedIn do. This is the conclusion of several successive research projects by Grazia Cecere, a privacy economist at the Institut Mines-Télécom Business School, who has been working on the biases of algorithms for several years. In her research, she provides insights into the mystery of the advertising algorithms used by the major social platforms.  “These algorithms decide and define the information seen by the users of social networks, who are mostly young people”, she stresses.

Through collaborative work with researchers from EPITECH (Clara Jean) and the University of Paris-Saclay (Fabrice Le Guel and Matthieu Manant), Grazia Cecere looked at how an advertiser’s message is processed and distributed by Facebook algorithms. The team launched two sponsored advertisements aimed at recruiting engineering school students. The advertisements used the same image, at the same price per appearance on user accounts, and the same target population: high school students between 16 and 19 years old, with no gender specified. The advertisement was therefore aimed at teenagers and young students.

There was one difference in the text of the advertisements, both of which promoted school-leaving pay rates for engineers and their rate of integration into the working world. On one of the ads: “€41,400 gross annual salary on average.” On the second: “€41,400 gross annual salary on average for women.” The researchers’ question was: how will these two ads be distributed among men and women by the algorithm?

Results. First, the advertisement with a message aimed at women reduced the number of views by users, regardless of the target, and it was shown predominantly to young men. The specification “for women” in the advertising text was not enough to direct the algorithm towards targeting high school girls more than high school boys. However, the researchers note in their publication that the algorithm appeared to treat targets between 16 and 17 years of age, minors, differently than targets between 18 and 19 years of age, adults. The algorithm slightly favored adult high school girls in the advertisement “for women”, compared to minor high school girls who were less likely to see it.

This indicates that the algorithm uses different decision processes for younger and older targets”, says Grazia Cecere and colleagues. “This is consistent with the strict legislation such as GDPR and COPPA surrounding the use of digital technology by minors in Europe and the United States.” While adult high school girls were more likely to see the advertisement than their younger peers, it is important to remember that they were still targeted less often than their male counterparts. The difference in algorithm treatment between minors and adults does not correct the gender bias in the advertising.

Another observation: the neutral advertisement – which did not specify “for women” – was more widely disseminated than the advertisement targeted at women, and here again, it was mainly aimed at men. This observation can be explained both by the length of the advertising text but also by its gendered orientation. Generally speaking, women have privileged access to this type of content when advertising is not specifically for women. Moreover, the word “women” in the text also led the algorithm to introduce an additional criterion, thus reducing the sample of targets – but clearly without favoring high school girls either.

Nevertheless, after several campaigns aimed at understanding the targeting mechanisms of these two ads, the researchers showed that the algorithm was capable of adapting its target according to the gender-specific text of the ad, which nonetheless reveals a market bias: targeting adult women costs advertisers more.

Complexity for advertisers

These results show the opacity of advertising algorithms and the paradoxical biases they entail. For engineering schools, diversity and parity are major recruitment challenges. Every year, schools invest efforts and resources in campaigns specifically targeted at women to attract them into sectors that remain highly masculine, without realizing that there are algorithmic decision parameters that are very complicated to control.

Read on I’MTech: Restricting algorithms to limit their powers of discrimination

This type of research sheds light on the avidly protected mechanisms of advertising algorithms and identifies good practices. However, Grazia Cecere reminds us that the biases generated by the algorithms are not necessarily voluntary: “They are often the consequences of how the algorithm optimizes the costs and views of the ads.” And these optimization methods are not initially based on male favoritism.

In 2019, research by Grazia Cecere, conducted with the same team and Catherine Tucker, a distinguished researcher at the MIT Sloan School of Management, showed the complexity of the link between optimization and algorithm bias, through an example of Snapchat advertising campaigns. The content of the advertisements was identical: advertising an engineering school for recruitment purposes. In this research, four similar advertising campaigns were launched with identical populations in all major cities in France. All other conditions remained the same, but a different photo was used for each campaign: a man from behind with a T-shirt bearing a message for men, a woman from behind with a T-shirt bearing a message for women, and the equivalents of these two photos without the people’s heads.

Pour tester les différences de traitement des algorithmes entre hommes et femmes, les chercheurs ont publié quatre photos sur Snapchat.

To test the differences in the way algorithms process the images for men and women, the researchers published four photos on Snapchat.

 

During the advertising campaign, the full photo of the man was the most often displayed, ahead of that of the man’s torso only, the woman’s torso only, and finally the full photo of the woman. Behind these results is an explanation of how the algorithm optimizes dissemination dynamically. “On the first day, the full photo of the man was the one that attracted the most visits by Parisians to the associated website” says Grazia Cecere. “This then led us to demonstrate that the algorithm bases itself on choices from cities with large populations to optimize targets. It replicates this in the other towns. It tends to optimize an entire campaign on the initial results obtained in these areas, by replicating them in all other areas.”

This case is typical of an indirect bias. “Maybe the Parisian users were more sensitive to this photo because there were more male students who identified with the ad in that city? Perhaps there are simply more male users in Paris? In any case, it is the behavior of Parisian users that has oriented the algorithm towards this bias, it is not the algorithm that has sought this result” stresses the researcher. However, without knowledge of the mechanisms of the algorithm, it is difficult for advertisers to predict these behaviors. The results of the research raise a question: is it acceptable, when trying to reach a balanced population – or even to target women preferentially in order to correct inequalities in professional fields – that the platforms’ algorithms lead to the opposite effect?

Interview by Benjamin Vignard, for I’MTech.

Find out more:

SDN

What is SDN (Software-Defined Networking)?

5G is coming and is bringing a range of new technologies to the table, including Software-Defined Networking. It is an essential element of 5G, and is a network development concept with a completely new infrastructure. Adlen Ksentini, a researcher at EURECOM, presents the inner workings of SDN.

 

How would you define SDN?

Adlen Ksentini: Software-Defined Networking (SDN) is a concept that was designed to “open up” the network, to make it programmable in order to manage its resources dynamically: on-demand routing, load distribution between equipment, intrusion detection, etc. It is an approach that allows network applications to be developed using a classic programming language, without worrying about how it will be deployed.

A central controller (or SDN controller) with overall control over the infrastructure will take care of this. This creates more innovation and productivity, but above all greater flexibility. SDN has evolved significantly in recent years to be integrated into programming networks such as 5G.

How does SDN “open up” the network?

AK: A network is arranged in the following way: there is a router, a kind of traffic agent for data packets, then a control plan that decides where those data packets go, and a transmission plan that transmits them.

The initial aim was to separate the control plan from the data flow plan in the equipment because each piece of equipment had its own configuration method. With SDN, router configuration is shared and obtained via an application above the SDN controller.  The application uses the functions offered by the SDN controller, and the SDN controller translates these functions into a configuration understood by the routers.  Communication between the SDN controller and the routers is done through a standardized protocol, such as OpenFlow.

How was the SDN developed?

AK:  The term first appeared about ten years ago and has been widely used ever since Cloud Computing became commonly used. “On-demand” networks were created, with virtual machines that then needed to be interconnected. This is the purpose of the SDN controller that will link these virtual machines together, translating information coming from different services. The concept has evolved and become a core technology, making it possible to virtualize infrastructure.

Why is it an essential part of 5G?

AK: 5G is intended for use in different markets. For example, Industry 4.0 or augmented reality require a variety of network services. Industry 4.0 will require very low latency, while augmented reality will focus on high bandwidth. To manage these different types of services, 5G will use the concept of network slicing.

This consists in virtualizing a structure in order to share it.  SDN is the key to interconnecting them, as it creates the ability to allocate network resources on demand. Thanks to this flexibility, it is possible to create specific network slices for each use. This is the principle of core network virtualization that is fundamental to 5G.

How does this principle of “resources on demand” work?

AK:  Imagine a company that does not have enough resources to invest in hardware. They will rent a virtual network: a cloud service offered for example by Amazon, requesting resources defined according to their needs. It could be a laboratory that wants to run simulations but does not have the computing capacity. They would use a cloud operator who will run these simulations for them. Storage capacity, computing power, bandwidth, or latency are thus configured to best meet the needs of the company or laboratory.

Why do we talk about new infrastructure with the SDN?

AK: The shift from 3G to 4G was an improvement in throughput or bandwidth, but was basically the same thing. 5G, with SDN, has a better infrastructure through this virtualization and can not only capture classic mobile phone users, but also open the market to industries.

SDN offers unique flexibility to develop innovative services and open the networks to new uses, such as autonomous vehicles, e-health, industry 4.0, or augmented reality. All these services have special needs and we need a network that can connect all these resources, which will certainly be virtual.

Tiphaine Claveau for I’MTech