NFV, Import et export dans le cloud, virtualisation.

What is NFV (Network Function Virtualization) ?

The development of 5G has been made possible through the development of new technologies. The role of Network Function Virtualization, or NFV, is to virtualize network equipment. Adlen Ksentini, a researcher at EURECOM, gives us a detailed overview of this virtualization.

 

What is NFV ?

Adlen Ksentini:  NFV is the virtualization of network functions, a system that service providers and network operators had hoped for in order to decouple software from hardware. It’s based on cloud computing: the software can be placed in a virtual environment – the cloud – and be run on PCs every day. The goal is to be able to use software that implements a network function and run it on different types of hardware, instead of having to purchase dedicated hardware.

How does it work?

A.K.: It relies on the use of a hypervisor, a virtualization layer that makes it possible to abstract the hardware. The goal is to virtualize the software that implements a network function to make it run on a virtual machine or a cloud-based container.

What kind of functions are virtualized?

A.K. : When we talk about network functions, it could refer to the router that sends packets to the right destination, firewalls that protect networks, DNS servers that translate domain names into IP addresses, or intrusion detection. All of these functions will be deployed in virtual machines or containers, so that a small or medium-sized company, for example, doesn’t have to invest in infrastructure to host these services, and may instead rent them from a cloud services provider, using the Infrastructure as a Service (IaaS) model.

What are the advantages of NFV?

A.K.: NFV provides all the benefits of cloud computing. First of all, it lowers costs since you only have to pay for the resources used. It also provides greater freedom since the virtualization layer enables it to be work on several types of hardware. It also makes it possible to react according to varying degree of traffic. If there’s a sudden rise in traffic it’s possible to scale up to respond to the demands.

Performance is another factor involved. Under normal circumstances, the computer’s operating system will not dedicate all of the processor’s capacity to a single task – it will spread it out and performance may suffer. The benefit of cloud computing is that it can take advantage of the almost unlimited resources of the cloud. This also makes for greater elasticity, since resources can be freed up when they are no longer needed.

Why is this technology central to 5G?

A.K.: 5G core networks are virtualized, they will run natively in the cloud. So we need software that is able to run these network functions in the cloud. NFV provides a number of advantages and that’s why it is used for the core of 5G. NFV and SDN are complementary and make it possible to obtain a virtual network.

Read more on I’MTech: What is SDN (Software-Defined networking)?

What developments are ahead for NFV?

A.K. : Communication technologies have created a framework for orchestrating and managing virtual resources, but the standard continues to evolve and a number of studies seek to improve it. Some aim to work on the security aspect, to better defend against attacks. But we’re also increasingly hearing about using artificial intelligence to enable the operator to improve resources without human intervention. That’s the idea behind Zero Touch Management, so that NFV networks can be self-correcting, self-manageable and, of course, secure.

 

Tiphaine Claveau for I’MTech

STREAM

STREAM: Bone tissue model culture

The STREAM project, which Mikhaël Hadida is working on at Mines Saint-Étienne, aims to develop a controlled, carefully-managed bone tissue culture platform. This system would facilitate observation in order to study the mechanisms involved in bone tissue by reducing the costs and time required for research.  

 

A culture system for bone tissue models in a controlled environment that allows for simplified observations: this is the aim of the STREAM project. Led by Mikhaël Hadida, a researcher at Mines Saint-Étienne, the STREAM research project (System for Bone Tissue Relevant Environment And Monitoring) aims to provide academic and industrial players with a useful tool for developing simplified, standardized and automated bone tissue models.

This in vitro culture system seeks to provide an innovative alternative to avoid animal testing in bone biology research. It is designed for laboratories, but could also be used to validate a medical system for pharmaceutical compounds or medical devices. This could pave the way for advances in research on osteoporosis, for example, or help make living bone grafts for regenerative medicine applications.

“The structure allows us to control the mechanical parameters of the system,” explains Mikhaël Hadida. It will be possible to control the culture conditions “and measure cell activity by collecting key data in real time.” The ability to control the mechanical environment of this type of culture has never before been reported in scientific literature. This would allow for better reproducibility between experiments, a key issue in research, while lowering costs and reducing the time required for research.

Little-understood mechanisms

One of the problems with current systems is the lack of a controlled environment. If the environment is not homogenous, it is difficult to draw conclusions from the experiments. “In research, we still don’t have a clear understanding of what is involved in mechanical stress on bone tissue,” explains Mikhaël Hadida.

Current systems are generally designed using biomimicry based on demineralized bone.   They remain very close to the human system and therefore imply better performance. “But it’s also a very complex system that is not possible to control or manage,” says the researcher.

There may be other phenomena involved, which in turn could affect the results observed. It is difficult to determine whether the parameters used have a real impact on the culture, or if other mechanisms the researchers are not aware of come into play. New collagen-based structures have also been developed, but they are still based on these highly complex systems. “It’s an attempt to find a shortcut – seeking performance before the foundations of these mechanisms are properly understood,” he adds.

The Bone Stream system

Mechanical stress in bones is not directly related to wear and tear, but bone cells react to mechanical stimuli that can damage bone tissue. Our bones are made up of a solid matrix and pore fluid. Walking or running leads to micro-stimulations – micro-compressions on this fluid – which create “currents” that stimulate the mechanisms involved.

“At the Centre for Biomedical and Healthcare Engineering, David Marchat can make us tailor-made culture scaffolds, which are perforated,” explains Mikhaël Hadida. The system the research team is working on is perfused to reproduce these fluid movements in the culture medium and obtain conditions similar to those of real life. “So we have a good combination in terms of the culture chamber and the culture medium to ensure that all the bone cells have a homogenous scaffold.”

Read more on I’MTech: Bone implants to stimulate bone regeneration

This homogeneity is essential to control the system and understand what impacts the culture, so as to be certain that it is indeed the mechanisms used that are responsible and not another variable the researchers are not aware of. Parameters tested include, for instance, fluid flow speed and what is known as shear stress. If you put pressure on a cube by moving your hand down its side, you apply a force parallel to the surface: this is shear stress.

Direct observation

“This system would also offer a major advantage for making observations,” says Mikhaël Hadida. “Current culture systems are destructive. In order to observe the culture they must be interrupted,” he explains. The culture scaffold must first be taken out of its environment. Then cell markers must be injected and it must be cut into very fine slices so that they can be examined under a microscope. It can also be ground up to analyze the DNA.

If you wish to study the development of your culture, these destructive methods pose a major problem. “If you want to observe your culture on a given day, then two days later, then a week later, your needs increase with all these observations,” he says. This means culture scaffolds must be available for each observation, which requires a significant investment, in terms of time and costs.

“So we wanted to develop a platform that could observe the culture in real time,” says the researcher. Their culture scaffold is installed on the transparent inner face of the culture chamber. This extends into a viewing chamber to monitor its evolution with a microscope without disturbing the culture scaffold. “We also have sensors to study the evolution of cell activity,” he adds. “They monitor changes in the culture medium very closely and therefore keep track of how the cells develop.”

The goal is now to continue developing this platform to bring a culture tool to the market that stands out from those already available. “For example, we’re working on a project with the European Space Agency (ESA) on this topic,” says Mikhaël Hadida. The astronauts on board the International Space Station (ISS) are subjected to specific mechanical stress and present a phenomenon of accelerated aging of bone tissue upon their return. ESA is therefore actively involved in this research in order to better understand this phenomenon and find solutions.

 

By Tiphaine Claveau, for I’MTech

Espace de co-working, industrie sans frontières, industry without borders

How are borders in industry changing?

The Industry Without Borders project launched by the German-French Academy for the Industry of the Future in 2017 seeks to challenge the idea that digital technology dissolves borders. Madeleine Besson and Judith Igelboeck, from Institut Mines-Télécom Business School and the Technical University of Munich respectively, explain why it is not so easy in practice.

 

Industry is going digital and this has brought about a wave of changes. The emphasis on open innovation pushes for the dissolution of borders within a company and in relationships between various organizations. “It’s not so easy,” says Madeleine Besson, a researcher in management at Institut Mines-Télécom Business School“We’ve seen that it’s becoming more fluid, but digitalization can also create new borders or reinforce those that already exist.”

The aim of the Industry Without Borders  project launched in 2017 was to identify factors that can lead to the creation of borders in companies and look at how these aspects are evolving. The project is led by the German-French Academy for the Industry of the Future, which brings together teams from IMT and the Technical University of Munich (TUM). “We looked at the way borders can be built, rebuilt, and at times, strengthened or effectively broken down,”  says Judith Igelsboeck, an organizational studies researcher at TUM. Research teams on both sides of the Rhine worked with companies, through field studies and qualitative interviews, in order to determine the changes that have been brought about by digital technology.

“We considered the idea of open innovation in particular,” says Madeleine Besson. Today, companies  consult consumers much more often in the processes of creation and innovation, but it often remains  under the company’s control. Conversely, Judith Igelsboeck reports that “a study in an IT consulting firm in Germany showed that customers went so far as to request the firm’s skills database so that they could choose the profiles of the IT specialists for their project directly themselves. The opening here is therefore clear.”

What borders?

“For a long time, borders in the business world were formalized from an economic viewpoint,” explains the French researcher. This includes assets and goods, employees and the machines used. “But the scope is much wider than that, and most importantly, it’s very mobile.”  A number of other aspects may also come into play, such as customer relationships and their involvement in innovation processes, as in the previous example, and relationships between different companies.

As far as internal borders are concerned, for example concerning organization within a department, management models tend to be moving toward eliminating borders. “This idea is reflected in efforts to redesign the very architecture of the office – the principle of open space,” explains Judith Igelsboeck. Workspaces become more agile, flexible and free of internal separations. The aim is to create more communal spaces, so that co-workers get to know each other better in order to work together.

But ultimately, open space may not be as open as it seems. “Employees reclaim ownership of the space by creating borders to mark out their personal space,” says Madeleine BessonThey do so by making special adjustments – at times perceptible only to those who use the workspace – to mark a space as their own.

Read more on I’MTech: Can workspaces become agile?

Madeleine Besson reminds us that, “the general consensus in scientific literature and the media is that digital tools and artificial intelligence facilitate instant connections, not only between people, but between things.” Supply chains should be developed in a single, seamless automated process, that can work beyond organizational borders. But it is not so clear in practice, and digital tools even appear to add new barriers.

Between theory and practice

“Imagine a printer that uses an automated tool to help manage the paper supply,”  says the French researcher. “A connected IT system between the paper supplier and the printer could help regulate paper ordering depending on current stock and the factories’ operations. The supplier becomes a sort of stockpile the company can draw on – the system is shared and the borders are therefore weakened.”

Yet, the same example could also be used to illustrate how new borders are created. If these companies use competing systems, such as Apple and Android, they will face insurmountable barriers since these two systems are not interoperable. “Technological change can also create a new border,”  adds Madeleine Besson. “It can create sub-categories between organizations that have the desire and skills to converse with computers, and others that may feel like they are merely assistants for automatons.”

“Our team encountered such a feeling during an interview with the staff of an after-sales service company,”  says the researcher. Their workday revolves around making rounds to customers whose equipment has broken down. Traditionally, these employees organized their own rounds. But their schedule is now managed by a computer system and they receive the list of customers to visit the night before. “The employees were frustrated that they were no longer in control of their own schedule. They didn’t want their responsibilities to be taken away”  she explains.

“They would meet up in the morning before making their rounds to exchange appointments. Some didn’t want to go to into big cities, others wanted to keep the customers they’d been working with for a long time. So the digital tool puts up a barrier within the company and is a source of tension and frustration, which could potentially give rise to conflicts or disputes.”  These changes are made without adequately consulting the internal parties involved and can lead to conflict in the company’s overall operations.

Across the Rhine

This initial phase of the project with a number of field studies in France and Germany is expected to lead to new collaborations. For the researchers, it would be interesting to study the changes on either side of the Rhine and determine whether similar transformations are underway, or if a cultural aspect may lead to either the dissolution or crystallization of borders.

“Each country has its own vision and strategy for the industry of the future,”  says Judith Igelsboeck. So it is conceivable that cultural differences will be perceptible. “The intercultural aspect is a point to be considered, but for now, we haven’t been able to study it in a single company with a German and French branch.”  This may be the topic for a new French-German collaboration. The German researcher says that another possible follow-up to this project could focus on the use of artificial intelligence in business management.

 

Tiphaine Claveau for I’MTech

EHPAD, Covid19, Coronavirus, nursing homes

In French nursing homes, the Covid-19 crisis has revealed the detrimental effects of austerity policies

This article was originally published (in French) in the Conversation. 
By Laura Nirello, IMT Lille Douai, and Ilona Delouette, University of Lille.

[divider style=”normal” top=”20″ bottom=”20″]

 

[dropcap]W[/dropcap]ith apocalyptic accounts of conditions in French nursing homes, where deaths have soared (over 9,000 estimated as of 3 May 2020), the Covid-19 pandemic has revealed, more than ever, the hardships facing this sector.

For years, care providers who work in France’s nursing homes (known as EHPADs in French) have been sounding the alarm about the crisis facing such facilities, underscoring the pitfalls of austerity policies and budgeting when applied to healthcare and care for dependent persons.

The ‘EHPAD’ nursing home status was created in 1997 when, after twenty years of discussions, the government approved the idea of covering care for dependent persons through the national Social Security program. At the time, the decision was based on a number of technical aspects, and in particular, uncertainty with respect to how the cost of providing care for dependent persons would develop in the future, and therefore, how it would be budgeted over time.

In 1997, a welfare allowance, managed by the departments, was therefore put in place (PSD, (assistance allowance), which has since been replaced with the APA (personal care allowance)).

Impossible to separate ‘cure’ from ‘care’?

This theoretical separation between healthcare, funded by Social Security, and care for dependent persons, funded by the departments, is at odds with the reality of care situations. Indeed, how can that which pertains to health (cure) be separated from that which pertains to assisting dependent persons (care)?

It is even more difficult to separate the two aspects in the case of highly dependent persons who require medical care in an institutional setting. The ‘EHPAD’ nursing home status was created precisely to cope with the influx of highly dependent persons: it makes facilities eligible for funding from both the public health insurance program and the departments.

Funding for nursing homes is therefore based on a three-part pricing system according to a theoretical categorization of costs (medical care, dependent care, living expenses). This funding is provided by public authorities, all of whom have limited budgets.

Living expenses are paid for by residents and their families. ‘Medical care’ is 100% funded through public health insurance, through the Regional Health Agency (ARS) while ‘dependent care’ is primarily funded by Departmental Councils. The Regional Health Agencies are limited to the fixed budgets voted upon annually through the Social Security Financing Act, while the Departmental Councils are limited to the funds transferred from the State through the personal care allowance (APA).

Medical care for the lowest possible cost

As part of the austerity policies imposed on the hospital sector, healthcare regulators gradually sought to remove expenditure for dependent elderly persons from hospital accounts. As such, according to IGAS, over a ten-year period (2006-2016) more than half of the beds in long-term care units (USLD) filled by highly dependent persons whose condition requires constant medical supervision were converted to places in nursing homes. Elderly people suffering from a loss of independence had no choice but to follow this trend and were sent to nursing homes. The State also invested in home care services and independent living facilities for the most independent individuals. This made it possible to limit the number of new places created in nursing homes.

The funding for nursing homes is negotiated through multi-year performance and resource contracts (CPOM) that determine an average level of dependency and severity of medical conditions for residents for a five-year period: the institutions are responsible for remaining within these averages and controlling resident admissions and discharges based on their level of dependency.

In this way, the authorities who fund the nursing homes pushed them to specialize in highly-dependent residents by taking in individuals excluded from the hospital setting and no longer independent enough to live at home or at intermediate living facilities. Nursing homes also tend to provide care for a community with an increasing number of medical conditions: more than a third of residents suffer from Alzheimer’s disease and struggle to perform everyday tasks (90% of residents need help with bathing and grooming); residents are admitted at an increasingly advanced age (85 years and 8 months) and stays in nursing homes are shorter (2 years and 5 months), according to data from the DREES (Directorate for Research, Studies, Evaluation and Statistics, a directorate of the central administration of health and social ministries).

But nursing homes’ resources have not kept pace with this changing profile of the residents receiving care. According to the DREES, while nursing homes now provide care for residents whose needs closely resemble those in long-term care units (USLD), the caregiver to patient ratio is .62 full-time equivalent employees per resident compared to 1.1 full-time equivalent employee per patient in long-term care units.

Moreover, while the staff of long-term care units are primarily made up of nurses, geriatric medicine specialists and nursing aids, in nursing homes there is only a single coordinating physician. And this physician is only present on a part-time basis, since they work at several facilities. Likewise, there are few nurses (5.8 for 100 residents) and they are not on site at night, whereas nurses are present 24 hours a day in long-term care units. Nursing home staff are primarily made up of nursing assistants and auxiliary staff, who are undoubtedly extremely devoted to their work, but are not adequately trained for the tasks they carry out and certainly underpaid as a result.

Deteriorating work and care conditions

Nursing homes find themselves facing a chronic lack of public funding. It therefore comes as no surprise that faced with emergency situations and endless needs, employees inevitably perform tasks that extend beyond their job description: they have no choice but to carry out tasks that are essential, but for which they are not qualified, to provide residents with the care and assistance they need (auxiliary staff help with grooming while nursing aids provide medical care). There is a disconnect between the work performed and salary levels, which remain low, making the sector unappealing and as a result, most nursing homes struggle to recruit staff, therefore exacerbating the already low caregiver-to-resident ratio in these facilities.

Working conditions have become even more difficult as changes in managerial practices have changed, as a result of efforts to control public spending, and have led to a demand for cost-effectiveness in nursing homes. These changes run counter to the founding principles of the facilities. As a successor to retirement homes, these institutions are also living communities, with a great number of interpersonal needs relating to accommodation (laundry, dining services), individual relationships and social life (care).

But in an effort to streamline operations, which goes hand in hand with cutting costs, work is “industrialized,” tasks are standardized and must be completed at a faster pace. The goal is to cut back on time considered to be “unproductive” – meaning saying “Good morning” and, “How are you today?” to residents in the morning and talking with them calmly in the evening – which ultimately amounts to all interpersonal aspects.

As far as indicators for funding institutions are concerned, public authorities prioritize tasks meant to accurately reflect operational productivity: the number of patients assisted with bathing and grooming or number of meals served! This intensifies the trend toward the dehumanization of living conditions in nursing homes, which are gradually turning in to “places to die.”

Dependence, a challenge for Social Security to recover from the crisis

This situation is alarming under normal circumstances, in particular from an ethical and social justice perspective, but it becomes tragic in the event of a health crisis. This is especially true today during the Covid-19 crisis. As the virus is wreaking havoc in these institutions, nursing homes lack medical staff to prescribe and administer the medications needed to keep patients alive and maintain their cognitive functions, or provide end-of-life care (Midazolam, Perfalgan). Staff members who are not considered caregivers had to wait for public authorities to decide to provide them with protective equipment, although it is critical to protecting high-risk residents. And while these residents are isolated in their rooms and visits are prohibited, employees do not have the time to comfort and support them at this difficult time.

These tragic circumstances call for a drastic rethinking of the nursing home model as many reports have suggested (Mission Flash Iborra-Fiat in 2018, Libault report in 2019). The fact is that these issues are related to the way the sector is funded.

While the various studies have assessed funding requirements for nursing homes at €7 and 10 billion, establishing a way to cover care for dependent persons within the healthcare sector, accompanied by increased resources based on needs, would have the advantage of doing away with the impossible separation between ‘cure’ and ‘care’, which has been maintained up to now for budgetary reasons, but which has shown its limitations, both in terms of managing hospitals and caring for dependent persons.

[divider style=”dotted” top=”20″ bottom=”20″]

Laura Nirello, Assistant Professor in economics, IMT Lille Douai – Institut Mines-Télécom and Ilona Delouette, PhD student in economics, University of Lille

This article has been republished from The Conversation under a Creative Commons license. Read original article (in French).

advertising algorithms

Social media: the everyday sexism of advertising algorithms

Social media advertising algorithms can create paradoxical situations, where messages aimed at women are mostly displayed to men. These are the findings of successive research projects carried out by Grazia Cecere at the Institut Mines-Télécom Business School, in partnership with EPITECH, the University of Paris-Saclay and the MIT School of Management. The team has shed light on some of the mechanisms of algorithms that, at first glance, maintain or amplify non-parity biases.

 

Advertising algorithms prefer men. At least, those of social networks such as Facebook, Snapchat, Twitter, and LinkedIn do. This is the conclusion of several successive research projects by Grazia Cecere, a privacy economist at the Institut Mines-Télécom Business School, who has been working on the biases of algorithms for several years. In her research, she provides insights into the mystery of the advertising algorithms used by the major social platforms.  “These algorithms decide and define the information seen by the users of social networks, who are mostly young people”, she stresses.

Through collaborative work with researchers from EPITECH (Clara Jean) and the University of Paris-Saclay (Fabrice Le Guel and Matthieu Manant), Grazia Cecere looked at how an advertiser’s message is processed and distributed by Facebook algorithms. The team launched two sponsored advertisements aimed at recruiting engineering school students. The advertisements used the same image, at the same price per appearance on user accounts, and the same target population: high school students between 16 and 19 years old, with no gender specified. The advertisement was therefore aimed at teenagers and young students.

There was one difference in the text of the advertisements, both of which promoted school-leaving pay rates for engineers and their rate of integration into the working world. On one of the ads: “€41,400 gross annual salary on average.” On the second: “€41,400 gross annual salary on average for women.” The researchers’ question was: how will these two ads be distributed among men and women by the algorithm?

Results. First, the advertisement with a message aimed at women reduced the number of views by users, regardless of the target, and it was shown predominantly to young men. The specification “for women” in the advertising text was not enough to direct the algorithm towards targeting high school girls more than high school boys. However, the researchers note in their publication that the algorithm appeared to treat targets between 16 and 17 years of age, minors, differently than targets between 18 and 19 years of age, adults. The algorithm slightly favored adult high school girls in the advertisement “for women”, compared to minor high school girls who were less likely to see it.

This indicates that the algorithm uses different decision processes for younger and older targets”, says Grazia Cecere and colleagues. “This is consistent with the strict legislation such as GDPR and COPPA surrounding the use of digital technology by minors in Europe and the United States.” While adult high school girls were more likely to see the advertisement than their younger peers, it is important to remember that they were still targeted less often than their male counterparts. The difference in algorithm treatment between minors and adults does not correct the gender bias in the advertising.

Another observation: the neutral advertisement – which did not specify “for women” – was more widely disseminated than the advertisement targeted at women, and here again, it was mainly aimed at men. This observation can be explained both by the length of the advertising text but also by its gendered orientation. Generally speaking, women have privileged access to this type of content when advertising is not specifically for women. Moreover, the word “women” in the text also led the algorithm to introduce an additional criterion, thus reducing the sample of targets – but clearly without favoring high school girls either.

Nevertheless, after several campaigns aimed at understanding the targeting mechanisms of these two ads, the researchers showed that the algorithm was capable of adapting its target according to the gender-specific text of the ad, which nonetheless reveals a market bias: targeting adult women costs advertisers more.

Complexity for advertisers

These results show the opacity of advertising algorithms and the paradoxical biases they entail. For engineering schools, diversity and parity are major recruitment challenges. Every year, schools invest efforts and resources in campaigns specifically targeted at women to attract them into sectors that remain highly masculine, without realizing that there are algorithmic decision parameters that are very complicated to control.

Read on I’MTech: Restricting algorithms to limit their powers of discrimination

This type of research sheds light on the avidly protected mechanisms of advertising algorithms and identifies good practices. However, Grazia Cecere reminds us that the biases generated by the algorithms are not necessarily voluntary: “They are often the consequences of how the algorithm optimizes the costs and views of the ads.” And these optimization methods are not initially based on male favoritism.

In 2019, research by Grazia Cecere, conducted with the same team and Catherine Tucker, a distinguished researcher at the MIT Sloan School of Management, showed the complexity of the link between optimization and algorithm bias, through an example of Snapchat advertising campaigns. The content of the advertisements was identical: advertising an engineering school for recruitment purposes. In this research, four similar advertising campaigns were launched with identical populations in all major cities in France. All other conditions remained the same, but a different photo was used for each campaign: a man from behind with a T-shirt bearing a message for men, a woman from behind with a T-shirt bearing a message for women, and the equivalents of these two photos without the people’s heads.

Pour tester les différences de traitement des algorithmes entre hommes et femmes, les chercheurs ont publié quatre photos sur Snapchat.

To test the differences in the way algorithms process the images for men and women, the researchers published four photos on Snapchat.

 

During the advertising campaign, the full photo of the man was the most often displayed, ahead of that of the man’s torso only, the woman’s torso only, and finally the full photo of the woman. Behind these results is an explanation of how the algorithm optimizes dissemination dynamically. “On the first day, the full photo of the man was the one that attracted the most visits by Parisians to the associated website” says Grazia Cecere. “This then led us to demonstrate that the algorithm bases itself on choices from cities with large populations to optimize targets. It replicates this in the other towns. It tends to optimize an entire campaign on the initial results obtained in these areas, by replicating them in all other areas.”

This case is typical of an indirect bias. “Maybe the Parisian users were more sensitive to this photo because there were more male students who identified with the ad in that city? Perhaps there are simply more male users in Paris? In any case, it is the behavior of Parisian users that has oriented the algorithm towards this bias, it is not the algorithm that has sought this result” stresses the researcher. However, without knowledge of the mechanisms of the algorithm, it is difficult for advertisers to predict these behaviors. The results of the research raise a question: is it acceptable, when trying to reach a balanced population – or even to target women preferentially in order to correct inequalities in professional fields – that the platforms’ algorithms lead to the opposite effect?

Interview by Benjamin Vignard, for I’MTech.

Find out more:

SDN

What is SDN (Software-Defined Networking)?

5G is coming and is bringing a range of new technologies to the table, including Software-Defined Networking. It is an essential element of 5G, and is a network development concept with a completely new infrastructure. Adlen Ksentini, a researcher at EURECOM, presents the inner workings of SDN.

 

How would you define SDN?

Adlen Ksentini: Software-Defined Networking (SDN) is a concept that was designed to “open up” the network, to make it programmable in order to manage its resources dynamically: on-demand routing, load distribution between equipment, intrusion detection, etc. It is an approach that allows network applications to be developed using a classic programming language, without worrying about how it will be deployed.

A central controller (or SDN controller) with overall control over the infrastructure will take care of this. This creates more innovation and productivity, but above all greater flexibility. SDN has evolved significantly in recent years to be integrated into programming networks such as 5G.

How does SDN “open up” the network?

AK: A network is arranged in the following way: there is a router, a kind of traffic agent for data packets, then a control plan that decides where those data packets go, and a transmission plan that transmits them.

The initial aim was to separate the control plan from the data flow plan in the equipment because each piece of equipment had its own configuration method. With SDN, router configuration is shared and obtained via an application above the SDN controller.  The application uses the functions offered by the SDN controller, and the SDN controller translates these functions into a configuration understood by the routers.  Communication between the SDN controller and the routers is done through a standardized protocol, such as OpenFlow.

How was the SDN developed?

AK:  The term first appeared about ten years ago and has been widely used ever since Cloud Computing became commonly used. “On-demand” networks were created, with virtual machines that then needed to be interconnected. This is the purpose of the SDN controller that will link these virtual machines together, translating information coming from different services. The concept has evolved and become a core technology, making it possible to virtualize infrastructure.

Why is it an essential part of 5G?

AK: 5G is intended for use in different markets. For example, Industry 4.0 or augmented reality require a variety of network services. Industry 4.0 will require very low latency, while augmented reality will focus on high bandwidth. To manage these different types of services, 5G will use the concept of network slicing.

This consists in virtualizing a structure in order to share it.  SDN is the key to interconnecting them, as it creates the ability to allocate network resources on demand. Thanks to this flexibility, it is possible to create specific network slices for each use. This is the principle of core network virtualization that is fundamental to 5G.

How does this principle of “resources on demand” work?

AK:  Imagine a company that does not have enough resources to invest in hardware. They will rent a virtual network: a cloud service offered for example by Amazon, requesting resources defined according to their needs. It could be a laboratory that wants to run simulations but does not have the computing capacity. They would use a cloud operator who will run these simulations for them. Storage capacity, computing power, bandwidth, or latency are thus configured to best meet the needs of the company or laboratory.

Why do we talk about new infrastructure with the SDN?

AK: The shift from 3G to 4G was an improvement in throughput or bandwidth, but was basically the same thing. 5G, with SDN, has a better infrastructure through this virtualization and can not only capture classic mobile phone users, but also open the market to industries.

SDN offers unique flexibility to develop innovative services and open the networks to new uses, such as autonomous vehicles, e-health, industry 4.0, or augmented reality. All these services have special needs and we need a network that can connect all these resources, which will certainly be virtual.

Tiphaine Claveau for I’MTech

libertés, data

Health crisis, state of emergency: what safeguards are there for our fundamental freedoms?

This article originally appeared (in French) in newsletter no. 17 of the VP-IP Chair, Data, Identity, Trust in the Digital Age for April 2020.

[divider style=”dotted” top=”20″ bottom=”20″]

The current pandemic and unprecedented measures taken to slow its spread provide an opportunity to measure and assess the impact of digital technology on our societies, including in terms of its legal and ethical contradictions.

While the potential it provides, even amid a major crisis, is indisputable, the risks of infringements on our fundamental freedoms are even more evident.

Without giving in to a simplistic but unrealistic ad hoc techno-solutionism, it would seem appropriate to observe the current situation by looking back forty years into the past, at a time when there was no internet or digital technology at our sides to soften the shock and provide a rapid global response.

The key role of digital technology during the health crisis

Whether to continue economic activity with remote work and remote conferences or to stay in touch with family and friends, digital technology, in its diverse uses, has proved to be invaluable in these exceptional times.

Without such technology, the situation would clearly have been much worse, and the way of life imposed upon us by the lockdown even harder to bear. It also would have been much more difficult to ensure outpatient care and impossible to provide continued learning at home for primary, secondary and higher education students.

The networks and opportunities for remote communication it provides and the knowledge it makes available are crucial assets when it comes to structuring our new reality, in comparison to past decades and centuries.

This health crisis requires an urgent response and therefore serves a brutal reminder of the importance of research in today’s societies, marked by the appearance of novel, unforeseeable events, in a time of unprecedented globalization and movement of people.

Calls for projects launched in France and Europe – whether for research on testing, immunity, treatments for the virus and vaccines that could be developed – also include computer science and humanities components. Examples include aspects related to mapping the progression of the epidemic, factors that can improve how health care is organized and ways to handle extreme situations.

Digital technology has enabled all of the researchers involved in these projects to continue working together, thinking collectively and at a fast pace. And through telemedicine (even in its early stages), it has provided a way to better manage, or at least significantly absorb, the current crisis and its associated developments.

In economic terms, although large swaths of the economy are in tatters and the country’s dependence on imports has made certain situations especially strained – as illustrated by the shortage of protective masks, a topic that has received wide media coverage – other sectors, far from facing a crisis, have seen a rise in demand for their products or services. This has been the case for companies in the telecommunications sector and related fields like e-commerce.

Risks related to personal data

While the crisis has led to a sharp rise in the use of digital technology, the present circumstances also present clear risks as far as personal data is concerned.

Never before has there been such a constant flow of data, since almost all the information involved in remote work is stored on company servers and passed through third party interfaces from employees’ homes. And never before has so much data about our social lives, family and friends been made available to internet and telecommunications companies since – with the exception of those with whom we are spending the lockdown period – more than ever, all of our communication depends on networks.

This underscores the potential offered by the dissemination and handling of the personal data that is now being generated and processed, and consequently, the potential danger, at both the individual and collective level, should it be used in a way that does not respect the basic principles governing data processing.

Yet, adequate safeguards are not yet in place, since the social contract relating to this area is still being developed and debated.

Companies that do not comply with the GDPR have not changed their practices [1] and the massive use of new forms of online connection continue to create risks, given the sensitive nature of the data that may be or is collected.

Examples include debates about the data protection policy for medical consultation interfaces and issuing prescriptions online [2]; emergency measures to enable distance learning via platforms whose data protection policies have been criticized or are downright questionable (Collaborate or Discord, which has been called “spyware” by some [3] to name just a few of many examples); the increased use of video conferencing, for which some platforms do not offer sufficient safeguards in terms of personal data protection, or which have received harsh criticism following an examination of their capacity for cybersecurity and for protecting the privacy of the information exchanged.

For Zoom, notably, which is very popular at the moment, it has reportedly been “revealed that the company shared information about some of its users with Facebook and could discreetly find users’ LinkedIn profiles without their knowledge[4].

There are also more overall risks, relating to how geolocation could be used, for example. The CNIL (the French Data Protection Authority) [5], the European Data Protection Committee [6] and the European Data Protection Supervisor [7] have given precise opinions on this matter and their advice is being sought at the moment.

In general, mobile tracking applications, such as the StopCovid application being studied by the government [8], and the issue of aggregation of personal data require special attention. This topic has been widely covered by the French [9], European [10] and international [11] media. The CNIL has called for vigilance in this area and has published recommendations [12].

The present circumstances are exceptional and call for exceptional measures – but these measures must only infringe on our freedom of movement, right to privacy and personal data protection rights with due respect for our fundamental principles: necessity of the measure, proportionality, transparency, and loyalty, to name just a few.

The least intrusive solutions possible must be used. In this respect, Ms Marie-Laure Denis, President of the CNIL, explains, “Proportionality may also be assessed with regard to the temporary nature, related solely to the management of the crisis, of any measure considered” [13].

The exceptional measures must not last beyond these exceptional circumstances. They must not be left in place for the long term and chip away at our fundamental rights and freedoms. We must be particularly vigilant in this area, as the precedent for measures adopted for a transitional period in order to respond to exceptional circumstances (the Patriot Act in the United States, the state of emergency in France) has unfortunately shown that these measures have been continued – with doubts as to their necessity – and some have been established in ordinary law provisions and have therefore become part of our daily lives [14].

[divider style=”dotted” top=”20″ bottom=”20″]

Claire Levallois-Barth, Lecturer in Law at Télécom Paris, Coordinator of the VP-IP Chair (Personal Information Values and Policies)

Maryline Laurent, Computer Science Professor at Télécom SudParis and Co-Founder of the VP-IP Chair

Ivan Meseguer, European Affairs, Institut Mines-Télécom, Co-Founder of the VP-IP Chair

Patrick Waelbroeck, Professor of Industrial Economics and Econometrics at Télécom Paris, Co-Founder of the VP-IP Chair

Valérie Charolles, Philosophy Researcher at Institut Mines-Télécom Business School, member of the VP-IP Chair, Associate Researcher at the Interdisciplinary Institute of Contemporary Anthropology (EHESS/CNRS)

 

phishing

Something phishy is going on!

Cyberattacks have been on the rise since the month of March 2020. Hervé Debar, an information systems security researcher at Télécom SudParis, talked to us about the relationship between cyberattacks – such as phishing and zoom-bombing  – and the Covid-19 health crisis.

 

For some, the crisis brought about by Covid-19 has created opportunities: welcome to the world of cyberattacks. The month of March saw a significant rise in online attacks, in part due to increased reliance on e-commerce and digital working practices, such as video conferences. “Such attacks include zoom-bombing,”  says Hervé Debar, an information systems security researcher at Télécom SudParis.

Zoom-Bombing

Zoom is a video conference platform that has become widely used for communication and to facilitate remote work. “It’s a system that works really well,” says the researcher, “although it’s controversial since it’s hosted in the United States and raises questions about compliance, in particular with the GDPR.”

Zoom-bombing is a way of attacking users of such platforms by hijacking meetings. “There are real problems with the use of collaborative software because you have to install a client, so you run the risk of leaving the door to your computer open,” explains Hervé Debar.

Zoom-bombers seek to make a disturbance, but may also potentially try to spy on users, even if “a lot of power is needed for a malicious third party to hijack a desired meeting.”  These virtual meetings are defined by IDs – sets of characters of varying lengths. In order to try to hijack a meeting, a hacker generates IDs at random in the hope of finding an active meeting.

“This means that there is little likelihood of finding a specific meeting in order to spy on users,” says Hervé Debar. “That being said, arriving uninvited in an active meeting at random to make trouble is easier in our current circumstances, since there are a much greater number of meetings.”  An algorithm could be used to generate these valid tags. This works like a robot calling set numbers: it calls numbers on a continual basis and if someone picks up on the other end, it hands the call over to an operator.

It is worth noting that Zoom has taken certain cybersecurity aspects into account for its services and is making efforts to provide appropriate solutions. To lower the risk, meetings can also be protected by using an access code. Besides zoom-bombing, more traditional attacks are well-suited to the current health crisis. One such attack is phishing.

For what purpose?

The goal of phishing is to find a convincing bait to get the recipient of an email to click on a link and want to take further action. “Phishing techniques have gone from selling Viagra a few years ago to selling masks or other medical products,” says Hervé Debar. “This reflects people’s needs. The more worried they are, the more vulnerable they are to this kind of attack.” The fear of getting sick, coupled with a shortage of available protective equipment, can therefore increase the risk of these types of practices.

You get an e-mail saying: “We have masks in stock! Buy X masks for X euros.” So you pay for your order but never receive it.“It’s fraud, plain and simple,” says Hervé Debar. But such practices may also take a more indirect form, by asking for your credit card number or other sensitive personal information. This information may be used directly or sold to a third party. Messages, links and videos can also contain links to download malware that is then installed on the user’s computer, or a malicious application for smartphones.

Recently, this type of email has started using a new approach. Hervé Debar says that “the message is worded as a response, as if the person receiving it had placed an order with their supplier.”  The goal is to build trust by making people think they know the sender, therefore making them more likely to fall for the scam.

From an economic viewpoint, unfortunately, such practices are profitable. “Even if very few people actually become victims, the operational cost is very low,” explains the researcher. “It works the same way as telephone scams and premium-rate numbers.”

Uncertainty about information sources tends to amplify this phenomenon. In these anxious times, this can be seen in the controversy over chloroquine. “The doubts surrounding this information make it conducive to phishing campaigns, especially since it has been difficult to get people to understand that it may be dangerous.”

How can we protect ourselves?

“Vigilance and common sense are needed to react appropriately and protect ourselves from phishing attacks,” says Hervé Debar, adding that “knowing suppliers well and having a good overview of inventory would be the best defense.”  For hospitals and healthcare facilities, the institutional system ensures that they are familiar with their suppliers and know where to place orders safely.

“We also have to keep in mind that these products must meet a certain quality level,” adds Hervé Debar. “It would be surprising if people could just magically produce them. We know that there is a major shortage of such supplies, and if governments are struggling to obtain them, it shouldn’t be any easier for individuals.”

Information technology assists with logistical aspects in healthcare institutions – for example inventory and transporting patients – and it’s important for these institutions to be able to maintain communication. So they may be targeted by attacks, in particular attacks on services that seek to saturate networks. “There have been attacks on services at Paris hospitals, which have been effectively blocked. It seems like a good idea to limit exterior connections and bandwidth usage.”

Tiphaine Claveau for I’MTech