networks, réseaux

High expectations for AI to ensure the security of new networks

As networks increasingly rely on new, primarily software-based architectures, the issue of security cannot be overlooked. Artificial intelligence is one field researchers are exploring to provide sufficient protection for these new networks, such as 5G (and beyond) and constrained networks such as the IoT. An approach explored in particular by cybersecurity researchers at IMT Lille Douai.

 

In a matter of a few dozen milliseconds, a well-targeted attack can wipe out an entire network and the services that go with it.” Ahmed Meddahi, a research professor at IMT Lille Douai, offers this  frightening reminder about the threats to Internet-of-Things networks in particular. Behind their control screens in their security operations centers, network operators can identify a multitude of diverse attacks in the blink of an eye. Granted, not all attacks are carried out so quickly. But this example is a good illustration of the constraints weighing on the researchers and engineers who develop cyberdefense systems. Such systems must be able to monitor, analyze, sort, detect and react, all in just a few milliseconds.

For this, humans have two complementary technological tools on their side: new network architectures and artificial intelligence. 5G or WPAN (a network technology for the Internet of Things) are based on two important characteristics with cryptic acronyms: SDN — SDN-WISE for the IoT — and NFV. SDN, which stands for software-defined network, “is a network’s capacity to be programmed, configured and controlled in a centralized, dynamic  way,” explains Ahmed Meddahi, who has been working on architecture security for the past several years. As for NFV, “it’s the virtualization of the IT world, adapted to the world of networks. The network functions which were purely hardware-based up to now are becoming software functions.” SDN and NFV are complementary and their primary aim is to reduce the development cycle for telecom services as well as the cost of network operations and maintenance.

Read more on I’MTech: SDN and Virtualization : More Intelligence in 5G networks

As far as cybersecurity is concerned, NFV and SDN could serve as a basis for providing an overview of the network, or could take on a portion of the complexity of IoT networks.  The network operator in charge of security could therefore establish an overall security policy from his control post, with the rules and basic behavior of the network.  He could then allow the network to make its own decisions instantaneously. The goal is to move towards more autonomous network security.

In the event of a threat or an attack, such an organization makes it possible to rapidly deny access or introduce filter rules for computer traffic, and therefore isolate or migrate segments of the network that are under attack.  This sort of architecture or approach is an advantage for effectively responding to threats and making the network more resilient. But sometimes, the speed at which humans can analyze situations and make decisions does not suffice. That’s where artificial intelligence comes in.

Detecting more quickly than humans

It’s one of the major areas of research in cybersecurity: first of all, how can we collect the most relevant information about network activity, out of a huge and widely-varying volume of traffic data, which, on top of that, is ever-growing? And second, how can we detect, identify and isolate an attack that only lasts a fraction of a second, or even anticipate it, to prevent the worst from happening?” says Ahmed Meddahi. New SDN and NFV architectures could help answer this question since these technologies will facilitate the integration of learning algorithms in network control systems. This is another promising new area of research for network and computer security scientists, which is naturally of interest to researchers at IMT Lille Douai.

The first challenge is to choose the right approach. Which algorithms should be used? Supervised, unsupervised or hybrid? And with which data? Traditional learning methods consist of showing the algorithm how the network behaves in a normal situation, and how it behaves in an abnormal situation or when under attack. It will then be able to learn and recognize situations that are almost identical to those it has learned. But there’s a problem: these learning methods based on examples or records are not compatible with the reality of cyberthreats.

Attacks are dynamic and are constantly changing,” says Ahmed Meddahi. “Hackers can get past even the strongest counter-measures and defenses since they change their approach on a regular basis and constantly change their signature.” But with supervised learning, an algorithm is trained with existing attacks, at the risk of quickly becoming outpaced by the attacks of tomorrow.

That’ s why researchers and industry stakeholders are instead focusing on an unsupervised or hybrid learning approach, and even on new AI algorithms designed especially for cybersecurity purposes. In this case, an algorithm would learn by itself what qualifies as normal or abnormal network operation. Rather than detecting the trace or signature of an attack, it will learn how to recognize the conditions in which an attack has occurred in the past, and notify operators if the same conditions occur or are being brought together.

The unsupervised approach also poses another problem: it requires constant learning on the network, which implies a significant cost in terms of resources,” says the IMT Lille Douai researcher. That is precisely the challenge facing scientists: finding a realistic approach to learning in an extremely dynamic, ever-changing environment. If researchers are beginning to work on new security issues for 5G and IoT networks, businesses naturally have high expectations.  With 5G set to launch in France in 2020, operators and managers of these next-generation networks are more concerned than ever about the security of users and their data.

 

building

Recovering knowledge of local, traditional building materials

Why is an old country farmhouse more pleasant in summer than a modern city building? Traditional building materials and natural stone provide old buildings with better thermal and hygrometric properties. Unfortunately, they often lack the technical characterizations they need to find their place in the construction industry. The European regional development project OEHM has set out to resolve this problem. It brings together IMT Mines Alès, the University of Montpellier and the National School of Architecture of Montpellier. Aymeric Girard, a materials researcher at IMT Mines Alès, gives us an overview of the project and the challenges involved.

 

You’re studying natural building materials through the OEHM project. Why is this?

Aymeric Girard: All building materials require technical characterization. It’s important, since proposals for buildings are always simulated by computer nowadays as a first step. But traditional building materials, which are not produced by industry, lack technical characteristics. By studying local, traditional materials through the project, we are striving to fill this gap.

If the construction industry doesn’t use these materials, is it interested in this knowledge?

AG: Yes, since one of the major observations about current buildings is that they rely too heavily on internal insulation. The main reason for this is a lack of thermal mass in modern buildings, meaning a mass of materials that serves as a heat regulator. In a new building made with conventional building materials, you’re hot in the summer and cold in the winter. So you need heat and air conditioning. But this is far less of a problem in old buildings built with traditional building materials. In Seville, which is one of the hottest cities in Europe, old churches and cathedrals remain cool in the summer.   The construction industry is now seeking to model new buildings after these traditional structures.

Read more on I’MTech: In Search of Forgotten Cements

There’s also a second benefit. The construction industry is a sector that contributes heavily to greenhouse emissions. This is partially due to the environmental footprint of transporting materials. Using local stones encourages short supply chains, thereby reducing the environmental impact.

What materials are we talking about?

AG: For the OEHM project, we’re working with a clay brick factory and four natural stone quarries: one for granite and three for limestone. Some of these stones are truly local, since they come from the Occitanie region where IMT Mines Alès is located. Others are local in the sense that they come from France at least.

What aspects of these stones and bricks do you study?

AG : We conduct two main analyses of these stones: a thermal analysis and a hygrometric analysis. Hygrometry allows us to study a material’s ability to absorb humidity. That’s important because in winter, for example, the windows in a house are usually closed and you cook, take showers, sweat etc. All of these things increase the humidity level in rooms, which affects quality of life. Certain stones with very low porosity will not absorb this humidity at all, while others with high porosity will have a buffering effect and provide greater comfort.

How do you obtain the technical characteristics you’re seeking?

AG: The quarries send us small five-centimeter cubes to be analyzed. We use the hot-wire method to study heat transfer. This involves taking two cubes of the same stone, and putting a sensor the size of a post-it note between them. We heat one side and observe the speed at which the stone on the other side heats up. We also study the stones’ heat capacity, by putting even smaller samples measuring 5 mm per side in a mini-oven. This provides us with information about how long it takes to raise the stone’s temperature and about how it behaves.

In terms of humidity, we have a sort of refrigerator where we apply a constant amount of moisture, then we compare the weight of the dry stone with the saturated stone above, and deduce its capacity to absorb moisture. It’s a very long process that can take up to four months.

With whom are you working on this project?

AG: On the industrial side, we’re only working with the quarries for now. They’re interested in the technical characteristics we’re producing in order to provide their partners and customers with data about the materials. It’s important knowledge, just as when you buy glass wool to renovate your home, or when you compare offers to decide what to buy. On the research side, the project is part of a long collaboration between IMT Mines Alès, the University of Montpellier, and the National School of Architecture of Montpellier.

What will the project produce besides these technical characteristics?

AG: We plan to use the data we recover to develop our own material simulation software. And we’re also going to carry out real-site testing in collaboration with the National School of Architecture of Montpellier. They have a replica of a house that can be adapted to test materials. This will give us the opportunity to test our results and share insights with architects about the opportunities offered by natural materials suited to the Mediterranean climate.

Image d'un globe terrestre vert - écoconception, eco-design

What is eco-design?

In industry, it is increasingly necessary to design products and services with concern and respect for environmental issues.  Such consideration is expressed through a practice that is gaining ground in a wide range of sectors: eco-design. Valérie Laforest, a researcher in environmental assessment and environmental engineering and organizations at Mines Saint-Étienne, explains the term.

 

What does eco-design mean?

Valérie Laforest: The principle of eco-design is to incorporate environmental considerations from the earliest stages of creating a service or product, meaning from the design stage. It’s a method governed by standards, at the national and international level, describing concepts and setting out current best practices for eco-design. We can just as well eco-design a building as we can a tee-shirt or a photocopying service.

Why this need to eco-design?

VL: There is no longer any doubt about the environmental pressure on the planet. Eco-design is one concrete way for us to think about how our actions impact the environment and consider alternatives to traditional production. Instead of producing, and then looking for solutions, it’s much more effective and efficient to ask questions from the design stage of a product to reduce or avoid the environmental impact.

What stages does eco-design apply to?

VL: In concrete terms, it’s based entirely on the life cycle of a system, from very early on in its existence. Eco-design thinking takes into account the extraction of raw materials, as well as the processing and use stages, until end of life. If we recover the product when it is no longer usable, to recycle it for example, that’s also an example of eco-design. As it stands today, end-of-life products are either sent to landfills, incinerated or recycled. Eco-design means thinking about the materials that can be used, but also thinking about how a product can be dismantled so as to be incorporated within another cycle.

When did we start hearing about this principle?

VL: The first tools arrived in the early 2000s but the concept may be older than that. Environmental issues and associated research have increased since 1990. But eco-design really emerged in a second phase when people started questioning the environmental impact of everyday things: our computer, sending an email, the difference between a polyester or cotton tee-shirt.

What eco-design tools are available for industry?

VL: The tools can fall into a number of categories. There are relatively simple ones, like check-lists or diagrams, while others are more complex. For example, there are life-cycle analysis tools to identify the environmental impacts, and software to incorporate environmental indicators in design tools. The latter require a certain degree of expertise in environmental assessment and a thorough understanding of environmental indicators. And developers and designers are not trained to use these kinds of tools.

Are there barriers to the development of this practice?

VL: There’s a real need to develop special tools for eco-design. Sure, some already exist, but they’re not really adapted to eco-design and can be hard to understand. This is part of our work as researchers, to develop new tools and methods for the environmental performance of human activities. For example, we’re working on projects with the Écoconception center, a key player in the Saint-Etienne region as well as at the national level.

In addition to tools, we also have to go visit companies to get things moving and see what’s holding them back. We have to consider how to train, change and push companies to get them to incorporate eco-design principles. It’s an entirely different way of thinking that requires an acceptance phase in order to rethink how they do things.

Is the circular economy a form of eco-design?

VL: Or is eco-design a form of the circular economy? That’s an important question, and answers vary depending on who you ask. Stakeholders who contribute to the circular economy will say that eco-design is part of this economy. And on the other side, eco-design will be seen as an initiator of the circular economy, since it provides a view of the circulation of material in order to reduce the environmental impact. What’s certain is that the two are linked.

Tiphaine Claveau for I’MTech

[box type=”info” align=”” class=”” width=””]

This article was published as part of Fondation Mines-Télécom‘s 2020 brochure series dedicated to sustainable digital technology and the impact of digital technology on the environment. Through a brochure, conference-debates, and events to promote science in conjunction with IMT, this series explores the uncertainties and challenges of the digital and environmental transitions.

[/box]

 

Joint AI

Joint AI: a platform to facilitate German-French research in AI

In 2019, The German-French Academy for the Industry of the Future launched the Joint AI platform project. This platform bringing together IMT and the Technical University of Munich, promotes collaboration between researchers and industry to develop artificial intelligence tools. Its secure environment allows for intellectual property protection for the results, and the reproducibility of scientific results.

 

The primary aim is to support artificial intelligence research projects between France and Germany.” This is how Anne-Sophie Taillandier begins her description of the Joint AI platform launched in 2019 by IMT and the Technical University of Munich. Since 2015, the two institutions have been working together through the German-French Academy for the Industry of the Future. This partnership has given rise to a number of research projects, some of which have focused on artificial intelligence. Researchers working in this area face a recurring problem: intellectual property protection for the results.

One of the major risks for AI researchers is presenting their work to academic peers or industry stakeholders and having it stolen,” explains Anne-Sophie Taillandier. For several years, this French artificial intelligence expert has headed IMT’s TeraLab, which aims to facilitate AI research in a secure environment.  “Through discussions with our colleagues at the Technical University of Munich, we realized that we each had infrastructures to host and develop AI projects, but that there was no transnational equivalent,” she explains. This gave rise to the Joint AI platform project: as a shared, reliable, protected site for German-French research on artificial intelligence.

Read more on I’MTech: TeraLab, a European Data Sanctuary

The platform is based on technological and legal tools. The hardware architecture and workspaces are designed to host data and work on it with the desired security level. Using a set of APIs, the results of a project can be highlighted and shared on both sides of the border, without having to move the data or the software developed. “Everyone can work with confidence, without having to provide access to their executable or data,” says Anne-Sophie Taillandier.

A tool for researchers…

For researchers working on AI — as well as other scientific disciplines — facilitating cooperation means facilitating the progress of research projects and results. This is especially true for all research related to Industry 4.0, as is the case for the German-French Academy for the Industry of the Future projects that the Joint AI platform currently hosts. “Research on industry involves complex infrastructures, made up of human users and sensors that link the physical and digital dimensions,” says  Georg Carle, holder of the Network Architectures and Services Chair at the Technical University of Munich, and co-director of the project with Anne-Sophie Taillandier.

He explains that, “In order to be valuable, this research must be based on real data and generate realistic models.” And the more the data is shared and worked on by different teams of researchers,  the more effective the resulting algorithms will be. For Georg Carle, “the Joint AI platform makes it possible to improve the reproducibility of results” between the French and German teams. “This leads to higher-quality results, with a bigger impact for the industry stakeholders.”

And for companies!

In addition to providing a collaborative tool for researchers, the Joint AI platform also provides innovation opportunities for companies involved in partnership-based research. When a German industry stakeholder seeks to collaborate with French researchers or vice versa, the legal constraints for moving data represent a major hurdle. Such collaboration is further limited by the fact that, even within the same large company, it can be difficult for the French and German branches to exchange data. “This can be for a variety of reasons: human resources personal data, data related to industrial property, or data concerning clients with whom there is a confidentiality guarantee,” says Anne-Sophie Taillandier.

Companies therefore need a secure location, from both a technological and legal standpoint, to facilitate joint research. Joint AI therefore makes it easier for private stakeholders to take part in research projects at the European level, such as Horizon 2020 framework program projects — or Horizon Europe for future European research projects as of next year. Such a platform offers a prototype for a solution to one of the biggest problems facing AI and digital innovation: secure data sharing between different stakeholders.

Also read on I’MTech:

agilité, agility

Can workspaces become agile?

Innovating, adapting ever-more rapidly to changes in the environment. Breaking away from the traditional office. The digital revolution has dramatically changed working methods, and with them, the way we organize space. Researchers from Institut Mines-Télécom Business School and IMT Atlantique have studied the paradoxes and tensions that arise when workspaces are designed to embody, promote and foster agility.

 

In recent years, the quest for agility has pushed companies to completely rethink their organization, methods and processes. They have urged their employees to develop new work practices. These changes often go hand-in-hand with a reconfiguration of spaces: the flexible office, digital workspaces that are modular and open, organized by work activity etc. But a sort of ambivalence can be seen behind these efforts.

Spaces, locations and offices are often synonymous with having a sense of bearings, of longevity, as a specific territory we claim as the base for our work. Agility, on the other hand, encourages continual reconfiguration, a transitory organization, keeping bodies and heads in constant motion,” explains Marie Bia Figueiredo, a management researcher at Institut Mines-Télécom Business School. So what’s work life like when the office is designed to embody and foster organizational agility? How do employees experience this apparent contradiction and make these new workspaces their own? Marie Bia Figueiredo and her fellow researchers at Institut Mines-Télécom Business School and IMT Atlantique set out to explore these issues.

These questions first occurred to us in 2016, at an observation day in the new offices of a major French bank which we called ‘The Oases’“, says Madeleine Besson, who is also a researcher in management at Mines-Télécom Business School. “We were struck by the omnipresence of references to agility in the talks presenting the buildings, in the way the space was designed, in the signage and even in the decorative elements. On one hand, companies have always relied on physical spaces to convey and embody the changes they hope to bring about in order to standardize and organize practices and behaviors. But at the same time, we must remember that the agile movement establishes a principle of autonomy and self-organization for teams. There was a certain dissonance to it.”

From agile methods to agile environment

The agile movement was formally established in the early 2000s with the aim of adapting quickly to change, whether in terms of disruptive technologies, the volatility of customers or regulatory developments. This quest for agility was first expressed through new project management methods, collectively referred to as “agile methods”. In  principle, these methods are based on a willingness to accept risk and change and on reorganizing and adapting to such change on a permanent basis. Today, companies increasingly see the workspace as a vehicle for change and agility. “Organizations are seeking to align space, work and information technologies,” explain the Institut Mines-Télécom Business School researchers in a forthcoming publication in Terminal[1].

The “Oases” created by the French bank observed by the researchers exemplify this trend. They were designed to embody an “exceptional” drive for transformation in a banking sector which has been particularly affected by technological and economic transformations. During the researchers’ investigation, the company’s real estate director explained their motivation for creating such spaces, “We wanted the Oases to provide fertile grounds for new ways of working in an effort to attract new talents.” The decision was inspired by the iconic workspaces of companies such as Google or Apple, which can “create well-being and conditions that allow employees to work differently.”

The research team’s study shows how this requirement for agility in the corporate world is expressed –in particular through a requirement for adaptation and ubiquity. In order to preserve their modularity, the workspaces cannot be personalized. Instead, workspaces are reserved when arriving at the office and rolling chairs and height-adjustable tables ensure that the space and office can constantly be rearranged. Work is primarily coordinated in the digital space and collective work has become invisible. Physical space and digital space are closely interlinked to convey a requirement for ubiquity.

The researchers also note a requirement for creativity and happiness. “The environment is decorated with plastic smiley faces. The office is designed to provide a fun environment where employees are encouraged to play ping pong or pool,” they note. Lastly, a requirement for speed is expressed by the pervasiveness of references to the passage of time. Hourglasses of varying sizes serve as reminders that there is no leeway when it comes to projects being completed on schedule. “Agility claims to prioritize individual interactions above processes and tools, but these interactions are still subject to strong time pressure. And agility means working at a faster pace, since you have to be ready to cancel or repeat operations as required by customers, the context or new developments,” points out Géraldine Guérillot, a researcher from the IMT Atlantique team.

Attempts to make a “non-place” one’s own

How do employees perceive these changes? Some find it difficult to break with their previous work habits. “Many of them told us that senior executives had to set an example by using the game room or nap room before the employees dared to use it,” says Jean-Luc Moriceau, a researcher at Institut Mines-Télécom Business School. “Others, without showing strong opposition to the new working methods, find ways to get around them. This can be seen in teams who regularly meet up to ‘recreate their territory’ or high-level managers who reserve a room for an entire day“. In some workplaces, employees leave personal belongings to (re)gain their bearings. The flex-office depersonalizes the workplace, so employees attempt to make the space their own, thereby developing behaviors that are contrary to the constant, agile reorganization of space.

Others play along. For example, one of the managers explains that for him, the hourglasses are a polite way of reminding everyone that time is tight. “He’ll meet with anyone who wants to see him, but they must present their views within the allotted time, which is physically represented by the sand flowing through the hourglass,” explain the researchers. Agility appears to aim to shake up the work environment.

But the researchers provide a warning, “The quest for agility, embodied by the reconfiguration of space, when implemented in a too prescriptive and uniform manner, can lead to producing ‘non-places’. These spaces deny the role that feelings, territories, memory and status play in the operations of an organization and work collectives.” The researchers demonstrate how, in turn, this gives rise to discreet ways of taking ownership of such spaces and “producing places,” understood as “minor uses resulting in alternative ways of occupying the shared space.”

By Anne-Sophie Boutaud for I’MTech

[1] Moriceau J.-L., Besson M., Bia Figueiredo M., Guérillot G. (2020), L’espace agile, oasis ou mirage ? Mise en perspective de quelques difficultés et paradoxes pour les travailleurs (Agile Spaces, Oasis or Mirage? A Perspective on Difficulties and Paradoxes for Employees), Terminal, Technologie de l’Information, Culture et Société (forthcoming).

Adlen Ksentini

EURECOM | 5G, Network virtualization, Network softwarization, Network slicing

[toggle title=”Find here all his articles on I’MTech” state=”open”]

[/toggle]

5G!Drones, 5G Drones

Putting drones to the 5G test

Projets européens H20205G!Drones, a European project bringing together industrialists, network operators and research centers, was launched in June 2019 for a three-year period. It should ultimately validate the use of 5G for delivery services by drone. Adlen Ksentini, a researcher at EURECOM, a key partner in the project, explains the challenges involved.

 

What was the context for developing the European 5G!Drones project?

Adlen Ksentini: The H2020 5G!Drones project is funded by the European Commission as part of phase 3 of the 5G PPP projects (5G Infrastructure Public Private Partnership). This phase aims to test  use cases for vertical industry applications (IoT, industry 4.0, autonomous cars etc.) on 5G test platforms. 5G!Drones focuses on use cases involving flying drones, or Unmanned Aerial Vehicles (UAV), such as for transport of packages, extension of network coverage with drones, public security etc.

What is the aim of this project?

AK: The aim is twofold. First, to test eight use cases for UAV services on 5G platforms located in Sophia Antipolis, Athens (Greece), Espoo and Oulu (Finland) to collect information that will allow us to validate the use of 5G for a wider roll-out of UAV services. And second, the project seeks to highlight the ways in which 5G must be improved to guarantee these services.

What technological and scientific challenges do you face?

AK: A number of obstacles will have to be overcome during the project: these obstacles are related to safeguarding drone flights . To fly drones, certain conditions are required. First, there has to be a reliable network with low latency, since remote control of the drones requires low latency in order to correct the flight path and monitor the drones’ position in real time. And there also has to be strong interaction between the U-Space service (see box) and the network operator to plan flights and check conditions: weather, availability of network coverage etc. In addition to these obstacles to be overcome, the 5G !Drones project will develop a software system that will be placed above the platforms, to automate the trials and display the results in real time.

[box type=”info” align=”” class=”” width=””]

The U-Space service is in charge of approving the flight plan submitted by drone operators. Its job is to check whether the flight plan is feasible, meaning ensuring that there are no other flights planned on the selected path and determining whether the weather conditions are favorable.

[/box]

How are EURECOM researchers contributing to this project?

AK: EURECOM is a key partner in the project. EURECOM will provide its 5G testing platform based on its OpenAirInterface (OAI) tool, which provides Network Function Virtualization (NFV) and Multi-access Edge Computing (MEC) solutions. It will host two trials on public safety using flying drones, led by partners representing the vertical industry. In addition, EURECOM will be studying and proposing a solution for developing a 5G network dedicated to UAVs, based on the concept of network slicing.

Who are your partners and what collaborations are important for you?

AK: The project counts 20 partners, including network operators (Orange France and Poland, COSMOTE), specialists in the UAV field (Alerion, INVOLI, Hepta Airborne’s, Unmanned System Limited, CAFA Tech, INVOLI, Frequentis, DRONERADAR), industrial groups (NOKIA, Thalès and AIRBUS), a SME (INFOLYSIS) and research centers and universities (Oulu University, Aalto University, DEMOKRITOS, EURECOM), as well as the municipality of Egaleo in Greece. EURECOM is playing a central role in the project with UAV vertical industry partners by collaborating with all the members of the consortium and acting as a liaison between the UAV vertical industry partners, industrial groups and network operators.

What are the expected benefits of the project?

AK: In addition to the scientific benefits in terms of publications, the project will allow us to verify whether 5G networks are ready to deliver UAV services. Feedback will be provided to 3GPP standards organizations, as well as to the authorities that control the airspace for UAVs.

What are the next important steps for the project?

AK: After a first year in which the consortium focused on studying an architecture that would make it possible to establish a link between the vision of UAV industry stakeholders and 5G networks, as well as a detailed description of the use cases to be tested, the project will be starting its second year, which will focus on deploying the tests on the various sites and then begin the testing.

Learn more about the 5G!Drones project

Interview by Véronique Charlet for I’MTech

 

industrial risk

How can industrial risk be assessed?

Safety is a key concern in the industrial sector. As such, studying risk represents a specialized field of research. Experiments in this area are particularly difficult to carry out, as they involve explosions and complicated measures. Frédéric Heymes, a researcher at IMT Mines Alès who specializes in industrial risk, discusses the unique aspects of this field of research, and new issues to be considered.

 

What does research on industrial risk involve?

Frédéric Heymes: Risk is the likelihood of the occurrence of an event that could lead to  negative and high-stakes consequences. Our research is broken down into three levels of anticipation (understanding, preventing, protecting) and one operational level (helping manage accidents). We have to understand what can happen and do everything possible to prevent dangerous events from happening in real life. Since accidents remain inevitable, we have to anticipate protective measures to best protect people and resources in the aftermath an accident. We must also be able to respond effectively. Emergency services and the parties responsible for managing industrial disasters need simulation tools to help them make the right decisions. Risk research is cross-sectorial and can be applied to a wide range of industries (energy, chemistry, transport, pharmaceuticals, agri-food).

What’s a typical example of an industrial risk study?

FH:  Although my research may address a wide variety of themes, on the whole, it’s primarily connected to explosive risk. That means understanding the phenomenon and why it occurs, in order to make sure it won’t happen again. A special feature of our laboratory is that we can carry out experimental field testing for dangerous phenomena that can’t be performed in the laboratory setting.

What does an experiment on explosive risk look like?

FH: We partnered with Total to carry out an especially impressive experiment, which had never before been done anywhere in the world. It was a study on the explosion of superheated water, under very high pressure at a very high temperature. It was potentially dangerous since the explosion releases a very large amount of energy. It was important for Total to understand what happens in the event of such an explosion and the consequences of concern. Carrying out the experiment was a real team effort and called for a great deal of logistical planning. Right away, it was different than working in a lab setting. There were between 5 and 8 people involved in each test, and everyone had their own specific role and specialty: data acquisition, control, high-speed cameras, logistics, handling. We needed a prototype that weighed about a ton, which we had made by a boilermaker. That alone was no simple task. Boilermakers are responsible for producing compliant equipment that is known to be reliable. But for our research, we knew that the prototype would explode. So we had to reassure the manufacturer in terms of liability.

How do you set up such an explosion?

FH: We need a special testing ground to carry out the experiment and to get permission to use it, we have to prove that the test is perfectly controlled. For these tests, we collaborated with the Camp des Garrigues, a military range located north of Nîmes. The test area is secure but completely empty, so it took a lot of preparation and set-up. In addition, firefighters were also on site with our team. And there was a great deal of research dedicated to sensors in order to obtain precise measurements. The explosion lasts less than a second. It’s a very short test. Most of the time, we only have access to the field for a relatively short period of time, which means we carry out the tests one after another, non-stop. We’re also under a lot of stress –  we know that the slightest error could  have dramatic consequences.

What happens after this study?

FH: The aim of this research was to study the consequences of such an explosion on the immediate environment. That provides us with an in-depth understanding of the event so that those involved can take appropriate action. We therefore obtain information about the explosion, the damage it causes and the size of the damaged area. We also observe whether it can give rise to a shock wave or projectile expulsion, and if so, we study their impacts.

Has there ever been a time when you were unable to carry out tests you needed for your research?

FH: Yes, that was the case for a study on the risk of propane tank explosions during wildfires. Ideally, we would have to control a real wildfire and expose propane tanks to this hazard. But we’re not allowed to do that, and it’s extremely dangerous. It’s a real headache. Ultimately, we have to divide the project into two parts and study each part separately. That way, we obtain results that we can link using modeling. On one hand, we have the wildfire with a huge number of variables that must be taken into account: wind strength and direction, slope inclination, types of species in the vegetation, etc. And on the other hand, we study fluid mechanics and thermodynamics to understand what happens inside propane tanks.

What results did you achieve through this study?

FH: We arrived at the conclusion that gas tanks are not likely to explode if brush clearing regulations are observed. In residential areas located near forests, there are regulations for maintenance, and brush clearing in particular. But if these rules are not observed, safety is undermined. We therefore suggested a protective component with good thermal properties and  flame resistance to protect tanks in scenarios that do not comply with regulations.

What are some current issues surrounding industrial risk?

FH: Research in the field of industrial risk really took off in the 1970s. There were a number of industrial accidents, which underscored the need to anticipate risks, leading to extensive research to prevent and protect against risks more effectively. But today, all energy sectors are undergoing changes and there are new risks to consider. Sectors are being created and raising new issues, as is the case for hydrogen for example. Hydrogen is a very attractive energy source since its use only produces water, and no carbon dioxide. But it is a dangerous compound since it’s highly flammable and explosive. The question is how to organize hydrogen supply chains (production, transport, storage, use) as well as possible. How can hydrogen best be used in the territory while minimizing risks? It’s a question that warrants further investigation. A cross-disciplinary research project on this topic with other IMT partners is in the startup phase, as part of Carnot HyTrend.

Read more on I’MTech: What is Hydrogen Energy?

So does that mean that energy and environmental transition come with their own set of new risks to be studied?

FH: Yes, that’s right and global warming is another current field of research. To go back to wildfires, they’re becoming more common which raises concerns. How can we deal with the growing number of fires? One solution is to consider passive self-protection scenarios, meaning reducing the vulnerability to risks through technological improvements, for example. The energy transition is bringing new technologies, along with new uses. Like I was saying before, hydrogen is a dangerous chemical compound, but we’ve known that for a long time. However, its operational use to support energy transition raises a number of new questions.

How can we deal with these new risks?

FH: The notion of new industrial risk is clearly linked to our social and technological evolutions. And evolution means new risks. Yet it’s hard to anticipate such risks since it’s hard  to anticipate such evolutions in the first place. But at the same time, these evolutions provide us with new tools: artificial intelligence for example. We can now assimilate large amounts of data and quickly extract useful, relevant results to recognize an abnormal, potentially dangerous situation. Artificial intelligence also helps us overcome a number of technological  hurdles. For example, we’re working with Mines ParisTech to conduct research on predicting the hydrodynamic behavior of gas leaks using artificial intelligence methods, with unprecedented computing speed and accuracy.

How is research with industrial players organized on this topic?

FH: Research can grow out of partnerships with research organizations, such as the IRSN (French Institute for Radiological Protection and Nuclear Safety). During the decommissioning of a power plant, even though there’s no longer any fissile material, residual metal dust could potentially ignite. So we have to understand what may happen in order to act accordingly in terms of safety. But for the most part, I collaborate directly with industrialists. In France, they’re responsible for managing the risks inherent in their operations. So there’s a certain administrative pressure to improve on these issues, and that sometimes involves research questions. But most of the time, investments are driven not by administrative requirements, but by a profound commitment to reducing risks.

What’s quite unique about this field of research is that we have complete freedom to study the topic and complete freedom to publish. That’s really unique to the field of risk. In general, results are shared easily, and often published so that “the competition” can also benefit from the findings. It’s also quite common for several companies in the same industry team up to fund a study since they all stand to benefit from it.

OligoArchive

DNA as the data storage medium

Projets européens H2020By 2025 the volume of data produced in the world will have reached 250 zettabytes (1 zettabyte = 1021 bytes). Current storage media have insufficient storage capacity or suffer from obsolescence. Preserving even a fraction of this data means finding a storage device with density and durability characteristics significantly superior to those of existing systems. The European OligoArchive project, launched in October 2019 for three years, proposes to use DNA (DeoxyriboNucleic Acid) as a storage medium. Raja Appuswamy, researcher at EURECOM partner of the project, explains further.

 

In what global context did the European OligoArchive project come about?

Raja Appuswamy Today, everything in our society is driven by data. If data is the oil that fuels the metaphorical AI vehicle, storage technologies are the cog that keep the wheel spinning. For decades, we wanted fast storage devices that can quickly deliver data, and optical, magnetic, and solid state storage technologies evolved to meet this requirement. As data-driven decision becomes a part of our society, we are increasingly faced with a new need–one for cheap, long-term storage devices that can safely store the collective knowledge we generate for hundreds or even thousands of years. Imagine you have a photograph that you would like to pass down to your great-great grand children. Where would you store it? How much space would it take? How much energy would it use? How much would it cost? Would your storage media still be readable two generations from now? This is the context for project OligoArchive.

What is at stake in this project?

RA Today, tape drives are the gold standard when it comes to data archival across all disciplines, from Hollywood movie archives to particle accelerator facilities. But tape media suffers from several fundamental limitations that makes it unsuitable for long-term data storage. First, the storage density of tape -the amount of data you can store per inch- is improving at a 30% rate annually; archival data, in contrast, that has a growth rate of 60%. Second, if one stores 1PB in 100 tape drives today, within five years, it would be possible to store the same data in just 25 drives. While this might sound like a good thing, using tape for archival storage implies constant data migration with each new generation of tape, and such migrations cost millions of dollars.

This problem is so acute that Hollywood movie archives have openly admitted that we are living in a dead period during which the productions of several independent artists will not be saved for the future! At the rate at which we are generating data for feeding our AI machinery, enterprises will soon be at this point. Thus, the storage industry as a whole has come to the realization that a radically new storage technology is required if we are to preserve data across generations.

What will be the advantages of the technology developed by OligoArchive?

RA Project OligoArchive undertakes the ambitious goal of retasking DNA–a biological building block–to function as a radically new digital storage media. DNA possesses three key properties that make it relevant for digital data storage. First, it is an extremely dense three-dimensional storage medium that has the theoretical ability to store 455 Exabytes in 1 gram. The sum total of all data generated world wide (global datasphere) is projected to be 175 Zettabytes by 2025. This could be stored in just under half a kilogram of DNA. Second, DNA can last several millenia as demonstrated by experiments that have the read DNA of ancient, extinct animal species from fossils that are dated back thousands of years. If we can bring back the wolly mammoth to life from its DNA, we can store data in DNA for millenia. Third, the density of DNA is fixed by nature, and we will always have the ability and the need to read DNA–everything from archeology to precision medicine depend on it. Thus, DNA is an immortal storage medium does not have the media obsolescence problem and hence, can never become out dated unlike other storage media (remember floppy disks?).

What expertise do EURECOM researchers bring?

The Data Science department at EURECOM is contributing to several aspects of this project. First, we are building on our deep expertise in storage systems to architect various aspects of using DNA as a storage media, like developing solutions for implementing a block abstraction over DNA, or providing random access to data stored in DNA. Second, we are combining our expertise in data management and machine learning to develop novel, structure-aware encoding and decoding algorithms that can reliably store and retrieve data in DNA, even though the underlying biological tasks of synthesis (writing) and sequencing (reading) introduce several errors.

Who are your partners and what are their respective contributions?

The consortium brings together a truly multi-disciplinary group of people with diverse expertise across Europe. Institute of Mollecular and Cellular Pharmacology (IPMC) in Sophia Antipolis, the home to the largest sequencing facility in the PACA region, is a partner that contributes its biological expertise to the project. Our partners at I3S, CNRS, are working on new compression techniques customized for DNA storage that will drastically reduce the amount of DNA needed to store digital content. Our colleagues at Imperial College London (UK) are building on our work and pushing the envelope further by using DNA not just a storage media, but a computational substrate by showing that some SQL database operations that run in-silico (on a CPU) today can be translated efficiently into in-vitro biochemical reactions directly on DNA. Finally, we also have HelixWorks, a startup from Ireland that specializes is investigating novel enzymatic synthesis techniques for reducing the cost of generating DNA, as an industrial partner.

What results are expected and ultimately what will be the applications?

The ambitious end goal of the project is to build a DNA disk–a fully working end-to-end prototype that shows that DNA can indeed function as a replacement for current archival storage technology like tape. Application wise, archival storage is a billion dollar industry, and we believe that DNA is a fundamentally disruptive technology that has the potential to reshape this market. But we believe that our project have an impact on areas beyond archival storage.

First, our work on DNA computation opens up an entirely new field of research on near-molecule data processing that mirrors the current trend of moving computation closer to data to avoid time-consuming data movement. Second, most of the models and tools we develop for DNA storage are actually applicable for analyzing genetic data in other contexts. For instance, the algorithm we are developing for reading data back from DNA provides a scalable solution for sequence clustering–a classic computational genomics problem with several applications. Thus, our work will also contribute to advances in computational genomics.

Learn more about OligoArchive

Rémi Sharrock

C in your Browser

In the academic world, teaching and carrying out research often go hand-in-hand. This is especially true for Rémi Sharrock, a computer science researcher at Télécom Paris, who has developed a C Language learning program comprising 7 MOOCs. The teaching approach used for his online courses called for the development of innovative tools, drawing on the researcher’s expertise. Rémi Sharrock was rewarded for this work in November 2019 by the edX platform, a leading global MOOC provider, who presented him with the its 2019 edX Prize. He talked to us about the story behind this digital learning program developed in partnership with Dartmouth College in the United States.

 

What led you to undertake research in order to create an online learning program?

Rémi Sharrock: The original aim was to propose a new way of learning C language. To do so, we had to develop a number of tools that didn’t exist at the time. This work carried out with Dartmouth College gave rise to research opportunities. Our goal was always to facilitate  exchange with the learner, and to make it a central part of the learning process. The tools we developed made it possible to carry out learning activities directly on the user’s computer, with many features that had never been seen before.

What are some examples of the tools you developed?

RS: The idea of a MOOC is that it’s open to as many people as possible. We didn’t know what type of computer users would connect with, or what operating system or browser they would use. But regardless of their system, we had to be able to provide users with a high-quality learning experience. The first tool we developed for this was WebLinux. It met the challenge of being able to code in C Language with Linux from any computer, using any browser. We didn’t want to make learners download an application, since that could discourage beginners. WebLinux therefore allowed us to emulate Linux for everyone, directly on the web-based learning platform.

How did you do this from a technical perspective?

RS: Technically, we run Linux directly in the browser, without going through a server. To do so, we use an openRisc processor emulator that is run in the browser, and a Linux that is compatible with this type of processor. That allows us to do without servers that run Linux, and therefore operate on a large scale with limited server resources.

That’s an advantage in terms of access to education, but does the tool also facilitate educational activities?  

RS: For that part we had to develop an additional tool, called Codecast. It’s a C language emulator that runs on any browser and is synchronized with the professor’s audio explanation. It was a real challenge to develop this tool because we wanted to make it possible for anyone to run C language instructions directly on their browser, without having to go through a remote computer server, or use third party software on their computer. We created a specialized C language interpreter for the web, which works with all browsers. When you’re watching the professor’s course in the video, you can directly edit lines of code and run them in your browser, right from the course web page. And on top of that, when the teacher integrates an instruction to be learned and tested that he’s sent you as part of the lesson, you can pause the video, edit the instruction and try different things, then resume the video without any consequences.

You also responded to another challenge with this type of MOOC: assessing learners.

RS: Yes, with a third tool, Taskgrader. In a traditional classroom course, the teacher assesses codes proposed by students one by one, and corrects them. This is inconceivable with a MOOC since you have tens or hundreds of thousands of learners to correct.  Taskgrader makes it possible to automatically assess students’ codes in real time, without the professor having to look them over, by providing personalized feedback.

Do all these tools have applications outside the scope of the MOOC C language learning program?

RS: Codecast could be of interest to big community-driven development websites like  Github. Amateur and professional developers share bits of code for applications on the website. But cooperation is often difficult: to correct someone’s code you have to download the incorrect version, correct it, then send it back to the person who then has to download it again. An emulator in the browser would make it possible to work directly online in real time. And as for Taskgrader, it’s a valuable tool for all computer language teachers, even outside the world of MOOCs.

Is your research work in connection with these MOOCs over now that the learning program has been completed?  

RS: No, since we’ve also committed to a second type of research. We’ve teamed up with Cornell and Stanford universities to carry out large-scale sociological experiments on these MOOC learners in an effort to better understand our learner communities.

What kind of research are you conducting to that end?

RS: We have 160,000 learners in the MOOC program worldwide from a wide range of social, ethnic and demographic backgrounds. We wanted to find out whether there are differences in the way in which men and women learn, for example, or between older and younger people. We therefore implement the differences in the given courses according to individuals’ profiles, based on A/B testing – the sample of learners is split in two, and each group has a learning parameter that changes, such as the teacher’s age, voice or gender. This should eventually allow us to better understand learning processes and adapt them to provide each individual with a program that facilitates knowledge transfer.