industrial risk

How can industrial risk be assessed?

Safety is a key concern in the industrial sector. As such, studying risk represents a specialized field of research. Experiments in this area are particularly difficult to carry out, as they involve explosions and complicated measures. Frédéric Heymes, a researcher at IMT Mines Alès who specializes in industrial risk, discusses the unique aspects of this field of research, and new issues to be considered.

 

What does research on industrial risk involve?

Frédéric Heymes: Risk is the likelihood of the occurrence of an event that could lead to  negative and high-stakes consequences. Our research is broken down into three levels of anticipation (understanding, preventing, protecting) and one operational level (helping manage accidents). We have to understand what can happen and do everything possible to prevent dangerous events from happening in real life. Since accidents remain inevitable, we have to anticipate protective measures to best protect people and resources in the aftermath an accident. We must also be able to respond effectively. Emergency services and the parties responsible for managing industrial disasters need simulation tools to help them make the right decisions. Risk research is cross-sectorial and can be applied to a wide range of industries (energy, chemistry, transport, pharmaceuticals, agri-food).

What’s a typical example of an industrial risk study?

FH:  Although my research may address a wide variety of themes, on the whole, it’s primarily connected to explosive risk. That means understanding the phenomenon and why it occurs, in order to make sure it won’t happen again. A special feature of our laboratory is that we can carry out experimental field testing for dangerous phenomena that can’t be performed in the laboratory setting.

What does an experiment on explosive risk look like?

FH: We partnered with Total to carry out an especially impressive experiment, which had never before been done anywhere in the world. It was a study on the explosion of superheated water, under very high pressure at a very high temperature. It was potentially dangerous since the explosion releases a very large amount of energy. It was important for Total to understand what happens in the event of such an explosion and the consequences of concern. Carrying out the experiment was a real team effort and called for a great deal of logistical planning. Right away, it was different than working in a lab setting. There were between 5 and 8 people involved in each test, and everyone had their own specific role and specialty: data acquisition, control, high-speed cameras, logistics, handling. We needed a prototype that weighed about a ton, which we had made by a boilermaker. That alone was no simple task. Boilermakers are responsible for producing compliant equipment that is known to be reliable. But for our research, we knew that the prototype would explode. So we had to reassure the manufacturer in terms of liability.

How do you set up such an explosion?

FH: We need a special testing ground to carry out the experiment and to get permission to use it, we have to prove that the test is perfectly controlled. For these tests, we collaborated with the Camp des Garrigues, a military range located north of Nîmes. The test area is secure but completely empty, so it took a lot of preparation and set-up. In addition, firefighters were also on site with our team. And there was a great deal of research dedicated to sensors in order to obtain precise measurements. The explosion lasts less than a second. It’s a very short test. Most of the time, we only have access to the field for a relatively short period of time, which means we carry out the tests one after another, non-stop. We’re also under a lot of stress –  we know that the slightest error could  have dramatic consequences.

What happens after this study?

FH: The aim of this research was to study the consequences of such an explosion on the immediate environment. That provides us with an in-depth understanding of the event so that those involved can take appropriate action. We therefore obtain information about the explosion, the damage it causes and the size of the damaged area. We also observe whether it can give rise to a shock wave or projectile expulsion, and if so, we study their impacts.

Has there ever been a time when you were unable to carry out tests you needed for your research?

FH: Yes, that was the case for a study on the risk of propane tank explosions during wildfires. Ideally, we would have to control a real wildfire and expose propane tanks to this hazard. But we’re not allowed to do that, and it’s extremely dangerous. It’s a real headache. Ultimately, we have to divide the project into two parts and study each part separately. That way, we obtain results that we can link using modeling. On one hand, we have the wildfire with a huge number of variables that must be taken into account: wind strength and direction, slope inclination, types of species in the vegetation, etc. And on the other hand, we study fluid mechanics and thermodynamics to understand what happens inside propane tanks.

What results did you achieve through this study?

FH: We arrived at the conclusion that gas tanks are not likely to explode if brush clearing regulations are observed. In residential areas located near forests, there are regulations for maintenance, and brush clearing in particular. But if these rules are not observed, safety is undermined. We therefore suggested a protective component with good thermal properties and  flame resistance to protect tanks in scenarios that do not comply with regulations.

What are some current issues surrounding industrial risk?

FH: Research in the field of industrial risk really took off in the 1970s. There were a number of industrial accidents, which underscored the need to anticipate risks, leading to extensive research to prevent and protect against risks more effectively. But today, all energy sectors are undergoing changes and there are new risks to consider. Sectors are being created and raising new issues, as is the case for hydrogen for example. Hydrogen is a very attractive energy source since its use only produces water, and no carbon dioxide. But it is a dangerous compound since it’s highly flammable and explosive. The question is how to organize hydrogen supply chains (production, transport, storage, use) as well as possible. How can hydrogen best be used in the territory while minimizing risks? It’s a question that warrants further investigation. A cross-disciplinary research project on this topic with other IMT partners is in the startup phase, as part of Carnot HyTrend.

Read more on I’MTech: What is Hydrogen Energy?

So does that mean that energy and environmental transition come with their own set of new risks to be studied?

FH: Yes, that’s right and global warming is another current field of research. To go back to wildfires, they’re becoming more common which raises concerns. How can we deal with the growing number of fires? One solution is to consider passive self-protection scenarios, meaning reducing the vulnerability to risks through technological improvements, for example. The energy transition is bringing new technologies, along with new uses. Like I was saying before, hydrogen is a dangerous chemical compound, but we’ve known that for a long time. However, its operational use to support energy transition raises a number of new questions.

How can we deal with these new risks?

FH: The notion of new industrial risk is clearly linked to our social and technological evolutions. And evolution means new risks. Yet it’s hard to anticipate such risks since it’s hard  to anticipate such evolutions in the first place. But at the same time, these evolutions provide us with new tools: artificial intelligence for example. We can now assimilate large amounts of data and quickly extract useful, relevant results to recognize an abnormal, potentially dangerous situation. Artificial intelligence also helps us overcome a number of technological  hurdles. For example, we’re working with Mines ParisTech to conduct research on predicting the hydrodynamic behavior of gas leaks using artificial intelligence methods, with unprecedented computing speed and accuracy.

How is research with industrial players organized on this topic?

FH: Research can grow out of partnerships with research organizations, such as the IRSN (French Institute for Radiological Protection and Nuclear Safety). During the decommissioning of a power plant, even though there’s no longer any fissile material, residual metal dust could potentially ignite. So we have to understand what may happen in order to act accordingly in terms of safety. But for the most part, I collaborate directly with industrialists. In France, they’re responsible for managing the risks inherent in their operations. So there’s a certain administrative pressure to improve on these issues, and that sometimes involves research questions. But most of the time, investments are driven not by administrative requirements, but by a profound commitment to reducing risks.

What’s quite unique about this field of research is that we have complete freedom to study the topic and complete freedom to publish. That’s really unique to the field of risk. In general, results are shared easily, and often published so that “the competition” can also benefit from the findings. It’s also quite common for several companies in the same industry team up to fund a study since they all stand to benefit from it.

OligoArchive

DNA as the data storage medium

Projets européens H2020By 2025 the volume of data produced in the world will have reached 250 zettabytes (1 zettabyte = 1021 bytes). Current storage media have insufficient storage capacity or suffer from obsolescence. Preserving even a fraction of this data means finding a storage device with density and durability characteristics significantly superior to those of existing systems. The European OligoArchive project, launched in October 2019 for three years, proposes to use DNA (DeoxyriboNucleic Acid) as a storage medium. Raja Appuswamy, researcher at EURECOM partner of the project, explains further.

 

In what global context did the European OligoArchive project come about?

Raja Appuswamy Today, everything in our society is driven by data. If data is the oil that fuels the metaphorical AI vehicle, storage technologies are the cog that keep the wheel spinning. For decades, we wanted fast storage devices that can quickly deliver data, and optical, magnetic, and solid state storage technologies evolved to meet this requirement. As data-driven decision becomes a part of our society, we are increasingly faced with a new need–one for cheap, long-term storage devices that can safely store the collective knowledge we generate for hundreds or even thousands of years. Imagine you have a photograph that you would like to pass down to your great-great grand children. Where would you store it? How much space would it take? How much energy would it use? How much would it cost? Would your storage media still be readable two generations from now? This is the context for project OligoArchive.

What is at stake in this project?

RA Today, tape drives are the gold standard when it comes to data archival across all disciplines, from Hollywood movie archives to particle accelerator facilities. But tape media suffers from several fundamental limitations that makes it unsuitable for long-term data storage. First, the storage density of tape -the amount of data you can store per inch- is improving at a 30% rate annually; archival data, in contrast, that has a growth rate of 60%. Second, if one stores 1PB in 100 tape drives today, within five years, it would be possible to store the same data in just 25 drives. While this might sound like a good thing, using tape for archival storage implies constant data migration with each new generation of tape, and such migrations cost millions of dollars.

This problem is so acute that Hollywood movie archives have openly admitted that we are living in a dead period during which the productions of several independent artists will not be saved for the future! At the rate at which we are generating data for feeding our AI machinery, enterprises will soon be at this point. Thus, the storage industry as a whole has come to the realization that a radically new storage technology is required if we are to preserve data across generations.

What will be the advantages of the technology developed by OligoArchive?

RA Project OligoArchive undertakes the ambitious goal of retasking DNA–a biological building block–to function as a radically new digital storage media. DNA possesses three key properties that make it relevant for digital data storage. First, it is an extremely dense three-dimensional storage medium that has the theoretical ability to store 455 Exabytes in 1 gram. The sum total of all data generated world wide (global datasphere) is projected to be 175 Zettabytes by 2025. This could be stored in just under half a kilogram of DNA. Second, DNA can last several millenia as demonstrated by experiments that have the read DNA of ancient, extinct animal species from fossils that are dated back thousands of years. If we can bring back the wolly mammoth to life from its DNA, we can store data in DNA for millenia. Third, the density of DNA is fixed by nature, and we will always have the ability and the need to read DNA–everything from archeology to precision medicine depend on it. Thus, DNA is an immortal storage medium does not have the media obsolescence problem and hence, can never become out dated unlike other storage media (remember floppy disks?).

What expertise do EURECOM researchers bring?

The Data Science department at EURECOM is contributing to several aspects of this project. First, we are building on our deep expertise in storage systems to architect various aspects of using DNA as a storage media, like developing solutions for implementing a block abstraction over DNA, or providing random access to data stored in DNA. Second, we are combining our expertise in data management and machine learning to develop novel, structure-aware encoding and decoding algorithms that can reliably store and retrieve data in DNA, even though the underlying biological tasks of synthesis (writing) and sequencing (reading) introduce several errors.

Who are your partners and what are their respective contributions?

The consortium brings together a truly multi-disciplinary group of people with diverse expertise across Europe. Institute of Mollecular and Cellular Pharmacology (IPMC) in Sophia Antipolis, the home to the largest sequencing facility in the PACA region, is a partner that contributes its biological expertise to the project. Our partners at I3S, CNRS, are working on new compression techniques customized for DNA storage that will drastically reduce the amount of DNA needed to store digital content. Our colleagues at Imperial College London (UK) are building on our work and pushing the envelope further by using DNA not just a storage media, but a computational substrate by showing that some SQL database operations that run in-silico (on a CPU) today can be translated efficiently into in-vitro biochemical reactions directly on DNA. Finally, we also have HelixWorks, a startup from Ireland that specializes is investigating novel enzymatic synthesis techniques for reducing the cost of generating DNA, as an industrial partner.

What results are expected and ultimately what will be the applications?

The ambitious end goal of the project is to build a DNA disk–a fully working end-to-end prototype that shows that DNA can indeed function as a replacement for current archival storage technology like tape. Application wise, archival storage is a billion dollar industry, and we believe that DNA is a fundamentally disruptive technology that has the potential to reshape this market. But we believe that our project have an impact on areas beyond archival storage.

First, our work on DNA computation opens up an entirely new field of research on near-molecule data processing that mirrors the current trend of moving computation closer to data to avoid time-consuming data movement. Second, most of the models and tools we develop for DNA storage are actually applicable for analyzing genetic data in other contexts. For instance, the algorithm we are developing for reading data back from DNA provides a scalable solution for sequence clustering–a classic computational genomics problem with several applications. Thus, our work will also contribute to advances in computational genomics.

Learn more about OligoArchive

Rémi Sharrock

C in your Browser

In the academic world, teaching and carrying out research often go hand-in-hand. This is especially true for Rémi Sharrock, a computer science researcher at Télécom Paris, who has developed a C Language learning program comprising 7 MOOCs. The teaching approach used for his online courses called for the development of innovative tools, drawing on the researcher’s expertise. Rémi Sharrock was rewarded for this work in November 2019 by the edX platform, a leading global MOOC provider, who presented him with the its 2019 edX Prize. He talked to us about the story behind this digital learning program developed in partnership with Dartmouth College in the United States.

 

What led you to undertake research in order to create an online learning program?

Rémi Sharrock: The original aim was to propose a new way of learning C language. To do so, we had to develop a number of tools that didn’t exist at the time. This work carried out with Dartmouth College gave rise to research opportunities. Our goal was always to facilitate  exchange with the learner, and to make it a central part of the learning process. The tools we developed made it possible to carry out learning activities directly on the user’s computer, with many features that had never been seen before.

What are some examples of the tools you developed?

RS: The idea of a MOOC is that it’s open to as many people as possible. We didn’t know what type of computer users would connect with, or what operating system or browser they would use. But regardless of their system, we had to be able to provide users with a high-quality learning experience. The first tool we developed for this was WebLinux. It met the challenge of being able to code in C Language with Linux from any computer, using any browser. We didn’t want to make learners download an application, since that could discourage beginners. WebLinux therefore allowed us to emulate Linux for everyone, directly on the web-based learning platform.

How did you do this from a technical perspective?

RS: Technically, we run Linux directly in the browser, without going through a server. To do so, we use an openRisc processor emulator that is run in the browser, and a Linux that is compatible with this type of processor. That allows us to do without servers that run Linux, and therefore operate on a large scale with limited server resources.

That’s an advantage in terms of access to education, but does the tool also facilitate educational activities?  

RS: For that part we had to develop an additional tool, called Codecast. It’s a C language emulator that runs on any browser and is synchronized with the professor’s audio explanation. It was a real challenge to develop this tool because we wanted to make it possible for anyone to run C language instructions directly on their browser, without having to go through a remote computer server, or use third party software on their computer. We created a specialized C language interpreter for the web, which works with all browsers. When you’re watching the professor’s course in the video, you can directly edit lines of code and run them in your browser, right from the course web page. And on top of that, when the teacher integrates an instruction to be learned and tested that he’s sent you as part of the lesson, you can pause the video, edit the instruction and try different things, then resume the video without any consequences.

You also responded to another challenge with this type of MOOC: assessing learners.

RS: Yes, with a third tool, Taskgrader. In a traditional classroom course, the teacher assesses codes proposed by students one by one, and corrects them. This is inconceivable with a MOOC since you have tens or hundreds of thousands of learners to correct.  Taskgrader makes it possible to automatically assess students’ codes in real time, without the professor having to look them over, by providing personalized feedback.

Do all these tools have applications outside the scope of the MOOC C language learning program?

RS: Codecast could be of interest to big community-driven development websites like  Github. Amateur and professional developers share bits of code for applications on the website. But cooperation is often difficult: to correct someone’s code you have to download the incorrect version, correct it, then send it back to the person who then has to download it again. An emulator in the browser would make it possible to work directly online in real time. And as for Taskgrader, it’s a valuable tool for all computer language teachers, even outside the world of MOOCs.

Is your research work in connection with these MOOCs over now that the learning program has been completed?  

RS: No, since we’ve also committed to a second type of research. We’ve teamed up with Cornell and Stanford universities to carry out large-scale sociological experiments on these MOOC learners in an effort to better understand our learner communities.

What kind of research are you conducting to that end?

RS: We have 160,000 learners in the MOOC program worldwide from a wide range of social, ethnic and demographic backgrounds. We wanted to find out whether there are differences in the way in which men and women learn, for example, or between older and younger people. We therefore implement the differences in the given courses according to individuals’ profiles, based on A/B testing – the sample of learners is split in two, and each group has a learning parameter that changes, such as the teacher’s age, voice or gender. This should eventually allow us to better understand learning processes and adapt them to provide each individual with a program that facilitates knowledge transfer.

ASTRID project

Astrid: a nuclear project goes up in smoke

The abandonment of the Astrid project marks a turning point for France’s nuclear industry. The planned nuclear reactor was supposed to be “safer, more efficient and more sustainable”, but therefore required significant funding. Stéphanie Tillement, a researcher at IMT Atlantique, has studied how Fukushima impacted the nuclear industry. Her work has focused in particular on the rationale for abandoning the Astrid project, taking into account the complicated history of nuclear energy and how it has evolved in the public and political spheres.

 

Since the early days of nuclear energy, France has positioned itself as a global leader in terms of both research and energy production. In this respect, the abandonment of the Astrid project in August 2019 marked a move away from this leading position. Astrid (Advanced Sodium Technological Reactor for Industrial Demonstration) was supposed to be France’s first industrial demonstrator for what are referred to as “4th-generation” reactors. The selected technology was the sodium-cooled fast neutron reactor (FNR). At present, nuclear power in France is supplied by 58 second-generation pressurized water reactors, which operate with “slowed-down” neutrons. As an FNR, ASTRID held the promise of more renewable energy – it was supposed to be able to use depleted uranium and plutonium resulting from the operation of current plants as a fuel source, meaning it would consume much less natural uranium.

As part of the AGORAS research project, IMT Atlantique researcher Stéphanie Tillement, studied the impact of the Fukushima accident on the world of nuclear energy. This led her to study the Astrid project, and in particular the many challenges it encountered. “We ruled out the link with Fukushima early on,” says the researcher. The problems Astrid ran into are not related to a paradigm shift as a result of the catastrophe. The reasons it was abandoned are endogenous to the industry and its history.” And financial reasons, though by no means negligible, are not enough to explain why the project was abandoned.

A tumultuous history

In the 2000s, the United States Department of Energy launched the Generation IV International Forum to develop international cooperation for new concepts for nuclear reactors. Out of the six concepts selected by this forum as the most promising, France focused on sodium-cooled reactors, a project which would be launched in 2010 under the name Astrid. The country preferred this concept in particular due to the fact that three French reactors using the technology had already been  built. However, none of them had been used on an industrial scale and the technology had not advanced beyond the prototyping stage. The first such reactor, Rapsodie, was dedicated purely to research. The second was Phénix. It was an intermediary step – it had to produce energy but remained an experimental reactor, far from an industrial scale. It was the third such reactor, Superphénix, which would be given the role of representing the first in the series of this new French industrial-scale energy. But from the beginning, it experienced shut-down periods following several incidents and in 1997, Prime Minister Lionel Jospin announced that it would be shut down once and for all.

 “This decision was widely criticized by the nuclear industry,” says Stéphanie Tillement, “who accused him of acting for the wrong reasons.” During the election campaign, Lionel Jospin had aligned himself with the Green party, who were openly in favor of decommissioning the power plant. “Its sudden shutdown would be taken very badly and destroy all hope for the use of such technology on an industrial-scale. Superphénix was supposed to be the first in a long line, and some remember it as ‘a cathedral in a desert.'” This also reflected public opinion on nuclear energy: the industry was facing growing mistrust and opposition.

“For a lot of stakeholders in the nuclear industry, in particular the CEA (The French Atomic and Alternative Energy Commission), Astrid gave hope to the idea of reviving this highly promising technology,” explains the researcher. One of the biggest advantages was the possibility of a closed nuclear cycle, which would make it possible to recycle nuclear material from current power plants – such as plutonium – to use as a fuel source in the reactors. “In this respect, the discontinuation of the Astrid project may in the long run call into question the very existence of the La Hague reprocessing plant,” she says. This plant processes used fuel, a portion of which (plutonium in particular) is reused in reactors, in the form of MOX fuel. “Without reactors that can use reprocessed materials effectively, it’s difficult to justify its existence.”

Read more on I’MTech: MOx strategy and the future of French nuclear plants

“From the beginning, our interviews showed that it was difficult for the Astrid stakeholders to define the status of the project precisely,” explains Stéphanie Tillement. The concept proposed when applying for funding was that of an industrial demonstrator. The goal was therefore to build a reactor within a relatively short period of time, which could produce energy on a large scale based on technology for which there was already a significant amount of operating experience. But the CEA also saw Astrid as a research project, to improve the technology and develop new design options. This would require far more time. “As the project advanced,” adds the researcher, “the CEA increasingly focused on a research and development approach. The concept moved away from previous reactors and its development was delayed. When they had to present the roadmap in 2018, the project was at a ‘basic design’ stage and still needed a lot of work, as far as design was concerned, but also in terms of demonstrating compliance with nuclear safety requirements.”

An abandoned or postponed project?

Stéphanie Tillement confirms that, “the Astrid project, as initially presented, has been permanently abandoned.” Work on the sodium technology is expected to be continued, but the construction of a potential demonstrator of this technology will be postponed until the second half of the 21st century. “It’s a short-sighted decision,” she insists. Uranium, which is used to operate reactors, is currently inexpensive. So there’s no need to turn to more sustainable resources – at least not yet. But abandoning the Astrid project means running the risk of losing the expertise acquired for this technology. Though some research may be continued, it will not be enough to maintain industrial expertise in developing new reactors, and the knowledge in this sector could be lost. “The process of regaining lost knowledge,” she says, “is ultimately as expensive as starting from scratch.”

A short-term decision, therefore, relying instead on EPR, 3rd-generaton reactors. But the construction of this type of reactor in Flamanville also faces its own set of hurdles. According to Stéphanie Tillement, “the challenges the Astrid project encountered are similar to those of the EPR project.” To secure funding for such projects, nuclear industry stakeholders seek to align themselves with the short timeframes of the political world. Yet, short deadlines are ultimately unrealistic and inconsistent with the timeframes for developing nuclear technology, and even less so when it’s a matter of the first of a series. This creates problems for nuclear projects – they fall behind schedule and their costs rise dramatically. In the end, this makes politicians rather wary of funding this sort of project. “So nuclear energy gets stuck in this vicious circle,” says the researcher, “in a world that’s increasingly unfavorable to this sector.”

This decision also aligns with the government’s energy strategy. In  broad terms, the State has announced that nuclear energy will be reduced to 50% of France’s energy mix, in favor of renewable energies. “The problem,” says Stéphanie Tillement, “is that we only have an outline. If there’s a political strategy on nuclear issues, it remains unclear. And there’s no long-term position – this is a way of  leaving the decision to future decision-makers. But making no decision is a decision. Choosing not to pursue the development of technologies which require a long time to develop may implicitly mean abandoning the idea of any such development in the future. Which leads some to consider, rather cynically, that politicians must think that when need it, we’ll buy the required technology from other powers (China, Russia) who have already developed it.”

connected devices

A dictionary for connected devices

The field of connected devices is growing at a staggering pace across all industries. There is a growing need to develop a communication standard, meaning a ‘common language’ that different smart systems could understand and interpret. To contribute to this goal, ETSI (European Telecommunications Standards Institute) is funding a European project in which Mines Saint-Étienne researchers Maxime Lefrançois and Antoine Zimmermann[1] are taking part.

 

In order to work together, connected devices must be able to communicate with one another. This characteristic, known as ‘semantic interoperability,’ is one of the key challenges of the digital transition. To be effective, semantic interoperability must be based on the adoption of an agreed-upon set of best practices. This would culminate in the creation of a standard adopted by the IoT community. At the European level, ETSI (European Telecommunications Standards Institute) is in charge of setting standards for information and communication technologies. “For example, ETSI standardized the SIM card, which acts as an identifier in mobile phone networks to this day,” explains Maxime Lefrançois. He and his colleague Antoine Zimmermann are researchers at Mines Saint-Étienne and specialize in the semantic web and knowledge representation. They are taking part in the STF 578 project on the interoperability of connected devices funded by ETSI, in partnership two researchers from Universidad Politécnica de Madrid.

“Instead of proposing a standard that strictly defines the content of communications between connected devices, we define and formally identify the concepts involved, through what is known as an ontology,” says Antoine Zimmermann. This provides IoT players with greater flexibility since the content of messages exchanged may use the language and format best suited to the device, as long as an explicit link is made with the concept identified in the reference ontology. The two researchers are working on the SAREF reference ontology (Smart Applications Reference Ontology), a set of ETSI specifications which include a generic base and specializations for the various sectors related to the IoT: energy, environment, building, agriculture, smart cities, smart manufacturing, industry and manufacturing, water, automotive, e-health, wearables.

“The SAREF standard describes smart devices, their functions and the services they provide, as well as the various properties of the physical systems these devices can control,” explains Maxime Lefrançois. For example, a light bulb can say, “I can provide light” by using a concept defined by SAREF. A system or application may then refer to the same lighting concept to tell the object to turn on. “Ultimately, this knowledge should be described following the same standard models within each industry to facilitate harmonization between industries.” adds the researcher. The aim of the project is therefore to develop a public web portal for the standard SAREF ontology to facilitate its adoption by companies and collect their feedback and suggestions for improvement.

A specially-designed ‘dictionary’

“The SAREF public web portal is a little bit like a ‘dictionary’ for connected devices,” explains Maxime Lefrançois. “If we take the example of a water heater that can measure energy consumption and can be remotely-controlled, SAREF will describe its possible actions, the services it can provide, and how it can be used to lower energy costs or improve household comfort.” But his colleague Antoine Zimmermann explains, “It isn’t a dictionary in the traditional sense. SAREF specifies in particular the technical and IT-related constraints we may encounter when communicating with the water heater.”

Imagine if one day all water heaters and heat pumps were connected to the IoT and could be remotely controlled. They could then theoretically be used as an energy resource that could ensure the stability and energy efficiency of the country’s electricity grid. If, in addition, there was a uniform way to describe and communicate with these devices, companies in the smart building and energy sectors would waste less time individually integrating products made by different manufacturers. They could then focus instead on developing innovative services connected to their core business, giving them a competitive advantage. “The goal of semantic interoperability is to develop a service for a certain type of smart equipment, and then reuse this service for all similar types of equipment,” says Maxime Lefrançois. “That’s the heart of SAREF”.

Read more on I’MTech: How the SEAS project is redefining the energy market

At present, the existing standards are compartmentalized by sector. The energy industry has standards for describing and communicating with the electrical equipment of a water tower, but the water tower must then implement different standards to interface with other equipment in the water distribution network. “There are several different consortia for each sector,” explain the researchers, “but we now have to bridge the gap between these consortia, in order to harmonize their standards.” Thus the need for a ‘dictionary,’ a common vocabulary that can be used by connected devices in all industries.

Take the example of automotive manufacturers who are developing new batteries for electric vehicles. Such batteries could theoretically be used by energy suppliers to regulate the voltage and frequency of the electricity grid. “The automotive and energy industries are two sectors that had absolutely no need to communicate until now,” says Maxime Lefrançois, “in the future, they may have to work together to develop a common language, and SAREF could be the solution.”

A multilingual ‘dictionary’

The IoT community is currently engaged in something of a ‘standards war’ in which everyone is developing their own specification and hoping that it will become the standard. Impetus from public authorities is therefore needed to channel the existing initiatives  — SAREF at the European level. “We can well imagine that in the future, there will only be a single, shared vocabulary for everyone,” says Antoine Zimmermann. “But we may find ourselves with different vocabularies being developed at the same time, which then remain. That would be problematic. This is how it is today, for example, with electrical outlets. A machine intended to be used in the United States will not work with European outlets and vice versa.”

“The development of the SAREF public web portal is an important step since it encourages companies to take part in creating this dictionary,” adds Maxime Lefrançois. The more companies are involved in the project, the more comprehensive and competitive it will be. “The value of a standard is related to the size of the community that adopts it,” he says.

“The semantic web is particularly useful in this respect,” says Antoine Zimmermann, “it allows everyone to agree. Companies are all engaged in digital transformation and use the web as a common platform to get in touch with clients and partners. They use the same protocols. We think the semantic web is also a good way to build these common vocabularies that will work in various sectors. We aren’t looking for the right solution, but to demonstrate best practices and make them more widespread so that companies look beyond their own community.” 

A collaborative ‘dictionary’

The researchers’ work also involves developing a methodology for building this standard: a company must be able to suggest a new addition to the vocabulary that is highly specific to a certain field, while ensuring that this contribution aligns with the standard models and best practices that have been established for the entire ‘dictionary.’

“And that’s the tricky part,” says Maxime Lefrançois. How can the SAREF public portal be improved and updated to make sure that companies use it? “We know how to write ‘dictionaries’ but supporting companies is no simple task.” Because there are a number of constraints involved: all these different vocabularies and jargons must be assimilated, and companies may not necessarily be familiar with them.

“So we have to reinvent collaborative support methods for this dictionary. That’s where DevOps approaches implemented for software development are useful,” he says. These approaches make it possible to automatically check the suggestions based on a set of quality criteria, then automatically make a new version of the portal available online if the criteria are  fulfilled. “The goal is to shorten SAREF development cycles while maintaining an optimal level of quality,” concludes the researcher.

There are other hurdles to overcome to get the connected devices themselves to ‘speak SAREF,’ due to the specific limitations of connected devices –  limited storage and computing capacity, low battery life, limited bandwidth, intermittent connectivity. The use of ontologies for communication and ‘reasoning’ was first thought up without these constraints, and must be reinvented for these types of ‘edge computing’ configurations. These issues will be explored in the upcoming ANR CoSWoT project (Constrained Semantic Web of Things) which will include researchers from LIRIS, Mines Saint-Étienne, INRAE (merger of INRA and IRSTEA), Université Jean-Monnet and the company Mondeca.

 

[1] Maxime Lefrançois and Antoine Zimmermann are researchers at the Laboratory Hubert Curien, a joint research unit between CNRS/Mines Saint-Étienne/Université Jean Monnet.

Being Human with algorithms : Marc-Oliver Pahl meets Raimund Seidel

Marc-Oliver Pahl is a researcher in cybesecurity at IMT Atlantique. In 2018, he launched “Being human with algorithms”, a series of video interviews between technicians and non-technicians around the topic of digital transformation. Through open discussions and dialogues, he depicts how digital technologies are perceived, and affect humans as citizens, consumers, workers…

In this episode, Marc-Oliver meets with Raimund Seidel, Director of the Schloss Dagstuhl – Leibniz Center for Informatics.

immune system

Understanding the resilience of the immune system through mathematical modeling

Gaining insight into how the immune system works using mathematics is the ultimate goal of the research carried out by IMT Atlantique researcher Dominique Pastor, along with his team. Although the study involves a great degree of abstraction, the scientists never lose sight of practical applications, and not only in relation to biology.

 

In many industries, the notion of “resilience” is a key issue, even though there is no clear consensus on the definition of the term. From the Latin verb meaning “to rebound,” the term does not exactly refer to the same thing as resistance or robustness. A resilient system is not unaffected by external events, but it is able to fulfill its function, even in a degraded mode, in a hostile environment. For example, in computer science, resilience means the ability to provide an acceptable level of services in the event of a failure

This capacity is also found in the human body ­— and in general, in all living beings. For example, when you have a cold, your abilities may be reduced, but in most cases you can keep living more or less normally.

This phenomenon is regularly observed in all biological systems, but remains quite complex. It is still difficult to understand how resilience works and the set of behaviors to which it gives rise.

A special case of functional redundancy: degeneracy

It was through discussions with Véronique Thomas-Vaslin, a biologist at Sorbonne University, that Dominique Pastor, a telecommunications researcher at IMT Atlantique, became particularly aware of this property of biological systems. Working with Roger Waldeck, who is also a researcher at IMT Atlantique, and PhD student Erwan Beurier, he carried out research to mathematically model this resilience, in order to demonstrate its basic principles and better understand how it works.

To do so, they drew on publications by other scientists, including American biologist Gerald Edelman (Nobel prize winner for medicine in 1972), underscoring another property of living organisms: degeneracy. (This term is usually translated in French as dégénérescence, which means ‘degeneration,’ but this word is misleading). “Degeneracy” refers to the ability of two structurally different elements to perform the same function. It is therefore a kind of functional redundancy, which also implies different structures. This characteristic can be found at multiple levels in living beings.

For example, amino acids, which are the building blocks of essential proteins, are produced from “messages” included in portions of DNA. More specifically, each message is called a “codon”: a sequence of three molecules, known as nucleotides. However, there are 4 possible nucleotides, meaning there are 64 possible combinations, for only 22 amino acids. That means that some codons correspond to the same amino acid: a perfect example of degeneracy.

My hunch is that that degeneracy is central to any resilient system,” explains Dominique Pastor. “But it’s just a hunch. The aim of our research is to formalize and test this idea based on mathematical results. This can be referred to as the mathematics of resilience.”

To this end, he relied on the work of French mathematician Andrée Ehresmann, Emeritus Professor at the University of Picardie Jules Verne, who established a mathematical model of degeneracy, known as the “Multiplicity Principle,” with Jean-Paul Vanbremeersch, an Amiens-based physician who specializes in gerontology.

Recreating resilience  in the form of mathematical modeling

Dominique Pastor and his team therefore started out with biologists’ concrete observations of the human body, and then focused on theoretical study. Their goal was to develop a mathematical model that could imitate both the degeneracy and resilience of the immune system in order to “establish a link between the notion of resilience, this Multiplicity Principle, and statistics.” Once this link was established, it would then be possible to study it and gain insight into how the systems work in real life.

The researchers therefore examined the performance of two categories of statistical testing, for a given problem, namely to detect a phenomenon. The first category is called “Neyman-Pearson testing,” and is optimal for determining whether or not an event has occurred. The second category, RDT, (Random Distortion Testing), is also optimal, but for a different task: detecting whether an event has moved away from an initial model.

The two types of procedures were not created with the same objective. However, the researchers  successfully demonstrated that RDT testing could also be used, in a “degenerative” manner, to detect a phenomenon, with a comparable performance to Neyman-Pearson testing. That means that in the theoretical case of an infinite amount of data, they can detect the presence or absence of a phenomenon with the same level of precision. The two categories therefore perform the same function, although they are structurally different. “We therefore made two sub-systems in line with the Multiplicity Principle,” concludes the IMT Atlantique researcher.

What’s more, the nature of RDT testing gives it an advantage over Neyman-Pearson testing since the latter only works optimally when real events follow a certain mathematical model.  If this is not the case — as so often happens in nature — it is more likely to be incorrect. RDT testing can adapt to a variable environment, since it is designed to detect such variations, and is therefore more robust. Combining the two types of testing can result in a system with the inherent characteristics of resilience, meaning the ability to function in a variety of situations.

From biology to cybersecurity

These findings are not intended to remain confined to a theoretical universe. “We don’t work with theory for the sake of theory,” says Dominique Pastor. “We never forget the practical side: we continually seek to apply our findings.” The goal is therefore to return to the real world, and not only in relation to biology. In this respect, the approach is similar to that used in research on neural networks – initially focused on understanding how the human brain works, it ultimately resulted in systems used in the field of computer science.

The difference is that neural networks are like black boxes: we don’t know  how they make their decisions,” explains the researcher. “Our mathematical approach, on the other hand, provides an understanding of the principles underlying the workings of another black box: the immune system.” This understanding is also supported by collaboration with David Spivak, a mathematician at MIT (United States), again, in the field of mathematical modeling of biological systems.

The first application Dominique Pastor is working on falls within the realm of cybersecurity. The idea is to imitate the resilient behavior of an immune system for protective purposes. For example, many industrial sites are equipped with sensors to monitor various factors (light, opening and closing of doors, filling a container etc.) To protect these devices, they could be combined with a system to detect external attacks. This could be made up of a network, which would receive data recorded by the sensors and run a series of tests to determine whether there has been an incident. Since these tests could be subject to attacks themselves, they would have to be resilient in order to be effective – hence the importance of using different types of tests, in keeping with the previously obtained results.

For now it is still too early to actually apply these theories. It remains to be proven that the Multiplicity Principle is a sufficient guarantee of resilience, given that this notion does not have a mathematical definition as of today. This is one of Dominique Pastor’s ambitions. The researcher admits that it is still his “pipe dream” and says, “My ultimate goal would still be to go back to biology. If our research could help biologists better understand and model the immune system, in order to develop better care strategies, that would be wonderful.”