Image d'un globe terrestre vert - écoconception, eco-design

What is eco-design?

In industry, it is increasingly necessary to design products and services with concern and respect for environmental issues.  Such consideration is expressed through a practice that is gaining ground in a wide range of sectors: eco-design. Valérie Laforest, a researcher in environmental assessment and environmental engineering and organizations at Mines Saint-Étienne, explains the term.

 

What does eco-design mean?

Valérie Laforest: The principle of eco-design is to incorporate environmental considerations from the earliest stages of creating a service or product, meaning from the design stage. It’s a method governed by standards, at the national and international level, describing concepts and setting out current best practices for eco-design. We can just as well eco-design a building as we can a tee-shirt or a photocopying service.

Why this need to eco-design?

VL: There is no longer any doubt about the environmental pressure on the planet. Eco-design is one concrete way for us to think about how our actions impact the environment and consider alternatives to traditional production. Instead of producing, and then looking for solutions, it’s much more effective and efficient to ask questions from the design stage of a product to reduce or avoid the environmental impact.

What stages does eco-design apply to?

VL: In concrete terms, it’s based entirely on the life cycle of a system, from very early on in its existence. Eco-design thinking takes into account the extraction of raw materials, as well as the processing and use stages, until end of life. If we recover the product when it is no longer usable, to recycle it for example, that’s also an example of eco-design. As it stands today, end-of-life products are either sent to landfills, incinerated or recycled. Eco-design means thinking about the materials that can be used, but also thinking about how a product can be dismantled so as to be incorporated within another cycle.

When did we start hearing about this principle?

VL: The first tools arrived in the early 2000s but the concept may be older than that. Environmental issues and associated research have increased since 1990. But eco-design really emerged in a second phase when people started questioning the environmental impact of everyday things: our computer, sending an email, the difference between a polyester or cotton tee-shirt.

What eco-design tools are available for industry?

VL: The tools can fall into a number of categories. There are relatively simple ones, like check-lists or diagrams, while others are more complex. For example, there are life-cycle analysis tools to identify the environmental impacts, and software to incorporate environmental indicators in design tools. The latter require a certain degree of expertise in environmental assessment and a thorough understanding of environmental indicators. And developers and designers are not trained to use these kinds of tools.

Are there barriers to the development of this practice?

VL: There’s a real need to develop special tools for eco-design. Sure, some already exist, but they’re not really adapted to eco-design and can be hard to understand. This is part of our work as researchers, to develop new tools and methods for the environmental performance of human activities. For example, we’re working on projects with the Écoconception center, a key player in the Saint-Etienne region as well as at the national level.

In addition to tools, we also have to go visit companies to get things moving and see what’s holding them back. We have to consider how to train, change and push companies to get them to incorporate eco-design principles. It’s an entirely different way of thinking that requires an acceptance phase in order to rethink how they do things.

Is the circular economy a form of eco-design?

VL: Or is eco-design a form of the circular economy? That’s an important question, and answers vary depending on who you ask. Stakeholders who contribute to the circular economy will say that eco-design is part of this economy. And on the other side, eco-design will be seen as an initiator of the circular economy, since it provides a view of the circulation of material in order to reduce the environmental impact. What’s certain is that the two are linked.

Tiphaine Claveau for I’MTech

[box type=”info” align=”” class=”” width=””]

This article was published as part of Fondation Mines-Télécom‘s 2020 brochure series dedicated to sustainable digital technology and the impact of digital technology on the environment. Through a brochure, conference-debates, and events to promote science in conjunction with IMT, this series explores the uncertainties and challenges of the digital and environmental transitions.

[/box]

 

Joint AI

Joint AI: a platform to facilitate German-French research in AI

In 2019, The German-French Academy for the Industry of the Future launched the Joint AI platform project. This platform bringing together IMT and the Technical University of Munich, promotes collaboration between researchers and industry to develop artificial intelligence tools. Its secure environment allows for intellectual property protection for the results, and the reproducibility of scientific results.

 

The primary aim is to support artificial intelligence research projects between France and Germany.” This is how Anne-Sophie Taillandier begins her description of the Joint AI platform launched in 2019 by IMT and the Technical University of Munich. Since 2015, the two institutions have been working together through the German-French Academy for the Industry of the Future. This partnership has given rise to a number of research projects, some of which have focused on artificial intelligence. Researchers working in this area face a recurring problem: intellectual property protection for the results.

One of the major risks for AI researchers is presenting their work to academic peers or industry stakeholders and having it stolen,” explains Anne-Sophie Taillandier. For several years, this French artificial intelligence expert has headed IMT’s TeraLab, which aims to facilitate AI research in a secure environment.  “Through discussions with our colleagues at the Technical University of Munich, we realized that we each had infrastructures to host and develop AI projects, but that there was no transnational equivalent,” she explains. This gave rise to the Joint AI platform project: as a shared, reliable, protected site for German-French research on artificial intelligence.

Read more on I’MTech: TeraLab, a European Data Sanctuary

The platform is based on technological and legal tools. The hardware architecture and workspaces are designed to host data and work on it with the desired security level. Using a set of APIs, the results of a project can be highlighted and shared on both sides of the border, without having to move the data or the software developed. “Everyone can work with confidence, without having to provide access to their executable or data,” says Anne-Sophie Taillandier.

A tool for researchers…

For researchers working on AI — as well as other scientific disciplines — facilitating cooperation means facilitating the progress of research projects and results. This is especially true for all research related to Industry 4.0, as is the case for the German-French Academy for the Industry of the Future projects that the Joint AI platform currently hosts. “Research on industry involves complex infrastructures, made up of human users and sensors that link the physical and digital dimensions,” says  Georg Carle, holder of the Network Architectures and Services Chair at the Technical University of Munich, and co-director of the project with Anne-Sophie Taillandier.

He explains that, “In order to be valuable, this research must be based on real data and generate realistic models.” And the more the data is shared and worked on by different teams of researchers,  the more effective the resulting algorithms will be. For Georg Carle, “the Joint AI platform makes it possible to improve the reproducibility of results” between the French and German teams. “This leads to higher-quality results, with a bigger impact for the industry stakeholders.”

And for companies!

In addition to providing a collaborative tool for researchers, the Joint AI platform also provides innovation opportunities for companies involved in partnership-based research. When a German industry stakeholder seeks to collaborate with French researchers or vice versa, the legal constraints for moving data represent a major hurdle. Such collaboration is further limited by the fact that, even within the same large company, it can be difficult for the French and German branches to exchange data. “This can be for a variety of reasons: human resources personal data, data related to industrial property, or data concerning clients with whom there is a confidentiality guarantee,” says Anne-Sophie Taillandier.

Companies therefore need a secure location, from both a technological and legal standpoint, to facilitate joint research. Joint AI therefore makes it easier for private stakeholders to take part in research projects at the European level, such as Horizon 2020 framework program projects — or Horizon Europe for future European research projects as of next year. Such a platform offers a prototype for a solution to one of the biggest problems facing AI and digital innovation: secure data sharing between different stakeholders.

Also read on I’MTech:

agilité, agility

Can workspaces become agile?

Innovating, adapting ever-more rapidly to changes in the environment. Breaking away from the traditional office. The digital revolution has dramatically changed working methods, and with them, the way we organize space. Researchers from Institut Mines-Télécom Business School and IMT Atlantique have studied the paradoxes and tensions that arise when workspaces are designed to embody, promote and foster agility.

 

In recent years, the quest for agility has pushed companies to completely rethink their organization, methods and processes. They have urged their employees to develop new work practices. These changes often go hand-in-hand with a reconfiguration of spaces: the flexible office, digital workspaces that are modular and open, organized by work activity etc. But a sort of ambivalence can be seen behind these efforts.

Spaces, locations and offices are often synonymous with having a sense of bearings, of longevity, as a specific territory we claim as the base for our work. Agility, on the other hand, encourages continual reconfiguration, a transitory organization, keeping bodies and heads in constant motion,” explains Marie Bia Figueiredo, a management researcher at Institut Mines-Télécom Business School. So what’s work life like when the office is designed to embody and foster organizational agility? How do employees experience this apparent contradiction and make these new workspaces their own? Marie Bia Figueiredo and her fellow researchers at Institut Mines-Télécom Business School and IMT Atlantique set out to explore these issues.

These questions first occurred to us in 2016, at an observation day in the new offices of a major French bank which we called ‘The Oases’“, says Madeleine Besson, who is also a researcher in management at Mines-Télécom Business School. “We were struck by the omnipresence of references to agility in the talks presenting the buildings, in the way the space was designed, in the signage and even in the decorative elements. On one hand, companies have always relied on physical spaces to convey and embody the changes they hope to bring about in order to standardize and organize practices and behaviors. But at the same time, we must remember that the agile movement establishes a principle of autonomy and self-organization for teams. There was a certain dissonance to it.”

From agile methods to agile environment

The agile movement was formally established in the early 2000s with the aim of adapting quickly to change, whether in terms of disruptive technologies, the volatility of customers or regulatory developments. This quest for agility was first expressed through new project management methods, collectively referred to as “agile methods”. In  principle, these methods are based on a willingness to accept risk and change and on reorganizing and adapting to such change on a permanent basis. Today, companies increasingly see the workspace as a vehicle for change and agility. “Organizations are seeking to align space, work and information technologies,” explain the Institut Mines-Télécom Business School researchers in a forthcoming publication in Terminal[1].

The “Oases” created by the French bank observed by the researchers exemplify this trend. They were designed to embody an “exceptional” drive for transformation in a banking sector which has been particularly affected by technological and economic transformations. During the researchers’ investigation, the company’s real estate director explained their motivation for creating such spaces, “We wanted the Oases to provide fertile grounds for new ways of working in an effort to attract new talents.” The decision was inspired by the iconic workspaces of companies such as Google or Apple, which can “create well-being and conditions that allow employees to work differently.”

The research team’s study shows how this requirement for agility in the corporate world is expressed –in particular through a requirement for adaptation and ubiquity. In order to preserve their modularity, the workspaces cannot be personalized. Instead, workspaces are reserved when arriving at the office and rolling chairs and height-adjustable tables ensure that the space and office can constantly be rearranged. Work is primarily coordinated in the digital space and collective work has become invisible. Physical space and digital space are closely interlinked to convey a requirement for ubiquity.

The researchers also note a requirement for creativity and happiness. “The environment is decorated with plastic smiley faces. The office is designed to provide a fun environment where employees are encouraged to play ping pong or pool,” they note. Lastly, a requirement for speed is expressed by the pervasiveness of references to the passage of time. Hourglasses of varying sizes serve as reminders that there is no leeway when it comes to projects being completed on schedule. “Agility claims to prioritize individual interactions above processes and tools, but these interactions are still subject to strong time pressure. And agility means working at a faster pace, since you have to be ready to cancel or repeat operations as required by customers, the context or new developments,” points out Géraldine Guérillot, a researcher from the IMT Atlantique team.

Attempts to make a “non-place” one’s own

How do employees perceive these changes? Some find it difficult to break with their previous work habits. “Many of them told us that senior executives had to set an example by using the game room or nap room before the employees dared to use it,” says Jean-Luc Moriceau, a researcher at Institut Mines-Télécom Business School. “Others, without showing strong opposition to the new working methods, find ways to get around them. This can be seen in teams who regularly meet up to ‘recreate their territory’ or high-level managers who reserve a room for an entire day“. In some workplaces, employees leave personal belongings to (re)gain their bearings. The flex-office depersonalizes the workplace, so employees attempt to make the space their own, thereby developing behaviors that are contrary to the constant, agile reorganization of space.

Others play along. For example, one of the managers explains that for him, the hourglasses are a polite way of reminding everyone that time is tight. “He’ll meet with anyone who wants to see him, but they must present their views within the allotted time, which is physically represented by the sand flowing through the hourglass,” explain the researchers. Agility appears to aim to shake up the work environment.

But the researchers provide a warning, “The quest for agility, embodied by the reconfiguration of space, when implemented in a too prescriptive and uniform manner, can lead to producing ‘non-places’. These spaces deny the role that feelings, territories, memory and status play in the operations of an organization and work collectives.” The researchers demonstrate how, in turn, this gives rise to discreet ways of taking ownership of such spaces and “producing places,” understood as “minor uses resulting in alternative ways of occupying the shared space.”

By Anne-Sophie Boutaud for I’MTech

[1] Moriceau J.-L., Besson M., Bia Figueiredo M., Guérillot G. (2020), L’espace agile, oasis ou mirage ? Mise en perspective de quelques difficultés et paradoxes pour les travailleurs (Agile Spaces, Oasis or Mirage? A Perspective on Difficulties and Paradoxes for Employees), Terminal, Technologie de l’Information, Culture et Société (forthcoming).

5G!Drones, 5G Drones

Putting drones to the 5G test

Projets européens H20205G!Drones, a European project bringing together industrialists, network operators and research centers, was launched in June 2019 for a three-year period. It should ultimately validate the use of 5G for delivery services by drone. Adlen Ksentini, a researcher at EURECOM, a key partner in the project, explains the challenges involved.

 

What was the context for developing the European 5G!Drones project?

Adlen Ksentini: The H2020 5G!Drones project is funded by the European Commission as part of phase 3 of the 5G PPP projects (5G Infrastructure Public Private Partnership). This phase aims to test  use cases for vertical industry applications (IoT, industry 4.0, autonomous cars etc.) on 5G test platforms. 5G!Drones focuses on use cases involving flying drones, or Unmanned Aerial Vehicles (UAV), such as for transport of packages, extension of network coverage with drones, public security etc.

What is the aim of this project?

AK: The aim is twofold. First, to test eight use cases for UAV services on 5G platforms located in Sophia Antipolis, Athens (Greece), Espoo and Oulu (Finland) to collect information that will allow us to validate the use of 5G for a wider roll-out of UAV services. And second, the project seeks to highlight the ways in which 5G must be improved to guarantee these services.

What technological and scientific challenges do you face?

AK: A number of obstacles will have to be overcome during the project: these obstacles are related to safeguarding drone flights . To fly drones, certain conditions are required. First, there has to be a reliable network with low latency, since remote control of the drones requires low latency in order to correct the flight path and monitor the drones’ position in real time. And there also has to be strong interaction between the U-Space service (see box) and the network operator to plan flights and check conditions: weather, availability of network coverage etc. In addition to these obstacles to be overcome, the 5G !Drones project will develop a software system that will be placed above the platforms, to automate the trials and display the results in real time.

[box type=”info” align=”” class=”” width=””]

The U-Space service is in charge of approving the flight plan submitted by drone operators. Its job is to check whether the flight plan is feasible, meaning ensuring that there are no other flights planned on the selected path and determining whether the weather conditions are favorable.

[/box]

How are EURECOM researchers contributing to this project?

AK: EURECOM is a key partner in the project. EURECOM will provide its 5G testing platform based on its OpenAirInterface (OAI) tool, which provides Network Function Virtualization (NFV) and Multi-access Edge Computing (MEC) solutions. It will host two trials on public safety using flying drones, led by partners representing the vertical industry. In addition, EURECOM will be studying and proposing a solution for developing a 5G network dedicated to UAVs, based on the concept of network slicing.

Who are your partners and what collaborations are important for you?

AK: The project counts 20 partners, including network operators (Orange France and Poland, COSMOTE), specialists in the UAV field (Alerion, INVOLI, Hepta Airborne’s, Unmanned System Limited, CAFA Tech, INVOLI, Frequentis, DRONERADAR), industrial groups (NOKIA, Thalès and AIRBUS), a SME (INFOLYSIS) and research centers and universities (Oulu University, Aalto University, DEMOKRITOS, EURECOM), as well as the municipality of Egaleo in Greece. EURECOM is playing a central role in the project with UAV vertical industry partners by collaborating with all the members of the consortium and acting as a liaison between the UAV vertical industry partners, industrial groups and network operators.

What are the expected benefits of the project?

AK: In addition to the scientific benefits in terms of publications, the project will allow us to verify whether 5G networks are ready to deliver UAV services. Feedback will be provided to 3GPP standards organizations, as well as to the authorities that control the airspace for UAVs.

What are the next important steps for the project?

AK: After a first year in which the consortium focused on studying an architecture that would make it possible to establish a link between the vision of UAV industry stakeholders and 5G networks, as well as a detailed description of the use cases to be tested, the project will be starting its second year, which will focus on deploying the tests on the various sites and then begin the testing.

Learn more about the 5G!Drones project

Interview by Véronique Charlet for I’MTech

 

industrial risk

How can industrial risk be assessed?

Safety is a key concern in the industrial sector. As such, studying risk represents a specialized field of research. Experiments in this area are particularly difficult to carry out, as they involve explosions and complicated measures. Frédéric Heymes, a researcher at IMT Mines Alès who specializes in industrial risk, discusses the unique aspects of this field of research, and new issues to be considered.

 

What does research on industrial risk involve?

Frédéric Heymes: Risk is the likelihood of the occurrence of an event that could lead to  negative and high-stakes consequences. Our research is broken down into three levels of anticipation (understanding, preventing, protecting) and one operational level (helping manage accidents). We have to understand what can happen and do everything possible to prevent dangerous events from happening in real life. Since accidents remain inevitable, we have to anticipate protective measures to best protect people and resources in the aftermath an accident. We must also be able to respond effectively. Emergency services and the parties responsible for managing industrial disasters need simulation tools to help them make the right decisions. Risk research is cross-sectorial and can be applied to a wide range of industries (energy, chemistry, transport, pharmaceuticals, agri-food).

What’s a typical example of an industrial risk study?

FH:  Although my research may address a wide variety of themes, on the whole, it’s primarily connected to explosive risk. That means understanding the phenomenon and why it occurs, in order to make sure it won’t happen again. A special feature of our laboratory is that we can carry out experimental field testing for dangerous phenomena that can’t be performed in the laboratory setting.

What does an experiment on explosive risk look like?

FH: We partnered with Total to carry out an especially impressive experiment, which had never before been done anywhere in the world. It was a study on the explosion of superheated water, under very high pressure at a very high temperature. It was potentially dangerous since the explosion releases a very large amount of energy. It was important for Total to understand what happens in the event of such an explosion and the consequences of concern. Carrying out the experiment was a real team effort and called for a great deal of logistical planning. Right away, it was different than working in a lab setting. There were between 5 and 8 people involved in each test, and everyone had their own specific role and specialty: data acquisition, control, high-speed cameras, logistics, handling. We needed a prototype that weighed about a ton, which we had made by a boilermaker. That alone was no simple task. Boilermakers are responsible for producing compliant equipment that is known to be reliable. But for our research, we knew that the prototype would explode. So we had to reassure the manufacturer in terms of liability.

How do you set up such an explosion?

FH: We need a special testing ground to carry out the experiment and to get permission to use it, we have to prove that the test is perfectly controlled. For these tests, we collaborated with the Camp des Garrigues, a military range located north of Nîmes. The test area is secure but completely empty, so it took a lot of preparation and set-up. In addition, firefighters were also on site with our team. And there was a great deal of research dedicated to sensors in order to obtain precise measurements. The explosion lasts less than a second. It’s a very short test. Most of the time, we only have access to the field for a relatively short period of time, which means we carry out the tests one after another, non-stop. We’re also under a lot of stress –  we know that the slightest error could  have dramatic consequences.

What happens after this study?

FH: The aim of this research was to study the consequences of such an explosion on the immediate environment. That provides us with an in-depth understanding of the event so that those involved can take appropriate action. We therefore obtain information about the explosion, the damage it causes and the size of the damaged area. We also observe whether it can give rise to a shock wave or projectile expulsion, and if so, we study their impacts.

Has there ever been a time when you were unable to carry out tests you needed for your research?

FH: Yes, that was the case for a study on the risk of propane tank explosions during wildfires. Ideally, we would have to control a real wildfire and expose propane tanks to this hazard. But we’re not allowed to do that, and it’s extremely dangerous. It’s a real headache. Ultimately, we have to divide the project into two parts and study each part separately. That way, we obtain results that we can link using modeling. On one hand, we have the wildfire with a huge number of variables that must be taken into account: wind strength and direction, slope inclination, types of species in the vegetation, etc. And on the other hand, we study fluid mechanics and thermodynamics to understand what happens inside propane tanks.

What results did you achieve through this study?

FH: We arrived at the conclusion that gas tanks are not likely to explode if brush clearing regulations are observed. In residential areas located near forests, there are regulations for maintenance, and brush clearing in particular. But if these rules are not observed, safety is undermined. We therefore suggested a protective component with good thermal properties and  flame resistance to protect tanks in scenarios that do not comply with regulations.

What are some current issues surrounding industrial risk?

FH: Research in the field of industrial risk really took off in the 1970s. There were a number of industrial accidents, which underscored the need to anticipate risks, leading to extensive research to prevent and protect against risks more effectively. But today, all energy sectors are undergoing changes and there are new risks to consider. Sectors are being created and raising new issues, as is the case for hydrogen for example. Hydrogen is a very attractive energy source since its use only produces water, and no carbon dioxide. But it is a dangerous compound since it’s highly flammable and explosive. The question is how to organize hydrogen supply chains (production, transport, storage, use) as well as possible. How can hydrogen best be used in the territory while minimizing risks? It’s a question that warrants further investigation. A cross-disciplinary research project on this topic with other IMT partners is in the startup phase, as part of Carnot HyTrend.

Read more on I’MTech: What is Hydrogen Energy?

So does that mean that energy and environmental transition come with their own set of new risks to be studied?

FH: Yes, that’s right and global warming is another current field of research. To go back to wildfires, they’re becoming more common which raises concerns. How can we deal with the growing number of fires? One solution is to consider passive self-protection scenarios, meaning reducing the vulnerability to risks through technological improvements, for example. The energy transition is bringing new technologies, along with new uses. Like I was saying before, hydrogen is a dangerous chemical compound, but we’ve known that for a long time. However, its operational use to support energy transition raises a number of new questions.

How can we deal with these new risks?

FH: The notion of new industrial risk is clearly linked to our social and technological evolutions. And evolution means new risks. Yet it’s hard to anticipate such risks since it’s hard  to anticipate such evolutions in the first place. But at the same time, these evolutions provide us with new tools: artificial intelligence for example. We can now assimilate large amounts of data and quickly extract useful, relevant results to recognize an abnormal, potentially dangerous situation. Artificial intelligence also helps us overcome a number of technological  hurdles. For example, we’re working with Mines ParisTech to conduct research on predicting the hydrodynamic behavior of gas leaks using artificial intelligence methods, with unprecedented computing speed and accuracy.

How is research with industrial players organized on this topic?

FH: Research can grow out of partnerships with research organizations, such as the IRSN (French Institute for Radiological Protection and Nuclear Safety). During the decommissioning of a power plant, even though there’s no longer any fissile material, residual metal dust could potentially ignite. So we have to understand what may happen in order to act accordingly in terms of safety. But for the most part, I collaborate directly with industrialists. In France, they’re responsible for managing the risks inherent in their operations. So there’s a certain administrative pressure to improve on these issues, and that sometimes involves research questions. But most of the time, investments are driven not by administrative requirements, but by a profound commitment to reducing risks.

What’s quite unique about this field of research is that we have complete freedom to study the topic and complete freedom to publish. That’s really unique to the field of risk. In general, results are shared easily, and often published so that “the competition” can also benefit from the findings. It’s also quite common for several companies in the same industry team up to fund a study since they all stand to benefit from it.

OligoArchive

DNA as the data storage medium

Projets européens H2020By 2025 the volume of data produced in the world will have reached 250 zettabytes (1 zettabyte = 1021 bytes). Current storage media have insufficient storage capacity or suffer from obsolescence. Preserving even a fraction of this data means finding a storage device with density and durability characteristics significantly superior to those of existing systems. The European OligoArchive project, launched in October 2019 for three years, proposes to use DNA (DeoxyriboNucleic Acid) as a storage medium. Raja Appuswamy, researcher at EURECOM partner of the project, explains further.

 

In what global context did the European OligoArchive project come about?

Raja Appuswamy Today, everything in our society is driven by data. If data is the oil that fuels the metaphorical AI vehicle, storage technologies are the cog that keep the wheel spinning. For decades, we wanted fast storage devices that can quickly deliver data, and optical, magnetic, and solid state storage technologies evolved to meet this requirement. As data-driven decision becomes a part of our society, we are increasingly faced with a new need–one for cheap, long-term storage devices that can safely store the collective knowledge we generate for hundreds or even thousands of years. Imagine you have a photograph that you would like to pass down to your great-great grand children. Where would you store it? How much space would it take? How much energy would it use? How much would it cost? Would your storage media still be readable two generations from now? This is the context for project OligoArchive.

What is at stake in this project?

RA Today, tape drives are the gold standard when it comes to data archival across all disciplines, from Hollywood movie archives to particle accelerator facilities. But tape media suffers from several fundamental limitations that makes it unsuitable for long-term data storage. First, the storage density of tape -the amount of data you can store per inch- is improving at a 30% rate annually; archival data, in contrast, that has a growth rate of 60%. Second, if one stores 1PB in 100 tape drives today, within five years, it would be possible to store the same data in just 25 drives. While this might sound like a good thing, using tape for archival storage implies constant data migration with each new generation of tape, and such migrations cost millions of dollars.

This problem is so acute that Hollywood movie archives have openly admitted that we are living in a dead period during which the productions of several independent artists will not be saved for the future! At the rate at which we are generating data for feeding our AI machinery, enterprises will soon be at this point. Thus, the storage industry as a whole has come to the realization that a radically new storage technology is required if we are to preserve data across generations.

What will be the advantages of the technology developed by OligoArchive?

RA Project OligoArchive undertakes the ambitious goal of retasking DNA–a biological building block–to function as a radically new digital storage media. DNA possesses three key properties that make it relevant for digital data storage. First, it is an extremely dense three-dimensional storage medium that has the theoretical ability to store 455 Exabytes in 1 gram. The sum total of all data generated world wide (global datasphere) is projected to be 175 Zettabytes by 2025. This could be stored in just under half a kilogram of DNA. Second, DNA can last several millenia as demonstrated by experiments that have the read DNA of ancient, extinct animal species from fossils that are dated back thousands of years. If we can bring back the wolly mammoth to life from its DNA, we can store data in DNA for millenia. Third, the density of DNA is fixed by nature, and we will always have the ability and the need to read DNA–everything from archeology to precision medicine depend on it. Thus, DNA is an immortal storage medium does not have the media obsolescence problem and hence, can never become out dated unlike other storage media (remember floppy disks?).

What expertise do EURECOM researchers bring?

The Data Science department at EURECOM is contributing to several aspects of this project. First, we are building on our deep expertise in storage systems to architect various aspects of using DNA as a storage media, like developing solutions for implementing a block abstraction over DNA, or providing random access to data stored in DNA. Second, we are combining our expertise in data management and machine learning to develop novel, structure-aware encoding and decoding algorithms that can reliably store and retrieve data in DNA, even though the underlying biological tasks of synthesis (writing) and sequencing (reading) introduce several errors.

Who are your partners and what are their respective contributions?

The consortium brings together a truly multi-disciplinary group of people with diverse expertise across Europe. Institute of Mollecular and Cellular Pharmacology (IPMC) in Sophia Antipolis, the home to the largest sequencing facility in the PACA region, is a partner that contributes its biological expertise to the project. Our partners at I3S, CNRS, are working on new compression techniques customized for DNA storage that will drastically reduce the amount of DNA needed to store digital content. Our colleagues at Imperial College London (UK) are building on our work and pushing the envelope further by using DNA not just a storage media, but a computational substrate by showing that some SQL database operations that run in-silico (on a CPU) today can be translated efficiently into in-vitro biochemical reactions directly on DNA. Finally, we also have HelixWorks, a startup from Ireland that specializes is investigating novel enzymatic synthesis techniques for reducing the cost of generating DNA, as an industrial partner.

What results are expected and ultimately what will be the applications?

The ambitious end goal of the project is to build a DNA disk–a fully working end-to-end prototype that shows that DNA can indeed function as a replacement for current archival storage technology like tape. Application wise, archival storage is a billion dollar industry, and we believe that DNA is a fundamentally disruptive technology that has the potential to reshape this market. But we believe that our project have an impact on areas beyond archival storage.

First, our work on DNA computation opens up an entirely new field of research on near-molecule data processing that mirrors the current trend of moving computation closer to data to avoid time-consuming data movement. Second, most of the models and tools we develop for DNA storage are actually applicable for analyzing genetic data in other contexts. For instance, the algorithm we are developing for reading data back from DNA provides a scalable solution for sequence clustering–a classic computational genomics problem with several applications. Thus, our work will also contribute to advances in computational genomics.

Learn more about OligoArchive

Rémi Sharrock

C in your Browser

In the academic world, teaching and carrying out research often go hand-in-hand. This is especially true for Rémi Sharrock, a computer science researcher at Télécom Paris, who has developed a C Language learning program comprising 7 MOOCs. The teaching approach used for his online courses called for the development of innovative tools, drawing on the researcher’s expertise. Rémi Sharrock was rewarded for this work in November 2019 by the edX platform, a leading global MOOC provider, who presented him with the its 2019 edX Prize. He talked to us about the story behind this digital learning program developed in partnership with Dartmouth College in the United States.

 

What led you to undertake research in order to create an online learning program?

Rémi Sharrock: The original aim was to propose a new way of learning C language. To do so, we had to develop a number of tools that didn’t exist at the time. This work carried out with Dartmouth College gave rise to research opportunities. Our goal was always to facilitate  exchange with the learner, and to make it a central part of the learning process. The tools we developed made it possible to carry out learning activities directly on the user’s computer, with many features that had never been seen before.

What are some examples of the tools you developed?

RS: The idea of a MOOC is that it’s open to as many people as possible. We didn’t know what type of computer users would connect with, or what operating system or browser they would use. But regardless of their system, we had to be able to provide users with a high-quality learning experience. The first tool we developed for this was WebLinux. It met the challenge of being able to code in C Language with Linux from any computer, using any browser. We didn’t want to make learners download an application, since that could discourage beginners. WebLinux therefore allowed us to emulate Linux for everyone, directly on the web-based learning platform.

How did you do this from a technical perspective?

RS: Technically, we run Linux directly in the browser, without going through a server. To do so, we use an openRisc processor emulator that is run in the browser, and a Linux that is compatible with this type of processor. That allows us to do without servers that run Linux, and therefore operate on a large scale with limited server resources.

That’s an advantage in terms of access to education, but does the tool also facilitate educational activities?  

RS: For that part we had to develop an additional tool, called Codecast. It’s a C language emulator that runs on any browser and is synchronized with the professor’s audio explanation. It was a real challenge to develop this tool because we wanted to make it possible for anyone to run C language instructions directly on their browser, without having to go through a remote computer server, or use third party software on their computer. We created a specialized C language interpreter for the web, which works with all browsers. When you’re watching the professor’s course in the video, you can directly edit lines of code and run them in your browser, right from the course web page. And on top of that, when the teacher integrates an instruction to be learned and tested that he’s sent you as part of the lesson, you can pause the video, edit the instruction and try different things, then resume the video without any consequences.

You also responded to another challenge with this type of MOOC: assessing learners.

RS: Yes, with a third tool, Taskgrader. In a traditional classroom course, the teacher assesses codes proposed by students one by one, and corrects them. This is inconceivable with a MOOC since you have tens or hundreds of thousands of learners to correct.  Taskgrader makes it possible to automatically assess students’ codes in real time, without the professor having to look them over, by providing personalized feedback.

Do all these tools have applications outside the scope of the MOOC C language learning program?

RS: Codecast could be of interest to big community-driven development websites like  Github. Amateur and professional developers share bits of code for applications on the website. But cooperation is often difficult: to correct someone’s code you have to download the incorrect version, correct it, then send it back to the person who then has to download it again. An emulator in the browser would make it possible to work directly online in real time. And as for Taskgrader, it’s a valuable tool for all computer language teachers, even outside the world of MOOCs.

Is your research work in connection with these MOOCs over now that the learning program has been completed?  

RS: No, since we’ve also committed to a second type of research. We’ve teamed up with Cornell and Stanford universities to carry out large-scale sociological experiments on these MOOC learners in an effort to better understand our learner communities.

What kind of research are you conducting to that end?

RS: We have 160,000 learners in the MOOC program worldwide from a wide range of social, ethnic and demographic backgrounds. We wanted to find out whether there are differences in the way in which men and women learn, for example, or between older and younger people. We therefore implement the differences in the given courses according to individuals’ profiles, based on A/B testing – the sample of learners is split in two, and each group has a learning parameter that changes, such as the teacher’s age, voice or gender. This should eventually allow us to better understand learning processes and adapt them to provide each individual with a program that facilitates knowledge transfer.

ASTRID project

Astrid: a nuclear project goes up in smoke

The abandonment of the Astrid project marks a turning point for France’s nuclear industry. The planned nuclear reactor was supposed to be “safer, more efficient and more sustainable”, but therefore required significant funding. Stéphanie Tillement, a researcher at IMT Atlantique, has studied how Fukushima impacted the nuclear industry. Her work has focused in particular on the rationale for abandoning the Astrid project, taking into account the complicated history of nuclear energy and how it has evolved in the public and political spheres.

 

Since the early days of nuclear energy, France has positioned itself as a global leader in terms of both research and energy production. In this respect, the abandonment of the Astrid project in August 2019 marked a move away from this leading position. Astrid (Advanced Sodium Technological Reactor for Industrial Demonstration) was supposed to be France’s first industrial demonstrator for what are referred to as “4th-generation” reactors. The selected technology was the sodium-cooled fast neutron reactor (FNR). At present, nuclear power in France is supplied by 58 second-generation pressurized water reactors, which operate with “slowed-down” neutrons. As an FNR, ASTRID held the promise of more renewable energy – it was supposed to be able to use depleted uranium and plutonium resulting from the operation of current plants as a fuel source, meaning it would consume much less natural uranium.

As part of the AGORAS research project, IMT Atlantique researcher Stéphanie Tillement, studied the impact of the Fukushima accident on the world of nuclear energy. This led her to study the Astrid project, and in particular the many challenges it encountered. “We ruled out the link with Fukushima early on,” says the researcher. The problems Astrid ran into are not related to a paradigm shift as a result of the catastrophe. The reasons it was abandoned are endogenous to the industry and its history.” And financial reasons, though by no means negligible, are not enough to explain why the project was abandoned.

A tumultuous history

In the 2000s, the United States Department of Energy launched the Generation IV International Forum to develop international cooperation for new concepts for nuclear reactors. Out of the six concepts selected by this forum as the most promising, France focused on sodium-cooled reactors, a project which would be launched in 2010 under the name Astrid. The country preferred this concept in particular due to the fact that three French reactors using the technology had already been  built. However, none of them had been used on an industrial scale and the technology had not advanced beyond the prototyping stage. The first such reactor, Rapsodie, was dedicated purely to research. The second was Phénix. It was an intermediary step – it had to produce energy but remained an experimental reactor, far from an industrial scale. It was the third such reactor, Superphénix, which would be given the role of representing the first in the series of this new French industrial-scale energy. But from the beginning, it experienced shut-down periods following several incidents and in 1997, Prime Minister Lionel Jospin announced that it would be shut down once and for all.

 “This decision was widely criticized by the nuclear industry,” says Stéphanie Tillement, “who accused him of acting for the wrong reasons.” During the election campaign, Lionel Jospin had aligned himself with the Green party, who were openly in favor of decommissioning the power plant. “Its sudden shutdown would be taken very badly and destroy all hope for the use of such technology on an industrial-scale. Superphénix was supposed to be the first in a long line, and some remember it as ‘a cathedral in a desert.'” This also reflected public opinion on nuclear energy: the industry was facing growing mistrust and opposition.

“For a lot of stakeholders in the nuclear industry, in particular the CEA (The French Atomic and Alternative Energy Commission), Astrid gave hope to the idea of reviving this highly promising technology,” explains the researcher. One of the biggest advantages was the possibility of a closed nuclear cycle, which would make it possible to recycle nuclear material from current power plants – such as plutonium – to use as a fuel source in the reactors. “In this respect, the discontinuation of the Astrid project may in the long run call into question the very existence of the La Hague reprocessing plant,” she says. This plant processes used fuel, a portion of which (plutonium in particular) is reused in reactors, in the form of MOX fuel. “Without reactors that can use reprocessed materials effectively, it’s difficult to justify its existence.”

Read more on I’MTech: MOx strategy and the future of French nuclear plants

“From the beginning, our interviews showed that it was difficult for the Astrid stakeholders to define the status of the project precisely,” explains Stéphanie Tillement. The concept proposed when applying for funding was that of an industrial demonstrator. The goal was therefore to build a reactor within a relatively short period of time, which could produce energy on a large scale based on technology for which there was already a significant amount of operating experience. But the CEA also saw Astrid as a research project, to improve the technology and develop new design options. This would require far more time. “As the project advanced,” adds the researcher, “the CEA increasingly focused on a research and development approach. The concept moved away from previous reactors and its development was delayed. When they had to present the roadmap in 2018, the project was at a ‘basic design’ stage and still needed a lot of work, as far as design was concerned, but also in terms of demonstrating compliance with nuclear safety requirements.”

An abandoned or postponed project?

Stéphanie Tillement confirms that, “the Astrid project, as initially presented, has been permanently abandoned.” Work on the sodium technology is expected to be continued, but the construction of a potential demonstrator of this technology will be postponed until the second half of the 21st century. “It’s a short-sighted decision,” she insists. Uranium, which is used to operate reactors, is currently inexpensive. So there’s no need to turn to more sustainable resources – at least not yet. But abandoning the Astrid project means running the risk of losing the expertise acquired for this technology. Though some research may be continued, it will not be enough to maintain industrial expertise in developing new reactors, and the knowledge in this sector could be lost. “The process of regaining lost knowledge,” she says, “is ultimately as expensive as starting from scratch.”

A short-term decision, therefore, relying instead on EPR, 3rd-generaton reactors. But the construction of this type of reactor in Flamanville also faces its own set of hurdles. According to Stéphanie Tillement, “the challenges the Astrid project encountered are similar to those of the EPR project.” To secure funding for such projects, nuclear industry stakeholders seek to align themselves with the short timeframes of the political world. Yet, short deadlines are ultimately unrealistic and inconsistent with the timeframes for developing nuclear technology, and even less so when it’s a matter of the first of a series. This creates problems for nuclear projects – they fall behind schedule and their costs rise dramatically. In the end, this makes politicians rather wary of funding this sort of project. “So nuclear energy gets stuck in this vicious circle,” says the researcher, “in a world that’s increasingly unfavorable to this sector.”

This decision also aligns with the government’s energy strategy. In  broad terms, the State has announced that nuclear energy will be reduced to 50% of France’s energy mix, in favor of renewable energies. “The problem,” says Stéphanie Tillement, “is that we only have an outline. If there’s a political strategy on nuclear issues, it remains unclear. And there’s no long-term position – this is a way of  leaving the decision to future decision-makers. But making no decision is a decision. Choosing not to pursue the development of technologies which require a long time to develop may implicitly mean abandoning the idea of any such development in the future. Which leads some to consider, rather cynically, that politicians must think that when need it, we’ll buy the required technology from other powers (China, Russia) who have already developed it.”

connected devices

A dictionary for connected devices

The field of connected devices is growing at a staggering pace across all industries. There is a growing need to develop a communication standard, meaning a ‘common language’ that different smart systems could understand and interpret. To contribute to this goal, ETSI (European Telecommunications Standards Institute) is funding a European project in which Mines Saint-Étienne researchers Maxime Lefrançois and Antoine Zimmermann[1] are taking part.

 

In order to work together, connected devices must be able to communicate with one another. This characteristic, known as ‘semantic interoperability,’ is one of the key challenges of the digital transition. To be effective, semantic interoperability must be based on the adoption of an agreed-upon set of best practices. This would culminate in the creation of a standard adopted by the IoT community. At the European level, ETSI (European Telecommunications Standards Institute) is in charge of setting standards for information and communication technologies. “For example, ETSI standardized the SIM card, which acts as an identifier in mobile phone networks to this day,” explains Maxime Lefrançois. He and his colleague Antoine Zimmermann are researchers at Mines Saint-Étienne and specialize in the semantic web and knowledge representation. They are taking part in the STF 578 project on the interoperability of connected devices funded by ETSI, in partnership two researchers from Universidad Politécnica de Madrid.

“Instead of proposing a standard that strictly defines the content of communications between connected devices, we define and formally identify the concepts involved, through what is known as an ontology,” says Antoine Zimmermann. This provides IoT players with greater flexibility since the content of messages exchanged may use the language and format best suited to the device, as long as an explicit link is made with the concept identified in the reference ontology. The two researchers are working on the SAREF reference ontology (Smart Applications Reference Ontology), a set of ETSI specifications which include a generic base and specializations for the various sectors related to the IoT: energy, environment, building, agriculture, smart cities, smart manufacturing, industry and manufacturing, water, automotive, e-health, wearables.

“The SAREF standard describes smart devices, their functions and the services they provide, as well as the various properties of the physical systems these devices can control,” explains Maxime Lefrançois. For example, a light bulb can say, “I can provide light” by using a concept defined by SAREF. A system or application may then refer to the same lighting concept to tell the object to turn on. “Ultimately, this knowledge should be described following the same standard models within each industry to facilitate harmonization between industries.” adds the researcher. The aim of the project is therefore to develop a public web portal for the standard SAREF ontology to facilitate its adoption by companies and collect their feedback and suggestions for improvement.

A specially-designed ‘dictionary’

“The SAREF public web portal is a little bit like a ‘dictionary’ for connected devices,” explains Maxime Lefrançois. “If we take the example of a water heater that can measure energy consumption and can be remotely-controlled, SAREF will describe its possible actions, the services it can provide, and how it can be used to lower energy costs or improve household comfort.” But his colleague Antoine Zimmermann explains, “It isn’t a dictionary in the traditional sense. SAREF specifies in particular the technical and IT-related constraints we may encounter when communicating with the water heater.”

Imagine if one day all water heaters and heat pumps were connected to the IoT and could be remotely controlled. They could then theoretically be used as an energy resource that could ensure the stability and energy efficiency of the country’s electricity grid. If, in addition, there was a uniform way to describe and communicate with these devices, companies in the smart building and energy sectors would waste less time individually integrating products made by different manufacturers. They could then focus instead on developing innovative services connected to their core business, giving them a competitive advantage. “The goal of semantic interoperability is to develop a service for a certain type of smart equipment, and then reuse this service for all similar types of equipment,” says Maxime Lefrançois. “That’s the heart of SAREF”.

Read more on I’MTech: How the SEAS project is redefining the energy market

At present, the existing standards are compartmentalized by sector. The energy industry has standards for describing and communicating with the electrical equipment of a water tower, but the water tower must then implement different standards to interface with other equipment in the water distribution network. “There are several different consortia for each sector,” explain the researchers, “but we now have to bridge the gap between these consortia, in order to harmonize their standards.” Thus the need for a ‘dictionary,’ a common vocabulary that can be used by connected devices in all industries.

Take the example of automotive manufacturers who are developing new batteries for electric vehicles. Such batteries could theoretically be used by energy suppliers to regulate the voltage and frequency of the electricity grid. “The automotive and energy industries are two sectors that had absolutely no need to communicate until now,” says Maxime Lefrançois, “in the future, they may have to work together to develop a common language, and SAREF could be the solution.”

A multilingual ‘dictionary’

The IoT community is currently engaged in something of a ‘standards war’ in which everyone is developing their own specification and hoping that it will become the standard. Impetus from public authorities is therefore needed to channel the existing initiatives  — SAREF at the European level. “We can well imagine that in the future, there will only be a single, shared vocabulary for everyone,” says Antoine Zimmermann. “But we may find ourselves with different vocabularies being developed at the same time, which then remain. That would be problematic. This is how it is today, for example, with electrical outlets. A machine intended to be used in the United States will not work with European outlets and vice versa.”

“The development of the SAREF public web portal is an important step since it encourages companies to take part in creating this dictionary,” adds Maxime Lefrançois. The more companies are involved in the project, the more comprehensive and competitive it will be. “The value of a standard is related to the size of the community that adopts it,” he says.

“The semantic web is particularly useful in this respect,” says Antoine Zimmermann, “it allows everyone to agree. Companies are all engaged in digital transformation and use the web as a common platform to get in touch with clients and partners. They use the same protocols. We think the semantic web is also a good way to build these common vocabularies that will work in various sectors. We aren’t looking for the right solution, but to demonstrate best practices and make them more widespread so that companies look beyond their own community.” 

A collaborative ‘dictionary’

The researchers’ work also involves developing a methodology for building this standard: a company must be able to suggest a new addition to the vocabulary that is highly specific to a certain field, while ensuring that this contribution aligns with the standard models and best practices that have been established for the entire ‘dictionary.’

“And that’s the tricky part,” says Maxime Lefrançois. How can the SAREF public portal be improved and updated to make sure that companies use it? “We know how to write ‘dictionaries’ but supporting companies is no simple task.” Because there are a number of constraints involved: all these different vocabularies and jargons must be assimilated, and companies may not necessarily be familiar with them.

“So we have to reinvent collaborative support methods for this dictionary. That’s where DevOps approaches implemented for software development are useful,” he says. These approaches make it possible to automatically check the suggestions based on a set of quality criteria, then automatically make a new version of the portal available online if the criteria are  fulfilled. “The goal is to shorten SAREF development cycles while maintaining an optimal level of quality,” concludes the researcher.

There are other hurdles to overcome to get the connected devices themselves to ‘speak SAREF,’ due to the specific limitations of connected devices –  limited storage and computing capacity, low battery life, limited bandwidth, intermittent connectivity. The use of ontologies for communication and ‘reasoning’ was first thought up without these constraints, and must be reinvented for these types of ‘edge computing’ configurations. These issues will be explored in the upcoming ANR CoSWoT project (Constrained Semantic Web of Things) which will include researchers from LIRIS, Mines Saint-Étienne, INRAE (merger of INRA and IRSTEA), Université Jean-Monnet and the company Mondeca.

 

[1] Maxime Lefrançois and Antoine Zimmermann are researchers at the Laboratory Hubert Curien, a joint research unit between CNRS/Mines Saint-Étienne/Université Jean Monnet.

Being Human with algorithms : Marc-Oliver Pahl meets Raimund Seidel

Marc-Oliver Pahl is a researcher in cybesecurity at IMT Atlantique. In 2018, he launched “Being human with algorithms”, a series of video interviews between technicians and non-technicians around the topic of digital transformation. Through open discussions and dialogues, he depicts how digital technologies are perceived, and affect humans as citizens, consumers, workers…

In this episode, Marc-Oliver meets with Raimund Seidel, Director of the Schloss Dagstuhl – Leibniz Center for Informatics.