data sharing

Data sharing, a common European challenge

Promoting data sharing between economic players is one of Europe’s major objectives via its digital governance strategy. To accomplish this, there are two specific challenges to be met. Firstly, a community must be created around data issues, bringing together various stakeholders from multiple sectors. Secondly, the technological choices implemented by these stakeholders must be harmonised.

 

‘If we want more efficient algorithms, with qualified uncertainty and reduced bias, we need not only more data, but more diverse data’, explains Sylvain Le Corff. This statistics researcher at Télécom SudParis thus raises the whole challenge around data sharing. This need applies not only to researchers. Industrial players must also strengthen their data with that from their ecosystem. For instance, an energy producer will benefit greatly from industrial data sharing with suppliers or consumer groups, and vice versa. A car manufacturer will become all the more efficient with more data sources from their sub-contractors.

The problem is that this sharing of data is far from being a trivial operation. The reason lies in the numerous technical solutions that exist to produce, store and use data. The long-standing and over-riding idea for economic players was to try to exploit their data themselves, and each organisation therefore made personal choices in terms of architecture, format or data-related protocols. An algorithm developed to exploit data sets in a specific format cannot use data packaged in another format. This then calls for a major harmonisation phase.

‘This technical aspect is often under-estimated in data sharing considerations’, Sylvain Le Corff comments. ‘Yet we are aware that there is a real difficulty with the pre-treatment operation to harmonise data.’ The researcher quotes the example of automatic language analysis, a key issue for artificial intelligence, which relies on the automatic processing of texts from multiple sources: raw texts, texts generated by audio or video documents, or texts derived from other texts, etc. This is the notion of multi-modality. ‘The plurality of sources is well-managed in the field, but the manner in which we oversee this multi-modality can vary within the same sector.’ Two laboratories or two companies will therefore not harmonise their data in the same way. In order to work together, there is an absolute need to go through this fastidious pre-treatment, which can hamper collaboration.

A European data standard

Olivier Boissier, a researcher in artificial intelligence and inter-operability at Mines Saint-Étienne, adds another factor to this issue: ‘The people who help to produce or process data are not necessarily data or AI specialists. In general, they are people with high expertise in the field of application, but don’t always know how to open or pool data sets.’ Given such technical limitations, a promising approach consists in standardising practices. This task is being taken on by the International Data Spaces Association (IDSA), whose role is to promote data sharing on a global scale, and more particularly in Europe.

Contrary to what one might assume, the idea of a data standard does not mean imposing a single norm on data format, architecture or protocol. Each sector has already worked on ontologies to help facilitate dialogue between data sets. ‘Our intention is not to provide yet another ontology’, explains Antoine Garnier, project head at IDSA. ‘What we are offering is more of a meta-model which enables a description of data sets based on those sector ontologies, and with an agnostic approach in terms of the sectors it targets.’

This standard could be seen as a list of conditions on which to base data use. To summarise the conditions in IDSA’s architectural model, ‘the three cornerstones are the inter-operability, certification and governance of data’, says Antoine Garnier. Thanks to this approach, the resulting standard serves as a guarantee of quality between players. It enables users to determine rapidly whether an organisation fulfils these conditions and is thus trustworthy. This system also raises the question of security, which is one of the primary concerns of organisations who agree to open their data.

Europe, the great lake region of data?

While developing a standard is a step forward in technical terms, it remains to be put into actual use. For this, its design must incorporate the technical, legal, economic and political concerns of European data stakeholders – producers and users alike. Hence the importance of creating a community consisting of as many organisations as possible. In Europe, since 2020, this community has had a name, Gaia-X, an association of players, including together IMT and IDSA in particular, to structure efforts around the federation of data, software and infrastructure clouds. Via Gaia-X, public and private organisations aim to roll out standardisation actions, using the IDSA standard among others, but may also implement research, training or awareness activities.

‘This is such a vast issue that if we want to find a solution, we must approach it through a community of experts in security, inter-operability, governance and data analysis’ Olivier Boissier points out, emphasising the importance of dialogue between specialists around this topic. Alongside their involvement in Gaia-X, IMT and IDSA are organising a winter school from 2 to 4 December to raise awareness among young researchers of data-sharing issues (see insert below). With the support of the German-French Academy for the Industry of the Future, it will provide the keys to understanding technical and human issues, through concrete cases. ‘Within the research community, we are used to taking part in conferences to keep up to date on the state of play of our field, but it is difficult to have a deeper understanding of the problems faced by other fields’, Sylvain Le Corff admits. ‘This type of Franco-German event is essential to structuring the European community and forming a global understanding of an issue, by taking a step back from our own area of expertise.’ 

The European Commission has made no secret of its ambition to create a space for the free circulation of data within Europe. In other words, a common environment in which personal and confidential data would be secured, but also in which organisations would have easy access to a significant amount of industrial data. To achieve this idyllic scenario of cooperation between data players, the collective participation of organisations is an absolute prerequisite. For academics, the communitarian approach is a core practice and does not represent a major challenge. For businesses, however, there remains a certain number of stakeholders to win over. The majority of major industries have understood the benefits of data sharing, ‘but some companies still see data as a monetizable war treasure that they must avoid sharing’, says Antoine Garnier. ‘We must take an informative approach and shatter preconceived ideas.’

Read on I’MTech: Data sharing: an important issue for the agricultural sector

What about non-European players? When we speak about data sharing, we systematically refer to the cloud, a market cornered by three American players, Amazon, Microsoft and Google, behind which we find other American stakeholders (IBM and Oracle) and a handful of Chinese interests such as Alibaba and Tencent. How do we convince these ‘hyper-scalers’ (the title refers to their ability to scale up to meet growing demand, regardless of the sector) to adopt a standard which is not their own, when they are the owners of the technology upon which the majority of data use is based? ‘Paradoxically, we are perhaps not such bad news for them’ Antoine Garnier assures us. ‘Along with this standard, we are also offering a form of certification. For players suffering from a negative image, this allows them to demonstrate compliance with the rules.’

This standardisation strategy also impacts European digital sovereignty and the transmission of its values. In the same way as Europe succeeded in imposing a personal data protection standard in the 2010s with the formalisation of the GDPR, it is currently working to define a standard around industrial data sharing. Its approach to this task is identical, i.e. to make standardisation a guarantee of security and responsible management. ‘A standard is often perceived as a constraint, but it is above all a form of freedom’ concludes Olivier Boissier. ‘By adopting a standard, we free ourselves of the technical and legal constraints specific to each given use.’

[box type=”info” align=”” class=”” width=””]From 2 to 4 December: a winter school on data sharing

Around the core theme of Data Analytics & AI, IMT and TU Dortmund are organising a winter school on data sharing for industrial systems, from 2 to 4 December 2020, in collaboration with IDSA, the German-French Academy for the Industry of the Future and with the support of the Franco-German University. Geared towards doctoral students and young researchers, its aim is to open perspectives and establish a state of play on the question of data exchange between European stakeholders. Through the participation of various European experts, this winter school will examine the technical, economic and ethical aspects of data sharing by bringing together the field expertise of researchers and industrial players.

Information and registration

[/box]

Metrics

A European project to assess the performance of robotic functions

Healthcare, maintenance, the agri-food industry, agile manufacturing. Metrics, a three-year H2020 project launched in January, is organizing robot competitions geared towards these four industries and is developing metrological methods to assess the data collected. TeraLab, IMT’s big data and AI platform, is a partner in this project. Interview with Anne-Sophie Taillandier, Director of TeraLab.

 

What is the aim of the European project Metrics?

Anne-Sophie Taillandier: The aim of Metrics (Metrological Evaluation and Testing of Robots in International Competitions) is threefold: First, it will organize robot competitions geared towards the industries in the four priority fields: healthcare, inspection and maintenance, agri-food, and agile manufacturing. The second goal is to develop metrological methods to assess the data provided by the robots. And lastly, the project will help structure the European robotics community around the competitions in the four priority sectors mentioned before.

Other European projects are organizing robot competitions to encourage scientific progress and foster innovation. How does Metrics stand out from these projects?

AST: One of the main things that makes the Metrics project different is that it aims to directly address the reliability and validity of AI algorithms during the competitions. To do so, the competitions must at once focus on the robot’s behavior in a physical environment and on the behavior of its AI algorithms when they are confronted with correctly qualified and controlled data sets. To the best of our knowledge, this question has not been addressed in previous European robotics competitions.

What are the challenges ahead?

AST: Ultimately, we hope to make the use of assessment tools and benchmarking widespread and ensure the industrial relevance of challenge competitions. We will also have to gain attention from industrial players, universities and the general public for the competitions  and ensure that the robots comply with ethical, legal, social and economic requirements.

How will you go about this?

AST: Metrics is developing an evaluation framework based on metrological principles in order to assess the reliability of the different competing robots in a thorough and impartial manner. For each competition, Metrics will organize 3 field evaluation campaigns  (in physical environments) and three cascade evaluation campaigns (on data sets) in order to engage with the AI community. Specific benchmarks for functions and tasks are defined in advance to assess the performance of robotic functions and the execution of specific tasks.

The Metrics partners have called upon corporate sponsors to support the competitions, verify their industrial relevance, contribute to an awareness program and provide effective communication.

How is TeraLab – IMT’s big data and AI platform – contributing to the project?

AST: First of all, TeraLab will provide sovereign, neutral spaces, enabling the Metrics partners and competitors to access data and software components in dedicated work spaces. TeraLab will provide the required level of security to protect intellectual property,  assets  and data confidentiality.

TeraLab and IMT are also in charge of the Data Management Plan setting out the rules for data management in Metrics, based on best practices for secure data sharing, with contributions from IMT experts in the fields of cybersecurity, privacy, ethics and compliance with GDPR (General Data Protection Regulation).

The consortium brings together 17 partners. Who are they?

AST: Coordinated by the French National Laboratory for Metrology and Testing (LNE), Metrics brings together 17 European partners: higher education and research institutions and organizations with expertise in the field of testing and technology transfer. They contribute expertise in robotic competitions and metrology. The partners provide test facilities and complementary networks throughout Europe in the four priority industrial areas.

What are the expected results?

AST: On a technological level, Metrics should encourage innovation in the field of robotic systems. It will also have a political impact with information for policymakers and support for robotic systems certification. And it will have a tangible socio-economic impact as well, since it will raise public awareness of robotic capacity and lead to greater engagement of commercial organizations in the four priority industries. All of this will help ensure the sustainability, at the European level, of the competition model for robotics systems that address socio-economic challenges.

Learn more about the Metrics project

Interview by Véronique Charlet for I’MTech

 

PREDIS

Innovating to improve radioactive waste management

The PREDIS European project aims to develop innovative activities for the management of radioactive waste, for which there is currently no solution. IMT Atlantique is one of the project’s seven work package leaders and will contribute to research on innovative approaches for the treatment and conditioning of metallic waste. Abdesselam Abdelouas, a researcher working on the project at IMT Atlantique, gives us an overview.

 

Can you describe the broader context for the PREDIS European project?

AA: The management of radioactive waste from the nuclear power cycle, as well as from other industries such as healthcare, radiopharmaceutical production, farming and mining operations, remains a challenge and requires the development of new methods, processes and technologies.

What is the project’s goal?

AA: The aim of PREDIS is to reduce the overall volume of waste destined for disposal and to recycle radioactively contaminated metallic waste. Reducing the volume of waste will make it possible to avoid building costly new disposal sites. The consortium will strive to test and assess innovative approaches  (methods, processes, technologies and demonstrators) for the treatment and conditioning of radioactive waste.

How do you plan to achieve this goal and what are the scientific hurdles to overcome?

AA: As part of this project, we’ll be selecting a well-known or new chemical process, improving it and adapting it for greater applicability. This process will also have to meet environmental requirements, in particular in regard to the toxicity of the materials used and the volume of effluents produced by the treatment.

How are IMT Atlantique researchers contributing to this project?

AA: Bernd Grambow and I are radiochemistry professors at IMT Atlantique’s Subatech laboratory, and we are coordinating Work Package 4 on metallic waste treatment. Beyond this coordination mission, we will be conducting research into decontamination and management of treatment effluents.

The PREDIS consortium brings together 48 partners. Which ones are you working with the most?

AA: In Work Package 4, we interact with some twenty mainly European partners, but we work more closely with the CEA (Marcoule), the University of Pannonia (Hungary) and the Czech Technical University (CTU).

What are the next big steps for the project?

AA: The PREDIS management team had been meeting on 16 June 2020 to prepare for the kick off meeting scheduled for September 2020.

Interview by Véronique Charlet for I’MTech

 

Gaia-X

Gaia-X: a sovereign, interoperable European cloud network

France and Germany have unveiled the Gaia-X project, which aims to harmonize cloud services in Europe to facilitate data sharing between different parties. It also seeks to reduce companies’ dependence on cloud service providers, which are largely American. For Europe, this project is therefore an opportunity to regain sovereignty over its data.

 

When a company chooses a cloud service provider, it’s a little bit like when you accept terms of service or sale: you never really know how you’ll be able to change your service or how much it will cost.” Anne-Sophie Taillandier uses this analogy to illustrate the challenges companies currently face in relying on cloud service providers. As director of IMT’s TeraLab platform specializing in data analysis and AI, she is contributing to the European Gaia-X project, which aims to introduce transparency and interoperability in cloud services in Europe.

Initiated by German Economy Minister Peter Altmaier, Gaia-X currently brings together ten German founding members and ten French founding members, including cloud service providers and major users of these services, of all sizes and from all industries. Along with these companies, a handful of academic players specialized in research in digital science and technology – including IMT – are also taking part in the project. This public-private consortium is seeking to develop two types of standards to harmonize European cloud services.

First of all, it aims to introduce technical standards to harmonize practices among various players. This is an important condition to facilitate data and software portability. Each company must be able to decide to switch service providers if it so wishes, without having to modify its databases to make them compatible with a new service. The standardization of the technical framework for every cloud service is a key driver to facilitate the movement of data between European parties.

Environmental issues illustrate the significance of this technical problem. “In order to measure the environmental impact of a company’s operations, its data must be combined with that of its providers, and possibly, its customers,” explains Anne-Sophie Taillandier, who, for a number of years, has been leading research at TeraLab into the issues of data transparency and portability. “If each party’s data is hosted on a different service, with its own storage and processing architecture, they will first have to go through a lengthy process in order to harmonize the data spaces.”  This step is currently a barrier for organizations that lack either financial resources or skills, such as small companies and public organizations.

Also read on I’MTech: Data sharing: an important issue for the agricultural sector

In addition to technical standards, the members of the Gaia-X partnership are also seeking to develop a regulatory and ethical framework for cloud service stakeholders in Europe. The goal is to bring clarity to contractual relationships between service providers and customers. “SMEs don’t have the same legal and technical teams as large companies,” says Anne-Sophie Taillandier. “When they enter into an agreement with a cloud service provider, they don’t have the resources to evaluate all the subtleties of the contract.”

The consortium has already begun to work on these ethical rules.  For example, there must not be any hidden costs when a company wishes to remove its data from a service provider and switch to another provider. Ultimately, this part of the project should give companies the power to choose their cloud service providers in a transparent way. An approach that recalls the GDPR, which gives citizens the ability to choose their digital services with greater transparency and to ensure the portability of their personal data when necessary.

Restoring European digital sovereignty

It is no coincidence that the concepts guiding the Gaia-X project evoke those of the GDPR. Gaia-X is rooted in a general European Union trend for data sovereignty.  The initiative is also an integral part of the long-term EU strategy to create a sovereign space for industrial and personal data, protected by technical and legal mechanisms, which are also sovereign.

The Cloud Act adopted by the United States in 2018 gave rise to concerns among European  stakeholders. This federal law gives local and national law enforcement authorities the power to request access to data stored by American companies, should this data be necessary to a criminal investigation, including when these companies’ servers are located outside the United States. Yet, the cloud services market is dominated by American players. Together, Amazon, Microsoft and Google have over half the market share for this industry. For European companies, the Cloud Act poses a risk to the sovereignty of their data.

Even so, the project does not aim to create a new European cloud services leader, but rather to encourage the development of existing players, through its regulations and standards, while harmonizing the practices already in place among the various players. The goal is not to prevent American players or those in other countries — Chinese giant Alibaba’s cloud service is increasingly gaining ground ­— from tapping into the European market. “Our goal is to issue standards that respect European values, and then tell anyone who wishes to enter the European market that they may, as long as they play by the rules.”

For now, Gaia-X has adopted an associative structure. In the months ahead, the consortium should be opening up to incorporate other European companies who want to take part. “The project was originally a Franco-German initiative,” says Anne-Sophie Taillandier, “but it is meant to open up to include other European players who wish to contribute.” In line with European efforts over recent years to develop digital technology with a focus on cybersecurity and artificial intelligence, Gaia-X and its vision for a European cloud will rely on joint creation.

 

Benjamin Vignard for I’MTech

Espace de co-working, industrie sans frontières, industry without borders

How are borders in industry changing?

The Industry Without Borders project launched by the German-French Academy for the Industry of the Future in 2017 seeks to challenge the idea that digital technology dissolves borders. Madeleine Besson and Judith Igelboeck, from Institut Mines-Télécom Business School and the Technical University of Munich respectively, explain why it is not so easy in practice.

 

Industry is going digital and this has brought about a wave of changes. The emphasis on open innovation pushes for the dissolution of borders within a company and in relationships between various organizations. “It’s not so easy,” says Madeleine Besson, a researcher in management at Institut Mines-Télécom Business School“We’ve seen that it’s becoming more fluid, but digitalization can also create new borders or reinforce those that already exist.”

The aim of the Industry Without Borders  project launched in 2017 was to identify factors that can lead to the creation of borders in companies and look at how these aspects are evolving. The project is led by the German-French Academy for the Industry of the Future, which brings together teams from IMT and the Technical University of Munich (TUM). “We looked at the way borders can be built, rebuilt, and at times, strengthened or effectively broken down,”  says Judith Igelsboeck, an organizational studies researcher at TUM. Research teams on both sides of the Rhine worked with companies, through field studies and qualitative interviews, in order to determine the changes that have been brought about by digital technology.

“We considered the idea of open innovation in particular,” says Madeleine Besson. Today, companies  consult consumers much more often in the processes of creation and innovation, but it often remains  under the company’s control. Conversely, Judith Igelsboeck reports that “a study in an IT consulting firm in Germany showed that customers went so far as to request the firm’s skills database so that they could choose the profiles of the IT specialists for their project directly themselves. The opening here is therefore clear.”

What borders?

“For a long time, borders in the business world were formalized from an economic viewpoint,” explains the French researcher. This includes assets and goods, employees and the machines used. “But the scope is much wider than that, and most importantly, it’s very mobile.”  A number of other aspects may also come into play, such as customer relationships and their involvement in innovation processes, as in the previous example, and relationships between different companies.

As far as internal borders are concerned, for example concerning organization within a department, management models tend to be moving toward eliminating borders. “This idea is reflected in efforts to redesign the very architecture of the office – the principle of open space,” explains Judith Igelsboeck. Workspaces become more agile, flexible and free of internal separations. The aim is to create more communal spaces, so that co-workers get to know each other better in order to work together.

But ultimately, open space may not be as open as it seems. “Employees reclaim ownership of the space by creating borders to mark out their personal space,” says Madeleine BessonThey do so by making special adjustments – at times perceptible only to those who use the workspace – to mark a space as their own.

Read more on I’MTech: Can workspaces become agile?

Madeleine Besson reminds us that, “the general consensus in scientific literature and the media is that digital tools and artificial intelligence facilitate instant connections, not only between people, but between things.” Supply chains should be developed in a single, seamless automated process, that can work beyond organizational borders. But it is not so clear in practice, and digital tools even appear to add new barriers.

Between theory and practice

“Imagine a printer that uses an automated tool to help manage the paper supply,”  says the French researcher. “A connected IT system between the paper supplier and the printer could help regulate paper ordering depending on current stock and the factories’ operations. The supplier becomes a sort of stockpile the company can draw on – the system is shared and the borders are therefore weakened.”

Yet, the same example could also be used to illustrate how new borders are created. If these companies use competing systems, such as Apple and Android, they will face insurmountable barriers since these two systems are not interoperable. “Technological change can also create a new border,”  adds Madeleine Besson. “It can create sub-categories between organizations that have the desire and skills to converse with computers, and others that may feel like they are merely assistants for automatons.”

“Our team encountered such a feeling during an interview with the staff of an after-sales service company,”  says the researcher. Their workday revolves around making rounds to customers whose equipment has broken down. Traditionally, these employees organized their own rounds. But their schedule is now managed by a computer system and they receive the list of customers to visit the night before. “The employees were frustrated that they were no longer in control of their own schedule. They didn’t want their responsibilities to be taken away”  she explains.

“They would meet up in the morning before making their rounds to exchange appointments. Some didn’t want to go to into big cities, others wanted to keep the customers they’d been working with for a long time. So the digital tool puts up a barrier within the company and is a source of tension and frustration, which could potentially give rise to conflicts or disputes.”  These changes are made without adequately consulting the internal parties involved and can lead to conflict in the company’s overall operations.

Across the Rhine

This initial phase of the project with a number of field studies in France and Germany is expected to lead to new collaborations. For the researchers, it would be interesting to study the changes on either side of the Rhine and determine whether similar transformations are underway, or if a cultural aspect may lead to either the dissolution or crystallization of borders.

“Each country has its own vision and strategy for the industry of the future,”  says Judith Igelsboeck. So it is conceivable that cultural differences will be perceptible. “The intercultural aspect is a point to be considered, but for now, we haven’t been able to study it in a single company with a German and French branch.”  This may be the topic for a new French-German collaboration. The German researcher says that another possible follow-up to this project could focus on the use of artificial intelligence in business management.

 

Tiphaine Claveau for I’MTech

5G-Victori

5G-Victori: large-scale tests for vertical industries

Twenty-five European partners have joined together in the three-year 5G-Victori project launched in June 2019. They are conducting large-scale trials for advanced use case validation in commercially relevant 5G environments. Navid Nikaein, researcher at EURECOM, key partner of the 5G-Victori project, details the challenges here.

 

What was the context for developing the European 5G-Victori project?

Navid Nikaein: 5G-Victori stands for VertIcal demos over Common large scale field Trials fOr Rail, energy and media Industries. This H2020 project is funded by the European Commission as part of the 3rd phase of the 5GPPP projects (5G Infrastructure Public Private Partnership). This phase aims to validate use cases for vertical industry applications on realistic and commercially relevant 5G test environments. 5G-Victori focuses on use cases involving in Transportation, Energy, Media, and Factories of the Future.

What is the aim of this project?

NN: The aim is threefold. First, the integration of different 5G operational environments required for the demonstration of the large variety of 5G-Victori vertical and cross-vertical use cases. Second, testing the four main use-cases, namely Transportation, Energy, Media, and Factories of future, on 5G platforms located in Sophia Antipolis (France), Athens (Greece), Espoo and Oulu (Finland), allowing partners to validate their 5G use cases in view of a wider roll-out of services. Third, the transformation of the current closed, purposely developed and dedicated infrastructures into open environments where resources and functions are exposed to the telecom and the vertical industries through common repositories.

What technological and scientific challenges do you face?

NN: A number of challenges have been identified for each use cases that will be tackled during the course of the project in their relevant 5G environment (see figure below).

In the Transport use case, we validate the sustainability of critical services, such as collision avoidance, and enhanced mobile broadband applications, such a 4K video streaming, under high-speed mobility in Railway environments. Main challenges considered in 5G-Victori are (a) the interconnection of on-board devices with the trackside and the trackside with the edge and/or core network (see figure below), and (b) guaranteed delivery of railway-related critical data and signalling services addressing on-board and trackside elements using a common software-based platform.

In the Energy use case, the main challenge is to facilitate the smart energy metering, fault detection, and preventive maintenance taking advantage of the low latency signal exchange between the substations and the control Center over 5G networks. Both high-voltage and low-voltage energy operations are considered.

In the Media use case, the main challenge is to enable divers content delivery networks capable of providing services in dense, static and mobile environments. In particular, 4K video streaming service continuity in mobile scenarios with 5G network coverage, and bulk transfer of large volumes of content for the disconnected operation for personalized Video on Demand (VoD) services.

In the Factories of future use case, the main challenge is the design and development of a fully automated Digital Utility Management system over a 5G network demonstrating advanced monitoring solutions. Such a solution shall be able to track all the operations, detect equipment fault, (3) support decision-making process of first responders based on the collected data.

How are EURECOM researchers contributing to this project?

NN: EURECOM is one of the key partners in this project as it will provide its operational 5G testing facilities based on OpenAirInterface (OAI) and Mosaic5G platforms. The facility provides Software-defined Networks (SDN), Network Function Virtualization (NFV) and Multi-access Edge Computing (MEC) solutions for 5G networks. In addition, Eurecom will design and develop a complete 5G network slicing solution that will be used to deploy a virtualized 5G network tailored to the above-mentioned use cases. Finally, Eurecom will pre-validate a subset of scenarios considered in the project on.

Also read on I’MTech SDN and virtualization: more intelligence in 5G networks

Who are your partners and what are your collaborations?

NN: The project counts 25 European partners that are represented in the figure below: SMEs, network operators, vendors, academia… EURECOM is playing a key role in the project in that it provides (a) 5G technologies through OpenAirInterface and Mosaic5G platforms to a subset of partners, and (b) 5G deployment and testing facilities located at Sophia Campus.

What are the expected benefits of the project?

NN: In addition to the scientific benefits in terms of publications, the project will provide supports in continuous development and maintenance of the OpenAirInterface and Mosaic5G software platforms. It also allow us to validate whether 5G network is able to deliver the considered use cases with the expected performance. We also plan to leverage our results by providing feedbacks when possible to the standardization bodies such as 3GPP and ORAN.

What are the next important steps for the project?

NN: In the first year, the project focused on refining the 5G architecture and software platforms to enable efficient execution of the considered use cases. In the 2nd year, the project will focus on deploying the use cases on the target 5G testing facilities provided by 5G-EVE, 5G-VINNI, 5GENESIS, and 5G-UK.

Learn more about the 5G-Victori project

Interview by Véronique Charlet for I’MTech

Subcultron

The artificial fish of the Venice lagoon

The European H2020 Subcultron project was completed in November 2019 and successfully deployed an autonomous fleet of underwater robots in the Venice lagoon. After four years of work, the research consortium  — which includes IMT Atlantique ­— has demonstrated the feasibility of synchronizing a swarm of over one hundred autonomous units in a complex environment. An achievement made possible by the use of robots equipped with a bio-inspired sixth sense known as an “electric sense.”

 

Curious marine species inhabited the Venice lagoon from April 2016 to November 2019. Nautical tourists and divers were able to observe strange transparent mussels measuring some forty centimeters, along with remarkable black lily pads drifting on the water’s surface. But amateur biologists would have been disappointed had they made the trip to observe them, since these strange plants and animals were actually artificial.  They were robots submerged in the waters of Venice as part of the European H2020 Subcultron project. Drawing on electronics and biomimetics, the project’s aim was to deploy an underwater swarm of over 100 robots, which were able to coordinate autonomously with one another by adapting to the environment.

To achieve this objective, the scientists taking part in the project chose Venice as the site for carrying it out. “The Venice lagoon is a sensitive, complex environment,” says Frédéric Boyer, a robotics researcher at IMT Atlantique — a member of the Subcultron research consortium. “It has shallow, very irregular depths, interspersed with all sorts of obstacles. The water is naturally turbid. The physical quantities of the environment vary greatly: salinity, temperature etc.” In short, the perfect environment for putting the robots in a difficult position and testing their capacity for adaptation and coordination.

An ecosystem of marine robots

As a first step, the researchers deployed 130 artificial mussels in the lagoon.  The mussels were actually electronic units encapsulated in a watertight tube. They were able to collect physical data about the environment but did not have the ability to move, other than sinking and resurfacing. Their autonomy was ensured by an innovative charging system developed by one of  the project partners: the Free University of Brussels. On the surface, the floating “lily pads” powered by solar energy were actually data processing bases. There was just one problem: the artificial mussels and lily pads could not communicate with one another. That’s where the notion of coordination and a third kind of robots came into play.

In the turbid waters of the Venice lagoon, artificial fish were responsible for transmitting environmental data from the bottom of the lagoon to the surface.

 

To send information from the bottom of the lagoon to the surface, the researchers deployed some fifty robotic fish. “They’re the size of a big sea bream and are driven by small propellers, so unlike the other robots, they can move,” explains Frédéric Boyer. This means that there is only a single path for transmitting data between the bottom of the lagoon and the surface: the mussels transmit information to the fish who swim towards the surface to deliver it to the lily pads, and then return to the mussels to start the process over again.  And all of this takes place in a variable marine environment, where the lily pads drift and the fish have to adapt.

Fish with a sixth sense

Developing this autonomous robot ecosystem was particularly difficult . “Current robots are developed with a specific goal, and are rarely intended to coordinate with other robots with different roles,” explains Frédéric Boyer. Developing the artificial fish, which played a crucial role, was therefore the biggest challenge of the project. The IMT Atlantique team contributed to these efforts by providing expertise on a bio-inspired sense: electric sense.

It’s a sense found in certain fish that live in the waters of tropical forests,” says the researcher. “They have electrosensitive skin, which allows them to measure the distortions of electric fields produced by themselves or others in their immediate environment: another fish passing nearby causes a variation that they can feel. This means that they can stalk their prey or detect predators in muddy water or at night.” The artificial fish of the turbid Venice lagoon were equipped with this electric sense.

This capacity made it possible for the fish to engage in organizational, cooperative behaviors. Rather than each fish looking for the mussels and the lily pads on their own, they grouped together and travelled in schools. They were therefore better able to detect variations in the electric field, whether under the water or on the surface, and align themselves in the right direction. “It’s a bit like a compass that aligns itself with the Earth’s electromagnetic field,” says Frédéric Boyer.

The Subcultron project therefore marked two important advances in the field of robotics: the coordination of a fleet of autonomous agents and equipping under-water robots with a bio-inspired sense. These advances are of particular interest for monitoring ecosystems and the marine environment. One of the secondary aims of the project, for example, was tracking the phenomenon of oxygen depletion in the water of the Venice lagoon. An event that occurs at irregular intervals, in an unpredictable manner, which leads to local mortality of aquatic species. Using the data they measured, the swarm of underwater robots successfully demonstrated that it is possible to forecast this phenomenon more effectively. In other words, an artificial ecosystem for the benefit of the natural ecosystem.

Learn more about Subcultron

[box type=”info” align=”” class=”” width=””]

The Subcultron project was officially launched in April 2015 as part of the Horizon 2020 research program . It was coordinated by the University of Graz, in Austria. It brought together IMT Atlantique in France, along with partners in Italy (the Pisa School of Advanced Studies, and the Venice Lagoon Research Consortium), Belgium (the Free University of Brussels), Croatia (the University of Zagreb), and Germany (Cybertronica).

[/box]

portabilité, portability

Data portability: Europe supports research players in this field

The right to data portability, introduced by the GDPR, allows individuals to obtain and reuse their personal data across different services. Launched in November 2019 for a period of three years, the European DAPSI project promotes advanced research on data portability by supporting researchers and tech SMEs and start-ups working in this field. The IMT Starter incubator is one of the project partners. IMT Starter business manager Augustin Radu explains the aim of DAPSI below.

 

What was the context for developing the DAPSI project?

Augustin Radu: Since the entry into force of the GDPR (General Data Protection Regulation) in 2018, all citizens have had the right to obtain, store and reuse personal data for their own purposes. The right to portability gives people more control over their personal data. It also creates new development and innovation opportunities by facilitating personal data sharing in a secure manner, under the control of the person involved.

What is the overall goal of the project?

AR: The Data Portability and Services Incubator (DAPSI) will empower internet innovators to develop human-centric technology solutions, meaning web technologies that can boost citizens’ control over data (privacy by design), trust in the internet and web decentralization, etc.

The goal is to develop new solutions in the field of data portability. The DAPSI project aims to allow citizens to transmit all the data stored by a service provider directly to another service provider, responding to the challenge of personal data portability on the internet, as provided for by the GDPR.

How will you achieve this goal?

AR: DAPSI will support up to 50 teams as part of a ten-month incubation program during which experts from various fields will provide an effective work methodology, access to cutting-edge infrastructure, training in business and data sectors, coaching, mentoring, visibility, as well as investment and a strong community. In addition, each DAPSI team will receive up to €150K in equity-free funding, which represents a total of €5.6 M through the three open calls.

How is IMT Starter contributing to the project?

AR: IMT Starter, in partnership with Cap Digital, will be in charge of this ten-month incubation program. In concrete terms, the selected projects will have access to online training sessions and one-to-one coaching sessions.

Who are your partners in this project?

AR: IMT Starter is managing a project led by project leader Zabala (Spain) along with four other European partners : F6S (United Kingdom), Engineering (Italy), Fraunhofer (Germany) and Cap Digital (France).

What are the expected benefits of DAPSI?

AR: This initiative aims to develop a more human-centric internet based on the values of openness, cross-border cooperation, decentralization and privacy protection. The primary objective is to allow users to regain control in order to increase trust in the internet. This should lead to more transparent services with more intelligence, greater engagement and increased user participation, therefore fostering social innovation.

What are some important steps for the project?

AR: The first call has been launched end of Februray. Anyone with an innovative project in the portability field might submit an application.

Learn more about DAPSI

Interview by Véronique Charlet for I’MTech

building

Recovering knowledge of local, traditional building materials

Why is an old country farmhouse more pleasant in summer than a modern city building? Traditional building materials and natural stone provide old buildings with better thermal and hygrometric properties. Unfortunately, they often lack the technical characterizations they need to find their place in the construction industry. The European regional development project OEHM has set out to resolve this problem. It brings together IMT Mines Alès, the University of Montpellier and the National School of Architecture of Montpellier. Aymeric Girard, a materials researcher at IMT Mines Alès, gives us an overview of the project and the challenges involved.

 

You’re studying natural building materials through the OEHM project. Why is this?

Aymeric Girard: All building materials require technical characterization. It’s important, since proposals for buildings are always simulated by computer nowadays as a first step. But traditional building materials, which are not produced by industry, lack technical characteristics. By studying local, traditional materials through the project, we are striving to fill this gap.

If the construction industry doesn’t use these materials, is it interested in this knowledge?

AG: Yes, since one of the major observations about current buildings is that they rely too heavily on internal insulation. The main reason for this is a lack of thermal mass in modern buildings, meaning a mass of materials that serves as a heat regulator. In a new building made with conventional building materials, you’re hot in the summer and cold in the winter. So you need heat and air conditioning. But this is far less of a problem in old buildings built with traditional building materials. In Seville, which is one of the hottest cities in Europe, old churches and cathedrals remain cool in the summer.   The construction industry is now seeking to model new buildings after these traditional structures.

Read more on I’MTech: In Search of Forgotten Cements

There’s also a second benefit. The construction industry is a sector that contributes heavily to greenhouse emissions. This is partially due to the environmental footprint of transporting materials. Using local stones encourages short supply chains, thereby reducing the environmental impact.

What materials are we talking about?

AG: For the OEHM project, we’re working with a clay brick factory and four natural stone quarries: one for granite and three for limestone. Some of these stones are truly local, since they come from the Occitanie region where IMT Mines Alès is located. Others are local in the sense that they come from France at least.

What aspects of these stones and bricks do you study?

AG : We conduct two main analyses of these stones: a thermal analysis and a hygrometric analysis. Hygrometry allows us to study a material’s ability to absorb humidity. That’s important because in winter, for example, the windows in a house are usually closed and you cook, take showers, sweat etc. All of these things increase the humidity level in rooms, which affects quality of life. Certain stones with very low porosity will not absorb this humidity at all, while others with high porosity will have a buffering effect and provide greater comfort.

How do you obtain the technical characteristics you’re seeking?

AG: The quarries send us small five-centimeter cubes to be analyzed. We use the hot-wire method to study heat transfer. This involves taking two cubes of the same stone, and putting a sensor the size of a post-it note between them. We heat one side and observe the speed at which the stone on the other side heats up. We also study the stones’ heat capacity, by putting even smaller samples measuring 5 mm per side in a mini-oven. This provides us with information about how long it takes to raise the stone’s temperature and about how it behaves.

In terms of humidity, we have a sort of refrigerator where we apply a constant amount of moisture, then we compare the weight of the dry stone with the saturated stone above, and deduce its capacity to absorb moisture. It’s a very long process that can take up to four months.

With whom are you working on this project?

AG: On the industrial side, we’re only working with the quarries for now. They’re interested in the technical characteristics we’re producing in order to provide their partners and customers with data about the materials. It’s important knowledge, just as when you buy glass wool to renovate your home, or when you compare offers to decide what to buy. On the research side, the project is part of a long collaboration between IMT Mines Alès, the University of Montpellier, and the National School of Architecture of Montpellier.

What will the project produce besides these technical characteristics?

AG: We plan to use the data we recover to develop our own material simulation software. And we’re also going to carry out real-site testing in collaboration with the National School of Architecture of Montpellier. They have a replica of a house that can be adapted to test materials. This will give us the opportunity to test our results and share insights with architects about the opportunities offered by natural materials suited to the Mediterranean climate.

Joint AI

Joint AI: a platform to facilitate German-French research in AI

In 2019, The German-French Academy for the Industry of the Future launched the Joint AI platform project. This platform bringing together IMT and the Technical University of Munich, promotes collaboration between researchers and industry to develop artificial intelligence tools. Its secure environment allows for intellectual property protection for the results, and the reproducibility of scientific results.

 

The primary aim is to support artificial intelligence research projects between France and Germany.” This is how Anne-Sophie Taillandier begins her description of the Joint AI platform launched in 2019 by IMT and the Technical University of Munich. Since 2015, the two institutions have been working together through the German-French Academy for the Industry of the Future. This partnership has given rise to a number of research projects, some of which have focused on artificial intelligence. Researchers working in this area face a recurring problem: intellectual property protection for the results.

One of the major risks for AI researchers is presenting their work to academic peers or industry stakeholders and having it stolen,” explains Anne-Sophie Taillandier. For several years, this French artificial intelligence expert has headed IMT’s TeraLab, which aims to facilitate AI research in a secure environment.  “Through discussions with our colleagues at the Technical University of Munich, we realized that we each had infrastructures to host and develop AI projects, but that there was no transnational equivalent,” she explains. This gave rise to the Joint AI platform project: as a shared, reliable, protected site for German-French research on artificial intelligence.

Read more on I’MTech: TeraLab, a European Data Sanctuary

The platform is based on technological and legal tools. The hardware architecture and workspaces are designed to host data and work on it with the desired security level. Using a set of APIs, the results of a project can be highlighted and shared on both sides of the border, without having to move the data or the software developed. “Everyone can work with confidence, without having to provide access to their executable or data,” says Anne-Sophie Taillandier.

A tool for researchers…

For researchers working on AI — as well as other scientific disciplines — facilitating cooperation means facilitating the progress of research projects and results. This is especially true for all research related to Industry 4.0, as is the case for the German-French Academy for the Industry of the Future projects that the Joint AI platform currently hosts. “Research on industry involves complex infrastructures, made up of human users and sensors that link the physical and digital dimensions,” says  Georg Carle, holder of the Network Architectures and Services Chair at the Technical University of Munich, and co-director of the project with Anne-Sophie Taillandier.

He explains that, “In order to be valuable, this research must be based on real data and generate realistic models.” And the more the data is shared and worked on by different teams of researchers,  the more effective the resulting algorithms will be. For Georg Carle, “the Joint AI platform makes it possible to improve the reproducibility of results” between the French and German teams. “This leads to higher-quality results, with a bigger impact for the industry stakeholders.”

And for companies!

In addition to providing a collaborative tool for researchers, the Joint AI platform also provides innovation opportunities for companies involved in partnership-based research. When a German industry stakeholder seeks to collaborate with French researchers or vice versa, the legal constraints for moving data represent a major hurdle. Such collaboration is further limited by the fact that, even within the same large company, it can be difficult for the French and German branches to exchange data. “This can be for a variety of reasons: human resources personal data, data related to industrial property, or data concerning clients with whom there is a confidentiality guarantee,” says Anne-Sophie Taillandier.

Companies therefore need a secure location, from both a technological and legal standpoint, to facilitate joint research. Joint AI therefore makes it easier for private stakeholders to take part in research projects at the European level, such as Horizon 2020 framework program projects — or Horizon Europe for future European research projects as of next year. Such a platform offers a prototype for a solution to one of the biggest problems facing AI and digital innovation: secure data sharing between different stakeholders.

Also read on I’MTech: