HyBlockArch

HyBlockArch: hybridizing the blockchain for the industry of the future

Within the framework of the German-French Academy for the Industry of the Future, a partnership between IMT and Technische Universität München (TUM), the HyBlockArch project examines the future of the blockchain. This project aims to adapt this technology to an industrial scale to create a powerful tool for companies. To accomplish this goal, the teams led by Gérard Mémmi (Télécom ParisTech) and Georg Carle (TUM) are working on new blockchain architectures. Gérard Memmi shares his insight.

 

Why are you looking into new blockchain architectures?

Gérard Mémmi: Current blockchain architectures are limited in terms of performance in the broadest sense: turnaround time, memory, energy… In many cases, this hinders the blockchain from being developed in Industry 4.0. Companies would like to see faster validation times or to be able to put even more information into a blockchain block. A bank that wants to track an account history over several decades will be concerned about the number of blocks in the blockchain and the possible increase in block latency times. Yet today we cannot foresee the behavior of blockchain architectures for many years to come. There is also the energy issue: the need to reduce consumption caused by the proof of work required to enter data into a blockchain, while still ensuring a comparable level of security.  We must keep in mind that the bitcoin’s proof of work consumes the same amount of electrical energy as a country like Venezuela.

What type of architecture are you trying to develop with the HyBlockArch project?

GM: We are working on hybrid architectures. These multi-layer architectures make it possible to reach an industrial scale. We start with a blockchain protocol in which each node of the ledger communicates with a mini data storage network on a higher floor. This is not necessarily a blockchain protocol and it can operate slightly differently while still maintaining similar properties. The structure is transparent for the users; they do not notice a difference. The miners who perform the proof of work required to validate data only see the blockchain aspect. This is an advantage for them, allowing them to work faster without taking the upper layer of the architecture into account.

What would the practical benefits be for a company?

GM: For a company this would mean smart contracts could be created more quickly and the computer operations that rely on this architecture would have shorter latency times, resulting in a broader scope of application. The private blockchain is very useful in the field of logistics. For example, each time a product changes hands, as from the vendor to the carrier, the operation is recorded in the blockchain. A hybrid architecture records this information more quickly and at a lower cost for companies.

This project is being carried out in the framework of the German-French Academy for the Industry of the Future. What is the benefit of this partnership with Technische Universität München (TUM)?

GM: Our German colleagues are developing a platform that measures the performance of the different architectures. We can therefore determine the most optimal architecture in terms of energy savings, fast turnaround and security for typical uses in the industry of the future. We contribute a more theoretical aspect: we analyze the smart contracts to develop more advantageous protocols, and we work with proof of work mechanisms for recording information in the blockchain.

What does this transnational organization represent in the academic field?

GM: This creates a European dynamic in the work on this issue. In March we launched a blockchain alliance between French institutes: BART. By working together with TUM on this topic, we are developing a Franco-German synergy in an area that only a few years ago was only featured as a minor issue at a research conference, as the topic of only one session. The blockchain now has scientific events all to itself. This new discipline is booming and through the HyBlockArch project we are participating in this growth at the European level.

 

C2Net

C2Net: supply chain logistics on cloud nine

Projets européens H2020A cloud solution to improve supply chain logistics? This is the principle behind the European C2Net project. Launched on January 1, 2015, the project was completed on December 31, 2017. The project has successfully demonstrated how a cloud platform can enable the various players in a supply chain to better anticipate and manage future problems. To do so, C2Net drew on research on interoperability and on the automation of alerts using data taken directly from companies in the supply chain. Jacques Lamothe and Frédérick Benaben, researchers in industrial engineering, on logistics and information systems respectively, give us an overview of the work they carried out at IMT Mines Albi on the C2Net project.

 

What was the aim of the C2Net project?

Jacques Lamothe: The original idea was to provide cloud tools for SMEs to help them with advanced supply chain planning. The goal was to identify future inventory management problems companies may have well in advance. As such, we had to work on three parts: recovering data from SMEs, generating alerts for issues to be resolved, and monitoring planning activity to see if everything went as intended. It wasn’t easy because we had to respond to interoperability issues — meaning data exchange between the different companies’ information systems. And we also had to understand the business rules of the supply chain players in order to evaluate the relevant alerts.

Could you give us an example of the type of problem a company may face?

Frédérick Benaben: One thing that can happen is that a supplier is only able to manufacture 20,000 units of an item while the SME is expecting 25,000. This makes for a strained supply chain and solutions must be found, such as compensating for this change by asking suppliers in other countries if they can produce more. It’s a bit like an ecosystem: when there’s a problem in one part, all the players in the supply chain are affected.

Jacques Lamothe: What we actually realized is that, a lot of the time, certain companies have very effective tools to assess the demand on one side, while other companies have very effective tools to measure production on the other side. But it is difficult for them to establish a dialogue between these two parts. In the chain, the manufacturer does not necessarily notice when there is lower demand and vice versa. This is one of the things the C2Net demonstrator helped correct in the use case we developed with the companies.

And what were the companies’ expectations for this project?  

Jacques Lamothe: For the C2Net project, each academic partner brought an industrial partner he had already worked with. And each of these SMEs had a different set of problems. In France, our partner for the project was Pierre Fabre. They were very interested in data collection and creating an alert system. On the Spanish side, this was less of a concern than optimizing planning. Every company has its own issues and the use cases the industrial partners brought us meant we had to find solutions for everyone: from generating data on their supply chains to creating tools to allow them to manage alerts or planning.

To what extent has your research work had an impact on the companies’ structures and the way they are organized?

Frédérick Benaben: What was smart about the project is that we did not propose the C2Net demonstrator as a cloud platform that would replace companies’ existing systems. Everything we did is situated a level above the organizations so that they will not be impacted, and integrates the existing systems, especially the information systems already in place. So the companies did not have to be changed. This also explains why we had to work so hard on interoperability.

What did the work on interoperability involve?

Frédérick Benaben: There were two important interoperability issues. The first was being able to plug into existing systems in order to collect information and understand what was collected. A company may have different subcontractors, all of whom use different data formats. How can a company understand and use the data from both subcontractor A, which is provided in one language and that of subcontractor B, which is provided in another? We therefore had to propose data reconciliation plans.

The second issue involves interpretation. Once the data has been collected and everyone is speaking  the same language, or at least can understand one another, how can common references be established? For example, having everyone speak in liters for quantities of liquids instead of vials or bottles. Or, when a subcontractor announces that an item may potentially be out of stock, what does this really mean? How far in advance does the subcontractor notify its customers? Does everyone have the same definition? All these aspects had to be harmonized.

How will these results be used?

Jacques Lamothe: The demonstrator has been installed at the University of Valencia in Spain and should be reused for research projects. As for us, the results have opened up new research possibilities. We want to go beyond a tool that can simply detect future problems or allow companies to be notified. One of our ideas is to work on solutions that make it possible to make more or less automated decisions to adjust the supply chain.

Frédérick Benaben: A spin-off has also been developed in Portugal. It uses a portion of the data integration mechanisms to propose services for SMEs. And we are still working with Pierre Fabre too, since their feedback has been very positive. The demonstrator helped them see that it is possible to do more than what they are currently able to do. In fact, we have developed and submitted a partnership research project with them.

Also read on I’MTech:

 

Artificial intelligence

What is artificial intelligence?

Artificial intelligence (AI) is a hot topic. In late March, the French government organized a series of events dedicated to this theme, the most notable of which was the publication of the report “For a Meaningful Artificial Intelligence,” written by Cédric Villani, a mathematician and member of the French parliament. The buzz around AI coincides with companies’ and scientists’ renewed interest in the topic. Over the last few years AI has become fashionable again, as it was in the 1950s and 1960s. But what does this term actually refer to? What can we realistically expect from it? Anne-Sophie Taillandier, director of IMT’s TeraLab platform dedicated to big data and AI, is working on innovations and technology transfer in this field. She was recently listed as one of the top 20 individuals driving AI in France by L’Usine Nouvelle. She sat down with us to present the basics of artificial intelligence.

 

How did AI get to where it is today?

Anne-Sophie Taillandier: AI has played a key role in innovation questions for two or three years now. What has helped create this dynamic are closer ties between two scientific fields: information sciences and big data, both of which focus on the question, “How can information be extracted from data, whether big or small?” The results have been astonishing. Six years ago, we were only able to automatically recognize tiny pieces of images. When deep learning was developed, the recognition rate skyrocketed. But if we have been able to use the algorithms on large volumes of images, it is because of hardware that has made it possible to perform the computations in a reasonable amount of time.

What technology is AI based on?

AST: Artificial intelligence is the principle of extracting and processing information. This requires tools and methods. Machine learning is a method that brings together highly statistical techniques such as neural networks. Deep learning is another technique that relies on deeper neural networks. These two methods have some things in common; what makes them different is the tools chosen. In any event, both technologies are based on the principle of learning. The system learns from an initial database and it is then used on other data. The results are assessed so that the system can keep learning. But AI itself is not defined by these technologies. In the future, there may be other types of technology which will also be considered artificial intelligence. And even today, researchers in robotics sometimes use different algorithms.

Can you give some specific examples of the benefits of artificial intelligence?

AST: The medical sector is a good illustration. In medical imaging, for example, we can teach an algorithm to detect cancerous tumors. It can then help doctors look for parts of an image that require their attention. We can also adjust a patient’s treatment depending on a lot of different data: is he alone or does he have a support network? Is he active or inactive? What is his living environment like? All these aspects contribute to personalized medicine, which has only become possible because we know how to process all this data and automatically extract information. For now, artificial intelligence is mainly used as a decision-making aid. Ultimately, it’s a bit like what doctors do when they ask patients questions, but in this case we help them gather information from a wide range of data. With AI, the goal is first and foremost to reproduce something that we know very well.

How can we distinguish between solutions that involve AI and others?

AST: I would say that it’s not really important. What matters is if using a solution provides real benefits. This question often comes up with chatbots, for example. Knowing whether AI is behind them or not — whether it’s just a decision tree based on a previous scenario or if it’s a human — is not helpful. As a consumer, what’s important to me is that the chatbot in front of me can answer my questions. They’re always popping up on sites now, which is frustrating since a lot of the time they are not particularly useful! So it is how a solution is used that really matters, more than the technology behind it.

Does the fact that AI is “trendy” adversely affect important innovations in the sector?

AST: With TeraLab we are working on very advanced topics with researchers and companies seeking  cutting-edge solutions. If people exaggerate in their communication materials or use the term “artificial intelligence” in their keywords, it doesn’t affect us. I’d rather that the public become familiar with the term and think about the technology already present in their smartphones than fantasize about something inaccessible.

 

Atmospheric reactive trace gases: low concentrations, major consequences

Projets européens H2020Despite only being present in very small quantities, trace gases leave their mark on the atmospheric composition. Since they are reactive, they may lead to the formation of secondary compounds such as ozone or aerosols that have a significant impact on health and the climate. IMT Lille Douai is a partner in the ACTRIS H2020 project, which aims to carry out long-term observations of trace gases, aerosols and clouds to better understand how they interact with one another and how they impact the climate and air quality.

 

Take some nitrogen, add a dose of oxygen, sprinkle in some argon and a few other inert gases, add a touch of water vapor and a pinch of carbon dioxide and you have the Earth’s atmosphere, or almost! Along with this mix composed of approximately 78% nitrogen, an honorable 21% oxygen, less than 1% argon and 0.04% carbon dioxide, you will also find trace gases with varying degrees of reactivity.  Emitted by both anthropogenic and natural sources, these gases exist in concentrations in the nanogram range, meaning 0.000000001 gram per cubic meter of the atmosphere. Does this mean they are negligible? Not really! “Once emitted these gases are not inert, but reactive,” explains Stéphane Sauvage, a researcher in atmosphere sciences and environmental technology at IMT Lille Douai. “They will react with one another in the atmosphere and lead to the formation of secondary species, such as ozone or certain aerosols that have a major impact on health and the climate.” This is why it is important to be able to identify and measure the precise quantity of these gases in the atmosphere.

ACTRIS (Aerosols, Clouds and Trace Gases Research Infrastructure) is a large-scale H2020 project which brings together 24 countries and over 100 laboratories, including IMT Lille Douai, as part of the ESFRI (European Strategy Forum on Research Infrastructure). By combining ground-based and satellite measurements, the aim is to carry out long-term observations of the composition of the atmosphere to better understand the factors behind the contaminants and their impact on the climate and air quality. In terms of innovation, the project seeks to develop new techniques and methods of observation. “At IMT Lille Douai, we have been developing our skills in ground-based observation of trace gases for many years, which has led to our being identified as contributors with extensive expertise on the topic,” says Stéphane Sauvage.

 

Gases that leave a mark on the atmosphere

Trace gases, which come from automobile exhausts, household heating, agricultural activities and emissions from plants and volcanoes, are good “tracers,” meaning that when they are measured, it is possible to identify their original source. But out of the 200 to 300 different species of trace gases that have been identified, some are still little-known since they are difficult to measure. “There are some very reactive species that play a key role in the atmosphere, but with such short lifetimes or in such low concentrations that we are not able to detect them,” explains Stéphane Sauvage.

Sesquiterpenes, a family of trace gases, are highly reactive. Emitted from vegetation, they play an important role in the atmosphere but remain difficult to quantify with current methods. “These gases have a very short lifetime, low atmospheric concentrations and they degrade easily during sample collection or analysis,” says Stéphane Sauvage.

On the other hand, some species, such as ethane, are well-known and measurable. Ethane results from human activity and has a low level of reactivity, but this does not make it any less problematic. It is present at a non-negligible level on a global scale and has a real impact on the formation of ozone. “We recently published an article in the Nature Geoscience journal about the evolution of this species and we realized that its emissions have been underestimated,” notes Stéphane Sauvage.

 

Complex relationships between aerosols, clouds and trace gases

In addition, by reacting with other atmospheric compounds, trace gases can lead to the formation of aerosols, which are suspensions of fine particles. Due to their capacity to absorb light, these particles impact the climate but can also penetrate the respiratory system leading to serious health consequences. “Although natural and anthropogenic sources are partially responsible for these fine particles, they are also produced during reactions with reactive trace gases through complex processes which are not yet entirely understood,” explains Stéphane Sauvage. This illustrates the importance of the ACTRIS project, which will observe the interactions between trace gases and aerosols, as well as clouds, which are also affected by these compounds.

Read more on IMTech: What are fine particles?

The measurements taken as part of the ACTRIS project will be passed on to a number of different players including weather and climate operational services, air quality monitoring agencies, the European Space Agency and policy-makers, and will also be used in agriculture, healthcare and biogeosciences. “The ACTRIS infrastructure is currently being built. We will enter the implementation phase in 2019, then the operational phase will begin in around 2025 and will last 25 years,” says Stéphane Sauvage. This is a very long-term project to organize research on a European scale, drawing on the complementary skills of over 100 research laboratories from 24 countries — to take atmospheric sciences to a stratospheric level!

 

[box type=”shadow” align=”” class=”” width=””]

A workshop on data from observations of reactive trace gases

Engineers and researchers from ten European countries met at IMT Lille Douai from 16 to 18 May for the annual ACTRIS project workshop on reactive trace gases. The objective was to review the data collected in Europe in 2017 and to discuss its validity along with the latest scientific and technical developments. All the players involved in making ground-based measurements of trace gases, aerosols and clouds will meet at IMT Lille Douai in October. Learn more

[/box]

 

H2sys

H2sys: hydrogen in the energy mix

I’MTech is dedicating a series of articles to success stories from research partnerships supported by the Télécom & Société Numérique Carnot institute (TSN), to which IMT and Femto Engineering belong.

[divider style=”normal” top=”20″ bottom=”20″]

Belles histoires, Bouton, CarnotH2sys is helping make hydrogen an energy of the future. This spin-off company from the FCLAB and Femto-ST laboratories in Franche-Comté offers efficient solutions for integrating hydrogen fuel cells. Some examples of these applications include generators and low-carbon urban mobility. And while the company was officially launched only 6 months ago, its history is closely tied to the pioneers of hydrogen technology from Franche-Comté.

 

1999, the turn of the century. Political will was focused on the new millennium and energy was already a major industrial issue. The end of the 90s marked the beginning of escalating oil prices after over a decade of price stability. In France, the share of investment in nuclear energy was waning. The quest for other forms of energy production had begun, a search for alternatives worthy of the 2000s. This economic and political context encouraged the town of Belfort and the local authorities of the surrounding region to invest in hydrogen. Thus, the FCLAB research federation was founded, bringing together relevant laboratories related to this theme. Almost two decades later, Franche-Comté has become a major hub for the discipline. FCLAB is the first national applied research community to work on hydrogen energy and the integration of fuel cell systems. It also integrates a social sciences and humanities research approach which looks at how our societies adopt new hydrogen technologies. This federation brings together 6 laboratories including FEMTO-ST and is under the aegis of 10 organizations, including the CNRS.

It was from this hotbed of scientific activity that H2sys was born. Described by Daniel Hissel, one of its founders, as “a human adventure”, the young company’s history is intertwined with that of the Franche-Comté region.  First, because it was created by scientists from FCLAB. Daniel Hissel is himself a professor at the University of Franche-Comté and leads a team of researchers at Femto-ST, both of which are partners of the federation. Secondly, because the idea at the heart of the H2sys project grew out of regional activity in the field of hydrogen energy. “As a team, we began our first discussions on the industrial potential of hydrogen fuel cell systems early as 2004-2005,” Daniel Hissel recalls.  The FCLAB teams were already working on integrating these fuel cells into energy production systems. However, the technology was not yet sufficiently mature. The fundamental work did not yet target large-scale applications.

Ten more years would be needed for the uses to develop and for the hydrogen fuel cell market to truly take shape. In 2013, Daniel Hissel and his colleagues watched intently as the market emerged. “All that time we had spent working to integrate the fuel cell technology provided us with the necessary objectivity and allowed us to develop a vision of the future technical and economic issues,” he explains. The group of scientists realized that it was the right time to start their business. They created their project the same year. They quickly received support from the Franche-Comté region, followed by the Technology Transfer Accelerator (SATT) in the Grand Est region and the Télécom & Société Numérique Carnot institute. In 2017, the project officially became the company H2sys.

Hydrogen vs. Diesel?

The spin-off now offers services for integrating hydrogen fuel cells based on its customers’ needs. It focuses primarily on generators ranging from 1 to 20 kW. “Our goal is to provide electricity to isolated sites to meet needs on a human scale,” says Daniel Hissel. The applications range from generating electric power for concerts or festivals to supporting rescue teams responding to road accidents or fires. The solutions developed by H2sys integrate expertise from FCLAB and Femto-ST, whose research involves work in system diagnosis and prognosis aimed at understanding and anticipating failures, lifespan analysis, predictive maintenance and artificial intelligence for controlling devices.

Given their uses, H2sys systems are in direct competition with traditional generators which run on combustion engines—specifically diesel. However, while the power ranges are similar, the comparison ends there, according to Daniel Hissel, since the hydrogen fuel cell technology offers considerable intrinsic benefits. “The fuel cell is powered by oxygen and hydrogen, and only emits energy in the form of electricity and hot water,” he explains. The lack of pollutant emissions and exhaust gas means that these generators can be used inside as well as outside. “This is a significant benefit when indoor facilities need to be quickly installed, which is what firefighters sometimes must do following a fire,” says the co-founder of the company.

Another argument is how unpleasant it is to work near a diesel generator. Anyone who has witnessed one in use understands just how much noise and pollutant emissions the engine generates. Hydrogen generators, on the other hand, are silent and emit only water. Their maintenance is also easier and less frequent: “Within the system, the gases react through an electrolyte membrane, which makes the technology much more robust than an engine with moving parts,” Daniel Hissel explains. All of these benefits make hydrogen fuel cells an attractive solution.

In addition to generators, H2sys also works on range extenders.  “This is a niche market for us because we do not yet have the capacity to integrate the technology into most vehicles,” the researcher explains. However, the positioning of the company does illustrate the existing demand for solutions that integrate hydrogen fuel cells. Daniel Hissel sees even more ambitious prospects. While the electric yield of these fuel cells is much better than those of diesel engines (55% versus 35%), the hot water they produce can also be recovered for various purposes. Many different options are being considered, including a water supply network for isolated sites, or for household consumption in micro cogeneration units for electricity and heating.

But finding new uses through intelligent integrations is not the only challenge facing H2sys. As a spin-off company from research laboratories, it must continue to drive innovation in the field. “With FCLAB, we were the first to work on diagnosing hydrogen fuel cell systems in the 2000s,” says Daniel Hissel. “Today, we are preparing the next move.” Their sights are now set on developing better methods for assessing the systems’ performance to improve quality assurance. In contributing to making the technology safer, H2SYS is heavily involved in developing fuel cells. And the technology’s maturation since the early 2000s is now producing results: hydrogen is now attracting the attention of manufacturers for the large-scale storage of renewable energies. Will this technology therefore truly be that of the new millennium, as foreseen by the pioneers of the Franche-Comté region in the late 90s? Without going that far, one thing is certain: it has earned its place in the energy mix of the future.

 

[box type=”shadow” align=”” class=”” width=””]

A guarantee of excellence
in partnership-based research since 2006

 

Having first received the Carnot label in 2006, the Télécom & Société numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Institut Mines-Télécom Business School, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering. Learn more [/box]

AskHub DessIA WaToo

Startups AskHub, DessIA and WaToo receive interest-free loans

On June 7, the Digital Fund of the Grandes Ecoles and Universities Initiative selected three new startups to receive interest-free loans. AskHub and DessIA, from ParisTech Entrepreneurs, the Télécom ParisTech incubator, and WaToo, from the IMT Atlantique incubator, will each receive €20,000 interest-free loans. These financial aid programs co-financed by Fondation Mines-Télécom, la Caisse des Dépôts and Revital’Emploi provide these growing companies with the funds they need to pursue their development.

 

[one_half][box type=”shadow” align=”” class=”” width=””]askhub

AskHub is a platform that analyzes requests that were not understood by chatbots and then offers an ecosystem of ready-to-use chat plug-ins to improve the user experience. Find out more

[/box]

[/one_half]

[one_half_last]

[box type=”shadow” align=”” class=”” width=””]DessIA

DessIA is design software for mechanical engineering. Using an approach based on artificial intelligence, the software can select the solution best adapted to users’ needs from among billions of possibilities. Find out more

[/box]

[/one_half_last]

[box type=”shadow” align=”” class=”” width=””]

logo watoo

WaToo offers a solution to prevent the misappropriation and falsification of sensitive documents by authorized users by concealing digital watermarks in the documents to protect them. Find out more

[/box]

Learn more about previous winners

 

GDPR

GDPR comes into effect. Now it’s time to think about certification seals!

The new European Personal Data Protection Regulation (GDPR) comes into effect on May 25. Out of the 99 articles contained in the regulation, two are specifically devoted to the question of certification. While establishing seals to demonstrate compliance with the regulation seems like a good idea in order to reassure citizens and economic stakeholders, a number of obstacles stand in the way.

 

Certification marks are ubiquitous these days since they are now used for all types of products and services. As consumers, we have become accustomed to seeing them everywhere: from the organic farming label for products on supermarket shelves to Energy certification for appliances. They can either be a sign of compliance with legislation, as is the case for CE marking, or a sign of credibility displayed by a company to highlight its good practices. While it can sometimes be difficult to make sense of the overwhelming number of seals and marks that exist today, some of them represent real value. AOC appellations, for example, are well-known and sought out by many consumers. So, why not create seals or marks to display responsible personal data management?

While this may seem like an odd question to citizens who see these seals as nothing more than red labels on free-range chicken packaging, the European Union has taken it into consideration. So much so, that Articles 42 and 43 of the new European Data Protection Regulation (GDPR) are devoted to this idea. The creation of seals and marks is encouraged by the text in order to enable companies established in the EU who process citizens’ data responsibly to demonstrate their compliance with the regulation. On paper, everything points to the establishment of clear signs of trust in relation to personal data protection.

However, a number of institutional and economic obstacles stand in the way.  In fact, the question of seals is so complicated that IMT’s Personal Data Values and Policies Chair* (VPIP) has made it a separate research topic, especially in terms of how the GDPR affects the issue. This research, carried out between the adoption of the European text on April 14, 2016 and the date it is set to come into force, May 25, 2018, has led to the creation of a work of more than 230 pages entitled Signes de confiance : l’impact des labels sur la gestion des données personnelles (Signs of Trust — the impact of seals on personal data management).

For Claire Levallois-Barth, a researcher in Law at Télécom ParisTech and coordinator of the publication, the complexity stems in part from the number and heterogeneity of personal data protection marks. In Europe alone, there are at least 75 different marks, with a highly uneven geographic distribution. “Germany alone has more than 41 different seals,” says the researcher. “In France, we have nine, four of which are granted by the CNIL (National Commission for Computer Files and Individual Liberties).” Meanwhile, the United Kingdom has only two and Belgium only one. Each country has its own approach, largely for cultural reasons. It is therefore difficult to make sense of such a disparate assortment of marks with very different meanings.

Seals for what?

Because one of the key questions is: what should the seal describe? Services? Products? Processes within companies? “It all depends on the situation and the aim,” says Claire Levallois-Barth. Until only recently, the CNIL granted the “digital safe box” seal to certify that a service respected “the confidentiality and integrity of data that is stored there” according to its own criteria. At the same time, the Commission also has a “Training” seal that certifies the quality of training programs on European or national legislative texts. Though both were awarded by the same organization they do not have the same meaning. So saying that a company has been granted “a CNIL seal” provides little information. One must delve deeper into the complexity of these marks to understand what they mean, which seems contradictory to the very principle of simplification they are intended to represent.

One possible solution could be to create general seals to encompass services, internal processes and training for all individuals responsible for data processing at an organization. However, this would be difficult from an economic standpoint. For companies it could be expensive — or even very expensive — to have their best practices certified in order to receive a seal. And the more services and teams there are to be certified, the more time and money companies would have to spend to obtain this certification.

On March 31, 2018, the CNIL officially transitioned from a labeling activity to a certification activity.

The CNIL has announced that it would stop awarding seals for free. “The Commission has decided that once the GDPR comes into effect it will concentrate instead on developing or approving certification standards. The seals themselves will be awarded by accredited certification organizations,” explains Claire Levallois-Barth. Afnor Certification or Bureau Veritas, for example, could offer certifications for which companies would have to pay. This would allow them to cover the time spent assessing internal processes and services, analyzing files, auditing information systems etc.

And for all the parties involved, the economic profitability of certification seems to be the crux of the issue. In general, companies do not want to spend tens of thousands, or even hundreds of thousands, of euros on certification just to receive a little-known seal. Certification organizations must therefore find the right formula: comprehensive enough to make the seal valuable, but without representing too much of an investment for most companies.

While it seems unlikely that a general seal will be created, some stakeholders are examining the possibility of creating sector-specific seals based on standards recognized by the GDPR, for cloud computing for example. This could occur if criteria were approved, either at the national level by a competent supervisory authority within a country (the CNIL in France), or at the European Union level by the European Data Protection Board (EDPB). A critical number of seals would then have to be granted. GDPR sets out two options for this as well.

According to Article 43 of the GDPR, certification may either be awarded by the supervisory authorities of each country, or by private certification organizations. In France, the supervisory authority is the CNIL, and certification organizations include Afnor and Bureau Veritas. These organizations are themselves monitored. They must be accredited either by the supervisory authority, or by the national accreditation body, which is the COFRAC in France.

This naturally leads to the question: if the supervisory authorities develop their own sets of standards, will they not tend to favor the accreditation of organizations that use these standards? Eric Lachaud, a PhD student in Law and Technology at Tilburg and guest at the presentation of the work by the Personal Data Values and Policies Chair on March 8, says, “this clearly raises questions about competition between the sets of standards developed by the public and private sectors.” Sophie Nerbonne, Director of Compliance at the CNIL, who was interviewed at the same event, says that the goal of the national commission is “not to foreclose the market but to draw on [its] expertise in very precise areas of certification, by acting as a data protection officer.”

A certain vision of data protection

It should be acknowledged, however that the area of expertise of a supervisory authority such as the CNIL, a pioneer in personal data protection in Europe, is quite vast. Beyond serving as a data protection officer and being responsible for ensuring compliance with GDPR within an organization that has appointed it, as an independent authority CNIL is in charge of regulating issues involving personal data processing, governances and protection, as indicated by the seals it has granted until now. Therefore, it is hard to imagine that the supervisory authorities would not emphasize their large area of expertise.

And even more so since not all the supervisory authorities are as advanced as the CNIL when it comes to certification in relation to personal data. “So competition between the supervisory authorities of different countries is an issue,” says Eric Lachaud. Can we hope for a dialogue between the 28 Member States of the European Union in order to limit this competition? “This leads to the question of having mutual recognition between countries, which has still not been solved,” says the Law PhD student. As Claire Levallois-Barth is quick to point out, “there is a significant risk of ‘a race to the bottom’.” However, there would be clear benefits. By recognizing the standards of each country, the countries of the European Union have the opportunity to give certification a truly transnational dimension, which would make the seals and marks valuable throughout Europe, thereby making them shared benchmarks for the citizens and companies of all 28 countries.

The high stakes of harmonization extend beyond the borders of the European Union. While the CE standard is criticized at times for how easy it is to obtain in comparison to stricter national standards, it has successfully imposed certain European standards around the world.  Any manufacturer that hopes to reach the 500 million-person market that the European Union represents must meet this standard. For Éric Lachaud, this provides an example of what convergence between the European Member States can lead to: “We can hope that Europe will reproduce what it has done with CE marking: that it will strive to make the voices of the 28 states heard around the world and to promote a certain vision of data protection.”

The uncertainties surrounding the market for seals must be offset by the aims of the GDPR. The philosophy of this regulation is to establish strong legislation for technological changes with a long-term focus. In one way, Articles 42 and 43 of the GDPR can be seen as a foundation for initiating and regulating a market for certification. The current questions being raised then represent the first steps toward structuring this market. The first months after the GDPR comes into effect will define what the 28 Member States intend to build.

 

*The Personal Data Values and Policies Chair brings together the Télécom ParisTech, Télécom SudParis graduate schools, and Institut Mines-Télécom Business School. It is supported by Fondation Mines-Télécom.

[box type=”info” align=”” class=”” width=””]

Personal data certification seals – what is the point?

For companies, having a personal data protection seal allows them to meet the requirements of accountability imposed by article 24 of the GDPR. It requires all organizations responsible for processing data to be able to demonstrate compliance with the regulation. This requirement also applies to personal data subcontractors.

This is what leads many experts to think that the primary application for seals will be business-to-business relationships rather than business-to-consumer relationships. SME economic stakeholders could seek certification in order to meet growing demand amongst their customers, especially major firms, for compliance in their subcontracting operations.

Nevertheless, the GDPR is a European regulation. This means that compliance is assumed: all companies are supposed to abide by the regulation as soon as it comes into effect. A compliance seal cannot therefore be used as a marketing tool. It is, however, likely that the organizations responsible for establishing certification standards will choose to encourage seals that go beyond the requirements of the GDPR. In this case, stricter control over personal data processing than what is called for by the legislation could be a valuable way to set a company apart from its competitors. [/box]

astatine, L'astate montre enfin sa liaison halogène !

Astatine halogen bond finally revealed!

Astatine is the last member of the halogen family, which also includes fluorine and chlorine. These chemical elements have a distinct feature: they are able to form an unusual kind of bond with molecules. Yet for astatine, the existence of this specific halogen bond had never before been proven. This is because in its natural state, astatine is the rarest element. Now all that has changed. The bond was revealed thanks to work by Subatech (a research unit including IMT Atlantique, CNRS and the University of Nantes) and the CEISAM. Their results have been published in the prestigious journal Nature Chemistry on March 19th.

 

Fluorine, chlorine, bromine, iodine? Toothpaste, swimming pool, photographic film, fish! Four chemical elements, and four objects that even those of us who are unscientific can associate with them. At first glance, brushing your teeth and swimming have little in common. And yet the four chemical elements mentioned above are all part of the same family: the halogen family. In fact, incandescent “halogen” lamps owe their name to the iodine and bromine contained in their bulbs. This just proves that in our lives we are often in contact with halogens, sometimes daily, for example with the chlorine that makes up half of our table salt (the other half is sodium). These elements are also well known to chemists, who in the early 20th century brought to light their ability to create an unusual type of bond: the halogen bond. These bonds are weaker than typical chemical bonds yet are significant enough to play a role in the development of liquid crystals, conductive polymers, and nanoporous materials.

However, this chemical family includes an element that is more discreet. Astatine, the fifth and last member of the group, is not very sociable: you’re unlikely to have come across this one during your lifetime. This is because scientists estimate the entire amount of astatine in the earth’s crust at less than 30 grams. It is the least abundant of all the elements naturally present on Earth. Its scarcity makes it difficult to study, and researchers even questioned its ability to form halogen bonds. The mystery was even more intriguing since experience so far had shown a link between the atom’s property and the strength of the halogen bond—astatine was suspected to be the source of the strongest halogen bonds. However, this still needed to be proven experimentally.

Now this has been accomplished, thanks to the Subatech and CEISAM teams, research laboratories which include participants from IMT Atlantique, CNRS and the University of Nantes. The work published on March 19 in the prestigious Nature Chemistry journal not only revealed astatine’s ability to create a halogen bond but confirmed that it is the strongest of its kind.  These results greatly contribute to understanding this element which is so difficult to study due to its extreme rarity. “The halogen bond shows that it is possible to form stable molecular structures with astatine,” explains Julie Champion, a chemist at Subatech. “This is very interesting for alpha-immunotherapy applications in particular.”

This radiotherapy technique involves introducing molecules which emit specific radioactive radiation—alpha particles—into the body to target cancer cells for example. Since some astatine isotopes are radioactive and emit alpha radiation, the element is considered to be a good choice for alpha-immunotherapy. Moreover, its scarcity stems from its short lifespan: after approximately eight hours, the astatine-210 isotope, which has the longest lifetime, was already half disintegrated.  This characteristic represents a great advantage for treatment, since astatine’s rapid disintegration limits side effects. Yet difficulties remain despite the first encouraging in vitro and in vivo attempts. The revelation of astatine’s potential to form halogen bonds opens new areas to explore with the goal of strengthening the connection between astatine and new biomolecules, which could potentially lead to more effective protocols for alpha-immunotherapy.

Working with the rarest element in the world

To demonstrate the existence of this bond, researchers had to adapt to the chemical element’s constraints. “We worked with isotope 211, which has an even shorter lifetime than isotope 210: after three days there is not enough astatine left for the experiments,” Julie Champion explains. The chemists had to be cunning. First, it is impossible to extract a few grams of astatine from the Earth’s crust; it must be produced artificially. “This is why we work with the Arronax cyclotron in Nantes.  This instrument is a particle accelerator used specifically for the alpha particles we use to bombard a target containing bismuth atoms. The resulting nuclear reaction produces astatine,” the researcher explains. The race against time begins for the synthetic halogen.

It is then extracted in chloroform and transported by truck to Subatech. The precious radioactive cargo must first undergo radiation protection inspections before being inserted into small test tubes which will be used for the radiochemistry experiments. “It is important to understand that we are working at the ultra-trace level,” explains Julie Champion. The quantities are so low that we cannot see the element we are studying and cannot study it using the usual spectroscopic methods.

How then can astatine’s halogen bond be revealed in this situation? For the solution, the chemists employed techniques from nuclear metrology. These tools make it possible to detect radioactivity within the sample. Two immiscible liquid phases are present in the test tubes. The first is an aqueous phase, containing the chemical species that are soluble in water. The second phase is called organic and contains the chemical species which are not soluble in water. The principle is the same as that of a drop of oil in a glass of water: the two liquids do not mix. Astatine, as it is extracted from the cyclotron, is usually present in the aqueous phase. However, when it forms a halogen bond with a molecule composed primarily of carbon and hydrogen, the astatine returns to the organic phase. By observing the radioactive emissions—in other words astatine’s signature—in both phases, the chemists were able to see astatine transitioning from one phase to the other.

In addition to the alpha-immunotherapy applications, this discovery paves the way for further research. In the experiment that was carried out, the Subatech and CEISAM teams formed a classical bond between astatine and an iodine atom to form a halogen bond with a molecule. Yet iodine can also form a halogen bond! Could it therefore be possible to create two halogen bonds, one with astatine and the other with iodine? This is the type of question that Julie Champion and her colleagues hope to study soon.

 

working classes digital, classes populaires numérique

How working classes use digital tools: The Facebook example

For over a decade now, the use of digital tools and internet connectivity has greatly developed among households, including among working classes. Yet very few studies exist on this part of the population’s specific uses of digital technology. In the context of the Poplog project, which counts Télécom ParisTech among its partners, Dominique Pasquier, a researcher in sociology, has studied this question through interviews and with help from a data set from Facebook accounts.*

 

Among low-income households, internet connectivity figures have skyrocketed. According to INSEE (the French national institute for statistics and economic studies), in 2006, 47.9% of employees and 37% of manual workers had access to the Internet at home. These figures rose to 88% among manual workers and 91.5% among employees. Within 10 years, internet use became fully integrated into the daily lives of working classes.

Yet, within the social sciences, barely any studies have focused on how the working classes relate to digital technology. “There is no reason to believe that internet uses are the same at the top and bottom of the social ladder,” explains Dominique Pasquier, researcher in sociology at Télécom ParisTech and Director of Research at the CNRS (the French national center for scientific research).

This observation is what led to the creation of the Poplog project. Funded by the ANR (the French National Research Agency), the partners for this project include Télécom ParisTech, the Centre Atlantique de Philosophie and Université de Bretagne Occidentale. The researchers looked at the use of digital technology among working classes with stable employment. Unlike very low-income classes that live on the outskirts of urban areas, the studied individuals live in rural areas and most own their own home. “This fraction of the population consists primarily of traditional families, there are very few single-parent families,” Dominique Pasquier explains. “In general, few have earned degrees and they work as manual workers or employees.

In the framework of this project, in order to study this category of the population and its relationship with digital tools, Dominique Pasquier looked specifically at how they use Facebook.

 

Data from Facebook accounts as research material

The researcher in sociology first attempted to collect information using various survey methods, particularly interviews. Yet very few people responded positively to requests for interviews. These difficulties are common in general sociology, according to Dominique Pasquier, especially when the study focuses on working classes. “These individuals do not have a clear understanding of what sociology is and do not see the point of these discussions,” she notes. “And this is a group that primarily welcomes family to their homes, but not strangers. Therefore, we face a rejection phenomenon.

This problem was avoided thanks to another project called Algopol, led by the Center for Social Analysis and Mathematics, Orange Labs France Télécom, LIAFA and Linkfluence from 2012 to 2015. The team carried out a major survey on the Facebook networks and recorded and anonymized data from approximately 15,000 accounts. Only 50 of the 15,000 accounts matched the social profiles Poplog was interested in. This number was suited to a qualitative study of the data.

The principle was that I was not allowed to meet the people who owned these accounts,” Dominique Pasquier explains. “The only information I had was their age, sex, municipality of residence, number of friends and the content they exchanged, excluding personal photos.” Yet this limited content was sufficient for conducting a sociological analysis of this data. Especially since this content complemented the information obtained during the interviews. “The two different formats do not provide the same insights,” the researcher continues. “The Facebook data reveals discussions in which the sociologist was not involved. Whereas during an interview, the person wants to give a good impression of themselves and therefore will not talk about certain subjects.”

Certain topics related to the use of digital technology was only available in the interviews, such as searches for information or online purchases. On the other hand, some topics were only available on Facebook, such as employment problems, or difficulties related to undesired singleness, a reality that affects unskilled male workers in particular.

 

Significant variations in how Facebook is used

The 50 accounts were exactly what I was looking for: adults between 30 and 50 years old who live in rural areas and are manual workers or work in personal care services,” Dominique Pasquier explains. “This is where we saw that the uses of Facebook are extremely varied.” There were many different types of users: some attempt to use the social network but do not know what to say, do not receive enough feedback and give up. Others try their hardest to attract attention, sharing ready-made catchphrases and impressive links. Some are very prolific in sharing events from their daily life, whereas others never talk about this aspect.

However, certain behaviors and phenomena were frequently observed throughout this selection of accounts. “There is a whole set of phrases about life that express a kind of circulating common ethic. During the interviews, people called them ‘quotes’,” Dominique Pasquier explains. “Furthermore, when someone posts a status update, those who respond are intergenerational and both male and female.

Finally, some things men shared about romantic difficulties, situations of undesired singleness or separation, caught Dominique Pasquier’s attention. She analyzed these comments and how others responded to it. “Some of what was shared was very aggressive, with misogynistic remarks. In this case, the comment always brought a response from the poster’s contacts, especially from women, who counteracted the remarks.”

The researcher’s goal was to analyze both what is shared on the social network and others’ reactions to it: “I analyze this content as things the individuals considered worthy of sharing and making known to their Facebook contacts who, in the context of this group of individuals from working classes with stable employment, are primarily made up of close friends and family.”

 

A different use of digital tools

I think this survey also demonstrates that these individuals are faring well with the internet, but in a completely different way,” Dominique Pasquier explains.  “In the case of Facebook, the social network is mainly used to maintain a kinship group.

Through these interviews and analysis, the researcher noticed other specific features in the use of digital tools among the studied population. “It is a social universe that presents different uses and it is important for the public authorities to be aware of this,” says Dominique Pasquier. Public policy is indeed moving towards establishing fully digital means of communication via email with social assistance institutions like Pôle Emploi and the Caisse d’Allocations Familiales. This digital transformation poses a problem. In the course of her study, the researcher observed that the individuals she surveyed did not use email as a means of interpersonal communication; they used it only to make purchases.  “These email addresses are shared by spouses or the entire family. With all the online purchases, the emails from Pôle Emploi will be lost among hundreds of spam emails and ads,” the researcher observes. “There is also a sort of rage that develops among this population, because of this inability to contact each other.

This shows how important it is to continue this work on the issue of digital technology and its use by working classes… while remaining vigilant. Although many sociology students are interested in studying digital corpora, these types of materials pose methodological problems. “Much of the data is anonymous, we often do not know who has produced it,” Dominique Pasquier explains. “Also, we often do not realize that 50% of online participation is produced by 1% of the population, by heavy contributors. We therefore mistake anecdotal occurrences for mass social phenomena.” Yet despite these challenges, digital data has “enormous potential, since we can work on large volumes of data and network phenomena…” Offering enough information to provide an understanding of how certain social groups are structured.

 

[box type=”shadow” align=”” class=”” width=””]* The i3 seminar on digital data analysis methodologies in the social sciences

The Poplog project and Dominique Pasquier’s research were presented at the Methods for the Analysis of Online Participation Seminar, organized by i3, a joint CNRS research unit of which Télécom ParisTech is a member. This seminar, which will run through June 2018, focuses on issues surrounding methods for processing digital data for research in the humanities and social sciences. The discussions focus on how the corpus is formed, analysis methods and the relationship between digital data and conventional survey methods.[/box]

 

MT 180, surgery, chirurgie, thesis, thèse

MT 180: 3D organ models facilitate surgery on children

Alessio Virzì, Biomedical Engineer – PhD student in Medical Image Processing, Télécom ParisTech – Institut Mines-Télécom, Université Paris-Saclay

The original version of this article (in French) was published on The Conversation, in connection with Alessio Virzì’s participation in the competition “My thesis in 180 seconds”.
[divider style=”normal” top=”20″ bottom=”20″]

What area have you chosen to focus on for your thesis?

I am interested in developing new tools for processing medical images for pediatric minimally invasive pelvic surgery.

This type of surgery involves a technique in which small incisions are made to enable the surgeon to reach the target area while limiting surgical trauma as much as possible. The technique involves robotic tools and a video imaging system that helps guide the surgeon’s movements. Due to the anatomical complexity of a child’s pelvis, the surgical planning stage, based on the study of medical imaging, is a crucial step.

What will your thesis work contribute to this area?

My thesis proposes new methods for processing medical images adapted to children with pelvic tumors or malformations. I am developing IT solutions for generating virtual 3D models of organs, tumors and nerve fibers based on MRI images.

Surgeons can view these 3D models before and during the surgery, thus improving the way they plan for the operation and providing more information than a simple video imaging system.

For example, I used artificial intelligence to analyze the pixels in the images to detect the different anatomical structures. I also used generic models of organs that I then adapted to the child to obtain the final 3D model.

I integrate all of these IT methodologies into interactive software that makes the surgeon the main actor in the image analysis process. In addition, thanks to this software, surgeons can easily verify the quality of the 3D models obtained and can make corrections as needed, based on their anatomical knowledge.

I developed this software based on existing open-source software that I improved by integrating specific models for MRI images of children’s pelvises.

It was very important for me to offer a tool that could easily be used in a clinical context by surgeons who are not specialized in computer science.

Example of a 3D model of the pelvis (right) obtained by processing MRI images (left).

What challenges have you faced along the way?

The first challenge was the limited amount of scientific literature on this topic, since this research area is underexplored. I therefore had to base my work on medical imaging studies on other anatomical structures like the adult brain.

Another major challenge was the need to develop methods that could be used in clinical practice. They needed to be extremely effective and easy for surgeons to use. This required additional efforts in the design and development of the software.

My communication with surgeons and radiologists played a crucial role in developing my research and allowed me to discover anatomical knowledge that I had not necessarily been aware of before and helped me understand their requirements for IT tools.

When did you decide to start a thesis?

My desire to do a thesis first arose during a research internship I did for my Masters 2 studies in biomedical engineering, that provided an opportunity to work on new applications in neuro-imaging.

In the future, I would like to continue working in the medical field because I find this area very motivating. My desire to find new applications has led me to explore the possibility of working in medical imaging in the private sector.

What are your thoughts on “My Thesis in 180 Seconds”?

I think the ability to share scientific knowledge is a key skill for researchers.

Unfortunately, I believe this aspect is not sufficiently present in scientific training. As researchers, we often have the unfortunate habit of using terms that are too specific and not accessible for non-scientists. Yet it is essential for us to help everyone understand what we are doing, both to demonstrate the importance of our work and stimulate its development.

This experience will definitely help me improve my skills in popularizing scientific knowledge and help me become more comfortable presenting this information to the public. It is also a very motivating challenge to have to present my thesis in a language that is not my first language and that I began using three years ago when I arrived here from Italy.