quantique, quantum technology

20 terms for understanding quantum technology

Quantum mechanics is central to much of the technology we use every day. But what exactly is it? The 11th Fondation Mines-Télécom booklet explores the origins of quantum technology, revealing its practical applications by offering a better understanding of the issues. To clarify the concepts addressed, the booklet includes a glossary, from which this list is taken.

 

Black-body radiation – Thermal radiation of an ideal object absorbing all the electromagnetic energy it receives.

Bra-ket notation (from the word bracket) – Formalism that facilitates the writing of equations in quantum mechanics.

Coherent detectors – Equipment used to detect photons based on amplitude and the phase of the electromagnetic signal rather than interactions with other particles.

Decoherence – Each possibility of a quantum superposition state interacts with its environment at a degree of complexity that makes the different possibilities incoherent and unobservable.

Entanglement – Phenomenon in which two quantum systems present quantum states that are dependent on one another, regardless of the distance separating them.

Locality (principle of) – The idea that two distant objects cannot directly influence each other.

Momentum – Product of the mass and velocity vector of a hypothetical object in time.

NISQ (Noisy Intermediate-Scale Quantum) – Current class of quantum computers

Observable (noun) – Concept in the quantum world comparable to a physical value (position, momentum, etc.) in the classical world.

Quanta – The smallest indivisible unit (of energy, momentum, etc.)

Quantum Hall effect – classical Hall effect refers to the phenomenon of voltage created by an electric current flowing through material immersed in a magnetic field. According to the conditions, this voltage increases in increments. This is the quantum Hall effect.

Quantum state – A concept that differs from a classical physical system, in which measured physical values like position and speed are sufficient in defining the system. A quantum state provides a probability distribution for each observable of the quantum system to which it refers.

Quantum system – Refers to an object studied in a context in which its quantum properties are interesting, such as a photon, mass of particles, etc.

Qubit – Refers to a quantum system in which a given observable (the spin for example) is the superposition of two independent quantum states.

Spin – Like the electric charge, one of the properties of particles.

Superposition principle – Principle that a same quantum state can have several values for one of its given observables.

The Schrödinger wave function – A fundamental concept of quantum mechanics, a mathematical function representing the quantum state of a quantum system.

Uncertainty Principle – Mathematical inequality that expresses a fundamental limit to the level of precision with which two physical properties of a same particle can be simultaneously known.

Wave function collapse – Fundamental concept of quantum mechanics that states that after a measurement, a quantum system’s state is reduced to what was measured.

Wave-particle duality (or wave-corpuscle duality) – The principle that a physical object sometimes has wave properties and sometimes corpuscular properties.

Also read on I’MTech

CloudButton

CloudButton: Big Data in one click

Projets européens H2020Launched in January 2019 for a three-year period, the European H2020 project CloudButton seeks to democratize Big Data by drastically simplifying its programming model. To achieve this, the project relies on a new cloud service that frees the final customer from having to physically manage servers. Pierre Sutra, researcher at Télécom SudParis, one the CloudButton partner, shares his perspective on the project.

 

What is the purpose of the project?

Pierre Sutra: Modern computer architectures are massively distributed across machines and a single click can require the computations from tens to hundreds of servers. However, it is very difficult to build this type of system, since it requires linking together many heterogeneous components. The key objective of CloudButton is to radically simplify this approach to programming.

How do you intend to do this? 

PS: To accomplish this feat, the project builds on a recent concept that will profoundly change computer architectures: Function-as-a-Service (FaaS). FaaS makes it possible to invoke a function in the cloud on-demand, as if it was a local computation. Since it uses the cloud, a huge number of functions can be invoked concurrently, and only the usage is charged—with millisecond precision. It is a little like having your own supercomputer on demand.

Where did the idea for the CloudButton project come from?

PS: The idea came from a discussion with colleagues from the Spanish university Rovira i Virgili (URV) during the 2017 ICDCS in Atlanta (International Conference on Distributed Computing Systems). We had just presented a new storage layer for programming distributed systems. This layer was attractive, yet it lacked an application that would make it a true technological novelty. At the time, the University of Berkeley offered an approach for writing parallel applications on top of FaaS. We agreed that this was what we needed to move forward. It would allow us to use our storage system with the ultimate goal of moving single-computer applications to the cloud with minimal effort. The button metaphor illustrates this concept.

Who are your partners in this project?

PS: The consortium brings together five academic partners: URV (Tarragona, Spain), Imperial College (London, UK), EMBL (European Molecular Biology Laboratory, Heidelberg, Germany), The Pirbright Institute (Surrey, UK) and IMT, and several industrial partners, including IBM and RedHat. The institutes specializing in genomics (The Pirbright Institute) and molecular biology (EMBL) will be the end users of the software. They also provide us with new use cases and issues.

Can you give us an example of a use case?

PS: EMBL offers its associate researchers access to a large bank of images from around the world. These images are stamped with information on the subject’s chemical composition by combining artificial intelligence and the expertise of EMBL researchers. For now, the system must calculate the stamps in advance. A use case for CloudButton would be for these computations to be performed on-demand, to customize user requests, for example.

How are Télécom SudParis researchers contributing to this project?

PS: Télécom SudParis is working on the storage layer for CloudButton. The goal is to design programming abstractions that are as similar as possible to what standard programming languages are. Of course, these abstractions must also be effective for the FaaS delivery model. This research is being conducted in collaboration with IBM and RedHat.

What technological and scientific challenges are you facing?

PS: In its current state, storage systems are not designed to handle massively parallel computations over a short period of time. The first challenge is therefore to adapt storage to the FaaS model. The second challenge is to reduce the synchronization between parallel tasks to a strict minimum in order to maximize performance. The third challenge is fault tolerance. Since the computations run on large-scale infrastructure, this infrastructure is regularly subject to errors. However, the faults must be hidden in order to display a simplified programming interface.

What are the expected benefits of this project?

PS: The success of a project like CloudButton can take several forms. Our first goal is to allow the institutes and companies involved in the project to resolve their computing and big data issues. On the other hand, the software we are developing could also meet with success among the open source community. Finally, we hope that this project will produce new design principles for computer system architectures that will be useful in the long run.

What are the important next steps in this project?

PS: We will meet with the European Commission one year from now for a mid-term assessment. So far, the prototypes and applications we have developed are encouraging. By then, I hope we will be able to present an ambitious computing platform based on an innovative use case.

 

[divider style=”normal” top=”20″ bottom=”20″]

The CloudButton consortium partners

data brokers

Data brokers: the middlemen running the markets

Over the past 5 years, major digital technology stakeholders have boosted the data broker business. These middlemen collect and combine masses of traces that consumers leave online. They then offer them to the companies of their choice in exchange for income. Above all, they use this capital to manipulate markets around the world. These new powerful stakeholders are greatly misunderstood. Patrick Waelbroeck, an economist at Télécom Paris, studies this phenomenon in the context of the Chair he cofounded dedicated to Values and Policies of Personal Information.

 

Data brokers have existed since the 1970s and the dawn of direct marketing. These data middlemen collect, sort and prepare data from consumers for companies in need of market analysis. But since the advent of the Web, data brokers like Acxiom, Epsilon and Quantum have professionalized this activity. Unlike their predecessors, they are the ones who choose the partners to whom they will sell the information. They employ tens of thousands of individuals, with turnover sometimes exceeding 1 billion dollars.

As early as 2015, in an article entitled The Black Box Society, Franck Pasquale, a law professor at the University of Maryland, identified over 4,000 data brokers in a 156-billion-dollar market. In 2014, according to the American Federal Trade Commission (FTC), one of these companies held information on 1.4 billion transactions carried out by American consumers, and over 700 billion aggregate items!

Yet these staggering figures are already dated, since technology giants have joined the data broker game over the past five years. Still, “economists are taking no notice of the issue and do not understand it,” says Patrick Waelbroeck, professor of industrial economics and econometrics at Télécom Paris. In the context of the IMT Chair Values and Policies of Personal Information, he specifically studies the effect of data brokers on fair competition and the overall economy.

Opaque activities

There are supply and demand dynamics, companies that buy, collect, modulate and build databases and sell them in the form of targeted market segments based on the customer’s needs,” the researcher adds. Technology giants have long understood that personal data is of little value on its own. A data broker’s activities entail not only finding and collecting data on or offline. More importantly, they must combine it to describe increasingly targeted market segments.

5 years ago, the FTC already estimated that some data brokers held over 3,000 categories of information on each American, from first and last names, addresses, occupations and family situations to intentions to purchase a car and wedding plans. But unlike “native” data brokers, technology giants do not sell this high value-added information directly. They exchange it for services and compensation. We know nothing about these transactions and activities, and it is impossible to measure their significance.

A market manipulation tool

One of the key messages from our research has been that these data brokers, and digital technology giants in especially, do not only collect data to sell or exchange,” says Patrick Waelbroeck. “They use it to alter market competition.” They are able to finely identify market potential for a company or a product anywhere in the world, giving them extraordinary leverage.

Imagine, for example, a small stakeholder who has the monopoly on a market in China,” says the economist. “A data broker who has data analysis indicating an interest in this company’s market segment for a Microsoft or Oracle product, for example, has the power to disrupt this competitive arena. For a variety of reasons—the interest of a customer, disruption of a competitor, etc.—they can sell the information to one of the major software companies to support them or, on the other hand, decide to support a Chinese company instead.

As a practical example of this power, in 2018, British Parliament revealed internal emails from Facebook. The conversations suggest that the Californian company may have favored third-party applications such as Netflix by sharing certain market data, while limiting access to smaller applications like Vine. “In economics, this is called a spillover effect on other markets,” Patrick Waelbroeck explains. “By selling more or less data to certain market competitors, data brokers can make the market more or less competitive and choose to favor or disadvantage a given stakeholder. ”

In a traditional market, the interaction between supply and demand introduces a natural form of self-regulation. In choosing one brand rather than another, the consumer exercises countervailing power. Internet users could do the same. But digital market mechanics are so difficult to understand that there are no users doing this. Although users regularly leave Facebook to prevent it from invading their privacy, it is unlikely they will do the same to prevent the social network from distorting competition by selling their data.

Data neutrality?

One of our Chair’s key messages is the observation of a total ignorance of the influence of data brokers,” Patrick Waelbroeck continues. “No one is pondering this issue of data brokers manipulating market competition. Not even regulators. Yet existing mechanisms could be used as a source of inspiration in countering this phenomenon.” The concept of net neutrality, for example, which in theory enables everyone to have the same access to all online services, could inspire data neutrality. It would prevent certain data brokers or digital stakeholders from deciding to favor certain companies over others by providing them with their data analysis.

Read more on IMTech What is Net Neutrality?

Another source of inspiration for regulation is the natural resource market. Some resources are considered as common goods. If only a limited number of people have access to a natural resource, competition is distorted, and the rejection of a commercial transaction can be sanctioned. Finally, an equivalent measure for intellectual property rights could be applied to data. Certain patents, which are necessary in complying with a standard, are regarded as raw materials and are therefore protected. The companies holding these “essential patents” are required by regulation to grant a license to all who want to use them at a reasonable and non-discriminatory rate.

The value of the data involved in digital mergers and acquisitions

In the meantime, pending regulation, the lack of knowledge about data brokers among competition authorities is leading to dangerous collateral damage. Unaware of the true value of certain mergers and acquisitions, like those between Google and DoubleClick, WhatsApp and Facebook, or Microsoft and LinkedIn, competition authorities use a traditional market analysis approach.

They see the two companies as belonging to different markets–for example WhatsApp as an instant messaging service and Facebook a social network–and in general conclude that they would not gain any market power in joining forces than they had individually. “That is entirely false!”, Patrick Waelbroeck protests. “They are absolutely in the same sector, that of data brokerage. After the union of these duos, they all merged their user databases and increased the number of their users. ”

 We must view the digital world through a new lens,” the researcher concludes. “All of us–economists, regulators, politicians and citizens–must understand this new data economy and its significant influence on markets and competition. In fact, in the long-term, all companies, even the most traditional ones, will be data brokers. Those unable to follow suit may well disappear. ”

Article by Emmanuelle Bouyeux for I’MTech

 

Christiine Lors

Interactions Materials-Microorganisms

This book is devoted to biocolonization, the biodeterioration of materials and possible improvements in their performance. Many materials age according to their use and their environment. The presence of microorganisms can then lead to biodeterioration. However, these can also help protect structures, provided their properties are used wisely. Christine LORS, researcher at IMT Lille Douai is co-author of this book published in English. Here is the presentation.

Read on I’MTech When microorganisms attack or repair materials

[box type=”shadow” align=”” class=”” width=””]This multidisciplinary book is the result of a collective work synthesizing presentations made by various specialists during the CNRS «BIODEMAT» school, which took place in October 2014 in La Rochelle (France). It is designed for readers of a range of scientific specialties (chemistry, biology, physics, etc.) and examines various industrial problems (e.g., water, sewerage and maintaining building materials).

Metallic, cementitious, polymeric and composite materials age depending on their service and operational environments. In such cases, the presence of microorganisms can lead to biodeterioration. However, microorganisms can also help protect structures, provided their immense possibilities are mastered and put to good use.

This book is divided into five themes related to biocolonization, material biodeterioration, and potential improvements to such materials resulting in better performance levels with respect to biodeterioration:
• physical chemistry of surfaces;
• biofilm implication in biodeterioration;
• biocorrosion of metallic materials;
• biodeterioration of non-metallic materials;
• design and modification of materials.

The affiliations of the authors of the various chapters illustrate the synergy between academic research and its transfer to industry. This demonstrates the essential interaction between the various actors in this complex field: analysing, understanding, and responding to the scientific issues related to biodeterioration.[/box]

[divider style=”normal” top=”20″ bottom=”20″]

Christine LorsInteractions Materials – Microorganisms
Concretes and Metals more Resistant to Biodeterioration
Christine Lors, Françoise Feugeas, Bernard Tribollet
EDP Sciences, 2019
416 pages
75,00 € (Paperback) – 51,99 € (PDF)

Order the book

servitization

Servitization of products: towards a value-creating economy

Businesses are increasingly turning towards selling the use of their products. This shift in business models affects SMEs and major corporations alike. In practice, this has an impact on all aspects of a company’s organization, from its design chain to developing collaborations, to rolling out new offerings for customers. Xavier Boucher and his colleagues, researchers in industrial systems design and optimization at Mines Saint-Étienne, help companies navigate this transformation.  

 

Selling uses instead of products. This shift in the business world towards a service economy has been emerging since the early 2010s. It is based on new offerings in which the product is integrated within a service, with the aim of increasing value creation. Leading manufacturers, such as Michelin, are at the forefront of this movement. With its Michelin Fleet Solutions, the company has transitioned from selling tires to selling kilometers to European commercial trucking fleets. But the trend also increasingly affects SMEs, especially as it is recognized as having many benefits including new opportunities to create value and drive growth, positive environmental impacts, building customer loyalty, increasing employee motivation and involvement.

However, such a transition is not easy to implement and requires a long-term strategy. What innovation strategies are necessary? What services should be rolled out and how? What company structures and expertise must be put in place? It all depends on market developments, the economic impacts of such a transformation on a company and the means to implement it, whether alone or with partners, to achieve a sustainable transformation. More generally, shifting a company’s focus to a product-service system means changing its business model. With his team, Xavier Boucher, a researcher at Mines Saint-Étienne, supports companies in this shift.

In the Rhône-Alpes region where he carries out his research, the majority of manufacturers are developing a service dimension to varying degrees through logistics or maintenance activities. “But out of the 150,000 companies in the region, only a few hundred have truly shifted their focus to selling services and to product life-cycle management,” explains the researcher. Nevertheless, his team is facing increasing demand from manufacturers.

Tailored support for companies

The transition from a model of selling products to a product-service system involves a number of issues of company organization, reconfiguration of the production chain and customer relationship management, which the researchers analyze using models. After a diagnostic phase, the goal is often to support a company with its transformation plan. The first step is changing how a product is designed. “When we design a product, we have to consider all the transformations that will make it possible to develop services throughout its use and throughout all the other phases of its life cycle,” explains Xavier Boucher. As such, it is often useful to equip a product with sensors so that its performance and life cycle can be traced when in customers’ possession. But production management is also impacted: this business strategy is part of a broader context of agility. The goal? Create value that is continually evolving through flexible and reconfigurable industrial processes in alignment with this purpose.

To this end, Xavier Boucher’s team develops different tools ranging from strategic analysis to decision support tools to bring a solution to market. “For example, we’ve created a business model that can be used while developing a new service offering to determine the right value creation chain to put in place and the best way for the company to sell the service,” says the researcher. Using a generic simulation platform and a customization approach, the researchers tailor these economic  calculators to manufacturers’ specific circumstances.

This is important since each situation is unique and requires a tailored business model. Indeed, marketing a mobile phone and deploying a cleaning robot will not rely on the same channels of action. The latter will call for services including customized installation for customers, maintenance and upgradability as well as management of consumables and measuring and guaranteeing cleaning quality. Moreover, companies vary in terms of their progress toward servitization. The researchers may collaborate with a start-up that has adopted a product-service model from the outset or with companies with established business models looking for a tailored, long-term transformation.

What are the steps toward a product-service system?

Companies may call upon the expertise of the Mines Saint-Étienne researchers at various stages in their progress toward this transition. For example, a manufacturer may be considering in advance how selling a service would impact its economic balance. Would such a change be the right move based on its situation? In this case, the models establish a progressive trajectory for its transformation and break it down into steps.

Another type of collaboration may be developed with a company who is ready to move towards selling services and is debating how to carry out its initial offering. Researchers use their simulation tools to determine three possible business models: the first is to market its product and add on the sale of services throughout its lifecycle; the second is to shift the company’s business to selling services and integrate the product within a service; and finally, the third model sells performance to customers.

The researchers helped the SME Innovtec develop an autonomous robot offering for industrial cleaning. “We developed computer-aided design tools: modeling, organizational scenarios, simulations. The goal was to expand the traditional product-oriented tools by adding a service package dimension,” explains Xavier Boucher. The company thus benefitted from different scenarios: identifying technologies to ensure its robots’ performance, determining which services were appropriate for this new product etc. But the projections also address topics beyond the production chain, such as whether it should integrate the new services within the current company or create a new legal structure to deploy them.

A final possibility is a company that has already made the transition to servitization but is seeking to go further, as is the case for Clextral, a SME that produces extrusion machines used by the food processing industry, which was supported through the European DiGiFoF project (Digital Skills for Factories of the Future). Its machines have a long service life and provide an opportunity to create value through maintenance and upgrading operations. The researchers have therefore identified a service development focus based on a retrofitting service, a sort of technical upgrade. This consists of exchanging obsolete parts while maintaining a machine’s configuration, or modifying the configuration of a piece of equipment to allow for a different industrial use than originally intended.

Digitization and risk management in multi-stakeholder context

The current trend towards servitization has been made possible by the digitization of companies. The Internet of Things has enabled companies to monitor their machines’ performance. In several years’ time, it may even be possible to fully automate the monitoring process, from ordering spare parts to scheduling on-site intervention. Smart product-service systems to combine digitization and servitization is a research focus and is a central part of the work carried out with elm.leblanc, a company seeking to put in place a real-time information processing to respond to customers more quickly.

However, this change in business models affects not only a company, but its entire ecosystem.  For example, elm.leblanc is considering sharing costs and risks between various stakeholders. One option, for example, would be to incorporate partner companies to implement this service. But how would the economic value or brand image be distributed between the partners without them taking over the company’s market? Research on managing risk and uncertainty is of key importance for Xavier Boucher’s team. “One of the challenges of our work is the number of potential failures for companies, due to the difficulties of effectively managing the transition. Although servitization has clearly been shown to be a path to the future, it is not associated with immediate, consistent economic success. It’s essential to anticipate challenges.”

Article written (in French) by Anaïs Culot for I’MTech

partage de données, data sharing

Data sharing: an important issue for the agricultural sector

Agriculture is among the sectors most affected by digital transition, given the amount of data it possesses. But for the industry to benefit from its full potential, it must be able to find a sound business model for sharing this data. Anne-Sophie Taillandier, the Director of TeraLab — IMT’s big data and AI platform — outlines the digital challenges facing this sector in five answers.

 

How important of an issue is data in the agricultural sector?

Anne-Sophie Taillandier: It’s one of the most data-intensive sectors and has been for a long time. This data comes from tools used by farmers, agricultural cooperatives and distribution operations, up to the border with the agrifood industry behind it. Data is therefore found at every step. It’s an extremely competitive industry, so the economic stakes for using data are huge.

How can this great quantity of data in the sector be explained?

AST: Agriculture has used sensors for a long time. The earliest IoT (Internet of Things) systems were dedicated to collecting weather data, and were therefore quickly used in farming to make forecasts. And tractors are state-of-the-art vehicles in terms of intelligence – they were among the earliest autonomous vehicles. Farms also use drones to  survey land. And precision agriculture is based on satellite imagery to optimize harvests while using as few resources as possible. On the livestock farming side, infrastructures also have a wealth of data about the quality and health of animals. And all of these examples only have to do with the production portion.

What challenges does agriculture face in relation to data?

AST: The tricky question is determining who has access to which data and in what context. These data sharing issues arise in other sectors too, but there are scientific hurdles that are specific to agriculture. The data is heterogeneous: it comes from satellites, ground-based sensors, information about markets etc. It comes in the form of texts, images and measurements. We must find a way for this data to communicate with each other. And once we’ve done so, we have to make sure that all the stakeholders in the industry can get benefit from it, by accessing a level of data aggregation that does not exceed what the other stakeholders wish to make available.

How can the owners of the data be convinced to share it?

AST: Everyone has to find a business model they find satisfactory. For example, a supermarket already knows its sales volumes – it has its own processing plants and different qualities of products. What it’s interested in is obtaining data from slaughterhouses about product quality. Similarly, livestock farmers are interested in sales forecasts for different qualities of meat in order to optimize their prices. So we gave to find these kinds of virtuous business models to motivate the various stakeholders. At the same time, we have to work on spreading the word that data sharing is not just a cost. Farmers must not spend hours every day entering data without understanding its value.

What role can research play in all this? What can a platform like TeraLab contribute?

AST: We help highlight this value, by demonstrating proof of concept for business models and considering potential returns on investment. This makes it possible to overcome the natural hurdles to sharing data in this sector. When we carry out tests, we see where the value lies for each party and which tools build trust between stakeholders — which is important if we want things to go well after the research stage. And with IMT, we provide all the necessary digital expertise in terms of infrastructure and data processing.

Learn more about Teralab, IMT’s big data and AI platform.

oil pollution

Marine oil pollution detected from space

Whether it is due to oil spills or cleaning out of tanks at sea, radar satellites can detect any oil slick on the ocean’s surface. Over 15 years ago, René Garello and his team from IMT Atlantique worked on the first proof of concept for this idea to monitor oil pollution from space. Today, they are continuing to work on this technology, which they now use in partnership with maritime law enforcement. René Garello explains to us how this technology works, and what is being done to continue improving it.

Most people think of oil pollution as oils spills, but is this the most serious form of marine oil pollution?

René Garello: The accidents which cause oil spills are spectacular, but rare. If we look at the amount of oil dumped into the seas and oceans, we can see that the main source of pollution comes from deliberate dumping or washing of tanks at sea. If we look at the amount of oil released over a year or a decade, this dumping releases 10 to 100 times more oil than oil spills. Although is does not get as much media coverage, the oil released by tank washing arrives on our coastlines in exactly the same way.

Are there ways of finding out which boats are washing out their tanks?

RG: By using sensors placed on satellites, we can have large-scale surveillance technology. The sensors allow us to monitor areas of approximately 100km². The maritime areas close to the coast are our priority, since this is where the tankers stay as they cannot sail on the high seas. Satellite detection methods have improved a lot over the past decade. 15 years ago, detecting tank dumping from space was a fundamental research issue. Today, the technology is used by state authorities to fight against this practice.

How does the satellite detection process work?

RG: The process uses imaging radar technology, which has been available in large quantities for research purposes since the 2000s. This is why IMT Atlantique [at the time called Télécom Bretagne] participated in the first fundamental work on large quantities of data around 20 years ago.  The satellites emit a radar wave towards the ocean’s surface, which is reflected back towards the satellite.  The reflection of the wave is different depending on the roughness of the water’s surface. The roughness is increased by things such as the wind, currents, or waves and decreased by cold water, algae masses, or oil produced by tank dumping. When the satellite receives the radar wave, it reconstructs an image of the water and displays the roughness of the surface.  Natural, accidental or deliberate incidents which reduce the roughness appear as a black mark on the image. The project is a European Research Project which is carried out in partnership with the European Space Agency and a startup based in our laboratories, Boost Technology – which has since been acquired by CLS – and has shown the importance of this technique for detecting oil slicks.

If several things can alter the roughness of the ocean’s surface, how do you differentiate an oil slick from an upwelling of cold water or algae?

RG: It is all about investigation. You can tell whether it is an oil slick from the size and shape of the black mark. Usually, the specialist photo-interpreter behind the screen has no doubt about the source of the mark, as an oil slick has a long, regular shape which is not similar to any natural phenomena. But this is not enough. We have to carry out rigorous tests before raising an alert. We cross-reference our observations with datasets to which we have direct access, such as the weather, temperature and state of the sea, wind, and algae cycles…. All of this has to be done within 30 minutes of the slick being discovered in order to alert the maritime police quickly enough for them to take action. This operational task is carried out by CLS, using the VIGISAT satellite radar data reception station that they operate in Brest, which also involves IMT Atlantique. As well as this, we also work with Ifremer, IRD and Météo France to make the investigation faster and more efficient for the operators.

Detecting an oil spill is one thing, but how easy is it to then find the boat responsible for the pollution?

RG: Radar technology and data cross-referencing allow us to identify an oil spill accurately. However, the radar doesn’t give a clear answer as to which boat is responsible for the pollution.  The transmission speed of the information sometimes allows the authorities to find the boat which is directly responsible, but sometimes we find the spill several hours after it has been created. To solve this problem, we cross-reference radar data with the Automatic Identification System for vessels, or AIS. Every boat has an AIS which provides GPS information about its location at sea. By identifying where the slick started and the time it was made, we can identify which boats were in the area at the time that could have done a tank dumping.

It is possible to identify a boat suspected of dumping (in green) amongst several vessels in the area (in red) using satellites.

It is possible to identify a boat suspected of dumping (in green) amongst several vessels in the area (in red) using satellites.

 

This requires knowing how to date back to when the slick was created and measuring how it changed at sea.

RG: We also work in partnership with oceanographers and geophysicists. How does a slick drift? How does its shape change over time? To answer these questions, we again use data about the currents and the wind. From this data, physicists use fluid mechanics models to predict how the sea would impact an oil slick. We are very good at retracing the evolution of the slick in the hour before it is detected. When we combine this with AIS data, we can eliminate vessels whose position at the time was incompatible with the behavior of the oil on the surface. We are currently trying to do this going further back in time.

Is this research specific to oil spills, or could it be applied to other subjects?

RG: We would like to use everything that we have developed for oil for other types of pollution. At the moment we are interested in sargassum, a type of brown seaweed which is often found on the coast. Its production increases with global warming. The sargassum invades the coastline and releases harmful gases when it decomposes. We want to know whether we can use radar imaging to detect it before it arrives on the beaches. Another issue that we’re working on involves micro-plastics. They cannot be detected by satellites. We are trying to find out whether they modify the characteristics of water in a way that we can identify using secondary phenomena, such as a change in the roughness of the surface. We are also interested in monitoring and predicting the movement of large marine debris…. The possibilities are endless!

Also read on I’MTech

In France, AMAPs (associations for community-supported agriculture) are emblematic examples of the social solidarity economy. But they are not the only social solidarity economy (SSE) organizations. Other examples include cooperative banks, non-profit groups and mutual funds.

What is the social and solidarity economy?

The social and solidarity economy (SSE) encompasses organizations that seek to respond to human problems through responsible solutions. Far from being an epiphenomenon, the SSE accounts for a significant share of the economy both in France and around the world. Contrary to popular belief, these principles are far from new. Mélissa Boudes, a researcher in management at Institut Mines-Télécom Business School, helps us understand the foundations of this economy.

 

What makes the social and solidarity economy unique?

Mélissa Boudes: The social and solidarity economy (SSE) is based on an organization structure that is both different and complementary to public economy and capitalist economy. This structure is dedicated to serving human needs. For example, organizations that are part of the SSE are either non-profit or low-profit limited companies. In this second case, profits are largely reinvested in projects rather than being paid to shareholders in the form of dividends. In general, SSE organizations have a democratic governance model, in which decisions are made collectively based on the “one person one vote” principle and involve those who benefit from their services.

What types of organizations are included in this economy?

MB: A wide range! Non-profit groups typically fall within this framework. Although sports and community non-profit groups do not necessarily claim to be part of the SSE, they fall within the framework based on their official statutes. Cooperatives, mutual funds and social businesses of varying sizes are also part of the SSE. One example is the cooperative group Up—formerly called Chèque déjeuner—which now has an international dimension. Other organizations include mutual health insurance groups, wine cooperatives, and cooperative banks.

How long has this economy existed?

MB: We often say that it has existed since the 19th century. The social and solidarity economy developed in response to the industrial revolution. At this time, workers entered a subordinate relationship that was difficult to accept. They wanted a way out. Alternative organizations were created with a primary focus on workers’ concerns. The first organizations of this kind were mutual aid companies that provided access to medical care and consumer cooperatives that helped provide access to good quality food. At the time, people often went into debt buying food. Citizens therefore created collective structures to help each other and facilitate access to good quality, affordable food.

So why have we only heard about the social and solidarity economy in recent years?

MB: It’s true that we seem to be witnessing the re-emergence of SSE, which was the subject of a law in 2014. SSE is now back in the forefront because the issues that led to its creation in the 19th century are reappearing—access to food that is free of pollution, access to medical care for “uberized” workers.  AMAPs (associations for community-supported agriculture) and cooperative platforms such as Label Emmaüs are examples of how the SSE can respond to these new expectations. Although new media coverage would suggest that these organization models are new, they actually rely on practices that have existed for centuries. However, the historical structures behind the SSE are less visible now because they have become institutionalized. For example, we sometimes receive invitations to participate in the general meeting for our banks or mutual funds. We don’t pay much attention to this, but it shows that even without knowing it, we are all part of the SSE.

Is the social and solidarity economy a small-scale phenomenon, or does it play a major role in the economy?

MB: The SSE exists everywhere in France, but also around the world. We must understand that SSE organizations aim to provide solutions to universal human problems: better access to education, mobility, healthcare… In France, the SSE represents 10% of employment.  This share rises to 14% if we exclude the public economy and only look at private employment. Many start-ups have been created based on the SSE model. This is therefore an important economic phenomenon.

Can any type of organization claim to be part of the social and solidarity economy?

MB: No, they must define an official status that is compatible with the SSE at the time the organization is founded, or request authorization if the company has a commercial status. They must request specific approval as a solidarity-based company of social benefit, which is attributed by the regional French employment authority (DIRECCTE).  Approval is granted if the company demonstrates that it respects certain principles, including providing a social benefit, a policy in its statutes limiting remuneration, an absence from financial markets, etc.

How does the social and solidarity economy relate to the concept of corporate social responsibility (CSR)?

MB: In practice, CSR and SSE concepts sometimes overlap when commercial companies partner with SSE companies to develop their CSR. However, these two concepts are independent. The CSR concept does, however, reveal an economic movement that places increasing importance on organizations’ social aims. More and more commercial companies are opting for a hybrid structure: without becoming SSE companies, they impose limited salary scales to avoid extremely high wages. We are in the process of moving towards an environment in which the dichotomies are more blurred. We can no longer think in terms of virtuous SSE organizations on one side and the profit-driven capitalist economy on the other. The boundaries are not nearly as clear-cut as they used to be.

Read on I’MTech Social and solidarity economy in light of corporate reform

nanoparticles

What happens to nanoparticles when they become waste?

For the past twenty years, industry sectors across the board have been producing a wide range of nanomaterials. They have developed rapidly and with little in the way of regulation. This has led to a regulatory vacuum when it comes to the end-of-life management of these nanomaterials. Little is known about the environmental and health impacts linked to the future of this nanowaste. IMT Atlantique researchers have therefore led two successive projects on the incineration of nanowaste as part of a research consortium: NanoFlueGas and Nano-Wet. The results confirm the persistence of certain nanoparticles after they leave the incinerator, in the form of effluent and ashes. Last April, the research consortium, made up of IMT Atlantique, INERIS and industrial partner Trédi – Groupe Séché Environnement, submitted its technical recommendations to ADEME. Researcher Aurélie Joubert, lead author of the Nano-Wet report and researcher at IMT Atlantique, provides a look back at how this pioneer program came about.

 

What was the purpose of the NanoFlueGas and Nano-Wet project?

Aurélie Joubert: There are currently no regulations on the management of nanoparticle waste, especially for end-of-life materials. If the nanoparticles contained in the waste are identified as being hazardous for the environment or health, this waste is categorized and treated in the hazardous waste cycle. If not, they follow the common domestic waste incineration cycle. Our goal was to understand what was happening to the nanoparticles over the course of this treatment, to determine whether the cycles are adapted to nanowaste and possibly how to optimize them.

During the NanoFlueGas project, we studied the incineration of nanowaste at temperatures of 850°C, which is the common practice in the domestic waste management cycle. A lab incineration furnace was developed for this purpose at INERIS. It reproduced the same conditions in terms of temperature, air turbulence and oxygen levels. We were able to confirm that nanoparticles were released during treatment at varying concentrations depending on the nature of the nanowaste being studied. We also observed a capture efficiency of over 99% of the nanoparticles released thanks to a system developed at IMT Atlantique: a pilot blowback bag filter.

During the second project, Nano-Wet, which ended on April 1st, we focused on the hazardous waste incineration cycle at 1,100°C and on a technique for the wet processing of smoke. ADEME, which funded the project, now has the results of this study, and they are now public.

Which types of nanowaste have you worked on, and where do they come from?

AJ: A national registry—called R-nano—has existed since 2013. It requires all nanoparticles present on French territory to be declared. We worked on actual waste deposits received by Trédi that are part of this registry. For the Nano-Wet project, we worked on chlorinated waste, which is waste produced during the manufacturing of PVC floor covering in the construction sector. We also had sulfur waste from ion exchange resin beads used for water treatment. Finally, we worked with organosilicon waste, in the form of a polymer from mastic, which we had already studied in the previous project and found to have high-emissivity.

We went through a fairly long and in-depth characterization phase for the types of waste selected. This was a choice we made and was unlike other research projects in which laboratories produce their own product and are therefore perfectly aware of its composition. We wanted to use real waste deposits, for which we did not know the exact composition, in order to develop relevant methodological tools for analysis. This could now allow industrialists like Trédi to identify the potential nanofillers in its waste in the lab, enabling them to remain in advance of the regulations in this area.

What were the findings from the Nano-Wet project?

AJ: For the incineration component, we observed different scenarios in the nanoparticles present in the smoke. This revealed that nanowaste behaves differently during incineration depending on the composition. With organosilicon polymer waste, the initial nanostructure of the silica is preserved. It even increases through the formation of nano-silica caused by the degradation of a polymer. Conversely, the nanostructure disappears in PVC waste. In the case of resin waste, which does not originally contain nanofiller, a nanostructure appears due to reactions involving the impurities that were present from the beginning.

The results also suggest that the “hazardous waste” cycle with incineration at 1,100°C should be applied to nanowaste. With the organosilicon polymer waste, we found that incineration at 1,100°C instead of 850°C reduced nanoparticle emissions due to a sintering phenomenon. This interesting finding could be applied to other types of waste. Concerning the smoke treatment system, we specifically focused on a device for treating gaseous pollutants: a washing column.  We showed that the washing column reduced the quantity of nanoparticle emissions by 60% after incineration. This column therefore significantly contributes to the overall effectiveness of the treatment system.

Why did you decide to focus on the water column smoke treatment process?

AJ: Both researchers and industrialists see scrubbers as an unusual process for nanoparticles, because they are normally used to treat acid gases in the smoke. With Nano-Wet, we demonstrated that they are still significantly effective with nanoparticles, reducing the quantity by 60%. This was a rather unexpected outcome, especially since it was obtained without optimizing the process: we studied the column’s effectiveness without adapting it to treat nanoparticles. Despite its effectiveness, it is important to maintain the complementary procedures designed specifically for particle collection, namely bag filters and electrostatic precipitators, which have an effectiveness of 99%. We are pioneers in this type of study and we will continue. We want to assess the influence of the operating conditions to improve the scrubbers’ effectiveness for nanoparticles, while maintaining their effectiveness with acid cases.

How did you go about studying the effectiveness of the water columns?

AJ: The idea was to build a water column that sprays fine droplets in our research facility. We developed our pilot test to enable us to work in realistic conditions in terms of temperature and humidity, but on a reduced scale. We kept the same turbulent flow regime, the same ratio between the flow of the liquid sprayed into the column and the air flow, the same size for the sprayed droplets, etc. Here we clearly witnessed the advantage of working with an industrialist on this project: Trédi provided us with the operating conditions in the form of technical specifications at the start of the project. However, we did simplify the experiment: the conditions are very complicated in the flue gas treatment lines because due to the large amount of acid gases. We made the decision to inject air enriched with nanoparticles without injecting corrosive acid gases like hydrochloric acid and sulfur dioxide.

Was it easy to transition from the laboratory to the industrial site?

AJ: When possible, we try to incorporate an on-site measurement phase in order to compare the laboratory scale with the real scale. In the context of the Nano-Wet project, a specialized team from INERIS conducted two measurement campaigns at the industrial site. They were faced with very complicated sampling conditions. Results from samples taken at the entrance to the water column allowed us to determine the particle size and concentration, conditions that we could then reproduce in the lab.

When the particles are removed from the column, they end up in the wastewater. What happens to them after that?

AJ: This is an issue that must be addressed. We did not identify the particles in the liquid phase after leaving the column.  We do not know what form they take, if they remain nanoparticles or if they agglomerate, in which case they could easily be separated in the water treatment station. Other research laboratories are working on these issues.

You just submitted your last report to ADEME. How could the legislation change?

AJ: The threshold values for particle emissions for waste incinerators are currently expressed in terms of total mass concentration in micrograms per cubic meter of air. This is not relevant for nanoparticles. Despite their negligible mass, they are suspected to be highly toxic. In the future, I think the standards could specify a concentration threshold in terms of quantity of particles, or in terms of mass according to the given particle size. For indoor air quality, for example, new tools have been developed to characterize mass concentrations by size of fine particulate matter: PM 10, PM 2.5 or PM 0.1.

This article was written (in Frenh) by Alice Mounissamy for I’MTech.

ALGIMEL is an environmentally-friendly material which is used in a wide range of projects.

ALGIMEL, a ‘marine’ polystyrene

In the future, materials will not only need to be more efficient; it will also be essential that they are environmentally friendly. With this in mind, researchers from IMT Mines Alès who specialize in bio-sourced materials are working on this project.  Over the past few decades, they have been trying to develop environmentally-friendly alternatives to the most polluting materials. One of their latest designs is a naturally occurring polymer foam which can replace several of polystyrene’s uses. Eric Guibal and Thierry Vincent tell us about their work.

 

What is the material you have developed like?

Eric Guibal: We see it as a material with similar properties to polystyrene. It is a low-density foam with a structure that is mainly made up of biopolymers. This material, named ALGIMEL, is mostly made up of alginate, a natural polymer which is found in the cell wall of brown algae. The team’s expertise is mainly focused on the synthesis and composition of these foams.

Why did you choose this type of biopolymer?

Thierry Vincent: From the beginning, our aim was to develop the most environmentally-friendly material possible. This decision goes far beyond the choice of polymer as during the synthesis of the material we don’t use any toxic products. For example, when we are developing the foam, we don’t use any chemical solvents.  We don’t use products which could be dangerous for the technicians. The additives which we use to improve the properties of the material are also natural. Our manufacturing processes have low energy consumption and the drying process is carried out at a maximum temperature of 50°C. Some synthetic products are still used during manufacturing, but they make up less than 1% of all the materials that we use, and we are working on replacing them with bio-sourced materials.

Why did you want to develop these foams?

EG: Our aim was to produce an insulating material which was also an alternative to polystyrene. This is because polystyrene takes several hundred years to biodegrade, pollutes water and releases several toxic substances when it burns. Although our biopolymer foam has similar thermal insulating properties to polystyrene, it is also different due to its outstanding fire-resistant properties. ALGIMEL also has a controlled lifespan. When it is being used, the material is stable; however, at the end of its life, its biodegradable properties mean that it can be put in the household compost.

What could this material be used for?

TV: This material is light and extremely versatile, which means it can have many uses. As well as its thermal insulation and flame-retardant properties, its surface properties can be modified by applying biopolymers which make the foam hydrophobic, so it can be used in humid environments. To increase its mechanical strength, you can easily add plant-based fibers or mineral fillers. We can also add pigments or dyes, both on the interior and the surface, to change its appearance. If it’s used with other materials such as wood, fabrics, leather, etc., either in its composite form or as a sandwich compound, the foam has many uses. These include decoration, packaging, bags, clips fashion and luxury items, etc.

How did your research lead you to this material?

TV: For around thirty years, our work has focused on developing biopolymers. Our main area of experience is with alginate, as well as other biopolymers such as chitin and chitosan, which are made from crustacean shells. We have always directed our research towards developing materials that are more environmentally friendly. ALGIMEL is the result of all the skills we have acquired during our research.

Will the material soon be used outside of your laboratory?

EG: We are currently working with organizations that specialize in technology transfer, including SATT AxLR Occitanie Est. We are also lucky enough to have a contract with Institut Carnot M.I.N.E.S and the Carnot Carats network. Today, we are in the process of raising process quality and pre-industrialization. In collaboration with several partners, we are working on improving the design of our foams so they can be used in decorative, fashion and luxury products. We know that the development of this type of material is in line with many announcements made by the public authorities and industrialists. The most recent example is Emmanuel Macron’s aim of making the fashion and luxury goods industries more sustainable, which he entrusted to François-Henri Pinault. All of this shows that this product has a promising future ahead of it.