Aizimov BEYABLE startup artificial intelligence

Artificial Intelligence hiding behind your computer screen!

Far from the dazzle of intelligent humanoid robots and highly effective chatbots, artificial intelligence is now used in many ordinary products and services. In the software and websites consumers use on a daily basis, AI is being used to improve the use of digital technology. This new dynamic is perfectly illustrated by two startups incubated at Télécom ParisTech: BEYABLE and AiZimov.

 

Who are the invisible workers managing the aisles of digital shops? At the supermarket, shoppers regularly see employees stocking the shelves, but the shelves of online sales sites are devoid of human contact. “Whether a website has 500 or 10,000 items for sale, there are always fewer employees managing the products than at a real store,” explains Julien Dugaret, founder of the startup BEYABLE. The young company is well aware that these digital showcases still require maintenance. Currently accelerated at Télécom ParisTech and formerly incubated there, it offers a solution for detecting anomalies on online shopping sites.

BEYABLE’s artificial intelligence algorithms use a clustering technique. By analyzing data from internet users’ visits to websites and the data associated with each project, they group the items together into coherent “clusters”. The articles that cannot be included in any of the clusters are then identified as anomalies and corrected so they can be reintegrated into the right place.

Some products do not have the right images, descriptions or references. For example, a pair of heels might be included in the ‘boots’ category of an online shop,” explains the entrepreneur. The software then identifies the heels so that an employee can correct the description. While this type of error may seem anecdotal or funny, for the companies that use BEYABLE’s services, the quality of the customer experience is at stake.

Some websites offer thousands of articles with product references that are constantly changing. It is important to make sure visitors using the website do not feel lost from one day to the next. “If a real merchant sold t-shirts one day and coffee tables the next, you can imagine all the logistics that would be required overnight. For an online shop, the logistics involved in changing the collection or promoting certain articles is much simpler, but not effortless. The reduced number of online store ‘department managers’ makes the logistics all the more difficult,” explains Julien Dugaret. Artificial intelligence tools play an essential role in these logistics, helping digital marketing teams save a lot of time and ensuring visitor satisfaction.

BEYABLE is increasingly working with websites run by major brands. These websites invest hundreds of thousands of euros to earn consumers’ loyalty. “These websites have now become very important assets for companies,” the founder of the startup observes. They therefore need to understand what the customers are looking for and how they interact with the pages. BEYABLE does more than perform basic analyses, like the so-called “analytics” tools—the best-known being Google Analytics—it also offers these companies, “a look at what they cannot see,” says Julien Dugaret.

The company’s algorithms learn from the visits by categorizing them and identifying several types of internet users: those who look at the maps for nearby shops, those who want to discover items before they buy them, those who are interested in the brand’s activities… “Companies do not always have data experts who can analyze all the information about their visitors, so we offer AI tools suited to this purpose,” Julien Dugaret explains.

Artificial intelligence for professional emails?

For those who use digital services, the hidden AI processes are not only used to improve their online shopping experience. Jérôme Devosse worked as a salesperson for several years and used to study social networks, company websites and news sites to glean information about the people he wanted to contact. “This is business as usual for salespeople: adapting the sales hook and initial contact based on the person’s interests and the company’s needs,” he explains.

After growing weary of doing this task the slow way, he decided to create a tool to automate the research he carried out before appointments. And that was how AiZimov was born, another startup incubated at Télécom ParisTech. “It’s an assistant,” explains Jérôme Devosse. “All I have to do is tell it ‘I want to contact that person‘ and it will write an email based on the public data available online.” Interviews with the person, their company’s financial reports, their place of residence, their participation at trade shows, all of this information is useful for the software. “For example, the assistant will automatically write a message saying, ‘I saw you will be at Vivatech next week, come meet us!”, AiZimov’s founder explains.

The tool works in three stages. First, there is the data acquisition stage which uses technology to search through large volumes of data. Next, the data must be understood. Is the sentence containing the targeted person’s name from an interview or a financial report? What are the associated key words and what logical connections can be made? Finally, the text is generated automatically and can be checked based on different criteria. The user can then choose to send an email that is more formal or more emotional—using things the contact is passionate about—or a very friendly email.

Orange and Renault are already testing the startup’s software. “For salespeople from large companies, the time they save by not writing emails to new contacts is used to maintain the connections they have with existing customers to continue the relationship,” explains Jérôme Devosse. Today, the tool does not send an email without the salesperson’s approval. The personnel can still choose to modify a few details. The entrepreneur is not seeking an entirely automatic process. His areas for future development are focused on using the software for other activities.

I would like to go beyond emails: once the information is acquired, it could be used to write a detailed or general script for making contact via telephone,” he explains. AiZimov’s technology could also be used for other professions than sales. In press relations it could be used to contact the most relevant journalists by sending them private messages on social networks, for example. And why not make this software available to human resource departments for contacting individuals for recruitment purposes? Artificial intelligence could therefore continue to be used in many different online interactions.

fundamental physics

From springs to lasers: energy’s mysterious cycle

In 1953, scientists theorized the energy behavior of a chain of springs and revealed a paradox in fundamental physics. Over 60 years later, a group of researchers from IMT Lille Douai, CNRS and the universities of Lille and Ferrara (Italy) has succeeded in observing this paradox. Their results have greatly enhanced our understanding of physical nonlinear systems, which are the basic ingredients required for detecting exoplanets, navigating driverless cars and the formation of big waves in the ocean. Arnaud Mussot, a physicist and member of the partnership, explains the process and the implications of the research, published in Nature Photonics on April 2, 2018.

 

The starting point for your work was the Fermi-Pasta-Ulam-Tsingou problem. What is that?

Arnaud Mussot: The name refers to the four researchers who wanted to study a complex problem in the 1950s. They were interested in observing the behavior of masses connected by springs. They used 64 of them in their experiment. With a chain like this, each spring’s behavior depends on that of the others, but in a non-proportional manner – what we call “nonlinear” in physics. A theoretical study of the nonlinear behavior of such a large system of springs required them to use a computer. They thought that the theoretical results the computer produced would show that when one spring is agitated, all the springs begin to vibrate until the energy spreads evenly to the 64 springs.

Is that what happened?

AM: No. To their surprise, the energy spread throughout the system and then returned to the initial spring. It was then redispersed into the springs and then again returned to the initial point of agitation, and so on. These results from the computer completely contradicted their prediction of energy being evenly and progressively distributed, known as an equipartition of energy.  Since then, these results have been called the “Fermi-Pasta-Ulam-Tsingou paradox” or “Fermi-Pasta-Ulam-Tsingou recurrence”, referring to the recurring behavior of the system of springs. However, since the 1950s, other theoretical research has been carried out. This research has shown that by allowing a system of springs to vibrate for a very long time, equipartition is achieved.

Why is the work you refer to primarily theoretical and not experimental?

AM: In reality, the vibrations in the springs are absorbed by many external factors. It could be friction with the air. In fact, experiments carried out to observe this paradox have not only concerned springs, but all nonlinear oscillation systems, such as laser beams in fiber optics. In this case, the vibrations are mainly absorbed by impurities in the glass that composes the fiber. In all these systems, energy losses due to external factors prevent the observation of anything beyond the first recurrence. The system returns to its initial state, but then it is difficult to make it return to this state a second time. Yet it is in this step that new and rich physics emerge.

Is this where your work published in Nature Photonics on April 2nd comes into play?

AM: We thought it was a shame to be limited to a single recurrence, because many interesting things happen when we can observe at least two. We therefore had to find ways to limit the absorption of the vibrations in order to reach the second recurrence. To accomplish this, we added a second laser which amplified the first one to compensate for the losses. This type of amplification is already used in fiber optics to carry data over long distances. We distorted its initial purpose to resolve part of our problem. The other part was to succeed in observing the recurrence.

Whether it be a spring or an optical fiber, Fermi-Pasta-Ulam-Tsingou recurrence is common to all nonlinear system

Was the observation difficult to achieve?

AM: Compensating for energy losses was a crucial step, but it was pointless if we were not able to clearly observe what was happening in the fiber. To achieve this, we used the same impurities in the glass which absorbed the light signal. These impurities reflect a small part of the laser which circulates in the fiber. The returning light provides us with information on the development of the laser beam’s power as it spreads. This reflected part is then measured with another laser which is synchronous with the first, to assess the difference in phase between the two. This gives us additional information that allows us to clearly reveal the second recurrence for the first time in the world.

What did these observation techniques reveal about the second recurrence?

AM: We were able to conduct the first experimental demonstration involving what we call a break in symmetry. There is a recurrence for the data values of the initial energy sent into the system. But we also knew that theoretically, if we slightly changed these initial values that disturbed the system, there would be a shift in the distribution of energy during the second recurrence. The system did not repeat the same values. In our experiment, we managed to reverse the maximum and minimum energy level in the second recurrence compared to the first.

What perspectives does this observation create?

AM: From a fundamental perspective, the experimental observation of the theory predicting the break in symmetry is very interesting because it provides a confirmation. But in addition to this, the techniques we implemented to limit the absorption and observe what was occurring are very promising. We want to perfect them in order to go beyond second recurrence. If we succeed in reaching the equipartition point predicted by Fermi, Pasta, Ulam and Tsingou, we will then be able to observe a continuum of light. In very simple terms, this is the moment when we no longer see the lasers’ pulsations.

Does this fundamental work have applications?

AM: In terms of applications, our work allowed us to better understand how nonlinear systems develop. Yet these systems are often all around us. In nature for example, they are the basic ingredients for forming rogue waves, exceptionally high waves that can be observed in the ocean. With a better understanding of Fermi-Pasta-Ulam-Tsingou recurrence and the energy variations in nonlinear systems, we could better understand the mechanisms involved in shaping rogue waves and could better detect them. Nonlinear systems are also present in many optical tools. Modern LIDARs which use frequency combs, or “laser rulers’’ calculate the distances by sending a laser beam and then very precisely timing how long it takes for it to return—like a radar except with light. However, the lasers have nonlinear behavior: here again our work can help optimize the operation of new generation LIDARs, which could be used for navigating autonomous cars. Finally, calculations on the physical nonlinear systems are also involved in detecting exoplanets, thanks to their extreme precision.

 

blockchains

Can we trust blockchains?

Maryline Laurent, Télécom SudParis – Institut Mines-Télécom, Université Paris-Saclay

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]B[/dropcap]lockchains were initially presented as a very innovative technology with great promise in terms of trust. But is this really the case? Recent events, such as the hacking of the Parity wallet ($30 million US) or the Tether firm ($31 million US) have raised doubts.

This article provides an overview of the main elements outlined in Chapter 11 of the book, Signes de confiance : l’impact des labels sur la gestion des données personnelles (Signs of trust: the impact of seals on personal data management) produced by the Personal Data Values and Policies Chair of which Télécom SudParis is the co-founder. The book may be downloaded from the chair’s website. This article focuses exclusively on public blockchains.

Understanding the technology:

A blockchain can traditionally be equated to a “big,” accessible and auditable account ledger deployed on the internet network. It relies on a large number of IT resources spread out around the world, called “nodes,” which help make the blockchain work. In the case of a public blockchain, everyone can contribute, as long as they have a powerful enough computer to execute the associated code.

Executing the code implies acceptance of the blockchain’s governance rules. These contributors are responsible for collecting transactions made by blockchain customers, aggregating transactions in a structure called a “block” (of transactions) and validating the blocks before they are linked to the  blockchain. The resulting blockchain can be up to several hundred gigabytes and is duplicated a great number of times on the internet, which ensures wide availability of the blockchain.

Elements of trust

The blockchain is based on the following major conceptual principles, naturally positioning it as an ideal technology of trust:

  • Decentralized architecture and neutrality of governance based on the principle of consensus: it relies on a great number of independent contributors, making it decentralized by definition. This means that unlike a centralized architecture where decisions can be made unilaterally, a consensus must be reached, or a party must manage to control over 50% of the blockchain’s computing power (computer resources) to have an effect on the system. Therefore, any change in the governance rules must previously have been approved by consensus between the contributors, who must then update the software code executed.
  • Transparency of algorithms makes for better auditability: all transactions, all blocks, and all governance rules are freely accessible and can be read by everyone. This means that anyone can audit the system to ensure the correct operation of the blockchain and legitimacy of the transactions. The advantage is that experts in the community of users may closely examine the code and report anything that seems suspicious. Trust is therefore based on whistleblowers.
  • Secure underlying technology: Cryptographic techniques and terms of use guarantee that the blockchain cannot be altered, that the transactions recorded are authentic, even if they have been made under a pseudonym and that blockchain security is able to keep up with technological advances thanks to an adaptive security level.

Questions remain

Now we will take a look at blockchains in practice and discuss certain events that have raised doubts about this technology:

  • A  51% attack: Several organizations that contribute significantly to running a blockchain can join forces in order to possess at least 51% of the blockchain’s computing power between them. For example, China is known to concentrate a large part of its computing power — a total of two thirds of its computing power in 2017 — in the bitcoin blockchain. This raises questions about the distributed character of the blockchain and the neutrality of governance since it results in completely uneven decision-making power. Indeed, majority organizations can censure transactions, which impacts the blockchain’s history, or worse still, they can have considerable power to get governance rules that they have decided upon approved.
  • Hard fork: When new governance rules that are incompatible with previous ones are brought forward in the blockchain, this leads to a “hard fork,” meaning a permanent change in the blockchain, which requires a broad consensus amongst the blockchain contributors for the new blockchain rules to be accepted. If a consensus is not reached, the blockchain forks, resulting in the simultaneous existence of two blockchains, one that operates according to the previous rules and the other, according to the new rules. This forking of the chain undermines the credibility of the two resulting blockchains, leading to the devaluation of the associated cryptocurrency. It is worth noting that that a hard fork brought about as part of a 51% attack will be more likely to succeed in getting the new rules adopted since a consensus will be reached more easily.
  • Money laundering: Blockchains are transparent by their very nature but the traceability of transactions can be made very complicated, which facilitates money laundering. It is possible to open a large number of accounts, use the accounts just once, and carry out transactions under the cover of a pseudonym. This raises questions about all of a blockchain’s contributors, since their moral values are essential to running the blockchain, and harms the technology’s image.
  • Programming errors: Errors can be made in smart contracts, the programs that are automatically executed within a blockchain, and can have a dramatic impact on industrial players. Due to one such error an attacker was able to steal $50 million US from the DAO organization in 2016. Organizations who fall victim to such bugs could seek to invalidate these harmful transactions – the  DAO succeeded in provoking a hard fork for this purpose — calling into question the very principle of the inalterability of the blockchain. Indeed, if blocks that have previously been recorded as valid in a blockchain are then made invalid, this raises questions about the blockchain’s reliability.

To conclude, the blockchain is a very promising technology that offers many characteristics to guarantee trust, but the problem lies in the disconnect between the promises of the technology and the ways in which it is used. This leads to a great deal of confusion and misunderstandings about the technology, which we have tried to clear up in this article.

Maryline Laurent, Professor and Head of the R3S Team at the CNRS SAMOVAR Laboratory, Télécom SudParis – Institut Mines-Télécom, Université Paris-Saclay

The original version of this article (in French) was published in French on The Conversation France.

Also read on I’MTech

 

campus mondial de la mer

Campus Mondial de la Mer: promoting Brittany’s marine science and technology research internationally

If the ocean were a country, it would be the world’s 7th-largest economic power, according to a report by the WWF, and the wealth it produces could double by 2030. The Brittany region, at the forefront of marine science and technology research, can make an important contribution to this global development. This is what the Campus Mondial de la Mer (CMM), a Brittany-based academic community, intends to prove. The aim of the Campus is to promote regional research at the international level and support the development of a sustainable marine economy. René Garello, a researcher at IMT Atlantique, a partner of the CMM, answers our questions about this new consortium’s activities and areas of focus.

 

What is the Campus Mondial de la Mer (CMM) and what are its objectives?

René Garello: The Campus Mondial de la Mer is a community of research institutes and other academic institutions, including IMT Atlantique, created through the initiative of the Brest-Iroise Technopôle (Technology Center). Its goal is to highlight the excellence of research carried out in the region focusing on marine sciences and technology. The CMM monitors technological development, promotes research activities and strives to bring international attention to this research. It also helps organize events and symposiums and disseminates information related to these initiatives. The campus’s activities are primarily intended for academics, but they also attract industrial players.

The CMM hosts events and supports individuals seeking to develop new projects as part of its goal to boost the region’s economic activity and create a sustainable maritime economy, which represents tremendous potential at the global level. An OECD report on the sea economy in 2030 shows that by developing all the ocean-based industries, the ocean economy’s output could be doubled, from $1.5 trillion US currently to $3 trillion US in 2030! The Campus de la Mer strives to support this development by promoting Brittany-based research internationally.

What are the Campus Mondial de la Mer‘s areas of focus?

RG: The campus is dedicated to the world of research in the fields of marine science and technology. As far as the technological aspects, underwater exploration using underwater drones, or autonomous underwater vehicles, is an important focus area. These are highly autonomous vehicles, it’s as if they had their own little brains!

Another important focus area involves observing the ocean and the environment using satellite technology. Research in this area mainly involves the application of data from these observations, from both a geophysical and oceanographic perspective and in order to monitor ocean-based activities and the pollution they create.

Finally, a third research area is concerned more with physics, biology and chemistry. This area is primarily led by the University of Western Brittany, which has a large research department related to oceanography, and Institut Universitaire Européen de la Mer.

What sort of activities and projects does the Campus de la Mer promote?

RG: One of the CMM’s aims is to promote the ESA-BIC Nord-France project (European Space Agency – Business Incubator Center), a network of incubators for the regions of Brittany, Hauts-de-France, Ile-de-France and Grand-Est, which provides opportunities for financial and technological support for startups. This project is also connected to the Seine Espace Booster and Morespace, which have close ties with the startup ecosystem of the IMT Altantique incubator.

Another project supported by the Campus Mondial de la Mer involves creating a collaborative space between IMT Atlantique and Institut Universitaire Européen de la Mer, based on shared research themes for academic and industrial partners and our network of startups and SMEs.

The CMM also supports two projects led by UBO. The first is the ISblue, the University Research School (EUR) for Marine Science and Technology, developed through the 3rd Investments in the Future program. The Ifremer and a portion of the laboratories associated with the engineering schools IMT Atlantique, ENSTA Bretagne, ENIB and École Navale (Naval Academy) are involved in this project. The second project consists of housing the UNU-OCEAN institute on the site of the Brest-Iroise Technology Center, with a five-year goal to be able to accommodate 25-30 individuals working at the center of an interdisciplinary research and training ecosystem dedicated to marine science and technology.

Finally, the research themes highlighted by the CMM are in keeping with the aims of GIS BreTel, a Brittany Scientific Interest Group on Remote Sensing that I run. Our work aligns perfectly with the Campus’s approach. When we organize a conference or a symposium, whether at the Brest-Iroise Technology Center or the CMM, everyone participates! This also helps give visibility to research carried out at GIS Bretel and to promote our activities.

Also read on I’MTech

[one_half]

[/one_half]

[one_half_last]

[/one_half_last]

Sampe, composite

Laure Bouquerel wins the SAMPE France competition for her thesis on composite materials for aeronautics

Simulating deformations during the molding stage in a new composite material for the aeronautics industry: this is the subject of Laure Bouquerel’s research at Mines Saint-Étienne as part of her CIFRE PhD thesis with INSA Lyon. The young researcher, winner of the SAMPE France competition, will present her work at the SAMPE France technical days in Bordeaux on 29 and 30 November 2018 and will compete for the World Selection in Southampton during the European meetings in September.

 

An aircraft must be lightweight… But durable! The aircraft’s primary parts, such as the wings and the fuselage, form its structure and bear the greatest stress. These pieces, which were initially manufactured using aluminum, were progressively replaced by composite materials containing carbon fibers and polymer resin for enhanced mechanical performance and resistance to corrosion, while also reducing the mass. The mass issue is at the heart of the aeronautical transport industry: savings on mass leads to a higher payload proportion for aircrafts, while also decreasing fuel consumption.

Traditionally, composite materials for primary parts are molded using indirect processes. This involves using a set of carbon fibers that are pre-impregnated with resin. The part is manufactured by an automated process that superimposes the layers, which are then cured in an autoclave, a pressurized oven. This is currently the most widely used process in the aeronautics industry. It is also the most expensive, due to the processes involved, the material used and its storage.

Hexcel offers a direct process using a new-generation material it has developed: HiTape®. It is a dry, unidirectional reinforcement composed of carbon fibers sandwiched between two thermoplastic webs. It is intended to be deposited using an automated process, then molded before the resin is injected,” Laure Bouquerel explains. The researcher is conducting a thesis at Mines Saint-Étienne on this material that Hexcel is working to develop. The goal is to simulate the molding process involving the stacking of carbon fiber reinforcements in order to better understand and anticipate the deformations and defects that could occur. This work is what earned the young materials specialist an award at the SAMPE France* competition.

Anticipating defects to limit costs

The carbon fibers in the HiTape® material are all aligned in the same direction. The rigidity is at its maximum level in the direction of the fibers. Several layers are deposited in different directions to manufacture a part. This offers very good rigidity in the desired directions, which were identified during the design phase for the structure,” Laure Bouquerel explains. Yet due to the HiTape® material’s specific structure and the presence of the thermoplastic web, specific deformations occur during the molding phase. The tension in the reinforcement is predominant and wrinkling can occur when the material is bent. Finally, friction can occur between the various reinforcement layers.

The appearance of wrinkling is a classic. As they become wrinkled, the fibers are no longer straight, and the load placed on the material will not be transferred as well,” the researcher observes. “These wrinkles also cause the development of areas that are less dense in fiber, where the resin will accumulate after the molding stage, creating zones of weakness in the material.” As these deformations appear, the final part’s overall structure is weakened.

The aim of Laure Bouquerel’s thesis work is to digitally simulate the molding process for the HiTape® material in order to identify and predict the appearance of deformations and then improve the molding process through reverse engineering. Why the use of digital simulation? This method eliminates all the trial and error involving real materials in the laboratory, thus reducing the time and cost involved in developing the product.

A great opportunity for the young researcher

A graduate of Centrale Nantes engineering school, the young researcher became specialized in this field while working toward her Master’s in advanced materials from Cranfield University in England. After earning these two degrees, she further defined her vocation during her work placement year. Laure Bouquerel began her career with Plastic Omnium, an automobile parts supplier in Lyon, and with Airbus in Germany, which explains her specialization in composite materials for the aeronautics industry.

As a winner of the SAMPE France competition, the PhD student will present her work at the SAMPE France technical days in Bordeaux on 29 and 30 November and will participate in the SAMPE Europe competition in Southampton from 11 to 13 September. This will provide a unique opportunity to give visibility to her work. “It will be an opportunity to meet with other industry stakeholders and other PhD students working on similar topics. Talking with peers can inspire new ideas for advancing our own research!”

[box type=”info” align=”” class=”” width=””]

*An international competition dedicated to materials engineering

SAMPE (Society for the Advancement of Material Process Engineering) rewards the best theses on the study of structural materials through an international competition. The French edition, SAMPE France, which Laure Bouquerel won, was held at Mines Saint-Étienne on March 22 and 23. The global competition will be held in Southampton from September 11 to 13 during the SAMPE Europe days. The aim of these international meetings is to bring together manufacturers and researchers from the field of advanced materials to develop professional networks and present the latest technical innovations.[/box]

 

 

HyBlockArch

HyBlockArch: hybridizing the blockchain for the industry of the future

Within the framework of the German-French Academy for the Industry of the Future, a partnership between IMT and Technische Universität München (TUM), the HyBlockArch project examines the future of the blockchain. This project aims to adapt this technology to an industrial scale to create a powerful tool for companies. To accomplish this goal, the teams led by Gérard Mémmi (Télécom ParisTech) and Georg Carle (TUM) are working on new blockchain architectures. Gérard Memmi shares his insight.

 

Why are you looking into new blockchain architectures?

Gérard Mémmi: Current blockchain architectures are limited in terms of performance in the broadest sense: turnaround time, memory, energy… In many cases, this hinders the blockchain from being developed in Industry 4.0. Companies would like to see faster validation times or to be able to put even more information into a blockchain block. A bank that wants to track an account history over several decades will be concerned about the number of blocks in the blockchain and the possible increase in block latency times. Yet today we cannot foresee the behavior of blockchain architectures for many years to come. There is also the energy issue: the need to reduce consumption caused by the proof of work required to enter data into a blockchain, while still ensuring a comparable level of security.  We must keep in mind that the bitcoin’s proof of work consumes the same amount of electrical energy as a country like Venezuela.

What type of architecture are you trying to develop with the HyBlockArch project?

GM: We are working on hybrid architectures. These multi-layer architectures make it possible to reach an industrial scale. We start with a blockchain protocol in which each node of the ledger communicates with a mini data storage network on a higher floor. This is not necessarily a blockchain protocol and it can operate slightly differently while still maintaining similar properties. The structure is transparent for the users; they do not notice a difference. The miners who perform the proof of work required to validate data only see the blockchain aspect. This is an advantage for them, allowing them to work faster without taking the upper layer of the architecture into account.

What would the practical benefits be for a company?

GM: For a company this would mean smart contracts could be created more quickly and the computer operations that rely on this architecture would have shorter latency times, resulting in a broader scope of application. The private blockchain is very useful in the field of logistics. For example, each time a product changes hands, as from the vendor to the carrier, the operation is recorded in the blockchain. A hybrid architecture records this information more quickly and at a lower cost for companies.

This project is being carried out in the framework of the German-French Academy for the Industry of the Future. What is the benefit of this partnership with Technische Universität München (TUM)?

GM: Our German colleagues are developing a platform that measures the performance of the different architectures. We can therefore determine the most optimal architecture in terms of energy savings, fast turnaround and security for typical uses in the industry of the future. We contribute a more theoretical aspect: we analyze the smart contracts to develop more advantageous protocols, and we work with proof of work mechanisms for recording information in the blockchain.

What does this transnational organization represent in the academic field?

GM: This creates a European dynamic in the work on this issue. In March we launched a blockchain alliance between French institutes: BART. By working together with TUM on this topic, we are developing a Franco-German synergy in an area that only a few years ago was only featured as a minor issue at a research conference, as the topic of only one session. The blockchain now has scientific events all to itself. This new discipline is booming and through the HyBlockArch project we are participating in this growth at the European level.

 

C2Net

C2Net: supply chain logistics on cloud nine

Projets européens H2020A cloud solution to improve supply chain logistics? This is the principle behind the European C2Net project. Launched on January 1, 2015, the project was completed on December 31, 2017. The project has successfully demonstrated how a cloud platform can enable the various players in a supply chain to better anticipate and manage future problems. To do so, C2Net drew on research on interoperability and on the automation of alerts using data taken directly from companies in the supply chain. Jacques Lamothe and Frédérick Benaben, researchers in industrial engineering, on logistics and information systems respectively, give us an overview of the work they carried out at IMT Mines Albi on the C2Net project.

 

What was the aim of the C2Net project?

Jacques Lamothe: The original idea was to provide cloud tools for SMEs to help them with advanced supply chain planning. The goal was to identify future inventory management problems companies may have well in advance. As such, we had to work on three parts: recovering data from SMEs, generating alerts for issues to be resolved, and monitoring planning activity to see if everything went as intended. It wasn’t easy because we had to respond to interoperability issues — meaning data exchange between the different companies’ information systems. And we also had to understand the business rules of the supply chain players in order to evaluate the relevant alerts.

Could you give us an example of the type of problem a company may face?

Frédérick Benaben: One thing that can happen is that a supplier is only able to manufacture 20,000 units of an item while the SME is expecting 25,000. This makes for a strained supply chain and solutions must be found, such as compensating for this change by asking suppliers in other countries if they can produce more. It’s a bit like an ecosystem: when there’s a problem in one part, all the players in the supply chain are affected.

Jacques Lamothe: What we actually realized is that, a lot of the time, certain companies have very effective tools to assess the demand on one side, while other companies have very effective tools to measure production on the other side. But it is difficult for them to establish a dialogue between these two parts. In the chain, the manufacturer does not necessarily notice when there is lower demand and vice versa. This is one of the things the C2Net demonstrator helped correct in the use case we developed with the companies.

And what were the companies’ expectations for this project?  

Jacques Lamothe: For the C2Net project, each academic partner brought an industrial partner he had already worked with. And each of these SMEs had a different set of problems. In France, our partner for the project was Pierre Fabre. They were very interested in data collection and creating an alert system. On the Spanish side, this was less of a concern than optimizing planning. Every company has its own issues and the use cases the industrial partners brought us meant we had to find solutions for everyone: from generating data on their supply chains to creating tools to allow them to manage alerts or planning.

To what extent has your research work had an impact on the companies’ structures and the way they are organized?

Frédérick Benaben: What was smart about the project is that we did not propose the C2Net demonstrator as a cloud platform that would replace companies’ existing systems. Everything we did is situated a level above the organizations so that they will not be impacted, and integrates the existing systems, especially the information systems already in place. So the companies did not have to be changed. This also explains why we had to work so hard on interoperability.

What did the work on interoperability involve?

Frédérick Benaben: There were two important interoperability issues. The first was being able to plug into existing systems in order to collect information and understand what was collected. A company may have different subcontractors, all of whom use different data formats. How can a company understand and use the data from both subcontractor A, which is provided in one language and that of subcontractor B, which is provided in another? We therefore had to propose data reconciliation plans.

The second issue involves interpretation. Once the data has been collected and everyone is speaking  the same language, or at least can understand one another, how can common references be established? For example, having everyone speak in liters for quantities of liquids instead of vials or bottles. Or, when a subcontractor announces that an item may potentially be out of stock, what does this really mean? How far in advance does the subcontractor notify its customers? Does everyone have the same definition? All these aspects had to be harmonized.

How will these results be used?

Jacques Lamothe: The demonstrator has been installed at the University of Valencia in Spain and should be reused for research projects. As for us, the results have opened up new research possibilities. We want to go beyond a tool that can simply detect future problems or allow companies to be notified. One of our ideas is to work on solutions that make it possible to make more or less automated decisions to adjust the supply chain.

Frédérick Benaben: A spin-off has also been developed in Portugal. It uses a portion of the data integration mechanisms to propose services for SMEs. And we are still working with Pierre Fabre too, since their feedback has been very positive. The demonstrator helped them see that it is possible to do more than what they are currently able to do. In fact, we have developed and submitted a partnership research project with them.

Also read on I’MTech:

 

Artificial intelligence

What is artificial intelligence?

Artificial intelligence (AI) is a hot topic. In late March, the French government organized a series of events dedicated to this theme, the most notable of which was the publication of the report “For a Meaningful Artificial Intelligence,” written by Cédric Villani, a mathematician and member of the French parliament. The buzz around AI coincides with companies’ and scientists’ renewed interest in the topic. Over the last few years AI has become fashionable again, as it was in the 1950s and 1960s. But what does this term actually refer to? What can we realistically expect from it? Anne-Sophie Taillandier, director of IMT’s TeraLab platform dedicated to big data and AI, is working on innovations and technology transfer in this field. She was recently listed as one of the top 20 individuals driving AI in France by L’Usine Nouvelle. She sat down with us to present the basics of artificial intelligence.

 

How did AI get to where it is today?

Anne-Sophie Taillandier: AI has played a key role in innovation questions for two or three years now. What has helped create this dynamic are closer ties between two scientific fields: information sciences and big data, both of which focus on the question, “How can information be extracted from data, whether big or small?” The results have been astonishing. Six years ago, we were only able to automatically recognize tiny pieces of images. When deep learning was developed, the recognition rate skyrocketed. But if we have been able to use the algorithms on large volumes of images, it is because of hardware that has made it possible to perform the computations in a reasonable amount of time.

What technology is AI based on?

AST: Artificial intelligence is the principle of extracting and processing information. This requires tools and methods. Machine learning is a method that brings together highly statistical techniques such as neural networks. Deep learning is another technique that relies on deeper neural networks. These two methods have some things in common; what makes them different is the tools chosen. In any event, both technologies are based on the principle of learning. The system learns from an initial database and it is then used on other data. The results are assessed so that the system can keep learning. But AI itself is not defined by these technologies. In the future, there may be other types of technology which will also be considered artificial intelligence. And even today, researchers in robotics sometimes use different algorithms.

Can you give some specific examples of the benefits of artificial intelligence?

AST: The medical sector is a good illustration. In medical imaging, for example, we can teach an algorithm to detect cancerous tumors. It can then help doctors look for parts of an image that require their attention. We can also adjust a patient’s treatment depending on a lot of different data: is he alone or does he have a support network? Is he active or inactive? What is his living environment like? All these aspects contribute to personalized medicine, which has only become possible because we know how to process all this data and automatically extract information. For now, artificial intelligence is mainly used as a decision-making aid. Ultimately, it’s a bit like what doctors do when they ask patients questions, but in this case we help them gather information from a wide range of data. With AI, the goal is first and foremost to reproduce something that we know very well.

How can we distinguish between solutions that involve AI and others?

AST: I would say that it’s not really important. What matters is if using a solution provides real benefits. This question often comes up with chatbots, for example. Knowing whether AI is behind them or not — whether it’s just a decision tree based on a previous scenario or if it’s a human — is not helpful. As a consumer, what’s important to me is that the chatbot in front of me can answer my questions. They’re always popping up on sites now, which is frustrating since a lot of the time they are not particularly useful! So it is how a solution is used that really matters, more than the technology behind it.

Does the fact that AI is “trendy” adversely affect important innovations in the sector?

AST: With TeraLab we are working on very advanced topics with researchers and companies seeking  cutting-edge solutions. If people exaggerate in their communication materials or use the term “artificial intelligence” in their keywords, it doesn’t affect us. I’d rather that the public become familiar with the term and think about the technology already present in their smartphones than fantasize about something inaccessible.

 

Atmospheric reactive trace gases: low concentrations, major consequences

Projets européens H2020Despite only being present in very small quantities, trace gases leave their mark on the atmospheric composition. Since they are reactive, they may lead to the formation of secondary compounds such as ozone or aerosols that have a significant impact on health and the climate. IMT Lille Douai is a partner in the ACTRIS H2020 project, which aims to carry out long-term observations of trace gases, aerosols and clouds to better understand how they interact with one another and how they impact the climate and air quality.

 

Take some nitrogen, add a dose of oxygen, sprinkle in some argon and a few other inert gases, add a touch of water vapor and a pinch of carbon dioxide and you have the Earth’s atmosphere, or almost! Along with this mix composed of approximately 78% nitrogen, an honorable 21% oxygen, less than 1% argon and 0.04% carbon dioxide, you will also find trace gases with varying degrees of reactivity.  Emitted by both anthropogenic and natural sources, these gases exist in concentrations in the nanogram range, meaning 0.000000001 gram per cubic meter of the atmosphere. Does this mean they are negligible? Not really! “Once emitted these gases are not inert, but reactive,” explains Stéphane Sauvage, a researcher in atmosphere sciences and environmental technology at IMT Lille Douai. “They will react with one another in the atmosphere and lead to the formation of secondary species, such as ozone or certain aerosols that have a major impact on health and the climate.” This is why it is important to be able to identify and measure the precise quantity of these gases in the atmosphere.

ACTRIS (Aerosols, Clouds and Trace Gases Research Infrastructure) is a large-scale H2020 project which brings together 24 countries and over 100 laboratories, including IMT Lille Douai, as part of the ESFRI (European Strategy Forum on Research Infrastructure). By combining ground-based and satellite measurements, the aim is to carry out long-term observations of the composition of the atmosphere to better understand the factors behind the contaminants and their impact on the climate and air quality. In terms of innovation, the project seeks to develop new techniques and methods of observation. “At IMT Lille Douai, we have been developing our skills in ground-based observation of trace gases for many years, which has led to our being identified as contributors with extensive expertise on the topic,” says Stéphane Sauvage.

 

Gases that leave a mark on the atmosphere

Trace gases, which come from automobile exhausts, household heating, agricultural activities and emissions from plants and volcanoes, are good “tracers,” meaning that when they are measured, it is possible to identify their original source. But out of the 200 to 300 different species of trace gases that have been identified, some are still little-known since they are difficult to measure. “There are some very reactive species that play a key role in the atmosphere, but with such short lifetimes or in such low concentrations that we are not able to detect them,” explains Stéphane Sauvage.

Sesquiterpenes, a family of trace gases, are highly reactive. Emitted from vegetation, they play an important role in the atmosphere but remain difficult to quantify with current methods. “These gases have a very short lifetime, low atmospheric concentrations and they degrade easily during sample collection or analysis,” says Stéphane Sauvage.

On the other hand, some species, such as ethane, are well-known and measurable. Ethane results from human activity and has a low level of reactivity, but this does not make it any less problematic. It is present at a non-negligible level on a global scale and has a real impact on the formation of ozone. “We recently published an article in the Nature Geoscience journal about the evolution of this species and we realized that its emissions have been underestimated,” notes Stéphane Sauvage.

 

Complex relationships between aerosols, clouds and trace gases

In addition, by reacting with other atmospheric compounds, trace gases can lead to the formation of aerosols, which are suspensions of fine particles. Due to their capacity to absorb light, these particles impact the climate but can also penetrate the respiratory system leading to serious health consequences. “Although natural and anthropogenic sources are partially responsible for these fine particles, they are also produced during reactions with reactive trace gases through complex processes which are not yet entirely understood,” explains Stéphane Sauvage. This illustrates the importance of the ACTRIS project, which will observe the interactions between trace gases and aerosols, as well as clouds, which are also affected by these compounds.

Read more on IMTech: What are fine particles?

The measurements taken as part of the ACTRIS project will be passed on to a number of different players including weather and climate operational services, air quality monitoring agencies, the European Space Agency and policy-makers, and will also be used in agriculture, healthcare and biogeosciences. “The ACTRIS infrastructure is currently being built. We will enter the implementation phase in 2019, then the operational phase will begin in around 2025 and will last 25 years,” says Stéphane Sauvage. This is a very long-term project to organize research on a European scale, drawing on the complementary skills of over 100 research laboratories from 24 countries — to take atmospheric sciences to a stratospheric level!

 

[box type=”shadow” align=”” class=”” width=””]

A workshop on data from observations of reactive trace gases

Engineers and researchers from ten European countries met at IMT Lille Douai from 16 to 18 May for the annual ACTRIS project workshop on reactive trace gases. The objective was to review the data collected in Europe in 2017 and to discuss its validity along with the latest scientific and technical developments. All the players involved in making ground-based measurements of trace gases, aerosols and clouds will meet at IMT Lille Douai in October. Learn more

[/box]