AiiNTENSE

AiiNTENSE: AI for intensive care units

The start-up AiiNTENSE was incubated at IMT Starter and develops decision support tools for healthcare with the aim of advising intensive care personnel on the most appropriate therapeutic procedures. To this end, the start-up is developing a data platform of all diseases and conditions, which it has made available to researchers. It therefore seeks to provide support for launching clinical studies and increase medical knowledge.

 

Patients are often admitted to intensive care units due to neurological causes, especially in the case of a coma. And patients who leave these units are at risk of developing neurological complications that may impact their cognitive and functional capacities. These various situations pose diagnostic, therapeutic and ethical problems for physicians. How can neurological damage following intensive care be predicted in the short, medium and long term in order to provide appropriate care? What will the neurological evolution of a coma patient involve, between brain death, a vegetative state and partial recovery of consciousness? An incorrect assessment of the prognosis could have tragic consequences.

In 2015, Professor Tarek Sharshar, a neurologist specialized in intensive care, saw a twofold need for training – on one hand neurology training for intensivists, and on the other, intensive care training for neurologists. He proposed a tele-expertise system connecting the two communities. In 2017, this project gave rise to AiiNTENSE, a start-up incubated at IMT Starter, whose focus soon expanded. “We started out with our core area of expertise: neuro-intensive care and drawing on support from other experts and learned societies, we shifted to developing decision support tools for all of the diseases and conditions encountered in intensive care units,” says Daniel Duhautbout, co-founder of AiiNTENSE. The start-up is developing a database of patient records which it analyzes with algorithms using artificial intelligence.

AI to aid in diagnosis and prognosis

The start-up team is working on a prototype concerning post-cardiac arrest coma. Experts largely agree on methods for assessing the neurological prognosis for this condition. And yet, in 50% of the cases of this condition, physicians are not yet able to determine whether or not a patient will awake from the coma. “Providing a prognosis for a patient in a coma is extremely complex and many available variables are not taken into account, due to a lack of appropriate clinical studies and tools to make use of these variables,” explains Daniel Duhautbout. That’s where the start-up comes in.

In 2020, AiiNTENSE will launch its pilot prototype in five or six hospitals in France and abroad. This initial tool comprises, first and foremost, patient records, taken from the hospital’s information system, which contain all the relevant data for making medical decisions. This includes structured biomedical information and non-structured clinical data (hospitalization or exam reports). In order to make use of the latter, the start-up uses technology for the automated processing of natural language. This results in patient records with semantic, homogenized data, which take into account international standards for interoperability.

A use for each unit

The start-up is developing a program that will in time respond to intensivists’ immediate needs. It will provide a quick, comprehensive view of an individual patient’s situation. The tool will offer recommendations for therapeutic procedures or additional observations to help reach a diagnosis. Furthermore, it will guide the physician in order to assess how the patient’s state will evolve. The intensivist will still have access to an expert from AiiNTENSE’s tele-expertise network to discuss cases in which the medical knowledge implemented in the AiiNTENSE platform is not sufficiently advanced.

The start-up also indirectly responds to hospital management issues. Proposing accurate, timely diagnoses means limiting unnecessary exams, making for shorter hospital stays and, therefore lower costs. In addition, the tool optimizes the traceability of analyses and medical decisions, a key medical-legal priority.

In the long term, the start-up seeks to develop a precision intensive care model. That means being able to provide increasingly reliable diagnoses and prognoses tailored for each patient. “For the time being, for example, it’s hard to determine what a patient’s cognitive state will be when they awaken from a coma. We need clinical studies to improve our knowledge,” says Daniel Duhautbout. The database and its analytical tools are therefore open to researchers who wish to improve our knowledge of conditions that require intensive care. The results of their studies will then be disseminated through AiiNTENSE’s integration platform.

Protecting data on a large scale

In order to provide a viable and sustainable solution, AiiNTENSE must meet GDPR requirements and protect personal health data. With this aim, the team is collaborating with researchers at IMT Atlantique and plans to use the blockchain to protect data. Watermarking, a sort of invisible mark attached to data, would also appear to be a promising approach. It would make it possible to track those who use the data and who may have been involved in the event of data leakage to external servers. “We also take care to ensure the integrity of our algorithms so that they support physicians confronted with critical neurological patients in an ethical manner,” concludes Daniel Duhautbout.

 

supply chain

Meet your supply chain using virtual reality

Immersive technologies for industrial engineering and risk management? This is the focus of research to be carried out at the SIReN laboratory launched on 15 November, bringing together researchers from IMT Mines Albi (France) and Georgia Tech (USA). On the French side, Frédérick Benaben, an expert in collaborative networks, is already using virtual reality to develop a supply chain decision support and management tool for companies.  

 

In front of you, a beam of green light traces a path leading you straight to the finishing line. You’re continuing along this path when, suddenly, a floating red sphere comes flying right toward you! It makes you veer sharply from your path, taking you away from where you were headed. You now face a crucial question: how can you achieve your goal from this new point? This is not a revolutionary video game set to be released at the end of the year. It’s a decision support tool developed by researchers at IMT Mines Albi and Georgia Tech to facilitate the visualization of data produced by artificial intelligence.

Building on their collaboration begun in 2015, the two academic partners have continued their joint research since 15 November through SIReN[1], a new international associated laboratory jointly based in Albi and Atlanta. “At the laboratory, we’re carrying out research on how immersive technology can help us develop and manage response networks,” explains Frédérick Benaben, an IMT Mines Albi researcher who specializes in the field of collaborative networks and information systems. Such networks include supply chains and crisis management. The researchers’ expertise is based on an original vision of artificial intelligence, at the crossroads between industrial engineering and computer and data sciences, and on a partnership that is already as concrete as the work it seeks to carry out.

Making the abstract concrete with virtual reality

A supply chain is a dynamic system that must be as agile as possible. It evolves over time in response to opportunities (opening of a new market, tax reductions etc.) or risks (weather events, closing of a border etc). Yet, understanding how these various events could impact the supply chain proves to be very complex. That’s where virtual reality comes in!

Read more on I’MTech: What is supply chain management?

Unlike “traditional” uses of virtual reality where the aim is to represent a copy of reality, as with digital twins for example, the researchers use virtual reality to get away from reality. “We can then project ourselves into a world where our physical reference points (such as up and down, distance, etc.) are conserved, but where we have the possibility of visualizing abstract concepts. Using spheres, we represent opportunities or risks, for example. Color effects can indicate how dangerous they are,” says the researcher.

In the virtual universe, the spatio-temporal reference points are defined by the set of performance indicators for a supply chain. Let’s consider a simplified case where there are only three indicators: cost, product quality and delivery time. The researchers therefore define a three-dimensional frame of reference in which the supply chain is situated. Like a mechanical force, each risk or opportunity that has an impact on the network will push or pull it in a certain direction. For example, flooding along a delivery route will push the supply chain down on the delivery time axis.

Through virtual reality, a user can observe a supply chain, move and observe risks and opportunities related to the supply chain.

 

In reality, logistics networks have dozens of performance indicators and over a hundred risks and opportunities — and therefore mechanical forces — to represent at each instant, making them complex to visualize. This is made possible through extensive work to identify and process data. To continue with the flooding example, data is identified such as the number of blocked routes, how delayed the trucks are, the percentage of damaged warehouses, implications for the products etc. The researchers turn this information into a macroscopic force exerted on the performance indicators.

Read more on I’MTech: C2Net: supply chain logistics on cloud nine

Virtual reality therefore helps respond to a need for agility in supply chains in an interactive way. Once the users are immersed in their supply chain universe, they can follow its movement and also interact with the supply chain. The spheres representing risks and opportunities are closer or further from the user based on how likely they are to occur. Their color indicates how dangerous they are, making it possible to identify areas of action more effectively. “The user steers the course of their system towards a precise objective. Virtual reality allows them to identify the forces they must use to achieve their goal, much like a sailor uses winds and currents. If a risk materializes, they deviate from the path but may be able to use a force to correct the effect,” explains Frédérick Benaben.

This decision support tool could also help anticipate the effects of an event on the path and avoid it, if possible. These are precisely the questions being explored through the SCAN research program (Agile, Digital Collaborative Systems) launched in September 2019 with Scalian, a company that specializes in digital transformation, conducted through the SIReN laboratory.

Virtuality and risk management

The supply chain is not the only area of research to benefit from virtual reality through the SIReN laboratory. In March 2019, researchers from IMT Mines Albi created the EGCERSIS[2] research program with support from the Occitanie region and in partnership with the companies Immersive Factory and Report One. The aim is to use immersive technologies to develop crisis management systems for sensitive industrial environments. In particular, they are building on an emerging need expressed by Immersive Factory, a company specialized in developing digital twins to provide safety training for staff at industrial sites. The company is seeking to expand its offerings by providing training for crisis situations. Employees may have already learned how to make sure a valve is closed after using it, but what should they do if it catches on fire? The four-year partnership will be based on demonstrating the strength of digital simulation to respond to this sort of issue.

To do so, the researchers will rely on the IOMEGA platform equipped with multi-screen interfaces displayed in the form of a cockpit, allowing interaction between software, artificial intelligence visualization etc. They will also take advantage of state-of-the-art equipment for immersion and virtual reality, giving users 360° freedom of movement, via the new IOMEGA VR platform launched on 15 November. On the other side of the Atlantic, a twin platform is being developed at Georgia Tech.

More generally, the two partner institutions will draw on their complementary expertise for the projects carried out through SIReN. They seek to increase the agility of collaborative networks but come at the problem from two different angles. The French team is developing technologies intended for the supply chain, while the American team has given rise to the concept of the physical internet, which aims to transport physical goods as efficiently as data is transmitted over the internet. Like the internet, logistics must become fluid, and immersive technologies have a key role to play in making this possible.

[1] Sentient Immersive Response Network

[2] Crisis Management Training in an Environment Representative of Sensitive Industrial Sites

ixblue

iXblue: Extreme Fiber Optics

Belles histoires, Bouton, CarnotSince 2006, iXblue, a French company based in Lannion, and the Hubert Curien laboratory [1] in Saint-Étienne have partnered to develop cutting-edge fiber optics. This long partnership has established iXblue as a global reference in the use of fiber optics in harsh environments. The scientific and technological advances have enabled the company to offer solutions for the nuclear, space and health sectors. But there’s something different about these optical fibers: they’re not used for telecommunications.

 

Last June, iXblue and the Hubert Curien laboratory officially opened LabH6, a joint research laboratory dedicated to fiber optics. This latest development comes from a partnership that has existed since 2006 and the explosion of the internet bubble. In fact, iXblue was born from the ashes of a start-up specializing in fiber optics for telecommunications. After the disappointment experienced in the digital technology sector in the early 2000s, “we decided to make a complete U-turn, leaving telecommunications behind, while remaining in fiber optics,” explains Thierry Robin, present since the beginning and currently the company’s CTO.

A daring move, at a time when fiber optics in domestic networks was in its infancy. But it was a move that paid off. In 13 years, the young company became a pivotal stakeholder in fiber optics for harsh environments. The company owes its success to the innovations developed with the Hubert Curien laboratory. The company’s products are now used in high-temperature conditions, under nuclear irradiation and in the vacuum of space.

Measuring nuclear irradiation

One of the major achievements of this partnership has been the development of optical fibers that can measure the radiation dose in an environment. The light passing through an optical fiber is naturally diminished over the length of the fiber. This attenuation, called optical loss, increases when the fiber is under nuclear radiation. “We understand the law governing the relationship between optical loss and the radiation dose received by the fiber,” explains Sylvain Girard, a researcher at the Hubert Curien laboratory. “We can therefore have an optical fiber play the role of hundreds of dosimeters by measuring the radiation value.”

There are two advantages to this application of the fiber. First of all, the resulting data can be used to establish a continuous mapping of the radiation over the length of the fiber, whereas dosimeters provide a value from their specific location. Secondly, the optical fiber provides a real-time measurement, since the optical loss is measured live. Dosimeters, on the other hand, are usually left for days or months in their locations before the value of the accumulated radiation can be measured.

The fibers used in this type of application are unique. They must be highly sensitive to radiation in order to accurately measure the variations. Research conducted for this purpose resulted in fibers doped with phosphorus or aluminum. This type of optical fiber is currently installed in the CERN Large Hadron Collider (LHC) in Geneva during the 2-year shutdown that will continue until 2020. “This will enable CERN to assess the vulnerability of the electronic equipment to radiation and hence avoid unplanned shutdowns caused by outages,” Sylvain Girard explains.

These optical fibers are also being assessed at the TRIUMF particle accelerator center in Canada for proton therapy. This high-precision medical technique treats ocular melanomas using radiation. The radiation dose deposited on the melanoma must be very precise. “The fiber should make it possible to measure the radiation dose in real-time and stop it once the required value is reached,” the researcher explains. “Without the fiber, doctors can only determine the total dose the patient received at the end of the treatment. They must therefore accumulate three low-dose radiation sessions one after the other to come as close as possible to the total target dose.”

Surviving space

While the fibers used in dosimetry must be sensitive to radiation for measurement purposes, others must be highly resistant. This is the case for fibers used in space. Satellites are susceptible to space radiation. However, the gyroscopes satellites use to position themselves use optical fiber amplifiers. iXblue and the Hubert Curien laboratory therefore partnered together to develop hydrogen or cerium-doped optical fibers. Two patents have been filed for these fiber amplifiers, and their level of resistance has made them the reference in optical fibers for the space sector.

The same issue of resistance to radiation exists in the nuclear industry, where it is important to measure the temperature and mechanical stress in the core of nuclear reactors. “These environments are exposed to doses of a million Grays. For comparison purposes, a lethal dose for humans is 5 Grays,” Sylvain Girard explains. The optical fiber sensors must therefore be extremely resistant. Once again, the joint research conducted by iXblue and the Hubert Curien laboratory led to two patents for new fibers that meet the needs of manufacturers like Orano (formerly AREVA). These fibers will also be deployed in the fusion reactor project, ITER.

All this research will continue at the new LabH6, which will facilitate the industrial application of the research conducted by iXblue and the Hubert Curien laboratory. The stakes are high, as the uses for optical fibers beyond telecommunications continue to increase. While space and nuclear environments may seem to be niche sectors, the optical fibers developed for these applications could also be used in other contexts. “We are currently working on fibers that are resistant to high temperatures for use in autonomous cars,” says Thierry Robin. “These products are indirectly derived from developments made for radiation-resistant fibers,” he adds. After leaving the telecommunications sector and large volume production 13 years, iXblue could soon return to its origins.

[box type=”shadow” align=”” class=”” width=””]A word from the company: Why partner with an academic institute like the Hubert Curien laboratory?

We knew very early on that we wanted an open approach and exchanges with scientists. Our partnership with the Hubert Curien laboratory allowed us to progress within a virtuous relationship. In an area where competitors maintain a culture of secrecy, we inform the researchers we work with of the exact composition of the fibers. We even produce special fibers for them that are only used for the scientific purposes of testing specific compositions. We want to enable our academic partners to conduct their research by giving them all the elements they need to make advances in the field. This spirit is what has allowed us to create unique products for the space and nuclear sectors.[/box]

[1] The Hubert Curien Laboratory is a joint research unit of CNRS/Université Jean Monnet/Institut d’Optique Graduate School, where Télécom Saint-Étienne conducts much of its research.

digital transformation

No, employees are not afraid of digital transformation

This article was originally published (in French) on The Conversation. By Emmanuel Baudoin, Institut Mines-Télécom Business School.

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]T[/dropcap]he 2019 edition of the study (in French) “French employees in the age of digital transformation” conducted by the HRM Digital Lab at Institut Mines-Télécom Business School shows that French employees are not afraid of digital transformation, and some even hope to see their company take greater steps towards it.

This study was conducted on a representative sample of just over 1,000 French employees and was managed by Opinion Way with support from CFA-EVE and the Essonne ANDRH (French Association of Human Resources Directors). An additional study was carried out with 100 decision-makers in the private sector, located in the Paris region, in order to collect their perceptions of digital transformation in their companies.

A multi-facetted transformation

The first finding is that 90% of the employees report that digital technology has had an impact on their job. This figure is unquestionable. However, it covers a wide variety of different situations. 48% of these employees consider that digital technology has had a great impact on their job while 42% say that it has just had a moderate impact, if any. As an illustration of the wide variety of situations, when asked how important digital technology was to their work,  11% said digital technology represented the core part of their work, 35% said digital technology represented a significant part of their work, 32% said digital technology was used only to support their work, 14% said digital technology was little-used in their work and 9% said that they did not use digital technology at all in their work.

The interviews carried out with decision-makers from companies of all sizes and industries told the same story –  digital technology is here to stay. 98% of those interviewed said digital technology has had an impact on their company. 65% consider that it has had a strong impact, while 11% feel that it has had a minor impact, if any. They report that all company processes have been affected: customer experience, supply chain management, administrative process management, the way products and services are developed, the way work is organized, the approach to managing teams, and employee experience.

Employees have a relatively high digital maturity level

Far from being pessimistic about the impact digital technology has had on their job, 84% of the employees say that it is making their work more interesting or that it has not had a major impact on their work. When asked to choose a response that describes their relationship with digital technology, 43% chose a very positive response, 47% chose a neutral response, while 11% chose a negative response. Another significant finding was that 40% hope their companies will go further in their digital transformation while 13% do not.

This overall positive perception of digital technology contributes to employees’ digital maturity level. This level can be defined as an employee’s overall relationship with digital technology at work, their perceived level of proficiency with digital technology, and the extent to which they use it to carry out their work activities.

As far as perceived level of proficiency is concerned, 61% consider that they have a very high or high level of proficiency, whereas 6% say that they have a low or very low level. At the same time, they are aware of their potential for improvement. 62% of respondents felt that they wasted time every day due to a lack of proficiency in certain digital skills.

A final point of interest is that the results also confirm that a new type of employee has emerged, one who can be called a ‘self HR employee,’ as identified in the first edition of this study. These ‘self HR’ employees take advantage of digital technology for a wide variety of purposes, including to learn independently and develop their skills, which means that they engage in informal digital learning. They also implement strategies to enhance their employee brand or even sell their skills.

 

       French employees are optimistic about the digital transformation! (Emmanuel Baudoin on Xerfi canal, 2019).

[divider style=”normal” top=”20″ bottom=”20″]

Emmanuel Baudoin, Associate Professor in HR, Institut Mines-Télécom Business School

The original version of this article (in French) was published on The Conversation under a Creative Commons license. Read the original article.

women

Why have girls been turning their backs on computer science?

Column written in partnership with The Conversation.
By Chantal Morley from Institut Mines-Télécom Business School.

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]F[/dropcap]or an organizer of a careers fair or an engineering school open day, the gender-bias in digital professions is clear. Boys are far more likely than girls to flock to stands about computer science or other digital professions. It is easy to assume that this is how things have always been. Our opinions are based on the stereotypes we see every day, with ‘geeks’ almost always being portrayed as male.

It is easy to forget that many female mathematicians played a key role in the history of computing. Examples of these incredible women are Grace Hopper, who developed the theory behind the compiler, and made the first ever prototype of the machine in 1952; Mary Keller, the first person to earn a doctorate in computer science in 1965.

Kathleen Booth, who was a pioneer in character recognition and machine translation; and Barbara Liskov, who programmed one of the first chess games in 1968 and became a professor of artificial intelligence at MIT.

Research and “coding stations”

After leading pioneering research, Adele Goldberg developed the first graphical interface computer in 1973, while Alice Recoque designed the Mitra line of minicomputers, including the Mitra 15, which was a huge commercial success. It is important to mention that MIT, in its pursuit of academic excellence, had an extremely forward-thinking policy focused on embracing diversity. This meant that between 1965 and 1985, the number of female computer science students rose from 5% to nearly 30%.

During these pioneering years, women were also very much involved at an operational level. In the 1940s, the first electronic computer, ENIAC, was fully programmed by six female mathematicians. In the early 1950s, 40% of programmers at computer manufacturer Eckert-Mauchly were women. Until 1960, in Great Britain’s public service, computer “coding stations” were almost exclusively “manned” by women.

In 1955, a team of four female programmers started the first computer services company (Computer Usage Company) started in the United States. Three years later, Elsie Schutt founded Computations Inc., which allowed mothers to pursue a career in computer science by working from home. This company lasted 50 years. In 1962, in Great Britain, Stephanie Shirley founded Freelance Programmers at the age of 29 with the same aim. The company was an international success until it was bought by Steria in 2007.

The turn of the 80s

So, how did the situation change? What caused computer science to become a male-dominated sector? The first turning point came in the 1980s and was due to recruitment procedures. To make it easier to whittle down the large number of job applicants they were receiving, an American company defined the psychological profile of a ‘good programmer’.

The profile was based on a sample of men working in a military environment and had two major characteristics: someone who was slightly less social than the average person, and someone who enjoyed typically ‘masculine’ activities.  This profile was far from what companies were looking for in the 1940s and 1950s, when they recruited patient, logical, imaginative people who liked crossword puzzles, played chess or used knitting patterns! The profile was widely used.

Secondly, as the need for computing staff increased, salaries became relatively high. At the time, female coders were not paid well, and it was unthinkable for a woman to supervise teams of both men and women. This caused the leading computing employer in Great Britain, the public sector, to stop recruiting competent, experienced and motivated female programmers in favor of men. This ultimately led to computer science becoming male-dominated.

The third factor was a takeover by the academic world, working hand in hand with the industry and excluding women. In 1968, world-leading computer science companies began to understand the importance of software in relation to hardware. This led them to organize a conference sponsored by NATO which brought together global programming specialists. However, no women, not even Grace Hopper or Jean Sammet, were invited to take part.

As well as this, universities encouraged industrialists to talk about ‘software engineering’ and ‘software engineers’ to make the career path sound more respectable. However, the term ‘engineer’  made computer science courses sound traditionally masculine.

On the other hand, the main American professional association (DPMA, Data Processing Management Association), which was overwhelmingly made up of men, created a professional skills qualification for both men and women to improve skills in the sector.  However, due to the hours needed for study and the unequal sharing out of housework between men and women, the qualification was still less accessible to women.

The influence of popular culture

In 1965, women represented 30% of the workforce in programming in the USA. In 1982, 35% of computer science jobs in France were held by women. From the late 1960s, computers gradually became widely used in society and carved out a presence in popular culture. However, this was often done in ways which did not allow a place for women. In Stanley Kubrick’s film, 2001: A Space Odyssey, the computer has a male voice and the relationship between humans and computers leaves no room for women.

In computer advertisements in the late 1970s, the target consumers were executives; in France, 80% of executives were male in 1985. Computers were marketed to men in a practical way, with the idea that they could work from home. But, to a large extent, they were also marketed as something fun.  At home, PCs were also not used equally. Men spent more time on them than women, sons had more access to them than daughters. Often, this mentality was passed down from their fathers.

When people started buying personal computers for children, for a long-time, boys were the main target audience and girls were sidelined. This portrayal of computers in society gradually spread and contributed to computers being viewed as a ‘masculine’ thing. All the more so as the free software movement, whose communities regularly exclude women, has constructed the figure of the hacker as the model developer.

Algorithms and bias

Finally, with the rise of social media and the generation of automatic advertisements, we have seen how algorithms reproduce gender bias, which reinforces discrimination against women in digital education.

All these factors have caused woman to gradually withdraw from computer science, and despite various initiatives, this trend has not been reversed. Since women’s role in computer science was never very visible, society has now forgotten the important role they once played.

If we look at countries which have equal numbers of men and women in their computer science sectors, such as India and Malaysia, as well as universities that have a permanent gender parity in these fields, such as CMU, NTNU and Harvey Mudd College, we can see how inclusive policies at a local level have enabled women to regain their place in the digital professions.

Also read on I’MTech: In IT professions, diversity is all about inclusion, not exclusion

[divider style=”normal” top=”20″ bottom=”20″]

Chantal Morley, Professor in Information systems, Institut Mines-Télécom (IMT).

The original version of this article (in French) was published in The Conversation under Creative Commons licence. Read the original article.

 

top ten

Our top ten articles of 2019 !

At this beginning of the year 2020, I’MTech takes a look back at 10 of the most noteworthy articles from the past year. What scientific topics made headlines at the close of the 2010s and the dawn of the 2020s? A look at this pivotal year, in which unsurprisingly, AI and the environment feature prominently… but not exclusively!

 

#1 Some people are paid fractions of cents to train algorithms. On social media sites, we all do it for free

#2 It’s leaving the beach and coming to cities. After air and water, sand is the third most-used resource for human beings.

#3 Do we want AI to be effective or to be able to understand what it’s doing?

#4 In rivers or on land, some plants are our best allies in the fight against heavy metals.

#5 Composite materials offer many advantages: weight, lifespan, mechanical performance etc. But for the time being, they also have one major drawback: their ability to be recycled.

#6 We no longer simply buy a product, but the services that go with it too. Companies clearly understand this.

#7 Water stress, impact on biodiversity, natural land transformation etc. Calculating the environmental footprint of a product or service is no simple task.

#8 Between optimizing sales and helping consumers, how AI is slowly making its way into our supermarkets.  

#9 Autonomous cars will soon be on our roads. But we still have to get them to communicate with one another!

#10 Do mobile applications and web services really help patients live better with their illnesses?

MADEin4

MADEin4: digital twinning and predictive maintenance for industry

Projets européens H2020The European MADEin4 project was launched in April 2019 for a three-year period. It aims to help semiconductor manufacturers and equipment suppliers play an active role in the continuous improvement of their equipment. How? By relying on new digital twinning and predictive maintenance technologies. Agnès Roussy and Valéria Borodin, research professors at Mines Saint-Étienne, a member of the MADEin4 project, explain the context that gave rise to this project and discuss the scientific hurdles to overcome.   

 

What was the context for developing the MADEin4 project?

Agnès Roussy: The MADEin4 project (Metrology Advances for Digitized ECS Industry 4.0) is an ECSEL project (Electronic Components and Systems for European Leadership). Its aim is to support and bring together the semiconductor industry in Europe in the transition to digital technology.

What is the overall goal of this project?

Valéria Borodin: To increase production output without affecting reliability levels in the manufacturing of electronic devices, the quality of which must comply with the increasingly demanding requirements of the highly competitive semiconductors market.

And how are you going about this?

AR: In order to improve productivity and facilitate the integration of digital technology into the organization of manufacturing processes for semiconductor and equipment manufacturers, going beyond the state of the art, the project will rely on an Industry 4.0 approach. To do so, two complementary boosters will be leveraged in the development of a pilot line: a physical accelerator based on next-generation metrology and inspection equipment for the microelectronics industry; and a digital accelerator – the digital twin (see box) – integrating artificial intelligence technology to improve output and equipment performance prediction.

[box type=”shadow” align=”” class=”” width=””]

Plus, loupeThe technique of digital twinning is used by manufacturers to monitor the operational status of their equipment (manufacturing, metrology, inspection). Digital twins of physical equipment are used. They evolve over time using data collected by sensors to measure the health status of equipment in order to prevent or anticipate breakdowns.[/box]

What technological and scientific challenges do you face?

VB: The development of digital twins and simulation models for managing and coordinating a production unit at different levels of decision-making poses a number of challenges, in particular, in terms of consistency of digital technology and decision-making across all industrial activities. In this regard, the help and expertise of semiconductor manufacturers and equipment suppliers (manufacturing and metrology) play a pivotal role in confirming the usefulness and industrial feasibility of the solutions we propose as academics.

How are Mines Saint-Étienne researchers contributing to the project?

AR: One of the research areas, in which Mines Saint Étienne’s Manufacturing and Logistics Sciences department (SFL) is primarily active, focuses on microelectronic manufacturing. This involves advanced process control, quantitative management of operations in the manufacturing process, and decision support at different levels (operational, tactical and strategic). As part of the MADEin4 project, we seek to explore opportunities and identify the limitations of new digital technologies in the intensive use and analysis of the massive quantities of data collected by inspection and metrology equipment.

Who are you partners for this project, and which collaborations are important for your work?

VB: The MADEin4 project brings together the expertise of 42 industrial and academic partners from 11 countries. Our key industrial partners for this project are STMicroelectronics in Rousset and Crolles. This project, among others, allows us to continue the long-standing, successful academic collaboration between the Manufacturing and Logistics Sciences Department at Mines Saint Etienne’s Provence Microelectronics Center (CMP) and the ST sites of Rousset and Crolles, who we’ve worked with for over 15 years. Many equipment suppliers are also involved in this project, so we’ll have the opportunity to work with them more closely on the equipment. And likewise for the academic partners involved: this European project will help foster new opportunities for collaboration through PhD theses or future calls for projects.

What are the expected benefits?

AR: The expected benefits of the MADEin4 project closely reflect the scientific and strategic priorities of Mines Saint-Etienne and the Provence Microelectronics Center (CMP), which promote a number of important topics: the industry of the future (Industry 4.0) and artificial intelligence (IA). Through the MADEin4 project, we seek to provide process control solutions for semiconductor manufacturers, explore opportunities for applications of digital twinning technology, strengthen the partnership with semiconductor manufacturers, and increase international recognition for the CMP on topics related to microelectronic manufacturing.

What are the important steps coming up for the project?

VB: The MADEin4 project started just over six months ago. This initial phase is exciting because everything seems to be possible. As for Mines Saint Étienne, the industrial data soon to be provided by the different partners will allow us to compare our research to the realities of industry. By the end of the first year, the research findings will be publicized through articles in international journals and presentations to the scientific communities involved.

Find out more about the MADEin4 project

codling moth

A tribe of irreducible codling moths

In agriculture, biological alternatives to pesticides are being sought for environmental and health reasons. Use of a virus as a biocontrol agent for crop pests has become relatively widespread. One such example is Cydia pomonella granulovirus. It has been used for decades to target a crop pest that is fond of apples and pears: the codling moth. Miguel Lopez-Ferber, a researcher at IMT Mines Alès, has been working on this topic since 2005, the year in which the larvae of this insect developed resistance to a commercial product made from this granulovirus.

 

Cydia pomonella. Common name: codling moth. Adult moths measure approximately 18 millimeters long and are a brownish-gray color. Shortly after they hatch, larvae bore into orchard fruits and feed on them from the inside. The apples and pears are damaged and rot, so they are no longer suitable to be sold. However, these insects are very susceptible to a virus: their granulovirus, which is known as CpGV. If they come into contact with it, the larvae become ill and die, leaving the orchards healthy and untouched.

Unfortunately for apple trees, codling moths have developed resistance to CpGV, which poses a number of problems. For one, farmers need to find a new way to protect their orchards. And manufacturers need to determine whether it is possible to improve the viral preparation, find a substitute, or if production must be stopped. Then there’s the scientific question: “Codling moths have been in contact with this virus for millions of years and have never developed resistance to it. Why now?” wonders Miguel Lopez-Ferber, a researcher at IMT Mines Alès: “If there had been widespread resistance in the past, we would no longer find this virus in nature.” 

One possible explanation is that, “we’ve underestimated the inventiveness of nature,” says Miguel Lopez-Ferber. “We’ve acted as if it were a chemical product: for years, exactly the same viral solution has been spread over orchards.” In nature, when an insect repeatedly comes into contact with the same chemical product, it will adapt and find a way to resist it. So the product will no longer work as well. Viruses, on the other hand, will also adapt and find new ways to reach the insects if we don’t prevent them from doing so – they are in co-evolution.

“It works the same way with humans, with the flu virus, for example,” explains the researcher. “We develop defenses to protect ourselves from the virus, but it evolves and comes back stronger the next year.” And CpGV exists in different forms throughout the world. There are slight variations in genotype – which is the sum total of an individual’s genes. And the solution available on the market corresponds to the culmination of research on a single isolated genotype of this virus.

Research to overcome resistance

With CpGV, the same virus isolate has been applied massively for years. This means that it’s possible that codling moth larvae are not resistant to other isolates of the virus. The different genotypes of the virus have been divided into 5 groups, from A to E. The “A” group is most similar to the Mexican isolate, which has been used historically. The researchers found that the other groups infected resistant larvae. At the beginning, however, the other viral isolates were less effective than those originally used – a greater quantity of product was needed for the same plot of land. But with a little selection, the performance reached the same levels as that of the original product. “We were also worried that we would observe resistance to these new genotypes in the future,” says Miguel Lopez-Ferber. But it is likely that this resistance phenomenon will not appear if there is greater viral diversity.

The researchers therefore tried another method: they combined the Mexican isolate to which the larvae had become resistant, with another genotype of the virus that infected the larvae. And they discovered that together, the two were even more effective in infecting the larvae. In a way, the second genotype, “opened the door to the one that had been blocked until then,” explains Miguel Lopez-Ferber, “but we still don’t really understand how it works.” The researchers are therefore studying how the different forms of the virus interact with one another to infect larvae. They could then use this knowledge to develop one or several optimal mixtures, by appropriately combining the specific characteristics of each genotype.

“Viral diversity is an asset, but we don’t yet fully understand how it works,” explains the researcher. “Imagine, for example, if we wanted to colonize a desert island. If all the colonists were miners, there’d be a lack of skills for building houses, planting crops etc. We need different, complementary jobs. That’s what we get when we combine several genotypes of a virus – except we don’t really know what their jobs are. We just know they work better together.”

And studying the virus’s behavior in codling moth larvae is no simple task. Let’s say that a type A and type B virus are dispersed in an orchard. How can we determine if a larva has absorbed both forms of the virus or only a single form? Or, if one of them has prevented the other from developing? In order to understand how the tasks are distributed, the researchers would must able to track A and B as they colonize the larva. The molecular tools available today are not the best suited for this. Miguel Lopez-Ferber’s team is currently working in partnership with NeoVirTech, a Toulouse-based company, to develop a better technique for tracking viruses.

[box type=”info” align=”” class=”” width=””]

The origins of granulovirus

Using a natural predator to protect our harvests is not a new idea,” says Miguel Lopez-Ferber. “We domesticated cats to combat mice. It’s the same principle with granulovirus.”

It was in Mexico in 1964 that the codling moth granulovirus (CpGV) was discovered. Codling moth larvae were found dead and researchers sought to determine the cause. They then isolated the virus responsible: the Mexican isolate of CpGV. Shortly after, other forms of the virus were observed in Russia, followed by the United Kingdom. Slight variations existed between the viruses, subtle differences in genotype – the sum total of an individual’s genes. The Mexican isolate was favored for a potential commercial product because it was more homogenous. This made it easier to characterize in order to comply with regulatory criteria for insecticides, which are equivalent for phytosanitary and biological products and require rigorous identification of a product’s makeup.  

After 25 years of research, the viral solution was ready for use and authorized for sale. In practice, it is used like a chemical product. A concentrated liquid solution is prepared in a tank and is then dispersed in a spray of fine droplets, ideally covering the entire orchard in a uniform manner. Starting in the 1990s, the product was widely used and applied several times a year. Until 2005, that is, when it was observed that codling moths were still present after the treatment.[/box]

Guillaume Balarac

Guillaume Balarac, turbulence simulator

Turbulence is a mysterious phenomenon in fluid mechanics. Although it has been observed and studied for centuries, it still holds secrets that physicists and mathematicians strive to unlock. Guillaume Balarac is part of this research community. A researcher at Grenoble INP (at the LEGI Geophysical and Industrial Flows Laboratory), he uses and improves simulations to understand turbulent flows better. His research has given rise to innovations in the energy sector. The researcher, who has recently received the 2019 IMT-Académie des Sciences Young Scientist Award, discusses the scientific and industrial challenges involved in his field of research.

 

How would you define turbulent flows, which are your research specialty?

Guillaume Balarac: They are flows with an unpredictable nature. The weather is a good example for explaining this. We can’t predict the weather more than five days out, because the slightest disturbance at one moment can radically alter what occurs in the following hours or days . It’s the butterfly effect. Fluid flows in the atmosphere undergo significant fluctuations that limit our ability to predict them. This is typical of turbulent flows, unlike laminar flows which are not subject to such fluctuations and whose state may be predicted more easily.

Apart from air mass movements in the atmosphere, where can turbulent flows be found?

GB: Most of the flows that we may encounter in nature are actually turbulent flows. The movement of oceans is described by turbulent flows, as is that of rivers. The movement of molten masses in the Sun generates a turbulent flow. This is also the case for certain biological flows in our bodies, like blood flow near the heart. Apart from nature, these flows are found in rocket propulsion, the motion  of wind turbines and that of hydraulic or gas turbines etc.

Why do you seek to better understand these flows?

GB: First of all, because we aren’t able to do so! It’s still a major scientific challenge. Turbulence is a rather uncharacteristic example – it has been observed for centuries. We’ve all seen a river or felt the wind. But the mathematical description of these phenomena still eludes us. The equations that govern these turbulent flows have been known for two centuries. And the underlying mechanics have been understood since ancient times.  And yet, we aren’t able to solve these equations and we’re ill-equipped to model and understand these events.

You say that researchers can’t solve the equations that govern turbulent flows. Yet, some weather forecasts for several days out are accurate…

GB: The iconic equation that governs turbulent flows is the Navier-Stokes equation. That’s the one that has been known since the 19th century. No one is able to find a solution with a pencil and paper. Finding a unique, exact solution to this equation is even one of the seven millennium problems established by the Clay Mathematics Institute.  As such, the person who finds the solution will be awarded $1 million. That gives you an idea about the magnitude of the challenge. To get around our inability to find this solution, we either try to approach it using computers, as is the case for weather forecasts  — with varying degrees of accuracy — or we try to observe it. And finding a link between observation and equation is no easy task either!

Beyond this challenge, what can a better understanding of turbulent flows help accomplish?

GB: There are a wide range of applications which require an understanding of these flows and the equations that govern them. Our ability to produce energy relies in part on fluid mechanics, for example. Nuclear power plants function with water and steam systems. Hydroelectric turbines work with water flows, as do water current turbines. For wind turbines, it’s air flows.  And these examples are only as far as the energy sector is concerned.

You use high-resolution simulation to understand what happens at the fundamental level in a turbulent flow. How does that work?

GB: One of the characteristics of turbulent flows are eddies. The more turbulent the flow, the more eddies of varying sizes it has. The principle of high resolution simulation is to define billions of points in the space in which the flow is produced, and calculate the fluid velocity at each of these points. This is called a mesh, and it must be fine enough to describe the smallest eddy in the flow. These simulations use the most powerful supercomputers in France and Europe. And even with all that computing power, we can’t simulate realistic situations – only academic flows in idealized conditions . These high-resolution simulations allow us to observe and better understand the dynamics of turbulence in canonical configurations.

Simulation des écoulements turbulents sur une hydrolienne.

Simulation of turbulent flows on a marine turbine.

Along with using these simulation tools, you work on improving them. Are the two related?

GB: They are two complementary approaches. The idea for that portion of my research is to accept that we don’t have the computing power to simulate the Navier-Stokes equation in realistic configurations. So the question I ask myself is – how can this equation be modified so that it can be possible to solve with our current computers, while ensuring that the prediction is still reliable? The approach is to solve the big eddies first. And since we don’t have the power to make a fine enough mesh for the small eddies, we look for physical terms, mathematical expressions, which replace the influence of the small eddies on the big ones. That means that we don’t have the small eddies in this modeling, but their overall contribution to flow dynamics is taken into account. This helps us improve simulation tools by making them able to address flows in realistic conditions.

Are these digital tools you’re developing used solely by researchers?

GB: I seek to carry out research that is both fundamental and application-oriented. For example, we worked with Hydroquest, on the performance of water current turbines to generate electricity. The simulations we carried out made it possible to assess the performance loss due to the support structures, which do not contribute to capturing the energy from the flow. Our research led to patents for new designs, with a 50% increase in yield.

More generally, do energy industry players realize how important it is to understand turbulent flows in order to make their infrastructures more efficient?

GB: Of course, and we have a number of partners who illustrate industrial interest for our research.    For example, we’ve adopted the same approach to improve the design of floating wind turbines. We’re also working with General Electric on hydroelectric dam turbines. These hydraulic turbines are increasingly being used to operate far from their optimal operating point, in order to mitigate the intermittence of renewable solar or wind energy.  In these systems, hydrodynamic instability develops, which has a significant effect on the machines’ performance. So we’re trying to optimize the operation of these turbines to limit yield loss.

What scientific challenges do you currently face as you continue your efforts to improve simulations and our understanding turbulent flows?

GB: At the technical level, we’re trying to improve our simulation codes to take full advantage of advances in supercomputers. We’re also trying to improve our numerical methods and models to increase our predictive capacity.  For example, we’re now trying to integrate learning tools to avoid simulating small eddies and save computing time. I’ve started working with Ronan Fablet, a researcher at IMT Atlantique, on precisely this topic. Then, there’s the huge challenge of ensuring the reliability of the simulations carried out. As it stands now, if you give a simulation code to three engineers, you’ll end up with different models. This is due to the fact the tools aren’t objective, and a lot depends on the individuals using them. So we’re working on mesh and simulation criteria that are objective. This should eventually make it possible for industry players and researchers to work with the same foundations,  and better understand one another when discussing turbulent flows.

 

Véronique Bellon-Maurel

Véronique Bellon-Maurel: from infrared spectroscopy to digital agriculture

Measuring and quantifying have informed Véronique Bellon-Maurel’s entire scientific career. A pioneer in near infrared spectroscopy, the researcher’s work has ranged from analyzing fruit to digital agriculture. Over the course of her fundamental research, Véronique Bellon-Maurel has contributed to the optimization of many industrial processes. She is now the Director of #DigitAg, a multi-partner Convergence Lab, and is the winner of the 2019 IMT-Académie des Sciences Grand Prix. In this wide-ranging interview, she retraces the major steps of her career and discusses her seminal work.   

 

You began your research career by working with fruit. What did this research involve?

Véronique Bellon-Maurel: My thesis dealt with the issue of measuring the taste of fruit in sorting facilities. I had to meet industrial requirements, particularly in terms of speed: three pieces of fruit per second! The best approach was to use near infrared spectroscopy to measure the sugar level, which is indicative of taste. But when I was beginning my thesis in the late 1980s, it took spectrometers one to two minutes to scan a piece of fruit. I suggested working with very near infrared, meaning a different type of radiation than the infrared that had been used up to then, which made it possible to use new types of detectors that were very fast and inexpensive.

So that’s when you started working on near infrared spectroscopy (NIRS), which went on to became your specialization. Could you tell us what’s behind this technique with such a complex name?

VBM: Near infrared spectroscopy (NIRS) is a method for analyzing materials. It provides a simple way to obtain information about the chemical and physical characteristics of an object by illuminating it with infrared light, which will pass through the object and become charged with information. For example, when you place your finger on your phone’s flashlight, you’ll see a red light shining through it. This light is red because the hemoglobin has absorbed all the other colors of the original light. So this gives you information about the material the light has passed through. NIRS is the same thing, except that we use particular radiation with wavelengths that are located just beyond the visible spectrum.

Out of all the methods for analyzing materials, what makes NIRS unique?

VBM: Near infrared waves pass through materials easily. Much more easily than “traditional” infrared waves which are called “mid-infrared.” They are produced by simple sources such as sunlight or halogen lamps. The technique is therefore readily available and is not harmful: it is used on babies’ skulls to assess the oxygenation saturation of their brains! But when I was starting my career, there were major drawbacks to NIRS. The signal we obtain is extremely cluttered because it contains information about both the physical and chemical components of the object.

And what is hiding behind this “cluttered signal”?

VBM: In concrete terms, you obtain hill-shaped curves and the shape of these curves depends on both the object’s chemical composition and its physical characteristics. You’ll get a huge hill that is characteristic of water. And the signature peak of sugar, which allows you to calculate a fruit’s sugar level, is hidden behind it. That’s the chemical component of the spectrum obtained. But the size of the hills also depends on the physical characteristics of your material, such as the size of the particles or cells that make it up, physical interfaces — cell walls, corpuscles — the presence of air etc. Extracting solely the information we’re interested in is a real challenge!

Near infrared spectrums of apples.

 

One of your earliest significant findings for NIRS was precisely that – separating the physical component from the chemical component on a spectrum. How did you do that?

VBM: The main issue at the beginning was to get away from the physical component, which can be quite a nuisance. For example, light passes through water, but not the foam in the water, which we see as white, even though they are the same molecules! Depending on whether or not the light passes through foam, the observation — and therefore the spectrum — will change completely. Fabien Chauchard was the first PhD student with whom I worked on this problem. To better understand this optical phenomenon, which is called diffusion, he went to the Lund Laser Center in Sweden. They have highly-specialized cameras: time-of-flight cameras, which operate at a very high speed and are able to capture photos “in flight.” We send photons onto a fruit in an extremely short period of time and we recover the photons as they come out since not all of them come out at the same time. In our experiments, if we place a transmitter and a receiver on a fruit spaced 6 millimeters apart, when they came out, certain photons had travelled over 20 centimeters! They had been reflected, refracted, diffracted etc. inside the fruit. They hadn’t travelled in a straight line at all. This gave rise to an innovation, spatially resolved spectroscopy (SRS) developed by the Indatech company that Fabien Chauchard started after completing his PhD.

We looked for other optical arrangements for separating the “chemical” component from the “physical” component. Another PhD student, Alexia Gobrecht, with whom I worked on soil, came up with the idea of using polarized near infrared light. If the photons penetrate the soil, they lose their polarization. Those that have only travelled on the surface conserve it. By differentiating between the two, we recover spectrums that only depend on the chemical component. This research on separating chemical and physical components was continued in the laboratory, even after I stopped working on it. Today, my colleagues are very good at identifying aspects that have to do with the physical component of the spectrum and those that have to do with to the chemical component. And it turns out that this physical component is useful! And to think that twenty years ago, our main focus was to get rid of it.

After this research, you transitioned from studying fruit to studying waste. Why did you change your area of application?

VBM: I’d been working with the company Pellenc SA on sorting fruit since around 1995, and then on detectors for grape ripeness. Over time, Pellenc transitioned to waste characterization for the purpose of sorting, based on the infrared knowledge developed through sorting fruit. They therefore called on us, with a new speed requirement, but this one was much tougher. A belt conveyor moves at a speed of several meters per second. In reality, the areas of application for my research were already varied. In 1994, while I was still working on fruit with Pellenc, I was also carrying out projects for biodegradable plastics. NIRS made it possible to provide quality measurements for a wide range of industrial processes. I was Ms. “Infrared sensors!”

 

“I was Ms. ‘Infrared sensors’!”
– Véronique Bellon-Maurel

 

Your work on plastics was among the first in the scientific community concerning biodegradability. What were your contributions in this area?

VBM: 1990 was the very beginning of biodegradable plastics. Our question was determining whether we could measure a plastic’s biodegradability in order to say for sure, “this plastic is truly biodegradable.” And to do so as quickly as possible, so why not use NIRS? But first, we had to define the notion of biodegradability, with a laboratory test. For 40 days, the plastics were put in reactors in contact with microorganisms, and we measured their degradation. We were also trying to determine whether this test was representative of biodegradability in real conditions, in the soil. We buried hundreds of samples in different plots of land in various regions and we dug them up every six months to compare real biodegradation and biodegradation in the laboratory. We wanted to the find out if the NIRS measurement was able to achieve the same result, which was estimating the degradation kinetics of a biodegradable plastic – and it worked. Ultimately, this benchmark research on the biodegradability of plastics contributed to the industrial production and deployment of the biodegradable plastics that are now found in supermarkets.

For that research, was your focus still on NIRS?

VBM: The crux of my research at that time was the rapid, non-destructive characterization — physical or chemical— of products. NIRS was a good tool for this. We used it again after that on dehydrated household waste in order to assess the anaerobic digestion potential of waste. With the laboratory of environmental biotechnology in Narbonne, and IMT Mines Alès, we developed a “flash” method to quickly determine the quantity of bio-methane that waste can release, using NIRS. This research was subsequently transferred to the Ondalys company, created by Sylvie Roussel, one of my former PhD students. My colleague Jean-Michel Roger is still working with them to do the same thing with raw waste, which is more difficult.

So you gradually moved from the agri-food industry to environmental issues?

VBM: I did, but it wasn’t just a matter of switching topics, it also involved a higher degree of complexity. In fruit, composition is restricted by genetics – each component can vary within a known range. With waste, that isn’t the case! This made environmental metrology more interesting than metrology for the food industry. And my work became even more complex when I started working on the topic of soil. I wondered whether it would be possible to easily measure the carbon content in soil. This took me to Australia, to a specialized laboratory at the University of Sydney. To my mind, all this different research is based on the same philosophy: if you want to improve something, you have to measure it!

So you no longer worked with NIRS after that time? 

VBM: A little less, since I changed from sensors to assessment. But even that was a sort of continuation: when sensors were no longer enough, how could we make measurements? We had to develop assessment methods. It’s very well to measure the biodegradability of a plastic, but is that enough to successfully determine if that biodegradable plastic has a low environmental impact? No, it isn’t – the entire system must be analyzed. I started working on life-cycle analysis (LCA) in Australia after realizing that LCA methods were not suited to agriculture: they did not account for water, or notions of using space. Based on this observation, we improved the LCA framework to develop the concept of a regional LCA, which didn’t exist at the time, allowing us to make an environmental assessment of a region and compare scenarios for how this region would evolve. What I found really interesting with this work was determining how to use data from information systems and sensors to build the most reliable and reproducible model as possible. I wanted the assessments to be as accurate as possible. This is what led me to my current field of research – digital agriculture.

Read more on I’MTech: The many layers of our environmental impact

In 2013 you founded #DigitAg, an institute dedicated to this topic. What research is carried out there?

VBM: The “Agriculture – Innovation 2025” report submitted to the French government in 2015 expresses a need to structure French research on digital agriculture. We took advantage of the opportunity to create Convergence Labs by founding the #DigitAg, Digital Agriculture Convergence Lab. It’s one of ten institutes funded by the Investments in the Future program. All of these institutes were created in order to carry out interdisciplinary research on a major emerging issue. At #DigitAg, we draw on engineering sciences, digital technology, biology, agronomy, economy, social sciences, humanities, management etc. Our aim is to establish knowledge bases to ensure that digital agriculture develops in a harmonious way. The challenge is to develop technologies but also to anticipate how they will be used and how such uses will transform agriculture – we have to predict how technologies will be used and the impacts they will have to help ensure ethical uses and prevent misuse. To this end, I’ve also set up a living lab, Occitanum — for Occitanie Digital Agroecology — set to start in mid-2020. The lab will bring together stakeholders to assess the use value of different technologies and understand innovation processes. It’s a different way of carrying out research and innovation, by incorporating the human dimension.