Yelda et OSO-AI Yelda and OSO-AI

Yelda and OSO-AI: new start-ups receive honor loans

On December 6, the Committee for the Digital Fund of the Graduate Schools and Universities Initiative chose two start-ups to receive honor loans: Yelda and OSO-AI. Together, Yelda, a start-up from the incubator IMT Starter, and OSO-AI, from the incubator at IMT Atlantique will receive three honor loans, for a total of €80,000.

These interest-free loans aimed at boosting the development of promising young companies are co-financed by the Fondation Mines-Télécom, the Caisse des Dépôts and Revital’Emploi. This initiative has supported over 84 startups since 2012.

 

[box type=”shadow” align=”” class=”” width=””]

Yelda is developing the first vocal assistant for companies. The start-up’s team—composed of experts in bots, automatic natural language processing, voice management and machine learning—is convinced that chat and vocal interactions will soon replace traditional interfaces. This will revolutionize the way users interact with companies, for both customers and employees. Yelda, a start-up from the incubator IMT Starter, received an honor loan of €40,000. Find out more [/box]

[box type=”shadow” align=”” class=”” width=””]

OSO-AI is already improving quality of life for the hearing impaired. The start-up will soon become the partner of reference in Artificial Intelligence for hearing aids and will invent Augmented Auditory Reality. The start-up, incubated at IMT Atlantique, received an honor loan of €30,000 and another of €10,000. Find out more [/box]

 

métiers du droit, legal professions

The legal professions coping with the challenge of digital technology

[dropcap]T[/dropcap]he first two issues of Enjeux numériques/Digital issues focused on artificial intelligence (AI) and big data. These topics reappear in this new issue devoted to the impact of digital technology on the legal professions at a time when big data and advanced algorithms are two major factors in the pursuit of the digital transformation of our society and of jobs. This special issue seeks to better understand the social, economic and societal issues in this transformation of the legal professions, in particular the regulatory or ethical issues and the stakes of being economically competitive and attractive. Since the world is undergoing this transformation, we would like to present a panorama of European studies on this topic […]

Introduction by Françoise Trassoudaine and Jean-Pierre Dardayrol
Conseil Général de l’Économie

About Digital issues, the new series of Annales des Mines

Digital Issues is a quarterly series (March, June, September and December) freely downloadable on the Annales des Mines website, with a print version in French language. Focus of the series is on the issues of the digital transition for an enlightened, yet non necessarily expert, readership. Various viewpoints are being used between technology, economy and society as the Annales des Mines are used to doing in all their series.

Download for free all the articles of this issue

migrants

How has digital technology changed migrants’ lives?

Over the past few decades, migrants have become increasingly connected, as have societies in both their home and host countries. The use of new technologies allows them to maintain ties with their home countries while helping them integrate in their new countries. They also play an important role in the process of migration itself. Dana Diminescu, a sociologist at Télécom ParisTech, is exploring this link between migration and digital technology and challenging the traditional image of the uprooted migrant. She explains how new uses have changed migratory processes and migrants’ lives.

 

When did the link between migration and digital technology first appear?

Dana Diminescu: The link really became clear during the migration crisis of 2015. Media coverage highlighted the migrants’ use of smartphones and the public discovered the role telephones play in the migration process. A sort of “technophoria” appeared for refugees. This led to a great number of hackathons being organized to make applications to help immigrants, with varying degrees of success. In reality, the migrants were already connected well before the media hype of 2015. In 2003, I’d already written an epistemological manifesto on the figure of the connected migrant, based on observations dating from the late 1990s.

In 1990, smartphones didn’t exist yet; how were migrants ‘connected’ at that time?

DD: My earliest observation was the use of a mobile phone by a collective of migrants living in a squat. For them, the telephone was a real revolution and an invaluable resource. They used it to develop a network and find contacts. This helped them find jobs and housing, in short, it helped them integrate society. Two years later, those who had been living in the squat had got off the street and the mobile phone played a large role in making this possible.

What has replaced this mobile phone today?

DD: Social media play a very strong role in supporting integration for all migrants, regardless of their home country or cultural capital. One of the first things they do when they get to their country of destination is to use Facebook to find contacts. WhatsApp is also widely used to develop networks. And YouTube helps them learn languages and professional skills.

Dana Diminescu has been studying the link between migrants and new technologies since the late 1990s.

Are digital tools only useful in terms of helping migrants integrate?

DD: No, that’s not all – they also have an immediate, performance-related effect on the migratory process itself. In other words, an individual’s action on social media can lead to almost instantaneous effect on migration movement. A message posted by a migrant showing that he was able to successfully cross the border at a certain place on the Balkan route creates movement. The other migrants will adjust their journey that same day. That’s why we now talk about migration traceability rather than migration movement. Each migrant uses and leaves behind a record of his or her journey. These are the records used in sociology to understand migrants’ choices and actions.

Does the importance of digital technology in migration activity challenge the traditional image of the migrant?

DD: For a long time, humanities research focused on the figure of the uprooted migrant. In this perception of migrants, they are at once absent from their home country, and absent from their destination country since they find it difficult to fit in completely. New technologies have had an impact on this view, because they have made these forms of presence more prominent. Today, migrants can use tools like Skype to see their family and loved ones from afar and instantly. In interviews, migrants tell me, “I don’t have anything to tell them when I go back to see them since I’ve already told them everything on Skype.” As for presence in their destination countries, digital tools play an increasingly important role in access, whether for biometric passports or cards to access work, transport etc. For migrants, the use of these different tools makes their way of life very different to the way it would have been a few years ago, when such access had not yet been digitized. It is now easier for them to exercise their rights.

Does this have an effect on the notion of borders?

DD: Geographical borders don’t have the same meaning they used to. As one migrant explained in his account one day, “They looked for me on the screen, they didn’t find me, I got through.” Borders are now based on our personal data: they’re connected to our date of birth, digital identities, locations. These are the borders migrants have to get around today. That’s why their telephones are confiscated by smugglers so that they can’t be traced, or why they don’t bring their smartphones with them, so that border police won’t be able to force them to open Facebook.

So digital technology can represent a hurdle for migrants?

DD: Since the migrants are connected, they can, of course, be traced. This limiting aspect of digital technology also exists in the uses of new technology in destination countries. Technology has increased the burden of the informal contract between those who leave and those who stay behind. Families expect migrants to be very present. They expect individuals to be available for them at the times they’re used to spending in their company. In interviews, migrants say that it’s a bit like a second job. They don’t want to appear as if they have broken away, they have to check in. At times, this leads to migrants’ lying, saying that they’ve lost their mobile phone or that they don’t have internet access, to free themselves from the burden of this informal contract. In this case, digital technology is seen as a constraint, and at times it can even be highly detrimental to social well-being.

In what sort of situations is this the case?

DD: In refugee camps, we’ve observed practices that cut migrants off from social ties. In Jordan, for example, it’s impossible to send children to get food for their parents. Individuals must identify themselves with a biometric eye scanner and that’s the only way for them to receive their rations. If they can’t send their children, they can’t send their friends or neighbors either. There is a sort of destruction of the social fabric and support networks. Normal community relationships become impossible for these refugees. In a way, these technologies have given rise to new forms of discrimination.

Does this mean we must remain cautious?

DD: We must be wary of digital solutionism. We conducted a research project with Simplon on websites that provide assistance for migrants. A hundred sites were listed. We found that for the most part, the sites were either not usable or not finalized — and when they are, they’re rarely used. Migrants still prefer using social media over specific digital tools. For example, they would still rather learn a language with Google Translate than use a language learning application. They realize that they need certain things to facilitate their learning and integration process. It’s just that the tools that have been developed for these purposes aren’t effective. So we have to be cautious and acknowledge that there are limitations to digital technology. What could we delegate to a machine in the realm of hospitality? How many humans are there behind training programs and personal support organizations?

 

Audrey Francisco Bosson

Audrey Francisco-Bosson, particle tracker

Audrey Francisco-Bosson has just won a L’Oréal-UNESCO For Women in Science Scholarship. This well-deserved award is in recognition of the young researcher’s PhD work in fundamental physics, carried out at the Subatech laboratory at IMT Atlantique. By exploring the furthest depths of matter through the eyes of the ALICE detector of the Large Hadron Collider (LHC) at CERN, Audrey Francisco-Bosson tracks particles in order to better understand the mysterious quark-gluon plasma.

 

How can matter be reproduced to represent its state at the origin of the universe? 

Audrey Francisco-Bosson: At our level, all matter is made up of atoms, the nuclei of which are composed of protons and neutrons. Inside these protons and neutrons, there are quarks bound together by gluons responsible for what we call “strong interaction.” The Large Hadron Collider (LHC) at CERN allows us to break atoms apart in order to study this strong interaction. When heavy nuclei collide with one another, the energy released is enough to release these quarks. What we end up with is a state of matter in which the quarks and the gluons are no longer bound together: the quark-gluon plasma. This state corresponds to that of the universe a few micro-seconds after the Big Bang: the temperature is 100,000 times higher than that of the sun’s core

What do you look at in the plasma?

AFB: The plasma itself has a very short lifetime: over a billion times shorter than a nanosecond. We cannot observe it. We can, however, observe the particles that are produced in this plasma. When they cool down, the quarks and gluons which were released in the plasma join together to form new particles. We measure their energy, momentum, charge and mass in order to identify and characterize them. All of these aspects provide us with information about the plasma. Since there are lots of different particles, it’s important to specialize a bit. For my PhD thesis I concentrated on the J/ψ particle.

Audrey Francisco-Bosson, winner of a 2018 L’Oréal-Unesco For Women in Science Scholarship. Photo: Fondation L’Oréal/Carl Diner.

 

What is special about the J/ψ particle?

AFB: Researchers have been interested in it for a long time since it has been identified as a good probe for measuring the temperature of the plasma. It is composed of a pair of quarks, which break apart above a certain temperature. Researchers had historically suspected that by looking at whether or not the pair had split apart, it would be possible to derive the temperature of the quark-gluon plasma. In practice, it turned out to be a bit more complicated than that. But the J/ψ particle is still used as a probe for the plasma. For my purposes I used it to deduce information, but about its viscosity rather than temperature.

How do you use J/ψ to deduce the viscosity of the quark-gluon plasma?

AFB: It’s important to understand that there are huge pressure variations in the environment we’re observing. The particles do not all have the same characteristics, and importantly, they aren’t all the same weight. They are thus divided up according to the pressure difference. Since the J/ψ is quite heavy, observing how it moves allows us to observe the flow of the plasma. As in a river, objects won’t travel at the same speed depending on their weight. By combining these observations of J/ψ particles with those of other particles, we deduce the viscosity properties of the plasma. That’s how it was proved that the quark-gluon plasma doesn’t behave like a gas— as we had thought — but like an inviscid fluid.

Does your scientific community still have any big questions about the quark-gluon plasma that the J/ψ particle could help answer?

AFB: One of the big questions is finding out at what moment this fluid characteristic is reached. That means that we can use the laws of fluid mechanics, and those of hydrodynamics in particular, to describe it. More generally, all this research makes it possible to test the properties and laws of quantum chromodynamics. This theory describes the strong interaction that binds the quarks. By testing this theory, we can assess whether the model used to describe matter is correct.

You are going to start working at Yale University in the USA in the coming weeks. What kind of research will you be carrying out there?

AFB: I’ll be working on the results of the STAR detector, which is located at the heart of the RHIC collider. It’s similar to the LHC ALICE detector but with different collision energies. The two detectors are complementary, so they allow us to compare different results in order to study variations between one energy and another and deduce new information about the plasma. For my part, the idea will also be to analyze collision data, like I did with ALICE. I’ll also work on developing new sensors. It’s an important task for me since I studied physical engineering before beginning my PhD thesis. I like to really understand how a detector works before using it. That’s also why I worked on a new sensor for ALICE during my PhD thesis which will be installed on the detector in 2021.

 

Also read on I’MTech

atmosphere

What’s new in the atmosphere?

In conjunction with the 4th National Conference on Air Quality held in Montrouge on 9 and 10 October 2018, I’MTech sat down with François Mathé, a researcher in atmospheric sciences at IMT Lille Douai to ask him five questions. He gave us a glimpse of the major changes ahead in terms of measuring and monitoring air pollutants. Between revising the ATMO index and technical challenges, he explains the role scientists play in what is one of today’s major public health and environmental challenges.

 

The ATMO index, which represents in France air quality with a number ranging from 1 (very good) to 10 (very bad), is going to be revised. What is the purpose of this change?

François Mathé: The concept of an index to represent outdoor ambient air quality is that it is an indicator that provides a daily report on the state of the atmosphere in a clear, easily-accessible way for people who live in cities with over 100,000 residents. The ATMO index is based on measured concentrations of pollutants which are representative of their origins: ozone (O3), particulate matter (PM10), nitrogen dioxide (NO2), and sulfur dioxide (SO2). A sub-index is calculated for each of these chemical species, which is determined daily based on average pollution levels considered for specific stations — those which are representative of ambient pollution, or “background pollution”. The highest sub-index corresponds to the ATMO index. The higher the value, the lower the air quality. The problem is that this approach doesn’t take into account proximity phenomena such as vehicle or industrial emissions, or the cocktail effect — if the four pollutants all have a sub-index of 6, the ATMO index will be lower than if three of them have a sub-index of 1 and the fourth has a sub-index of 8. Yet the cocktail effect can have impacts on health, whether short or long-term. This is one of the reasons for the index revision planned in the near future, to better report on the state of the atmosphere, while updating the list of pollutants taken into consideration and making our national index consistent with those used by our European neighbors.

Why does the list of pollutants considered have to be updated?

FM: Sulfur dioxide (SO2) and carbon monoxide (CO) are chemical compounds that were in the spotlight for a long time. Although their toxicity is a real issue, these pollutants are now associated with very specific, clearly-defined situations, such as industrial sites or underground parking lots. At the national level, it is no longer appropriate to take them into account. Conversely, new species are emerging which are worth highlighting on a national scale. In June, the ANSES published a notice on non-regulated air pollutants that should be taken into consideration in monitoring air quality. The list includes pollutants such as 1.3-butadiene, ultrafine particles (UFP), soot carbon, and others. In France, we also have a very specific problem: plant protection products, i.e. pesticides. The ANSES has established a list of over 90 of these types of products which are currently being assessed through a year-long project covering the entire French territory. As a result, all of these ‘new pollutants’ require to re-examine how air quality is presented to citizens. In addition, we could mention pollens which are often ‘marginalized’ when it comes to monitoring air quality in France.

Against this backdrop of changing the way air quality is assessed and represented, what role do researchers play?

FM: Behind these notions of measuring, monitoring and representing air quality there are regulations, at both national and European level. And regulations imply technical standards and guidelines for the organizational aspect. That’s where the central laboratory for air quality monitoring (LCSQA) comes in. It serves as the central scientific reference body, bringing together IMT Lille Douai, Ineris, and LNE (the national laboratory of metrology and testing). By pooling different skills, this body acts as a foundation of expertise. It is responsible for tasks such as validating technical documents establishing methodologies to apply for pollutants measurement, setting requirements for using devices, verifying the technical compliance of the instruments themselves, etc. For example, we conduct tests on sensors in the laboratory and in real conditions to assess their performance and make sure that they are able to measure correctly, compared to reference instruments.

Where do the regulations and guidelines you use in your work come from?

FM: Mainly from European directives. The first regulations date back to the 1980s and the current texts in force, establishing thresholds which must not be exceeded and measuring techniques to be used, date from 2004, 2008 and 2015 respectively. The 2004 text specifically applies to the chemical composition of PM10, and in particular the concentration of specific heavy metals (arsenic, cadmium, nickel) and of organic compounds (benzo[a]pyrene as a tracer of polycyclic aromatic hydrocarbons). All other regulated gaseous and particulate pollutants are covered by the 2008 directive which was updated by the 2015 text. These regulations determine our actions, but as end users, we also have the opposite role: that of participating in the drafting and revision of standards texts at the European level. The LCSQA provides technical and scientific expertise concerning the application and evolution of these regulations. For example, we’re currently working hard to develop the technical guidelines which will be used for measuring pesticides. We also play a role in verifying the technical compliance of common instruments as well as innovative ones used to improve the performance of real-time measurement, which is essential to having access to a high enough quality of information to be able to take appropriate steps more quickly.

What does this challenge of improving measuring time represent?

FM: Air quality is one of people’s biggest concerns. There is no point in finding out today that we breathed poor quality air last night; the damage has already been done. Quicker information enables us to take preventive action earlier, and therefore be more effective in helping populations to manage the risk of exposure. It also allows us to take action more quickly to address the causes: regulating traffic, working with industry, citizen initiatives, etc. So it’s a big challenge. To rise to this challenge, real-time measurement — provided it is of sufficient quality — is our main focus. In our current system, for a certain number of the pollutants involved, the methodology is based on using an instrument to collect a sample, sending it to a laboratory for analysis, and reporting the results. The idea is to make this measurement chain as short as possible through direct, on-site analysis, with results reported at the time samples are collected, to the extent possible. This is where our role in qualifying devices is meaningful. These new systems have to produce results that meet our needs and are reliable, ideally approaching the level of quality of the current system.

 

[divider style=”normal” top=”20″ bottom=”20″]

A MOOC to learn all about air pollution

On October 8, IMT launched a MOOC dedicated to air quality, drawing on the expertise of IMT Lille Douai. It presents the main air pollutants and their origin, whether man-made or natural. The MOOC will also provide an overview of the health-related, environmental and economic impacts of air pollution.

[divider style=”normal” top=”20″ bottom=”20″]

internet

Left out of the digital revolution?

Dominique Pasquier, Télécom ParisTech, Institut Mines-Télécom (IMT)

This text is published as part of the “Digital Society” column written by researchers from the Economic and Social Sciences department at Télécom ParisTech, members of the Interdisciplinary Institute for Innovation (CNRS).

[divider style=”dotted” top=”20″ bottom=”20″]

[dropcap]S[/dropcap]ome revolutions are silent ones. Internet integration in working-class areas is one such example.

Where do we stand in terms of the “digital divide”?

In the early 2000s there was a lot of talk about the “digital divide” focusing on inequality in terms of both access and uses. Annual reports by CREDOC have shown that over the past ten years working-class categories have started to catch up in terms of internet access: in France, between 2006 and 2017, the proportion of employees with Internet access at home increased from 51% to 93%, while for blue collars it rose from 38% to 83% (CREDOC 2017 : 48).

Age, rather than income or level of education, is now the most determining factor (8 out of 10 of those who never use the internet are 60 or older). But these same reports show that while the divide is closing in terms of internet access, internet use among working and lower classes remains less varied and frequent than among the middle and upper classes. Individuals without college degrees find it harder to adjust to paperless administrative services, do less research, buy less online, and only very rarely produce content. In short, there appears to be a kind of “working classes’ internet” that is less creative, less daring and less useful to a certain extent.

Shifting the focus

Perhaps the question should be looked at from a different angle. These statistical surveys are based on tallying up and lamenting shortcomings in comparison to the most innovative internet practices, meaning those of young, educated urban populations. The issue could be examined from a different perspective by starting out with the idea that the internet practices favored by the working and lower classes make sense in light of their everyday needs and that they are important indicators for understanding their relationship with the world and possible transformations in this relationship.

As Jacques Rancière explained in his analysis of written productions of workers in the 19th century, it is a matter of setting the equality of intelligences as a starting point for discussing and understanding how “a common language appropriated by others” can be re-appropriated by those for whom it was not intended (Rancière 2009 : 152).

This shifted focus would make it possible to perceive uses that may not stand out as remarkable, but which have dramatically transformed the relationship with knowledge and learning among those with low levels of education. Examples of such uses include looking up words used by doctors or found in headings of children’s homework. For sophisticated internet users, these uses may seem rather basic, but in the meantime they bring about profound changes by making relationships with experts less asymmetrical and by reducing the phenomena of “deferential attitude” among the working classes that Annette Lareau analyzes in her excellent book, Unequal Childhoods (2011).

Online research: learning and shopping

Employees with lower-paying jobs who do not use digital technology for work also spend a great deal of time online to inform themselves about their profession or their rights: the success of websites for caregivers is a good example. Childcare workers discuss their views on bringing up children and hospital service workers talk about their relationships with patients. It is also important to note the ways in which tutorials have revived interest in expertise traditionally held by the working class: novel cooking ingredients, new gardening or DIY methods and never-seen-before knitting patterns have made their way into homes.

Working-class families thus use the internet to learn, but they also use it make purchases. For people who live in rural or semi-rural areas, being able to access goods that had previously been impossible to find nearby in just a few clicks would appear to represent a huge opportunity. But in reality, it is a bit more complicated. Online marketplaces are less appreciated for their choice as they are for the savings they provide through browsing for special offers. Looking for deals is the main source of motivation, along with the fact that this makes it possible to manage supplies by purchasing in batches. At the same time, these savings are guilty of playing a role in weakening local business – or what remains of local business anyway.

In communities where people know each other well, shopkeepers are also neighbors – and sometimes friends – and betrayal can leave both sides feeling bitter. On the other hand, marketplaces for secondhand goods, such as Le Bon Coin which has recruited a significant number of rural and working-class customers, are described as virtuous markets: they allow people to enjoy leisurely browsing by geographic location (this has become a new source of gossip!) and provide an alternative to throwing away used goods. These marketplaces also give people the opportunity to earn a little money and help those who buy goods maintain a sense of pride, since they are able to furnish their homes or buy clothes at a low cost without resorting to donation systems. Online shopping has therefore led to paradoxical changes in relationships with the local community, as it has destroyed certain ties but created others.

Using the internet for reading and communication

The internet can be described as a relationship with writing – the mark of those who have created it. This can be difficult for individuals with low levels of education who do not write much in their work. Email, which requires standardized writing, is largely underused in these working-class families: it is only used to communicate with e-commerce websites or administrative services, and is often related to a problem in the latter case.

This is also because email is based on an interpersonal, asynchronous communication rationale that goes against the standards of face-to-face relationships and group discussion that are so prevalent in working-class communities. Facebook has fared much better: it allows users to exchange content by sharing links, is in keeping with a group discussion system and does not require formal writing of any kind. This social network appears to be a good way to stay in touch with extended family and close friends, while seeking consensus on shared values. It is a self-segregating network which is not particularly open to other social worlds.

While the internet has kept many of its promises in terms of our relationship with knowledge, it clearly has not managed to break down the barriers between different social worlds.

[divider style=”dotted” top=”20″ bottom=”20″]

Dominique Pasquier, sociologist, research director at CNRS and author of L’Internet des familles modestes. Enquête dans la France rurale (The Internet of Low-Income Families. A Survey in Rural France) Paris, Presses des Mines, 2018.

Dominique Pasquier, sociologist, research director at CNRS, member of the Interdisciplinary Institute for Innovation (i3), Télécom ParisTech, Institut Mines-Télécom.

The original version of this article was published in French on The Conversation France.

Also read on I’MTech

IIoT

What is the Industrial Internet of Things (IIoT)?

Industry and civil society do not share the same expectations when it comes to connected objects. The Internet of Things (IoT) must therefore adapt to meet industrial demands. These specific adaptations have led to the emergence of a new field: IIoT, or the Industrial Internet of Things. Nicolas Montavont, a researcher at IMT Atlantique, describes the industrial stakes that justify the specific nature of the IIoT and the challenges currently facing the scientific community.

 

What does the IIoT look like in specific terms?

Nicolas Montavont: One of the easiest examples to present and understand is the way production lines are monitored. Sensors ensure that a product is manufactured under good conditions, by controlling what travels down the conveyor belt and by measuring the temperature, humidity or luminosity for the specific work environment. Actuators can then respond to the data received, for example by reconfiguring a production line based on the environment or context, allowing a machine to perform a different task.

How does the IIoT benefit companies?

NM: There are benefits in every area: production times, line performance, cost reduction, etc. One major benefit is increased flexibility thanks to a more autonomous system. Production lines can operate and adapt with fewer human interventions. Staff can therefore transition from a role of handling and management to supervision and control. This change especially benefits small businesses. Today, production is very focused on large volumes. Increased flexibility and autonomy let companies find more cost-effective ways of manufacturing small quantities.

What justifies referring to IIoT as a separate field, distinct from the mainstream IoT?

NM: Mainstream IoT technologies are not designed to meet industry requirements. In general, when IoT is used for applications, the performance requirements are not very high. Communicating objects are used to send non-critical data packets without strict time constraints. The opposite is true for industry demands, which require object networks that send important data with the lowest possible latency. Therefore, specific IoT standards must be developed for the industrial sector, hence the name IIoT. For example, companies do not want to be limited by propriety standards, and so they want to push the Internet to become the network that supports their architectures.

Why do companies have more performance constraints for their networks of communicating objects?

NM: One scenario that clearly represents industrial constraints is the one we selected for the SCHEIF project, in the context of the German-French Academy for the Industry of the Future (GFA). We initiated a collaboration with the Technische Universität of Munich (TUM) on the quality of the network and data in an industrial environment. We started with a scenario featuring a set of robots that move autonomously through a work environment. They can accomplish specific tasks and can also detect and adapt to environmental changes. For example, if a person walks through the area, they must not be hit by the robots. This scenario includes a major safety aspect, which demands an efficient network, low latency, good-quality data communications and the effective assessment of the state of the environment.

What are the scientific challenges of this type of example?

NM: First of all, locating robots indoors in real time presents a challenge. Technologies exist but are not yet perfect and do not offer sufficient performance levels. Secondly, we need to make sure the robots exchange the monitoring data in an appropriate manner, by prioritizing the information. The main problem is, “who needs to send what, and when?” We are working on how to schedule the communications and represent the robots’ knowledge of their environment. We also have energy consumption constraints, first in terms of hardware and the network. Finally, there is a significant cybersecurity aspect, which has become a major focus area for the scientific community.

digital simulation

What is digital simulation?

Digital simulation has become an almost mandatory step in developing new products. But what does “simulating behavior” or “modeling an event” really mean? Marius Preda, a computer science researcher at Télécom SudParis, explains what’s hiding behind these common industry expressions.

 

What is digital simulation used for?

Marius Preda: Its main goal is to reduce prototyping costs for manufacturers. Instead of testing a product with real prototypes, which are expensive, companies use fully digital twins of these prototypes. These virtual twins take the form of a 3D model that has all the same attributes as the real product– colors, dimensions, visual aspect– and most importantly, in which a great quantity of metadata is injected, such as physical properties of the materials. This makes it possible to create a simulation that is very close to reality. The obvious advantage is that if the product isn’t right, the metadata can simply be changed, or the attributes of the digital twin can be directly modified. With a real prototype, it would have to be entirely remade.

What can be simulated?

MP: The main focus is on production. Companies make simulations in order to accurately measure all the parameters of a part and obtain production specifications. That explains a high percentage of uses of digital simulation. After that, there are significant concerns about aging. Physical laws that determine how materials wear are well-known, so companies inject them into digital models in order to simulate how a part will wear as it is used. One of the new applications is predictive maintenance. Simulations can be used to predict breakage or faults in order to determine the optimal moment a part should be repaired or replaced. All that relates to products but there are also simulations of whole factories to simulate their operations, and simulations of the human body.

Read on IMTech:  A digital twin of the aorta to help prevent aneurysm rupture

How is a digital simulation created?

MP: The first step is defining the goal of the simulation. Taking a car, for example, if the goal is to study how the car body deforms during an impact, the modeling will be different from if the goal were to analyze visual and sound comfort inside the passenger compartment. So modeling is carried out based on what the aim is: automobile manufacturers don’t create a 3D model with the idea that they’ll be able to use it for all simulations. The 3D form may be the same, but what’s important are the physical properties that will be included within the model. For crash test simulations, properties related to the way materials deform are injected into the model in the form of equations that govern their behavior. For sound comfort, the laws of reflectivity and sound propagation are included.

What form do simulations take?

MP: Virtual reality is often presented as something new, but manufacturers have been using it for years for simulations! In the past, they would create 3D environments called “caves,” which where rooms in which different parts of a car – to continue with our automobile example – were projected on the walls. Today, virtual reality headsets make it possible to save space and put more people in the same virtual environment. But beyond this highly visual form of simulation, what industry professionals are really interested in is the model and the results behind it. What matters isn’t really seeing how a car deforms in an accident, but knowing by how many centimeters the engine block penetrates into the passenger compartment. And sometimes, there isn’t even a visual: the simulation takes the form of a curve on a graph showing how material deformation depends on the speed of the car.

What sectors use digital simulations the most?  

MP: I talk about the automobile industry a lot since it’s one of the first to have used digital simulations. Architects were also among the first to use 3D to visualize models. And factories and relatively complex industrial facilities rely on simulation too. Among other things, it allows them to analyze the piping systems behind the walls. It’s a way to access information more easily than with plans. On the other hand, there are sectors, such as construction and civil engineering, where simulation is under-utilized and plans still play a central role.

What are some major ways digital simulation could evolve in the near future?

MP: In my opinion, interaction between humans and 3D models represents a big challenge. New devices like virtual reality glasses are being used, but the way people interact with the model remains unnatural. Yes, from within a virtual space, users can change how rooms are arranged with a wave of the hand. But if they want to change the physical parameters behind a material’s behavior, they still have to use a computer to introduce raw data in coded form. It would be a major advance to be able to directly change these metadata from within the virtual environment.

 

Esma Ismailova, Electronic textiles, textiles électroniques, matériaux avancés

Special issue: Electronic textiles

[dropcap]E[/dropcap]lectronic textiles is a multidisciplinary field which is not limited to the characterization and development of novel materials and devices. The field also targets technologies related to the interconnection of electronic functionalities leading to smart networks and to the development of hybrid approaches integrating flexible devices with traditional solid‐state electronics.

Furthermore, unlike other emerging technologies, electronic textiles are, in part, based on one of humankind’s oldest technologies. Well‐established sectors, such as the textile and fashion industries, thus play a central role in material and process development, and not just in “end‐game” commercialization. This unique connection to industry makes electronic textiles an exciting and dynamic research field where academia and the private sector work hand in hand to advance all aspects of the technology.

[Extract from the Editorial]

 

This special issue on electronic textiles was edited by Esma Ismailova (researcher at the Centre Microélectronique de Provence of Mines Saint-Étienne), Tobias Cramer, and Daniel T. Simon, the organizers of the Symposium “Electronic textiles” (E‐MRS Spring 2017 meeting).
It was planned together with Wiley and the European Materials Research Society (E‐MRS).

[divider style=”normal” top=”20″ bottom=”20″]

Esma Ismailova, textiles électroniques, matériaux avancésADVANCED MATERIALS TECHNOLOGIES
Volume3, Issue 10
Special issue: Electronic textiles
Esma Ismailova, Tobias Cramer, Daniel T. Simon (ed.)
Wiley, October 2018