Yaneck Gottesman

Precision measurement and characterization

Yaneck Gottesman, a metrologist and specialist in the analysis and characterization of components, contributes to the development of the Optics and Photonics laboratory at Télécom SudParis. The lab is equipped with innovative high-performance electronic instruments for measurement, with multiple applications covering the fields of healthcare, telecoms and security.

 

Why does a component stop working? What happened in its transition from ‘working’ to ‘broken’? What unforeseen physics are at work? This is what Yaneck Gottesman tries to understand; he is interested in “seeking out the flaws in our understanding of everyday objects“. Metrology, or the science of measurement, demands precision and adaptability and is at the heart of the Professor’s research at Télécom SudParis. An example of this is an experiment during which the optical properties of a mirror produced results that were difficult to explain. It took a year of questioning the framework of interpretation to show that the object was affected by vibrations that were almost undetectable because they were of an amplitude of considerably less than a micron. The benefit of this experiment was that it enabled the team to develop their expertise and establish a protocol for characterizing the dynamic properties of the objects studied: “when I carry out an analysis I always have to ask myself what it is I am really measuring and whether I need to reconsider the model. It is a question of being able to disassociate the object being measured from the instrument used to measure it.” Here, it was necessary here to break free of a supposedly static context in order to carry out a full dynamic analysis.

‘Homemade’ instruments to meet specific requirements

This ability to break free of the conventional field is a fundamental quality that allows the development of these specific tools and is a strength of the Optics and Photonics laboratory. Such tools include the OLCI (Optical Low Coherence Interferometry) and the OFDI (Optical Frequency Domain Interferometry). These two instruments have been specially developed for measurement and analysis with micrometric resolution over distances of up to 200 m (depending on the instrument used). Interferometry is a method that uses two signals produced by the same optical source, one of which serves as a reference while the other examines the object to be analyzed. Data is obtained by the superimposition of these two signals, which have undergone different conditions of diffusion.

In the case of research work carried out by various French laboratories on next-generation fiber optics, for example, the tools on the market were not suitable due to the ambiguity of interpretation of results recorded by these instruments. This difficulty led Yaneck and his colleagues to work on controlling the properties of emission from the source, obtaining a flexible interferometric architecture and controlling the full signal processing chain. This approach was a determining factor in demonstrating an unusual property of the fibers studied.

 

The prototype of the reflectometry bench developed in the field of frequencies: this system is used, among other things, for the spectral, spatial and modal characterization of optoelectronic circuits and components

The prototype of the reflectometry bench developed in the field of frequencies: this system is used, among other things, for the spectral, spatial and modal characterization of optoelectronic circuits and components

Quality and variety of the data recorded

What really adds value to the work carried out on the laboratory’s instruments, however, “is the quality and variety of information recorded simultaneously when an object is measured”. One of the initial, very important and difficult challenges is the limit in terms of absolute precision of an instrument. Here, it is assured by a systematic approach that involves the combination of benchmarks, methods of optical referencing and electronic circuits used to digitally compensate for optical fluctuations specific to the environment and the instrument. The second challenge concerns the diversity of the information recorded. The solution proposed consists in developing instruments capable of simultaneously recording all vectoral values of the light collected, such as intensity, temporal and spatial phases and polarization. Thanks to this ‘all in one’ approach, each instrument becomes a spectrum analyzer as well as an ellipsometer, a telemeter or a Doppler scanner. Moreover this diversity provides detailed information on the electromagnetic field that is otherwise unobtainable.

The approaches that have been developed, some of which have been patented by Télécom Sud-Paris, provide extremely powerful observation tools with multiple and specialized uses according to the objects examined, such as fiber optic instruments for optoelectronic components, or instruments for use through free space for OCT (optical coherence tomography) imaging. Potentially there is a wide range of applications. In the healthcare sector, biosensors are very promising (see insert), as is cell imaging. When biologists start to use these instruments a large number of medical applications will emerge. Other fields are also concerned such as telecommunications, whose components would benefit from extremely precise diagnostics. Security will also benefit from unfalsifiable biometric sensors: a property that is a direct product of the variety of different measurements carried out simultaneously by these unique devices.

[box type=”shadow” align=”” class=”” width=””]The OLCI, patented and serving healthcare
Radical transformations are expected in the field of healthcare thanks to innovative measurement instruments. The OLCI, and its use with an OCT – two devices that have both been patented by Institut Mines-Télécom – may allow the early diagnosis of certain illnesses. Imagine a drop of blood on a miniature optical surface made up of specialized zones in which millions of bio-photonic sensors are placed and which react with the molecules present in the blood. These chemical reactions modify the optical nature of the surface, which is analyzed in detail by the OLCI. The results are correlated and interpreted by a physician in order to provide a very precise diagnosis.[/box]

An unrivaled instrumental platform and level of expertise

The laboratory has an instrument base named the VCIS (Versatile Coherence Interferometry Setup), which can be combined and structured according to needs and applications. It relies on high-performance tools, which Yaneck Gottesman hopes can be used by laboratories, in industry, by universities, manufacturers and researchers: “whether they are private or public, the platform is open for them to come and analyze or evaluate objects of interest in detail, test the performance of their components or develop new, more specialized instruments using the existing modular base.

Beside the economic benefits of this partnership, there is a desire to create a place of open exchange which will allow progress to be made in multiple fields at the same time. Attracting users and being surrounded by them in order to understand their needs and benefit from their expertise, is definitely the best way to remain in contact with the domain in question and anticipate the future.

 

Portrait_Yaneck_GottesmanOptics as a central theme

Yaneck Gottesman is an alumnus of the École Centrale de Marseille (former ESIM) which he left in 1997 and where, in his final year, he also took a Postgraduate Advanced Diploma in optics with the École Nationale Supérieure de Physique de Marseille. He then joined the CNET in Bagneux (French National Centre for Studies in Telecommunications, now Orange Labs) where he prepared and, in 2001, defended a thesis on optical reflectometry for component analysis. He then did a post-doctoral year at the Laboratory for Photonics and Nanostructures (LPN) at the CNRS in Marcoussis, focused on non-linear optics. In 2002 he joined the Electronics and Physics department (EPH) at Télécom SudParis (INT at the time), where he specialized in precision measurements and the physics of optoelectronic components. He has been accredited to lead research since February 2014.

Editor: Umaps

Intelligence incarnated, a bio-inspired approach in the field of robotics

Using nature as inspiration is certainly the oldest scientific approach and one that still has much to reveal. Bio-inspired robotics is a research topic at Mines Nantes that uses this process. It does not aim to simply mimic, but actually to understand the tricks nature has found to solve problems. Researcher Frédéric Boyer’s work is driven by such meticulous observation. He and his team develop various robots that do not need the use of computer calculations to obtain autonomy, and which explore their environment thanks to the characteristics of their bodies and new senses such as the electrical sense.

 

Locomotion: when morphology does away with calculation

Underwater robots for deep mining operations, humanoid robots for aiding elderly or dependent people, and drones the size of an insect for passing unnoticed all share the common task of having to reproduce complex human or animal behavior such as vision or locomotion, and to act fully autonomously even before displaying behavior that could be considered intelligent, such as decision-making. Research in artificial intelligence has long provided answers to these questions by allowing itself to be guided by two major lines of thinking: the symbolic paradigm represented by expert systems, based on rules and logic, and the subsymbolic paradigm represented by the neuron networks approach. However, observations of nature have revealed that the brain is not always absolutely necessary. For example, even when dead, a fish continues to extract energy from a series of whirlpools in order to move forward: a form of ‘passive swimming’ that has been modeled at Mines Nantes. Making robots perform complex behavior does not therefore always require computer calculations.

There are a large number of physicians in the robotics community, as Frédéric Boyer reminds us, himself a researcher and professor in robotics, and they do not especially look at the brain. “We try instead to reduce top-level cognitive problems to lowest-level solutions, those that are as closely bound as possible to the body”. This has the advantage of freeing the brain of tasks that it does not need to process, since the body is adapted to perception and action. “Bio-inspired robotics explores a new paradigm for unlocking autonomy by reconsidering intelligence as an attribute emerging from the interactions between the machine-animal’s body and its environment; it is intelligence incarnated, computational morphology.” In 2004 the team in which Frédéric Boyer works at IRCCyN (Institut de recherche en communication et cybernétique de Nantes) created an underwater machine that can move around like an eel in marine environments that are not easily accessible.

The researcher looks at locomotion problems in highly restricted environments, for example snakes in a tree or worms in a pipe. The resulting applications, currently few and far between, may concern maintenance in pipelines. Even more interesting is a gibbon-robot, an assembly of articulated arms ending in magnets, which swings majestically like a pendulum until it releases one of its grips, targeting a higher one, and thus climbs up a wall. The objective here is to propel the machine by sourcing part of the necessary energy from its surroundings, with the movement produced making the most of gravity which is conceived as a resource provided by the environment.

 

1-

The electrical sense is studied in a pool in which a probe is able to move in three dimensions in a maze.

The bio-inspired process in three stages

Questions have long been asked about the flight of insects, which is theoretically impossible if we take reference from the flight birds or planes. Through close observation, it was eventually understood that the wing’s torsion as it beats supplies the additional lift required for flight. “Without this mechanism that uses morphology, it could never work” the roboticist explained. It has paved the way to the design of drones the size of insects, previously impossible.

A technical barrier? Think bio-inspiration,” continues the researcher, giving the example of small, autonomous underwater robots dedicated to navigation in very awkward environments in rough waters. “Nature constitutes a wonderful library of inspiration”, he adds. The bio-inspired process occurs ‘naturally’ in three stages. Firstly, looking at nature and observing living matter with the aid of biologists; this observation phase can be carried out at face value, with no set objective because in nature there is still a lot to learn and to record. Next, functions are extracted from these observations. “We don’t copy, we understand. This phase is based on mathematics. It should not be confused with biomimicry,” the researcher stresses. It must be remembered that nature is not always optimized: it remains fairly redundant and precise copying could constitute a source of inefficiency. On the other hand, “nature has found simple tricks, and the mathematization of living matter allows these good ideas to be extracted”. The final step in the process entails implementing these tricks in technological devices that do not belong to nature. Such implementations are a bonus for the biologists who participated at the beginning and an advantage of these multi-disciplinary exchanges, often allowing them to understand their own field of research better.

As Frédéric Boyer explains, systematic exploration of nature within the framework of this approach, with no preconceived search for an application, should be encouraged. Indeed, it quickly becomes clear that “looking at an animal or plant, even the most humble animal that you might squash underfoot, can occupy the life of several researchers”, as the nature lover enthusiastically explains. The subject goes as far as the creation of new senses, such as the electrical sense known to be possessed by certain fish that emit electrical fields which are deformed when they come into contact with obstacles, which the fish then detect thanks to sensors on their skin. This observation was used to enable the robot named Angels, developed in a European project coordinated by Frédéric Boyer, to swim in the dark, its electrical sense giving it “a sort of immaterial body”. Although new and still little-explored, this field of research is proving of interest to industry players. The researcher and his team work with Areva on swimming in contaminated mud and snake-robots in pipes, and with CEA on remotely operated commands using electro-haptic feedback. They are also partners of a new H2020 European project called subCULTron (Submarine cultures perform long-term robotic exploration of unconventional environmental niches) whose aim is to create a bench of cooperative underwater robots with an electrical sense for monitoring the canals in Venice.

The next step: getting out into the air. The human body contrasts greatly with the air in terms of electrical fields. Research on the robot-eel and the electrical sense may create new and profitable forms of approach in terms of cooperation between robots and human beings. Frédéric Boyer concludes with an invitation to his fellow researchers who are aware of the importance of cross-disciplinary work: “bio-inspiration provides a systematic way of ‘shaking up’ your research”.

 

Photo_F_Boyer_carrée-A Professor at Mines Nantes, Frédéric Boyer is passionate about biology. After completing a thesis in robotics at Paris Diderot University, he began by doing “mathematical calculations all day long in geometrical mechanics for high-distortion structures such as cables and soft robots”. He was then awarded his Accreditation to Lead Research and entered the field of bio-inspired robotics. He won the Michel Monpetit prize in 2006 from the Académie des Sciences for his work in dynamics. More recently, he and his team received the La Recherche 2014 prize in the category for technology, for their work on the electrical sense. These results motivate Frédéric Boyer to do research that is even more applicative, “combining my love of nature with my work”. His team collaborates with the other laboratories at Mines Nantes: Subatech (a joint research unit in nuclear technology) for sensors, and DSEE (Département Systèmes Énergétiques et Environnement) for fluid mechanics, as well as with several European laboratories..

Editor : Nereÿs

Marine pollution

Marine pollution as seen by ultrafast cameras

Ultrafast cameras unveil processes that are invisible to the naked eye. At Mines Alès, Pierre Slangen, a specialist in applied optics, uses them to build advanced technology devices and thus to understand how gases and liquids are diffused during environmental disasters.

 

 

Certain physical phenomena occur in such small time scales that they remain practically invisible without the use of sophisticated cameras and state-of-the-art laser devices. At Mines Alès, Pierre Slangen’s field of specialism has developed a research theme in the field of applied optics in new technology, for visualizing pollutants in aquatic environments. As early as 1993, before earning his State Doctorate in Belgium, he characterized materials through the use of holograms, measuring movements of just 0.5 µm. He continued his work in this field in 1995 when he joined Mines Alès. In 2003 he collaborated with a team as part of the Clara project with Cedre, Ifremer, Météo France and Inéris that was building a new IT system designed to help decision-making to deal with chemical substance spills at sea. All that remained was to observe the development of such spills in the water in detail. There is increasing research into viewing the invisible, from traces of nanoparticles to certain gases and chemical products, thanks to the possibilities offered today by ultrafast cameras. “They enable us to discover a significant amount of extreme processes whereby optics, although it does not always enable us to see, at least enables us to distinguish”, Pierre Slangen explains.

 

 Evaluating and anticipating industrial risks

Stakeholders in the oil and gas industry have been expressing their need for modelling tools for offshore leaks for several years. The accident at the Deepwater Horizon rig in 2010 highlighted the importance of such detailed knowledge of underwater phenomena and their consequences at the surface. In 2011 the METANE (Modeling undErwater gas/oil blowouT And lNg lEak) collaborative project received a double accreditation from the French Pôle Mer Bretagne Atlantique and Pôle Mer Méditerranée, as well as funding from the FUI (French Unique Interministerial Fund) to develop such a decision-making tool for industrial hazards linked to underwater leaks of oil, natural gas or liquefied natural gas (LNG) at sea. This 3-year project has received total financial support of €330 k. METANE is also a grouping of major public and private-sector actors, each of whom contribute their own expertise: Alyotech, Cedre, Nymphea Environnement, Mines Alès and GDF SUEZ.
Ultimately, the developed tool enables plans to be defined for the prevention and management of disasters, focused on the risk of accidents in relation to underwater hydrocarbon leaks. In order to calibrate and validate the digital model the project required laboratory trials both at Mines Alès and in situ at the Cedre in Brest. These trials allowed the trajectory, dissolution and rising speed of gas bubbles or oil droplets in the water column to be studied using optics. The project is focused on safety and the environment and aims to provide an understanding of the risks for personnel and equipment of off-shore installations.

Ultrafast imaging

Now essential in the fight against marine pollution in order to detect minute drops of pollutants or the ejection of gases with highly complex dynamics, these high-frequency cameras offer a better understanding of the way different fluids mix with each other. When a chemical tanker sinks, the products it was transporting can indeed behave differently within the water column. These non-linear fluid mechanics, which occur across very short periods of time, a matter of just milliseconds for some, must therefore be observed in detail in laboratories using experimental trials. Thanks to these cutting edge cameras it is now possible to ‘stretch out’ periods of time. The frequency of frames can be as much as a million per second: a far cry from the cameras for use by the general public, which rarely exceed 50 frames per second! The highest frequencies allow us to see how matter structures itself in space.

The ultrafast cameras used by Pierre Slangen and his team, which sometimes work at a rate that is much faster than the speed of time as we perceive it, allow micro-processes to be filmed and then viewed in fine detail. Slowing down time allows us to make out what happens in each time interval, from microseconds to seconds, something that is essential for understanding the dynamics of how chemical products spread when a ship sinks or during the explosion of a gas contained in a pipe. For Laurent Aprin and Frédéric Heymes, the team’s researchers in fluid dynamics, the results obtained in the field of optics allow a better understanding of how a chemical product diffuses as it rises through extreme temperature and pressure conditions when a ship sinks in deep water. This leads to an improvement in predictions of the amount of product to be handled by clean-up operations at the surface and in the air surrounding the site where diffusion is to be controlled (explosiveness, toxicity etc.)

 

Illumination using lasers

Such high speed imagery requires suitable illumination, so Pierre Slangen’s team builds complete devices where the light is supplied by superluminescent LEDs, or by laser pulses that emit light beams in the form of flashes. These devices vary according to the problem: an oil droplet mixing with water cannot be compared to the change in shape of a piece of laminate sheet metal for a vehicle, the artificial limb of a Paralympic athlete such as Dominique André at the Olympic Games in Sydney in 2000, or the device set up by the Zéline Zonzon dance company who wanted to show that dancers perceive and displace the air around them. “I used to be a developer of specific techniques, today I am a builder. I always tell my students that if you do not master the contents of a box, you will not master the box”, Pierre Slangen explains.

The work does not stop here, however. Not satisfied with building the best devices, Pierre Slangen wants to qualify them. He and his team mathematically quantify these methods from a metrological perspective and evaluate the deterioration of information linked to measurement uncertainties such as noise inherent to the coherence of the laser source and distortions that are intrinsic to the cameras and their lenses.

The advantage of these optics techniques for field measurements, especially in terms of diffuse light, is that they are well adapted to non-destructive testing and the measuring of fields of movement in 3D for solids and fluids thanks to their high level of sensitivity and excellent spatial resolution. The use of mechanical trials based on techniques like the correlation of images and the measurement of interference in diffuse light, produces cinematic fields which enable identification of the behavior of the materials studied.

Pierre Slangen’s next challenge is to observe larger fields (1 m2) using a resolution of 1 µm. This would require fast sensors of tens of millions of pixels… which don’t exist yet. The idea is to emulate the astrophysicists in Chili who, with the European Southern Observatory’s Very Large Telescope, are testing out multiplexing. They put together small images with an excellent resolution. Understanding of micro-phenomena is being added to that of the infinitely big.

 

Portrait_PS_pour_blogPierre Slangen entered the field of visualization techniques during his Master’s in Optoelectronics. At the time he was keenly interested in the creation of holograms: images in three dimensions that do not require special glasses. After a State Doctorate thesis in 1995 at the Belgian University of Liège, he is now a Research Professor at Mines Alès and has been accredited to lead research since 2013. He joined the Institut des Sciences des Risques team at Mines Alès in 2010. He is currently leading contractual work on the analysis of fragmented jets of liquid using imaging, confinement loss through very high velocity impact in reservoirs, and the study of atmospheric transfer as well as oil or liquefied natural gas (LNG) leaks at sea. He shares his knowledge through his lessons on sensors, applied optics and holography as well as by participating in television programs. To find out more

Editor: Umaps

Biomécanique au service de la santé

Biomechanics serving healthcare

Stéphane Avril, a researcher at Mines Saint-Étienne, describes himself as a “biomechanics” but would like to become a “mechanobiologist”, a switch from studying the mechanical properties of the body to decoding its biological mechanisms using engineering tools. Focused in particular on analysis of the behavior of normal and pathological vessels, his work should have significant consequences on treatment for various vascular illnesses, and has already led to several industrial partnerships.

 

Research at Mines Saint-Étienne was originally organized around three main activities: mechanics and materials, manufacturing and processes engineering, and company performances. After adjustments and reorientation within the manufacturing industry in France, the idea emerged of setting up an ‘Engineering and health’ research center, which would develop the School’s three main activities but orientate them towards the fields of biology and medicine”, explained Stéphane Avril.

The researcher was hired in 2008 to develop biomechanics in health, a discipline that analyzes the mechanical behavior (movements, deformations) of tissue and organs (blood circulation, articulations etc.). He has been managing the ‘Engineering and health’ center at Mines Saint-Étienne, made up of sixty or so people, since 2010.

 

Two main themes: vessels and compression

The work of the biomechanics team led by Stéphane Avril, composed of twenty or so engineers and researchers, has a “double backbone”, he explains. “The first subject, more biologically-themed, tends towards predicting the development of certain cardiovascular illnesses, such as aneurysms (dilatations) of the aorta, thanks to studies on the resistance of the vascular wall”. The second focuses on the treatment offered by medical fabric in the general sense including, principally, compression stockings but also knee braces, lumbar belts etc. “This second field was developed at the request of certain industrial players. The region is the largest in Europe for manufacturing these textiles”, the researcher added. “The approach we have adopted entails working on an application, such as a piece of software, in order to see if it can resolve a medical problem. This translational research, driven by its practical benefits, is one of the school’s specialties.

 

Biomécanique au service de la santé

Predicting the effects of elastic compression on a subject’s leg

A team recognized for its research in the treatment of aortic aneurisms

Since 2008 the biomechanics team has been working with vascular surgeons at Saint-Étienne who implant stent grafts (prostheses placed inside the diseased vessel) in patients with aortic aneurisms that are in danger of breaking, with the aim of protecting the aneurism sac from the blood flow. This operation rebuilds a solid aortic ‘wall’.

Stéphane Avril has started an important program that has received grants from the French National Research Agency (ANR). The program’s objective is to better adapt stent grafts to the characteristics of the aneurism through mechanical calculations and the use of industrial software. “These questions, which may seem applied research, have raised fundamental issues that have been acknowledged by the international community”, the researcher indicated. In particular the team is interested in enzymes (metalloproteases) and their participation in the weakening then dilation of the artery wall.

In December 2014 Pierre Badel, a researcher in the laboratory, was given a ‘starting grant’ by the prestigious European Research Council (ERC) for his work on the prevention of aneurism rupture as part of the AArteMIS (Aneurysmal Arterial Mechanics: Into the Structure) project. At the beginning of 2015 Stéphane Avril was also honored by the ERC with the ‘consolidator grant’ for the BIOLOCHANICS project. Through this project, the team of researchers aims to develop a new approach to treatment of aortic aneurisms. Mines Saint-Etienne will thus receive 3.5 million euros over 5 years for research into aneurism rupture. In addition, within 5 years Stéphane Avril’s team hope to identify the signs of arterial instability with the support of companies specializing in magnetic resonance imaging (MRI). Over time the research should lead to work on medication and regenerative cell treatments, in connection with industry.

All this work takes us from classical biomechanics (analysis of movements and deformations) to mechanobiology which aims to predict the changes in the microstructure of an organ, in this case vessels, taking into account the mechanical constraints acting on its location. We could even talk of cellular mechanobiology, inasmuch as the researchers at Mines Saint-Étienne attempt to understand better the influence of mechanical changes on the cellular working itself (protein constructions, chemical reactions etc.)

 

Atheromatous plaques in the carotid artery, coronary dilation complications: a better understanding

After starting at Mines Saint-Étienne, Stéphane Avril received funding from the ANR (French National Research Agency), using MRI to identify what may cause atherosclerotic plaques formed in the carotid arteries to break in patients.

Recently, the team’s biomechanics looked at complications of coronary angioplasty, work which consists of dilating the coronary arteries of the heart, shrunken by atheroma, in order to improve flow. One of these complications, called a dissection, occurs when part of the dilated coronary artery wall tears lengthways. Using simulations, Stéphane Avril and his researchers have identified the pathological processes that occur and have shown that some of these occurrences may be factors of secondary coronary obstruction.

A leader in medical compression

Within a few years Stéphane Avril’s team was a European leader in soft tissue biomechanics, “one of the cutting edge subjects at Mines Saint-Étienne”. The researchers are especially well-known for their work in the field of medical mechanical compression. Some of this research aims to better understand venous compression devices, associated with the wearing of contention stockings, and is carried out in collaboration with the companies Sigvaris and Thuasne which specialize in the manufacture of these fabrics. Thanks to a recent study using ultrasound by a young PhD student, in partnership with Sigvaris, it has been shown that elastic compression exercises a type of pressure on soft tissue that tends to reduce stagnant blood in venous microcirculation. This would explain the positive effect of contention on superficial varicose veins.

Other work, using magnetic resonance images, suggests that this compression has an effect on deep veins in the leg more through contraction of the leg muscles than by the passive transmission of pressure to the vein wall itself.

 

A good example of the importance of biomedical engineering

It is clear that Stéphane Avril’s work is an excellent illustration of the multiple possibilities made available by the development of biomedical engineering techniques, including a better understanding of physiological and pathological processes and an improvement of treatments.
Biomechanics and mechanobiology are necessarily multidisciplinary fields because they are situated at the interface of engineering, the health profession and industrial development, and should offer even more contributions in the years to come.

 

StéphanPortrait_Stéphane_Avrile Avril, cutting edge research
dedicated to health

Aged 38, he joined Mines-Saint-Étienne in 2008 as a professor and researcher. After a degree in math and studies in engineering followed by a PhD at Mines Saint-Étienne, he chose a career as a researcher in engineering sciences, first applied to the field of materials and then health. During his PhD thesis he applied new photomechanics technology to analysis of the properties of materials. Then in 2003 he worked in Châlons-en-Champagne in the mechanics and manufacturing laboratory, directed by Fabrice Pierron, at Arts et Métiers ParisTech, where he developed new mathematical tools for using photomechanical data. In 2006 a year spent in the laboratory directed by Jon Huntley at Loughborough University in Great Britain enabled him to better understand the advantages of MRI data for analyzing living tissue. Between May and August in 2014 Stéphane Avril completed his training with a sabbatical in the American laboratory directed by Jay Humphrey at Yale University in the USA, in order to start working on mechanobiology. In January 2015 he received the prestigious European research grant from the ERC.

Editor: Umaps, Corinne Tutin

Stéphan Clémençon, Automated Learning

Automated learning from data

Big Data is an issue not only of technology, but also for society. Aware of the value of data, Institut Mines-Télécom has made it a major field of research, because a new approach is needed in order to store, share and use data to achieve optimal use. Machine Learning is one such approach. It is being researched by Stéphan Clémençon, Professor at Télécom ParisTech.

 

Several zettabytes of data are generated each year, which is the equivalent of several billion billion thousands of octets, and while this supply of data enables the creation of new services it also considerably modifies our needs and requirements. Yesterday’s tools are outdated and new ways of putting this abundance of data to good use must be found. Machine Learning aims to do this. The discipline combines mathematics and information technology in order to create algorithms for processing big data, with a large number of industrial applications.

Télécom ParisTech recruited Stéphan Clémençon, a mathematician who specializes in modeling and statistics, to help develop Machine Learning. As Stéphan explains, “when you start to deal with very large quantities of data, you enter the realm of probability”. This field has long been neglected, especially by engineering schools where mathematics was taught in a very deterministic way, but students are becoming more and more interested in it, which is fortunate, for “questions of large scale raise difficult problems requiring lots of creativity!” New methods are required and the main difference between these and the old methods is that the latter were based on traditional statistics and relied on predetermined models of the data. They were developed principally in the 1930s when the methods and challenges were different, computers had a very limited capacity for calculations and when data was expensive to produce.

 

Finding the hidden meaning in big data

Nowadays there are sensors everywhere and data is collected automatically, with no pre-defined use but with the notion that it contains valuable information. The idea is to examine the data keenly and make the best use of it. Machine Learning’s objective is to design algorithms suited to dealing with big data. Enabling machines to learn automatically was an idea born from the fact that the data is too large to realistically enable each stage of processing it to be carried out by an expert, as well as from a desire to see the emergence of innovative services and teaching with no a priori.

The question is “how can a machine learn to decide by itself?” How can we compress, represent and predict information from data selected to provide examples? This is the challenge of Machine Learning, which is fuelled by probabilistic modelling based on optimization and a theory of learning which guarantees sound results. The major problem is designing algorithms with good potential for generalization. Using criteria that is too strict may lead to overfitting, meaning the production of models that suit the given examples perfectly but which cannot be generalized. On the other hand, criteria that is not specific enough does not have sufficient predictive capacity. In the case of Machine Learning, the right amount of complexity must of course be deduced automatically from the data.

 

A chair for Machine Learning

The chair in “Machine Learning for Big Data” was created at the end of 2013 and has fifteen professors, all from Télécom ParisTech. Its aim is to inform people about Machine Learning, illustrate the ubiquity of math and carry out a research program with four private partners who are providing a financial contribution of two million euros over five years, as well as real and concrete challenges:

  • Criteo, a world leader in advertising targeting, aims to offer each internet user the link that they are most likely to click on according to their browsing history. How can the enormous space of the Web be explored?
  • The Safran group manufactures 70% of the world’s civil and military plane engines. How can anomalies be detected in real time and the replacement of a part suggested before failure occurs? ?
  • PSA Peugeot Citroën hopes to connect data with its uses. How can construction costs be reduced and the commercial offering optimized, with models that meet market expectations?
  • A French banking group is launching an all-digital bank. How can client accounts be monitored in real time? How can its use be made simpler and the right financial products be offered?

To find out more

 

Designing new services from data

The multiple applications of Machine Learning are a driving force for research“, says Stéphan Clémençon, giving a number of examples showing the variety of fields in which big data is collected and used: “automated facial recognition in biometrics, risk management in finance, analysis of social networks in viral marketing, improving the relevance of results produced by search engines, security in intelligent buildings or in transport, surveillance of infrastructures and predictive maintenance through on-board systems, etc.

In Machine Learning, potential applications are found first, and then the math that allows them to be understood and clearly defined, significantly improving the process. A “professional” knowledge of these applications is therefore necessary. It was in view of this that the chair in Machine Learning for Big Data of Télécom ParisTech (see insert) was created with Criteo, PSA Peugeot Citroën, the Safran group and a major French bank. The idea is to work together with industry and academics to produce effective products based on the partners’ projects and notably providing increased knowledge of the state of the art for some and a keener understanding of the challenges of application for others.

Big Data refers at the same time to a specific infrastructure and a list of unresolved problems. Stéphan Clémençon regrets that in France, “we have missed out on the equipment stage” but fortunately, he adds: “we do have a large number of innovative SMEs headed by well-educated students, especially in applied mathematics.” Data engineering is multi-disciplinary by definition and the trump card of a school like Télécom ParisTech, which teaches in a variety of fields, is being able to offer specialized programs, in particular since, as Stéphan Clémençon underlines, “Machine Learning relates to key business challenges and there are lots of potential applications in this field.

 

Photo_Steph_Clémençon_redimensionné2_blogStimulating Machine Learning research and teaching

Stéphan Clémençon joined Télécom ParisTech in 2007 in order to develop teaching and research in Machine Learning, automated learning from data. He is in charge of the STA (Statistics and Applications) group and is head of the Advanced Master’s program titled “Management and Analysis of Big Data”. He set up the “Data Scientist” Specialized Studies Certificate awarded by the school for continuous learning, for engineers wanting to increase their skills in Machine Learning techniques. Stéphan also teaches at ENS Cachan and at the University of Paris Diderot, and is an Associate Professor at the Ecole des Ponts ParisTech and ENSAE ParisTech.

 

 

Christian Person, textiles, connected people

Textiles and connected people

The quantified self is one of the fields of research at Institut Mines-Télécom, requiring the miniaturization of sensors, the optimization of their energy consumption and sometimes their incorporation into the fabric of clothing. Christian Person, a researcher at Télécom Bretagne, has developed a research activity into this direction, focusing on harvesting ambient energy and locating antennas as close to the body as possible. He carries out his research as part of the Smart Sensing™ consortium, which innovates with cutting-edge technology fabrics designed for the intense uses of communicating clothing, through the first product, the d-shirt, a “digital t-shirt” for sports players.

 

Smart Sensing, a multidisciplinary consortium working on a simple idea

Multiple talents are required to develop a device that measures sportsmen’s physiological parameters, such as body temperature and heart rate, as well as their physical parameters such as speed, acceleration and geolocation. Yet this is what Jean-Luc Errant, founder of the company Cityzen Sciences, has done. He wanted a device that could be worn without being noticed and function in extreme situations such as in high mountains or at sea. During his investigation he met scientists, engineers and athletes, and their opinions gave birth to a seemingly simple idea: rather than a mobile phone, a garment can in fact provide the ideal way of monitoring physical condition throughout the day. All that remained was to incorporate sensors and energy sources. Cityzen Sciences was set up in 2008, and two years of academic research followed in order to create a state-of-the-art product. The project attracted interest and in 2010 received significant support from BPI France, and a consortium was set up consisting of the Payen group, specialists in elastic threads and fabrics for sporting and technical purposes, Éolane, the leader in France for industrial services in professional electronics, the Cyclelab group, bicycle specialists who were to act as distributers, and Télécom Bretagne.

“The multidisciplinary spirit behind the consortium is also present on the academic side” points out Christian Person, who has adopted Isaac Newton’s maxim: “Men build too many walls and not enough bridges”. No less than six research departments at Télécom Bretagne are uniting their expertise for the first time for tomorrow’s communicating garment. Christian Person is working on techniques for integrating and reducing the size of antennas and related elements as well as the evaluation of wave-people interaction and designing intelligent sensors. Since “everything must be optimized”, his co-workers are researching algorithms for measuring electrocardiographic data, the detection of variations in the signals received, software interfaces (the data is collected on a dedicated platform) and IPv6 protocols for connected objects.

 

Harvesting ambient energy

By wearing a d-shirt a cyclist will be able to monitor his cardiac data via electrodes located close to his chest. The information will be transmitted through the threads of the fabric, “threads of approximately 25 microns, containing both insulation fibres and conductors”, to an electronic card situated at the top of his back that enables the transfer of data collected at the end of the ride to a related terminal.

Amongst the technical challenges of the d-shirt – miniaturization, integration of conductor threads into the fabric, connected textile sensors, energy management and data processing – Christian Person is researching antenna components and energy harnessing. For reasons relating to cost, size or weight, battery-powered sensors alone are not sufficient enough to provide the necessary battery life. Temperature gradients, mechanical vibrations, light or radio-frequency waves all constitute potential sources of energy in the surrounding environment. “At the moment”, the researcher continues, “radio waves are our source of energy, with the aim of using the omnipresent electromagnetic spectrum”. Other sources are possible, however, “such as harvesting energy through a piezoelectric micro-generator, using the dilatation of the fabric when we inhale or the movements of the bicycle”. Since energy sources are not all equally reliable, such as the piezoelectric components, “current research focusses on cumulatively harnessing multiple energy sources”. With regard to this, thermal energy seems very promising given the differences between the human body and its surroundings, especially in a sporting context.

 

Monitoring as close to the body as possible

Christian Person’s interest in sensors located on or in the body comes from his long-term research on the analysis of interactions between waves emitted by phones and human body. From probes originally developed for Antennessa (now Satimo), a spin-off company incubated at Télécom Bretagne, as part of the Comobio project that coordinated studies in this field, the researcher, who is now a Cofrac (French Accreditation Committee) expert for the certification of telephones, is currently looking into the advantages of worn sensors and associated waves. “One idea is to place antennas on the body in order to spacially identify body parts precisely by the level of radio signals emitted/received”. This is the aim of the BoWI (Body World Interaction) project launched on 1st October 2012 for a duration of four years within the Breton Labex CominLabs. The researcher is also a senior member of the WHIST laboratory, a joint laboratory between the Institut Mines-Télécom and Orange Labs created in 2009, and which is dedicated to communicating people and the interaction between waves and people. “Corporal networks remain very peripheral (sensors on the body), but we are starting to use methods that interact more and more with the body, with  non-intrusive and non-invasive sensing systems, as well as targeting flagship applications in the field of monitoring health in real time”, he explains. Moreover, by analyzing interaction between sensors and their relative locations, highly original fields of application are created, “like music or piloting drones for example” the researcher says enthusiastically.

For the moment, the d-shirt will be sold at an initial launch price targeting users accustomed to top-of-the-range products. Professional sports teams are also partners of the Smart Sensing consortium and are interested in the idea. “The trainer can see his players’ physical condition live, and make changes at the right moment”, Christian Person explains. In time, amateur sports players will also use this technology, in particular since the connected garment will upload information onto social networks, enabling interaction with others. Since each person will want their garment to be compatible with connected objects sold by different companies, the data must be processed independently of these objects and their origin. This is the purpose of Cityzen Data, a company incubated at Télécom Bretagne. Last but not least, a design school rounds off the list of partners looking to maximize visibility of the project, accelerate innovation transfer and enable large-scale use of the products developed. “Connected people are more and more instrumented” the researcher concludes, and there is no doubt that Smart Sensing’s multidisciplinary approach, dealing at the same time with production, distribution of instrumented objects and the processing of collected data, should guarantee the success of such technological innovation.

 

[box type=”shadow” align=”” class=”” width=””]

Professor and Deputy Scientific Director at Télécom Bretagne, Christian Person considers that he has “been very fortunate” in the research projects he has led. This research professor “who initially intended to become a secondary-school teacher”, leads a team of 25 staff at the CNRS Lab-STICC laboratory and has supervised more than 30 thesis, some of which have led to the creation of companies such as Elliptika, a company specialized in microwave products design for spacial apps. He has also filed twenty or so patents, in particular in sub-marine communication: “given that sea water is very similar to the human body in terms of physical properties, it was used for risk-free studies on the underwater communication”. A Senior Member of IEEE, he has co-edited several articles in international journals, and directs the K commission (Electromagnetics in Biology and Medicine) of the French section of URSI (International Union of Radio Science). He is part of a committee for the organization of international science conferences in the field of microwaves.[/box]

 

 

 

 

Towards a smart electricity network

For several years, Smart Grids have been the hot topic in the energy field. As current networks become less and less adapted to new energy issues, we need to thoroughly rethink energy management and introduce smarter electricity networks. These networks of the future should be able to adapt production to consumption as precisely as possible, taking into account new parameters and constraints such as renewable energy or electric vehicles. The Institut Mines-Télécom, with expertise in the field of energy with the Mines schools and knowledge in the field of digital technology thanks to the Télécom schools, is involved in various projects for the Smart Grids engineering.

 

If the amount of research on Smart Grids is significantly increasing, it’s because there is a sense of urgency. “The driving force behind it all is energy transition. France must quarter its CO2 emissions by 2050”, explains Marc Girod-Genet, researcher at Télécom SudParis. The difficulty is that the solutions proposed for this transition are not always compatible with the state of current electricity grids. Most renewable energy sources are, for example, highly fluctuating and dependent on weather conditions. Moreover, some end users are becoming prosumers (i.e. both producers and consumers) by installing a miniature wind turbine or solar panels with associated small energy storage infrastructures, a dual role that has not been sufficiently accounted for by the network.

Another problem is the future popularization of electric vehicles, with heavy consumption at any hour of the day. “A few years ago it was a lot simpler to manage electricity grids,” Marc Girod-Genet points out. “If we needed more energy, with standard power plants, it was generally enough to increase the output. Today, production and consumption phases fluctuate much more.” The expected energy transition is therefore impossible without Smart Grids that are allowing the matching of these new demands.

 

Changes at all levels

Smart Grids can be seen as classical electricity grids to which three new aspects are added: a telecommunications network to relay information on the, a large scale and distributed information management system, and an energy services platform.

As primary links in the chain, consumers’ houses will very soon be fitted with smart meters. “The objective within Europe is to equip 80 % of households with this type of meter by horizon 2020” says Marc Girod-Genet. These new tools will allow local management decisions to be made according to in particular consumer habits, equipment available and levels of consumption. “Télécom Bretagne research teams are currently working on all these mechanisms for measurement communication and preprocessing information locally (Advance Metering Infrastructure – AMI), in partnership with Itron and Texas Instruments in a joint research center for developing such smart meters”, explains Marc Girod-Genet. The resulting combination of these meters with a telecommunications network will allow information to be received on a larger scale in the various control centers that will collect this sea of data. These control centers will link the information up with other types of data, for example wind speed measurements for forecasting wind turbine production. “This data is particularly large in quantity (Big Data) and diverse in nature and scales. One of the challenge is therefore to find ways of managing this heterogeneity in order to facilitate decision-making on actions aiming to improve energy efficiency. Big Data analysis is one of our specialties at Télécom SudParis.” Finally, this new network architecture, provided with an extended energy-related knowledge base, will also allow the provision of new services such as dynamic pricing, dynamic energy provider selection, and smart monitoring of electricity consumption, for each consumer. The Institut Mines-Télécom is involved in such societal changes in energy consumption.

 

Micro-grids in electric vehicles

Researchers at the Institut Mines-Télécom have been working for several years on different aspects of these Smart Grids, and more generally on what is known as Smart Energy.  For Marc Girod-Genet, this involvement is logical: “The Mines schools have always been strong in the field of pure energy, such as in particular production management, energy transport and storage, renewable energy management, fuel cells and energy transformation. Télécom research teams are contributing expert knowledge on communications networks and their management; information management systems and data modeling/processing issues (Big Data included); smart metering, Advance Metering Architecture (AMI); and service architectures. We are therefore perfectly qualified to work on Smart Grids.” Such expertise has led to the involvement of the Institut Mines-Télécom laboratories in various large-scale projects. One of them, Nice Grid (with the participation of Mines ParisTech), was launched in 2011 and allowed a Smart Grid, or “micro-grid” to be tested out at local level within a district of Nice (fifth French city). With smart meters, energy monitoring and consumer involvement, every aspect of smart grids was included in this project.

The Institut Mines-Télécom is also looking into issues less directly linked to smart grids, like electric vehicles. Completed in 2013, the VELCRI project[1] (with the participation of Télécom SudParis) consisted of three main research priorities: optimizing the distribution of charge amongst vehicles, allowing secure communication between the charging station, cars and the electricity network and finally using car batteries as a means of energy micro-storage. “Thanks to internal partnerships between Mines and Télécom researchers, we have lots of different studies on future energy networks” concludes the researcher. The Internet of Things and machine-to-machine communication (M2M), cloud architecture, environmental impact, societal issues, infrastructural and data security etc.: these are all lines of research within the Institut Mines-Télécom for optimal answer to energy transition objective requirements.

[box type=”shadow” align=”” class=”” width=””]

SEAS, for a European smart grid

Studies on Smart Grids are not limited to the local or national scale. With globalization, Europe is becoming increasingly focused on energy management. At the end of 2013, a major European project named SEAS (Smart Energy Aware Systems) was launched, involving collaborators from Spain, Finland, Portugal, Turkey and France. The aim is as simple as it is ambitious: achieve interoperability of different energy systems, leading to umbrella energy management. Since management systems vary greatly from one country to another, the challenge will be to standardize generic mechanisms, data format, protocols and materials for energy information representation, processing and exchange between energy systems, automation systems, ICT based digital services and all related stakeholders. In this respect, Armines, Télécom Bretagne, Télécom ParisTech and Télécom SudParis researchers are contributing significantly to this 3-year project, focusing particularly on the open semantic energy data models used, the energy information exchange platform and the management of electric vehicles. According to the partners involved, the concrete applications of this research should be seen at the end of the project scheduled for 2017.[/box]

[author title=”” image=”https://imtech-test.imt.fr/wp-content/uploads/2014/06/Photo_Marc_Girod_Genet_recadré.jpg”]In 1994, Marc Girod-Genet left EPITA. His engineering diploma in hand, he headed to the USA where he obtained a Master of Science at the Stevens Institute of Technology. “Born in Zaire, I’ve always been attracted to going abroad”, says the researcher. Yet it was in France that he accepted his first job, at CNET (now Orange Labs), whilst also writing a thesis at the same time. In 2000 Marc Girod-Genet earned his PhD in Information Sciences at the University of Versailles Saint-Quentin-en-Yvelines before joining Télécom SudParis in the same year as EU research project manager. He concentrated in particular on mobile networks, already endeavoring to introduce more intelligence.

He later became an associate research and professor in 2005, which allowed him to teach alongside his research. “I like this perspective of knowledge transfer,” he admits “the field of new information and communication technology is evolving very quickly, and our teaching must therefore be constantly adapted.” With two decades already under his belt studying all types of networks as well as multiple awards, Marc Girod-Genet has a considerable amount of knowledge to transfer…[/author]

[1] VELCRI: Electric Vehicle with Integrated Quick Charging

From epertise in telecommunications networks to the performance of electricity grids

From networks to everyday objects, the internet has radically changed our environment. From the main arteries to the smallest vessels, it is embedded in such a large number of the most banal objects that it puts a strain on the energy bill. Yet now communicating objects can exchange information to optimize their electricity consumption. After several years of research on the IPv6 protocol, Laurent Toutain and Alexander Pelov, researchers at Télécom Bretagne, are adapting this protocol to suit objects with a small energy supply and to the smart grids that are being built. Their work is part of a series of Institut Mines-Télécom projects on energy transition, focusing on the evolution, performance and compatibility of the energy networks of the future.

 

From the web to the Internet of things: 20 years of protocol development

Over the past few years the advent of smart transport and the Internet of Things has exposed the limits of the classical model of the internet. Mobility, creation of spontaneous networks, energy constraints and security must be taken into account. The number of devices eligible for an Internet address has exceeded capacities of IP, the network’s fundamental protocol. With IPv6, a version offering 667 million billions of possible IP addresses per mm2 on Earth, each component or receptor of an object can now have its own address and be consulted. But IP was not designed for receptors located in the middle of nowhere with finite resources such as the processor, battery and memory and with low-speed connection. For such “LowPAN”, Low Power Wireless Personal Area Networks, a new version of IPv6 has been created, 6LowPAN, with an associated consultation protocol, CoAP (constraint application protocol), which assists continual communication between the traditional internet and that of Things.

“CoAP is a new way of structuring networks,” explains Laurent Toutain, “the interaction between existing networks and communicating objects can be established in two ways: either by improved integration of IP protocols making the network more uniform, or by marginalization of IP within the network and a diversification of the protocols for access to things”. Confidentiality and security aspects will be fundamental to the success of either one of these architectures. The researcher and his team also use mathematical models and game theory, applying them to the fields of smart transport and energy management.

Transmitting data in local energy loops

Over the past few years several regions in France producing considerably less electricity than they consume have endeavored to mobilize the region towards adopting concerted energy efficient behavior. Alexander Pelov observes that “this is the case of the poorly supplied Provence-Alpes-Côte d’Azur region, which is becoming the leader in smart grids”, meaning a vision of an electricity network with optimized links to improve its overall level of energy efficiency. Brittany and its partners have also been working for several years on controlling the demand for electricity, developing the production of energy from renewable sources and the security of electricity supply. In 2012 it sent out an initial call for projects on the “local energy loop”.

One of the objectives of electricity suppliers today is to be able to exchange data through the electricity network, “a network that was never designed to transport them”, emphasizes Laurent Toutain. It will use a low-speed 250 kb/s configuration similar to LowPAN, with the same constraints as the Internet of Things. Laurent Toutain’s team has built a simulator to precisely model the behavior of such networks. This simulator allows the re-definition of routing algorithms and the study of new applicative behavior. “We try to adapt to the existing infrastructure: we must use it so we can adapt to all forms of traffic”, and also improve the network’s performance to increase its uses. This is a major challenge because the electricity network must, for example, communicate with vehicles and negotiate if there is a priority ambulance, as well as supplying energy and transferring it from one place to another. “Without prior knowledge of telecoms networks, none of that is possible”, explains the researcher.

[box type=”shadow” align=”” class=”” width=””]

A smart grid skills center on the Rennes campus of Télécom Bretagne

The fruit of a partnership with the Itron group, specialists in the development of metering solutions, and of Texas Instruments, experts in the field of semi-conductors, this research centre for power line communication technology inaugurated in November 2013 creates innovative solutions for electricity suppliers (technical upgrading of networks, smart metering etc.) and serves the French smart grids industry with the expertise of its researchers and engineers. Find out more +[/box]

Giving consumers a more active role in consumption

While better energy management can be achieved by the supplier, consumers must also play their part. Rennes is a pioneering city in thinking on the Digital City, a smart, sustainable and creative city built on the openness of public data, and has in this context sent out a call for projects concerned with energy and transport policies. Currently developing the ÉcoCité ViaSilva district, Rennes is encouraging inhabitants to restrict their energy usage and has committed to an Open Energy Data program.

Based on the observation that “we cannot double the existing infrastructure in order to transmit data”, the team of researchers based in Rennes are working on systems that give people a more active role in their consumption. It has been observed that the simple fact of showing users their consumption levels encouraged them to adopt better habits and led to savings of between 5 and 10%. “The idea is to make it fun, to imagine the “foursquare” of energy” explain Laurent Toutain and Alexander Pelov, referring to the localization mobile application whose most active users win badges. Another aspect is the visual representation of user behavior, which the team is working on with the École Européenne Supérieure d’Art de Bretagne, in digital laboratories (FabLabs) in Brittany. “Ultimately”, the researchers continue with a smile, “it’s like doing quantified-self at home”. This famous concept of “self-quantification” refers to the notion of being a “consum’actor” studied by sociologists and which is proving to be significant at this time of energy transition.

[box type=”shadow” align=”” class=”” width=””]

Research fostering spin-offs

“It’s extremely rewarding to work on a societal issue like energy”, enthuses Alexander Pelov. Numerous collaborations with start-ups like Cityzen Data, companies like Deltadore, Kerlink, Médria and the FabLabs bear witness to this passion. The start-up Homadeus that is currently in the Télécom Bretagne incubator offers both “open energy data” materials and the interfaces (web and mobile) to drive them.[/box]

 

Toutain_recadréLaurent Toutain and Alexander Pelov are both researchers in the Networks, Security and Multimedia Services department of Télécom Bretagne. A reputed expert in IP networks, and in particular in service quality, metrology, routing protocols and IPv6, Laurent is currently looking at newPelov architectures and services for domestic networks with a  focus on industry and technology rather than research. After studies in Bulgaria and a thesis at the University of Strasbourg in 2009, Alexander joined Télécom Bretagne in 2010 to work on energy efficiency in wireless networks and the use of smart grids in the context of smart metering and electric vehicles.

Rédaction : Nereÿs

réseaux sociaux, social networks, Christine Balagué, social media

Social Media: understanding digital uses

The digital revolution, from which emerged social networks and big data, has led to far-reaching changes in a large number of economic sectors. Businesses need help to develop new strategies suited to these transformations, and it is with this in mind that Christine Balagué, researcher at Télécom École de Management and Vice-President of the National Digital Council, created the “Social media: economic and social value creation” chair within the Institut Mines-Télécom.

 

The digital explosion is characterized by mass uses, data and trends. We are seeing a transfer of audiences from the traditional media to the internet and, within the internet, from websites to social networks, which now attract more than 1.6 billion people in the world,[1] including at least 32 million in France.[2] New technological developments, tablets, smartphones and other touchscreen interfaces allow information to be shared in real time. The consequences for businesses are numerous. Communication, customer relations, innovation, market surveys and the marketing-related professions have been totally transformed. Social networks, which give everybody a chance to have their say, have become unavoidable, but how should they be used?

 

Using social media: yes! But how?

Christine Balagué’s work has been focusing on this matter for several years, and it was with the aim of thoroughly understanding the uses of digital technology in order to offer appropriate strategies to businesses that in 2011 she created the Chair called “Social media: economic and social value creation”. It is an ambitious project: modeling the behaviors of social media users, identifying emerging uses, studying the way social media have transformed customer relations management and designing tools adapted to big data.* There are 15 or so researchers in various disciplines working on these four lines of research in the four telecoms schools of the Institut Mines-Télécom.

Most of the research on the theme of “quantitative modeling” of behavior is carried out at Télécom Ecole de Management. The aim is to understand how information spreads through networks. The emphasis is placed on quantitative aspects in this approach. The target is to be able to explain and predict the number of comments, likes, shares, tweets, re-tweets, and posts: in a nutshell, all the activity generated on social networks. The key for this model is the choice of explanatory variables. Some of them are structure-related variables such as page rank, number of hits and degree centrality. However, graph theory alone does not explain the spread of information within the networks as there are other variables related to the character of the individual or the content of what is being shared. Communities also develop within social networks and need to be identified. One branch of research focuses on their detection and the analysis of what users talk about. For example, in collaboration with researchers at Télécom Bretagne and as part of the Open Food System[3] project which focuses on the influence of digital technology on food preparation, the study of conversations has identified communities not necessarily linked to a given platform.

The research requires the development of crawling* tools and text mining* software adapted to social networks. These crawling tools have been designed at Télécom École de Management, and a thesis at Télécom SudParis is being carried out on large-scale crawling. The number of fans or followers is not a very meaningful indicator. Metrics on audience, algorithms for the measurement of influencers and the right variables explaining internet users’ behavior are assets with genuine value-added compared to existing tools. In particular, they allow companies to be shown where to concentrate their efforts and what to pay attention to.

Three main lines of research are being carried out on the topic of “uses” by sociologists at Télécom ParisTech: geolocation, cultural diversity and multi-network. Displaying spatial location is one of the emerging behavioral phenomena that sociologists are interested in. Moreover, uses differ according to the area of the world and many companies are very interested in such cultural diversities.

 

The metamorphosis of customer relations

The issue of “managing customer relations through social media” is being explored at Télécom École de Management. The subject has been revolutionized by the ability for consumers to express themselves directly, in real time and at any moment. We are in the era of user empowerment where the expression of dissatisfaction has to be managed. Hence the numerous Twitter feeds dedicated to real-time customer satisfaction management. It really works: research shows that a client who expressed dissatisfaction and whose problem was quickly resolved is then 5 times more loyal than any other average client. New jobs meet these new needs for companies, such as that of community manager*. A measurement model for this role and its impact on customer satisfaction has been developed.

Lastly, the research at Télécom École de Management on the subject of Big Data revolves around two projects in particular: Sodatech and Inovagora.[4] Sodatech is a monitoring and CRM 2.0* project based on the processing of both purchasing data and social data leading to innovations in the field of recommendations. It is built around the idea that today, an individual’s behavior is influenced much more by what their peers say on a social network or community platform than by information published by a brand, company or institution. The objective of Sodatech is to qualify active social media users through innovative criteria and to detect the dynamics of influence, while Innovagora aims to create crowd-innovation* platforms.

 

The advantage of the multidisciplinary approach

All of this research meets businesses’ operational needs. The work of the Chair has led to the elaboration of new algorithms and key metrics for understanding and measuring internet users’ behavior on social networks. Today, the work is focused mainly on big data analysis of social networks, for which the scope of research is still very open. Against this backdrop, the Chair has a major asset: the ability to capitalize on the diversity of skills within the Institut Mines-Télécom graduate schools in order to broaden horizons and create interplay between complementary visions. The different aspects brought to the table by marketing, IT, sociology, math, physics and psychology allow varied advances to be used in connection with each other. Long live the multidisciplinary approach!

Se renouveler pour mieux continuer
Lancée en 2011 par Christine Balagué, la chaire « Réseaux Sociaux : création de valeur économique et sociale » est structurée autour d’un programme de recherche sur trois ans. Le premier budget d’un million d’euros, conclu avec les trois premiers partenaires, Danone, Les Pages Jaunes et La Poste, arrive à son terme fin 2014. Une nouvelle phase de financement prolongera la collaboration avec les partenaires historiques ou permettra d’en impliquer de nouveaux. Elle portera sur le Social Internet of Things (SIoT), réseaux sociaux comprenant à la fois des individus et des objets connectés. Learn more

Christine Balagué, digital technologies in her genes

“How can digital technology create value for businesses?” This is the main thread of the work conducted by Christine Balagué, research professor at Télécom Ecole de Management, ESSEC graduate (1987), ENSAE graduate (1993), Dauphine University graduate (2000) and accredited to supervise research (2013). Her doctoral thesis in Management Sciences, which she defended in 2005 at HEC, is about modeling internet users’ behavior using Petri nets. Christine is Vice-President of the French National Digital Technology Council, a group of experts regularly called upon by the government to consider all matters relating to the impact of digital technology on society and the economy, and to issue recommendations accordingly.

 

* Glossary

  • Big data: Extremely large datasets needing to be processed in real time
  • Community manager : person who manages communities on social networks
  • Crawling: technique of data extraction from websites.
  • Text mining: extraction of knowledge in texts with the help of algorithms.
  • Crowd innovation: innovation of products and services via the creation of digital platforms destined to collect internet users’ opinions and ideas.
  • CRM 2.0 : Customer Relationship Management via social media

[1] Source Institut eMarketer, November 2013
[2] Source Institut Médiamétrie, June 2013
[3] Project selected within the framework of the call for “Structuring Projects for poles of competitiveness” in the French Future Investments Program
[4] Projects selected within the framework of the “Big data” call for projects in the French Future Investments Program.

computational sciences

Computational sciences: when math makes all the difference

What is the link between thought-controlled computers, being able to turn invisible, using virtual worlds to test hypotheses in archeology, and research to better understand and treat epilepsy? Each of these fields of research benefit from a new scientific approach, known as computational, which is becoming a new discipline in its own right.

Computational physics is an area of research at the Institut Mines-Télécom, led more specifically by Francesco Andriulli, an associate professor at Télécom Bretagne. He is part of the generation of scientists who enjoy the benefit of advanced technology and deal with masses of data, and who must master not only their particular field of research, but also math and engineering.

 

The use of computational science in physics

Prompted by Jim Gray, a database researcher and winner of the 1998 Turing prize, the scientific community has noted a major paradigm shift over the past couple of years within the sea of data surrounding us. Three main paradigms have driven science since its beginnings, complementing rather than replacing each other. For centuries, science was based on an empirical methodology of observation and study of only observable phenomenon. Later, scientific practices became more theoretical, using models, abstractions and generalizations. It was in this way that Kepler’s laws, Newton’s laws of motion, Maxwell’s equations in electromagnetism and the kinetic theory of gases emerged. Lastly, the invention of computers in the middle of the 20th century added programming to researchers’ list of tools, allowing them to model complex phenomena.

Now science has evolved once more. Scientists almost never look directly into their instruments (telescopes, microscopes etc.) but instead simultaneously examine data recorded by such instruments and those created by simulations. This data are becoming increasingly big. For Jim Gray, the exploration of data has become the fourth scientific paradigm.

Computational physics is part of this heritage. Francesco Andriulli explains that “It is a multidisciplinary field, theoretical and at the same time part of applied math and physics, advanced engineering and high performance computing.” Computational electromagnetism comes down to solving Maxwell’s equations, a much more predictive theory than mechanics, in complex spaces and environments. Even through these equations are 150 years old, the field is thriving: “We find new effects and new elements, particularly in the quantum field and in the science of complex materials.”

In addition to contributing to scientific advances, computational physics has also had a significant impact on cutting-edge engineering and its applications. Throughout his career, Francesco Andriulli has collaborated with many organisations, research centers and companies in Europe and the USA. The electromagnetic and digital techniques that he has developed have been applied to several problems and techniques in industry such as the design and electromagnetic characterization of circuits, metamaterials, wideband antennas, the characterization of electromagnetic physics of the brain, brain-machine interfaces and brain imaging.

 

Modern brain-machine interfaces

256 sensors precisely measure the points of origin of electrical activity in the brain

For anyone wishing to understand the electromagnetic interactions taking place within the brain, it is a highly complex environment. “There are several major areas of complexity,” explains the scientist, for whom this topic is one of the applications of his research field. “The physical properties are not the same throughout; the movement of the electric charge follows a preferential route between the neuron cells, and not in all directions; the forces involved are very weak and, above all, it is difficult to reproduce results of an experiment because it is impossible to obtain the same brain state twice.” The brain can be studied with the aid of functional magnetic resonance imaging, a technique that allows examination of the fabric of the brain, but which is cumbersome, costly and difficult to implement. Another technique, electroencephalography, involves measuring the brain’s electrical activity with the aid of sensors placed on the scalp: “We have the most powerful machine available today, with 256 high-density sensors”, adds the researcher. But measuring isn’t enough, the location of where the measurement was taken must be known, which is why the patient is surrounded by a dome that precisely locates each sensor and allows a deformable model of the patient’s head to be calculated.

A mathematical and physical theoretical model, brain-machine sensors and interfaces that allow every kind of interpretation possible, and new algorithms for processing the data collected… computational electromagnetism proves to be useful at every step of the journey, because it resolves the complexity presented at each of these levels.

 

Coupled with the latest technologies, math plays a highly practical role

We take the measurement on the surface of the brain volume, and math allows us to move from surface to volume”, summarizes Francesco Andriulli. Algorithms are then the key to accelerating the calculations. “Here, math is not just useful, it’s essential.” Thanks to these algorithms and supercomputers, calculations have gone from taking four years to just two hours.

Today, all this is possible for two main reasons. Firstly, there was a major advance in mathematics at the end of the 1990s: the Fast Multipole Method was developed, among other things, to help resolve the n-body problem within a reasonable timeframe, as well as other complex problems. The second reason is that the technology available today allows such problems to be tackled. Concerning the brain, such technology increases predictive power for electromagnetic behavior and allows the elaboration of very precise imaging.

The algorithms’ success also depends on the equipment they are applied to. The two fields are not separate. “It’s definitely transdisciplinary, not multidisciplinary”, explains the researcher enthusiastically, “transdisciplinary work is intrinsically multidisciplinary: it’s like swimming in multiple swimming pools at the same time.” This transdisciplinary character is given concrete form in several fundamental and applied research projects initiated by Francesco Andriulli at regional, national and European levels. He was awarded a grant from the ANR JCJC program (French National Research Agency, Young Researchers Program) for advances in electroencephalography imaging techniques. Even though he has had access to the world’s seventh most powerful computer, he also works with more standard equipment in his laboratory TeleComPhysLab for Brain Research Applications at Télécom Bretagne, to see what is achievable on this scale. “Our expertise will make all the difference” he confidently concludes.

 

Currently in his thirties, Francesco Andriulli’s career is in a period of acceleration where he can dedicate himself to his research whilst at the same time being actively involved in the scientific community and communicating his passion. A senior IEEE member, in July 2013 he was named associate editor of the prestigious scientific review IEEE Antennas and Wireless Propagation Letters, and later IEEE Transaction on Antennas & Propagation. “I have wanted to be a researcher since high school, when I discovered that knowledge gave you skills for achievements later on” he recalls. Hence his keen interest in math, which is “essential, because even if we develop applications that seem nice and obvious later, we must never forget that they originated from mathematics”, paired with an engineer’s mentality, in order to “be able to do things.

A graduate of the Polytechnic of Turin and the University of Michigan, Ann Arbor, the young researcher arrived at Télécom Bretagne in 2010. An accredited research supervisor, he heads up a team of PhD students and post-docs. For the European “Marie Curie Ambassador” program he is currently leading a project to popularize science in several high schools in Brest.