From epertise in telecommunications networks to the performance of electricity grids

From networks to everyday objects, the internet has radically changed our environment. From the main arteries to the smallest vessels, it is embedded in such a large number of the most banal objects that it puts a strain on the energy bill. Yet now communicating objects can exchange information to optimize their electricity consumption. After several years of research on the IPv6 protocol, Laurent Toutain and Alexander Pelov, researchers at Télécom Bretagne, are adapting this protocol to suit objects with a small energy supply and to the smart grids that are being built. Their work is part of a series of Institut Mines-Télécom projects on energy transition, focusing on the evolution, performance and compatibility of the energy networks of the future.

 

From the web to the Internet of things: 20 years of protocol development

Over the past few years the advent of smart transport and the Internet of Things has exposed the limits of the classical model of the internet. Mobility, creation of spontaneous networks, energy constraints and security must be taken into account. The number of devices eligible for an Internet address has exceeded capacities of IP, the network’s fundamental protocol. With IPv6, a version offering 667 million billions of possible IP addresses per mm2 on Earth, each component or receptor of an object can now have its own address and be consulted. But IP was not designed for receptors located in the middle of nowhere with finite resources such as the processor, battery and memory and with low-speed connection. For such “LowPAN”, Low Power Wireless Personal Area Networks, a new version of IPv6 has been created, 6LowPAN, with an associated consultation protocol, CoAP (constraint application protocol), which assists continual communication between the traditional internet and that of Things.

“CoAP is a new way of structuring networks,” explains Laurent Toutain, “the interaction between existing networks and communicating objects can be established in two ways: either by improved integration of IP protocols making the network more uniform, or by marginalization of IP within the network and a diversification of the protocols for access to things”. Confidentiality and security aspects will be fundamental to the success of either one of these architectures. The researcher and his team also use mathematical models and game theory, applying them to the fields of smart transport and energy management.

Transmitting data in local energy loops

Over the past few years several regions in France producing considerably less electricity than they consume have endeavored to mobilize the region towards adopting concerted energy efficient behavior. Alexander Pelov observes that “this is the case of the poorly supplied Provence-Alpes-Côte d’Azur region, which is becoming the leader in smart grids”, meaning a vision of an electricity network with optimized links to improve its overall level of energy efficiency. Brittany and its partners have also been working for several years on controlling the demand for electricity, developing the production of energy from renewable sources and the security of electricity supply. In 2012 it sent out an initial call for projects on the “local energy loop”.

One of the objectives of electricity suppliers today is to be able to exchange data through the electricity network, “a network that was never designed to transport them”, emphasizes Laurent Toutain. It will use a low-speed 250 kb/s configuration similar to LowPAN, with the same constraints as the Internet of Things. Laurent Toutain’s team has built a simulator to precisely model the behavior of such networks. This simulator allows the re-definition of routing algorithms and the study of new applicative behavior. “We try to adapt to the existing infrastructure: we must use it so we can adapt to all forms of traffic”, and also improve the network’s performance to increase its uses. This is a major challenge because the electricity network must, for example, communicate with vehicles and negotiate if there is a priority ambulance, as well as supplying energy and transferring it from one place to another. “Without prior knowledge of telecoms networks, none of that is possible”, explains the researcher.

[box type=”shadow” align=”” class=”” width=””]

A smart grid skills center on the Rennes campus of Télécom Bretagne

The fruit of a partnership with the Itron group, specialists in the development of metering solutions, and of Texas Instruments, experts in the field of semi-conductors, this research centre for power line communication technology inaugurated in November 2013 creates innovative solutions for electricity suppliers (technical upgrading of networks, smart metering etc.) and serves the French smart grids industry with the expertise of its researchers and engineers. Find out more +[/box]

Giving consumers a more active role in consumption

While better energy management can be achieved by the supplier, consumers must also play their part. Rennes is a pioneering city in thinking on the Digital City, a smart, sustainable and creative city built on the openness of public data, and has in this context sent out a call for projects concerned with energy and transport policies. Currently developing the ÉcoCité ViaSilva district, Rennes is encouraging inhabitants to restrict their energy usage and has committed to an Open Energy Data program.

Based on the observation that “we cannot double the existing infrastructure in order to transmit data”, the team of researchers based in Rennes are working on systems that give people a more active role in their consumption. It has been observed that the simple fact of showing users their consumption levels encouraged them to adopt better habits and led to savings of between 5 and 10%. “The idea is to make it fun, to imagine the “foursquare” of energy” explain Laurent Toutain and Alexander Pelov, referring to the localization mobile application whose most active users win badges. Another aspect is the visual representation of user behavior, which the team is working on with the École Européenne Supérieure d’Art de Bretagne, in digital laboratories (FabLabs) in Brittany. “Ultimately”, the researchers continue with a smile, “it’s like doing quantified-self at home”. This famous concept of “self-quantification” refers to the notion of being a “consum’actor” studied by sociologists and which is proving to be significant at this time of energy transition.

[box type=”shadow” align=”” class=”” width=””]

Research fostering spin-offs

“It’s extremely rewarding to work on a societal issue like energy”, enthuses Alexander Pelov. Numerous collaborations with start-ups like Cityzen Data, companies like Deltadore, Kerlink, Médria and the FabLabs bear witness to this passion. The start-up Homadeus that is currently in the Télécom Bretagne incubator offers both “open energy data” materials and the interfaces (web and mobile) to drive them.[/box]

 

Toutain_recadréLaurent Toutain and Alexander Pelov are both researchers in the Networks, Security and Multimedia Services department of Télécom Bretagne. A reputed expert in IP networks, and in particular in service quality, metrology, routing protocols and IPv6, Laurent is currently looking at newPelov architectures and services for domestic networks with a  focus on industry and technology rather than research. After studies in Bulgaria and a thesis at the University of Strasbourg in 2009, Alexander joined Télécom Bretagne in 2010 to work on energy efficiency in wireless networks and the use of smart grids in the context of smart metering and electric vehicles.

Rédaction : Nereÿs

réseaux sociaux, social networks, Christine Balagué, social media

Social Media: understanding digital uses

The digital revolution, from which emerged social networks and big data, has led to far-reaching changes in a large number of economic sectors. Businesses need help to develop new strategies suited to these transformations, and it is with this in mind that Christine Balagué, researcher at Télécom École de Management and Vice-President of the National Digital Council, created the “Social media: economic and social value creation” chair within the Institut Mines-Télécom.

 

The digital explosion is characterized by mass uses, data and trends. We are seeing a transfer of audiences from the traditional media to the internet and, within the internet, from websites to social networks, which now attract more than 1.6 billion people in the world,[1] including at least 32 million in France.[2] New technological developments, tablets, smartphones and other touchscreen interfaces allow information to be shared in real time. The consequences for businesses are numerous. Communication, customer relations, innovation, market surveys and the marketing-related professions have been totally transformed. Social networks, which give everybody a chance to have their say, have become unavoidable, but how should they be used?

 

Using social media: yes! But how?

Christine Balagué’s work has been focusing on this matter for several years, and it was with the aim of thoroughly understanding the uses of digital technology in order to offer appropriate strategies to businesses that in 2011 she created the Chair called “Social media: economic and social value creation”. It is an ambitious project: modeling the behaviors of social media users, identifying emerging uses, studying the way social media have transformed customer relations management and designing tools adapted to big data.* There are 15 or so researchers in various disciplines working on these four lines of research in the four telecoms schools of the Institut Mines-Télécom.

Most of the research on the theme of “quantitative modeling” of behavior is carried out at Télécom Ecole de Management. The aim is to understand how information spreads through networks. The emphasis is placed on quantitative aspects in this approach. The target is to be able to explain and predict the number of comments, likes, shares, tweets, re-tweets, and posts: in a nutshell, all the activity generated on social networks. The key for this model is the choice of explanatory variables. Some of them are structure-related variables such as page rank, number of hits and degree centrality. However, graph theory alone does not explain the spread of information within the networks as there are other variables related to the character of the individual or the content of what is being shared. Communities also develop within social networks and need to be identified. One branch of research focuses on their detection and the analysis of what users talk about. For example, in collaboration with researchers at Télécom Bretagne and as part of the Open Food System[3] project which focuses on the influence of digital technology on food preparation, the study of conversations has identified communities not necessarily linked to a given platform.

The research requires the development of crawling* tools and text mining* software adapted to social networks. These crawling tools have been designed at Télécom École de Management, and a thesis at Télécom SudParis is being carried out on large-scale crawling. The number of fans or followers is not a very meaningful indicator. Metrics on audience, algorithms for the measurement of influencers and the right variables explaining internet users’ behavior are assets with genuine value-added compared to existing tools. In particular, they allow companies to be shown where to concentrate their efforts and what to pay attention to.

Three main lines of research are being carried out on the topic of “uses” by sociologists at Télécom ParisTech: geolocation, cultural diversity and multi-network. Displaying spatial location is one of the emerging behavioral phenomena that sociologists are interested in. Moreover, uses differ according to the area of the world and many companies are very interested in such cultural diversities.

 

The metamorphosis of customer relations

The issue of “managing customer relations through social media” is being explored at Télécom École de Management. The subject has been revolutionized by the ability for consumers to express themselves directly, in real time and at any moment. We are in the era of user empowerment where the expression of dissatisfaction has to be managed. Hence the numerous Twitter feeds dedicated to real-time customer satisfaction management. It really works: research shows that a client who expressed dissatisfaction and whose problem was quickly resolved is then 5 times more loyal than any other average client. New jobs meet these new needs for companies, such as that of community manager*. A measurement model for this role and its impact on customer satisfaction has been developed.

Lastly, the research at Télécom École de Management on the subject of Big Data revolves around two projects in particular: Sodatech and Inovagora.[4] Sodatech is a monitoring and CRM 2.0* project based on the processing of both purchasing data and social data leading to innovations in the field of recommendations. It is built around the idea that today, an individual’s behavior is influenced much more by what their peers say on a social network or community platform than by information published by a brand, company or institution. The objective of Sodatech is to qualify active social media users through innovative criteria and to detect the dynamics of influence, while Innovagora aims to create crowd-innovation* platforms.

 

The advantage of the multidisciplinary approach

All of this research meets businesses’ operational needs. The work of the Chair has led to the elaboration of new algorithms and key metrics for understanding and measuring internet users’ behavior on social networks. Today, the work is focused mainly on big data analysis of social networks, for which the scope of research is still very open. Against this backdrop, the Chair has a major asset: the ability to capitalize on the diversity of skills within the Institut Mines-Télécom graduate schools in order to broaden horizons and create interplay between complementary visions. The different aspects brought to the table by marketing, IT, sociology, math, physics and psychology allow varied advances to be used in connection with each other. Long live the multidisciplinary approach!

Se renouveler pour mieux continuer
Lancée en 2011 par Christine Balagué, la chaire « Réseaux Sociaux : création de valeur économique et sociale » est structurée autour d’un programme de recherche sur trois ans. Le premier budget d’un million d’euros, conclu avec les trois premiers partenaires, Danone, Les Pages Jaunes et La Poste, arrive à son terme fin 2014. Une nouvelle phase de financement prolongera la collaboration avec les partenaires historiques ou permettra d’en impliquer de nouveaux. Elle portera sur le Social Internet of Things (SIoT), réseaux sociaux comprenant à la fois des individus et des objets connectés. Learn more

Christine Balagué, digital technologies in her genes

“How can digital technology create value for businesses?” This is the main thread of the work conducted by Christine Balagué, research professor at Télécom Ecole de Management, ESSEC graduate (1987), ENSAE graduate (1993), Dauphine University graduate (2000) and accredited to supervise research (2013). Her doctoral thesis in Management Sciences, which she defended in 2005 at HEC, is about modeling internet users’ behavior using Petri nets. Christine is Vice-President of the French National Digital Technology Council, a group of experts regularly called upon by the government to consider all matters relating to the impact of digital technology on society and the economy, and to issue recommendations accordingly.

 

* Glossary

  • Big data: Extremely large datasets needing to be processed in real time
  • Community manager : person who manages communities on social networks
  • Crawling: technique of data extraction from websites.
  • Text mining: extraction of knowledge in texts with the help of algorithms.
  • Crowd innovation: innovation of products and services via the creation of digital platforms destined to collect internet users’ opinions and ideas.
  • CRM 2.0 : Customer Relationship Management via social media

[1] Source Institut eMarketer, November 2013
[2] Source Institut Médiamétrie, June 2013
[3] Project selected within the framework of the call for “Structuring Projects for poles of competitiveness” in the French Future Investments Program
[4] Projects selected within the framework of the “Big data” call for projects in the French Future Investments Program.

computational sciences

Computational sciences: when math makes all the difference

What is the link between thought-controlled computers, being able to turn invisible, using virtual worlds to test hypotheses in archeology, and research to better understand and treat epilepsy? Each of these fields of research benefit from a new scientific approach, known as computational, which is becoming a new discipline in its own right.

Computational physics is an area of research at the Institut Mines-Télécom, led more specifically by Francesco Andriulli, an associate professor at Télécom Bretagne. He is part of the generation of scientists who enjoy the benefit of advanced technology and deal with masses of data, and who must master not only their particular field of research, but also math and engineering.

 

The use of computational science in physics

Prompted by Jim Gray, a database researcher and winner of the 1998 Turing prize, the scientific community has noted a major paradigm shift over the past couple of years within the sea of data surrounding us. Three main paradigms have driven science since its beginnings, complementing rather than replacing each other. For centuries, science was based on an empirical methodology of observation and study of only observable phenomenon. Later, scientific practices became more theoretical, using models, abstractions and generalizations. It was in this way that Kepler’s laws, Newton’s laws of motion, Maxwell’s equations in electromagnetism and the kinetic theory of gases emerged. Lastly, the invention of computers in the middle of the 20th century added programming to researchers’ list of tools, allowing them to model complex phenomena.

Now science has evolved once more. Scientists almost never look directly into their instruments (telescopes, microscopes etc.) but instead simultaneously examine data recorded by such instruments and those created by simulations. This data are becoming increasingly big. For Jim Gray, the exploration of data has become the fourth scientific paradigm.

Computational physics is part of this heritage. Francesco Andriulli explains that “It is a multidisciplinary field, theoretical and at the same time part of applied math and physics, advanced engineering and high performance computing.” Computational electromagnetism comes down to solving Maxwell’s equations, a much more predictive theory than mechanics, in complex spaces and environments. Even through these equations are 150 years old, the field is thriving: “We find new effects and new elements, particularly in the quantum field and in the science of complex materials.”

In addition to contributing to scientific advances, computational physics has also had a significant impact on cutting-edge engineering and its applications. Throughout his career, Francesco Andriulli has collaborated with many organisations, research centers and companies in Europe and the USA. The electromagnetic and digital techniques that he has developed have been applied to several problems and techniques in industry such as the design and electromagnetic characterization of circuits, metamaterials, wideband antennas, the characterization of electromagnetic physics of the brain, brain-machine interfaces and brain imaging.

 

Modern brain-machine interfaces

256 sensors precisely measure the points of origin of electrical activity in the brain

For anyone wishing to understand the electromagnetic interactions taking place within the brain, it is a highly complex environment. “There are several major areas of complexity,” explains the scientist, for whom this topic is one of the applications of his research field. “The physical properties are not the same throughout; the movement of the electric charge follows a preferential route between the neuron cells, and not in all directions; the forces involved are very weak and, above all, it is difficult to reproduce results of an experiment because it is impossible to obtain the same brain state twice.” The brain can be studied with the aid of functional magnetic resonance imaging, a technique that allows examination of the fabric of the brain, but which is cumbersome, costly and difficult to implement. Another technique, electroencephalography, involves measuring the brain’s electrical activity with the aid of sensors placed on the scalp: “We have the most powerful machine available today, with 256 high-density sensors”, adds the researcher. But measuring isn’t enough, the location of where the measurement was taken must be known, which is why the patient is surrounded by a dome that precisely locates each sensor and allows a deformable model of the patient’s head to be calculated.

A mathematical and physical theoretical model, brain-machine sensors and interfaces that allow every kind of interpretation possible, and new algorithms for processing the data collected… computational electromagnetism proves to be useful at every step of the journey, because it resolves the complexity presented at each of these levels.

 

Coupled with the latest technologies, math plays a highly practical role

We take the measurement on the surface of the brain volume, and math allows us to move from surface to volume”, summarizes Francesco Andriulli. Algorithms are then the key to accelerating the calculations. “Here, math is not just useful, it’s essential.” Thanks to these algorithms and supercomputers, calculations have gone from taking four years to just two hours.

Today, all this is possible for two main reasons. Firstly, there was a major advance in mathematics at the end of the 1990s: the Fast Multipole Method was developed, among other things, to help resolve the n-body problem within a reasonable timeframe, as well as other complex problems. The second reason is that the technology available today allows such problems to be tackled. Concerning the brain, such technology increases predictive power for electromagnetic behavior and allows the elaboration of very precise imaging.

The algorithms’ success also depends on the equipment they are applied to. The two fields are not separate. “It’s definitely transdisciplinary, not multidisciplinary”, explains the researcher enthusiastically, “transdisciplinary work is intrinsically multidisciplinary: it’s like swimming in multiple swimming pools at the same time.” This transdisciplinary character is given concrete form in several fundamental and applied research projects initiated by Francesco Andriulli at regional, national and European levels. He was awarded a grant from the ANR JCJC program (French National Research Agency, Young Researchers Program) for advances in electroencephalography imaging techniques. Even though he has had access to the world’s seventh most powerful computer, he also works with more standard equipment in his laboratory TeleComPhysLab for Brain Research Applications at Télécom Bretagne, to see what is achievable on this scale. “Our expertise will make all the difference” he confidently concludes.

 

Currently in his thirties, Francesco Andriulli’s career is in a period of acceleration where he can dedicate himself to his research whilst at the same time being actively involved in the scientific community and communicating his passion. A senior IEEE member, in July 2013 he was named associate editor of the prestigious scientific review IEEE Antennas and Wireless Propagation Letters, and later IEEE Transaction on Antennas & Propagation. “I have wanted to be a researcher since high school, when I discovered that knowledge gave you skills for achievements later on” he recalls. Hence his keen interest in math, which is “essential, because even if we develop applications that seem nice and obvious later, we must never forget that they originated from mathematics”, paired with an engineer’s mentality, in order to “be able to do things.

A graduate of the Polytechnic of Turin and the University of Michigan, Ann Arbor, the young researcher arrived at Télécom Bretagne in 2010. An accredited research supervisor, he heads up a team of PhD students and post-docs. For the European “Marie Curie Ambassador” program he is currently leading a project to popularize science in several high schools in Brest.

 

 

Quantum physics

Confidential Communications and Quantum Physics

Quantum physics opens up a variety of possibilities for radical new applications such as quantum cryptography, a discipline to which Romain Alléaume, researcher at the Institut Mines-Télécom, devotes his energy. Member of the Quantum Information team at Télécom ParisTech, the researcher uses specifically quantum properties of light to design and create systems allowing confidential data to be remotely transmitted. His most recent work addresses the security of these quantum devices as well as their compatibility with ‘traditional’ telecommunications networks.

 

Curiosity and amazement are what attract many people to the world of quantum physics. Romain Alléaume likewise was intrigued by these counterintuitive laws of nature, among them the principle of quantum superposition: “a quantum system may be in several different states at the same time”; the observer effect: “what is equally surprising is that the act of measuring a quantum object to find out what state it is in, changes that state”; as well as quantum entanglement, another special quantum feature: “it refers to correlations between quantum particles that prevent them from being described separately”. Researchers have taken advantage of these confusing properties and in the 1980s started to come up with a new way of processing data. But it has not been plain sailing, as the researcher explains: “one fundamental difficulty of designing a quantum computer stems from the fact that quantum data units, called ‘qubits’ must be able to interact with each other and be modified very quickly while remaining incredibly well protected from the outside environment.” But as paradoxical as it may seem, the quantum computer is not just a fanciful idea, since it actually exists experimentally, the largest to date being 14 qubits.

 

The benefits of the multidisciplinary environment at the Institut Mines-Télécom

Quantum data addresses a number of disciplines including physics, computer science and information theory. It requires the most advanced physics technology, such as that of cold atoms, superconductors and optics, and has mobilized numerous cutting-edge research teams. It was at Télécom ParisTech that Romain Alléaume began his research career setting up the ‘Quantum Information” team that has now grown in size, with four permanent researchers and around 15 members. “The Institut Mines-Télécom has played a critical role in establishing the team, notably through financial support and surface area allocation.” He recognizes that there are significant benefits to be had from working at the Institut Mines-Télécom: “there’s the multidisciplinary aspect (coding, discrete mathematics, physics, optics), a rich environment and an opening onto the market.” Such an environment has been made even more fertile since Romain Alléaume decided to set up his company, encouraged and advised by the school.

In 2008 he launched his industrial venture and co-founded SeQureNet, a startup company designing quantum cryptography systems. Better known by its acronym QKD (Quantum Key Distribution), quantum cryptography, usually using a fiber optic link, allows confidential information to be transmitted with significantly improved security compared to that of standard cryptography. The data is encoded into light, in the polarization of the photons for example. SeQureNet uses a technology called ‘continuous variable quantum cryptography’, the fruit of work at the Institut d’optique Graduate School and at Thales, and which is today developed at the Institut Mines-Télécom. This technology offers several important advantages: a detector system that provides efficient protection from interference and good compatibility with current infrastructures. Based simply on standard telecoms components, it does not require the use of specialised fibers.

[box type=”shadow” align=”” class=”” width=””]

SeQureNet, a company born in the laboratory

Founded in 2008, SeQureNet is an offshoot of the ‘quantum data’ team at the Institut Mines-Télécom which develops and markets innovative technology offering enhanced security for communication networks. Since 2012 SeQureNet has been marketing Cygnus, a QKD system that only uses standard components; with this system the company took part in the experimental demonstration that broke the world distance record for this technology: 80 km compared to 25 km previously. On 14th April 2013 this work was published online by the research journal Nature Photonics in an article entitled ‘Experimental demonstration of long-distance continuous-variable quantum key distribution’ by P. Jouguet, S. Kunz-Jacques, A. Leverrier, E. Diamanti and P. Grangier. Cygnus has already attracted the interest of the NICT (Japanese National Institute of Information and Communications Technology) in Japan, which has bought it. Two types of application are envisioned: academic and industrial R&D (deployment, networks and security) and the protection of infrastructures (defense and telecom operators), with the main advantage being QKD’s capacity to assure secure data exchanges over the long term. Learn more[/box]

 

Long-term security

The idea underpinning QKD is to use the observer effect to turn an apparent weakness into an asset. Anyone trying to ‘read’ the photon messenger would disturb its quantum state and introduce errors, thus making it impossible for a spy to get correspondents to share and use a key that they believe to to be secure but which has been intercepted. Another fundamental competitive edge is QKD’s long-term security, which standard cryptography cannot guarantee. For example, the security of RSA, the groundbreaking public-key cryptography algorithm invented by Rivest, Shamir and Adleman[1], relies upon a conjecture: the difficulty of factoring large numbers. But as Romain Alléaume suggests, “Nothing stops you recording everything today in the hope of cracking the code tomorrow!” With QKD, on the other hand, a key that is secure today will still be so in 20 years’ time. QKD’s intrinsic security has compelled the researcher to try and integrate it into networks. This is a challenge that he has been tackling since 2004 with the European project SECOQC, and which in 2008 led to the opening of the first European QKD network.

Today the team is working on improving QKD’s performance as well as on its integration into optical network infrastructures through collaboration between Télécom ParisTech and SeQureNet. But these are not the only objectives: the systems’ practical security is also an important issue, particularly for industrialization. “In the cryptography field you expect to be challenged, especially when you have to be compared to already existing systems which are subjected to very extensive certification tests.” It has been found that the physical systems of QKD may also contain flaws, and that they should therefore undergo preventative testing. Such a reversal has not escaped Romain Alléaume: “this calls into question exactly what QKD was so proud of: the universality and infallibility of its security. Part of my current work focuses on this point: looking at potential attacks and developing counter-measures in order to guarantee QKD’s security.” For this young discipline it is perhaps the price to be paid in the shift up to maturity. There is, however, a new approach based upon the notion of entanglement that guarantees quantum systems’ security to in spite of the imperfection of the hardware used. Entangled photon sources open up considerable research prospects not only in cryptography, but also for manufacturing ‘quantum repeaters’: an opportunity for the researcher and his team to conduct exciting new research!

 

Photo_Romain_AlleaumeCombining research, innovation and industry

Romain Alléaume entered the École normale supérieure in Paris in 1998. In his final year, already interested in quantum cryptography, he began a thesis at ENS Cachan on single photon sources. Attracted by a career combining research and industry, he joined the Corps des Télécom in 2001 and took classes at Télécom ParisTech. In 2004 he defended his thesis, and since then has been a professor and researcher at Télécom ParisTech. His current research within the Institut Mines-Télécom addresses quantum cryptography, quantum networks and information theory. Additionally, he is the scientific advisor of SeQureNet, a start-up business of which he is one of the co-founders.

 

[1] RSA encryption (named after the initials of its three inventors Rivest, Shamir and Adleman) is an asymmetric cryptography algorithm calculated in 1977 and is frequently used in electronic commerce as well as, more generally, to exchange confidential data over the internet.

 

Augmented Reality and Surgery: helping surgeons achieve ever-greater accuracy

Questions of health, well-being and personal independence are central preoccupations of all modern societies. Age-related illnesses, and our ageing populations, pose a number of social and economic challenges. Over the last two decades, information sciences and technologies have helped us rise to these challenges. Concepts such as distance health care and online treatment are now accepted as a part of everyday life. Among these new technologies, successive innovations in the field of medical imaging have been of particular significance. The Institut Mines-Télécom is a leading centre of research in this sector. Based at Télécom Bretagne, where she works with a multi-disciplinary team bringing together clinical scientists and researchers specialising in information processing, Chafiaâ Hamitouche has had a hand in numerous discoveries which have helped pave the way for the use of augmented reality technology in surgery.

 

Chafiaâ Hamitouche, a researcher at Télécom Bretagne, has long been interested in the potential of modelling technologies, starting from morphology modelling. “Form and function are interdependent. It’s thanks to the movement of the foetus inside the mother’s womb that joints are formed,” Dr. Hamitouche explains. Her work seeks to find  a new representation space forboth the morphology (form) and movement (function) of joints.

This research allows us to achieve a clearer understanding of certain joint and bone disorders, reflected in symptoms such as sprains and arthritis; treatment generally requires physiotherapy or surgery.

The rise of  Navigated Surgery

In 2001, Chafiaâ Hamitouche first began to take a particular interest in Computer Assisted  Orthopaedic Surgery. As she points out, “this is a sector where stakes are high: every year over 150,000 prostheses are implanted in France alone.” But decisions regarding the implant operation are still generally made on the basis of 2D X-rays, with no quantifiable control method. There is therefore always the risk that errors might arise, making the operation a failure.

The aim of  assisted surgery is to optimise the surgeon’s gesture, ensuring  its accuracy. This requires a change in the layout of operating room – introducing a 3D localizer and specially-adapted surgical instruments – and the use of guiding tools, some of which were developed at LaTIM[1]. These techniques allow surgeons to make their interventions less invasive, and sometimes shorter. They can also help avoid or limit X-ray exposure for patients and medical  staff.

And, as Dr. Hamitouche explains, “with 3D modelling and advanced imaging technology the surgeon can get a clear look at the bone structure of the patient before the operation, select the surgical instruments and plan the surgery accordingly.” During the operation, real-time fusion of multimodal (data from various sensors) information  is perfomed. This fusion of pre-operative and intra-operative data  allows the surgeon to see things which would otherwise be invisible or inaccessible due to the limited size of the incision or the presence of other tissues. Data  fusion can also help predict the consequences of a gesture (intervention) before it is made.

Mathematical modelling of biological forms, based on the morphogenesis process

Personalised treatment and surgery

Dr. Hamitouche continues to explore and expand the possibilities of guided surgery, and since 2010 has been working on radiofrequency ablation for bone tumours. “Augmented reality technology is the future of surgery,” she declares. Dr. Hamitouche and her team are currently working on an ambitious EU-backed project (ITEA2 programme), developing the interventional environment of next generation. The project is led by one of the world’s leading medical imaging specialists, Philips Healthcare, bringing together 27 partners from 4 different countries, including 6 French SMEs. Dr. Hamitouche is the coordinator of the French arm consortium, focusing on boosting the productivity and efficiency of surgical treatments and reducing patient risk and discomfort by helping doctors make the transition from invasive surgery towards minimally-invasive, image-guided techniques.

The operating room of tomorrow will be interactive and highly flexible, allowing professionals to tailor their procedures to the specific requirements of the patient. The techniques pioneered by Dr. Hamitouche and her team allow surgeons to plan their operations in great detail before they begin, and also to be more accurate and efficient in their use of radiation therapy to treat tumours.

[box type=”shadow” align=”” class=”” width=””]

OSTESYS and IMASCAP, two companies born in our laboratories

High tibial osteotomy is a surgical technique for treating arthritis in the knees. It is a cheap, non-invasive and relatively quick surgical procedure performed on around 7,000 patients every year. But to achieve satisfactory long term results, the operation needs to be extremely accurate. OSTESYS was founded in March 2009, a spin-off from a project developed by Chafiaâ Hamitouche’s research team and focusing on osteotomy of the knee. The company produces intelligent medical equipment integrating augmented reality technology, with innovative surgical instruments and implants which allow surgeons to achieve maximum accuracy using minimally-invasive techniques.

IMASCAP is an innovative start-up launched in 2009 as part of an ANR project conducted in partnership with LaTIM and Nice University Hospital, specialising in Image Guided Shoulder Surgery. Total Shoulder Arthroplasty pose a number of problems: the soft tissue environment where the muscles are of utmost importance, and it can be difficult to visualise the structure of the shoulder. The new technology developed, provides a 3D reconstruction showing the different structures in the shoulder, which can be reliably connected to key points identified by palpation of the target area. This process makes the operation possible even in highly complex cases. This procedure is performed on around 25,000 patients every year in France alone. Find out more [/box]

From fundamental research to clinical application

Augmented reality technologies of this nature are now widely used in clinical contexts. These new solutions have been developed with substantial input from surgeons, helping to make their interventions more precise, cut costs and above all to protect patients’ well-being. This is certainly true of the work undertaken by the team since 2006 on the development of a new generation of predictive autonomous implants  . These prostheses incorporate electronic components and adjustable mechanical elements allowing for post-operative functional modifications.

The team’s researchers are currently working on four key technological challenges: achieving autonomous power by generating energy within the prosthesis, analysing ligament imbalance, installing telemetry system and  actuation. Further opportunities to launch new start-ups and create new jobs? Watch this space…

[author title=”” image=” https://imtech-test.imt.fr/wp-content/uploads/2017/10/Portrait_Chafiaa_réduit.jpg”]Chafiaâ Hamitouche would have been able to study medicine, but ultimately decided that her real passion was mathematics. She obtained her engineering degree in Algeria in 1987, winning an international doctoral scholarship in the process. More interested in signal processing than electronics, Dr. Hamitouche embarked upon a research career and was soon drawn to the medical sector. In 1988 she developed an interest in three-dimensional medical imaging, and in 1991 submitted her doctoral thesis, prepared at the Image and Signal Processing Laboratory of the University of Rennes 1 and focusing on a very promising subject: the analysis of three-dimensional medical images, applied to the extraction of anatomical structures.

Having always lived close to the sea, she chooses to continue her work in Brest, attracted by both the quality of research and the quality of life in Brittany. In 2005 she was officially accredited as a Research Director in the field of geometric and kinematics modelling in 3D medical imaging. She is now Professor of Image and Signal Processing at Télécom Bretagne, and deputy Director of LaTIM.

Dr. Hamitouche is involved with a number of projects in the field of augmented reality and its surgical applications, and has coordinated various national and European projects as well as supervising more than 12 doctoral theses. She is listed as co-inventor of 5 national and international patents and software innovations.

Nonetheless, Dr. Hamitouche is always keen to underline the importance of teamwork. When she first arrived at Télécom Bretagne, Christian Roux, now the school’s Scientific Director, was already working on bone and joint structures. Éric Stindel, an orthopaedic surgeon and the current Director of LaTIM, played a crucial role in the development of numerous innovations in orthopaedic surgery, innovations to which Dr. Hamitouche contributed. “I’ve also been lucky enough to work with some great doctoral researchers,” she hastens to add, describing their passion for Computer Assisted surgery which, in some cases, has resulted in business start-ups.[/author]

[1] LaTIM, is the Laboratoire de Traitement de l’Information Médicale (Laboratory of Medical Information Processing),  a combined research unit affiliated to INSERM (UMR1101) and including researchers from Brest University Hospital, the Université de Bretagne Occidentale and Télécom Bretagne.

Télécom SudParis, Maryline Laurent, Cybersécurité, Sécurité, Security, CVPIP

The security challenges of open IT systems

In today’s society, IT security systems are seriously put to the test: cloud data storage, the omnipresence of micro chips, social networking… Not only does constant communication via networks expose cryptographic systems to multitudes of threatening connections, but security mechanisms must also now function on devices with weak computing capacity, such as mobile phones. The older algorithms are under strain, and increasingly vulnerable to identity fraud and other violations of privacy. Institut Mines-Télécom, and specifically, the Télécom SudParis R3S (Réseaux, systèmes, services et sécurité – Networks, systems, services and security) team led by Maryline Laurent, is facing up the challenge of making these systems secure. The work of this team, conducted in collaboration with other research laboratories and industrial partners, focuses on developing new architecture and advanced cryptographic technology.

 

One of the most significant contemporary security problems is data storage in clouds: “Nowadays, a system fitted with a single entry point is relatively easy to secure,” explains Maryline Laurent, “but when a cloud is distributed on any hardware featuring (disc) storage resources, the protection of confidential data against potential criminals becomes a real problem.” This is an issue for private users just as much as it is for professionals, given that routers (Internet provider boxes) connect our domestic devices to a worldwide system.

The R3S team is working in partnership with telephone service providers such as Orange to come up with solutions capable of improving the security of content. Their goal: to ensure that content remains confidential and anonymous. Maryline Laurent explains: “Each user on a network is provided with a unique identifier in order to be recognised. We adopted a two-stage approach, the first stage enabling the encryption of data using a symmetric key and the second stage enabling the symmetric key to be secured within the cloud using the ID-based method; the symmetric key, which enables encryption and decryption, is generated by the users based on their identifier.” Consequently, not only would hackers be unable to read the data if stolen, but they would not even be able to tell who the data belongs to.

Furthermore, the standard ID-based method now appears to be insufficient for open systems such as cloud computing. More often than not it is not just a single user, but entire groups of users who share and access the same data. A further refinement has therefore been implemented: “To increase security, we decided that our ID-based encryption would generate its key not just from the unique user identifier, but also from the data itself.” This provides a unique cloud, where each piece of data has its own unique identifier providing a summary of its content. To find data in such a system, you need to know what you are looking for.

 

The challenge of securing a passive system

However this method requires a high level of processing power on the part of the user, while today’s systems are increasingly portable and scaled down. “A further challenge,” says Maryline Laurent “is being able to quickly authenticate devices such as smart phones, which are limited to carrying out simple processing operations.” Whether by use of a password or some other means, this authentication should provide evidence that the network user is who they say they are.

Maryline Laurent has carried out research on an extreme case of lack of power, in relation to RFID (Radio Frequency Identification) chips, which are tiny and yet play an important role in identifying remote devices. The security drawbacks of RFID chips explain why Europe is so reluctant to use them. A new European regulation should also enable general regulations on data protection to be updated. In the United States, industrial groups have already established uses for RFID chips, which are, for example, now replacing labels and barcodes at Walmart. Incorporated within daily objects, they could even allow a handbag to be scanned instantly to see if it contains the cigarette lighter you’re looking for, or a stolen ring. However, this quickly poses a threat to privacy: without strong authentification, anyone could potentially search your bag without permission! Success in blocking access could be the key to opening the European market to RFID chips.

Resolving the issue of lack of power required an innovative solution. “We took the NTRU (N-th degree truncated polynomial ring) method, a highly promising public key method, and adapted it. It is now possible to divide up the cryptographic processes: the entire workload is given to the server and the RFID chip only has a few binary operations to perform.”

Technically, what has been developed is a light system of two-way authentification where the processing principle consists of converting the NTRU into a binary polynomial and proposing a new method of generating/multiplying polynomials. “We can now carry out multiplication using simple shift operations.”

RFID chips will also benefit from strong authentification. “Our RFID chips project is highly advanced and has enabled us to register two patents. And if this encryption functions on passive chips, it will of course function on any machine.”

 

Taking back control of identity

These two developments allow data to be secured, but there is another element to take into account in order for users to put their trust into the systems: the security of data flows. Maryline Laurent explains: “The classic example is Facebook; if you return to Facebook a year after you have unsubscribed, you will find that all the data in your profile has been retained. This does not comply with the right to erasure of data outlined by European regulations. In social networks as they are today, users are losing control of the information they transmit and produce, and are unaware of the location of their data, of whether it has been duplicated and who has access to it.”

Maryline Laurent’s team are working on this issue alongside the W3C (World Wide Web Consortium), the organisation which works to standardize web technologies. The concept works by testing solutions which enable the user to manage access and distribution of their data on a social network, using a resource named MyProfile. Amongst other things, the user is able to physically control their data, as if it appeared on a home computer. It is not the data of the user which is transmitted, but solely its location on the hard disk. This is made possible using semantic web technology, meaning the network is required to connect to the disk in order to access the data. Using this approach, if users wish to erase their data there is no way for anyone to gain access to it.

Unfortunately, it will be difficult to get social networks to adopt these privacy-respecting technologies, which are clearly a disadvantage to them. The solution is therefore to establish new rival networks. The general public is becoming increasingly aware of the risks and will, in theory, eventually migrate towards networks which will respect their privacy, giving them greater control over their personal data.

 

Maintaining security

Despite all these measures, is it possible for data to ever be truly secure? We are increasingly entrusting our secrets to a growing number of people. Ultimately, vulnerability does not occur as a result of the system but of the user: “Users do not really understand the consequences of decisions they make in relation to data and system security,” explains Maryline Laurent. “Our role as researchers is to come up with solutions to protect users from the risks connected with new technologies, to guide their decision making, and to give them the confidence to know that their efforts are not in vain.”

 

Maryline Laurent, Chaire VPIP, SécuritéThe Personal Information Values and Policies Chair

With a PhD in Information Science, Maryline Laurent now specialises in IPv4 and IPv6 (Internet Protocol) network security. As a professor at Telecom SudParis, for several years she has led research into the issues of security and data protection in cloud computing, under strain systems, social networks and identity management. As a result of this research, she now co-hosts the multidisciplinary Institut Mines-Télécom Chair launched in 2013, entitled Personal Information Values and Policies.

The aim of the Chair is to promote discussion around legal, ethical, economical and technical regulation of personal information and online identities. This team’s key strength is its multi-disciplinary approach: Maryline Laurent is responsible for the technological side of things, but she works alongside a legal practitioner, a philosopher and an economist! Their boundary-breaking work is yielding new solutions to a one of modern society’s most delicate problems.

 

René Garello, Vigisat, Océan, Télécom Bretagne, Ocean

An ocean of environmental data to be processed

If nowadays we can detect icebergs along the route of the Vendée Globe, track and identify ships causing pollution from space, and define ecological corridors and reserves for the protection of animal species, it is largely thanks to stations which receive and process satellite images in real time, and the progress made in the last twenty years in the field of signal processing.

The creation of resources for environmental surveillance, sustainable management of marine resources and maritime safety is a key research priority of the Institut Mines-Télécom, and a field to which René Garello – a signal processing researcher at Télécom Bretagne, Director of GIS BreTel, and the recently elected president of the IEEE[1] OES (Oceanic Engineering Society) international academic association – has made major contributions.

 

Mapping, climate change, the battle against pollution, oceanography… a number of organizations working in these fields can be found in the city of Brest, with many of them collaborating at the Pôle Mer Bretagne competitiveness cluster, working together for a safe and sustainable maritime world. An expert at Pôle Mer and representative of the Brittany region for NEREUS (the Network of European Regions Using Space Technologies), René Garello is a key figure in this research community, covering both remote sensing and marine science.

 

Remote sensing and environmental observation

René Garello joined Télécom Bretagne in 1984, after three years at the NOAA (National Oceanic and Atmospheric Administration), the American agency responsible for the study of the ocean and the atmosphere. His thesis on radars for observation of the ocean surface led him to explore further practical applications, bringing valuable specialist skills to the agency.

Upon returning to France he continued to plough his own unique furrow, finding practical applications for scientific breakthroughs. While pursuing his research at Télécom Bretagne, the young researcher was also setting up European projects with German, British and Italian partners on marine surface observation and remote sensing. The first links were made with IFREMER (Institut Français de Recherche pour l’Exploitation de la Mer – French Research Institute for Exploitation of the Sea), which was commissioned by the IEEE OES with the organization of its flagship conference, OCEANS, in 1994. In 2005, René Garello chaired the second edition of the event in Brest.

The strength of his work lies in its two-dimensional approach to methods which were previously only used for one-dimension signal processing. “The starting point was signal processing, something which immediately interested me during my graduate studies, when I developed a particular interest in mathematics and its applications,” explains René Garello. Despite often relating to image-type data, this particular case was a question of two-dimensional signal processing as opposed to image processing.

Remote sensing is now carried out in ocean, coastal and inshore areas as well as both rural and urban areas for environmental purposes. VIGISAT (see text box) is a ground station for direct reception and analysis of satellite images, located in Brest and operated by CLS[2]. It provides ocean observation and surveillance services to a variety of organizations, including those responsible for government action at sea, defense, offshore construction, researchers and environmental associations…

 

[box type=”shadow” align=”” class=”” width=””]

Vigisat, René Garello, BreTel, Télécom Bretagne

The station for direct reception in Brest

VIGISAT, a station for direct reception and analysis of satellite images operated in Brest.

This cutting-edge facility was created by a Télécom Bretagne start-up called Boost Technologies, set up in 2001 by Professor Vincent Kerbaol, in which René Garello was a founding partner. A technology transfer partnership (for data processing software) was set up between Télécom Bretagne and Boost. The goal of this venture was to create a satellite radar receiving ground station in Brest, but such an infrastructure raised complex management issues (cost, maintenance, contracts with the European Space Agency) and the VIGISAT platform eventually came to fruition at the end of 2009, once Boost received backing from CLS. VIGISAT receives, processes, analyzes and distributes high-resolution radar images from European and Canadian satellites, 24 hours a day and in real time. Learn more [/box]

 

A constant interplay between theory and practice

René Garello’s main interest is still finding new applications for signal processing in the maritime context. “The marine surface is of great interest to a signal processing specialist, and it’s a useful subject to study: the sea provides a statistic that is non-Gaussian, non-stationary and non-linear. None of the models given to students are still applicable. For example, the noise model is not additive but multiplicative, there is nothing we can find in text books”, explains the professor.

The marine surface constitutes a film of water otherwise known as “surface roughness”. Waves, swells, eddies, wind and boats are visible through variation in roughness. The difficulty in processing these images arises when a sea is in motion which results in a Doppler effect (the shift in frequency of a wave between the levels recorded at emission and reception, when the distance between these two points changes over a period of time) before interpretation using signal processing statistical tools takes place: speed of wave movement, currents, boats. These speeds need to be taken into consideration in order to produce accurate images.

“The scientific advances we have made have been linked to the fact that we were looking for spatially distributed information, and wanted to work towards an extremely high-resolution,” explains René Garello. “We managed to get around the problem of non-stationary data using small, linear pieces.” But, bypassing the standard scientific approach, René did not just try out a few signal processing methods to see which one would work the most effectively. He began by looking closer at the physical nature of the phenomenon, and called in some specialist assistance.

“We rely on our knowledge of the geo-physical data, and of the instrument, the radar, which gives us a model which takes into consideration the physical nature of the phenomenon and sensor interaction.” This back and forth relationship between upstream research in signal processing and downstream research in marine science has led to progress in both fields.

One recent example arose from a simple question: how can we detect internal waves from space? Internal waves are a marine phenomenon, remaining beneath the surface and, through the vertical displacement of a mass of water, causing modifications to the surface roughness. These waves exert an influence on the water column: their energy can affect moored structures and they act as barriers to underwater communications. A European project (MORSE) driven by René Garello focused on analyzing radar images from periods of known internal wave activity, in order to be able to quantify: “We measured the wavelength and height of the internal wave on the radar image, and with the in situ measurements, we were able to validate the model which enabled us to experience the phenomenon in three-dimension.”

This desire to continuously compare research with the field of application is central in the the scientist activities for the IEEE. With the issues of climate change and energy-saving (oil, deep-sea exploration, offshore wind power) becoming ever more important, René Garello is steering the association towards further research on renewable marine energy, one of the key themes of the Pôle Mer Bretagne cluster and the IEED (Institut d’Excellence dans le domaine des Énergies Décarbonées – Institute of Excellence in the field of Decarbonated Energies) based in Brest.

 

René Garello, Télécom BretagneBorn in 1953 in the South of France, René Garello has spent his whole career in academia. Following a Master’s degree in Telecommunications in Toulon in 1977, he completed his Master of Advanced Studies and PhD at the INP Grenoble (Institut Polytechnique de Grenoble – Grenoble Polytechnic Institute of Technology). After joining Télécom Bretagne in 1984, he was appointed professor in 1988 and earned the Habilitation to Supervise Research in 1994. He was elected Fellow of the IEEE for his scientific research in 2006.

René Garello explains his commitment to the IEEE professional association: “The IEEE pushes us to prove that what we are doing is useful for humankind: it has a political commitment too, working with policy-makers, at a global level. Each researcher networks with contacts in their own country, so we have representative groups with an interest in maritime activities in most countries worldwide.” René was elected president of the IEEE OES (Oceanic Engineering Society) at the end of 2012, a responsibility which is rarely given to a non-American.

The researcher now divides his time between the IEEE and the research he is carrying out at TOMS (Traitements, Observation et Méthodes Statistiques – Processing, Observation and Statistical Methods) team at the Brittany-based Lab-STICC. Now a multi-administrative team (UBO, UBS, Télécom Bretagne, ENIB, ENSTA Bretagne), it has been rated A+ by the Agence d’Évaluation de la Recherche et de l’Enseignement Supérieur (AERES –Agency for the Assessment of Research and Higher Education) and brings together 20 professors and 25 PhD students.

 

[1] The Institute of Electrical and Electronics Engineers is a large organization which brings together over 400,000 members from more than 40 academic associations. The IEEE organizes 850 conferences each year, with 2 million documents, almost 150 journals and over 1,300 projects and standards under development. Now present in 160 countries, the institute is primarily led and managed by American nationals.

[2] A subsidiary of the CNES (Centre National d’Études Spatiales – National Centre for Space Studies) and IFREMER, CLS operates and processes data from over 80 instruments embedded in some forty satellites.