ERC Grants, Francesco Andriulli, Yanlei Diao, Petros Elia, Roisin Owens

4 ERC Consolidator Grants for IMT

The European Research Council has announced the results of its 2016 Call for Consolidator Grants. Out of the 314 researchers to receive grants throughout Europe (across all disciplines), four come from IMT schools.

 

10% of French grants

These four grants represent 10% of all grants obtained in France, with 43 project leaders awarded from French institutions (therefore placing France in 3rd position, behind the United Kingdom with 58 projects and Germany with 48 projects).

For Christian Roux, the Executive Vice President for Research and Innovation at IMT, “this is a real recognition of the academic excellence of our researchers on a European level. Our targeted research model, which performs well in our joint research with the two very active Carnot institutes, will also benefit from ERC’s more fundamental work to support major scientific breakthroughs.”

Consolidator Grants reward experienced researchers with a sum of € 2 million to fund projects for a duration of five years, therefore providing them with substantial support.

 

[one_half][box]Francesco Andriulli, Télécom Bretagne, ERCAfter Claude Berrou in 2012, Francesco Andriulli is the second IMT Atlantique researcher to be honored by Europe as part of the ERC program. He will receive a grant of €2 million over five years, enabling him to develop his work in the field of computational electromagnetism. Find out more +
[/box][/one_half]

[one_half_last][box]Yanlei Diao, ERC, Télécom ParisTechYanlei Diao, a world-class scientist, recruited jointly by École Polytechnique, the Inria Saclay – Île-de-France Centre and Télécom ParisTech, has been honored for scientific excellence for her project as well as her innovative vision in terms of “acceleration and optimization of analytical computing for big data”. [/box][/one_half_last]

[one_half][box]

Petros Elia, Eurecom, ERCPetros Elia is a professor of Telecommunications at Eurecom and has been awarded this ERC Consolidator Grant for his DUALITY project (Theoretical Foundations of Memory Micro-Insertions in Wireless Communications).
Find out more +

[/box][/one_half]

[one_half_last][box]

Roisin Owens, Mines Saint-Étienne, ERCThis marks the third time that Roisin Owens, a Mines Saint-Étienne researcher specialized in bioelectronics, has been rewarded by the ERC for the quality of her projects. She received a Starting Grant in 2011 followed by a Proof of Concept Grant in 2014.
Find out more +
[/box][/one_half_last]

Noël Crespi, Networks and new services, Télécom SudParis, Internet of Things, IoT

Networks and new services: A complete story

This book shines a spotlight on software-centric networks and their emerging service environments. The authors examine the road ahead for connectivity, for both humans and ‘things’, considering the rapid changes that have shaken the industry.

The book analyses major catalytic shifts that are shaping the communications world: softwarization, virtualization, and cloud computing. It guides the reader through a maze of possible architectural choices, driven by discriminating and sometimes conflicting business considerations. The new ICT capabilities lead us toward smarter environments and to an ecosystem where applications are backed up by shared networking facilities, instead of networks that support their own applications. Growing user awareness is a key driver towards the softwarization process.

Softwarization disrupts the current status quo for equipment, development, networks, operations and business. It changes radically the value chain and the involved stakeholders. The dilemma is between a ‘slow burn’ traditional step-by-step approach and a bold transformation of the whole infrastructure and business models. This book is essential reading for those seeking to build user-centric communication networks that support independent agile and smart applications. See more

 

About the authors

 

Roberto Minerva, Networks and new services, IoTRoberto Minerva, has a Master Degree in Computer Science from Bari University, Italy, and a Ph.D in computer Science and Telecommunications from Telecom Sud Paris, France. Roberto is the head of Innovative Architectures group within the Future Centre in the Strategy Department of Telecom Italia. His job is to create advanced scenarios derived from the application of emerging ICT technologies with innovative business models especially in the area of IoT, distributed computing, programmable networks and personal data. he is currently involved in Telecom Italia activities related to Big Data, architecture for IoT, and ICT technologies for leveraging Cultural Heritage.

Noël Crespi holNoël Crespi, Télécom SudParis, Networks and new services, IoTds Masters degrees from the Universities of Orsay and Canterbury, a Diplome d’ingénieur from Telecom ParisTech and a Ph.D and Habilitation from Paris VI University. He joined Institut Mines-Telecom in 2002 and is currently Professor and MSc Programme Director, leading the Service Architecture Laboratory. He coordinates the standardisation activities for Institut Telecom at ETSI, 3GPP and ITU-T. He is also adjunct professor at KAIST (Korea), and is on the 4-person Scientific Advisory Board of FTW (Austria). His current research interests are in Service Architectures, Communication Services, Social Networks, and Internet of Things/Services. He is the author/co-author of 250 articles and contributions in standardisation. See more

 

Noël Crespi, Networks and New Services, Internet of ThingsNetworks and New Services: A Complete Story
Roberto Minerva, Noël Crespi
Springer, 2017
Series “Internet of Things”
186 pages
100,21 € (hardcover) – 79,72 € (eBook)

Buy this book

chaire AXA, Maurizio Filippone, Eurecom

Accurate Quantification of Uncertainty. AXA Chair at Eurecom

AXA Chairs reward only a few scientists every year. With his chair on New Computational Approaches to Risk Modeling, Maurizio Filippone a researcher at Eurecom joins a community of prestigious researchers such as Jean Tirole, French Professor who won the Nobel prize in economics.

 

Maurizio, you’ve just been awarded an AXA chair. Could you explain what it is about and what made your project selected?

AXA chairs are funded by the AXA Research Fund, which supports fundamental research to advance our understanding of risk. Started in 2008, the AXA Chair scheme funds about 50 new projects annually, of which four to eight are chairs. They are individual fellowships, and the one I received is going to support my research activities for the next seven years. My project is entitled New Computational Approaches to Risk Modeling”. The AXA Chair selection process is not based on the project only. For this type of grant, several criteria are important: timeliness, vision, credibility of both the proposal and the candidate (track record, collaborations, etc.), institution and fit within institution’s strategy. For example, the fact that the research area of this topic is in line with the Eurecom long-term strategy in Data science played a major role in the selection process of my project. This grant definitely represents a major achievement in my career.

 

What is your project about exactly?

My project deals with one simple question: How do you go from data to decisions? Today, we can access so much data generated by so many sensors, but we are facing difficulties in using these data in a sensible way. Machine learning is the main technique that helps make sense of data and I will use and develop novel techniques in this domain throughout this project. Quantification of risk and decision-making require accurate quantification of uncertainty, which is a major challenge in many areas of sciences involving complex phenomena like finance, environmental and medical sciences. In order to accurately quantify the level of uncertainty, we employ flexible and accurate tools offered by probabilistic nonparametric statistical models. But today’s diversity and abundance of data make it difficult to use these models. The goal of my project is to propose new ways to better manage the interface between computational and statistical models – which in turn will help get accurate confidence on predictions based on observed data.

 

How will you be able to do that? With what kind of advanced computing techniques?

The idea behind the project is that it is possible to carry out exact quantification of uncertainty relying exclusively on approximate, and therefore cheaper, computations. Using nonparametric models is difficult and generally computationally intractable due to the complexity of the systems and amount of data. Although computers are more and more powerful, exact computations remain serial, too long, too expensive and sometimes almost impossible to carry out. The way approximate computations will be designed in this project will be able to reduce computing time by orders of magnitude! The exploitation of parallel and distributed computing on large scale computing facilities – which is a huge expertise at Eurecom – will be key to achieve this. We will thus be able to develop new computer models that will make accurate quantification of uncertainty possible.

 

What are the practical applications?

Part of the focus of the project will be on life and environmental applications that require quantification of risk.  We will then use mostly life sciences data (e.g., neuroimaging and genomics) and environmental data for our models. I am confident that this project will help tackle the explosion of large scale and diverse data in life and environmental sciences. This is already a huge challenge today, and it will be even more difficult to deal with in the future. In the mid-term, we will develop practical and scalable algorithms that learn from data and accurately quantify their uncertainty on predictions. On the long term, we will be able to improve on current approaches for risk estimation: they will be timely and more accurate. These approaches can have major implications in the development of medical treatment strategies or environmental policies for example. Is some seismic activity going to trigger a tsunami for which it is worth warning the population or not? Is a person showing signs of a systemic disease, like Parkinson, actually going develop the disease or not? I hope the results of our project will make it easier to answer these questions.

 

Do you have any partnerships in this project?

Of course! I will initiate some new collaborations and continue collaborating with several prestigious institutions worldwide to make this project a success:  Columbia University in NYC, Oxford, Cambridge, UCL and Glasgow in the UK, the Donders Institute of Neuroscience in the Netherlands, New South Wales in Australia, as well as INRIA in France. The funding from the AXA Research Fund will help create a research team at Eurecom: the team will comprise myself and two PhD students and one Post Doc. I would like the team to comprise a blend of expertise, since novelty requires an interdisciplinary approach: computing, statistics, mathematics, physics, plus some expertise in life and environmental sciences.

 

What are the main challenges you will be facing in this project?

Attracting talents is one of the main challenges! I’ve been lucky so far, but it is generally difficult. This project is extremely ambitious; it is a high-risk, high-gain project, so there are some difficult technical challenges to face – all of them related to the cutting-edge tools, techniques and strategies we will be using and developing. We will find ourselves in the usual situation when working on something new and visionary – namely, being stuck in blind alleys or being forced to dismiss promising ideas that do not work to give some examples. But that is why it has been funded for 7-years! Despite these difficulties, I am confident this project will be a success and that we will make a huge impact.

 

French National Library

The French national Library is combining sociology and big data to learn about its Gallica users

As a repository of French culture, the Bibliothèque Nationale de France (BnF, the French national Library) has always sought to know and understand its users. This is no easy task, especially when it comes to studying the individuals who use Gallica, its digital library. To learn more about them, without limiting itself to interviewing sample individuals, the BnF has joined forces with Télécom ParisTech, taking advantage of its multidisciplinary expertise. To meet this challenge, the scientists are working with IMT’s TeraLab platform to collect and process big data.

[divider style=”normal” top=”20″ bottom=”20″]

 

[dropcap]O[/dropcap]ften seen as a driving force for technological innovation, could big data also represent an epistemological revolution? The use of big data in experimental sciences is nothing new; it has already proven its worth. But the humanities have not been left behind. In April 2016, the Bibliothèque Nationale de France (BnF) leveraged its longstanding partnership with Télécom ParisTech (see box below) to carry out research on the users of Gallica — its free, online library of digital documents. The methodology used is based in part on the analysis of large quantities of data collected when users visit the website.

Every time a user visits the website, the BnF server records a log of all actions carried out by the individual on Gallica. This information includes pages opened on the website, time spent on the site, links clicked on the page, documents downloaded etc. These logs, which are anonymized in compliance with the regulations established the CNIL (French Data Protection Authority), therefore provide a complete map of the user’s journey, from arriving at Gallica to leaving the website.

With 14 million visits per year, this information represents a large volume of data to process, especially since it must be correlated with the records of the 4 million documents available for consultation on the site — which include the type of document, creation date, author etc. — which also provide valuable information for understanding users and their interest in documents. Carrying out sociological fieldwork alone, by interviewing larger or smaller samples of users, is not enough to capture the great diversity and complexity of today’s online user journeys.

Researchers at Télécom ParisTech therefore took a multidisciplinary approach. Sociologist Valérie Beaudouin teamed up with François Roueff to establish a dialogue between the sociological analysis of uses through field research, on one hand, and data mining and modeling on the other. “Adding this big data component allows us to use the information contained in the logs and records to determine the typical behavioral profiles and behavior of Gallica users,” explains Valérie Beaudouin. The data is collected and processed on IMT’s TeraLab platform. The platform provides researchers with a turnkey working environment that can be tailored to their needs and offers more advanced features than commercially-available data processing tools.

Also read on I’MTech TeraLab and La Poste have teamed up to fight package fraud

What are the different profiles of Gallica users?

François Roueff and his team were tasked with using the information available to develop unsupervised learning algorithms in order to identify categories of behavior within the large volume of data. After six months of work, the first results appeared. The initial finding was that only 10 to 15% of Gallica users’ browsing activity involves consulting several digital documents. The remaining 85 to 90% of users represent occasional visits, for a specific document.

We observed some very interesting things about the 10 to 15% of Gallica users involved,” says François Roueff. “If we analyze the Gallica sessions in terms of variety of types of documents (monographs, press, photographs etc.), eight out of ten categories only use a single type,” he says. This reflects a tropism on the part of users toward a certain form of media. When it comes to consulting documents, in general there is little variation in the ways in which Gallica users obtain information. Some search for information about a given topic solely by consulting photographs, while others consult solely press articles.

According to Valérie Beaudouin, the central focus of this research lies in understanding such behavior. “Using these results, we develop hypotheses, which must then be confirmed by comparing them with other survey methodologies,” she explains. Data analysis is therefore supplemented by an online questionnaire to be filled out by Gallica users, field surveys among users, and even by equipping certain users with video cameras to monitor their activity in front of their screens.

[tie_full_img]Photo d'une affiche de communication de la Bibliothèque nationale de France (BnF) avec pour slogan "Êtes-vous déjà entré à l'intérieur d'une encyclopédie ?", octobre 2016. Pour l'institution, rendre la culture accessible au public est une mission cruciale, et cela passe par un accès aux ressources numériques adapté aux utilisateurs.[/tie_full_img]

Photo from a poster for the Bibliothèque Nationale de France (BnF), October 2016. For the institution, making culture available to the public is a crucial mission, and that means digital resources must be made available in a way that reflects users’ needs.

 

Field studies have allowed us to understand, for example, that certain groups of Gallica users prefer downloading documents so they can read them offline, while others would rather consult them online to benefit from the high-quality zoom feature,” she says. The Télécom ParisTech team also noticed that in order to find a document on the digital library website, some users preferred to use Google and include the word “Gallica” in their search, instead of using the website’s internal engine.

Confirming the hypotheses also means working closely with teams at BnF, who provide knowledge about the institution and the technical tools available to users.  Philippe Chevallier, project manager for the Strategy and Research delegation of the cultural institution, attests to the value of dialogue with the researchers: “Through our discussions with Valérie Beaudouin, we learned how to take advantage of the information collected by community managers about individuals who are active on social media, as well as user feedback received by email.”

Analyzing user communities: a crucial challenge for institutions

The project has provided BnF with insight into how existing resources can be used to analyze users. This is another source of satisfaction for Philippe Chevallier, who is committed to the success of the project. “This project is the proof that knowledge about user communities can be a research challenge,” he says with excitement. “It’s too important an issue for an institution like ours, so we need to dedicate time to studying it and leverage real scientific expertise,” he adds.

And when it comes to Gallica, the mission is even more crucial. It is impossible to see Gallica users, whereas the predominant profile of users of BnF’s physical locations can be observed. “A wide range of tools are now available for companies and institutions to easily collect information about online uses or opinions: e-reputation tools, web analytics tools etc. Some of these tools are useful, but they offer limited possibilities for controlling their methods and, consequently, their results. Our responsibility is to provide the library with meaningful, valuable information about its users and to do so, we need to collaborate with the research community,” says Philippe Chevallier.

In order to obtain the precise information it is seeking, the project will continue until 2017. The findings will offer insights into how the cultural institution can improve its services. “We have a public service mission to make knowledge available to as many people as possible,” says Philippe Chevallier. In light of observations by researchers, the key question that will arise is how to optimize Gallica. Who should take priority? The minority of users who spend the most time on the website, or the overwhelming majority of users who only use it sporadically? Users from the academic community— researchers, professors, students — or the “general public”?

The BnF will have to take a stance on these questions. In the meantime, the multidisciplinary team at Télécom ParisTech will continue its work to describe Gallica users. In particular, it will seek to fine-tune the categorization of sessions by enhancing them with a semantic analysis of the records of the 4 million digital documents. This will make it possible to determine, within the large volume of data collected, which topics the sessions are related to. The task poses modeling problems which require particular attention, since the content of the records is intrinsically inhomogeneous: it varies greatly depending on the type of document and digitization conditions.

 

[divider style=”normal” top=”20″ bottom=”20″]

plusOnline users: a focus for the BnF for 15 years

The first study carried out by the BnF to describe its online user community dates back to 2002, five years after the launch of its digital library, in the form of a research project that already combined approaches (online questionnaires, log analysis etc.). In the years that followed, digital users became an increasingly important focus for the institution. In 2011, a survey of 3,800 Gallica users was carried out by a consulting firm. Realizing that studying users would require more in-depth research, the BnF turned to Télécom ParisTech in 2013 with the objective of assessing the different possible approaches for a sociological analysis of digital uses. At the same time, BnF launched its first big data research to measure Gallica’s position on the French internet for World War I research. In 2016, the sociology of online uses and big data experiment components were brought together, resulting in the project aiming to understand the uses and users of Gallica.[divider style=”normal” top=”20″ bottom=”20″]

 

Eurecom, HIGHTS, Autonomous car, H2020

The autonomous car: safety hinging on a 25cm margin

Projets européens H2020Does an autonomous or semi-autonomous car really know where it is located on a map? How accurately can it position itself on the road? For the scientists who are part of the European H2020 “HIGHTS” project, intelligent transportation systems must know their position down to one quarter of a meter. Jérôme Härri, a researcher in communication systems at Eurecom — a partner school for this project — explains how the current positioning technology must be readjusted to achieve this level of precision. He also explains why this involves a different approach than the one used by manufacturers such as Tesla or Google.

 

You are seeking solutions for tracking vehicles’ location within a margin of 25 centimeters. Why this margin?

Jérôme Härri: It is the car’s average margin for drifting to the right or left without leaving its traffic lane. This distance is found both in the scientific literature and in requests from industrial partners seeking to develop intelligent transportation. You could say it’s the value at which driving autonomously becomes possible while ensuring the required safety for vehicles and individuals: greater precision is even better; less precision, and things get complicated

 

Are we currently far from this spatial resolution? With what level of precision do the GPS units in most of our vehicles locate us on the road?

JH: A basic GPS can locate us with an accuracy of 2 to 10 meters, and the new Galileo system promises an accuracy of 4m. But this is only possible when there is sufficient access to satellites and in an open, or rural area. In the urban context, tall buildings make satellites less accessible and reaching a level of accuracy under 5 meters is rare. The margin of error is then reduced by projection, so that the user only rarely experiences such a large error in the positioning. But this does not work for an autonomous car. Improvements to GPS systems do exist, such as differential GPS, which can position us with an accuracy of one meter, or even less. Real time kinematic technology (RTK), used for cartography in the mountains, is even more efficient. Yet it is expensive, and also has its limits in the city. RTK technology is becoming increasingly popular for use in the dynamics of digital cities, but we have not yet reached that point.

 

And yet Google and Tesla are already building their autonomous or semi-autonomous cars. How are these cars being positioned?

JH: The current autonomous cars use a positioning system on maps that is very precise, down to the traffic lane, which combines GPS and 4G. However, this system is slow. It is therefore used for navigation, so that the car knows what it must do to reach its destination, but not for detecting danger. For this aspect, the cars use radar, lidars — in other words, lasers ­— or cameras. But this system has its limits: the sensors can only see around 50 meters away. However, on the highway, cars travel at a speed of 30, even 40 meters per second. This gives the autonomous car one second to stop, slow down, or adapt in the event of a problem… Which is not enough. And the system is not infallible. For example, the Tesla car accident that occurred last May was caused by the camera that is supposed to detect danger confusing a truck’s light color with the light color of the sky.

 

What approaches are you taking in the HIGHTS project for improving the geolocation and reliability?

JH: We want to know within a 25-centimeter margin where a vehicle is located on the road, not just in relation to another car. In order to do this, we use the cooperation between vehicles to triangulate and reduce the effect of a weak GPS signal. We consider that every vehicle nearby can be an anchor for the triangulation. For example, an autonomous car can have a weak GPS signal, but can have three surrounding cars with a better signal. We can increase the car’s absolute positioning by triangulating its position in relation to three nearby vehicles. In order to do this, we need communication technologies for exchanging GPS positions — Bluetooth, Zizbee, Wi-Fi, etc. — and technology such as cameras and radar in order to improve the positioning in relation to surrounding vehicles.

 

And what if the car is isolated, without any other cars nearby?

JH: In the event that there are not enough cars nearby, we also pursue an implicit approach. Using roadside sensors at strategic locations, it is possible to precisely locate the car on the map. For example, if I know the distance between my vehicle and a billboard or traffic light, and the angles between these locations and the road, I can combine this with the GPS position of the billboard and traffic light, which don’t move, making them very strong positioning anchors. We therefore combine the relative approach with the absolute position of the objects on the road. Yet this situation does not occur very frequently. In most cases, what enables us to improve the accuracy is the cooperation with other vehicles.

 

So, does the HIGHTS project emphasize the combination of different existing technologies rather than seeking to find new ones?

JH: Yes, with the aim of validating their effectiveness. Yet at the same time we are working on developing LTE telecommunication networks for the transmission of information from vehicle to vehicle — which we refer to as LTE-V2X. In so doing we are seeking to increase the reliability of the communications. Wi-Fi is not necessarily the most robust form of technology. On a computer, when the Wi-Fi isn’t working, we can still watch a movie. But for cars, the alternative V2X technology ensures the communications if the Wi-Fi connection fails, whether it is by accident or due to a cyber-attack. Furthermore, these networks provide the possibility of using pedestrians’ smartphones to help avoid collisions. With the LTE networks, HIGHTS is testing the reliability of the device-to-device LTE approach for inter-vehicle communication. Our work is situated upstream of the standardization work. The experience of this project enables us to work beyond the current standards and develop them along with organizations such as ETSI-3GPP, ETSI-ITS and IETF.

 

Does your cooperative approach stand a chance of succeeding against the individualistic approach used by Tesla and Google, who seek to remain sovereign regarding their vehicles and solutions? 

JH: The two approaches are not incompatible. It’s a cultural issue. Americans (Google, Tesla) think “autonomous car” in the strictest sense, without any outside help. Europeans, on the other hand, think “autonomous car” in the broader sense, without the driver’s assistance, and therefore are more likely to use a cooperative approach in order to reduce costs and improve the interoperability of the future autonomous cars. We have been working on the collaborative aspect for several years now, which has included research on integrating cars into the internet of things, carried out with the CEA and BMW — which are both partners of the HIGHTS project. We therefore have some very practical and promising lines of research on our side. And the U.S. Department of Transportation has issued a directive requiring vehicles to have a cooperative unit beginning in 2019. Therefore, Google and Tesla can continue to ignore this technology, but since they will be present in vehicles and made freely available to them, there’s a good chance they will use it.

 

[box type=”shadow” align=”” class=”” width=””]

HIGHTS: moving towards a demonstration platform

Launched in 2015, the 3-year HIGHTS project answers the call made by the H2020 research program on the theme of smart, green, and integrated transportation. It brings together 14 academic and industrial partners[1] from five different countries, and includes companies that work closely with major automakers like BMW. Its final objective is to establish a demonstration platform for vehicle positioning solutions, from the physical infrastructure to the software.

[1] Germany: Jacobs University Bremen, Deutsche Zentrum für Luft- und Raumfahrt (DLR), Robert Bosch, Zigpos, Objective Software, Ibeo Automotive Systems, Innotec21.
France: Eurecom, CEA, BeSpoon.
Sweden
: Chalmers University of Technology.
Luxemburg: FBConsulting.
The Netherlands
: PSConsultancy, TASS International Mobility Center.

[/box]