SEAS, ITEA

How the SEAS project is redefining the energy market

The current energy transition has brought with it new energy production and consumption modes. Coordinated by Engie, the european SEAS project aims to foster these changes to create a more responsible energy market. SEAS is seeking to invent the future of energy usage by facilitating the integration of new economic stakeholders to redistribute energy, as well as increasing the energy management options offered to individuals. These ideas have been made possible through the contributions of researchers from several IMT graduate schools (IMT Atlantique, Mines Saint-Étienne, Télécom ParisTech and Télécom SudParis). Among these contributions, two innovations are supported by IMT Atlantique and Mines Saint-Étienne.

 

An increasing number of people are installing their own energy production tools, such as solar panels. This breaks with the traditional energy model of producer-distributor-consumer”. Redefining the stakeholders in the energy chain, as noted by Guillaume Habault, IMT Atlantique computer science researcher, is at the heart of the issue addressed by the Smart Energy Aware Systems (SEAS) project. The project was completed in December, after three years of research as part of the European ITEA program. It brought together 34 partners from 7 countries, one of which was IMT in France. On 11 May, the SEAS project won the ITEA Award of Excellence, in acknowledgement for the high quality of its results.

The project is especially promising as it does not only involve individuals wanting to produce their own energy using solar panels. New installations such as wind turbines provide new sources of energy on a local scale. However, this creates complications for stakeholders in the chain such as network operators: their energy production is erratic, as it is dependent on the seasons and the weather. Yet it is important to be able to foresee energy production in the very short term in order to ensure that every consumer is supplied. Overestimating the production of a wind farm or a neighborhood equipped with solar panels means taking the risk of not having enough energy to cope with a lack in production, and ultimately causing power cuts for residents. “Conversely, underestimating production means having to store or dispatch the surplus energy elsewhere. Poor planning can create problems in the network, and even reduce the lifespan of some equipment” the researcher warns.

 

An architecture for smart energy grid management

Among the outcomes of the SEAS is a communication architecture capable of gathering all information from different production and consumption modes locally, almost in real time. “The ideal goal is to be able to inform the network in 1-hour segments: with this length of time, we can avoid getting information about user consumption that is excessively precise, while anticipating cases of over- or under-consumption,” explains Guillaume Habault, the creator of the architecture.

For individuals, SEAS may take the form of an electric device that can transmit information about their consumption and production to their electricity provider. “This type of data will allow people to optimize their power bills,” the researcher explains. “By having perfect knowledge of the local energy production and demand at a given moment, residents will be able to tell if they should store the energy they produce, or redistribute it on the network. With the help of the network, they may also decide what time would be the most economical to recharge their electric car, according to electricity prices, for instance.”

 

 

These data on the current state of a sub-network point to the emergence of new stakeholders, known as “flexibility operators”. First of all, because optimizing your consumption by adapting the way you use each appliance in the house requires specific equipment, and takes time. While it is easy to predict that energy will be more expensive at times of peak demand, such as in the evenings, it is more difficult to anticipate the price of electricity according to how strong the wind is blowing in a wind farm located several dozen kilometers away. It is safe to say that with suitable equipment, some individuals will be inclined to delegate their energy consumption optimization to third-party companies.

The perspectives of intelligent energy management offered by SEAS go beyond the context of the individual. If the inhabitants of a house are away on holiday, couldn’t the energy produced by their solar panels be used to supply the neighborhood, thus taking pressure off a power plant located a hundred kilometers away? Another example: refrigerators operate periodically, they don’t cool constantly, but rather at intervals. In a neighborhood, or a city, it would therefore be possible to intelligently shift the startup time of a group of these appliances to outside peak hours, so that an already heavily-used network can be concentrated on the heaters people switch on when they return home from work.

Companies are particularly keen to get these types of services. Load management allows them to temporarily switch off machines that are not essential to their service in exchange for a payment to those who are in charge of this load management. The SEAS architecture incorporates communication security in order to ensure trust between stakeholders. In particular, personal data are decentralized: each party owns their own data and can decide not only to allow a flexibility operator to have access to them, but can also determine their granularity and level of use. “an individual will have no trouble accepting that their refrigerator cools at different times from usual, but not that their television gets cut off while they are watching it,” says Guillaume Habault. “And companies will want to have even more control over whether machines are switched off or on.”

 

Objects that speak the same language

In order to achieve such efficient management of electricity grids, the SEAS project turned to the semantic web expertise of Mines Saint-Étienne. “The semantic web is a set of principals and formalisms that are intended to allow machines to exchange knowledge on the web”, explains Maxime Lefrançois, head researcher in developing the knowledge model for the SEAS project. This knowledge model is the pivotal language that allows objects to be interoperable in the context of energy network management.

Up to now, each manufacturer had their own way of describing the world, and the machines made by each company evolved in their own worlds. With SEAS, we used the principles and formalisms of the semantic web to provide machines with a vocabulary allowing them to “talk energy”, to use open data that exists elsewhere on the web, or to use innovative optimization algorithms on the web”, says the researcher. In other words, SEAS proposes a common language enabling each entity to interpret a given message in the same way. Concretely, this involves giving each object a URL address, which can be consulted in order to obtain the information on it, in particular to find out what it can do and how to communicate with it. Maxime Lefrançois adds, “We also contributed to principles and formalisms of the semantic web with a series of projects aimed at making it more accessible to companies and machine designers, so that they could adapt their existing machines and web services to the SEAS model at a lower cost”.

Returning to a previous example, using this extension of the web makes it possible to adapt two refrigerators of different brands so that they can communicate, agree on the way they operate, and avoid creating a consumption peak by starting up at the same time. In terms of services, this will allow flexibility operators to create solutions without being limited by the languages specific to each brand. As for manufacturers, it is an opportunity for them to offer household energy management solutions that go beyond simple appliances.

Thanks to the semantic web, communication between machines can be more easily automated, improving the energy management service proposed to the customer. “All these projects point to a large-scale deployment,” says Maxime Lefrançois. Different levels of management can thus be envisioned. Firstly, for households, for coordinating appliances. Next, for neighborhoods, redistributing the energy produced by each individual according to their neighbors’ needs. Finally on a regional or even national scale, for coordinating load management for overall consumption, relieving networks in cases of extremely cold temperatures, for example. The SEAS project could therefore change things on many levels, offering new modes of more responsible energy consumption.

 

This article is part of our dossier Digital technology and energy: inseparable transitions!

 

SEAS, ITEA

 

[divider style=”normal” top=”20″ bottom=”20″]

SEAS wins an “ITEA Award of Excellence for Innovation and Business impact”

Coordinated by Engie, and with one of its main academic partners being IMT, SEAS won an award of excellence on May 11 at the Digital Innovation Forum 2017 in Amsterdam. This award recognizes the relevance of the innovation in terms of its impact on the industry.

[divider style=”normal” top=”20″ bottom=”20″]

Smart cities, Ville intelligente : « Ce n’est que par une recherche pluridisciplinaire que les défis seront relevés »

Smart cities: “it is only through multidisciplinary research that we can rise to these challenges”

The smart city is becoming an increasingly tangible reality for citizens in urban areas, with the efforts made to increase mobility and energy management being obvious examples. But is more efficient transport and optimized energy consumption sufficient to define a smart city? Being a member of the jury of the international Prizes Le Monde-Smart Cities that will be awarded in Singapor on June 2, Deputy President of IMT Francis Jutand explained to us why smart cities must be considered in a general and systemic way.

 

Is it possible to reduce smart cities to cities with lower energy consumption?

Francis Jutand: Definitely not. The impact of digital technology on cities goes far beyond energy-saving issues, even if this is an important aspect of it. Of course, it allows smart technology to be used in energy monitoring for buildings and vehicles, but digital technology also plays in important role in managing mobility and interactions. For example, it eliminates the need for physical transport by allowing for telecommuting, coworking and exchanges of information in general. It could even allow for a more adaptive organization of mobility — although there is a long way still to go in this matter.

 

What do you mean by more adaptive organization?

FJ: One of the problems affecting cities is congestion linked to peaks in traffic. Managing congestion is a tricky systemic challenge which has to combine a number of solutions, such as organization of work, staggered management of office opening hours, proaction and dynamic reaction. There is a whole organizational infrastructure to be established, to which digital technology can contribute.

 

Besides the digitization of services, will smart cities also be a source of apprehension for citizens?

FJ: Digital technology allows us to provide new functionalities. Everyone experiences digital technology and its services and perceives a certain number of obvious advantages. Digital technology also concerns future problems to be resolved. In the case of digital cities, one of the most interesting ones is anticipating its growing complexity. Infrastructures are being digitized and can be interfaced. At the same time, humans are benefitting from increased capacities for interaction, while at the same time autonomous entities are being developed — such as autonomous cars — which incorporate intelligent elements that also have a high capacity for interaction with infrastructures. Therefore, there needs to be an efficient management of exchanges between agents, humans and infrastructures.

 

Is digital technology the only field that must addressed when considering the city of the future? 

FJ: Smart and sustainable cities — I always add the word “sustainable”, because it is vital — must be considered from several perspectives. In terms of research, the subjects concerned are digital technology and big data, of course, but also supply chains, air quality, social and economic impacts etc. It is only through multidisciplinary research that we can truly rise to these challenges. This is what we try to do at Institut Mines-Télécom, with schools that are very active in their area and involved in local projects linked to smart cities. In addition to their strength in research, they are an important lever for innovation for designing products and services linked to smart and sustainable cities, and more particularly by fostering entrepreneurship through their students.

 

If digital technology is not the only subject of reflection for cities of the future, why does it seem to be an ever-present topic of discussion?

FJ: In the currents temporality, the technologies that increase our capacity are digital technologies. They lead to the most innovation. They are used not only for automation, but also for developing interactions and providing algorithmic intelligence and autonomy in different products and services. Interaction implies connection. I would add that it is also necessary to manage the securing of transactions both in terms of reliability of operations and prevention of malicious actions. Today, digital technology is a driving force as well a guide, but the unique thing about it is that it comes out in waves. It is therefore necessary to combine short and long-term views of its impact and work on creativity and innovation. This is why openness and accessibility of data are important points.

 

Is a smart city necessarily one in which all data is open?

FJ: The debate on this matter is too often caricatural and simplified through the question of “should data be open or not?”. In reality, the debate plays out on a different level. Data is not static, and the needs vary. There is a cost to supplying raw data. An extreme position in favor of complete openness would very quickly become financially impossible, and it would be difficult to produce the new data we need. Besides this, there is the issue of data enrichment: we must be able to encourage approaches for a common commodity in which any citizen can work on the data, as well as commercial approaches for developing new services. The balance is hard to find, and will probably depend on the characteristics of each city.

 

You mentioned the cost of digital technology and development, and its energy impact. If local governments can’t bear the entire cost, how can we guarantee homogeneous development within a city or between cities?

 FJ: First of all, it’s true that there are sometimes concerns about the idea that digital technology itself consumes a lot of energy. We must remember that, for the moment, the proportion of a city’s overall energy consumption accounted for by digital technology is very small compared with buildings and transport. Secondly, given that local governments can’t bear the full cost, it is not inconceivable that private-sector-based initiatives will foster and generate differences in the city or between cities. It is extremely difficult to plan the homogenization of cities, nor is it desirable because they are living, and therefore evolving, entities.

The most likely outcome is that sustainable smart cities will develop per district with purely private offerings that will be naturally selective because they will target solvent markets, but which will also leave room for equally welcome civic initiatives. The whole process will be regulated by local government. But this is something we are used to: it’s typically the case with fiber optic broadband and its roll-out. In any case, it is essential to make public policies clear. If we don’t make them clear, people may react by adopting a defensive precautionary position and refusing the development of smart cities. For now, this is not the case, and lots of cities such as Lyon, Rennes, Bordeaux, Nice, Montpellier, Grenoble, Paris, Nantes are determinedly tackling the problem.

 

Could the rise of connected cities lead to the development of new networks between megacities?

FJ: Megacities are increasingly powerful economic entities all over the world. A general expansion of the economic power of cities is also taking place. There are elements of an economic impetus which could lead to shared forms of mutualization or innovation that go much further than previous twinning projects, or even competition. It is therefore likely that economic competition between nations will move toward competition between megacities and the areas that support them.

 

Véronique Riffault, IMT Lille Douai

Véronique Riffault

IMT Nord Europe | #Atmosphere #Environment #Pollution

[toggle title=”Find all her articles on I’MTech” state=”open”]

[/toggle]

ISS, télécommunication spatiale, Space Telecommunication

What is space telecommunication? A look at the ISS case

Laurent Franck is a space telecommunications researcher at IMT Atlantique. These communication systems are what enable us to exchange information with far-away objects (satellites, probes…). These systems also enable us to communicate with the International Space Station (ISS). This is a special and unusual case compared to the better-known example of satellite television. The researcher explains how these exchanges between Earth and outer space take place.

 

Since his departure in November 2016, Thomas Pesquet has continued to delight the world with his photos of our Earth as seen from the sky. It’s a beautiful way to demystify life in space and make this profession—one that fascinates both young and old—more accessible. We were therefore able to see that the members of Expedition 51 aboard the ISS are far from lost in space. On the contrary, Thomas Pesquet was able to cheer on the France national rugby union team on a television screen and communicate live with children from different French schools (most recently on February 23, in the Gard department). And you too can follow this ISS adventure live whenever you want. But how is this possible? To shed some light on this issue, we met with Laurent Franck, a researcher in space telecommunications at IMT Atlantique.

 

What is the ISS and what is its purpose?

Laurent Franck: The ISS is a manned international space station. It accommodates international teams from the United States, Russia, Japan, Europe and Canada. It is a scientific base that enables scientific and technological experiments to be carried out in the space environment. The ISS is situated approximately 400 kilometers above the earth’s surface. But it is not stationary in the sky, because when something is in orbit at this altitude, the laws of physics make the object rotate at a faster speed than the Earth’s rotation. It therefore follows a circular orbit around our planet at a speed of 28,000 kilometers per hour, enabling it to orbit the Earth in 93 minutes.

 

How can we communicate with the ISS?

LF: Not by wire, that’s for sure! We can communicate directly, meaning between a specific point on Earth and the space station. To do this, it must be visible above us. We can get around this constraint by going through an intermediary. One or several satellites that are situated at a higher elevation can then be used as relays. The radio wave goes from the Earth to the relay satellite, and then to the space station, or vice versa. It is all quite an exercise in geometry. There are approximately ten American relay satellites in orbit. They are called TDRS (Tracking and Data Relay Satellite). Europe has a similar system called EDRS (European Data Relay System).

 

Why are these satellites located at a higher altitude than that of the space station?

LF: Let’s take a simple analogy. I take a flashlight and shine it on the ground. I can see a ring of light on the ground. If I raise the flashlight higher off the ground, this circle gets bigger. This spot of light represents the communication coverage between the ground and the object in the air. The ISS is close to the Earth’s surface, and therefore it only covers a small part of the Earth, and this coverage is moving. Conversely, if I take a geostationary satellite at an altitude of 36,000 kilometers, the coverage is greater and corresponds to a fixed point on the Earth. Not only are few satellites required in order to cover the Earth’s surface, but the ISS can also sustainably communicate, via the geostationary satellite, with a ground station that is also located within this area of coverage. Thanks to this system, only three or four ground stations are required to permanently communicate with the ISS.

 

Is live communication with the ISS truly live?

LF: There is a slight time lag, for two reasons. First, there is the time the signal takes to physically travel from point A to point B. This time is related to the speed of light. Therefore, it takes 125 milliseconds to reach a geostationary satellite (television or satellite relays). We then must add the distance between the satellite and the ISS. This results in a travel time that is incompressible–since it is physical–of a little over a quarter of a second. Or half a second to travel there and back. This first time lag is easily observable when we watch the news on television: the studio asks a question and the reporter on the ground seems to wait before answering, due to the time needed to receive the question via satellite and send the reply!

Secondly, there is a processing time, since the information travels through telecommunications equipment. This equipment cannot process the information at the speed of light. Sometimes the information is stored temporarily to accommodate the processor speed. It’s like when I have to wait in line at a counter. There’s the time the employee at the counter takes to do their job, plus the wait time due to all the people in line in front of me. This time can quickly add up.

We can exchange any kind of information with the ISS. Voice and image, of course, as well as telemetry data. This is the information a spacecraft sends to the earth to communicate its state of health. Included in this information is the station’s position, the data from the experiments carried out on board, etc.

 

What are the main difficulties the spatial telecommunications systems experience?

LF: The major difficulty is linked to the fact that we must communicate with objects that are very far away and have limited electrical transmission power. We record these constraints in an energy link budget. This involves several phenomena. The first is that the farther away we communicate, the more energy is lost. With the distance, the energy is dispersed like a spray. The second phenomenon involved in this budget is that the quality of communication depends on the amount of energy received at the destination. We ask: out of one million bits that are transmitted, how many are false when they arrive at the destination? Finally, the last point is the output rate that is possible for the communication. This also depends on the amount of energy invested in the communication. We often adjust the output rate to obtain a certain level of quality. It all depends on the amount of energy available for transmission. This is limited aboard the ISS, since it is powered via solar panels and sometimes travels in the Earth’s shadow. The relay satellites have the same constraints.

 

Is there a risk of interference when the information is travelling through space?

LF: Yes and no, because radio frequency telecommunications are highly regulated. The right to transmit is linked to a maximum frequency and power. It is also regulated in space: we cannot “spill over” into another nearby antenna. For space communications, there are tables that define the maximum amount of energy that we can send outside of the main direction of communication. Below this maximum level, the energy that is sent to a nearby antenna is of course interference, but it will not prevent it from functioning properly.

 

What are the other applications of communications satellites?

LF: They are used for Internet access, telephony, video telephony, the Internet of things… But what is interesting is what they are not used for: GPS navigation and weather observations, for example. In fact, space missions are traditionally divided into four components: the telecommunications we are discussing here, navigation/positioning, observation, and deep-space exploration like the Voyager probes. Finally, what is fascinating is that with a field as specialized as that of space, there is an almost infinite amount of even more specialized derivations.

 

OpenAirInterface, Eurecom, 5G

OpenAirInterface: An open platform for establishing the 5G system of the future

Belles histoires, bouton, CarnotIn this article, we continue our exploration of the Télécom & Société numérique Carnot institute technological platforms. OpenAirInterface is the platform created by EURECOM to support mobile telecommunication systems like 4G and 5G. Its goal: to develop access solutions for networks, radio and core networks. Its service is based on a software suite developed using open source.

 

The OpenAirInterface platform offers a 4G system built on a set of software programs. These programs can each be tested and modified individually by the user companies, independently of the other programs. The goal is to establish the new features of what will become the 5G network. To find out more, we talked with Christian Bonnet, a communications systems researcher at EURECOM.

 

What is OpenAirInterface?

Christian Bonnet: This name encompasses two aspects. The first is the implementation of the software that makes up a 4G-5G system. This involves software components that run in a mobile terminal – those that increment the radio transmissions, and those that are in the core network.

The second part of OpenAirInterface is an “endowment fund” created by EURECOM at the end of 2014, which is aimed at leading an open and global software Alliance (OSA – OpenAirInterface Software Alliance).

 

How does this software suite work?

CB: The aim is to implement the software components required for a complete 4G system. This involves the modem of a mobile terminal, the software for radio relay stations, as well as the software for the specific routers used for a network core. Therefore, we deal with all of the processes involved in the radio layer (modulation, coding, etc.) of communication protocols. It runs on the Intel x86 processors that are found in PCs and computer clusters. This means that it is compatible with Cloud developments. To install it, you must have a radio card connected to the PC, which serves as the terminal, and a second PC, which serves as a relay station.

Next, depending on what we need to do, we can take only a part of the software implementation. For example, we can use commercial mobile terminals and attach to a network composed of an OpenAirInterface relay and a commercial network core. Any combination is possible. We have therefore established a complete network chain for 4G, which can move towards the 5G network using all of these software programs.

 

openairinterface

 

Who contributes to OpenAirInterface?

CB: Since the Alliance was established, we have had several types of contributors. The primary contributor, to date, has been EURECOM, because its teams are those that developed the initial versions of all the software programs. These teams include research professors, post-doctoral students, and PhD students who can contribute to this platform that provides participants with an experimental environment for their research. In addition, through the software Alliance, we have acquired new kinds of contributors: industrial stakeholders and research laboratories located throughout the world. We have expanded our base, and this openness enables us to receive contributions from both the academic and industrial worlds. (Editor’s note: Orange, TCL and Ercom are strategic OpenAirInterface partners, but the Alliance also includes many associate members, such as Université Pierre et Marie Curie (UPMC), IRT Bcom, INRIA and, of course, IMT. The full list is available here.)

 

What does the Carnot Label represent for your activities?

CB: The Carnot Label was significant in our relationship with the Beijing University of Posts and Telecommunications in China (BUPT), a university specializing in telecommunications. The BUPT asked us to provide a quality label reference that would allow us to demonstrate the recognition of our expertise. The Carnot Label was presented and recognized by the foreign university. This label demonstrates the commitment of OpenAirInterface developments to the industrial world, while also representing a seal of quality that is recognized far beyond the borders of France and Europe.

 

Why do companies and industrial stakeholders contact OpenAirInterface?

CB: To develop innovation projects, industrial stakeholders need advances in scientific research. They come to see us because they are aware of our academic excellence and they also know that we speak the same language. It’s in our DNA! Since its very beginning, EURECOM has embodied the confluence of industry and research; we speak both languages. We have developed our own platforms, we have been confronted with the same issues that industrial stakeholders face on a daily basis. We are therefore positioned as a natural intermediary between these two worlds. We listen attentively to the innovation projects they present.

 

You chose to develop your software suite as open source, why?

CB: It is a well-known model that is beginning to spread. It facilitates access to knowledge and contributions. This software is covered by open source licenses that protect contributors and enable wider dissemination. This acts as a driving force and an accelerator of development and testing, since each software component must be tested. If you multiply the introduction of this software throughout the world, everyone will be able to use it more easily. This enables a greater number of software tests, and therefore increases the amount of feedback from users for improving the existing versions. Therefore, the entire community benefits. This is a very important point, because even in industry, many components are starting to be developed using this model.

 

In addition to this approach, what makes OpenAirInterface unique?

CB: OpenAirInterface has brought innovation to open source software licensing. Many types of open source licenses exist. It is a vast realm, and the industrial world is bound to large patent portfolios. The context is as follows: on the one hand, there are our partners who have industrial structures that rely on revenue from patents and, on the other hand, there is a community who wants free access to software for development purposes. How can this apparent contradiction be resolved?

We have introduced a specific license to protect the software for non-commercial operations –everything related to research, innovation, tests – as for classic open source software. For commercial operations, we have established a patent declaration system. This means that if industrial stakeholders implement their own patented components, they need only indicate this and, for commercial operations, people will therefore contact the rights holders to negotiate. These conditions are known as FRAND (fair, reasonable and nondiscriminatory) terms, and reflect the practices industrial players in the field follow with standardization organizations such as GPP. In any case, this procedure has been well accepted. This explains why Orange and Nokia (formerly Alcatel-Lucent Bell Labs), convinced by the benefits of this type of software license, are featured among the Alliance’s strategic partners.

 

What is the next development phase for OpenAirInterface?

CB: Several areas of development exist. The projects that are proposed as part of the European H2020 program, for which we are awaiting the results, will allow us to achieve scientific advances and will benefit the software range. The Alliance has also defined major areas for development through joint projects led by both an industrial partner and an academic partner. This type of structure enables us to bring people together from around the world. They volunteer to participate in one of the steps towards achieving 5G.

 

 

[divider style=”normal” top=”20″ bottom=”20″]

The TSN Carnot institute, a guarantee of excellence in partnership-based research since 2006. 

Having first received the Carnot label in 2006, the Télécom & Société Numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Télécom, École de Management, EURECOM, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.

[divider style=”normal” top=”20″ bottom=”20″]

 

Müge Özman, Strategic Management Innovation Networks, Télécom École de Management

Strategic Management of Innovation Networks

Suitable for a one- or two-semester course for undergraduate and graduate students, this interdisciplinary textbook explains the diverse aspects of innovation and social networks, which occupy a central place in business and policy agendas. Its unified approach presents networks as nested systems that can span organisations, industries, regions, and markets, giving students a holistic perspective and reducing the amount of effort required to learn the theoretical framework for each layer. With engaging real-world examples, the text also provides a practical guide on how to manage networks to increase innovation and improve performance. Topics covered include forming teams to foster creativity, selecting partners and leveraging partnerships for learning, managing organisational change, and sponsoring technologies in communities. Students will learn the metrics used in social network analysis and how they are interpreted and applied. Suggested reading lists and online resources offer opportunities for further review and practice.

 

Muge Ozman, Télécom École de Management, Strategic Management Innovation NetworksAuthor

Müge Özman is ‎Professor of Management at Télécom École de Management-IMT, Paris. She has participated in major projects funded by the European Union, and worked with a variety of both small and large companies on innovation-related projects.

 

Müge Özman, Strategic Management Innovation Networks, Télécom École de ManagementStrategic Management of Innovation Networks
Müge Özman
Cambridge University Press, 2017
360 pages
29.99 £ (paperback)
30.00 USD (eBook)

Order online

télédétection, remote sensing

Remote sensing explained: from agriculture to major disasters

There are numerous applications for remote sensing, from precision agriculture to helping with the deployment of aid in major disasters. Grégoire Mercier, a researcher at IMT Atlantique, explains the key notions of this remote observation method using examples from his research.

 

Detecting invisible objects from space… this is now possible through remote sensing. This remote observation method is currently used in monitoring air quality in urban areas, monitoring ecosystems, detecting illegal fishing etc. Its applications are as numerous as the territories that can be covered. Its aim: to spatialize information that would otherwise be located by measurements on the ground.

Over the past few years, Brittany has become a key player in certain remote sensing topics in France and Europe through GIS Bretel and the Vigisat observation platform, “We are going through an interesting period for remote sensing, because we are carrying out missions with an operational purpose”, explains Grégoire Mercier, a researcher at IMT Atlantique who specializes in remote sensing image processing. “We can respond to issues concerning civil society and not just show that we can observe clouds using images” he adds. With the help of our expert, we will provide an overview of the key elements of remote sensing, from methods to applications.

 

Retracing the history of signals in remote sensing

Grégoire Mercier defines remote sensing as “any remote observation for which electromagnetic radiation is used to study the interaction between waves and matter. Depending on the result, we obtain an understanding of the object which has interacted with the wave”.

Photons are a key element in both spatial and airborne remote sensing. Thousands of them travel through space at the speed of light until they reach the Earth’s atmosphere. At this point, things become more complicated due to the atmosphere itself, clouds and aerosols. The atmosphere is full of obstacles which may prevent photons reaching the Earth’s surface. For example, when a photon comes into contact with a particle or water droplet, it is partially reflected and/or absorbed and sends new waves out in random directions. If it successfully reaches the ground, what happens next depends on where it lands. Vegetation, oceans, lakes or buildings… the reflected radiation will differ according to the object struck.

Every element has its own spectral signature which later enables it to be identified on the remote sensing images obtained using an in-built sensor on a satellite, aircraft or drone.

 

Spectral response and remote observations

Every object has a unique signature. “When we observe chlorophyll, lots of things appear in green, we see absorption of red and, a step further on, we observe a very specific response in the near-infrared region”, explains Grégoire Mercier. Observation of these spectral responses indicates that the remotely observed zone is of a vegetal nature. However, these observations are adjusted according to the moisture and the presence of pathosystems (a bacterial disease in the plant). The latter modify the plant’s spectral “ID card”. This is how researchers detect hydric stress or infections before the effects become visible to the naked eye. The process is particularly useful in precision agriculture.

Airborne remote sensing provides information on practices and the evolution of landscapes. “At IMT Atlantique we worked in collaboration with the COSTEL laboratory on the characterization of wetland areas in agriculture. The aim was to create a tool for operational purposes. We were able to prove that the use of hedges helped prevent surface run-off and therefore the pollution of water courses.”

 

Active/passive remote sensing and wavelengths

There are two types of remote sensing depending on the type of sensor used. When we use the sun’s radiation for observation, we talk of passive remote sensing. In these cases, the sensors used are referred to as “optic”. The wavelengths in question (typically between 400 and 2,500 nanometers) allow lenses to be used. “The electromagnetic wave interacts with the molecular energy level on a nanometric scale, which enables us to observe the constituents directly,” explains Grégoire Mercier. This is how the composition of the Earth’s atmosphere can be observed, for example.

But observations are not purely limited to the visible field of the electromagnetic spectrum. The aim is to go beyond the human visual system with observations in the thermal infrared range (up to 5 mm in wavelength) and of microwaves (centimetric or decimetric wavelengths). “When we use wavelengths that are no longer nanometric, but centimetric, the wave/matter interaction with these electromagnetic waves is completely different”, explains Grégoire Mercier.

These interactions are characteristic of radar observations. This time, it is a question of active remote sensing because a wave is emitted toward the surface by the sensor before it receives the response. “For these wavelengths (from 1 centimeter to 1 meter), everything happens as though we were blind and touching the surface with a hand the size of the wavelength. If the surface is flat, we won’t see anything because we won’t feel anything. The texture of an element provides information.” In other words, radar observation of the sea’s surface reveals ripples corresponding to capillary waves. If we look at a lake, on the other hand, nothing can be seen. This helps scientists identify what they are observing.

 

Image processing and applications for large-scale disasters

Grégoire Mercier has made improving sensing methods part of his daily work. “My research is based on operational methods that aim to detect changes with a high level of reliability”, explains Grégoire Mercier. More particularly, the researcher addresses image analysis in time-related applications. He has collaborated with the CNES on the creation of tools used during major disasters.

Initiated in 2000 by the CNES and ESA, the international charter on “Space and natural disasters” currently unites 16 space agencies from all over the world. The charter can be activated following a major natural or industrial disaster depending on the level of severity. “There is always one space agency on watch. When the charter is activated, it must do everything possible to update the map of the affected area”, explains Grégoire Mercier. To obtain this post-disaster map, the space agency requisitions any available satellite, which it uses to map the zone before the deployment of civilian security. The objective must generally be achieved in three hours.

“Rapid mapping does not allow you to choose the most suitable sensor or the best perspective. The observation then has to be compared to the one corresponding to the previous situation, which can be found in databases. The problem is that the images will probably not have been produced by the same sensor and will not have the same spatial resolution, so the idea is to implement tools that will facilitate comparison of the two images and the management of heterogeneous data. That’s where we come in,” Grégoire Mercier continues.

Also see the video on I’MTech: Communicating in emergencies and natural disasters

Having exceeded the initial aims of the SPOT satellite (Satellite for Observation of the Earth), remote sensing has sufficiently proven its worth for it to become a vital tool for the observation of territories. The task is now to establish operational image processing methodologies, as proposed by the ANR PHOENIX project.

 

[divider style=”normal” top=”20″ bottom=”20″]

Remote sensing for monitoring the natural evolution of a landscape: the ANR PHOENIX project

Grégoire Mercier is involved in the ANR PHOENIX project. Launched in 2015, the project notably aims to establish reliable remote sensing methodologies which will be used in characterizing the natural evolution of landscapes. In this way, it will be possible to analyze large-scale structures such as alpine glaciers and the Amazonian rainforest at different periods to determine the impact of various types of changes on their evolution. The use of satellite data for monitoring the environment will allow analysis of its current state and forecasting of its future state. Find out more

[divider style=”normal” top=”20″ bottom=”20″]

 

Also read on I’MTech:

[one_half]

[/one_half][one_half_last]

[/one_half_last]

TeraLab, a big data platform with a European vision

TeraLab, an IMT platform aimed at accelerating big data projects by uniting researchers and companies, has held the “Silver i-Space” quality label since December 1st, 2016. This label, awarded by the Big Data Value Association, is a guarantee of the quality of the services the platform provides, both at the technical and legal levels. The label testifies to TeraLab’s relevance in the European big data innovation ecosystem and the platform’s ability to offer cutting-edge solutions. Anne-Sophie Taillandier, the platform’s director, tells us about the reasons for this success and TeraLab’s future projects.

 

What does the “Silver i-Space” label, awarded by the Big Data Value Association (BDVA) on December 1st, mean for you? 

Anne-Sophie Taillandier: This is an important award, because it is a Europe-wide reference. The BDVA is an authoritative body, because it ensures the smooth organization of the public-private partnership on big data established by the European Commission. This label therefore has an impact on our ability to work at the continental level. DG Connect, the branch of the Commission in charge of rolling out a digital single market in Europe, pays particular attention to this: for example, it prefers that H2020 research projects use pre-existing platforms. Therefore, this label provides better visibility within the international innovation ecosystem.

 

In addition to TeraLab, three other platforms have been awarded this label. Does this put you in competition with the other platforms at the European level?

AST: The i-Spaces are complementary, not competitive. With TeraLab, we insist on creating a neutral zone: it is a breath of fresh air in the life cycle of a project, so that people—researchers and companies— can test things peacefully. The complementarity that exists between the platforms enables us to combine our energies. For example, one of the other recipients of the Silver i-Space label in December was SDIL, which is based in Germany and is more focused on industrial solutions. The people who contact the SDIL have already made an industrial choice. The stakeholders who contact TeraLab have not yet made a choice, and want to explore the potential value of their data.

 

How do you explain this recognition by the BDVA?

AST: TeraLab has always sought to be cutting-edge. We offer a platform that is equipped with the latest tools. Since the fourth quarter of 2016, for example, the platform has been equipped with GPUs: processors initially designed for computing graphics, but that are now also used for deep learning applications. The computing optimization they provide make it possible to make the time required for learning the algorithms fifteen times faster. We can therefore make much more powerful machines available to researchers working in the area of deep learning. Until now, the infrastructure did not allow for this. Generally speaking, if we feel that a project needs specific equipment, we look at whether we can introduce it, and whether it makes sense to do that. This constant updating is also one of the tasks set as part of the Investments for the Future program (PIA) by the Directorate-General for Enterprise.

 

Is it primarily the platform’s technical excellence that has been recognized?

AST: That’s not all, TeraLab represents an infrastructure, but also an ecosystem and a set of services. We assist our partners so that they can make their data available. We also have quality requirements regarding legal aspects. For example, we are working towards obtaining authorization from the Shared Healthcare Information Systems Agency (ASIP) to allow us to store personal health data. From a technical perspective, we have all we need to store this data and work with it. But we need to meet the legal and administrative requirements in order to do this, such as meeting the ISO 27001 standard. We must therefore provide guarantees of the way we receive data from companies, establish contracts, etc.

 

Have the upstream aspects, prior to data processing, also been the focus of innovations?

 AST: Yes, because we must constantly ensure consent regarding the data. Our close relationship with IMT researchers is an asset in accomplishing this. Therefore, we have projects on data watermarking, cryptographic segmentation, and the blockchain. Data analysis and extracting value from this data can only be carried out once the entire upstream process is completed. The choice of tools for ensuring privacy issues is therefore essential, and we must constantly ensure that we remain cutting-edge in terms of security aspects as well.

 

How are these quality criteria then reflected in the business aspect?

AST: Today, TeraLab is involved in projects that are related to many different themes. We talked about the blockchain, but I could also mention the industry of the future, energy, tourism, health care, insurance, open source issues, interoperability… And, more importantly, since 2016, TeraLab has achieved financial equilibrium: the revenue from projects offsets the operating costs. This is far from trivial, since it means that the platform is sustainable. TeraLab will therefore continue to exist after 2018, the completion date for the PIA that initiated the platform.

 

What are TeraLab’s main objectives for the coming year?

AST: First of all, to strengthen this equilibrium, and continue to ensure TeraLab’s sustainability. To accomplish this, we are currently working on big data projects at the European level to respond to calls for proposals from the European Commission via the H2020 program. Then, once we have obtained the authorization from the ASIP for storing personal health data, we plan to launch health research projects, especially in the area of “P4” medicine: personalized, preventive, predictive, and participatory.

 

Cybersécurité, cyberattaques, Hervé Debar, Télécom SudParis, cybersecurity, cyberattacks

Cybersecurity: new challenges to face up to

Computer hacking and sensitive data theft is increasing. Companies are now highly connected and must adapt their security policy to strengthen the protection of their information assets. Hervé Debar, a researcher at Télécom SudParis and expert in cybersecurity, studies the different types of cyber attacks in order to optimize their detection.

 

The era when IT students developed viruses and then enjoyed their “success” based on the number of computers infected across the world is long gone… Today, the lure of money is the primary motivation for hackers and internet fraudsters and their targets include companies in the French CAC40 and Organizations of Vital Importance (OIV), i.e. those operating in sectors of vital national importance (transport, energy, telecoms etc.) SMEs and SMIs are also concerned by the rise in cyber attacks. They work as subcontractors for client institutions and are therefore targeted by hackers who want to obtain confidential files (R&D program, list of suppliers etc.) or directly infect the main client’s Information System (IS).

 

Highly connected companies

Two factors explain this evolution. The first is the increase in digital data managed and exchanged by companies. Information is at the core of their business activities. Secondly, organizations have become highly connected with fixed and mobile devices, peripheral devices (printers, cameras etc.) connected to networks, sensors in production lines, automated systems and the Internet of Things (IoT). “The result is that the economic value of connection is greater than the economic value of protection. We must therefore do the best we can to isolate what has to be isolated but still allow companies to benefit from a certain level of connectivity” Hervé Debar stresses.

In terms of computer safety, companies have to manage different problems including “the acceptability of security by users. If the recommended solutions are too complicated, they won’t use them and will find a way to get around them. Security must support use, not hinder it”, the expert from Télécom SudParis explains.

 

Complex regulations

To face up to this challenge, companies must be able to manage two major limitations. The first concerns the technical aspect. Services such as BYOD (Bring Your Own Device) or corporate Appstores (portals allowing employees to download selected applications to improve productivity) are being rolled out at a quicker rate than security measures.

The second limitation concerns the human dimension. Regulations on digital technology are very complicated, especially due to the Law on Military Programming and obligations imposed by the CNIL (French National Commission on Data Protection and Civil Liberties), and will become even more so in May 2018 with the introduction of the new European regulations on data protection. All companies will have to report personal data violation, in contrast to the law of 1978 currently in force which only concerns suppliers of communications services.

These legal constraints require companies to bring in experts who are able to administrate and roll-out IT security measures on a daily basis and inform staff members through training.

 

[box type=”shadow” align=”” class=”” width=””]

Attacks by computer zombies

DDoS (Distributed Denial of Service) attacks are a company’s worst nightmare. They use a network of thousands of computers or connected devices (often compromised) to interrupt the targeted service or services. There was a major attack of this kind in France between 18 and 21 September 2016, when the servers of OVH, a French web host, were inundated with millions of requests peaking at as high as one terabit per second. A squadron of around 150,000 IP cameras (or botnets) infected by cyber criminals were behind the attack. Hervé Debar has been studying this type of attack for a number of years along with 6cure – a Normandy-based start-up specializing in traffic cleansing – and in the framework of the NECOMA research project (Nippon-European Cyberdefense-Oriented Multilayer threat Analysis), one of the six FP7 projects financed by the European Commission under the Europe and Japan coordinated call. His team’s work consists in studying the possibilities offered by the mechanisms and functions of the network itself in detecting large-scale DDoS attacks which could saturate a single local defense system. The idea is to identify the attacking flow according to its provenance or technical characteristics in order to differentiate it from the legitimate flow, with the aim of restraining bad traffic to leave more room for “good” traffic.[/box]

 

Detecting cyber attacks

It is crucial to detect these attacks as early as possible in order to combat them. Identifying cyber attacks is one of IMT’s principal research topics. “We mainly work on the detection of and protection against distributed denial of service attacks (see insert) and those which more closely target environments on the network side”, explains Hervé Debar. The process has seen several developments since the first work carried out at the start of the 1980s by the American military. It was initially limited to rolling out a few probes. The few alerts raised were dealt with “manually”. Then the number of sensors increased and alerts became more frequent. To manage them efficiently, companies implemented SIEMs (Security Information and Event Management).

“Today, we need to automate part of the reaction so that operators can concentrate on the attacks that are more difficult to deal with. We are heading toward more autonomous and more reactive systems which can protect themselves against attacks. But it remains a complicated matter. Artificial intelligence (my PhD subject) is one possible way to improve our response to a certain number of threats”, explains Hervé Debar. The other option is training staff members to react better to attacks. We talk of “cyber range” which allows a realistic simulation of cyber attacks and the evaluation of cyberdefense tactics. These centers are designed to help OIV operators in particular make the right decisions in response to the impact.

The professionalization of cybercriminals and the increase in vulnerability, which concerns both IT networks and industry, require greater awareness among all actors, both public and private.

 

Hervé Debar, Cybersécurité, Télécom SudParis

A very involved practitioner

Hervé Debar is not a theorist, but a keenly involved expert with 25 years of experience in the world of cybersecurity and R&D. His work includes more than 100 scientific articles on the subject, the coordination of three European projects (WOMBAT, NECOMA and PANOPTESEC) and participation in a number of French and European programs on the subject. His long career in the private sector partially accounts for this considerable activity. An engineer by training, he obtained his PhD before joining Dassault AT (Automation and Telecoms) to work on European research projects. He then left for Switzerland to join IBM where he developed the IDMEF standard and a product called Tivoli Risk Manager, one of the first security information and event management products on the market. Upon returning to France, he joined Orange Labs in Caen and became an emeritus expert and leader of research activities on security. He then brought his considerable experience in the private sector to Télécom SudParis, where he has been in charge of the “Networks and Security” department since 2009.

 

FEMTO Engineering: a new component of the TSN Carnot institute

Belles histoires, bouton, CarnotIn July 2016 the Ministry for Education and Research renewed the Télécom & Société numérique Carnot institute accreditation and at the same time, the institute presented a new component: the FEMTO Engineering Center for Technological Development. We interviewed the Director of the FEMTO-ST laboratory, Laurent Larger, who told us a little more about the identity of this new component of the TSN Carnot institute and its domains of expertise which could be of interest for companies for their partnership research projects.

 

What is the difference between FEMTO-ST and FEMTO Engineering?

Laurent Larger: FEMTO Engineering was created in 2013 by FEMTO-ST and is associated with the FC’Innov foundation (Franche-Comté Innovation). FEMTO Engineering and the FEMTO-ST laboratory have a strategic proximity. In fact, the researchers in the two structures share the same building and their research overlaps. The experts at FEMTO Engineering can benefit from advice from those in the laboratory. FEMTO Engineering is devoted to the promotion and technology transfer of research developed at FEMTO-ST. The advantage of having a structure like FEMTO Engineering is that it constitutes a partner which is more attractive for the industry and SMEs.

 

What are FEMTO Engineering’s fields of work?

LL: In principle, all the research topics in which FEMTO-ST has expertise. FEMTO-ST has seven main departments: automation and micro-robotics; IT; energy; applied mechanics; micro-nano-science and systems; optics; time and frequency analysis (metrology of time, high-end radiofrequency and microwave sources). In practice, we have identified a certain number of topics which meet the requirements of a Carnot institute in terms of promotion and technology transfer or development.

Carnot TSN, FEMTO Engineering, FEMTO-ST

Laurent Larger, Director of FEMTO-ST institute

Let’s take energy for example. FEMTO-ST has internationally-acknowledged expertise in the field of fuel cell systems. FEMTO Engineering rolls out this expertise in the context of renewable energy using hydrogen. We are also currently working with a vehicle manufacturer on Stirling engines using the technique of heat recovery. FEMTO Engineering is also involved in the biomedical field. Work is currently being carried out on creating biochips allowing for the identification of living species. This allows us, for example, to propose new solutions for proteomic analysis on gold and silica chips using SPR (surface plasmon resonance).

 

What are FEMTO Engineering’s strengths?

LL: FEMTO-ST’s strengths relate to its technological expertise in high-tech components and systems. As an example, let’s take two technologies related to FEMTO-ST’s proprietary expertise.

Femtosecond laser machining is an activity based on expertise in the formation of a femtosecond laser beam. It allows for nano-machining with accurate control in terms of both time and space. In other words, it allows us to make nano holes in materials (glass, silicon, diamond etc.) with unprecedented speeds of execution and extremely high aspect ratios. This method can be used for ultra-fast cutting or for making nano holes to structure thin films. Last but not least, it can be used in surface texturing, which allows us to change the adhesive properties of a surface, to make it hydrophilic or hydrophobic, for example.

Another example of FEMTO Engineering’s strengths is its expertise in the field of time and frequency metrology. We have designed a cryogenic sapphire oscillator at the high end of state-of-the-art technology on an international level, and which has set new records for stability. These oscillators are used to obtain baseline frequencies for calibrating observation systems and have already been used by the ESA (European Space Agency). It also has applications in the field of radiofrequencies and microwaves.

More broadly speaking, our very close links with FEMTO-ST enable FEMTO Engineering to rely upon a vast range of experimental knowledge and high-tech facilities structured around platforms and notably including MIMENTO, our micro-nano-manufacturing plant (a member of the French CNRS RENATECH network).

 

Carnot TSN, FEMTO Engineering, FEMTO-ST

Manufacturing a resin mask. This mask will be used to create motifs on substrate materials. For example, for making metallic electrodes or for engraving. Photo credits: CC Ludovic Godard – UFC

 

What was the reason for wanting to join the TSN Carnot institute, what are your expectations and how will you benefit from it?

LL: We were already familiar with the Carnot institute environment because we belonged to it between 2006 and 2011. We wanted to return in order to benefit from new opportunities in terms of promotion and transfer. In this sense, the deciding factor with the TSN Carnot institute was its solidity. Out of all the Carnot institutes, it is very well positioned in terms of the level of activity. This standing appealed to us and we plan on contributing to it!

In terms of expertise, this is the most complementary Carnot institute in relation to our technology offering. Thanks to this complementarity, we can offer new opportunities in terms of promotion and industrial relations/partnerships. Another important thing to mention is our location. We are on the border with Switzerland, with which we have a lot of international interactions level, and in 2015 thus generated more than 40% of turnover from abroad. This trend was strengthened in 2016 with twice the amount of turnover.

 

Will FEMTO Engineering evolve after joining the TSN Carnot institute?

LL: For the moment, the workforce at FEMTO Engineering stands at 14 people, compared with 800 in the laboratory. This ratio provides significant room for growth, particularly in relation to the pool of expertise within FEMTO-ST. We benefit from this interaction with the laboratory, which offers us plenty of other topics that could be rolled out at FEMTO Engineering. The people with this expertise are mainly former PhD students and engineers from the laboratory. The concept is to offer members of industry the best possible interface to facilitate their innovation projects. Joining the TSN Carnot institute is an opportunity for us to grow and develop new activities with the business world.

 

[box type=”shadow” align=”” class=”” width=””]

The TSN Carnot institute, a guarantee of excellence in partnership-based research since 2006

 

Having first received the Carnot label in 2006, the Télécom & Société numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Télécom École de Management, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.[/box]