Les chercheurs, cyber-remparts des infrastructures critiques, critical infrastructures, cyber protection

Researchers, the cyber-ramparts of critical infrastructures

Cyber protection for nuclear power stations or banks cannot be considered in the same way as for other structures. Frédéric Cuppens is a researcher at IMT Atlantique and leader of the Chair on Cybersecurity of Critical Infrastructures. He explains his work in protecting operators whose correct operation is vital for our country. His Chair was officially inaugurated on 11 January 2017, strengthening state-of-the-art research on cyberdefense.

 

The IMT chair you lead addresses the cybersecurity of critical infrastructures. What type of infrastructures are considered to be critical?

Frédéric Cuppens: Infrastructures which allow the country to operate correctly. If they are attacked, failure could place the population at risk or be seriously damaging to the execution of essential services for citizens. A variety of domains are covered by the operators of these infrastructures’ activities, and this diversity is illustrated in our Chair’s industrial partners, which include stakeholders in energy generation and distribution — EDF; telecommunications — Orange and Nokia; defense — Airbus Defence and Space. There are also sectors which are perhaps initially less obvious but which are just as important such as banks and logistics, and for this we are working with La Société Générale, BNP Paribas and La Poste[1].

Also read on I’MTech: The Cybersecurity of Critical Infrastructures Chair welcomes new partner Société Générale

 

The Chair on Cybersecurity of Critical Infrastructures is relatively recent, but these operators did not wait until then to protect their IT systems. Why are they turning to researchers now?

FC: The difference now is that more and more automatons and sensors in these infrastructures are connected to the internet. This increases the vulnerability of IT systems and the severity of potential consequences. In the past, an attack on these systems could crash internal services or slow production down slightly, but now there is a danger of major failure which could put human lives at risk.

 

How could that happen in concrete terms?

FC: Because automatons are connected to the internet, an attacker could quite conceivably take control of a robot holding an item and tell it to drop it on someone. There is an even greater risk if the automaton handles explosive chemical substances. Another example is an attack on a control system: the intruder could see everything taking place and send false information. In a combination of the two examples, the danger is very great: an attacker could take control of whatever he wants and make it impossible for staff members to react by preventing control.

 

How do you explain the vulnerability of these systems?

FC: Systems in traditional infrastructures, like the ones in question, were computerized a long time ago. At this point, they were isolated from an IT point of view. As they weren’t designed to be connected to the internet, the security must now be updated. Today, cameras or automatons can still possess vulnerabilities because their primary function is to film and handle objects and not necessarily be resistant to every possible kind of attack. This is why our role is first and foremost to detect and understand the vulnerabilities of these tools depending on their cases of use.

 

Are cases of use important for the security of a single system?

FC: Of course, and measuring the impact of an attack according to an IT system’s environment is at the core of our second research focus. We develop adapted measurements to identify the direct or potential consequences of an attack, and these measurements will obviously have different values depending on whether an attacked automaton is on a military boat or in a nuclear power station.

 

In this case, can your work with each partner be reproduced for protecting other similar infrastructures, or is it specific to each case?

FC: There are only a limited number of automaton manufacturers for critical applications: there must be 4 or 5 major suppliers in the world. The cases of use do of course have an effect on the impact of the intrusion, but the vulnerabilities remain the same. Part of what we do can therefore be reproduced, of course. On the other hand, we have to be specific with regard to the measurement of impact. The same line is taken by ordering institutions in research. The projects of Investments for the Future programs on both the French national and European H2020 scale strongly encourage us to work on specific cases of use. That said, we still sometimes address topics that are not linked to a particular case of use, but which are more general.

 

The new topics that the Chair plans to address include a project called Cybercop 3D for visualizing attacks in 3D. It seems a rather unusual concept at first sight.

FC: The idea is to improve control tools, which are currently similar to a spreadsheet with different colored lines to facilitate visualizing the data on the condition of the IT system. We could use 3D technology to allow computer engineers to view a model of places where intrusions are taking place in real time, and improve the visibility of correlations between events. This would also provide a better understanding of attack scenarios, which are currently presented as 2D tree views and quickly become unreadable. 3D technology could improve their readability.

 

The issue in hand is therefore to improve the human experience in measures taken against the attacks. What is the importance of this human factor?

FC: It is vital. As it happens, we are planning to launch a research topic on this subject by appointing a researcher specializing in qualitative psychology. This will be a cross-cutting topic, but will above all complement our third focus which develops decision-making tools to provide the best advice for people in charge of rolling out countermeasures in the event of an attack. The aim is to see whether, from a psychological point of view, the solution proposed to humans by the decision-making tool will be interpreted correctly. This is important because, in this environment, staff are used to managing accidental failure and do not necessarily respond by thinking it is a cyber attack. It is therefore necessary to make sure that when the decision-making tool proposes something, it is understood correctly. This is all the more important given that the operators of critical systems do not follow an automatization rationale. It is still humans who control what happens.

 

[1] In addition to the operators of critical infrastructures mentioned, the Chair’s partners also include Amossys, a company specializing in cybersecurity expertise. In addition, there are institutional partners with Région Bretagne and FEDER, Fondation Télécom and IMT’s schools: IMT Atlantique, Télécom ParisTech and Télécom SudParis.

 

 

 

GDPR, chair Values and policies of personal information

Personal data: How the GDPR is changing the game in Europe

The new European regulation on personal data will become officially applicable in May 2018. The regulation, which complements and strengthens a European directive from 1995, guarantees unprecedented rights for citizens, including the right to be forgotten, the right to data portability, and the right to be informed of security failures in the event of a breach involving personal data… But for these measures to be effective, companies in the data sector will have to be in agreement. However, they have little time to comply with this new legislation that, for most companies, will require major organizational changes. Failure to make these changes will expose them to the risk of heavy sanctions.

 

With very little media coverage, the European Union adopted the new General Data Protection Regulation (GDPR) on April 27, 2016. Yet this massive piece of legislation, featuring 99 articles, includes plenty of issues that should arouse the interest of European citizens. Because, starting on May 25, 2018, when the regulation becomes officially applicable in the Member States, users of digital services will acquire new rights: the right to be forgotten, in the form of a right to be dereferenced, an increased consideration of their consent to use or not use their personal data, increased transparency on the use of this data… And the two-year period, from the moment the regulation was adopted to the time of its application, is intended to enable companies to adapt to these new constraints.

However, despite this deferment period, Claire Levallois-Barth, coordinator of the IMT chair Values and policies of personal information (VPIP) assures us that “two years is a very short period”. The legal researcher bases this observation on the work she has carried out among the companies she interviewed. Like many stakeholders in the world of digital technology, they find themselves facing new concepts introduced by the GDPR. Starting in 2018, for example, they must ensure their customers’ right to data portability. Practically speaking, each user of a digital service will have the option of taking his or her personal data to a competitor, and vice versa.

Claire Levallois-Barth, coordinatrice de la chaire VPIP.

Claire Levallois-Barth, coordinator of the chair Values and policies of Personal information

Two years does not seem very long for establishing structures that will enable customers to exercise this right to data portability. Because, although the regulation intends to ensure this possibility, it does not set concrete procedures for accomplishing this: “therefore, it is first necessary to understand what is meant, in practical terms, by a company ensuring its customers’ right to data portability, and then define the changes that must be made, not only in technical terms, but also in organizational terms, including the revision of current procedures and even the creation of new procedures,” explains Claire Levallois-Barth.

The “privacy by design” concept, which is at the very heart of the GDPR, and symbolizes this new way of thinking about personal data protection in Europe, is just as restricting for organizations. It requires the integration of all of the principles that govern the use of personal data (principles of purpose, proportionality, duration of data storage, transparency…) in advance, beginning at the design phase for a product or service. Furthermore, the regulation is now based on the principle of responsibility, which implies that the company itself must be able to prove that it respects this legislation by keeping updated proof of its compliance. The design phases for products and services, as well as the procedures for production and use must therefore be revised in order to establish internal governance procedures for personal data. According to Claire Levallois-Barth, “for the most conscientious companies, the first components of this new governance were presented to the executive committee before the summer of 2016.

 

Being informed before being ready

While some companies are in a race against time, others are facing problems that are harder to overcome. During the VPIP Chair Day held last November 25th, dedicated to the Internet of things, Yann Padova, the Commissioner specializing in personal data protection at the French Energy Regulatory Commission (CRE), warned that “certain companies do not yet know how to implement the new GDPR regulations.” Not all companies have access to the skills required for targeting the organizational levers that must be established.

For example, the GDPR mentions the requirement, in certain cases, for a company that collects or processes users’ data, to name a Data Protection Officer (DPO). This expert will have the role of advising the data controller—in other words, the company—to ensure that it respects the new European regulation. But depending on the organization of major groups, some SMEs will only play a subcontracting role in data processing: must they also be prepared to name a DPO? The companies are therefore faced with the necessity of quickly responding to many questions, and clear-cut answers do not always exist. And another reality is even more problematic: some companies are not at all informed of the contents of the GDPR.

Yann Padova, commissaire à la CRE.

Yann Padova, CRE Commissioner

Yann Padova points out that before they can be ready, companies must be aware of the challenges. Yet he recognizes that he “does not see many government actions in France that explain the coming regulations.” Joining him to discuss this subject on November 25, lawyer Denise Lebeau-Marianna—in charge of personal data protection matters at the law firm of Baker & McKenzie—confirmed this lack of information, and not only in France. She cited a study on companies’ readiness for the GDPR that was carried out by Dimensional Research and published in September 2016. Out of 821 IT engineers and company directors in the data sector, 31% had heard about the GDPR, but were not familiar with its contents, and 18% had never heard of it.

 

Without sufficient preparation, companies will face risks… and sanctions

For Claire Levallois-Barth, it seems obvious that with all of these limits, not all companies will comply with all aspects of the GDPR by 2018. So, what will happen then? “The GDPR encourages companies to implement protection measures that correspond to the risk level their personal data processing activities present. It is therefore up to companies to quantify and assess this risk. They then must eliminate, or at least reduce the risks in some areas, bearing in mind that the number of data processing operations is in the tens or even hundreds for some companies,” she explains. What will these areas be? That depends on each company, what it offers its users and its ability to adapt within two years.

And if these companies are not able to comply with the regulations in time, they will be subject to potential sanctions. One of the key points of the GDPR is an increase in fines for digital technology stakeholders that do not comply with their obligations, especially regarding user rights. In France, the CNIL could previously impose a maximum penalty of €150,000 before the Law for a Digital Republic increased this amount to €3 million. But the GDPR, a European regulation with direct application, will repeal this part of French regulation in May 2018, imposing penalties of up to €20 million euros or 4% of a company’s total annual worldwide turnover.

The new European Committee for data protection—currently called G29—will be in charge of organizing this regulation. This organization, which combines all of the European Union CNILs, has just published its first three notices on the regulation issues that require clarification, including portability and the DPO. This should remove some areas of uncertainty surrounding the GDPR, the biggest of which remains the question of the GDPR’s real, long-term effectiveness.

Because, although in theory the regulation proposed by the EU is aimed at better protecting users’ personal data in our digital environment, and at simplifying administrative procedures, many points still seem unclear. “Until the regulation has come into effect and the European Commission has published the implementing acts presenting the regulation, it will be very difficult to tell whether the protection for citizens will truly be reinforced,” Claire Levallois-Barth concludes.

 

 

Marius Preda

Marius Preda

Télécom SudParis | #Augmented reality #Multimedia #3D

[toggle title=”Find all his articles on I’MTech” state=”open”]

[/toggle]

Carnot TSN, Scalinx, electronics

Scalinx: Electronics, from one world to another

Belles histoires, bouton, CarnotThe product of research carried out by its founder, Hussein Fakhoury, at the Télécom ParisTech laboratories (part of the Télécom & Société numérique Carnot institute), Scalinx is destined to shine as a French gem in the field of electronics. By developing a new generation of analog-to-digital converters, this startup is attracting the attention of stakeholders in strategic fields such as the defense and space sectors. These components are found in all electronic systems that interface analog and digital functions, whose performance depends on the quality of the converters they use.

 

We live in an analog world, whereas machines exist in a digital world,” Hussein Fakhoury explains. According to this entrepreneur, founder of the startup Scalinx, all electronic systems must therefore feature a component that can transform analog magnitudes into digital values. “This converter plays a vital role in enabling computers to process information from the real world,” he insists. Why is this? It makes it possible to transform a value that is continually changing over time, like electrical voltage, into digital data that can be processed by computer systems. And designing this interface is precisely what Hussein Fakhoury’s startup specializes in.

Scalinx develops next generation analog-to-digital converters. Based on a different architectural approach than that used by its competitors, the components it has developed offer many advantages for applications that require a fast digitization system. “By using a new electronic design for the structure, we provide a much more compact solution that consumes less energy,” the startup founder explains. However, he points out that the Scalinx interfaces “are not intended to replace the historical architectural in every circumstance, since these historical structures are essential for certain applications.

Hussein Fakhoury, the founder of Scalinx

These new converters are intended for specific markets, in which the performance and the efficient use of space are of upmost importance. This is the case in the space electronics, defense, and medical imaging sectors. For this last example, a prime example is ultrasound. While today we can see the fetus in a woman’s womb in two dimensions using ultrasound technology, medical imaging is increasingly moving towards 3D visualization. However, to transition from 2D to 3D, probes must be used that use more converters. With the traditional architectures, the heat dissipation would become too great, and would not only damage the probe, but could inconvenience the patient.

And the obstacles are not only of a technical nature; they are also strategic. The quality of an electronic system depends on this analog/digital interface. Quality is therefore of utmost importance for high-end systems. Currently, however, “the global leaders for high-performance components in this field are American,” Hussein Fakhoury observes. Yet the trade regulations, as well as issues of sovereignty and confidentiality of use can represent a limit for European stakeholders in critical areas like the defense sector.

 

A spin-off from Télécom ParisTech set to conquer Europe

Scalinx therefore wants to become a reference in France and Europe for converters intended for applications that cannot sacrifice energy consumption for the sake of performance. For now, the field appears to be open. “Few companies want to take on this strategic market,” the founder explains. The startup’s ambition seems to be taking shape, since it benefited from two consecutive years of support from Bpifrance as the winner of the national i-Lab contest for business start-up assistance in 2015 and 2016. It also received an honor loan from The Fondation Télécom in 2016.

Scalinx’s level of cutting-edge technology in the key area of analog-digital interfaces can be attributed to the fact that its development took place in an environment conducive to state-of-the-art innovation. Hussein Fakhoury is a former Télécom ParisTech researcher (part of the Télécom & Société numérique Carnot institute), and his company is a spin-off that has been carefully nurtured to maturity. “Already in 2004, when I was working for Philips, I thought the subject of converters was promising, and I began my research work in 2008 to improve my technical knowledge of the subject,” he explains.

Then, between 2008 and the creation of Scalinx in 2015, several partnerships were established with industrial stakeholders, which resulted in the next generation of components that the startup is now developing. NXP — the former Philips branch specialized in semiconductors—France Télécom (now Orange) and Thalès collaborated with the Télécom ParisTech laboratory to develop the technology that is today being used by Scalinx.

With this wealth of expertise, the company is now seeking to develop its business and acquire new customers. Its business model is based on a “design house” model, as Hussein Fakhoury explains: “The customers come to see us with detailed specifications or with a concept, and we produce a turnkey integrated circuit that matches the technical specifications we established together.” This is a concept the founder of Scalinx hopes to further capitalize on as he pursues his ambition of European conquest, an objective he plans to meet over the course of the next five years.

 

[box type=”shadow” align=”” class=”” width=””]

The TSN Carnot institute, a guarantee of excellence in partnership-based research since 2006

Having first received the Carnot label in 2006, the Télécom & Société numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies. The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Télécom, École de Management, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.[/box]

SEAS, ITEA

How the SEAS project is redefining the energy market

The current energy transition has brought with it new energy production and consumption modes. Coordinated by Engie, the european SEAS project aims to foster these changes to create a more responsible energy market. SEAS is seeking to invent the future of energy usage by facilitating the integration of new economic stakeholders to redistribute energy, as well as increasing the energy management options offered to individuals. These ideas have been made possible through the contributions of researchers from several IMT graduate schools (IMT Atlantique, Mines Saint-Étienne, Télécom ParisTech and Télécom SudParis). Among these contributions, two innovations are supported by IMT Atlantique and Mines Saint-Étienne.

 

An increasing number of people are installing their own energy production tools, such as solar panels. This breaks with the traditional energy model of producer-distributor-consumer”. Redefining the stakeholders in the energy chain, as noted by Guillaume Habault, IMT Atlantique computer science researcher, is at the heart of the issue addressed by the Smart Energy Aware Systems (SEAS) project. The project was completed in December, after three years of research as part of the European ITEA program. It brought together 34 partners from 7 countries, one of which was IMT in France. On 11 May, the SEAS project won the ITEA Award of Excellence, in acknowledgement for the high quality of its results.

The project is especially promising as it does not only involve individuals wanting to produce their own energy using solar panels. New installations such as wind turbines provide new sources of energy on a local scale. However, this creates complications for stakeholders in the chain such as network operators: their energy production is erratic, as it is dependent on the seasons and the weather. Yet it is important to be able to foresee energy production in the very short term in order to ensure that every consumer is supplied. Overestimating the production of a wind farm or a neighborhood equipped with solar panels means taking the risk of not having enough energy to cope with a lack in production, and ultimately causing power cuts for residents. “Conversely, underestimating production means having to store or dispatch the surplus energy elsewhere. Poor planning can create problems in the network, and even reduce the lifespan of some equipment” the researcher warns.

 

An architecture for smart energy grid management

Among the outcomes of the SEAS is a communication architecture capable of gathering all information from different production and consumption modes locally, almost in real time. “The ideal goal is to be able to inform the network in 1-hour segments: with this length of time, we can avoid getting information about user consumption that is excessively precise, while anticipating cases of over- or under-consumption,” explains Guillaume Habault, the creator of the architecture.

For individuals, SEAS may take the form of an electric device that can transmit information about their consumption and production to their electricity provider. “This type of data will allow people to optimize their power bills,” the researcher explains. “By having perfect knowledge of the local energy production and demand at a given moment, residents will be able to tell if they should store the energy they produce, or redistribute it on the network. With the help of the network, they may also decide what time would be the most economical to recharge their electric car, according to electricity prices, for instance.”

 

 

These data on the current state of a sub-network point to the emergence of new stakeholders, known as “flexibility operators”. First of all, because optimizing your consumption by adapting the way you use each appliance in the house requires specific equipment, and takes time. While it is easy to predict that energy will be more expensive at times of peak demand, such as in the evenings, it is more difficult to anticipate the price of electricity according to how strong the wind is blowing in a wind farm located several dozen kilometers away. It is safe to say that with suitable equipment, some individuals will be inclined to delegate their energy consumption optimization to third-party companies.

The perspectives of intelligent energy management offered by SEAS go beyond the context of the individual. If the inhabitants of a house are away on holiday, couldn’t the energy produced by their solar panels be used to supply the neighborhood, thus taking pressure off a power plant located a hundred kilometers away? Another example: refrigerators operate periodically, they don’t cool constantly, but rather at intervals. In a neighborhood, or a city, it would therefore be possible to intelligently shift the startup time of a group of these appliances to outside peak hours, so that an already heavily-used network can be concentrated on the heaters people switch on when they return home from work.

Companies are particularly keen to get these types of services. Load management allows them to temporarily switch off machines that are not essential to their service in exchange for a payment to those who are in charge of this load management. The SEAS architecture incorporates communication security in order to ensure trust between stakeholders. In particular, personal data are decentralized: each party owns their own data and can decide not only to allow a flexibility operator to have access to them, but can also determine their granularity and level of use. “an individual will have no trouble accepting that their refrigerator cools at different times from usual, but not that their television gets cut off while they are watching it,” says Guillaume Habault. “And companies will want to have even more control over whether machines are switched off or on.”

 

Objects that speak the same language

In order to achieve such efficient management of electricity grids, the SEAS project turned to the semantic web expertise of Mines Saint-Étienne. “The semantic web is a set of principals and formalisms that are intended to allow machines to exchange knowledge on the web”, explains Maxime Lefrançois, head researcher in developing the knowledge model for the SEAS project. This knowledge model is the pivotal language that allows objects to be interoperable in the context of energy network management.

Up to now, each manufacturer had their own way of describing the world, and the machines made by each company evolved in their own worlds. With SEAS, we used the principles and formalisms of the semantic web to provide machines with a vocabulary allowing them to “talk energy”, to use open data that exists elsewhere on the web, or to use innovative optimization algorithms on the web”, says the researcher. In other words, SEAS proposes a common language enabling each entity to interpret a given message in the same way. Concretely, this involves giving each object a URL address, which can be consulted in order to obtain the information on it, in particular to find out what it can do and how to communicate with it. Maxime Lefrançois adds, “We also contributed to principles and formalisms of the semantic web with a series of projects aimed at making it more accessible to companies and machine designers, so that they could adapt their existing machines and web services to the SEAS model at a lower cost”.

Returning to a previous example, using this extension of the web makes it possible to adapt two refrigerators of different brands so that they can communicate, agree on the way they operate, and avoid creating a consumption peak by starting up at the same time. In terms of services, this will allow flexibility operators to create solutions without being limited by the languages specific to each brand. As for manufacturers, it is an opportunity for them to offer household energy management solutions that go beyond simple appliances.

Thanks to the semantic web, communication between machines can be more easily automated, improving the energy management service proposed to the customer. “All these projects point to a large-scale deployment,” says Maxime Lefrançois. Different levels of management can thus be envisioned. Firstly, for households, for coordinating appliances. Next, for neighborhoods, redistributing the energy produced by each individual according to their neighbors’ needs. Finally on a regional or even national scale, for coordinating load management for overall consumption, relieving networks in cases of extremely cold temperatures, for example. The SEAS project could therefore change things on many levels, offering new modes of more responsible energy consumption.

 

This article is part of our dossier Digital technology and energy: inseparable transitions!

 

SEAS, ITEA

 

[divider style=”normal” top=”20″ bottom=”20″]

SEAS wins an “ITEA Award of Excellence for Innovation and Business impact”

Coordinated by Engie, and with one of its main academic partners being IMT, SEAS won an award of excellence on May 11 at the Digital Innovation Forum 2017 in Amsterdam. This award recognizes the relevance of the innovation in terms of its impact on the industry.

[divider style=”normal” top=”20″ bottom=”20″]

Smart cities, Ville intelligente : « Ce n’est que par une recherche pluridisciplinaire que les défis seront relevés »

Smart cities: “it is only through multidisciplinary research that we can rise to these challenges”

The smart city is becoming an increasingly tangible reality for citizens in urban areas, with the efforts made to increase mobility and energy management being obvious examples. But is more efficient transport and optimized energy consumption sufficient to define a smart city? Being a member of the jury of the international Prizes Le Monde-Smart Cities that will be awarded in Singapor on June 2, Deputy President of IMT Francis Jutand explained to us why smart cities must be considered in a general and systemic way.

 

Is it possible to reduce smart cities to cities with lower energy consumption?

Francis Jutand: Definitely not. The impact of digital technology on cities goes far beyond energy-saving issues, even if this is an important aspect of it. Of course, it allows smart technology to be used in energy monitoring for buildings and vehicles, but digital technology also plays in important role in managing mobility and interactions. For example, it eliminates the need for physical transport by allowing for telecommuting, coworking and exchanges of information in general. It could even allow for a more adaptive organization of mobility — although there is a long way still to go in this matter.

 

What do you mean by more adaptive organization?

FJ: One of the problems affecting cities is congestion linked to peaks in traffic. Managing congestion is a tricky systemic challenge which has to combine a number of solutions, such as organization of work, staggered management of office opening hours, proaction and dynamic reaction. There is a whole organizational infrastructure to be established, to which digital technology can contribute.

 

Besides the digitization of services, will smart cities also be a source of apprehension for citizens?

FJ: Digital technology allows us to provide new functionalities. Everyone experiences digital technology and its services and perceives a certain number of obvious advantages. Digital technology also concerns future problems to be resolved. In the case of digital cities, one of the most interesting ones is anticipating its growing complexity. Infrastructures are being digitized and can be interfaced. At the same time, humans are benefitting from increased capacities for interaction, while at the same time autonomous entities are being developed — such as autonomous cars — which incorporate intelligent elements that also have a high capacity for interaction with infrastructures. Therefore, there needs to be an efficient management of exchanges between agents, humans and infrastructures.

 

Is digital technology the only field that must addressed when considering the city of the future? 

FJ: Smart and sustainable cities — I always add the word “sustainable”, because it is vital — must be considered from several perspectives. In terms of research, the subjects concerned are digital technology and big data, of course, but also supply chains, air quality, social and economic impacts etc. It is only through multidisciplinary research that we can truly rise to these challenges. This is what we try to do at Institut Mines-Télécom, with schools that are very active in their area and involved in local projects linked to smart cities. In addition to their strength in research, they are an important lever for innovation for designing products and services linked to smart and sustainable cities, and more particularly by fostering entrepreneurship through their students.

 

If digital technology is not the only subject of reflection for cities of the future, why does it seem to be an ever-present topic of discussion?

FJ: In the currents temporality, the technologies that increase our capacity are digital technologies. They lead to the most innovation. They are used not only for automation, but also for developing interactions and providing algorithmic intelligence and autonomy in different products and services. Interaction implies connection. I would add that it is also necessary to manage the securing of transactions both in terms of reliability of operations and prevention of malicious actions. Today, digital technology is a driving force as well a guide, but the unique thing about it is that it comes out in waves. It is therefore necessary to combine short and long-term views of its impact and work on creativity and innovation. This is why openness and accessibility of data are important points.

 

Is a smart city necessarily one in which all data is open?

FJ: The debate on this matter is too often caricatural and simplified through the question of “should data be open or not?”. In reality, the debate plays out on a different level. Data is not static, and the needs vary. There is a cost to supplying raw data. An extreme position in favor of complete openness would very quickly become financially impossible, and it would be difficult to produce the new data we need. Besides this, there is the issue of data enrichment: we must be able to encourage approaches for a common commodity in which any citizen can work on the data, as well as commercial approaches for developing new services. The balance is hard to find, and will probably depend on the characteristics of each city.

 

You mentioned the cost of digital technology and development, and its energy impact. If local governments can’t bear the entire cost, how can we guarantee homogeneous development within a city or between cities?

 FJ: First of all, it’s true that there are sometimes concerns about the idea that digital technology itself consumes a lot of energy. We must remember that, for the moment, the proportion of a city’s overall energy consumption accounted for by digital technology is very small compared with buildings and transport. Secondly, given that local governments can’t bear the full cost, it is not inconceivable that private-sector-based initiatives will foster and generate differences in the city or between cities. It is extremely difficult to plan the homogenization of cities, nor is it desirable because they are living, and therefore evolving, entities.

The most likely outcome is that sustainable smart cities will develop per district with purely private offerings that will be naturally selective because they will target solvent markets, but which will also leave room for equally welcome civic initiatives. The whole process will be regulated by local government. But this is something we are used to: it’s typically the case with fiber optic broadband and its roll-out. In any case, it is essential to make public policies clear. If we don’t make them clear, people may react by adopting a defensive precautionary position and refusing the development of smart cities. For now, this is not the case, and lots of cities such as Lyon, Rennes, Bordeaux, Nice, Montpellier, Grenoble, Paris, Nantes are determinedly tackling the problem.

 

Could the rise of connected cities lead to the development of new networks between megacities?

FJ: Megacities are increasingly powerful economic entities all over the world. A general expansion of the economic power of cities is also taking place. There are elements of an economic impetus which could lead to shared forms of mutualization or innovation that go much further than previous twinning projects, or even competition. It is therefore likely that economic competition between nations will move toward competition between megacities and the areas that support them.

 

Véronique Riffault, IMT Lille Douai

Véronique Riffault

IMT Nord Europe | #Atmosphere #Environment #Pollution

[toggle title=”Find all her articles on I’MTech” state=”open”]

[/toggle]

ISS, télécommunication spatiale, Space Telecommunication

What is space telecommunication? A look at the ISS case

Laurent Franck is a space telecommunications researcher at IMT Atlantique. These communication systems are what enable us to exchange information with far-away objects (satellites, probes…). These systems also enable us to communicate with the International Space Station (ISS). This is a special and unusual case compared to the better-known example of satellite television. The researcher explains how these exchanges between Earth and outer space take place.

 

Since his departure in November 2016, Thomas Pesquet has continued to delight the world with his photos of our Earth as seen from the sky. It’s a beautiful way to demystify life in space and make this profession—one that fascinates both young and old—more accessible. We were therefore able to see that the members of Expedition 51 aboard the ISS are far from lost in space. On the contrary, Thomas Pesquet was able to cheer on the France national rugby union team on a television screen and communicate live with children from different French schools (most recently on February 23, in the Gard department). And you too can follow this ISS adventure live whenever you want. But how is this possible? To shed some light on this issue, we met with Laurent Franck, a researcher in space telecommunications at IMT Atlantique.

 

What is the ISS and what is its purpose?

Laurent Franck: The ISS is a manned international space station. It accommodates international teams from the United States, Russia, Japan, Europe and Canada. It is a scientific base that enables scientific and technological experiments to be carried out in the space environment. The ISS is situated approximately 400 kilometers above the earth’s surface. But it is not stationary in the sky, because when something is in orbit at this altitude, the laws of physics make the object rotate at a faster speed than the Earth’s rotation. It therefore follows a circular orbit around our planet at a speed of 28,000 kilometers per hour, enabling it to orbit the Earth in 93 minutes.

 

How can we communicate with the ISS?

LF: Not by wire, that’s for sure! We can communicate directly, meaning between a specific point on Earth and the space station. To do this, it must be visible above us. We can get around this constraint by going through an intermediary. One or several satellites that are situated at a higher elevation can then be used as relays. The radio wave goes from the Earth to the relay satellite, and then to the space station, or vice versa. It is all quite an exercise in geometry. There are approximately ten American relay satellites in orbit. They are called TDRS (Tracking and Data Relay Satellite). Europe has a similar system called EDRS (European Data Relay System).

 

Why are these satellites located at a higher altitude than that of the space station?

LF: Let’s take a simple analogy. I take a flashlight and shine it on the ground. I can see a ring of light on the ground. If I raise the flashlight higher off the ground, this circle gets bigger. This spot of light represents the communication coverage between the ground and the object in the air. The ISS is close to the Earth’s surface, and therefore it only covers a small part of the Earth, and this coverage is moving. Conversely, if I take a geostationary satellite at an altitude of 36,000 kilometers, the coverage is greater and corresponds to a fixed point on the Earth. Not only are few satellites required in order to cover the Earth’s surface, but the ISS can also sustainably communicate, via the geostationary satellite, with a ground station that is also located within this area of coverage. Thanks to this system, only three or four ground stations are required to permanently communicate with the ISS.

 

Is live communication with the ISS truly live?

LF: There is a slight time lag, for two reasons. First, there is the time the signal takes to physically travel from point A to point B. This time is related to the speed of light. Therefore, it takes 125 milliseconds to reach a geostationary satellite (television or satellite relays). We then must add the distance between the satellite and the ISS. This results in a travel time that is incompressible–since it is physical–of a little over a quarter of a second. Or half a second to travel there and back. This first time lag is easily observable when we watch the news on television: the studio asks a question and the reporter on the ground seems to wait before answering, due to the time needed to receive the question via satellite and send the reply!

Secondly, there is a processing time, since the information travels through telecommunications equipment. This equipment cannot process the information at the speed of light. Sometimes the information is stored temporarily to accommodate the processor speed. It’s like when I have to wait in line at a counter. There’s the time the employee at the counter takes to do their job, plus the wait time due to all the people in line in front of me. This time can quickly add up.

We can exchange any kind of information with the ISS. Voice and image, of course, as well as telemetry data. This is the information a spacecraft sends to the earth to communicate its state of health. Included in this information is the station’s position, the data from the experiments carried out on board, etc.

 

What are the main difficulties the spatial telecommunications systems experience?

LF: The major difficulty is linked to the fact that we must communicate with objects that are very far away and have limited electrical transmission power. We record these constraints in an energy link budget. This involves several phenomena. The first is that the farther away we communicate, the more energy is lost. With the distance, the energy is dispersed like a spray. The second phenomenon involved in this budget is that the quality of communication depends on the amount of energy received at the destination. We ask: out of one million bits that are transmitted, how many are false when they arrive at the destination? Finally, the last point is the output rate that is possible for the communication. This also depends on the amount of energy invested in the communication. We often adjust the output rate to obtain a certain level of quality. It all depends on the amount of energy available for transmission. This is limited aboard the ISS, since it is powered via solar panels and sometimes travels in the Earth’s shadow. The relay satellites have the same constraints.

 

Is there a risk of interference when the information is travelling through space?

LF: Yes and no, because radio frequency telecommunications are highly regulated. The right to transmit is linked to a maximum frequency and power. It is also regulated in space: we cannot “spill over” into another nearby antenna. For space communications, there are tables that define the maximum amount of energy that we can send outside of the main direction of communication. Below this maximum level, the energy that is sent to a nearby antenna is of course interference, but it will not prevent it from functioning properly.

 

What are the other applications of communications satellites?

LF: They are used for Internet access, telephony, video telephony, the Internet of things… But what is interesting is what they are not used for: GPS navigation and weather observations, for example. In fact, space missions are traditionally divided into four components: the telecommunications we are discussing here, navigation/positioning, observation, and deep-space exploration like the Voyager probes. Finally, what is fascinating is that with a field as specialized as that of space, there is an almost infinite amount of even more specialized derivations.

 

OpenAirInterface, Eurecom, 5G

OpenAirInterface: An open platform for establishing the 5G system of the future

Belles histoires, bouton, CarnotIn this article, we continue our exploration of the Télécom & Société numérique Carnot institute technological platforms. OpenAirInterface is the platform created by EURECOM to support mobile telecommunication systems like 4G and 5G. Its goal: to develop access solutions for networks, radio and core networks. Its service is based on a software suite developed using open source.

 

The OpenAirInterface platform offers a 4G system built on a set of software programs. These programs can each be tested and modified individually by the user companies, independently of the other programs. The goal is to establish the new features of what will become the 5G network. To find out more, we talked with Christian Bonnet, a communications systems researcher at EURECOM.

 

What is OpenAirInterface?

Christian Bonnet: This name encompasses two aspects. The first is the implementation of the software that makes up a 4G-5G system. This involves software components that run in a mobile terminal – those that increment the radio transmissions, and those that are in the core network.

The second part of OpenAirInterface is an “endowment fund” created by EURECOM at the end of 2014, which is aimed at leading an open and global software Alliance (OSA – OpenAirInterface Software Alliance).

 

How does this software suite work?

CB: The aim is to implement the software components required for a complete 4G system. This involves the modem of a mobile terminal, the software for radio relay stations, as well as the software for the specific routers used for a network core. Therefore, we deal with all of the processes involved in the radio layer (modulation, coding, etc.) of communication protocols. It runs on the Intel x86 processors that are found in PCs and computer clusters. This means that it is compatible with Cloud developments. To install it, you must have a radio card connected to the PC, which serves as the terminal, and a second PC, which serves as a relay station.

Next, depending on what we need to do, we can take only a part of the software implementation. For example, we can use commercial mobile terminals and attach to a network composed of an OpenAirInterface relay and a commercial network core. Any combination is possible. We have therefore established a complete network chain for 4G, which can move towards the 5G network using all of these software programs.

 

openairinterface

 

Who contributes to OpenAirInterface?

CB: Since the Alliance was established, we have had several types of contributors. The primary contributor, to date, has been EURECOM, because its teams are those that developed the initial versions of all the software programs. These teams include research professors, post-doctoral students, and PhD students who can contribute to this platform that provides participants with an experimental environment for their research. In addition, through the software Alliance, we have acquired new kinds of contributors: industrial stakeholders and research laboratories located throughout the world. We have expanded our base, and this openness enables us to receive contributions from both the academic and industrial worlds. (Editor’s note: Orange, TCL and Ercom are strategic OpenAirInterface partners, but the Alliance also includes many associate members, such as Université Pierre et Marie Curie (UPMC), IRT Bcom, INRIA and, of course, IMT. The full list is available here.)

 

What does the Carnot Label represent for your activities?

CB: The Carnot Label was significant in our relationship with the Beijing University of Posts and Telecommunications in China (BUPT), a university specializing in telecommunications. The BUPT asked us to provide a quality label reference that would allow us to demonstrate the recognition of our expertise. The Carnot Label was presented and recognized by the foreign university. This label demonstrates the commitment of OpenAirInterface developments to the industrial world, while also representing a seal of quality that is recognized far beyond the borders of France and Europe.

 

Why do companies and industrial stakeholders contact OpenAirInterface?

CB: To develop innovation projects, industrial stakeholders need advances in scientific research. They come to see us because they are aware of our academic excellence and they also know that we speak the same language. It’s in our DNA! Since its very beginning, EURECOM has embodied the confluence of industry and research; we speak both languages. We have developed our own platforms, we have been confronted with the same issues that industrial stakeholders face on a daily basis. We are therefore positioned as a natural intermediary between these two worlds. We listen attentively to the innovation projects they present.

 

You chose to develop your software suite as open source, why?

CB: It is a well-known model that is beginning to spread. It facilitates access to knowledge and contributions. This software is covered by open source licenses that protect contributors and enable wider dissemination. This acts as a driving force and an accelerator of development and testing, since each software component must be tested. If you multiply the introduction of this software throughout the world, everyone will be able to use it more easily. This enables a greater number of software tests, and therefore increases the amount of feedback from users for improving the existing versions. Therefore, the entire community benefits. This is a very important point, because even in industry, many components are starting to be developed using this model.

 

In addition to this approach, what makes OpenAirInterface unique?

CB: OpenAirInterface has brought innovation to open source software licensing. Many types of open source licenses exist. It is a vast realm, and the industrial world is bound to large patent portfolios. The context is as follows: on the one hand, there are our partners who have industrial structures that rely on revenue from patents and, on the other hand, there is a community who wants free access to software for development purposes. How can this apparent contradiction be resolved?

We have introduced a specific license to protect the software for non-commercial operations –everything related to research, innovation, tests – as for classic open source software. For commercial operations, we have established a patent declaration system. This means that if industrial stakeholders implement their own patented components, they need only indicate this and, for commercial operations, people will therefore contact the rights holders to negotiate. These conditions are known as FRAND (fair, reasonable and nondiscriminatory) terms, and reflect the practices industrial players in the field follow with standardization organizations such as GPP. In any case, this procedure has been well accepted. This explains why Orange and Nokia (formerly Alcatel-Lucent Bell Labs), convinced by the benefits of this type of software license, are featured among the Alliance’s strategic partners.

 

What is the next development phase for OpenAirInterface?

CB: Several areas of development exist. The projects that are proposed as part of the European H2020 program, for which we are awaiting the results, will allow us to achieve scientific advances and will benefit the software range. The Alliance has also defined major areas for development through joint projects led by both an industrial partner and an academic partner. This type of structure enables us to bring people together from around the world. They volunteer to participate in one of the steps towards achieving 5G.

 

 

[divider style=”normal” top=”20″ bottom=”20″]

The TSN Carnot institute, a guarantee of excellence in partnership-based research since 2006. 

Having first received the Carnot label in 2006, the Télécom & Société Numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Télécom, École de Management, EURECOM, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.

[divider style=”normal” top=”20″ bottom=”20″]

 

Müge Özman, Strategic Management Innovation Networks, Télécom École de Management

Strategic Management of Innovation Networks

Suitable for a one- or two-semester course for undergraduate and graduate students, this interdisciplinary textbook explains the diverse aspects of innovation and social networks, which occupy a central place in business and policy agendas. Its unified approach presents networks as nested systems that can span organisations, industries, regions, and markets, giving students a holistic perspective and reducing the amount of effort required to learn the theoretical framework for each layer. With engaging real-world examples, the text also provides a practical guide on how to manage networks to increase innovation and improve performance. Topics covered include forming teams to foster creativity, selecting partners and leveraging partnerships for learning, managing organisational change, and sponsoring technologies in communities. Students will learn the metrics used in social network analysis and how they are interpreted and applied. Suggested reading lists and online resources offer opportunities for further review and practice.

 

Muge Ozman, Télécom École de Management, Strategic Management Innovation NetworksAuthor

Müge Özman is ‎Professor of Management at Télécom École de Management-IMT, Paris. She has participated in major projects funded by the European Union, and worked with a variety of both small and large companies on innovation-related projects.

 

Müge Özman, Strategic Management Innovation Networks, Télécom École de ManagementStrategic Management of Innovation Networks
Müge Özman
Cambridge University Press, 2017
360 pages
29.99 £ (paperback)
30.00 USD (eBook)

Order online