RAMSES

RAMSES: Time keeper for embedded systems

I’MTech is dedicating a series of articles to success stories from research partnerships supported by the Télécom & Société Numérique Carnot Institute (TSN), to which Télécom ParisTech belongs.

[divider style=”normal” top=”20″ bottom=”20″]

Belles histoires, Bouton, CarnotEmbedded computing systems are sometimes responsible for performing “critical” functions. In the transport industry, they sometimes prevent collisions between two vehicles. To help design these important systems, Étienne Borde, a researcher at Télécom ParisTech specialized in embedded systems, developed RAMSES. This platform gives developers the tools they need to streamline the design process for these systems. Its potential for various applications in the industrial and transport sectors and robotics has been recognized by Télécom & Société Numérique Carnot Institute, which has made it a part of its technological platform.

 

What is the purpose of the RAMSES platform?

Étienne Borde: RAMSES is a platform that helps design critical real-time embedded systems. This technical term refers to embedded systems that have a significant time component: if a computer operation takes longer than planned, a critical system failure could occur. In terms of the software, time is managed by a real-time operating system. RAMSES automates the configuration of this system while ensuring the system’s time requirements are met.

What sectors could this type of system configuration support be used for?

EB: The transport sector is a prime candidate. We also have a case study for the railway sector that shows what the platform could contribute in this field. RAMSES is used to estimate the worst data transmission time for a train’s control system. The most critical messages transmitted ensure that the train does not collide with another train. For safety reasons, the calculations are carried out using three computing units at the back of the train and three computing units at the front of the train. What RAMSES offers is a better control of latency and better management of the flow of transmission operations.

How does RAMSES help improve the configuration of critical real-time embedded systems?

EB: RAMSES is a compiler of the AADL language. This language is used to describe computer architectures. The basic principle of AADL is to define categories of software or hardware components that correspond to physical objects used in the everyday life of computer scientists or electronic engineers. An example of one of these categories is that of processors: AADL can describe the computer’s calculation unit by its parameters and frequency.  RAMSES helps assemble these different categories to represent the system with different levels of abstraction. This explains how the platform got its name: Refinement of AADL Models for Synthesis of Embedded Systems.

How does a compiler like RAMSES benefit professionals?

EB: Professionals currently develop their systems manually using the programming language of their choice, or generate this code using a model. They can assess the data transmission time on the final product, but with poor traceability in relation to the initial model. If a command takes longer than expected, it is difficult for the developers to isolate the step causing the problem. RAMSES generates intermediate representations as it progresses, analyzing the time associated with each task to ensure no significant deviations occur. As soon as an accumulation of mechanisms present a major divergence in relation to the set time constraints, RAMSES alerts the professional. The platform can indicate which steps are causing the problem and help correct the AADL code.

Does this mean RAMSES is primarily a decision-support tool?

EB: Decision support is one part of what we do. Designing critical real-time embedded systems is a very complex task. Developers do not have all the information about the system’s behavior in advance. RAMSES does not eliminate all of the uncertainties, but it does reduce them. The tool makes it possible to reflect on the uncertainties to decide on possible solutions. The alternative to this type of tool is to make decisions without enough analysis. But RAMSES is not used for decision support only. The platform can also be used to improve systems’ resilience, for example.

How can optimizing the configuration impact the system’s resilience?

EB: Recent work by the community of researchers in systems security looks at mixed criticality. The goal is to use multi-core architectures to deploy critical and less critical functions on the same computing unit. If the time constraints for the critical functions are exceeded, the non-critical functions are degraded. The computing resources made available by this process are then used for critical functions, thereby ensuring their resilience.

Is this a subject you are working on?

EB: Our team has conducted work to ensure that critical tasks will always have enough resources, come what may. At the same time, we are working on the minimum availability of resources for less critical functions. This ensures that the non-critical functions are not degraded too often. For example, for a train, this ensures that the train does not stop unexpectedly due to a reduction in computational resources. In this type of context, RAMSES assesses the availability of the functions based on their degree of criticality, while ensuring enough resources are available for the most critical functions.

Which industrial sectors could benefit from the solutions RAMSES offers?

EB: The main area of application is the transport sector, which frequently uses critical real-time embedded systems. We have partnerships with Thales, Dassault and SAFRAN in avionics, Alstom for the railway sector and Renault for the automotive sector. The field of robotics could be another significant area of application. The systems in this sector have a critical aspect, especially in the context of large machines that could present a hazard to those nearby, should a failure occur. This sector could offer good use cases.

 

[divider style=”normal” top=”20″ bottom=”20″]

A guarantee of excellence in partnership-based research since 2006

The Télécom & Société Numérique Carnot Institute (TSN) has been partnering with companies since 2006 to research developments in digital innovations. With over 1,700 researchers and 50 technology platforms, it offers cutting-edge research aimed at meeting the complex technological challenges posed by digital, energy and industrial transitions currently underway in in the French manufacturing industry. It focuses on the following topics: industry of the future, connected objects and networks, sustainable cities, transport, health and safety.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Institut Mines-Télécom Business School, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.

[divider style=”normal” top=”20″ bottom=”20″]

Ttool

TTool offers security, safety and performance for embedded systems

I’MTech is dedicating a series of articles to success stories from research partnerships supported by the Télécom & Société Numérique Carnot Institute (TSN), to which Télécom ParisTech belongs.

[divider style=”normal” top=”20″ bottom=”20″]

Belles histoires, Bouton, CarnotBy combining three areas for assisting the design of embedded systems, the TTool platform is seen as a safety net for industry. The platform, which originated in the laboratories of Télécom ParisTech, has now become one of the technology platforms Carnot TSN offers to companies. TTool helps them design embedded systems that are more secure, protected from cyberattacks and more efficient. Ludovic Apvrille, the Telecom ParisTech researcher behind this platform, explains the tool’s strengths and how it works.

 

What does the TTool platform you developed do?

Ludovic Apvrille: It is a design tool for embedded systems for creating models and running tests. On the one hand, it takes into account operation safety, in other words, it ensures the systems do not cause any risks to humans or have serious economic impacts. On the other hand, it offers a cybersecurity aspect by preventing certain attacks. This platform’s strength is that it offers both these aspects, whereas industrial tools only offer one or the other.

How does TTool ensure a system operates safely?

LA: Embedded system designers want to know whether a specific error could occur. These specific errors can be avoided by voluntarily degrading the system’s operation level when there is a high likelihood of the error occurring. This makes the system less efficient, but at least it does not endanger the user. It allows safety features to be integrated directly into the system itself. TTool helps implement these features. The developers create models of their system, enter the aspects they want to verify, and then just press a button. TTool does the rest: the platform analyzes the system and the potential error, then reports whether safety can be ensured.

Is this different from the cybersecurity aspect?

LA: Yes, the cybersecurity part is different. Embedded systems are vulnerable to different types of attacks, especially those that are used to read or inject data into the system’s communication links. We use TTool to study the system’s vulnerabilities to this type of attack: TTool can detect this type of attack and then automatically add software and hardware components to counter them. These components can ensure the data’s confidentiality, prevent a hacker from posing as a user, or indicate whether data has been altered by a hacker.

In specific terms, how can manufacturers use a platform like TTool?

LA: A good example is the joint Nokia Bell Labs and Télécom ParisTech laboratory inaugurated this year on June 25. TTool is one of the tools Nokia uses to respond to problems. In this case—due to the platform’s ability to quickly analyze a system’s model—it provides the company with information on the expected performance. We are specifically interested in the latency related to processing data. Nokia works with 5G encoders/decoders, components which encode and decode data packets sent by fifth generation mobile technology. TTool indicates how long it takes for a data sample to be encoded or decoded based on the choice of architecture.

How does TTool work?

LA: TTool is based on three modeling environments. The first is Diplodocus, which partitions the functions into software and hardware features. In the Nokia example, signal processing is carried out in part by the base stations. If there are major changes in the processing functions, the base stations can no longer provide this feature. The equipment providers therefore want to move some of the processing to the cloud. Since this is a costly operation, they want to minimize the number of calculations. In this case, the functions performed on the hardware and by the software must be optimized. Next, TTool is based on the AVATAR environment, which focuses on creating the embedded software: modeling, testing, then code generation. Finally, there is the SySML-Sec environment, which helps the developer add security functions to the embedded system for the entire development cycle.

Which sectors benefit most from this platform?

LA: It’s fairly diverse. TTool is currently being used for the European project H2020 AQUAS for two case studies on critical systems. In these cases, a security study must be carried out on both the safety of the operations and the performance. One is with Siemens, to add security to industrial motors. The other is for railway systems for automated doors—like the metro on line 1 in Paris. The objective is to detect if cyberattacks could affect the time the doors open or close. TTool can therefore be adapted to a wide range of embedded systems.

What are TTool’s main advantages?

LA: TTool offers a safety net during the development phase for embedded systems and it helps in the path to certification. It is therefore a platform that can increase trust in a product, beginning at the design phase. This does not mean the embedded systems can avoid the testing phase they are required to undergo, but there are less chances of finding errors. It should also be noted that TTool is free and open source software. This means that everyone can use the tools and modify the code to adapt it to their needs. The research team behind TTool intervenes when there is an issue they can work on to help companies improve their capacities.

 

[divider style=”normal” top=”20″ bottom=”20″]

A guarantee of excellence in partnership-based research since 2006

The Télécom & Société Numérique Carnot Institute (TSN) has been partnering with companies since 2006 to research developments in digital innovations. With over 1,700 researchers and 50 technology platforms, it offers cutting-edge research aimed at meeting the complex technological challenges posed by digital, energy and industrial transitions currently underway in in the French manufacturing industry. It focuses on the following topics: industry of the future, connected objects and networks, sustainable cities, transport, health and safety.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Institut Mines-Télécom Business School, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.

[divider style=”normal” top=”20″ bottom=”20″]

BioMica

BioMICA platform: at the cutting edge of medical imaging

Belles histoires, Bouton, CarnotAmong the research platforms at Télécom SudParis, BioMICA has developed bio-imaging applications that have already been approved by the medical field. Airways, its 3D representation software, received funding from Télécom & Société Numérique Carnot Institute.

[divider style=”dotted” top=”20″ bottom=”20″]

The original version of this article (in French) was published on the Télécom SudParis website

[divider style=”dotted” top=”20″ bottom=”20″]

 

One of the recommendations included in the March 2017 France AI Strategy report was to put artificial intelligence to work to improve medical diagnosis. The BioMICA research platform (which stands for Bio-Medical Imaging & Clinical Applications) has made this goal its mission.

We aim to develop tools that can be used in the clinical setting,” says Catalin Fetita, professor at Télécom SudParis and director of the bio-medical imaging platform. “Our applied research focuses on computer-aided diagnosis involving medical and biological imaging,” he explains. As a specialist in image analysis and processing, Catalin Fetita offers the platform true expertise in the area of medical imaging, particularly in lung imaging.

AirWays, or another way of seeing lungs

AirWays is “image marker” software (like biomarkers in biology). Based on a sequence of lung images taken by a scanner, it extracts as much information as possible for clinicians to assist them in their diagnosis by offering a range of different visualization and classification options. “The quantitative aspect is very important, we do not only want to offer better visual quality,” Catalin Fetita explains. “We offer the possibility of obtaining improved measurements of morphological differences in several areas of the respiratory system at different moments in time. This help clinicians decide which treatment to choose.” In terms of quantified results, the software can detect 95% of stenosis cases, which is the narrowing of bronchial tubes that affects respiratory capacity.

AirWays software uses a graphic grid representation of bronchial tube surfaces after analyzing clinical images and then generates 3D images to view them both “inside and outside” (above, a view of the local bronchial diameter using color coding)This technique allows doctors to plan more effectively for endoscopies and operations that were previously performed by sight.

“For now, we have limited ourselves to the diagnosis-analysis aspect, but I would also like to develop a predictive aspect,” says the researcher. This perspective is what motivated Carnot TSN to help finance AirWays in December 2017. “This new budget will help us improve and optimize the software’s interface and increase its computing power to make it a true black box for automatic and synthetic processing,” explains Catalin Fetita, who also hopes to work towards commercializing the software.

A platform for medicine of the future

In addition to its many computer workstations for developing its medical software, the BioMICA platform features two laboratories for biological experimentation. One of the laboratories has a containment level of L1 (any biological agent that is non-pathogenic for humans) and the other is L2 (possible pathogen with low risk). Both will help advance the clinical studies in cellular bio-imaging.

In addition, Catalin Fetita and his team are preparing a virtual reality viewing station to provide a different perspective of the lung tissue analyzed by Airways. “Our platform works thanks to research partnerships and technological transfers,” he explains, “but we can also use it to provide services for clinical studies.”

 

OMNI

OMNI: transferring social sciences and humanities to the digital society

I’MTech is dedicating a series of articles to success stories from research partnerships supported by the Télécom & Société Numérique Carnot Institute (TSN), to which IMT Atlantique belongs.

[divider style=”normal” top=”20″ bottom=”20″]

Belles histoires, Bouton, CarnotTechnology transfer also exists in social sciences and the humanities! The OMNI platform in Brittany proves this by placing its research activities at the service of organizations. Attached to the Breton scientific interest group called M@rsouin (which IMT Atlantique manages), it brings together researchers and professionals to study the impact of digital technology on society. The relevance of the structure’s approach has earned it a place within the “technology platform” offering of the Télécom & Société Numérique Carnot Institute (see insert at the end of the article). Nicolas Jullien, a researcher in Digital Economy at IMT Atlantique and Manager of OMNI, tells us more about the way in which organizations and researchers collaborate on topics at the interface between digital technology and society.

 

What is the role of the OMNI platform?

Nicolas Jullien: Structurally, OMNI is attached to the scientific interest group called M@rsouin, comprised of four universities and graduate schools in Brittany and, recently, three universities in Pays de la Loire*. For the past 15 years, this network has served the regional objective of having a research and study system on ICT, the internet and, more generally, what is today referred to as digital technology. OMNI is the research network’s tool for proposing studies on the impact of digital technology on society. The platform brings together practitioners and researchers and analyzes major questions that public or private organizations may have. It then sets up programs to collect and evaluate information to answer these questions. According to the needs, we can carry out questionnaire surveys – quantitative studies – or interview surveys – which are more qualitative. We also guarantee the confidentiality of responses, which is obviously important in the context of the GDPR. It is first and foremost a guarantee of neutrality between the player who wishes to collect information and the responding actors.

So is OMNI a platform for making connections and structuring research?

NJ: Yes. In fact, OMNI has existed for as long as M@rsouin, and corresponds to the part just before the research phase itself. If an organization has questions about digital technology and its impact and wants to work with the researchers at M@rsouin to collect and analyze information to provide answers, it goes through OMNI. We help establish the problem and express or even identify needs. We then investigate whether there is a real interest for research on the issue. If this is the case, we mobilize researchers at M@rsouin to define the questions and the most suitable protocol for the collection of information, and we carry out the collection and analysis.

What scientific skills can you count on?

NJ: M@rsouin has more than 200 researchers in social sciences and humanities. Topics of study range from e-government to e-education via social inclusion, employment, consumption, economic models, operation of organizations and work. The disciplines are highly varied and allow us to have a very comprehensive approach to the impact of digital technology on an organization, population or territory… We have researchers in education sciences, ergonomics, cognitive or social psychology, political science and, of course, economists and sociologists. But we also have disciplines which would perhaps be less expected by the general public, but which are equally important in the study of digital technology and its impacts. These include geography, urban planning, management sciences and legal expertise, which has been closely involved since the development of wide-scale awareness of the importance of personal data.

The connection between digital technology and geography may seem surprising. What is a geographer’s contribution, for example, to the question of digital technology?

NJ: One of the questions raised by digital technology is that of access to online resources. Geographers are specifically interested in the relationship between people and their resources and territory. Incorporating geography allows us to study the link between territory and the consumption of digital resources, or even to more radically question the pertinence of physical territory in studies on internet influence. It is also a discipline that allows us to examine certain factors favoring innovation. Can we innovate everywhere in France? What influence does an urban or rural territory have on innovation? These are questions asked in particular by chambers of commerce and industry, regional authorities or organizations such as FrenchTech.

Why do these organizations come to see you? What are they looking for in a partnership with a scientific interest group?

NJ: I would say that these partners are looking for a new perspective. They want new questions or a specialist point of view or expert assessment in complex areas. By working with researchers, they are forced to define their problem clearly and not necessarily seek answers straight away. We are able to give them the breathing space they need. But we can only do so if our researchers can make proposals and be involved in partners’ problems. We propose services, but are not a consultancy agency: our goal remains to offer the added value of research.

Can you give an example of a partnership?

NJ: In 2010 we began a partnership with SystemGIE, a company which acts as an intermediary between large businesses and small suppliers. It manages the insertion of these suppliers in the purchasing or production procedures of large clients. It is a fairly tricky positioning: it is necessary to understand the strategy of suppliers and large companies and the tools and processes to put in place… We supported SystemGIE in the definition of its atypical economic model. It is a matter of applied research because we try to understand where the value lies and the role of digital technology in structuring these operators. This is an example of a partnership with a company. But our biggest partner remains the Brittany Regional Council. We have just finished a survey with it on craftspeople. The following questions were asked: how do craftspeople use digital technology? How does their online presence affect their activity?

How does the Carnot label help OMNI?

NJ: First and foremost it is a recognition of our expertise and relevance for organizations. It also provides better visibility at a national institutional level, allowing us to further partnerships with public organizations across France, as well as providing better visibility among private actors. This will allow us to develop new, nationwide partnerships with companies on the subject of the digitization of society and the industry of the future.

* The members of M@rsouin are: Université de Bretagne Occidentale, Université de Rennes 1, Université de Rennes 2, Université de Bretagne Sud, IMT Atlantique, ENSAI, ESPE de Bretagne, Sciences Po Rennes, Université d’Angers, Université du Mans, Université de Nantes.

 

[divider style=”normal” top=”20″ bottom=”20″]

A guarantee of excellence in partnership-based research since 2006

The Télécom & Société Numérique Carnot Institute (TSN) has been partnering with companies since 2006 to research developments in digital innovations. With over 1,700 researchers and 50 technology platforms, it offers cutting-edge research aimed at meeting the complex technological challenges posed by digital, energy and industrial transitions currently underway in in the French manufacturing industry. It focuses on the following topics: industry of the future, connected objects and networks, sustainable cities, transport, health and safety.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Institut Mines-Télécom Business School, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.

[divider style=”normal” top=”20″ bottom=”20″]

H2sys

H2sys: hydrogen in the energy mix

I’MTech is dedicating a series of articles to success stories from research partnerships supported by the Télécom & Société Numérique Carnot institute (TSN), to which IMT and Femto Engineering belong.

[divider style=”normal” top=”20″ bottom=”20″]

Belles histoires, Bouton, CarnotH2sys is helping make hydrogen an energy of the future. This spin-off company from the FCLAB and Femto-ST laboratories in Franche-Comté offers efficient solutions for integrating hydrogen fuel cells. Some examples of these applications include generators and low-carbon urban mobility. And while the company was officially launched only 6 months ago, its history is closely tied to the pioneers of hydrogen technology from Franche-Comté.

 

1999, the turn of the century. Political will was focused on the new millennium and energy was already a major industrial issue. The end of the 90s marked the beginning of escalating oil prices after over a decade of price stability. In France, the share of investment in nuclear energy was waning. The quest for other forms of energy production had begun, a search for alternatives worthy of the 2000s. This economic and political context encouraged the town of Belfort and the local authorities of the surrounding region to invest in hydrogen. Thus, the FCLAB research federation was founded, bringing together relevant laboratories related to this theme. Almost two decades later, Franche-Comté has become a major hub for the discipline. FCLAB is the first national applied research community to work on hydrogen energy and the integration of fuel cell systems. It also integrates a social sciences and humanities research approach which looks at how our societies adopt new hydrogen technologies. This federation brings together 6 laboratories including FEMTO-ST and is under the aegis of 10 organizations, including the CNRS.

It was from this hotbed of scientific activity that H2sys was born. Described by Daniel Hissel, one of its founders, as “a human adventure”, the young company’s history is intertwined with that of the Franche-Comté region.  First, because it was created by scientists from FCLAB. Daniel Hissel is himself a professor at the University of Franche-Comté and leads a team of researchers at Femto-ST, both of which are partners of the federation. Secondly, because the idea at the heart of the H2sys project grew out of regional activity in the field of hydrogen energy. “As a team, we began our first discussions on the industrial potential of hydrogen fuel cell systems early as 2004-2005,” Daniel Hissel recalls.  The FCLAB teams were already working on integrating these fuel cells into energy production systems. However, the technology was not yet sufficiently mature. The fundamental work did not yet target large-scale applications.

Ten more years would be needed for the uses to develop and for the hydrogen fuel cell market to truly take shape. In 2013, Daniel Hissel and his colleagues watched intently as the market emerged. “All that time we had spent working to integrate the fuel cell technology provided us with the necessary objectivity and allowed us to develop a vision of the future technical and economic issues,” he explains. The group of scientists realized that it was the right time to start their business. They created their project the same year. They quickly received support from the Franche-Comté region, followed by the Technology Transfer Accelerator (SATT) in the Grand Est region and the Télécom & Société Numérique Carnot institute. In 2017, the project officially became the company H2sys.

Hydrogen vs. Diesel?

The spin-off now offers services for integrating hydrogen fuel cells based on its customers’ needs. It focuses primarily on generators ranging from 1 to 20 kW. “Our goal is to provide electricity to isolated sites to meet needs on a human scale,” says Daniel Hissel. The applications range from generating electric power for concerts or festivals to supporting rescue teams responding to road accidents or fires. The solutions developed by H2sys integrate expertise from FCLAB and Femto-ST, whose research involves work in system diagnosis and prognosis aimed at understanding and anticipating failures, lifespan analysis, predictive maintenance and artificial intelligence for controlling devices.

Given their uses, H2sys systems are in direct competition with traditional generators which run on combustion engines—specifically diesel. However, while the power ranges are similar, the comparison ends there, according to Daniel Hissel, since the hydrogen fuel cell technology offers considerable intrinsic benefits. “The fuel cell is powered by oxygen and hydrogen, and only emits energy in the form of electricity and hot water,” he explains. The lack of pollutant emissions and exhaust gas means that these generators can be used inside as well as outside. “This is a significant benefit when indoor facilities need to be quickly installed, which is what firefighters sometimes must do following a fire,” says the co-founder of the company.

Another argument is how unpleasant it is to work near a diesel generator. Anyone who has witnessed one in use understands just how much noise and pollutant emissions the engine generates. Hydrogen generators, on the other hand, are silent and emit only water. Their maintenance is also easier and less frequent: “Within the system, the gases react through an electrolyte membrane, which makes the technology much more robust than an engine with moving parts,” Daniel Hissel explains. All of these benefits make hydrogen fuel cells an attractive solution.

In addition to generators, H2sys also works on range extenders.  “This is a niche market for us because we do not yet have the capacity to integrate the technology into most vehicles,” the researcher explains. However, the positioning of the company does illustrate the existing demand for solutions that integrate hydrogen fuel cells. Daniel Hissel sees even more ambitious prospects. While the electric yield of these fuel cells is much better than those of diesel engines (55% versus 35%), the hot water they produce can also be recovered for various purposes. Many different options are being considered, including a water supply network for isolated sites, or for household consumption in micro cogeneration units for electricity and heating.

But finding new uses through intelligent integrations is not the only challenge facing H2sys. As a spin-off company from research laboratories, it must continue to drive innovation in the field. “With FCLAB, we were the first to work on diagnosing hydrogen fuel cell systems in the 2000s,” says Daniel Hissel. “Today, we are preparing the next move.” Their sights are now set on developing better methods for assessing the systems’ performance to improve quality assurance. In contributing to making the technology safer, H2SYS is heavily involved in developing fuel cells. And the technology’s maturation since the early 2000s is now producing results: hydrogen is now attracting the attention of manufacturers for the large-scale storage of renewable energies. Will this technology therefore truly be that of the new millennium, as foreseen by the pioneers of the Franche-Comté region in the late 90s? Without going that far, one thing is certain: it has earned its place in the energy mix of the future.

 

[box type=”shadow” align=”” class=”” width=””]

A guarantee of excellence
in partnership-based research since 2006

 

Having first received the Carnot label in 2006, the Télécom & Société numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Institut Mines-Télécom Business School, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering. Learn more [/box]

data-moove

Seamless vacations thanks to a research lab

Belles histoires, Bouton, CarnotFor four years now, researchers from EURECOM and the startup Data-Moove have worked together to radically improve the tourist experience in various regions. With help from technological innovations from laboratories, they have succeeded in aggregating the information available on the web and social networks to create a local and comprehensive picture of what a geographical area has to offer.

 

Finding a restaurant, concert hall or hotel when traveling abroad can turn into quite an ordeal. Every restaurant and event has a Facebook page and website, yet few sites gather all a destination’s activities into one spot. For tourists, this means spending time on social networks, time they would rather spend enjoying their vacation. Data-Moove’s challenge was therefore significant: the French startup’s mission was to offer a solution to this problem by creating a comprehensive overview of a region’s tourism offering. On March 2nd, the young company inaugurated an interactive board in the Saint-Barthélemy airport in the West Indies. Travelers arriving on the island can now see an overview of the activities available to them in the area and can create an itinerary for their stay. This interactive board is complemented by a mobile application offered by the island’s Tourism Board which is free for the end user.

This service responds to tourism offices’ growing demand for digital technology to help promote their regions. To meet this need Data-Moove worked together with EURECOM research teams, which are part of the Télécom & Société Numérique Carnot Institute. Their partnership started in 2015. At that time, Raphaël Troncy, a researcher in data science at EURECOM, and his team were involved in the European project 3cixty led by EIT Digital. “We were working to automate the collection of tourism and cultural information,” the scientist recalls. “We wanted a platform that would bring together all the information about accommodation, places of interest, and seasonal, sports and cultural activities…” In short, offering comprehensive and local information. The project was launched a year earlier and already provided a fully developed technical solution. All that was missing was a commercial partner. Data-Moove, which had just been founded, met this need throughout the entire project, which ended in 2016.

Searching social media

During the three-year 3cixty project, the EURECOM researchers needed to solve the problem posed by the heterogeneity of the information sources. TripAdvisor and Facebook do not use the same language and information about a restaurant is not always available in the same format. They therefore needed to represent this stream of data collected from social networks by using semantic graphs: word clouds were linked together based on how they were related. People, places, dates and actions were described in a standardized way and then processed to provide the user with streamlined information, regardless of the source.

Because we aggregate information from many sources, there is a good chance the same information will be presented twice in the data stream,” says Raphaël Troncy. This brings us to the second technological challenge: solving the problem of duplicates involved measuring the similarity in the references to places, dates and names of events.  “We therefore developed a learning algorithm to automatically carry out this work of studying the similarities,” the researcher explains. Another learning model was established to automatically predict the category of an event without much description. This makes it possible to directly present information as being related to sports, theater or music, for example.

A tourism application for discovering all the tourist attractions Saint-Barthélemy has to offer.

Data-Moove implemented the technical solutions developed during the 3cixty project in its first product: City Moove, based on an application like the one used by Saint-Barthélemy. “Our technology for aggregating flows of information can also be connected to a preexisting application,” explains Frédéric Bossard, co-founder of Data-Moove. The goal is to avoid having an excessive number of digital tools for a region. The company also prefers to work with tourist offices to improve the tools they already use. “The problem many regions have is that they often have too many applications, each for a specific area,” he explains.

Tourism of the future, brick by brick

The two partners decided to capitalize on this success by taking the use of digital technology in tourism a step further. In 2017, they began partnering with the European PasTime project—also supported by EIT Digital—which is intended to make suggestions for activities when people are traveling. “The idea is to ask end users when they will arrive in a city and then directly propose an itinerary,” Raphaël Troncy explains. Once again, they carried out machine learning research on large volumes of data. They developed standard profiles based on interactions with users on social media. “The real challenge is to develop a package, in other words, connect interests with tastes in food and preferences for events,” the researcher explains. Here they were able to build on City Moove, to take the technology to a new level by adding a customized aspect.

And a third level is underway. Since February 2018, EURECOM and Data-Moove have been working on a new product: a smart conversational assistant to answer questions about a region’s tourist attractions.  Their work, entitled MinoTour, is being carried out in the context of the European project H2020 Data Pitch. The chatbot they develop will also learn from users’ searches and provide answers based on the aggregated data flow from City Moove. “There is a logic to our products,says Frédéric Bossard: “we build brick by brick, from the database to the chatbot, developing solutions that are best adapted to the geographical areas.”

After Saint-Barthélemy, Data-Moove will test its solutions in Saint-Tropez, Madeira, and on a wider scale in the Provence-Alpes-Côte d’Azur region. Areas with significant tourism activities, which will allow them to continue improving their products to better meet the needs of both the regions and the tourists.

[divider style=”normal” top=”20″ bottom=”20″]

The advantage of a partnership with Eurecom: “An operational perspective”

Frédéric Bossard, co-founder of Data-Moove

One of the objectives of the Télécom & Société Numérique Carnot Institute is to professionalize relations between companies and researchers. Frédéric Bossard, co-founder of Data-Moove, can testify to this: “It is nice to work with EURECOM because the researchers truly have an operational perspective, which is rare among academic partners. They quickly understand our constraints and what we want to accomplish. The quality of these discussions convinced us to enter a partnership rather than a simple collaboration. Today, EURECOM is a partner of Data-Moove since the school has taken shares in the company. By making their laboratories and knowledge available to us, they allow us to take the development of our products to whole new levels.

[divider style=”normal” top=”20″ bottom=”20″]

 

[box type=”shadow” align=”” class=”” width=””]

The TSN Carnot institute, a guarantee of excellence in partnership-based research since 2006

Having first received the Carnot label in 2006, the Télécom & Société numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Télécom École de Management, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering. Learn more [/box]

carnot TSN

Researching technological disruptions to prepare for the future

Belles histoires, Bouton, CarnotFundamental and applied research are often simplistically portrayed as being opposite each other. The Carnot program, run by the Ministry of Higher Education, Research and Innovation, and by the National Research Agency (ANR), challenges this vision. Though its primary objective is to develop virtuous partnerships between public research institutions and companies in order to stimulate technology transfer, it achieves this goal by requiring the institutions it funds to make significant contributions to fundamental research. Far from being paradoxical, this forward-looking strategy focuses on preparing companies for the future, beyond their immediate technological competitiveness. The Télécom & Société Numérique Carnot Institute (Carnot TSN, of which IMT is a member) fulfills this mission through its Futur & Ruptures (Future and Disruptions) program. Its director, Christian Picory-Donné, answers our questions to explain the scientific challenges of this type of program.

 

The Carnot program invites Carnot Institutes to carry out “scientific resourcing.” Can you explain what that is?

Christian Picory-Donné: It means preliminary research with the aim of preparing for the future of industry. What we have learned from the Fraunhofer Institutes in Germany, or other major technological research institutes, is that it is difficult for these organizations to be trailblazers. Their resources are targeted at satisfying nowadays market demands to such an extent that they tend to fail to prepare for the future. The Carnot program strives to address this shortcoming by providing funding well ahead of current industrial problems.

How does the Télécom & Société Numérique Carnot Institute (TSN) achieve this resourcing?

CP: The Futur & Ruptures program serves to fund the equivalent of 60 to 80 years of PhD, post-doctorate theses and sabbaticals every year. Funding for this support comes in part from Fondation Mines-Télécom, and in part from TSN Carnot. For TSN Carnot this initiative represents virtually all of its resourcing activity, or approximately 60% of the annual contribution it receives from the Carnot program. To give some concrete figures, this year TSN Carnot received a contribution of €4 million. This means €2.4 million was allocated to the Futur & Ruptures program to fund PhD and post-doctorate theses at members of TSN Carnot.

carnot TSN

Christian Picory-Donné, Director of TSN Carnot

The Carnot program’s objective is to develop partnership-based research. How does this upstream scientific positioning influence innovation?

CP: One example that comes to mind is the joint SePeMed laboratory between IMT Atlantique and the MEDECOM company, which was launched in 2014. It focuses on problems related to managing and securing medical databases. It all started with work by Gouenou Coatrieux, a researcher at IMT Atlantique, who obtained Carnot funding for his PhD and post-doctoral students through the Futur & Ruptures program. Thanks to the results achieved, he submitted a proposal for a Labcom (joint laboratory) project to the National Research Agency whose funding enabled a strategic partnership with MEDECOM through the creation of this joint laboratory was. There are many other success stories such as this one, and more to come since a great number of theses are funded while the benefits are not always as rapidly materialized or as directly related.

The Futur & Ruptures program is ten years old this year. What is your view of the program? 

CP: A very positive view. First of all, the Carnot program is a virtuous program, since it rewards research conducted by public laboratories in proportion to its effectiveness and the extent to which it fulfills its commitments to expanding partnership-based research. It is also a tool to support IMT’s strategy. Originally focused on research to support economic development, the Carnot program also supports IMT’s development plan. The same is true for the tools supported by the Foundation, so much so that a significant leverage effect can be observed—funding, development of research resources, research results—in keeping with the institute’s strategic priorities.

This reality of partnership-based research contrasts with the stereotype of research activity as a service provided for a company.   

CP: Of course. We consider it to be targeted research since it is attentive to companies’ problems, but that does not mean there is a subordinate relationship. Resourcing may be seen from different perspectives. First of all, there is this noble image of the researcher toiling away in his ivory tower to achieve his vision. This is not our standpoint, because Carnot resourcing is more connected to the vision we may have about long-term industrial needs. It is a very forward-looking position. We know, for example, that data protection and cryptography are a major issue—simply by being attentive to current industry concerns—so we have researchers working on quantum cryptography. But no company has come to ask us to fund such specialized research for use in immediate applications. That would be too risky for companies: industrial needs for quantum cryptography will not become a reality until a number of years from now. This positioning allows us to carry out fundamental research, while looking ahead to the technological obstacles that companies will face in the future.

[box type=”shadow” align=”” class=”” width=””]

The TSN Carnot institute, a guarantee of excellence in partnership-based research since 2006

Having first received the Carnot label in 2006, the Télécom & Société numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Télécom École de Management, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering. Learn more [/box]

watermarking

Watermarking: a step closer to secure health data

Belles histoires, Bouton, CarnotIn the near future, watermarking data could be the best traceability technique in the healthcare domain. It involves hidding information into medical images with the aim at reinforcing data security for patients and healthcare professionals. After being developed for nearly ten years in the laboratories of IMT Atlantique and Medecom, watermarking has now reached a level of maturity that allows its integration into professional products. Yet it still must be approved by standardization bodies.

 

Are you sure that is really your body on the latest X-ray from your medical exam? The question may seem absurd, yet it is crucial that you, your doctor and the radiologist can all answer this question with a resounding “yes”. To ensure this level of certainty, healthcare professionals must rely on the latest technological advances. This is a matter of ensuring the right patient gets the right diagnosis—no one wants an X-ray of their lungs to be switched with one from a chain-smoker!

To ensure an X-ray is correctly associated with the patient and to return a lost X-ray to its rightful owner, the name printed on the X-ray film is not sufficient. An ill-intentioned individual or an administrative error could cause the unfortunate exchange of two patients’ images. Medecom and researchers from IMT Atlantique, part of the Télécom & Société Numérique Carnot Institute, have been working on a more secure system based on the watermarking. For over ten years the two entities have been collaborating on this technology and four years ago they inaugurated SePEMeD, a joint laboratory focused on this area, with support from the French National Agency for Research (ANR).  Since then, the maturity and viability of the watermark technology have become increasingly convincing.

A Secret Message

The watermark draws on the principle of steganography, the art of hidden writing, which is almost as old as cryptography,” explains Gouenou Coatrieux, a researcher in imagery and information processing at IMT Atlantique. “In the case of X-rays, we change some pixels in the image to hide a message and leave an invisible mark,” he continues (see box below). The value of the watermark is that the protection is independent from the storage format. The X-ray can therefore be exchanged between departments and hospitals, each with its own unique system for processing X-ray images, yet this will not affect the watermark, which will continue to contain the information related to the patient.

[box type=”info” align=”” class=”” width=””]

The watermark: a message hidden in the pixels

The secret to watermarking X-rays is in the pixels, which can be encoded in 8, 16 or 32 bits. If a pixel is encoded in 8 bits, this means its color is indicated by a series of 8 bits—a bit is a 0 or 1 in the binary code. There are 256 possible 8-bit combinations: 00000000, 00000001, etc. There are therefore 256 possible colors for a pixel encoded in 8 bits, or 256 different shades of gray for a pixel in a black and white image.

Watermarking an image involves modifying certain pixels by changing one of their bits. This means the color, or shade of gray, is altered. To prevent this from being noticeable on the X-ray, the bit containing the least amount of information—the one located at the end of the 8-bit sequence—is modified. The colors related to bits 00110101 and 00110100 are very similar, whereas those related to bits 00000000 and 10110110 are very different. The more two series of bits are dissimilar, the less similar the colors.

The changed bits in the pixels form a message, which could be a patient’s name or the doctors’ authorization to access the X-ray. To discover which bits bear this message, the X-ray recipient must have the watermark key associated with the medical image. This ensures the secrecy of the message.[/box]

In addition to traceability, watermarking has other advantages. First, it can help detect insurance fraud. If an X-ray is tampered with by an ill-intentioned individual, for example to fake a disease, the secretly watermarked pixels will also be modified, revealing the attempted fraud. Next, the watermark can be added to data that is already encrypted using a method that has been patented by Medecom and IMT Atlantique. It is therefore possible to ensure traceability while maintaining the confidentiality of medical information the image contains. This also makes it possible to write information about certain doctors’ access authorizations directly on the encrypted data.

Moving towards standardization?

While this watermark technology is now mature, it still must pass the test of standardization procedures in order to be implemented in software and the information systems of healthcare professionals. “Our goal now is to show that altering the image with the watermark does not have any effect on the quality of the image and the doctors’ diagnostic capacity,” says Michel Cozic, R&D director at Medecom. The SePEMeD team is therefore working to conduct qualitative studies on watermarked data with physicians.

At the same time, they must convince certain healthcare professionals of the value of watermarking.  The protection of personal data, and medical data in particular, is not always viewed the same way throughout the healthcare world. “In the hospital environment, professionals tend to believe that the environment is necessarily secure, which is not always the case,” Michel Cozic explains. In France, and in Europe in general, attitudes about data security are changing. The new General Data Protection Regulation (GDPR) established by the European Commission is proof of this. However, it will be some time before the entire medical community systematically takes data protection into account.

Ten years of research… and ten more to come?

Since there is still a long way to go before healthcare professionals begin using watermarks, the SePEMeD story is not over yet. Founded in 2014 to solidify the collaboration between IMT Atlantique and Medecom, which has lasted over ten years, SePEMeD was originally intended to run only three years. However, following the success of the research which led to promising applications, this first joint laboratory accredited by the ANR on data security will continue its work until at least 2020. Beyond data traceability, SePEMeD is also seeking to improve the security of remotely processed encrypted images in cloud storage.

We update our focus areas based on our results,” Gouenou Coatrieux notes, explaining why the SePEMeD laboratory has been extended. Michel Cozic agrees: “We are currently focusing our research on issues related to browsers’ access to data, and the integration of watermarking modules in existing products used by professionals.” The compatibility of algorithms with healthcare institutions’ computer configurations and systems will be a major issue involved in the adoption of this technology. Last but not least: ease of use.  “No one wants to have to enter passwords in the software,” observes Medecom’s R&D Director. We must therefore succeed in integrating watermarking as a security solution that is straightforward for doctors.

[divider style=”normal” top=”20″ bottom=”20″]

The benefit of collaborating with IMT Atlantique: “The human aspect”

Michel Cozic

One of Télécom & Société Numérique Carnot Institute’s objectives is to professionalize the relationships between companies and researchers.  Michel Cozic, Medecom’s R&D Director shares his experience: “There is also a human aspect to these collaborations. Our exchanges with IMT Atlantique go very smoothly, we understand each other. On both sides we accept our differences, constraints and we compromise. We come from two different environments and this means we must have discussions. There must be an atmosphere of trust, a good relationship and a common understanding of the objectives. This is what we have been able to accomplish through the SePEMeD laboratory.

[divider style=”normal” top=”20″ bottom=”20″]

[box type=”shadow” align=”” class=”” width=””]

The TSN Carnot institute, a guarantee of excellence in partnership-based research since 2006

Having first received the Carnot label in 2006, the Télécom & Société numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies.

The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Télécom École de Management, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.

[/box]

laser femtoseconde, Femto Engineering

A new laser machining technique for industry

Belles histoires, Bouton, CarnotFEMTO-Engineering, part of Carnot Télécom & Société numérique institute, offers manufacturers a new cutting and drilling technique for transparent materials. By using a femtosecond laser, experts can reach unrivalled levels of precision when manufacturing ultra-hard materials. Jean-Pierre Goedgebuer, director of FC’Innov (FEMTO-Engineering), explains how the technique works.

What is high aspect ratio precision machining and what is it used for?

Jean-Pierre Goedgebuer: precision machining is used in cutting, drilling and engraving materials. It allows various designs to be inscribed onto materials such as glass, steel or stainless steel. It’s a very widespread method in industry. Precision machining corresponds to a positioning and shaping technique for an extremely small scale, i.e. in the range of 2 microns (10-6 meters). The term “aspect ratio” for example is a reference to drilling. It corresponds to the relationship between the depth and the diameter. Therefore, an aspect ratio of 100 corresponds to a diameter 100 times smaller than its depth.

Cutting or drilling requires local destruction and mastery of the material. In order to achieve this, we supply energy from a laser. This emits heat when it comes into contact with the material.

 

What is femtosecond machining?

JPG: The term femtosecond [1] refers to the duration of the laser pulses, which last a few tens or hundreds of femtoseconds. The length of the pulse determines the length of the interaction between light and the material. The shorter it is, the fewer thermal exchanges there are with the material and therefore in principal, the less the material is destroyed.

In laser machining, we use short pulses (femtoseconds – 10-15 of a second) or longer pulses (nanoseconds – 10-9 of a second). The choice depends on the usage. For machining with no thermal effect, that is, where the material is not affected by the heat produced by the pulse, we tend to use femtosecond pulses, allowing us to find a good compromise between destruction of the material and how high the temperature is. These techniques are associated to light propagation models which allow us to simulate the impact of the properties of a material on the propagation of the light going through it.

 

The femtosecond machining technique generally uses Gaussian beams. The defining characteristic of your process is that it uses Bessel beams. What is the difference?

JPG: Gaussian laser beams are beams inside which the energy is spread in a Gaussian way. When they have raised energy levels, they produce non-linear effects when propagated in the materials. This means that they produce autofocusing effects, making their diameters non-constant and distorting their propagation. These effects can be detrimental to the quality of the machining of certain special kinds of glass.

In contrast, the Bessel laser beams, like what we use in our machining technique, allow us to avoid these non-linear effects. They therefore have the ability to maintain a constant diameter over a well-defined length. They act as very fine “laser needles”, measuring just a few hundred nanometers in diameter (a nanometer corresponds to approximately the size of an atom). Inside these “laser needles” is a very high concentration of energy. This generates an extremely localized plasma within the material, which causes the excision of the material. Furthermore, we can control the length of these “laser needles” in a very precise way. We use them to do very deep cutting or drilling (with an aspect ratio of up to 2,000) producing a precise, clean result with no thermal effects.

In order to start being able to use this new technology, we used a traditional femtosecond laser. What led to several patents being filed by the Institut FEMTO-ST, was finding out how to transform Gaussian beams into Bessel beams.

 

What is the point of this new technology?

JPG: There are two main reasons for it. As we’re dealing with “laser needles” which hold a high density of energy, it is possible to drill very hard materials which would pose a problem for traditional laser machining techniques. Thanks to the technique’s athermic nature, the material in question keeps its physicochemical properties intact; it does not change.

This machining method is used for transparent materials. Industrial demand is high as there are many products that require the machining of herder transparent materials. This is the case for example with smartphones, where the screens need to be made from special kinds of very durable, scratch-resistant glass. This is a big market and is a major focus for many laser manufacturers, particularly in Europe, the US and of course, Asia. There are several other uses however, including elsewhere in the biomedical field.

 

What’s next for this technique?

JPG: Our mission at FEMTO Engineering is to accentuate the research coming out of the Institut FEMTO-ST. In this context, we have partnerships with manufacturers with whom we are exploring how this new technology could respond to their needs in terms of very specific materials where traditional femtosecond machining doesn’t give satisfactory results. We are currently working on cutting new materials for smartphones, as well as polymers for medical use.

The primary research carried out by the Institut FEMTO-ST, is continuing to focus in particular on better understanding light-matter interaction mechanisms and plasma formation. This research was recently formally recognized by the ERC (European Research Council) which finances experimental projects that encourage scientific discovery. The aim is to really master the understanding of the physical properties of Bessel beam propagation which is something that has not been particularly studied on a scientific level before now.

[1] A femtosecond corresponds to one millionth of a billionth of a second. It’s the approximate duration of an electromagnetic wave. A femtosecond is to a second what a second is to the lifetime of the universe.

On the same topic:

vigisat, surveillance, environnement

VIGISAT: monitoring and protection of the environment by satellite

Belles histoires, Bouton, CarnotFollowing on from our series on the platforms provided by the Télécom & Société numérique Carnot institute, we will now look at VIGISAT, based near Brest. This collaborative hub is also a project focusing on the satellite monitoring of oceans and continents in high resolution.

 

On 12th July, scientists in Wales observed a drifting iceberg four times the size of London. The imposing block of ice detached from the Antarctic and is currently meandering around the Weddell Sea, and has now started to crack. This close monitoring of icebergs was made possible by satellite images.

Although perhaps not directly behind this observation, the Breton Observation Station, VIGISAT, is particularly involved in the matter of maritime surveillance. It also gathers useful information on protecting the marine and terrestrial environments. René Garello, a researcher at IMT Atlantique, presents the main issues.

 

What is VIGISAT?

René Garello: VIGISAT is a reception center for satellite data (radar sensors only) operated by CLS (Collecte Localisation Satellites) [1]. The station benefits from the expertise of the Groupement d’Intérêt Scientifique Bretagne Télédétection (BreTel) community, made up of nine academic members and partners from the socio-economic world. Its objective is to demonstrate the relevance of easy access data for the development of methods for observing the planet. It is at the service of the research community (for academic partners) and of the “end users” from a business perspective.

VIGISAT is also a project within the Breton CPER (Contrat de Plan État-Région) framework, which has been renewed to run until 2020. The station/project concept was named a platform by the Institut Carnot Télécom & Société Numérique at the end of 2014.

 

The VIGISAT station

 

What data does VIGISAT collect and how does it process this?

RG: The VIGISAT station receives data from satellites carrying Synthetic Aperture Radars (better known as SARs). This microwave sensor allows us to obtain very high resolution imaging of the Earth’s surface. The data received by the station therefore come from both the Canadian satellite RadarSAt-2, and in particular from the new series of European satellites: SENTINEL. These are sun-synchronous orbiting satellites [NB: the satellite always passes over a certain point at the same solar time], which move at an altitude of 800km and can circle the Earth in just 100 minutes.

We receive raw information collected by satellites, in other words, data come in the form of unprocessed bit streams. The data are then transmitted by fiber optic to the processing center which is also located on the site. “Radar images” are then constructed using the raw information and the radar’s known parameters. The final data, although in image form, require expert interpretation. In simple terms, the radar wave emitted is sensitive to the properties of the observed surfaces. In this way, the nature of the earth (vegetation, bare surfaces, urban landscapes, etc.) will send its own characteristic energy. Furthermore, the information required depends on the measuring device’s intrinsic parameters, such as the length of the wave or the polarization.

 

What scientific issues are addressed using VIGISAT data?

RG: CLS and researchers from members of the GIS BreTel are working on diverse and complementary issues. At IMT Atlantique or Rennes 1 University, we are mainly focusing on the methodological aspects. For example, for 20 years, we have had a high level of expertise on statistical processing of images. In particular, this allows us to identify areas of interest on terrestrial images or surface types on the ocean. More recently, we have been faced with the sheer immensity of the data we collect. We therefore put machine learning, data mining and other algorithms in place in order to fully process these databases.

Other GIS institutions, such as Ifremer or IUEM [2], are working on marine and coastal topics, in collaboration with us. For example, research has been carried out on estuary and delta areas, such as the Danube. The aim is to quantify the effect of flooding and its persistence over time.

Finally, continental themes such as urban planning, land use, agronomics and ecology are the main elements being studied by Rennes 2 University or Agrocampus. In the case of urban planning, satellite observations allow us to locate and map the green urban fabric. This allow us to estimate the allergenic potential of public spaces for example. It should be noted that a lot of these works, which began in the field of research, have led to the creation of some viable start-ups [3].

What projects has VIGISAT led?

RG: Since 2010, VIGISAT’s privileged data access has allowed it to back various other research projects. Indeed, it has created a lasting dynamic within the scientific community on the development of land, as well as the surveillance and controlled exploitation of land. Amongst the projects currently underway, there is for example CleanSeaNet, which focuses on the detection and monitoring of marine pollution. KALIDEOS-Bretagne looks at the evolution of land and landscape occupation and use on a town-countryside gradient. SESAME deals with the management and exploitation of satellite data for marine surveillance purposes.

 

Who is benefitting from the data analyzed by VIGISAT?

RG: Several targets were identified whilst preparing for the CPER 2015-2020 support request. One of these objectives is to generate activity in terms of the use of satellite data by Breton businesses. This includes the development of new public services based on satellite imaging in order to favor downstream services with regional affiliates development strategy.

One sector that benefits from the data and their processing is undoubtedly the highly reactive socio-economic world (i.e. start-ups, SMEs, etc.) that are based on the uses we discussed earlier. On a larger scale, protection and surveillance services are also addressed by the action coordinated by the developers and the suppliers of a service, such as GIS and the authorities at a regional, national and European level. By way of an example, BreTel has been a member of the NEREUS (Network of European Regions Using Space technologies) since 2009. This allows us to hold a strong position in the region as a center of expertise in marine surveillance (as well as in detection and monitoring of oil pollution) and also analyze ecological corridors in the context of biodiversity.

 [1] CLS is an affiliate of CNES, ARDIAN and Ifremer. It is an international business specializing in supplying Earth observation and surveillance solutions since 1986.
 [2] European Institute for Marine Studies
[3] Some examples of these start-ups include: e-ODYN, Oceandatalab, Hytech Imaging, Kermap, Exwews, and Unseenlab.

[box type=”info” align=”” class=”” width=””]

On VIGISAT:

The idea for VIGISAT began in 2001, with the start-up BOOST Technologies, which came out of IMT Atlantique (formerly Télécom Bretagne). From 2005, propositions were made to various partners including the Bretagne Region and the Brest Metropolis, in order to try and develop an infrastructure like VIGISAT on the campus close to Brest. Following BOOST Technologies’ merger with CLS in 2008, the project flourished after the creation of GIS BreTel in 2009. In the same year, the VIGISAT project experienced further success when presented to CPER. Then, BreTel grew its roadmap by adding the “research” sector, as well as the “training”, “innovation”, “promotion/dispersal” aspects. GIS BreTel is currently focusing on the “activity creation” and “new public services” sections which are in tune with the philosophy of the Carnot platforms.

BreTel also has a presence on a European level. GIS and its members have gained the title of “Copernicus Academy”. Thanks to this, they receive support from specialists in the European Copernicus program for all their education needs. From the end of 2017, BreTel and its partners will be participating in the Business Incubator Centers at ESA (ESA-BIC) which will cover five regions in Northern France (Brittany, Pays de la Loire, Ile-de-France, Hauts-de-France and Grand-Est), headed by the Brittany region.[/box]

[box type=”shadow” align=”” class=”” width=””]

The TSN Carnot institute, a guarantee of excellence in partnership-based research since 2006

 

Having first received the Carnot label in 2006, the Télécom & Société numérique Carnot institute is the first national “Information and Communication Science and Technology” Carnot institute. Home to over 2,000 researchers, it is focused on the technical, economic and social implications of the digital transition. In 2016, the Carnot label was renewed for the second consecutive time, demonstrating the quality of the innovations produced through the collaborations between researchers and companies. The institute encompasses Télécom ParisTech, IMT Atlantique, Télécom SudParis, Télécom, École de Management, Eurecom, Télécom Physique Strasbourg and Télécom Saint-Étienne, École Polytechnique (Lix and CMAP laboratories), Strate École de Design and Femto Engineering.[/box]

Also on I’MTech:

[box][one_half]

[/one_half][one_half_last]

[/one_half_last][/box]