Jean-Louis Dessalles, Claude Shannon, Simplicity theory

Simplicity theory: teaching relevance to artificial intelligences

The simplicity theory is founded on humans’ sensitivity to variations in complexity. Something that seems overly simple suddenly becomes interesting. This concept, which was developed by Jean-Louis Dessalles from Télécom ParisTech, challenges Shannon’s probabilistic method for describing certain information. Using this new approach, he can explain events that are otherwise inexplicable, such as creativity, decision-making, coincidence, or “if only I had…” thoughts. And all these concepts could someday be applied to artificial intelligences.

 

How can we pinpoint the factors that determine human interest? This seemingly philosophical question from researcher Jean-Louis Dessalles from Télécom ParisTech, is addressed from the perspective of the information theory. To mark the centenary of Claude Shannon’s birth, the creator of the information theory, his successors will present their recent research at the Institut Henri Poincaré from October 26 to 28. On this occasion, Jean-Louis Dessalles will present his premise which states that the solution to an apparently complex problem can be found in simplicity.

Founded on the work of Claude Shannon

Claude Shannon defined information based on three principles: coding, surprise, and entropy. The latter has the effect of eliminating redundancy, which is generalized by using Kolmogorov complexity. This definition of complexity corresponds to the minimal description size that interests the observer. For example: a message is transmitted, which is identical to one that was previously communicated. Its minimal description consists in stating that it is a copy of the previous message. The complexity therefore decreases, and the information is simplified.

In his research, Jean-Louis Dessalles reuses Shannon’s premises, but, in his opinion, the probabilistic approach is not enough. By way of illustration, the researcher uses the example of the lottery. “Imagine the results from a lottery draw are: “1, 2, 3, 4, 5, 6”. From a probability perspective, there is nothing astonishing about this combination, because its probability is the same as any other combination. However, humans find this sensational, and see it as more astonishing than a random series of numbers in no particular order.” Yet Shannon stated: “what is improbable is interesting“. For Jean-Louis Dessalles, this presents a problem. According to Dessalles, probabilities are unable to represent human interest and the type of information a human being will consider.

The simplicity theory

Jean-Louis Dessalles offers a new cognitive approach that he calls the simplicity theory. This approach does not focus on the minimal description of information, but rather on discrepancies in information. In other words, the difference between what the observer expects and what he or she observes. This is how he redefines Shannon’s concept of surprise. For a human observer, what is expected corresponds to a causal probability. In the lottery example, the observer expects to obtain a set of six numbers that are completely unrelated to each other. However, if the results are “1, 2, 3, 4, 5, 6”, the observer recognizes a logical sequence. This sequence reflects Kolmogorov complexity. Therefore, between the expected combination and the observed combination, the drawing’s degree of description and categorization was simplified. And understandably so, since there is a switch from 6 random numbers to an easily fathomable sequence. The expected complexity of the six numbers of the lottery drawing is much higher than the drawing that was obtained. An event is considered surprising when it is perceived as being simpler to describe than it is to produce.

The simplicity theory was originally developed to account for what people see as interesting in language. The concept of relevance is particularly important here because this word refers to all the elements in information that are worthy of interest, which is something humans can detect very easily. Human interest is made up of two different components: the unique aspect of a situation and the emotion linked to the information. When the simplicity theory is applied, it can help to detect the relevance of news in the media, the subjectivity surrounding an event, or the associated emotional intensity. This emotional reaction depends on the spatio-temporal factor. “An accident that occurs two blocks away from your house will have a greater impact than one that occurs four blocks away, or fifteen blocks, etc. The proximity of a place is characterized by the simplification of the number of bits in its description. The closer it is, the simpler it is,” the researcher explains. And Jean-Louis Dessalles has plenty of examples to illustrate simplicity! Especially since each scenario can be studied in retrospect to better identify what is or is not relevant. This is the very strength of this theory; it characterizes what moves away from the norm and is not usually studied.

Read the  blog post Artificial Intelligence: learning driven by childlike curiosity

Research for artificial intelligence

The ultimate goal of Jean-Louis Dessalles’ theory is to enable a machine to determine what is relevant without explaining it to the machine ahead of time. “AI currently fail in this area. Providing them with this ability would enable them to determine when they must compress the information,” Jean-Louis Dessalles explains. Today these artificial intelligences use a description of statistical relevance, which is often synonymous with importance, but is far removed from relevance as perceived by humans. “AI, which are based on the principle of statistical Machine Learning, are unable to identify the distinctive aspect of an event that creates its interest, because the process eliminates all the aspects that are out of the ordinary,” the researcher explains. The simplicity theory, on the other hand, is able to characterize any event, to such an extent that the theory currently seems limitless. The researcher recommends that relevance be learned as it is learned naturally by children. And beyond the idea of interest, this theory encompasses the computability of creativity, regrets, and the decision-making process. These are all concepts which will be of interest for future artificial intelligence programs.

Read the blog post What is Machine Learning?

[box type=”shadow” align=”” class=”” width=””]

Claude Shannon, code wizard

To celebrate the centenary of Claude Shannon’s birth, the Institut Henri Poincaré is organizing a conference dedicated to the information theory, from October 26 to 28. The conference will explore the mathematician’s legacy in current research, and areas that are taking shape in the field he created. Institut Mines-Télécom, a partner of the event, along with CNRS, UPMC and Labex Carmin, will participate through presentations from four of its researchers: Olivier Rioul, Vincent Gripon, Jean-Louis Dessalles and Gérard Battail.

To find out more about Shannon’s life and work, CNRS has created a website that recounts his journey.[/box]

Read more on our blog

City4age, the elderly-friendly H2020 project

Projets européens H2020In the framework of the European research program H2020, the Institut Mines-Telecom is taking part in the project « City4age ». The latter is meant to offer a smart city model adapted to the elderly. Through non-intrusive technologies, the aim is to improve their quality of life and to facilitate the action of Health Services. The researcher and director of the IPAL[1], Mounir Mokhtari, contributes to the project in the test city of Singapor. Following here, is an interview given by the researcher to the Petitjournal.com, a french media for the French overseas.

 

Mounir Mokhtari, head of the IPAL

Mounir Mokhtari, Director of the IPAL

 

LePetitJournal.com : What are the research areas of and stakes involved in “City4age”?

Mounir Mokhtari : Today, in Europe as in Singapore, the population is ageing and the number of dependent elderly persons is rising sharply; even as the skilled labour force that can look after these people has decreased significantly. The management (of this issue) is often institutionalisation.  Our objective is to maintain the autonomy of this group of people at home and in the city, to improve their quality of life and that of their caregivers (family, friends etc.) by the integration of daily non-intrusive and easy-to-use technologies.

It involves the development of technological systems that motivate elderly persons in frail health to stay more active, to reinforce social ties and to prevent risks.  The objective is to install non-intrusive captors, information systems and communication devices in today’s homes, and to create simple user interfaces with everyday objects such as smartphones, TV screens, tablets, to assist dependent people in their daily living.

 

LPJ : What are the principal challenges in the research?

MM : The first challenge is to identify the normal behavior of the person, to know his / her habits, to be able to detect changes that may be related to a decline in cognitive or motor skills.  This involves the collection of extensive information available through connected objects and lifestyle habits, which we used to define a “user profile”.

Then the data obtained is interpreted and a service provided to the person.  Our objective is not to monitor people but to identify exact areas of interest (leisure, shopping, exercise) and to encourage the person to attend such activities to avoid isolation which could result in the deterioration of his / her quality of life or even health.

For this, we use the tools of decision and system learning, the Machine Learning or Semantic Web.  It’s the same principle, if you like, that Google uses to suggest appropriate search results (graph theory), with an additional difficulty in our case, related to the human factor.  It is all about making a subjective interpretation of behavioural data using machines that have a logical interpretation.  But it is also where the interest of this project lies, besides the strong societal issue.  We work with doctors, psychologists, ergonomists, physio and occupational therapists and social science specialists, etc.

 

LPJ : Can you give us a few simple examples of such an implementation ?

MM : To assist in the maintaining of social ties and activity levels, let’s take the example of an elderly person who has the habit of going to his / her Community Centre and of taking his / her meals at the hawker centre.   If the system detects that this person has reduced his / her outings outside of home, it will generate a prompt to the person to encourage him / her to get out of the home again, for example, “your friends are now at the hawker centre and they are going to eat, you should join them”.  The system can also simultaneously notify the friends on their mobiles that the person has not been out for a long time and to suggest that they visit him/ her for example.

Concerning the elderly who suffer cognitive impairment, we work on the key affected functions that are simple daily activities such as sleeping, hygiene, eating, and risks of falls.  For example, we install motion captors in rooms to detect possible falls.  We equip beds with optic fibre captors to manage the person’s breathing and heart rate to spot potential sleep problems, apnea or cardiac risks, without disturbing the person’s rest.

 

LPJ : An application in Singapore ?

MM : Our research is highly applied, with a strong industry focus and a view to a quick deployment to the end-user.  The solutions developed in the laboratory are proven in a showflat, then in clinical tests.  At the moment, we are carrying out tests at the Khoo Teck Puat hospital to validate our non-intrusive sleep management solutions.

Six pilot sites were chosen to validate in situ the deployment of City4age, including Singapore for testing the maintenance of social ties and activity levels of the elderly, via the Community Centres in HDB neighbourhoods.  The target is a group of around 20 people aged 70 and above, fragile and suffering from mild cognitive impairment, who are integrated in a community – more often in a Senior Activity Centre.  The test also involves the volunteers who help these elderly persons in their community.

 

LPJ : What is your background in Singapore?

MM : My research concentrated mainly on the area of technology that could be used to assist dependent people.  I came to Singapore for the first time in 2004 for the International Conference On Smart Homes and Health Telematics or ICOST which I organised.

I then discovered a scientific ecosystem that I was not aware of (at that period, the focus was turned towards the USA and some European cities).  I was pleasantly surprised by the dynamism, the infrastructure in place and the building of new structures at a frantic pace, and above all, by a country that is very active in the research area of new technologies.

I continued to exchange with Singapore since then and finally decided to join the laboratory IPAL, to which I am seconded by the “Institut Mines-Télécom” since 2009.  I took over the direction of IPAL in 2015 to develop this research.

 

LPJ : What is your view of the MERLION programme?

MM : The PHC MERLION is very relevant and attractive for the creation of new teams. There was an undeniable leverage of French diplomacy and MERLION in the launch of projects and in the consolidation of collaborations with our partners.

The programme brings a framework that creates opportunities and encourages exchanges between researchers and international conference participants and also contributes to the emergence of new collaborations.

Without the MERLION programme, for example, we would not have been able to create the symposium SINFRA (Singapore-French Symposium) in 2009, which has become a biennial event for the laboratory IPAL.  In addition, the theme of « Inclusive Smart Cities and Digital Health » was initiated into IPAL thanks to a MERLION project which was headed by Dr. Dong Jin Song who is today the co-director of IPAL for NUS.

Other than the diplomatic and financial support, the Embassy also participates in IPAL’s activities through making available one of its staff members on a part-time basis, who is integrated into the project team (at IPAL).

 

LPJ : Do you have any upcoming collaborations?

MM : We are planning a new collaboration between IPAL and the University of Bordeaux – which specialises in social sciences – for a behavioural study to help us in our current research.  We are thinking of applying for a new MERLION project in order to kickstart this new collaboration.  It is true that the Social Sciences aspect, despite its importance in the well-being of the elderly and their entourage, is not very well-developed in the laboratory. This PHC MERLION proposal may well have the same leverage as the previous one.

Beyond the European project City4Age, IPAL just signed a research collaboration agreement with PSA Peugeot—Citroën on mobility aspects in the city and well-being with a focus on the management of chronic diseases, such as diabetes and respiratory illnesses.  There is also an ongoing NRF (National Research Foundation) project with NUS (National University of Singapore), led by Dr. Nizar Quarti, a member of IPAL, on mobile and visual robotics.

Interview by Cécile Brosolo (www.lepetitjournal.com/singapour) and translation by Institut Français de Singapour, Ambassade de France à Singapour.

[1] IPAL : Image & Pervasive Access Lab – CNRS’s UMI based in Singapore.

Claude Shannon, a legacy transcending digital technology

Claude Shannon, a major scientist from the second half of the 20th century, marked his era with his communication theory. His work triggered a digital metamorphosis that today affects all levels of our societies. To celebrate what would have been Shannon’s 100th birthday this year, the Institut Henri Poincaré will pay tribute to the scientist with a conference on October 26 to 28. At this event, Olivier Rioul, a researcher at Télécom ParisTech, will provide insight into the identity of this pioneer in the communication field, and will present part of the legacy he left behind.

 

 

Claude Elwood Shannon. The name is not well known by the general public. And yet, if the digital revolution the world is experiencing today had a father, it would doubtless be this man, born in 1916 in a small town in Michigan. His life, which ended in 2001, received little media coverage. Unlike Alan Turing, no Hollywood blockbusters have been dedicated to him. Nor has his identity been mythicized by artistic circles, as was the case for Einstein. “Shannon led an ordinary life, and perhaps that is why nobody talks about him,” observes Olivier Rioul, researcher in digital communications at Télécom ParisTech.

Though his life was not particularly extraordinary, Claude Shannon’s work, on the other hand, was thrilling in many ways. In 1948, he wrote an article entitled A mathematical theory of communication. “Its publication came as revolution in the scientific world,” explains Olivier Rioul. In this article, Claude Shannon introduced the concept of bits of information. He also outlined – for the first time – a schematic diagram of a communication channel, which included all the active parts involved in transmitting a signal, from its source to its destination.

Claude Shannon, Communication Theory, Olivier Rioul

First schematic diagram of a communication system, published by Claude Shannon in 1948. He explained that a channel could be “a coaxial cable, a band of radio frequencies, a beam of light, etc.”

 

Shannon and his magic formula

Yet in addition to his channel diagram, it was above all a formula published in a paper in 1948 that went on to mark the scientific community: C=W.log(1+SNR). With this mathematical expression, Shannon defined the maximum capacity of a transmission channel, in other words, the quantity of information that can be transmitted in a reliable manner. It shows that this capacity depends solely on the channel’s bandwidth, and the relationship between the strength of the transmitted signal and the noise in the channel. Based on these results, every channel has a throughput limit, below which the message transmitted from the transmitter to the receiver is not altered.

Shannon’s strength lies in having succeeded in obtaining this result in a theoretical way,” insists Olivier Rioul. “Shannon did not provide the solution required to reach this limit, but showed that it exists, for all channels.” It would not be until 43 years later, with the work of Berrou, Glavieux and Thitimajshima in 1991, that Shannon’s limit would be nearly reached for the first time with the development of turbo codes.

Olivier Rioul believes the story behind this formula is out of the ordinary, and has been the subject of many historical approximations. “And the time was ripe. In 1948 – the year in which Claude Shannon made his results public – seven other scientists published similar formulas,” he explains, based on research carried out with José Carlos Magossi on the history of this formula.

However, the results obtained by Shannon’s peers were sometimes inaccurate and sometimes inspired by Shannon’s prior work, and therefore not very original. And all of them were part of the same environment, were in contact with each other or participated in the same conferences. All except Jacques Laplume, a French engineer who obtained a correct formula similar to Shannon’s at almost the same time. Yet what he lacked and what kept him from leaving his mark on history was the enormous contribution of the rest of Shannon’s theory.

Read the blog post What are turbo codes?

A pioneer in communications, but that’s not all…

While his work represents the beginnings of modern digital communications, Claude Shannon also left behind a much greater legacy. In 1954, behavioral psychologist Paul Fitts published his law, named after him, which is used to model human movements. In his scientific article, he explicitly cited Shannon’s theorem, referring to his channel capacity formula. “Today we use Fitt’s formula to study human-computer interactions,” explains Olivier Rioul, who worked with a PhD student on reconciling this law with Shannon’s theory.

The scope of Shannon’s work therefore far exceeds the realms of information and communication theory. As a lover of games, he developed one of the first machines for playing chess. He was also one of the pioneers of artificial intelligence and machine learning, with his demonstration in 1950 of a robotic mouse that could find its way through a labyrinth and remember the optimal route.

Although Shannon’s life was not necessarily extraordinary in the literal sense, he was undeniably an extraordinary man. As for his lack of fame – which the centenary celebration of his birth seeks to remedy – he himself had said, referring to his information theory, “In the beginning I didn’t think it would have an enormous impact. I enjoyed working on these problems, just like I enjoyed working on lots of other problems, without any ulterior motives for money or fame. And I think that a lot of other scientists have this same approach, they work because of their love of the game.”

 

[box type=”shadow” align=”” class=”” width=””]

Claude Shannon, code wizard

To celebrate the centenary of Claude Shannon’s birth, the Institut Henri Poincaré is organizing a conference dedicated to the information theory, from October 26 to 28. The conference will explore the mathematician’s legacy in current research, and areas that are taking shape in the field he created. Institut Mines-Télécom, a partner of the event, along with CNRS, UPMC and Labex Carmin, will participate through presentations from four of its researchers: Olivier Rioul, Vincent Gripon, Jean-Louis Dessalles and Gérard Battail.

To find out more about Shannon’s life and work, CNRS has created a website that recounts his journey.[/box]

Read more on our blog :

Bioplastics, Mines Douai

Bioplastics: “still a long road to higher performance”

As required by environmental transition, materials of the future must be “greener”. Bioplastics in particular have become a main focus of attention, and are often presented as the solution to the pollution caused by the plastics we use every day, which can take hundreds of years to decompose. Patricia Krawczak, a researcher at Mines Douai, studies these new polymers. Yet she issues this warning: our expectations must remain reasonable, because it will take time for bioplastics to become efficient and profitable… and not all of them are biodegradable.

 

Plastic materials are inextricably linked to our everyday lives. They are essential, and yet are often seen as a scourge of modern times. Their negative impact on the environment is often denounced, such as the millions of tons[1] of polymer waste disposed of in the oceans each year, negatively impacting marine biodiversity. Not to mention that producing these plastics requires hydrocarbons, and hence the use of fossil fuels. The scientific community is seeking to offer alternative solutions in response to this situation: “agro-based” or “bio-sourced” plastics made from natural materials of plant or animal origin, also referred to as bioplastics. At Mines Douai, this new “green” form of plastics processing is one of the key research areas of the TPCIM department directed by Patricia Krawczak.

The current challenge is to develop bio-sourced polymers with higher added value, to set them apart from the widely-distributed plastics, called commodity plastics — such as polyolefins. The goal is to compete against technical plastics, or performance plastics, from the traditional, petrochemical derived process — such as polyamides and polycarbonates,” the materials researcher explains. These major polymer families Patricia Krawczak mentions are often used in key sectors, such as transportation (automotive, aeronautics, etc.), which are large-scale consumers of plastics. Yet entering these markets proves to be a difficult task, due to the demanding specifications.

Herein lies one of bioplastics’ greatest challenges: proving, if not their superiority, at least their equal performance compared to conventional polymers under strict operating conditions. Yet this is far from always the case. “For the time being, industrial-scale bio-sourced products are primarily used for applications in the low value-added packaging sector, such as  bags for supermarkets,” the scientist explains. The properties of the majority of these bioplastics are not yet adapted for producing vehicle components, such as under-the-hood automotive parts, which must be resistant to high temperatures and constant or repeated mechanical stress over time.

This is why much work remains to be done before certain attractive properties can be achieved, and explains the need to temper the excitement about bioplastics. Patricia Krawczak is very clear on this point: “We cannot yet compete with one hundred years of research in the field of petrochemical plastics processing. The road to high performance is still long for bio-sourced plastics.

The “conventional” plastics industry has indeed been successful in developing a wide range of materials, able to meet the thermo-mechanical and physico-chemical demands of specific uses, and comply with strict application specifications. The range is much larger than what the few bioplastics currently being produced can offer. Not to mention the fact that these bioplastics sometimes have unattractive psychosensorial properties (smells, colors, transparency). A cloudy or yellowish appearance can make certain applications unacceptable, such as for food packaging or touchscreens; and the foul-smelling compounds generated during processing or during use can be disturbing.

However, this does not mean that bioplastics will be forever confined to markets for low value-added products. But hopes of quickly replacing all plastics from petroleum fractions with bioplastics should be tempered for the time being. However, a few examples do exist of bioplastics offering very good properties or even new functions, and are winning over plastics processing industrials and purchasers. This is the case for a bio-sourced polymer developed by Mitsubishi and marketed under the name of Durabio. Its impact resistance is comparable to that of conventional polycarbonate, as well having a high degree of transparency and excellent optical properties (resistance to UV yellowing) and surface properties (hardness, scratch and abrasion-resistance) that surpass its petroleum-based counterparts, and justify its price.

 

Bioplastics need to keep up with the pace!

One of the major hurdles to overcome — in addition to having the characteristics required to comply with application specifications — is that of the potential additional cost of using bioplastics. Bio-sourced polymers’ access to downstream markets is in fact subject to an inescapable condition: to remain competitive, manufacturers of plastic parts cannot consider investing in new production methods or substantially modifying their existing machinery. “It is therefore crucial to ensure that bioplastics can be integrated into current production lines, with technical performances, and production costs and speed that are compatible with market constraints,Patricia Krawczak points out. Yet, this is not an easy task. Why? Because certain bio-sourced polymers are sensitive to thermal or thermomechanical degradation during the forming stages for manufactured products.

 

Mines Douai, Patricia Krawczak, Bioplastics

To bring bioplastics to maturity, researchers must make them compatible with current processes.

 

It is therefore up to bioplastics to adapt to the plastics processing procedures used to manufacture industrial parts, not the other way around. For the scientists, this means modifying the plastics’ behavior in liquid form, specifically by adding chemical additives. “A common example is starch, which cannot be processed in its raw state using conventional extrusion methods. It must be plasticized by adding water or polyols, with the aim of lowering the temperature at which it becomes liquid,” the researcher explains. Another approach being explored is mixing bio-sourced polymers to obtain a blend tailored to the specific characteristics required.

Once the appropriate formula has been developed, the work is not yet over. The possible migration of the various additives, or the potential changes to the morphology of the blends during the processing stage must also be controlled, to ensure optimal functional properties. In short, developing bioplastics requires a great deal of optimization.

 

Bio-sourced does not necessarily mean biodegradable

Once the bioplastics are perfectly adapted to current plastic processing procedures, and have become efficient and competitive, it is important to keep the initial goal in mind: reducing the environmental impact. However, green plastics processing is all too often wrongly associated with developing and processing biodegradable plastics. Patricia Krawczak reminds us that green polymers do not necessarily have this feature: “On the contrary, many applications in transportation (cars, airplanes) and construction require durable materials that can be used in the long-term without any form of deterioration.

Since not all bioplastics are biodegradable, they must be recovered and recycled. And there is no guarantee we will be able to put them in our recycling bins at home. In France, these recycling bins currently only accept a limited number of specific plastics: polyethylene terephthalate, polyethylene and polypropylene. It may not be possible to recycle the new biopolymers using the same facilities. Studies must now be carried out to determine whether or not these biopolymers can be integrated into existing recycling facilities without any disruption, or to determine if new facilities will need to be created.

The problem is, the proportion of biopolymers in the total volume of the plastics produced and consumed in the global market represents only 0.5% of all different types (and an estimated 2% by 2020). “Establishing a recycling program generally requires the generation of a sufficient volume of waste to enable a sustainable economy to be built on the collection, sorting and reutilization procedures. At present, however, the amount of bioplastic waste is too small, and is too diverse,” Patricia Krawczak warns. However, initiatives are being developed to recycle small volumes of waste. This is one of the subjects being discussed by the Circular Economy & Innovation Chair (ECOCIRNOV) led by Mines Douai.

 

A promising future for green plastics?

Research aimed at removing the remaining obstacles is advancing, and the future looks promising for green plastics processing, as it is driven by application sectors with strong potential. In addition to transportation, the biomedical field is interested in biocompatible materials for creating controlled release systems for active ingredients. Patricia Krawczak’s team has worked on this subject in conjunction with a French research group on biomaterials from Nord Pas-de-Calais (Fédération Biomatériaux et Dispositifs Médicaux Fonctionnalisés du Nord Pas-de-Calais). The development of electroactive bio-sourced polymers suitable for 3D printing – the focus of research led by Jérémie Soulestin in one of Patricia Krawczak’s research groups – could also benefit the market for connected objects.

Finally, it is important to remember that polymers, along with fibers, constitute one of the two essential components required for producing composite materials. Chung-Hae Park, also a member of Patricia Krawczak’s team, is already working on the development of flax-based composites. He recently completed the proof of concept for the high-speed manufacturing of parts, with a cycle time of two minutes, close to automotive speeds (one part per minute). Success in offering biopolymers with suitable properties, reinforced with plant fibers, could therefore constitute another step towards developing completely bio-sourced structural composites. This class of materials could potentially have numerous high-performance applications.

 

[1] The United Nations Environment Program published a report in 2016 indicating that between 4.8 and 12.7 million tonnes of plastic were dumped in the world’s seas.

Sea Tech Week, René Garello, Océan connecté

Sea Tech Week: Key issues of a connected ocean

The sea is becoming increasingly connected, with the development of new real-time transmission sensors. The aggregated data is being used to improve our understanding of the role oceans play in climate issues, but several challenges must be considered: the development of autonomous sensors and the pooling of research on a global level. This was the subject of Sea Tech Week, which took place in Brest from October 10 to 14, bringing together international experts from different disciplines relating to the sea.

 

From renewable marine energies and natural resources to tourism… The sea has many uses – we swim in it, travel on it and exploit all it has to offer. By 2030, the ocean economy (or blue economy) is expected to create great wealth and many jobs. In the meantime, before we reach that distant point in the future, Télécom Bretagne is combining expertise in oceanography with information and communication technology in order to further research in this field.

A global topic

Although the subject was left out of climate conferences up until 2015, the ocean is constantly interacting with the environments around it. It is at the very heart of the subject of climate change. In fact, it is currently the largest carbon sink in existence, leading to acidification, with irreparable consequences for marine fauna and flora.

In this current context, there is an increasing need for measurements. The aim is to obtain an overview of the ocean’s effects on the environment and vice versa. All different types of parameters must be studied to obtain this global view: surface temperatures, salinity, pressure, biological and chemical variables, and the influence of human activities, such as maritime traffic. René Garello,  a researcher at Télécom Bretagne, in a presentation delivered on the first morning of Sea Tech Week, explained the challenges involved in integrating all this new data.

A connected ocean for shared research

The study of the marine world is not immune to recent trends: it must be connected. The goal is to use technological resources to allow large volumes of data to be transmitted by developing coding. This involves adapting aspects of connected object technology to the marine environment.

One challenge involved in the connected ocean field is the development of sophisticated and more efficient sensors to improve in-situ observation techniques. René Garello refers to them as smart sensors. Whether they are used to examine surface currents, or acoustic phenomena, these sensors must be able to transmit data quickly, be autonomous, and communicate with each other.

However, communication is necessary for more than just the sensors. Scientific communities also have their part to play. “On the one hand, we make measurements, and on the other we make models. The question is whether or not what is carried out in a given context is pooled with other measurements carried out elsewhere, allowing it to be integrated to serve the same purpose,” explains René Garello.

Another challenge is therefore to prevent the fragmentation of research which would benefit from being correlated. The goal is to pool both data and stakeholders by bringing together chemical oceanographers and physical oceanographers, modelers and experimenters, with the ultimate aim of better orchestrating global research.

A parallel concern: Big Data

Currently, only 2% of data is used. We are forced to subsample the data, which means we are less efficient,” observes René Garello. The need to collect as much material as possible is counterbalanced by the human capacity to analyze the material in its entirety. In addition, the data must be stored and processed in different ways. According to René Garello, future research must be carried out in a restrained manner: “Big Data leads to a paradox, because the goal of the research is to decrease data size so users receive a maximum amount of information in minimum amount of space.” Smart sensors can allow a balance to be struck between data compression and Big Data by using an input filtering process, and by not collecting all data, so that work can be carried out on a human scale.

Towards standardization procedures

Not many standards currently exist in the marine sphere,. The question of data integrity and how it represents reality is the last major issue. Satellite sensors are already properly codified, since their measurements are carried out in an environment in which the measurement conditions are stable, unlike in-situ sensors, which can be dragged away by drifting objects and buoys. In this context of mobile resources, the sample must be proven reliable through the prior calibration of the measurement. Research can help to improve this concept of standards.

However, basic research alone is not sufficient. The future also requires links to be forged between science, technology and industry. In a report published in April 2016, the OECD foresees the creation of many ocean-related industries (transport, fishing, marine biotechnology, etc.). How will current research help this blue economy to take shape? From the local context in Brest, to European research programs such as AtlantOS, these issues clearly exist within the same context: everything is interconnected.

 

[box type=”shadow” align=”” class=”” width=””]

Sea Tech Week 2016

Sea Tech Week : A week dedicated to marine sciences and technology

Every two years in Brest, workshops and trade shows are organized in relation to sea-related disciplines. The week is organized by the Brest metropolitan area with support from several research and corporate partners. In 2014, over 1,000 participants arrived in Brittany for this event, 40% of whom were international visitors. In 2016, the event focused on digital technology, in connection with the French Tech label, and addressed the following topics: observation, robotics, modeling, sensors… via 18 conferences led by experts from around the world. Find out more  [/box]