Botfuel, Chatbots

Botfuel: chatbots to revolutionize human-machine interfaces

Are chatbots the future of interactions between humans and machines? These virtual conversation agents have a growing presence on messaging applications, offering us services in tourism, entertainment, gastronomy and much more. However, not all chatbots are equal. Botfuel, a start-up incubated at ParisTech Entrepreneurs, offers its services to businesses wanting to develop top-of-the-range chatbots.

They help us to order food, book a trip or discover cultural events and bars. Chatbots are virtual intermediaries providing us with access to many services, and are becoming ever more present on messaging applications such as Messenger, Whatsapp, Telegram, etc. In 2016, Yan Georget and Javier Gonzalez decided to enter the conversational agent market, founding their start-up, Botfuel, incubated for a year at ParisTech Entrepreneurs. They were not looking, however, to develop simple, low-end chatbots. “Many key players target the mass market, with more potential users and easy-to-produce chatbots”, explains Yan Georget, “this is not our approach”.

Botfuel is aimed at businesses that want to provide their customers with a high quality user experience. The fledgling business offers companies state-of-the-art technological building blocks to help develop these intelligent conversational agents. It therefore differs from the process typically adopted by businesses for designing chatbots. Ordinarily, chatbots operate on a decision-tree basis, which is established in advance. Developers create potential scenarios based on the questions the chatbot will ask the user, trying to predict all possible responses. However, the human mind is unpredictable. There will inevitably be occasions where the user gives an unforeseen response or pushes the conversational agent to its limits. Chatbots that rely on decision-trees soon tend to struggle, often leaving customers disappointed.

We take a different approach which is based on machine learning algorithms”, explains Yan Georget. Every sentence provided by users is analyzed to understand its meaning and identify its possible intentions. To achieve this, Botfuel works in collaboration with businesses on their databases, trying to understand the primary motivations of web-users. For example, the start-up has collaborated with BlaBlaCar, helping them to fine-tune their chatbots, which in turn allows customers to find a carshare more easily. Thanks to this design approach, the chatbot knows to attribute the same meaning to the phrases: “I want to go to Nantes”, “I would like to get to Nantes by car” and “looking for a carshare to Nantes”, something which is near impossible for traditional chatbots which dismiss various semantic formulations if the conversation doesn’t exactly match the expected discussion scenario.

Botfuel also uses information extraction algorithms to precisely identify dates, places and desired services, regardless of the order in which they appear in the conversation with the chatbot. Understanding the user is clearly Botfuel’s principal motivation. Building on this, Yan Georget explains the choices they made in terms of the issue of language correction. “People make typos when they are writing. We opted for word-by-word correction, using an algorithm based on the occurrence of each word in the language and the resemblance this has with the mistyped word entered by the user. We only correct mistakes made by a user if we are sure we know what word it is they were wanting to type.” This approach differs from that of other chatbots, which base correction on the overall meaning of the phrase. This second approach sometimes incurs errors by associating a user with the wrong intentions. Even though more errors may be corrected with this method, Botfuel prefers to prioritize the absence of misunderstanding, even if it means the chatbot has to ask the user to reformulate their question.

 

Chatbots, the future of online services

In addition to BlaBlaCar, many other businesses are interested in Botfuel’s chatbots. French insurance provider April and French bank Société Générale now form part of their clientele. One of the main reasons these new interfaces are attracting so much interest is, according to Yan Georget, because “conversational interfaces are very powerful”. “When using an online purchasing service, you have only to type in what you’re looking for and you’ve practically already made the purchase.” The alternative consists in going through the menu of the website of a vendor and finding the desired product from a list using search filters… this takes several clicks, and wastes several minutes, in comparison to simply typing in “I’m looking for Fred Vargas’s latest novel” into the chatbot interface.

For businesses, chatbots also represent a great opportunity to learn more about their customers. Botfuel provides an analysis system allowing businesses to better understand the links between different customer demands. By talking to a chatbot, the customer provides a lot more information than they would by simply browsing the site. It explains their interests in a more detailed way and provides better explanation for their dissatisfaction. These are all elements that can be very valuable to businesses, helping them to improve their service for the benefit of the customer.

These new perspectives revealed by chatbots are promising, but Yan Georget would also like to moderate expectations and alleviate certain fears surrounding the service: “The aim of chatbots is not to replace humans in the customer experience. Their purpose is to use conversation as an interface with machines, in place of the interactions that we currently have. When a computer operating system changes, habitual users have to readapt to this new interface. With chatbots, the conversational interface doesn’t change, conversation is flexible. The only thing that changes is the additional features that a chatbot can gain with each update.” In terms of the “intelligent” nature of the chatbots, the co-founder remains cautious in this respect. With a doctorate in artificial intelligence, he is aware of the ethical limits and risks associated with an unbridled boom in this field of technology. The focus is therefore on the issue of non-supervised learning in chatbots, which gives them more autonomy when dealing with customers. For Yan Geroget, the example of Tay, Microsoft’s chatbot which was made to become racist on Twitter by its users, is very significant. “The development of chatbots should be supervised by humans. Auto-learning is too dangerous, and it is not the kind of risk to which we are willing to expose businesses or final users.

 

[box type=”info” align=”” class=”” width=””]

Chatbots: an educational tool?

On 29th June at Télécom ParisTech, Botfuel and ParisTech Entrepreneurs came together for a meet-up dedicated to discussing chatbots and education. In this sector, intelligent conversational agents represent a potential asset for personalized education programs. From revising for your exams with Roland Garros through the chatbot Messenger “ReviserAvecRG”, to finding your path with the specialist careers advisor bot “Hello Charly”, aimed at young people between the ages of 14 and 24, or even practicing writing in a foreign language, chatbots undeniably offer a real range of tools.

The meet-up provided an opportunity to share and present experiences and concrete examples from specialist businesses. This event comes as part of the launch of the incubator’s “Tech Meet-ups” program, meetings focusing on technology and the future. The event is therefore the first in a series that will be continued in October with the next “Tech Meet-up”, which this time will be dedicated to the blockchain.[/box]

 

Silense, Marius Preda, Télécom SudParis

Will we soon be able to control machines with simple gestures?

Projets européens H2020The “Silense” European project launched in May 2017 is aimed at redefining the way we interact with machines. By using ultrasound technology similar to sonar, the researchers and industrialists participating in this collaboration have chosen to focus on 3D motion sensing technology. This technology could allow us to control our smartphone or house with simple gestures, without any physical contact with a tactile surface.

 

Lower the volume on your TV from your couch just by lowering your hand. Close the blinds in your bedroom by simply squeezing your fingers together. Show your car’s GPS the right direction to take by lifting your thumb. It may sound like scenes from a science fiction movie. Yet these scenarios are part of the real-life objectives of the European H2020 project called “Silense”, which stands for (Ultra)Sound Interfaces and Low Energy iNtegrated SEnsors. For a three-year period, this project will bring together 42 academic and industrial partners from eight countries throughout the continent. This consortium—which is particularly large, even for a H2020 project—will work from 2017 to 2020 to develop new human-machine interfaces based on ultrasound.

What we want to do is replace tactile commands by commands the users can make from a distance, by moving their hands, arms or body,” explains Marius Preda, a researcher with Télécom SudParis, one of the project’s partners. To accomplish this, scientists will develop technology that is similar to sonar. An audio source will emit an inaudible sound that fills the air. When the sound wave hits an obstacle, it bounces back and returns to the source. Receivers placed at the same level as the transmitter record the wave travel times and determine the distance between the source and the obstacle. A 3D map of the environment can therefore be created. “It’s the same principle as an ultrasound,” the researcher explains.

In the case of the Silense project, the source will be made up of several transmitters, and there will be many more receivers than for a sonar. The goal is to improve the perception of the obstacles, thus improving the resolution of the 3D image that is produced. This should make it possible to detect smaller variations in shape, and therefore gestures that are more complex than those that are currently possible. “Today we can see if a hand is open or closed, but we cannot distinguish a finger that is up or two fingers that are up and squeezed together”, Marius Preda explains.

Télécom SudParis is leading the project’s software aspect. Its researchers’ mission is to develop image processing algorithms to recognize the gestures users make. By using neural networks to create deep learning, the scientists want to create a dictionary of distinctly different gestures. They will need to be recognizable by the ultrasound sensors regardless of the hand or arm’s position in relation to the sensor.

This is no easy task: the first step is to study differentiating gestures; the ones that cannot confuse the algorithms. The next steps involve reducing noise to improve the detection of shapes, sometimes in a way that is specific to the type of use—a sensor in the wall of a house will not have the same shortcomings as one in a car door. Finally, the researchers will also have to take the uniqueness of each user into account. Two different people will not make a specific sign the same way nor at the same speed.

Our primary challenge is to develop software that can detect the beginning and end of a movement for any user,” explains Marius Preda, while emphasizing how difficult this task is, considering the fluid nature of human gestures: “We do not announce when are going to start or end a gesture. We must therefore succeed in perfectly segmenting the user’s actions into a chain of gestures.

 

Moving towards the human-machine interaction of tomorrow

To meet this challenge, researchers at Télécom SudParis are working very closely with the partners in charge of the hardware aspect. Over the course of the project’s three-year period, the consortium hopes to develop new, smaller generations of sensors. This would make it possible to increase the number of transmitters and receivers on a given surface area, therefore improving the image resolution. This innovation, combined with new image processing algorithms, should significantly increase the catalogue of shapes recognized by ultrasound.

The Silense project is being followed very closely by car and connected object manufacturers. A human-machine interface that uses ultrasound features several advantages. In comparison to the current standard interface—touch—it improves vehicle safety by decreasing the attention required to push a button or tactile screen. In the case of smartphones or smart houses, this will mean greater convenience for consumers.

The ultrasound interface that is proposed here must also be compared with its main competitor: interaction through visual recognition—Kinect cameras, for example. According to Marius Preda, the use of ultrasound removes the lighting problems encountered with video in situations of overexposure (bright light in a car, for example) or underexposure (inside a house at night). In addition, the shape segmentation, for example for hands, is easier using 3D acoustic imaging. “If your hand is the same color as the wall behind you, it will be difficult for the camera to recognize your gesture,” the researcher explains.

Silense therefore has high hopes of creating a new way to interact with machines in our daily lives. By the end of the project, the consortium hopes to establish three demonstrators: one for a smart house, one integrated into a car, and one in a screen like that of a smartphone. If these first proof-of-concept studies prove conclusive, don’t be surprised to see drivers making big gestures in their cars someday!

 

data science, pierre tandeo

Environmental data: data science takes on the challenge

From July 3 to July 7, a series of conferences was held on data science and the environment. The event, organized by IMT Atlantique, builds bridges between two communities between whom there has been little collaboration in Europe so far. Environmental data could benefit from new processing methods that could help to explain what, until now, has been impossible for physics.

 

Some marine and atmospheric phenomena lack physical explanations, despite the observations that have been made. Could these explanations be found through a new method of analysis? The collaboration between data science and the environment is currently underdeveloped in Europe. Yet data scientists offer tools and methodologies that could be helpful in processing environmental data. With the goal of establishing a connection between these scientific communities, IMT Atlantique created a special conference series: “Data science & Environment”, bringing together researchers from around the world. This event is associated with a summer school in order to raise awareness of these mixed approaches among future researchers. Both events were initiated by Pierre Tandeo, a researcher already convinced that this collaboration will bear fruit. Specialized in mathematics applied to oceans and meteorology, he presents the issues related to this collaboration.

 

What is data science?

Pierre Tandeo: Data science is built on the analysis of data using mathematical and statistical tools. It is often confused with big data. Yet data science involves a “professional” aspect, meaning that it uses a scientific approach for extracting relevant, physics-related information related a specific subject matter. Big data, on the other hand, is not necessarily aimed at addressing questions related to physics.

It is often said that data scientists wear three hats, since they must master the mathematical and IT tools, and the data for a given subject. It is not easy to possess these three areas of expertise, which explains why we organized this conference. The goal is to cause the community of applied mathematics to intermingle with that of physics related to environmental data processing, in order to merge their skills in a move towards an environmental data science.

 

What kinds of environmental data can data scientists process?

PT: The conference focuses on the study of oceans, the atmosphere and climate. Within these areas, there are three main types of data: satellite observations, in situ measurements at sea or in the atmosphere, and simulations from computer models. These simulations are intended to describe the phenomena using physical equations.

Today, this data is becoming increasingly easy to access. It includes large volumes of information that have not yet been used, due to the processing challenges presented by these large sets of data. Manipulating the data sets is a complex undertaking, and special IT and statistical tools must be used to process them.

 

What can data science contribute to environmental research and vice versa?

PT:  Major environmental questions remain, and physical comprehension remains insufficient. What this means is that we are not able to convert what is observed into equations. The question is, can we try to understand these environmental phenomena using data, since the connections are undoubtedly hidden within it? To reveal this data, a suitable mathematical tool must be built.

Also, when we check the weather, for example, we don’t trust the forecasts that are made beyond one week’s time, because the system is complex. It’s called “chaotic.” The difficulty in forecasting environmental data lies in the fact that many interactions can take place between the variables that physics cannot even explain. This complexity requires a revision of the applied mathematical techniques that are commonly used. The environment forces us to rethink the way data is processed. This makes it an ideal field for data science, since it is hard to master, thus providing a challenge for mathematicians.

 

Can you give us an example of an environmental issue that has benefited from a mathematical approach?

PT: Some statistical approaches have proven successful. Forecasting the coupled atmosphere-ocean phenomenon called ENSO (with its two opposite phases: El Nino/La Nina) is a good example. The two ENSO phases appear irregularly (every 2 to 7 years) and have extremely significant human, economic and ecological impacts [they particularly affect North and South America]. Therefore, physicists try to predict six months in advance whether we will experience a normal year, El Nino (unusually hot) or La Nina (unusually cold). The ENSO predictions from statistical models were often found to be better than the predictions provided by physical models. These statistical forecasts are based on learning from historical data that is constantly increasing, particularly since the use of satellites.

This conference also provided an opportunity to identify other environmental challenges that remain unresolved, for which data science could provide a solution. It is a vast and rapidly growing field.

Also read on I’MTech:
Ocean remote sensing: solving the puzzle of missing data

 

What topics will be discussed at the conferences?

PT:  The first half focuses on the applications of data science for the climate, atmosphere, and oceans. Yet we have observed that applied mathematical methods are more widespread among the atmosphere and climate community. I think oceanographers have things to learn from what is being done elsewhere. That is also why the event is being held in Brest, one of the major European oceanographic centers.

The other sessions are devoted to mathematical methodologies, and are aimed at presenting how high dimensional problems—with a large volume of information—can be processed, and how to extract relevant information. Data assimilation is also addressed. This looks at the question of how physical forecast models can be mixed with satellite data. The last focus is on analog methods, which involve using learning techniques based on historical observations and trying to project them on current or future data.

 

What are the anticipated outcomes of these sessions?

PT: In the short term, the goal is to start conversations. I would like to see two researchers from both communities finding common ground, because they both have something to gain. In the medium term, the goal is to make this an ongoing event. Ideally, we would like to repeat the event in other locations in France, or in Europe, and open it up to other types of environmental data over the next two years. Finally, the long-term goal would be to initiate projects involving international collaboration. Along with several colleagues, we are currently working to establish a French-American project on the applications of applied mathematics for climate. The creation of international mixed research units in these areas would mark a true culmination.

 

Cahier de veille, Fondation Mines-Télécom, confiance numérique, trust

Intelligence booklet: the new balances of trust, between algorithms and social contract

The 9th Fondation Mines-Télécom booklet addresses the major cross-cutting issue of trust in the digital age. The booklet was put together over the course of a series of events held in partnership with NUMA Paris, with support from the Foundation’s major corporate partners. It examines the changing concepts of trust, while highlighting existing research in this area from IMT schools.

 

Cybersecurity, blockchain, digital identities… In 27 pages, the booklet offers perspective on these topics that are so widely covered by the media, examining the foundation of our interactions and practices: trust, a concept that is currently undergoing tremendous changes.

The first section examines the concept of trust and provides a multidimensional definition. The booklet differentiates the term “confidence”, a form of trust related to the social context we live in, from “trust” that is decided at the individual level. This second form of trust is becoming predominant in the digital economy: trust tends to be reduced to a calculation of risk, to the detriment of our capacity to learn how to trust.

In the second section, the booklet examines the transformation of trust in the digital age. It offers a presentation of the blockchain by introducing the related concepts of protocol, consensus, and evidence. In addition to providing different types of use in the areas of health, personal data, and private life, it provides economic insight and raises the question: how can we make trust a new common good?

The booklet ends with a third section focused on the human factor. Exploring issues of governance, trust transitivity, networks of trust… New balances are being created and the relationship between social consensus and algorithmic consensus is constantly evolving.

 

This booklet, written by Aymeric Poulain-Maubant, an independent expert, benefited from contributions  from research professors from IMT schools: Claire Levallois-Barth (Télécom ParisTech), Patrick Waelbroeck (Télécom ParisTech), Maryline Laurent (Télécom SudParis), Armen Khatchatourov (Télécom École de Management) and Bruno Salgues (Mines Saint-Étienne). The Foundation’s corporate partners, specifically Accenture, Orange, La Poste Group, and La Caisse des Dépôts, also lent their expertise to this project.

[box type=”shadow” align=”” class=”” width=””]

Download here the booklet (in French)
The new balances of trust, between algorithms and social contract

Confiance numérique, Cahier de veille, Fondation Mines-Télécom

[/box]

[box type=”shadow” align=”” class=”” width=””]

On the occasion of the release of the Fondation Mines-Télécom booklet, I’MTech
publish a series devoted to the link between technologies and trust.

[/box]

Trust in the digital age

How does the concept of trust play out in the context of new technologies? At a time when blockchain technology is experiencing phenomenal success in the corporate sector, it has become crucial to examine the mechanisms involved in building trust. Behind the machines’ apparent infallible precision, there are humans, with all their complexity and subjectivity. What risks does this create? And, more importantly, what role can trust play in mitigating these risks?

Finding an answer to this question sometimes requires examining the past, and studying the current technological breakthroughs in the light of those our society has experienced before. Because, ultimately, what is trust? Isn’t it simply, like any other value, a permanent social construction that must be examined to gain insights on our behavior?

In this special series, I’MTech addresses these issues by looking at current research in the areas of philosophy, sociology and economy. It is being published to coincide with the release of the new Fondation Mines-Télécom booklet on the new balances of trust.

 

[one_half]

[/one_half][one_half_last]

[/one_half_last]

Celtic-Plus Awards, Eureka

Three IMT projects receive Celtic-Plus Awards

Three projects involving IMT schools were featured among the winners at the 2017 Celtic-Plus Awards. The Celtic-Plus program is committed to promoting innovation and research in the areas of telecommunications and information technology. The program is overseen by the European initiative Eureka, which seeks to strengthen the competitiveness of industries as a whole.

 

[box type=”shadow” align=”” class=”” width=””]

SASER (Safe and Secure European Routing) :
Celtic-Plus Innovation Award

The SASER research program brings together operators, equipment manufacturers and research institutes from France, Germany and Finland. The goal of this program is to develop new concepts for strengthening the security of data transport networks in Europe. To achieve this goal, the SASER project is working on new architectures, specifically by imagining networks that integrate, or are distributed, via the latest technological advances in cloud computing and virtualization. Télécom ParisTech, IMT Atlantique and Télécom SudParis are partners in this project led by the Nokia group.

[/box]

[box type=”shadow” align=”” class=”” width=””]

NOTTS (Next Generation Over-The-Top Multimedia Services) :
Excellence Award for Services and Applications

NOTTS seeks to resolve the new problems created by over-the-top multimedia services. These services, such as Netflix and Spotify, are not controlled by the operators and affect the internet network. The project proposes to study the technical problems facing the operators, and seek solutions for creating new business models that would be agreeable for all parties involved. It brings together public and private partners from 6 different countries: Spain, Portugal, Finland, Sweden, Poland, and France, where Télécom SudParis is based.

[/box]

[box type=”shadow” align=”” class=”” width=””]

H2B2VS (HEVC Hybrid Broadcast Broadband Video Services) :
Excellence Award for Multimedia

New video formats such as ultra-HD and 3D test the limits of broadcasting networks and high-speed networks. Both networks have limited bandwidths. The H2B2VS project aims to resolve this bandwidth problem by combining the two networks. The broadcasting network would transmit the main information, while the high-speed network would transmit additional information. H2B2VS includes industrialists and public research institutes in France, Spain, Turkey, Finland and Switzerland. Télécom ParisTech is part of this consortium.

[/box]

Two IMT projects also received awards at the 2016 Celtic-Plus Awards.

Jean-Luc Dugelay, Biométrie, smartphone, biométrics, iris recognition

Iris recognition: towards a biometric system for smartphones

Smartphones provide a wide range of opportunities for biometrics. Jean-Luc Dugelay and Chiara Galdi, researchers at Eurecom, are working on a simple, rapid iris recognition algorithm for mobile phones, which could be used as an authentication system for operations such as bank transactions.

 

Last name, first name, e-mail address, social media, photographs — your smartphone is a complete summary of your personal information. In the near future, this extremely personal device-tool could even take on the role of digital passport. A number of biometric systems are being explored in order to secure access to these devices. Facial, digital, or iris recognition have the advantage of being recognized by the authorities, making them more popular options, even for research. Jean-Luc Dugelay is a researcher specialized in image processing at Eurecom. He is working with Chiara Galdi to develop an algorithm designed especially for iris recognition on smartphones. The initial results of the study were published in May 2017 in Pattern Recognition Letters. Their objective? Develop an instant, easy-to-use system for mobile devices.

 

The eye: three components for differentiating between individuals

Biometric iris recognition generally uses infrared light, which allows for greater visibility of the characteristics which differentiate one eye from another. “To create a system for the general public, we have to consider the type of technology people have. We have therefore adopted a technique using visible light so as to ensure compatibility with mobile phones,” explains Jean-Luc Dugelay.

 

oeil, spot, biométrie

Examples of color spots

 

The result is the FIRE (Fast Iris REcognition) algorithm, which is based on an evaluation of three parameters of the eye: color, texture, and spots. In everyday life, eye color is approximated by generic shades like blue, green or brown. In FIRE, it is defined by a colorimetric composition diagram. Eye texture corresponds to the ridges and ligaments that form the patterns of the iris. Finally, spots are the small dots of color within the iris. Together, these three parameters are what make the eyes of one individual unique from all others.

 

FIRE methodology and validation

When images of irises from databases were used to test the FIRE algorithm, variations in lighting conditions from different photographs created difficulties. To remove variations in brightness, the researcher applied a technique to standardize the colors. “The best-known color space is red-green-blue (RGB) but other systems exist, such as LAB. This is space where color is expressed according to the lightness ‘L,’ and A and B, which are chromatic components. We are focusing on these last two aspects rather than the overall definition of color, which allows us to exclude lighting conditions,” explains Jean-Luc Dugelay.

An algorithm then carries out an in-depth analysis of each of the three parameters of the eye. In order to compare two irises, each parameter is studied twice: once on the eye being tested, and once on the reference eye. Distance calculations are then established to represent the degree of similarity between the two irises. These three calculations result in scores which are then merged together by a single algorithm. However, the three parameters do not have the same degree of reliability in terms of distinguishing between two irises. Texture is a defining element, while color is a less discriminating characteristic. This is why, in merging the scores to produce the final calculation, each parameter is weighted according to how effective it is in comparison to the others.

 

Authentication, identification, security and protocol

This algorithm can be used according to two possible configurations which determine its execution time. In the case of authentication, it is used to compare the dimensions of the iris with those of the person who owns the phone. This procedure could be used to unlock a smartphone or confirm bank transactions. The algorithm gives a result in one second. However, when used for identification purposes, the issue is not knowing if the iris is your own, but rather to whom it corresponds. The algorithm could therefore be used for identity verification purposes. This is the basis for the H2020 PROTECT project in which Eurecom is taking part. Individuals would no longer be required to get out of their vehicles when crossing a border, for example, since they could identify themselves at the checkpoint from their mobile phones.

Although FIRE has successfully demonstrated that the iris recognition technique can be adapted for visible light and mobile devices, protocol issues must still be studied before making this system available to the general public. “Even if the algorithm never made mistakes in visible light, it would also have to be proven to be reliable in terms of performance and strong enough to withstand attacks. This use of biometrics also raises questions about privacy: what information is transmitted, to whom, who could store it etc.,” adds Jean-Luc Dugelay.

Several prospects are possible for the future. First of all, the security of the system could be increased. “Each sensor associated with a camera has a specific noise which distinguishes it from all other sensors. It’s like digital ballistics. The system could then verify that it is indeed your iris and in addition, verify that it is your mobile phone based on the noise in the image. This would make the protocol more difficult to pirate,” explains Jean-Luc Dugelay. Other possible solutions may emerge in the future, but in the meantime, the algorithmic portion developed by the Eurocom team is well on its way to becoming operational.

 

réseau de chaleur, heating networks

Improving heating network performance through modeling

At the IMT “Energy in the Digital Revolution” conference held on April 28, Bruno Lacarrière, an energetics researcher with IMT Atlantique, presented modeling approaches for improving the management of heating networks. Combined with digital technology, these approaches support heating distribution networks in the transition towards smart management solutions.

The building sector accounts for 40% of European energy consumption. As a result, renovating this energy-intensive sector is an important measure in the law on the energy transition for green growth. This law aims to improve energy efficiency and reduce greenhouse gas emissions. In this context, heating networks currently account for approximately 10% of the heat distributed in Europe. These systems deliver heating and domestic hot water from all energy sources, although today the majority are fossil-fueled. “The heating networks use old technology that, for the most part, is not managed in an optimal manner. Just like smart grids for electrical networks, they must benefit from new technology to ensure better environmental and energy management,” explains Bruno Lacarrière, a researcher with IMT Atlantique.

 

Understanding the structure of an urban heating network

A heating network is made up of a set of pipes that run through the city, connecting energy sources to buildings. Its purpose is to transport heat over long distances while limiting loss. In a given network, there may be several sources (or production units). The sources may come from a waste incineration plant, a gas or biomass heating plant, an industrial residual surplus, or a geothermal power plant. These units are connected by pipelines carrying heat in the form of liquid water (or occasionally vapor) to substations. These substations then redistribute the heat to the different buildings.

These seemingly simple networks are in fact becoming increasingly complex. As cities are transforming, new energy sources and new consumers are being added. “We now have configurations that are more or less centralized, which are at times intermittent. What is the best way to manage the overall system? The complexity of these networks is similar to the configuration of electrical networks, on a smaller scale. This is why we are looking at the whether a “smart” approach could be used for heating networks,” explains Bruno Lacarrière.

 

Modeling and simulation for the a priori and a posteriori assessment of heating networks

To deal with this issue, researchers are working on a modeling approach for the heating networks and the demand. In order to develop the most reliable model, a minimum amount of information is required.  For demand, the consumption data history for the buildings can be used to anticipate future needs. However, this data is not always available. The researchers can also develop physical models based on a minimal knowledge of the buildings’ characteristics. Yet some information remains unavailable. “We do not have access to all the buildings’ technical information. We also lack information on the inhabitants’ occupancy of the buildings,” Bruno Lacarrière points out. “We rely on simplified approaches, based on hypotheses or external references.

For the heating networks, researchers assess the best use of the heat sources (fossil, renewable, intermittent, storage, excess heat…). This is carried out based on the a priori knowledge of the production units they are connected to. The entire network must provide the heat distribution energy service. And this must be done in a cost-effective and environmentally-friendly manner. “Our models allow us to simulate the entire system, taking into account the constraints and characteristics of the sources and the distribution. The demand then becomes a constraint.”

The models that are developed are then used in various applications. The demand simulation is used to measure the direct and indirect impacts climate change will have on a neighborhood. It makes it possible to assess the heat networks in a mild climate scenario and with high performance buildings. The heat network models are used to improve the management and operation strategies for the existing networks. Together, both types of models help determine the potential effectiveness of deploying information and communication technology for smart heating networks.

 

The move towards smart heating networks

Heating networks are entering their fourth generation. This means that they are operating at lower temperatures. “We are therefore looking at the idea of networks with different temperature levels, while examining how this would affect the overall operation,” the researcher adds.

In addition to the modelling approach, the use of information and communication technology allows for an increase in the networks’ efficiency, as was the case for electricity (smart monitoring, smart control). “We are assessing this potential based on the technology’s capacity to better meet demand at the right cost,” Bruno Lacarrière explains.

Deploying this technology in the substations, and the information provided by the simulation tools, go hand in hand with the prospect of deploying more decentralized production or storage units, turning consumers into consumer-producers [1], and even connecting to networks of other energy forms (e.g. electricity networks), thus reinforcing the concept of smart networks and the need for related research.

 

[1] The consumers become energy consumers and/or producers in an intermittent manner. This is due to the deployment of decentralized production systems (e.g. solar panels).

 

This article is part of our dossier Digital technology and energy: inseparable transitions!

 

Also read on I’MTech

Energy transition, Digital technology transition

Digital technology and energy: inseparable transitions

[dropcap]W[/dropcap]hat if one transition was inextricably linked with another? Faced with environmental challenges, population growth and the emergence of new uses, a transformation is underway in the energy sector. Renewable resources are playing a larger role in the production of the energy mix, advances in housing have helped reduce heat loss and modes of transportation are changing their models to limit the use of fossil fuels. But even beyond these major metamorphoses, the energy transition in progress is intertwined with that of digital technology. Electrical grids, like heat networks, are becoming “smart.” Modeling is now seen as indispensable from the earliest stages of design or renovation or buildings.

The line dividing these two transitions is indeed so fine that it is becoming difficult to determine to which category belong the changes taking place in the world of connected devices and telecommunications. For mobile phone operators, power supply management for mobile networks is a crucial issue. The proportion of renewable energy must be increased, but this leads to a lower quality of service. How can the right balance be determined?  And telephones themselves pose challenges for improving energy autonomy, in terms of both hardware and software.

This interconnection illustrates the complexity of the changes taking shape in contemporary societies. In this report we seek to present issues situated at the interface between energy and digital technology. Through research carried out in the laboratories of IMT graduate schools, we outline some of the major challenges currently facing civil society, economic stakeholders and public policymakers.

For consumers, the forces at play in the energy sector may appear complex. Often reduced to a sort of technological optimism without taking account of scientific reality, they are influenced by significant political and economic issues. The first part of this report helps reframe the debate while providing an overview of the energy transition through an interview with Bernard Bourges, a researcher who specializes in this subject.  A European SEAS project is explained as a concrete example of the transformations underway in order to provide a look at the reality behind the promises of smart grids.

[one_half]

[/one_half][one_half_last]

[/one_half_last]

The second part of the report focuses on heat networks, which, like electric networks, can also be improved with the help of algorithms. Heat networks represent 9% of the heat distributed in Europe and can therefore represent a catalyst for reducing energy costs in buildings. Bruno Lacarrière’s research illustrates the importance of digital modeling in the optimization of these networks (article to come). And because reducing heat loss is also important at the level of individual buildings, we take a closer look at Sanda Leteriu’s research on how to improve energy performance for homes.

[one_half]

[/one_half][one_half_last]

[/one_half_last]

The report concludes with a third section dedicated to the telecommunications sector. An overview of Loutfi Nuaymi’s work highlights the role played by operators in optimizing the energy efficiency of their networks and demonstrates how important algorithms are becoming for them. We also examine how electric consumption can be regulated by reducing the demand for calculations in our mobile phones, with a look at research by Maurice Gagnaire. Finally, since connected devices require ever-more powerful batteries, the last article explores a new generation of lithium batteries, and the high hopes for the technologies being developed in Thierry Djenizian’s laboratory.

[one_half]

[/one_half][one_half_last]

[/one_half_last]

 

[divider style=”normal” top=”20″ bottom=”20″]

To further explore this topic:

To learn more about how the digital and energy transitions are intertwined, we suggest these related articles from the IMTech archives:

5G will also consume less energy

In Nantes the smart city becomes a reality with mySMARTLife

Data centers: taking up the energy challenge

The bitcoin and blockchain: energy hogs

[divider style=”normal” top=”20″ bottom=”20″]

energy performance, performance énergétique, sanda lefteriu

Energy performance of buildings: modeling for better efficicency

Sanda Lefteriu, a researcher at IMT Lille-Douai, is working on developing predictive and control models designed for buildings with the aim of improving energy management. A look at the work presented on April 28 at the IMT “Energy in the digital revolution” symposium.

Good things come to those who wait. Seven years after the Grenelle 2 law, a decree published on May 10 requires buildings (see insert) used for the private and public service sector to improve their energy performance. The text sets a requirement to reduce consumption by 25% by 2020 and by 40% by 2030.[1] To do so, reliable, easy-to-use models must be established in order to predict the energy behavior of buildings in near-real-time. This is the goal of research being conducted by Balsam Ajib, a PhD student supervised by Sanda Lefteriu and Stéphane Lecoeuche of IMT Lille-Douai as well as by Antoine Caucheteux of Cerema.

 

A new approach for modeling thermal phenomena

State-of-the-art experimental models for evaluating energy performance of buildings use models with what are referred to as “linear” structures. This means that input variables for the model (weather, radiation, heating power etc.) are only linked to the output of this same model (temperature of a room etc.) through a linear equation. However, a number of phenomena which occur within a room, and therefore within a system, can temporarily disrupt its thermal equilibrium. For example, a large number of individuals inside a building will lead to a rise in temperature. The same is true when the sun shines on a building when its shutters are open.

Based on this observation, researchers propose using what is called a “commutation” model, which takes account of discrete events occurring at a given moment which influence the continuous behavior of the system being studied (change in temperature). “For a building, events like opening/closing windows or doors are commutations (0 or 1) which disrupt the dynamics of the system. But we can separate these actions from linear behavior in order to identify their impacts more clearly,” explains the researcher. To do so, she has developed several models, each of which correspond to a situation. “We estimate each configuration, for example a situation in which the door and windows are closed and heat is set at 20°C corresponds to one model. If we change the temperature to 22°C, we identify another and so on,” adds Sanda Lefteriu.

 

Objective: use these models for all types of buildings

To create these scenarios, researchers use real data collected inside buildings following measurement programs. Sensors were placed on the campus of IMT Lille-Douai and in passive houses which are part of the INCAS platform in Chambéry. These uninhabited residences offer a completely controlled site for experimenting since all the parameters related to the building (structure, materials) are known. These rare infrastructures make it possible to set up physical models, meaning models built according to the specific characteristics of the infrastructures being studied. “This information is rarely available so that’s why we are now working on mathematical modeling which is easier to implement,” explains Sanda Lefteriu.

We’re only at the feasibility phase but these models could be used to estimate heating power and therefore energy performance of buildings in real time,” adds the researcher. Applications will be put in place in social housing as part of the ShINE European project in which IMT Lille-Douai is taking part. The goal of this project is to reduce carbon emissions from housing.

These tools will be used for existing buildings. Once the models are operational, control algorithms installed on robots will be placed in the infrastructures. Finally, another series of tools will be used to link physical conditions with observations in order to focus new research. “We still have to identify which physical parameters change when we observe a new dynamic,” says Sanda Lefteriu. These models remain to be built, just like the buildings which they will directly serve.

 

 [1] Buildings currently represent 40-45% of energy spending in France across all sectors. Find out more+ about key energy figures in France.

 

This article is part of our dossier Digital technology and energy: inseparable transitions!

 

[box type=”shadow” align=”” class=”” width=””]

Energy performance of buildings:

The energy performance of a building includes its energy consumption and its impact in terms of greenhouse gas emissions. Consideration is given to the hot water supply system, heating, lighting and ventilation. Other building characteristics to be assessed include insulation, location and orientation. An energy performance certificate is a standardized way to measure how much energy is actually consumed or estimated to be consumed according to standard use of the infrastructure. [/box]