Posts

BeePMN, abeilles, apiculteur

BeePMN: Monitoring bees to take better care of them

At the crossroads between Industry 4.0 and the Internet of Things, the BeePMN research project aims to help amateur and professional beekeepers. It will feature an intuitive smartphone app that combines business processes with real-time measurements of apiaries. 

When a swarm of bees becomes too crowded for its hive, the queen stops laying eggs and the worker bees leave in search of a place to start a new colony. The hive splits into two groups; those who follow the queen to explore new horizons, and those who stay and choose a new queen to take over the leadership of the colony. As exciting as this new adventure is for the bees, for the beekeeper who maintains the hive, this new beginning brings complications. In particular, the loss of part of the colony also leads to a decrease in honey production. On the other hand, the loss of the bees can be caused by something much worse, like the emergence of a virus or an invasion that threatens the health of the bee colony. 

Beekeepers therefore monitor these events in the life of the bees very closely, but keeping track of the hives on a daily basis is a major problem, and a question of time. The BeePMN project, at the crossroads between the processes of Industry 4.0 and the Internet of Things, wants to give the beekeepers eyes in the back of their heads to be able to monitor the health of their hives in real time. BeePMN combines a non-invasive sensor system, to provide real-time data, with an intuitive and easy-to-use application, to provide decision-making support. 

This project was launched as part of the Hubert Curien Partnerships which support scientific and technological exchanges between countries, offering the installation of sites both in France, near Alès, and in Lebanon, with the beekeeping cooperative Atelier du Miel. It is supported by a collaboration between a team led by Gregory Zacharewicz, Nicolas Daclin and François Trousset at IMT Mines Alès, a team led by Charles Yaacoub and Adib Akl at the Holy Spirit University of Kaslik in Lebanon, and the company ConnectHive. This company, which specializes in engineering as applied to the beekeeping industry, was founded by François Pfister, a retired IMT Mines Alès researcher and beekeeping enthusiast.

BeePMN has several goals: to monitor the health of the hives, to increase honey production, and to facilitate the sharing of knowledge between amateurs and professionals. 

“I actually work on business process problems in industry,” says Grégory Zacharewicz, a researcher at IMT Mines Alès on the project. “But the synergy with these different partners has directed us more towards the craft sector, and specifically beekeeping,” with the aim of providing tools to accelerate their tasks or reminders about certain activities. “I often compare BeePMN to a GPS: it is of course possible to drive without it, but it’s a tool that guides the driver to optimize his choices,” he explains. 

Making better decisions 

The different sites, both in France and Lebanon, are equipped with connected sensors, non-invasive for the bee colonies, which gather real-time data on their health, as well as on humidity, temperature, and weight. For the latter, they have developed ‘nomad’ scales, which are less expensive than the usual fixed equivalent. This data is then recorded in an application to help guide the beekeepers in their daily choices. Though professionals are used to making these kinds of decisions, they may not necessarily have all the information at hand, nor the time to monitor all their apiaries. 

The data observed by the sensors is paired with other environmental information such as the current season, weather conditions, and the flowering period. This allows for precise information on each hive and its environment, and improves the relevance of possible actions and choices. 

“If, for example, we observe a sudden 60% weight loss in a hive, there is no other option than to harvest it,” says Charbel Kady, a PhD student at IMT Mines Alès who is also working on the BeePMN project. On the other hand, if the weight loss happens gradually over the course of the week, that might be the result of lots of other factors, like a virus attacking the colony, or part of the colony moving elsewhere. That is the whole point of combining this essential data, like weight, with environmental variables, to provide more certainty on the cause of an event. “It’s about making sense of the information to identify the cause,” notes Charbel Kady. 

The researchers would also like to add vegetation maps to the environmental information. This is an important aspect, especially with regard to honey plants, but this information is difficult to find for certain regions, and complex to install in an application. The project also aims to progress towards prevention aspects: a PhD student, Marianne El Kassis, joined the BeePMN team to work on simulations and to integrate them into the application, to be able to prevent potential risks. 

Learn through play 

The two researchers stressed that one of the points of the application is for beekeepers to help each other. “Beekeepers can share information with each other, and the interesting model of one colleague can be copied and integrated into the everyday life of another,” says Charbel Kady. The application centralizes the data for a set of apiaries and the beekeepers can share their results with each other, or make them available to beginners. That’s the core of the second part of the project, a ‘serious’ game to offer a simplified and fun version to amateur beekeepers who are less independent. 

Professionals are accustomed to repeating a certain set of actions, so it is possible to formalize them with digital tools in the form of business processes to guide amateurs in their activities. “We organized several meetings with beekeepers to define these business rules and to integrate them into the application, and when the sensors receive the information, it triggers certain actions or alerts, for example taking care of the honey harvest, or needing to add wax to the hive,” explains Grégory Zacharewicz. 

“There is a strong aspect of knowledge and skill transfer. We can imagine it like a sort of companionship to pass on the experience acquired,” says the researcher. The GPS analogy is applicable here too: “It makes available a whole range of past choices from professionals and other users, so that when you encounter a particular situation, it suggests the best response based on what has been decided by other users in the past,” the researcher adds. The concept of the app is very similar, in offering the possibility to capitalize on professionals’ knowledge of business processes to educate yourself and learn, while being guided at the same time. 

The BeePMN project is based on beekeeping activities, but as the researchers point out, the concept itself can be applied to various fields. “We can think of a lot of human and industrial activities where this project could be replicated to support decision-making processes and make them stronger,” explains Grégory Zacharewicz.

Tiphaine Claveau

cryptographie, nombres aléatoires, random numbers

Cryptography: what are the random numbers for?

Hervé Debar, Télécom SudParis – Institut Mines-Télécom and Olivier Levillain, Télécom SudParis – Institut Mines-Télécom

The original purpose of cryptography is to allow two parties (traditionally referred to as Alice and Bob) to exchange messages without another party (traditionally known as Eve) being able to read them. Alice and Bob will therefore agree on a method to exchange each message, M, in an encrypted form, C. Eve can observe the medium through which the encrypted message (or ciphertext) C is sent, but she cannot retrieve the information exchanged without knowing the necessary secret information, called the key.

This is a very old exercise, since we speak, for example, of the ‘Julius Caesar Cipher’. However, it has become very important in recent years, due to the increasing need to exchange information. Cryptography has therefore become an essential part of our everyday lives. Besides the exchange of messages, cryptographic mechanisms are used in many everyday objects to identify and authenticate users and their transactions. We find these mechanisms in phones, for example, to encrypt and authenticate communication between the telephone and radio antennas, or in car keys, and bank cards.

The internet has also popularized the ‘padlock’ in browsers to indicate that the communication between the browser and the server are protected by cryptographic mechanisms. To function correctly, these mechanisms require the use of random numbers, the quality (or more precisely, the unpredictability) thereof contributes to the security of the protocols.

Cryptographic algorithms

To transform a message M into an encrypted message C, by means of an algorithm A, keys are used. In so-called symmetric algorithms, we speak of secret keys (Ks), which are shared and kept secret by Alice and Bob. In symmetric algorithms, there are public (KPu) and private (KPr) key pairs. For each user, KPu is known to all, whereas KPr must be kept safe by its owner. Algorithm A is also public, which means that the secrecy of communication relies solely on the secrecy of the keys (secret or private).

Sometimes, the message M being transmitted is not important in itself, and the purpose of encrypting said message M is only to verify that the correspondent can decrypt it. This proof of possession of Ks or KPr can be used in some authentication schemes. In this case, it is important never to use the same message M more than once, since this would allow Eve to find out information pertaining to the keys. Therefore, it is necessary to generate a random message NA, which will change each time that Alice and Bob want to communicate.

The best known and probably most widely used example of this mechanism is the Diffie-Helman algorithm.  This algorithm allows a browser (Alice) and a website (Bob) to obtain an identical secret key K, different for each connection, by having exchanged their respective KPu beforehand. This process is performed, for example, when connecting to a retail website. This allows the browser and the website to exchange encrypted messages with a key that is destroyed at the end of each session. This means that there is no need to keep it (allowing for ease of use and security, since there is less chance of losing the key). It also means that not much traffic will be encrypted with the same key, which makes cryptanalysis attacks more difficult than if the same key were always used.

Generating random numbers

To ensure Eve is unable obtain the secret key, it is very important that she cannot guess the message NA. In practice, this message is often a large random number used in the calculations required by the chosen algorithm.

Initially, generating random variables was used for a lot of simulation work. To obtain relevant results, it is important not to repeat the simulation with the same parameters, but to repeat the simulation with different parameters hundreds or even thousands of times. The aim is to generate numbers that respect certain statistical properties, and that do not allow the sequence of numbers to be differentiated from a sequence that would be obtained by rolling dice, for example.

To generate a random number NA that can be used in these simulations, so-called pseudo-random generators are normally used, which apply a reprocessing algorithm to an initial value, known as the ‘seed’.  These pseudo-random generators aim to produce a sequence of numbers that resembles a random sequence, according to these statistical criteria. However, using the same seed twice will result in obtaining the same sequence twice.

The pseudo-random generator algorithm is usually public. If an attacker is able to guess the seed, he will be able to generate the random sequence and thus obtain the random numbers used by the cryptographic algorithms. In the specific case of cryptography, the attacker does not necessarily even need to know the exact value of the seed. If they are able to guess a set of values, this is enough to quickly calculate all possible keys and to crack the encryption.

In the 2000s, programmers used seeds that could be easily guessed, that were based on time, for example, making systems vulnerable. Since then, to avoid being able to guess the seed (or a set of values for the seed), operating systems rely on a mixture of the physical elements of the system (e.g. processing temperature, bus connections, etc.). These physical elements are impossible for an attacker to observe, and vary frequently, and therefore provide a good seed source for pseudo-random generators.

What about vulnerabilities?

Although the field is now well understood, random number generators are still sometimes subject to vulnerabilities. For example, between 2017 and 2021, cybersecurity researchers found 53 such vulnerabilities (CWE-338). This represents only a small number of software flaws (less than 1 in 1000). Several of these flaws, however, are of a high or critical level, meaning they can be used quite easily by attackers and are widespread.

A prime example in 2010 was Sony’s error on the PS3 software signature system. In this case, the reuse of a random variable for two different signatures allowed an attacker to find the manufacturer’s private key: it then became possible to install any software on the console, including pirated software and malware.

Between 2017 and 2021, flaws have also affected physical components, such as Intel Xeon processors, Broadcom chips used for communications and Qualcom SnapDragon processors embedded in mobile phones. These flaws affect the quality of random number generation.  For example, CVE-2018-5871 and CVE-2018-11290 relate to a seed generator whose periodicity is too short, i.e. that repeats the same sequence of seeds quickly. These flaws have been fixed and only affect certain functions of the hardware, which limits the risk.

The quality of random number generation is therefore a security issue. Operating systems running on newer processors (less than 10 years old) have random number generation mechanisms that are hardware-based. This generally ensures a good quality of the latter and thus the proper functioning of cryptographic algorithms, even if occasional vulnerabilities may arise. On the other hand, the difficulty is especially prominent in the case of connected objects, whose hardware capacities do not allow the implementation of random generators as powerful as those available on computers and smartphones, and which often prove to be more vulnerable.

Hervé Debar, Director of Research and Doctoral Training, Deputy Director, Télécom SudParis – Institut Mines-Télécom and Olivier Levillain, Assistant Professor, Télécom SudParis – Institut Mines-Télécom

This article has been republished from The Conversation under a Creative Commons license. Read the original article.

réseaux optiques, optical networks

The virtualization of optical networks to support… 5G

Mobile networks are not entirely wireless. They also rely on a network of optical fibers, which connect antennas to the core network, among other things. With the arrival of 5G, optical networks must be able to keep up with the ramping up of the rest of the mobile network to ensure the promised quality of service. Two IMT Atlantique researchers are working on this issue, by making optical networks smarter and more flexible.  

In discussions of issues surrounding 5G, it is common to hear about the installation of a large number of antennas or the need for compatible devices. But we often overlook a crucial aspect of mobile networks: the fiber optic infrastructure on which they rely. Like previous generations, 5G relies on a wired connection in most cases. This technology is also used in the “last mile”. It therefore makes it possible to connect antennas to core network equipment, which is linked to most of the connected machines around the world. It can also connect various devices within the same antenna site.

In reality, 5G is even more dependent on this infrastructure than previous generations since the next-generation technology comes with new requirements related to new uses, such as the Internet of Things (IoT). For example, an application such as an autonomous car requires high availability, perfect reliability, very-low latency etc. All of these constraints weigh on the overall architecture, which includes fiber optics. If they cannot adapt to new demands within the last mile, the promises of 5G will be jeopardized. And new services (industry 4.0, connected cities, telesurgery etc.) will simply not be able to be provided in a reliable, secure way.

Facilitating network management through better interoperability

Today, optical networks are usually over-provizioned in relation to current average throughput needs. They are designed to be able to absorb 4G peak loads and are neither optimized, nor able to adapt intelligently to fluctuating demand. The new reality created by 5G, therefore represents both a threat for infrastructure in terms of its ability to respond to new challenges, and an opportunity to rethink its management.

Isabel Amigo and Luiz Anet Neto, telecommunications researchers at IMT Atlantique, are working with a team of researchers and PhD students to conduct research in this area. Their goal is to make optical networks smarter, more flexible and more independent from the proprietary systems imposed by vendors. A growing number of operators are moving in this direction. “At Orange, it used to be common to meet specialists in configuration syntaxes and equipment management for just one or two vendors,” explains Luiz Anet Neto, who worked for the French group for five years. “Now, teams are starting to set up a “translation layer” that turns the various configurations, which are specific to each vendor, into a common language that is more straightforward and abstract.”

This “translation layer”, on which he is working with other researchers, is called SDN, which stands for Software-Defined Networking. This model is already used in the wireless part of the network and involves offloading certain functions of network equipment. Traditionally, this equipment fulfills many missions: data processing (receiving and sending packets back to their destination), as well as a number of control tasks (routing protocols, transmission interfaces etc.) With SDN, equipment is relieved from these control tasks, which are centralized within an “orchestrator” entity that can control several devices at once.  

Read more on I’MTech: What is SDN?

There are many benefits to this approach. It provides an overview of the network, making it easier to manage, while making it possible to control all of the equipment, regardless of its vendor without having to know any proprietary language. “To understand the benefit of SDN, we can use an analogy between a personal computer and the SDN paradigm,” says Isabel Amigo. “Today, it would be unthinkable to have a computer that would only run applications that use a specific language. So, machines have an additional layer – the operating system – that is in charge of “translating” the various languages, as well as managing resources, memory, disks etc. SDN therefore aims to act like an operating system, but for the network.” Similarly, the goal is to be able to install applications that are able to work on any equipment, regardless of the hardware vendor. These applications could, for example, distribute the load based on demand.

Breaking our dependence on hardware vendors

SDN often goes hand in hand with another concept, inspired by virtualization in data centers: NFV (Network Functions Virtualization). Its principle: being able to execute any network functionality (not just control functions) on generic servers via software applications.”Usually, dedicated equipment is required for these functions,” says the IMT researcher. “For example, if you want to have a firewall, you need to buy a specific device from a vendor. With NFV, this is no longer necessary: you can implement the function on any server via an application.”

Read more on I’MTech: What is NFV?

As with SDN, the arrival of virtualization in optical networks promotes better interoperability. This makes it harder for vendors to require the use of their proprietary systems linked to their equipment. The market is also changing, by making more room for software developers. “But there is still a long way to go,” says Luiz Anet Neto. “Software providers can also try to make their customers dependent on their products, through closed systems. So operators have to remain vigilant and offer an increasing level of interoperability.”

Operators are working with the academic world precisely for this purpose. They would fully benefit from standardization, which would simplify the management of their optical networks. Laboratory tests carried out by IMT Atlantique in partnership with Orange provide them with technical information and areas to explore ahead of discussions with vendors and standardization bodies.

Sights are already set on 6G

For the research teams, there are many areas for development. First of all, the scientists are seeking to further demonstrate the value of their research, through testing focusing on a specific 5G service (up to now, the experiments have not applied to a specific application). Their aim is to establish recommendations for optical link dimensioning to connect mobile network equipment.

The goal is then to move towards smart optimization of optical networks. To provide an example of how findings by IMT Atlantique researchers may be applied, it is currently possible to add a “probe” that can determine if a path is overloaded and shift certain services to another link if necessary. The idea would then be to develop more in-depth mathematical modeling of the phenomena encountered, in order to automate incident resolution using artificial intelligence algorithms.

And it is already time for researchers to look toward the future of technology. “Mobile networks are upgraded at a dizzying pace; new generations come out every ten years,” says Luiz Anet Neto. “So we already have to be thinking about how to meet future requirements for 6G!

Bastien Contreras

IoT, Internet of Things

A standardized protocol to respond to the challenges of the IoT

The arrival of 5G has put the Internet of Things back in the spotlight, with the promise of an influx of connected objects in both the professional and private spheres. However, before witnessing the projected revolution, several obstacles remain. This is precisely what researchers at IMT Atlantique are working on, and they have already achieved results of global significance.

The Internet of Things (IoT) refers to the interconnection of various physical devices via the Internet for the purpose of sharing data. Sometimes referred to as the “Web 3.0”, this field is set to develop rapidly in the coming years, thanks to the arrival of new networks, such as 5G, and the proliferation of connected objects. Its applications are infinite: monitoring of health data, the connected home, autonomous cars, real-time and predictive maintenance on industrial devices, and more.

Although it is booming, the IoT still faces major challenges. “We need to respond to three main constraints: energy efficiency, interoperability and security,” explains Laurent Toutain, a researcher at IMT Atlantic. But there is one problem: these three aspects can be difficult to combine.

The three pillars of the IoT

First, energy is a key issue for the IoT. For most connected objects, the autonomy of a smartphone is not sufficient. In the future, a household may have several dozen such devices. If they each need to be recharged every two or three days, the user will have to devote several hours to this task. And what about factories that could be equipped with thousands of connected objects? In some cases, these are only of value if they have a long battery life. For example, a sensor could be used to monitor the presence of a fire extinguisher at its location and send an alert if it does not detect one. If you have to recharge its battery regularly, such an installation is no longer useful.

For a connected object, communication features account for the largest share of energy consumption. Thus, the development of IoT has been made possible by the implementation of networks, such as LoRa or Sigfox, allowing data to be sent while consuming little energy.

The second issue is interoperability, i.e. the ability of a product to work with other objects and systems, both current and future. Today, many manufacturers still rely on proprietary universes, which necessarily limits the functionalities offered by the IoT. Take the example of a user who has bought connected light bulbs from two different brands. They will not be able to control them via a single application.

Finally, the notion of security remains paramount within any connected system. This observation is all the more valid in the IoT, especially with applications involving the exchange of sensitive data, such as in the health sector. There are indeed many risks. An ill-intentioned user could intercept data during transmission, or send false information to connected objects, thus inducing wrong instructions, with potentially disastrous consequences.

Read more on I’MTech: The IoT needs dedicated security – now

On the Internet, methods are already in place to limit these threats. The most common is end-to-end data encryption. Its purpose is to make information unreadable while it is being transported, since the content can only be deciphered by the sender and receiver of the message.

Three contradictory requirements?

Unfortunately, each of the three characteristics can influence the others. For example, by multiplying the number of possible interlocutors, interoperability raises more security issues. But it also affects energy consumption. “Today, the Internet is a model of interoperability,” explains Laurent Toutain. For this, it is necessary to send a large amount of information each time, with a high degree of redundancy. It offers remarkable flexibility, but it also takes up a lot of space.” This is only a minor disadvantage for a broadband network, but not for the IoT, which is constrained in its energy consumption.

Similarly, if you want to have a secure system, there are two main possibilities. The first is to close it off from the rest of the ecosystem, in order to reduce risks, which radically limits interoperability.

The second is to implement security measures, such as end-to-end encryption, which results in more data being sent, and therefore increased energy consumption.

Reducing the amount of data sent, without compromising security

For about seven years, Laurent Toutain and his teams have been working to reconcile these different constraints, in the context of the IoT. “The idea is to build on what makes the current Internet so successful and adapt it to the constrained environments, says the researcher. We are therefore taking up the principles of the encryption methods and protocols used today, such as HTTP, but taking into account the specific requirements of the IoT”.

The research team has developed a compression mechanism named SCHC (Static Context Header Compression, pronounced “chic”). It aims to improve the efficiency of encryption solutions and provide interoperability in low-power networks.

For this purpose, SCHC works on the headers of the usual Internet protocols (IP, UDP and CoAP), which contain various details: source address, destination address, location of the data to be read, etc. The particularity of this method is that it takes advantage of the specificity of the IoT: a simple connected object, such as a sensor, has far fewer functions than a smartphone. It is then possible to anticipate the type of data sent. “We can thus free ourselves from the redundancy of classic exchanges on the web, says Laurent Toutain. We then lose flexibility, which could be inconvenient for standard Internet use, but not for a sensor, which is limited in its applications”.

In this way, the team at IMT Atlantique has achieved significant results. It has managed to reduce the size of the headers traditionally sent, weighing 70-80 bytes, to only 2 bytes, and to 10 bytes in their encrypted version. “A quantity that is perfectly acceptable for a connected object and compatible with network architectures that consume very little energy,” concludes the researcher.

A protocol approved by the IETF

But what about that precious interoperability? With this objective, the authors of the study approached the IETF (Internet Engineering Task Force), the international organization for Internet standards. The collaboration has paid off, as SCHC has been approved by the organization and now serves as the global standard for compression. This recognition is essential, but is only a first step towards effective interoperability. How can we now make sure that manufacturers really integrate the protocol into their connected objects? For this, Laurent Toutain has partnered with Alexander Pelov, also a researcher at IMT Atlantic, in order to found the start-up company Acklio. The company works directly with industrialists and offers them solutions to integrate SCHC in their products. It thus intends to accelerate the democratization of the protocol, an effort supported in particular by  €2 million in funds raised at the end of 2019.

Read more on I’MTech Acklio: linking connected objects to the Internet

Nevertheless, industrialists remain to be convinced that the use of a standard is also in their interest. To this end, Acklio also aims to position SCHC among the protocols used within 5G. To achieve this, it will have to prove itself with the 3GPP (3rd Generation Partnership Project) which brings together the world’s leading telecommunications standards bodies. “A much more constraining process than that of the IETF,” however, warns Laurent Toutain.

Bastien Contreras