attack

When the internet goes down

Hervé Debar, Télécom SudParis – Institut Mines-Télécom, Université Paris-Saclay

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]”A[/dropcap]third of the internet is under attack. Millions of network addresses were subjected to distributed denial-of-service (DDoS) attacks over two-year period,” reports Warren Froelich on the UC San Diego News Center website. A DDoS is a type of denial-of-service (DoS) attack in which the attacker carries out an attack using many sources distributed throughout the network.

But is the journalist justified in his alarmist reaction? Yes and no. If one third of the internet was under attack, then one in every three smartphones wouldn’t work, and one in every three computers would be offline. When we look around, we can see that this is obviously not the case, and if we now rely so heavily on our phones and Wikipedia, it is because we have come to view the internet as a network that functions well.

Still, the DDoS phenomenon is real. Recent attacks testify to this, such as botnet Mirai’s attack on the French web host OVH, and the American web host DynDNS falling victim to the same botnet.

The websites owned by customers of these servers were unavailable for several hours.

What the article really looks at is the appearance of IP addresses in the traces of DDoS attacks. Over a period of two years, the authors found the addresses of two million different victims, out of the 6 million servers listed on the web.

Traffic jams on the information superhighway

Units of data, called packets, circulate on the internet network. When all of these packets want to go to the same place or take the same path, congestion occurs, just like the traffic jams that occur at the end of a workday.

It should be noted that in most cases it is very difficult, almost impossible, to differentiate between normal traffic and denial of service attack traffic. Traffic generated by “Flash crowd” and “slashdot effect” phenomena is identical to the traffic witnessed during this type of attack.

However, this analogy only goes so far, since packets are often organized in flows, and the congestion on the network can lead to these packets being destroyed, or the creation of new packets, leading to even more congestion. It is therefore much harder to remedy a denial-of-service attack on the web than it is a traffic jam.

attaques

Diagram of a deny of service attack. Everaldo Coelho and YellowIcon

 

This type of attack saturates the network link that connects the server to the internet. The attacker does this by sending a large number of packets to the targeted server. These packets can be sent directly if the attacker controls a large number of machines, a botnet.

Attackers also use the amplification mechanisms integrated in certain network protocols, such as the naming system (DNS) and clock synchronization (NTP). These protocols are asymmetrical. The requests are small, but the responses can be huge.

In this type of attack, an attacker contacts the DNS or NTP amplifiers by pretending to be a server that has been attacked. It then receives lots of unsolicited replies. Therefore, even with a limited connectivity, the attacker can create a significant level of traffic and saturate the network.

There are also “services” that offer the possibility of buying denial of service attacks with varying levels of intensity and durations, as shown in an investigation Brian Krebs carried out after his own site was attacked.

What are the consequences?

For internet users, the main consequence is that the website they want to visit is unavailable.

For the victim of the attack, the main consequence is a loss of income, which can take several forms. For a commercial website, for example, this loss is due to a lack of orders during that period. For other websites, it can result from losing advertising revenue. This type of attack allows an attacker to use ads in place of another party, enabling the attacker to tap into the revenue generated by displaying them.

There have been a few, rare institutional attacks. The most documented example is the attack against Estonia in 2007, which was attributed to the Russian government, although this has been impossible to prove.

Direct financial gain for the attacker is rare, however, and is linked to the ransom demands in exchange for ending the attack.

Is it serious?

The impact an attack has on a service depends on how popular the service is. Users therefore experience a low-level attack as a nuisance if they need to use the service in question.

Only certain large-scale occurrences, the most recent being the Mirai botnet, have impacts that are perceived by a much larger audience.

Many servers and services are located in private environments, and therefore are not accessible from the outside. Enterprise servers, for example, are rarely affected by this kind of attack. The key factor for vulnerability therefore lies in the outsourcing of IT services, which can create a dependence on the network.

Finally, an attack with a very high impact would, first of all, be detected immediately (and therefore often blocked within a few hours), and in the end would be limited by its own activities (since the attacker’s communication would also blocked), as shown by the old example of the SQL Slammer worm.

Ultimately, the study shows that the phenomena of denial-of-service attacks by saturation have been recurrent over the past two years. This news is significant enough to demonstrate that this phenomenon must be addressed. Yet this is not a new occurrence.

Other phenomena, such as routing manipulation, have the same consequences for users, like when Pakistan Telecom hijacked YouTube addresses.

Good IT hygiene

Unfortunately, there is no surefire form of protection against these attacks. In the end, it comes down to an issue of cost of service and the amount of resources made available for legitimate users.

The “big” service providers have so many resources that it is difficult for an attacker to catch them off guard.

Still, this is not the end of the internet, far from it. However, this phenomenon is one that should be limited. For users, good IT hygiene practices should be followed to limit the risks of their computer being compromised, and hence used to participate in this type of attack.

It is also important to review what type of protection outsourced service suppliers have established, to ensure sure they have sufficient capacity and means of protection.

[divider style=”normal” top=”20″ bottom=”20″]

Hervé Debar, Head of Department Networks and Telecommunications services, Télécom SudParis – Institut Mines-Télécom, Université Paris-Saclay

The original version of this article (in French) was published on The Conversation.

 

cyrating

Cyrating: a trusted third-party for cybersecurity assessment

Cyrating, a startup incubating at ParisTech Entrepreneurs, provides organizations the service of assessing their performance and efficiency in cybersecurity. By positioning itself as a trust third-party, it is meeting the needs of companies for an objective analysis of their cyber risk. The service allows companies to assess their position relative to competitors.

 

In the cybersecurity sector, Cyrating intends to play a role that organizations are often asking for, but as yet has never been provided: that of a trusted third-party. The startup that has been incubating at ParisTech Entrepreneurs since last September offers to assess the cybersecurity performance of public and private companies. The rating they receive allows them to position themselves relative to their competitors, as well as define areas for improvement and determine the cybersecurity level of their subsidiaries and suppliers.

Regardless of the type of company, the startup bases its assessment on the same criteria. This results in objective ratings that are not dependent on the organization’s size or structure. “For example, we look at the level of protection for domain names, company websites, email services…” explains François Gratiolet, co-founder of Cyrating. He calls these criteria “facts” and they are supplemented by an analysis of “events” such as a data breach or the hosting of malware on the internal server.

Cyrating processes a set of observable data with the aim of uncovering these facts and events related to the organization’s cybersecurity. They are then measured against best practices in order to obtain a rating. Based on assessment algorithms, metrics and ratings are automatically calculated by category. The organizations evaluated by Cyrating therefore obtain a clear view of their efficiency in a variety of cybersecurity issues, in addition to the overall rating. This enables them to identify the measures they must immediately implement to improve their protection and optimize their allocation of financial and human resources.

Unlike auditing and consulting firms, Cyrating’s service does not require any intervention in the organizations’ departments or offices. There is no need to install any software or equipment. Furthermore, the service is based on a subscription system. The rating is ongoing throughout the entire subscription period. Therefore, as they track the changes in their rating, organizations can immediately observe the impact of their actions.

The startup is the first of its kind in Europe. And few startups are offering this type of service on a global level. “It’s a business that is booming in the United States,” says François Gratiolet. This early entry into the European market is a serious advantage for Cyrating, whose business relies on a powerful platform that can be scaled up: as the time the company has been assessing organizations increases, the more attractive their rating system becomes. The startup officially launched its business in Lille in January 2018, at the International Cybersecurity Forum (FIC)—the largest European trade show in the sector. Over the course of the startup’s development and the creation of its use cases—still very recent, since the startup is only a few months old—it has already assessed hundreds of companies. “A year from now we expect to have rated over 50,000 organizations” the co-founder predicts.

The first businesses to be won over by Cyrating’s services were large and intermediate-sized companies. “They see the opportunity to measure the performance of their suppliers and subsidiaries, and optimize their audit cycles,” François Gratiolet explains. But insurance providers could also be interested in this service, as well as agencies that want to purchase data blocks for statistical purposes. By positioning itself as a trusted third-party, the startup could quickly become a key player in cybersecurity in France and Europe.

Neural Meta Tracts, brain, white matter, Pietro Gori

The brain: seeing between the fibers of white matter

The principle behind diffusion imaging and tractography is exploring how water spreads through our brain in order to study the structure of neurons. Doctors can use this method to improve their understanding of brain disease. Pietro Gori, a researcher in image processing at Télécom ParisTech, has just launched a project called Neural Meta Tracts, funded by the Emergence program at DigiCosme. It aims to improve modelling, visualization and manipulation of the large amounts of data produced by tractography. This may considerably improve the analysis of white matter in the brain, and in doing so, allow doctors to more easily pinpoint the morphological differences between healthy and sick patients.

 

What is the goal of the Neural Meta Tracts project?

Pietro Gori: The project stems from my past experience. I have worked in diffusion imaging, which is a non-invasive form of brain imaging, and tractography. This technique allows you to explore the architecture of the brain’s white matter, which is made up of bundles of several millions of neuron axons. Tractography allows us to represent these bundles in the form of curves in a 3D model of the brain. It is a very rich method which provides a great deal of information, but this information is difficult to visualize and make use of in digital calculations. Our goal with Neural Meta Tracts is to facilitate and accelerate the manipulation of these data.

Who can benefit from this type of improvement to tractography?  

PG: By making visualization easier, we are helping clinicians to interpret imaging results. This may help them to diagnose brain diseases more easily. Neurosurgeons can also gain from tractography in planning operations. If they are removing a tumor, they want to be sure that they do not cut fibers in the critical areas of the brain. The more precise the image is, the better prepared they can be. As for improvements to data manipulation and calculation, neurologists and radiologists doing research on the brain are highly interested. As they are dealing with large amounts of data, it can take time to compare sets of tractographies, for example when studying the impact of a particular structure on a particular disease.

Could this help us to understand certain diseases?

PG: Yes. In psychiatry and neurology, medical researchers want to compare healthy people with sick people. This enables them to study differences which may either be the consequence or the cause of the disease. In the case of Alzheimer’s, certain parts of the brain are atrophied. Improving mathematical modeling and visualization of tractography data can therefore help medical researchers to detect these anatomical changes in the brain. During my thesis, I also worked on Tourette syndrome. Through my work, we were able to highlight anatomical differences between healthy and sick subjects.

How do you improve the visualization and manipulation of tractography data?

PG: I am working with Jean-Marc Thiery and other lecturers and researchers at Télécom ParisTech and the École Polytechnique on applying differential geometry techniques. We analyze the geometry of bundles of neuron axons, and we try to approximate them as closely as possible without losing information. We are working on algorithms which will be able to rapidly compare two sets of tractography data. When we have similar sets of data, we try to aggregate them, again trying not to lose information. It is important to realize that if you have a database of a cohort of one thousand patients, it can take days of calculation using very powerful computers to compare their tractographies in order to find averages or main variations.

Who are you collaborating with on this project to obtain the tractography data and study the needs of practitioners?

PG: We use a high-quality freely-accessible database of healthy individuals, called the Human Connectome Project. We also collaborate with clinicians in the Pitié Salpêtrière, Sainte-Anne and Kremlin-Bicêtre hospitals in the Paris region. These are radiologists, neurologists and neurosurgeons. They provide their experience of the issues with which they are faced. We are initially focusing on three applications: Tourette syndrome, multiple sclerosis, and surgery on patients with tumors.

Also read on I’MTech:

[one_half]

[/one_half]

[one_half_last]

[/one_half_last]

fine particles, Véronique Riffault, IMT Lille Douai

What are fine particles?

During peak pollution events, everyone is talking about them. Fine particles are often accused of being toxic. Unfortunately, they do not only come out during episodes of high pollution. Véronique Riffault, a researcher in atmospheric sciences at IMT Lille Douai, revisits the basics of fine particles to better understand what they are all about.

 

What does a fine particle look like?

Véronique Riffault: They are often described as spherical in shape, partly because scientists speak of diameter to describe their size. In reality, they come in a variety of shapes. When they are solid, they can indeed sometimes be spherical, but also cubic, or even made up of aggregates of smaller particles of different shapes. Some small fibers are also fine particles. This is the case with asbestos and nanotubes. Fine particles may also be liquids or semi-liquids. This happens when their chemical nature gives them a soluble character, they then dissolve when they meet droplets of water in the atmosphere.

How are they created?

VR: The sources of fine particles are highly varied, and depend on the location and the season. They may be generated directly by human processes, which are generally linked to combustion activities. This is true of residential heating using wood burning, road traffic, industry, etc. There are also natural sources: sea salt in the oceans or mineral dust in deserts, but these particles are usually bigger. Indirectly, they are also created by condensation of gases or by oxidation when atmospheric reactions make volatile organic compounds heavier. These “secondary” emissions are highly dependent on environmental conditions such as sunshine, temperature, etc.

Why do we hear about different sizes, and where does the term “PM” come from?

VR: Depending on their size, fine particles have different levels of toxicity. The smaller they are, the deeper they penetrate the respiratory system. Above 2.5 microns [1 micron = 1 thousandth of a millimeter], the particles are stopped quite effectively by the nose and throat. Below this, they go into the lungs. The finest particles even get into the pulmonary alveoli and into the bloodstream. In order to categorize them, and to establish resulting regulations, we distinguish fine particles by specific names: PM10, PM2,5, etc. The figure refers to the higher size in micron, and “PM” stands for Particulate Matter.

How can we protect ourselves from fine particles?

VR: One option is to wear a mask, but their effectiveness depends greatly on the way in which they are worn. When badly positioned, they are useless. A mask can give the wearer a sense of security when they wear them during peak pollution events. The risk is that they feel protected, and carry on doing sport, for example. This leads them to hyperventilate, which increases their exposure to fine particles. The simplest measure would be to not produce fine particles in the first place. Measures to reduce traffic can be effective if it is not just a fraction of vehicles which are immobilized. Authorities can take measures to restrict agricultural spreading. Fertilizer produces ammonia which combines with nitrogen oxides to create ammonium nitrates, which are fine particles. People also need to be made aware that they should not burn green waste, such as dead leaves and branches, in their gardens, but to take them to recycling locations, and to reduce their use of wood fire heating during peak pollution events.

Also read on I’MTech Particulate matter pollution peaks: detection and prevention

Are fine particles dangerous outside of peak pollution events?

VR: Even outside of peak pollution events, there are more particles than there should be. The only European regulation on a daily basis is for PM10 particles. For PM2,5 particles, the limit is annual: fewer than 20 micrograms per cubic meter on average. This poses two problems. The World Health Organization (WHO) recommends a threshold of 10 micrograms per cubic meter. This amount is regularly exceeded at several sites in France. The only thing helping us is that we are lucky to have an oceanic climate which brings rain. Precipitation removes the particles from the atmosphere. On average over a year, we remain below the limit, but on a daily basis we could be breathing in dangerous amounts of fine particles.

Also read on I’MTech

[one_half]

[/one_half][one_half_last]

[/one_half_last]

bitcoin, Patrick Waelbroeck

Bitcoin: the economic issues at stake

Patrick Waelbroeck, Institut Mines-Télécom (IMT)

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]C[/dropcap]ryptocurrencies like Bitcoin only have value if all the participants in the monetary system view it to as currency. It must therefore be rare, in the sense that it must not be easily copied (a problem equivalent to counterfeit banknotes for traditional currencies).

This is a requirement that is met by the Bitcoin network, which ensures no double-spending occurs. In addition to the value linked to the acceptance of the currency, Bitcoins owes its value to a variety of economic mechanisms linked to the analysis of the Bitcoins’ supply and demand.

Bitcoin supply

The issuance of currency in the primary market

The creation of Bitcoins is determined by the mining process. Each block that is mined generates Bitcoins. Their design stipulates that the amount per mined block be divided by 2 for every 210,000 blocks, to obtain a total amount of Bitcoins in circulation of (excluding those that are lost) 21 million. This monetary rule is monitored by the Bitcoin Foundation consortium, as we will discuss later in this article. The monetary rule can therefore be modified to respond to fluctuating market conditions, which can result in a hard fork.

Electricity is the main component (over 90% according to current estimates) of a mining farm’s total costs. In 2015, Böhme et al. (2015) assessed the Bitcoin network’s consumption at over 173 megawatts of electricity on a continuous basis. This represented approximately 20% of a nuclear power plant’s production and amounted to 178 million dollars per year (based on residential electricity prices in the United States). This amount may seem high, but Pierre Noizat considers that it is not any more than the annual electricity cost for the global network of ATMs (automatic teller machines), estimated at 400 megawatts. Once we figure in the costs involved in manufacturing and putting currency and bank cards into circulation, we see that the Bitcoin network’s electricity cost is not as high as it seems.

However, this cost may significantly increase as the network continues to develop, due to a negative externality inherent in mining: each miner that invests in new material increases his or her marginal revenue, but at the same time increases the overall mining cost, since the difficulty increases with the number of miners and their computation capacity (hash power).

La quête du bitcoin. xlowmiller/VisualHunt

Therefore, for the Bitcoin network, the difficulty of the cryptography problem that must be solved and approved by a proof-of-work consensus increases along with the network’s overall hash power. There is therefore a risk of over-investing in the mining capacity, since individual miners do not consider the negative effect on the entire network.

It is important to note that increasing the mining difficulty reduces mining incentives and increases the verification time, and thus the efficiency of the blockchain itself. This mechanism brings to mind the tragedy of the commons, in which shared resources (here, hash power) are depleted and only maintained by a handful of farms and pools, thereby nullifying the very principle of the public blockchain, which is decentralized.

There is therefore a risk that mining capacities will become greatly concentrated in the hands of a small group of players, thus invalidating the very principle of the blockchain. This trend is already visible today.

In the end, the supply of Bitcoins, and therefore the monetary creation on the primary market, depend on the cost of electricity and the difficulty associated with the mining process, as well as the governance rules pertaining to the Bitcoin price generated by a mined block.

The Bitcoins value on the secondary market

The Bitcoin can also be bought and sold on an exchange platform. In this case, the Bitcoin’s value is similar to a financial investment in which the financial players anticipate the prospect of financial gain and factors that could cause the Bitcoin to appreciate.

Bitcoin demand

The demand for cryptocurrency depends on several user concerns that are addressed below, starting with the positive factors and ending with the risks.

Financial privacy

Bitcoin accepted here. jurvetson on Visual Hunt

Governments are increasingly limiting the use of cash to demonstrate their efforts to counter money-laundering and the development of black markets. Cash is the only means of payment that is 100% anonymous. Bitcoin and other cryptocurrencies come in second, since the pseudonymous system used by Bitcoin effectively conceals the identity of the individuals making the transactions. Furthermore, other cryptocurrencies, such as the Zcash, go a step further, masking all the metadata linked to a transaction.

Why do people want to use an anonymous payment method? For several reasons.

First of all, this type of payment method prevents users from leaving any traces that could be used for monitoring purposes by the government, employers, and certain companies (especially banks and insurance companies). Companies and banks use price discrimination practices that can sometimes work against consumers. Leaving traces through payment can also cause companies to further incite customers to take advantage of new commercial offers and engage in targeted advertising that some see as a nuisance.

Secondly, paying with an anonymous payment method limits “sousveillance” (or inverse-surveillance) by close friends and family. Like when a payment is made using a joint account.

Thirdly, making payment under a pseudonym makes it possible to maintain business confidentiality.

Fourthly, just like the privacy policy, anonymity in certain transactions (for example healthcare products or hospital visits) helps build trust in society, and is therefore of economic value. Therefore, by enabling pseudonymity, Bitcoin brings added value in these various instances.

The Bitcoin works in times of crisis, thus avoiding capital controls

The Bitcoin emerged right after the financial crisis of 2008. This period witnessed the power of governments and central banks to control cash withdrawals and outstanding capital stock. There are very few means available for avoiding these two institutional constraints. The Bitcoin in one such means. Even if cash withdrawals are prohibited, Bitcoin owners can still pay using their private key.

The Bitcoin imposes discipline on governments

The Bitcoin (and the same is true for other cryptocurrencies) can be considered as a monetary alternative that is not controlled by a central bank. Some economists, like F. Hayek, sees these alternative currencies that compete with the official currency as a means of imposing discipline on governments that might be tempted to use inflation to finance their debt. If this happens, consumers and investors would no longer use the official currency, and would instead purchase the alternative currency, creating a deflationary pressure on the official currency.

Security-related network externalities

The level of security increases with the number of network nodes, since each node increases the computation power required to create a breach in the Blockchain security (through a 51% attack, double-spending, or denial of service–DOS). Furthermore, a DOS attack is especially hard to stage, since it is so difficult to determine who the recipient is. Positive network externalities therefore exist: Bitcoin’s value increases with the number of nodes participating in the network.

Indirect network externalities related to payment method

Bitcoin is a payment method, just like cash, debit cards and Visa/Mastercard/American Express cards. Bitcoin can therefore be understood using the multi-sided market theory, which models situations where two groups of economic players benefit from positive crossed externalities. The consumer who chooses a payment method for a purchase is happy when it is accepted by the merchant. In the same way, merchants are eager to accept a payment method that customers possess. Consequently, the dynamics of multi-sided markets result in virtuous cycles that can experience a slow inception phase, followed by a very fast deployment phase. If the Bitcoin experienced this type of phase, its value would enter a period of acceleration.

A Bitcoin bubble? duncan on Visual Hunt, CC BY-NC

The risks

Among the factors that reduce the demand for Bitcoins, the most prominent are the risks related to rules and regulations. On the one hand, a State could order that the capital gains generated from buying and selling Bitcoins be declared. On the other hand, Bitcoins can be used in regulated sectors (like the insurance and bank sectors) and their use could therefore be regulated as well. Finally, there is always the risk of losing the data on the hard drive where the private key is stored, resulting in the loss of the associated Bitcoins, or a State could force access to private keys for security reasons.

However, the greatest risk involves the governance of the Bitcoin network.

In the event of a disagreement on how the communication protocol should develop, there is a risk that the network could split into several networks (hard fork) with currencies that would be incompatible with each other. The most important issue involves the choice of the consensus rule for validating new blocks. A consensus must be reached on this consensus, which the technology itself appears unable to provide.

Conclusion

The Bitcoin’s economic value depends on many positive economic factors that could propel the cryptocurrency into a period of sustained growth, which would justify the current surge in its prices in the exchange markets. However, the risks related to the network’s governance must not be overlooked, since trust in this new currency depends on it.

Patrick Waelbroeck, Professor of Economics at Télécom ParisTech, Institut Mines-Télécom (IMT)

The original version of this article (in French) was published on The Conversation.

Also read on I’MTech:

 

Davide Balzarotti, Eurecom, ERC, Consolidator grant

A third ERC grant in 3 years at EURECOM

Getting a grant from the European Research Council is not an easy task but this is what Davide Balzarotti, Professor in the Security Department, has just accomplished. He is the third EURECOM professor to obtain an ERC grant in the past 3 years.

 

ERC, Davide Balzarotti, BITCRUMBS, EURECOM

Davide, you just got an ERC Consolidator grant, one of the most prestigious research grants in Europe. What is your feeling today?

Everybody knows it is one of the most selective grants in Europe, so I’m obviously very proud of that. It is definitely a major step in my career. It is an important recognition for the efforts I have made to get this grant and for the relevance of the project I presented. Plus, I was told there are only 329 researchers across Europe – and 38 researchers in France – who got this grant this year, so I am particularly honoured to be one of them. I am also very happy for EURECOM since it has been awarded one ERC grant every year for the past 3 years… Considering there are only 24 professors, it is a real success!

 

Will this grant change your day-to-day life as a researcher at EURECOM?

I am sure it will! In different ways even. First, I won’t have to worry about getting money for the next few years. The Consolidator grant is a five-year grant that represents €2 million. This grant is not only generous, it also offers recognition and visibility. In fact, the two other ERC grantees at EURECOM – David Gesbert & Petros Elia – explained me that I will certainly be more solicited by the research community. It will also give me a lot of independence and creative freedom to conduct the project for which I got this grant: BITCRUMBS – Towards a Reliable and Automated Analysis of Compromised Systems. I will dedicate 70% of my time to the project but I can manage it the way I want depending on the people I will work with. I actually need to hire a team of seven researchers – five Ph.D. students and two post-docs – and one engineer. On top of that, I will be involved in the EURECOM ERC committee that helps scientists benefit from the experience of the ones who already received such grants. This committee actually helped me a lot in writing my proposal, so I look forward to helping my colleagues in return.

 

BITCRUMBS seems to be a ground-breaking project in the computer security area. Could you explain its main objective?

BITCRUMBS is actually a brand new way of addressing computer security issues. And this ERC grant will help me pursue very ambitious research objectives with this project, which covers a wide range of digital security areas. I hope our results will change the way digital security will be managed in the future. The main objective of BITCRUMBS is to rethink what we call the “incident response” (IR). It is clear that research on prevention and detection helps make devices more secure, but since a 100% secure system does not exist, improving IR can be very useful too. Incident response addresses the aftermath of a digital security breach that, if not handled properly, can lead to data breach or a system collapse. We all know the risk of security breaches is now higher than ever. Attackers frequently break into corporate networks, government services and even critical infrastructures. Almost half of computers worldwide are infected by malware. A voting machine can be altered to rig the results of an election, a connected car can be hacked to drive down a cliff or a security camera can be controlled over the Internet to spy over our houses and our families. The problem is that we do not have the tools to analyze these attacks and understand their causes! This has to change.

With BITCRUMBS, I want to give investigators the possibility to quickly verify the state of compromised systems and help citizens trust the result of computer forensic investigations. In the future, I believe we should design digital systems the way we design airplanes – secure against crashes but also equipped with black boxes to collect all the data required to support an incident investigation.

 

What is your strategy to reach this objective?

I want to propose a more scientific and comprehensive methodology to analyse compromised systems. This should be done in three steps. The first part of the project will focus on measuring the effectiveness and accuracy of the techniques currently used to analyse compromised systems, and on assessing the reliability of their data sources. This will help increase the theoretical and scientific foundations of IR techniques. The second part of the project will focus on the design and implementation of new automated analysis techniques able to cope with advanced threats and the analysis of IoT devices. These techniques will have to be robust, scalable and generic – capable of working on different classes of devices. Of course, results given by these new techniques will need to be reliable and based on a solid theoretical foundation. The last step will introduce a new forensics analysis by design methodology. My goal is to provide a set of guidelines for the design of future systems and software – to help developers provide the required information to support the analysis of compromised systems.

 

What about the scientific and technological impacts?

I hope research conducted in BITCRUMBS will have a long-lasting impact – not only scientific – on the area of incident response and on the way we analyze compromised systems. First, BITCRUMBS will bring a scientific foundation to IR, based on repeatable experiments and precise measurements of the reliability of data and techniques used in current investigations. It will also have a practical impact since it will produce open source tools and improve existing software that are regularly used by companies and law enforcement to deal with computer attacks. Last but not least, BITCRUMBS will have an impact on our society. Improving the IR process will increase the trust that citizens have in the result of digital investigations. In order to clearly show the impact of BITCRUMBS in different fields and scenarios, we will address our objectives using real case studies borrowed from traditional computer software and embedded systems.

 

What are the main challenges you will be facing in BITCRUMBS?

Like any very broad project, BITCRUMBS success depends on a lot of factors. From a scientific point of view, it mainly depends on the combination of very different research skills including memory forensics, embedded systems security, malware and binary analysis, distributed systems and operating system design and defenses. I have considerable experience in each of these research areas, but in order to reduce the risks, I already secured key collaborations with leading universities and security companies so I can find research partners from different areas to work with. The other potential risk is the possible failure to develop some of the techniques I have envisioned. It is actually a very common risk in research projects that introduce novel solutions. For this reason, for each disrupting approach I would like to develop, I also have thought of less risky techniques for which I have experience and already conducted some investigation to evaluate the feasibility of a few ideas. But above all, one of the main challenges will be to find motivated postdocs in digital security willing to work in Europe. Most PhD students go to the US for their postdoc or are hired by security companies offering good conditions and interesting opportunities. I hope BITCRUMBS challenges and potential results can attract some of them.

[divider style=”dotted” top=”15″ bottom=”15″]

The original version of this article was published on EURECOM website

[divider style=”dotted” top=”15″ bottom=”15″]

Also read on I’MTech :

DessIA

DessIA: Engineering of the Future with Artificial Intelligence

What is the best architecture for the gearbox of a hybrid car? If an engineer had to answer that question, he would consider a handful of possibilities based on what already exists on the market. But the startup DessIA takes a whole different approach. Its artificial intelligence algorithms enable it to consider billions of different architectures to find the optimum configuration. The software developed by the young company digitally builds all the possible structures using the necessary components. The performance and the feasibility of the architectures built using this method are assessed, the design space is therefore intelligently explored to reduce the number of architectures physically tested. The automated, smart sorting keeps only the best architectures. In addition to the possibility of analyzing considerably more models than a human could, DessIA’s advantage is that the layouts created with its components are radically different from what already exists. “When we present our approaches to manufacturers, many of them say this is exactly the way they want to work, but they have no idea where to start,” say Pierre-Emmanuel Dumouchel and Steven Masfaraud, co-founders of the startup incubated at ParisTech Entrepreneurs.

For now, DessIA is specialized in subjects related to the transmission of mechanical power. It can work on both on gearboxes for cars and systems for transferring energy between a helicopter’s turbines and blades. The field itself is vast, and reflects the experience of its two founders, former employees of PSA. The issues can even include the mechatronic systems of complex electrically motorized mechanisms. The startup’s applications are limited to this subject because the algorithms’ work must be controlled by a thorough knowledge of the sector. Still, the two founders are not ruling out the possibility of someday moving towards providing assistance in the design of electrical or hydraulic systems. But not until a few years from now.

By remaining focused on mechanical systems, many opportunities have opened up for the young company. DessIA’s objective is to go beyond the mere optimization of architectures. Once the best structure has been determined, the ideal solution would be to have a very simple way of obtaining a 2D industrial plan, or even the 3D CAD model to directly integrate into the computer aided design software. The two founders intend to achieve this outcome by the end of 2018. If they succeed, they could redefine how mechanical systems are designed at the industrial level, from the reflection phases to drawing the part.

 

[divider style=”normal” top=”20″ bottom=”20″]

Pierre-Emmanuel Dumouchel worked at PSA for 10 years. After supervising Steven Masfaraud’s thesis for three years, they decided to partner together to create DessIA. They aim to simplify the design process for engineers through a breakthrough approach based on artificial intelligence.

[divider style=”normal” top=”20″ bottom=”20″]

Literacy, La Rotonde culture scientifique Guillaume Desbrosse

Advancing scientific, technical and industrial literacy, a challenge for society – An interview with Guillaume Desbrosse

What is the role of a center for scientific literacy and what sort of work does its director do? What are the goals of the AMCSTI (Association of Museums and Centers for the Advancement of Scientific, Technical and Industrial Culture)? Guillaume Desbrosse, the Director of CCSTI La Rotonde Mines Saint-Étienne, who has recently been appointed president of the AMCSTI, explains the challenges and goals of organizations dedicated to scientific, technical and industrial literacy.

 

What are the goals of a center for scientific culture such as La Rotonde, and as the director of the center, what does your work involve?

La Rotonde’s goal is to share scientific, technical and industrial knowledge with the widest possible audience through different media, in order to respond to a challenge facing society: creating conditions for bringing people into contact with science. La Rotonde is a venue for an annual program of cultural events for the general public including exhibitions, theater, debates, workshops and conferences. At the same time, we develop activities “outside our walls” which allow us to reach out to populations who may not consider visiting a center for scientific culture: events, science education projects for schools, projects carried out with the city of St-Étienne cultural partners and many more. We are also a center for producing scientific literacy resources: exhibitions, children’s books, comic books, films etc. We use all forms of media to talk about science!

At La Rotonde I lead a team of nine people and we all work together to create and carry out projects. I’m in charge of coordinating the team’s work, responding to calls for projects, creating partnerships and collaborations with other organizations and developing strategies to expand the reach of La Rotonde and Mines St-Étienne. We’ve developed many different projects including Mondo Minot, an exhibition intended for very young visitors which gives children aged 2-7 the opportunity to discover science. It has been brought to life three times with a different version each time and has been presented in a number of venues in France. It is currently on display at Quai des Savoirs in Toulouse. We have also developed the Escapad project, an immersive children’s book which is associated with a tablet through an innovative, patented device. Esca’Pad takes 6-9-year-olds on a journey to explore the island of Madagascar, where we have partnerships with several research libraries and cultural sites. The project is produced through collaboration between La Rotonde, Avant-Goût Studio, and Universcience, which groups together La Cité des Sciences and Le Palais de la Découverte.

Read more on I’MTech At La Rotonde, the scientific mediation is based on experiments

How does the AMCSTI work?

It is an association which groups together 194 professional organization members, including very large organizations such as the Cité des Sciences et de l’Industrie or the Museum of Natural History as well as smaller associations throughout France.

Its purpose is to bring together different players around projects and themes so that professionals can meet one another, engage in dialogue and increase their expertise. It also works on strategic positioning on a national level and interactions with different ministries: it is necessary to establish contacts with the world of culture, education, tourism, innovation, industry etc. Scientific literacy leads to a cross-disciplinary approach. The AMCSTI fosters dialogue between the professionals who work in the field and carry out projects and the political representatives or local authorities who support these scientific literacy projects.

 

Can you tell us about your current or upcoming projects at AMSCTI?

Soon we will be launching new projects and establishing a strategic plan for the next three years. We’re working on developing themes: last year’s proposed area of focus was “Culture, beliefs and sciences.” The goal is to help citizens learn to think critically in order to face rising obscurantism and to steer the debate in the right direction by showing the differences between beliefs and knowledge. We are thinking about subjects involving innovation, to show how our association can be a source of collaboration in this field. We are closely linked with the world of business, industry and startups. In the coming months, we will also launch themes dedicated to healthcare. A working group is now studying vaccines, which is a controversial subject. Our fellow citizens’ growing resistance to science can be explained by a lack of dialogue or popularization programs and the pervasiveness of fake news and a certain vagueness in discourse. Our job is therefore to bring together the different players involved in the association to work on specific actions and themes. We strive to provide innovative, creative, high-level popularization programs for the general public in order to help people become more curious, learn to think critically and form their own opinions about the world of today and tomorrow.

[box type=”shadow” align=”” class=”” width=””]

AMCSTI is the national network of scientific, technical and industrial culture professionals (CSTI). Museums, research organizations, zoos, universities, foundations etc. A wide range of players is represented within the association. AMCSTI supports its members and brings them together to foster the advancement and recognition of scientific, technical and industrial culture.

Find out more about AMCSTI

[/box]

Industry without borders : l'industrie du futur au-delà des frontières

Between France and Germany, studying the metamorphosis of industry beyond borders

The “Industry Without Borders” project, launched on November 15, 2017 as part of the German-French Academy for the Industry of the Future, brings together French and German scientists. The project involves sociologists, economists, management and strategy researchers, and aims to study the transformation of contemporary industries. Madeleine Besson, the project leader at IMT, provides a more detailed look at the advantages of this collaboration with the Technical University of Munich (Technische Universität München, TUM).

 

This project is part of the German-French Academy for the Industry of the Future. Could you remind us what this academy hopes to achieve?

Madeleine Besson: On October 27, 2015, under the partnership between the French Alliance for Industry of the Future and the German platform Industrie 4.0, IMT and the Technical University of Munich (TUM) created the German-French Academy for the Industry of the Future. This academy involves new content for training in digital transformation, summer schools, and for a large part of joint research projects. This content is centered on both very technical subjects, such as hybrid blockchain architectures for the industry of the future, and on human and organizational aspects of digitalization.

In this context, what is the goal of the “Industry without borders” project?

MB: The industry of the future is defined by networks and links between a variety of heterogeneous elements. Production technology, organizations, innovative activities and products, to name just a few, will be closely linked with one other. As a result, data, information and knowledge will be exchanged among all these elements. Some of these exchanges will occur between people, or will at least be controlled by them. But in other cases, they will occur between cyber-physical systems without any human intervention or direct control. Organizational borders will no longer limit the exchange and flow of the resulting information. This is precisely why the “Industry Without Borders” project is trying to understand how digital transformation in industry is changing organizational borders.

What areas of expertise do IMT and TUM provide in the study of organizational transformation?

MB: I coordinate the French team composed of IMT researchers over three sites: Télécom École de Management (TEM) in the Paris region, Mines Saint-Étienne and IMT Atlantique in Brittany. We have brought together experts in management and marketing at TEM, as well as strategy at Mines Saint-Étienne and economy at IMT Atlantique. The researchers are all highly dedicated to understanding the impact of digitalization in their respective fields.

As for TUM, and in particular the MCTS (technology and society research center at the Technological University of Munich), Uli Meyer’s team is specialized in the field of sociology. This covers sociology of work, sociology of innovation, and sociology of sciences and techniques.

What do you gain from collaborating with a technical university in Germany?

MB: Whether for the teams at TEM or the researchers at Mines Saint-Étienne and IMT Atlantique, we have a tradition of applying management research to technology. This is also true of the researchers at MCTS. This collaboration allows us to study an important subject with a strong team of people dedicated to the subject. We can combine complementary methods and explore the issue form both sides of the Rhine.

From a methodological point of view, the MCTS is specialized in company ethnography methods, and will provide us with training in this area. In exchange, we have expertise in econometrics which we can share with them. In order to create a true Franco-German research unit, we are currently looking for established companies on both sides of the Rhine to participate in the research. Again, collaboration between the teams is precious here.

What scientific production is anticipated from the project?

MB: There are different sorts of outcomes anticipated from this project. Of course, we expect to produce initial results on redefining the borders of organizations undergoing digitalization. We hope to share the results in conferences and scientific journals. A complementary objective is to use the elements we collect to create teaching materials. Finally, our objective with the TUM in this first research phase, is for an ambitious Franco-German project, for which we will seek funding from French and German research agencies. To increase the credibility of our future plan, one result in itself will be the team’s ability to develop a joint methodology and coordinate data collection in the industrial organizations based in the two countries.

[box type=”note” align=”” class=”” width=””]

A word from Judith Igelsböck,
researcher at the Technical University of Munich (TUM)

Here at the MCTS, we are privileged to be able to discover different epistemic cultures and create new areas of knowledge with the researchers at IMT. They have vast experience in the study of industry and innovation. With their expertise, we will be able to enrich our exploration into the way industrial boundaries are changing, disappearing and being rebuilt to adapt to the requirements of open innovation, new digital technologies, and new ways of organizing and distributing work. This will provide a unique working environment for interdisciplinary and trans-national study of today’s industries.

[/box]

 

 

composite materials

Composite Materials: the race to keep going faster

In the world of materials, composites are currently undergoing a transformation that is just a significant as the plastics boom in the 1960s. To massively integrate these materials into high-volume production markets—automotive, aeronautics, rail, etc.—manufacturers must further reduce the time it takes to mold the parts. This complex goal is attainable by redesigning the materials’ composition and architecture, adapting the manufacturing processes used, and relying on new digital simulation tools.

 

Fifteen years ago, we were happy when we could produce a complex part made of structural composites in five minutes,” recalls Patricia Krawczak, a materials researcher at IMT Lille Douai. In 2017, the requirements are more demanding. Many sectors are interested in the mechanical properties that composites have to offer, including the transportation industry. These new materials—which are more resistant, lighter, and more durable—pave the way for breakthrough products, and offer new economic opportunities. It is therefore necessary to be able to integrate them massively into markets with high-production volumes, such as the automotive industry. According to the Industry of the Future Alliance, the greatest industrial challenges currently facing composite materials are the reduction of molding cycle and manufacturing times, and the development of “high-speed” processes. For automotive parts, the target is set at around one minute.

Also read on I’MTech What is a composite material?

At the request of manufacturers, scientists are working to meet this challenge. To do so, the plastic resins used as the matrices to impregnate fiber reinforcements have already been modified for some time now. Researchers have switched from thermosetting resins—which harden when heated—to thermoplastic resins, which melt with heat and harden as the part returns to room temperature. “Cure times for thermosetting resins are fairly long, even though the suppliers of these matrices are making progress and working to reduce them,” explains Patricia Krawczak. However, thermoplastic resins can be shaped quickly, for example by using a hot-pressing process, stamping, or using injection technology on a fiber preform, without the need for any further curing. “In this respect, these resins can help reduce the cycle time,” the researcher explains.

 

A comprehensive “materials/process/products” approach

However, modifying the nature of the resin isn’t all that is required to reduce manufacturing time. In order for parts to feature high mechanical performance properties, defects, such as air bubbles trapped in the material, must be avoided at all costs. It is therefore necessary to ensure that the thermoplastic resin completely penetrates between the reinforcement fibers once it has become liquid through increased temperature. Yet these resins are known for being more viscous than thermosetting resins. Scientists must reduce the viscosity of these resins to gain a competitive edge and take advantage of the thermoplastics’ capacity to be shaped quickly. “We are working with chemists to develop polymers that retain the same properties once they become solid, while being more fluid at the processing temperature,” Patricia Krawczak explains.

Researchers in materials science can also adapt the architecture of the reinforcements. Within composite materials are fibers that form strands composed of filaments. They can be assembled–woven, braided or sewn–in different ways. “The way the reinforcement is formed affects the properties of the final composite material, as well as the permeability of the fiber preform, which therefore influences how the resin flows into the reinforcement,” Patricia Krawczak notes. In order to find the best structure, with the right balance between mechanical properties and sufficient permeability, the researcher’s team is also working with fiber reinforcement designers.

Another available means of action lies in the manufacturing process itself, particularly by adapting it to the specific characteristics of the materials that make up the composite itself: the polymer matrix and the reinforcement fiber. “Our team has developed fast hybrid processes that integrate several steps into a single molding operation, for example by combining resin transfer molding and compression molding, or thermoforming/stamping of local composite inserts and overmolding,” Patricia Krawczak explains. In addition to the shortened manufacturing cycle, this mix of processes significantly reduces the number of basic components that must be assembled to produce a complex part. This is a significant benefit for the plastic parts manufacturers that partner with IMT Lille Douai.

 

Optimization platform for new composite processes (POPCOM). Photo: IMT Lille Douai.

 

Digital technology helps identify optimal materials-process pairings

In practice, composite materials offer a very impressive range of “matrix polymer / fiber reinforcement / manufacturing process” combinations, which has been further increased by recent innovations from producers—chemists and textile manufacturers—and processors. To speed up the design of industrial parts, researchers develop virtual engineering chains. Using a technological platform equipped with prototype tools and demonstrators that represent industrial manufacturing processes, they analyze, identify and model impregnation mechanisms. They therefore complement and improve on the manufacturing numerical simulation tools.

For example, a few years ago we worked with a highly reactive resin that had a gel time of one minute,” says Patricia Krawczak. “But at the time, the available simulation software did not take into account the spatial and temporal variations in the viscosity of these very fast-polymerizing resins. We therefore had to update the digital tool by incorporating a specific model combining thermokinetic reactions and flow”. It was then possible to properly simulate the impregnation of different types of reinforcements with this resin on geometrically complex automotive body parts. By conducting numerical tests, many more combinations have been explored. Researchers can work faster and identify the best configurations to optimize manufacturing technologies. They can therefore successfully reduce cycle times while still maintaining the level of quality and performance.

By studying the materials virtually, as well as the processes used, scientists can dare to explore methods that seem counter-intuitive. This was the case during a European project in which IMT Lille Douai partnered with manufacturers to reduce the molding cycle time for the floor structure of a motor vehicle. “By having our models integrate the way fiber fabrics are distorted during the draping procedure and the consequences this has on the local flow of resin, we were able to simulate a process. We proposed inlet points for the sequential injection of resin, distribution channels and vents at areas on the part that were not the most logical choices for manufacturers,” Patricia Krawczak recalls. But the numerical model had accurately predicted that the resin would impregnate the reinforcement faster and in a more homogeneous manner using this strategy. This result was then confirmed through full-scale experimental validations.

Today, the researchers continue to pursue this scientific approach—supported by industrial collaborations—to explore the potential of new materials and innovative processes. The digital tools are adapted to accurately simulate new technological alternatives and respond to the growing demand for natural fibers in composite materials. Due to their porous nature, they absorb part of the resin and swell. This phenomenon must therefore be included in the simulations, in hopes that this will lead to the discovery of new, more efficient scenarios. Cycle times are no longer improved by several minutes, like they were fifteen years ago; now they are improved by tens of seconds. This gain is still a considerable one in industries that can potentially produce thousands of composite parts each day.

Find out more about natural fiber-based high-performance composites: