données massives, big data

Big data and personal information: a scientific revolution?

This article was originally published (in French) on the website for IMT’s Values and Policies of Personal Information Chair.

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]O[/dropcap]n 15 November 2019, Valérie Charolles was the keynote speaker at a symposium organized by the University of Insurance on the theme, “Data : a (r)evolution for insurance?” For this sector, where big data has changed some horizons, but which has been working with data for a long time, this first keynote sought to provide a philosophical perspective on current developments in information processing.

Does the advent of big data mark a break with the previous ways of handling information, in particular personal information?  Does it represent a true scientific revolution?  This question has been discussed in scientific, philosophical and intellectual debates ever since Chris Anderson’s thought-provoking article for the American publication Wired in 2008. In the article, he proclaimed “the end of theory,” which has been made “obsolete” by the “data deluge” and concluded with this intentionally provocative statement: “It’s time to ask: What can science learn from Google?”

Just over a decade after its publication, at a time when what we now know as “big data,” combined with “deep learning,” is used on a massive scale, the Débat journal chose to explore the topic by devoting a special report in its November-December 2019 issue to “the consequences of big data for science” As such, it called on philosophers from a variety of backgrounds (Daniel Andler, Emeritus Professor in the philosophy of science at Université Paris-Sorbonne, Valérie Charolles, philosophy researcher at Institut Mines-Télécom Business School, Jean-Gabriel Ganascia, professor at Université Paris-Sorbonne and Chairman of the CNRS Ethics Committee) as well as a physicist, (Marc Mézard, who is also the director of ENS), asking them to assess Chris Anderson’s thesis. Engineer and philosopher Jean-Pierre Dupuy had shared his thoughts on the subject in May, in the journal Esprit.

Big data and scientific models

The authors of these articles acknowledge the contributions of big data processing on a scientific level (although Jean-Pierre Dupuy and Jean-Gabriel Ganascia express a certain skepticism in this regard). This sort of processing makes it possible to develop scientific models that are more open and which, through successive aggregations of layers of correlated information, may give rise to forms of connections, links. Although this machine learning by what are referred to as deep networks has existed for over 70 years, its implementation is still relatively recent. It has been made possible by the large amount of information now collected and the computing power of today’s  computers. This represents a paradigm shift in computer science. Deep learning clearly provides scientists with a powerful tool, but, unlike Chris Anderson, none of the above authors see it as a way to replace scientific models developed from theories and hypotheses.

There are many reasons for this. Since they predict the future based on the past, machine learning models are not made for extreme situations and can make mistakes or produce false correlations. In 2009, the journal Nature featured an article on Google Flu Trends, which, by combining search engine query data, was able to predict the peak of the flu epidemic two weeks before the national public health agency. But in 2011, Google’s algorithm performed less well than the agency’s model that relied on human expertise and collected data. The relationships revealed by the algorithms represented correlations rather than causalities, and the phenomena revealed must still be explained using a scientific approach.  Furthermore, the algorithms themselves work with the hypotheses (part of their building blocks) they are given by those who develop them, and other algorithms, if applied to the same data set, would produce different results.

Algorithmic processing of personal data

In any case, even if it does not represent a paradigm shift, the use of big data attests to a new, more inductive scientific style, where data plays an increasingly important role (we often hear the term “data-driven” science). Yet ready-to-be-analyzed raw data does not exist. Daniel Andler elaborates extensively on this point, which is also evoked by the other authors. The information with which computers are provided must be verified and annotated in order to become data that can be used by algorithms in a meaningful way. And these algorithms do not work by themselves, without any human intervention.

When personal data is involved, this point is especially important, as underscored by Valérie Charolles. To begin with, the limitations cited above in terms of the results provided by the algorithms also clearly apply to personal data processing. Furthermore, individuals cannot be reduced to the information they can provide about themselves using digital tools, even if a considerable about of information is provided.  What’s more, the quantity of information does not presuppose its quality or relevance, as evidenced by Amazon’s hiring algorithm that systematically discriminated against women simply due to the fact that they were underrepresented in the database. As Marc Mésard concludes, “we must therefore be vigilant and act now to impose a regulatory framework and essential ethical considerations.”

[divider style=”dotted” top=”20″ bottom=”20″]

Valérie Charolles, philosophy researcher at Institut Mines-Télécom Business School, member of IMT’s Values and Policies of Personal Information Chair associate researcher at the Interdisciplinary Institute of Contemporary Anthropology (EHESS/CNRS)

 

crisis

Crisis management: how to prepare local territories

The catastrophic wildfires in the Gard department last summer highlighted the fact that local territories must be prepared to handle natural disasters. Although certain reflexes never vary, each disaster is unique and requires a tailored response. Sophie Sauvagnargues, a researcher at IMT Mines Alès who specializes in local management of natural crises, is taking part in organizing exercises to help prepare local authorities for these situations.

 

Each crisis is unique. It is not possible for the local territories concerned to learn the correct method for resolving crises, since each response is suited to a particular situation. So, how can they prepare themselves for the unexpected? “There are essential skills and competencies to develop, in addition to organizational and management methods to ensure preparedness and response,” says Sophie Sauvagnargues. A researcher at IMT Mines Alès in natural crisis management, she is working with other researchers to develop “realistic, educational crisis exercises that are immersive and adaptive, so that local territories are better prepared to respond.”

As such, the researchers are working with local authorities (municipalities and inter-municipal bodies), industry players and institutional authorities to help them prepare for natural crises and learn how to manage them. Examples of such crises include forest fires or floods, such as occurred in the southern half of France in late November 2019. Some regions are at greater risk than others, and some accumulate several types of risks. Other regions anticipate the development of new risks, as the expected consequences of climate change. Mountain regions will prepare for torrential flooding due to rising water levels, while Reunion Island will anticipate the consequences of being hit by a cyclone.

The Alès researchers studied this latter example through the ANR SPICy project, in partnership with BRGM, Météo France-Indian Ocean, BRLIngénierie and LACy. The project, which was completed in 2017, is still being followed up with new crisis anticipation exercises in Reunion — the most recent of which date from November 2019. The researchers reach out to municipalities to work with them on developing appropriate tools for managing a cyclonic crisis. These tools are then used in a simulated situation.

Realistic exercises

It’s often the early stages of a crisis – mobilization and anticipation – that justify these exercises to the greatest extent,” says Sophie Sauvagnargues. The researcher and her colleagues carry out preliminary work to provide a detailed analysis of the present situation at the municipal level, then develop a tailored, fictional crisis-management scenario for each municipality and provide on-site support to play out this scenario. “The preparatory work needed to develop these exercises is extremely important,” she adds, “for these local crisis centers in Reunion Island, it went on for several months.” The scenario must reflect the context and reality, and be tailored for the organization in question.

One of the first steps is to gain a clear understanding of the context surrounding the requested exercise. This requires an in-depth understanding of the municipalities’ crisis organization and the position of the inter-municipal body with jurisdiction over this area. The learning objectives must also be determined in advance – if they have not been specified by the participants – as well as the phenomenon that would initiate the crisis. This involves analyzing what has worked and what hasn’t in previous crises to identify areas for improvement. “We design the scenarios with disruptive events calling for specific competencies and skills,” explains Sophie Sauvagnargues.

A great deal of documentation is required to ensure the credibility of these disruptive elements and the scenario as a whole. “For the ANR SPICy project in Reunion Island, we had to reconstruct an increasing cyclone warning level, at the meteorological and hydrological level,” says Sophie Sauvagnargues. “We worked with Météo France for the plausibility of cyclone activity, but the entire crisis had to be designed to reflect the geographic, meteorological and institutional situation.

Observing in order to correct

The exercises recreate a plausible, absorbing environment to immerse participants in a real-life situation for an hour and a half. Sophie Sauvagnargues and her team separate the participants and the organizers into two different rooms. It’s up to the participants to play out the sequences, by determining what actions and decisions should be taken to manage and anticipate the arrival of the cyclone and best prepare for it.

We stay in communication with them to support and guide them throughout the scenario,” explains Sophie Sauvagnargues. “That means being able to adapt to questions we hadn’t expected and keeping an interactive attitude while guiding them toward the intended learning goals.

The researchers also observe group dynamics, how decision-making is organized and collaboration. “We aren’t there to judge their decisions, but to assess a group and how it works,” she asserts. “That’s essential so that we may then discuss the exercise in terms of strengths and weakness, difficulties encountered, and their feelings in relation to the original objectives. Our goal is to take an objective look at how the exercise went.

New crisis management tools

As part of the SPICy project, we’ve developed Graduated Response Plans for the municipalities,” says Sophie Sauvagnargues. In addition to the Local Response Plan that at-risk municipalities are required to implement, the Graduated Response Plans divide the phases of crisis management into several scenarios. These scenarios have been calculated depending on the risks and evolve with the intensity of the crisis.

A plan has been established to suggest responses depending on each scenario. This may involve road blockades during floods, impacted areas, required immediate action such as neighborhoods to be warned first.

This Graduated Response Plan provides those involved in crisis management with a set of actions ready to go, so they won’t forget anything, by giving them more time to focus on the specific characteristics of the crisis,” adds Sophie Sauvagnargues. This is one example of a tool that can be developed through working with municipalities. In addition to training local authorities, crisis management exercises also provide an opportunity to develop tools to meet specific local needs.

 

Tiphaine Claveau for I’MTech

portabilité, portability

Data portability: Europe supports research players in this field

The right to data portability, introduced by the GDPR, allows individuals to obtain and reuse their personal data across different services. Launched in November 2019 for a period of three years, the European DAPSI project promotes advanced research on data portability by supporting researchers and tech SMEs and start-ups working in this field. The IMT Starter incubator is one of the project partners. IMT Starter business manager Augustin Radu explains the aim of DAPSI below.

 

What was the context for developing the DAPSI project?

Augustin Radu: Since the entry into force of the GDPR (General Data Protection Regulation) in 2018, all citizens have had the right to obtain, store and reuse personal data for their own purposes. The right to portability gives people more control over their personal data. It also creates new development and innovation opportunities by facilitating personal data sharing in a secure manner, under the control of the person involved.

What is the overall goal of the project?

AR: The Data Portability and Services Incubator (DAPSI) will empower internet innovators to develop human-centric technology solutions, meaning web technologies that can boost citizens’ control over data (privacy by design), trust in the internet and web decentralization, etc.

The goal is to develop new solutions in the field of data portability. The DAPSI project aims to allow citizens to transmit all the data stored by a service provider directly to another service provider, responding to the challenge of personal data portability on the internet, as provided for by the GDPR.

How will you achieve this goal?

AR: DAPSI will support up to 50 teams as part of a ten-month incubation program during which experts from various fields will provide an effective work methodology, access to cutting-edge infrastructure, training in business and data sectors, coaching, mentoring, visibility, as well as investment and a strong community. In addition, each DAPSI team will receive up to €150K in equity-free funding, which represents a total of €5.6 M through the three open calls.

How is IMT Starter contributing to the project?

AR: IMT Starter, in partnership with Cap Digital, will be in charge of this ten-month incubation program. In concrete terms, the selected projects will have access to online training sessions and one-to-one coaching sessions.

Who are your partners in this project?

AR: IMT Starter is managing a project led by project leader Zabala (Spain) along with four other European partners : F6S (United Kingdom), Engineering (Italy), Fraunhofer (Germany) and Cap Digital (France).

What are the expected benefits of DAPSI?

AR: This initiative aims to develop a more human-centric internet based on the values of openness, cross-border cooperation, decentralization and privacy protection. The primary objective is to allow users to regain control in order to increase trust in the internet. This should lead to more transparent services with more intelligence, greater engagement and increased user participation, therefore fostering social innovation.

What are some important steps for the project?

AR: The first call has been launched end of Februray. Anyone with an innovative project in the portability field might submit an application.

Learn more about DAPSI

Interview by Véronique Charlet for I’MTech

cryptography

Taking on quantum computers

What if quantum computers, with their high computing power, were already available: what would happen? How would quantum computing transform communications and the way they are encrypted? Romain Alléaume, a researcher at Télécom Paris, talks to us about his research for the future of cryptography.

 

A hypothetical quantum computer with its high computing power would be like a sword of Damocles to current cryptography. It would be strong enough to be able to decrypt a great number of our  secure communications, in particular as they are implemented on the internet. “It poses a threat in terms of protecting secrets,” says Romain Alléaume, a researcher at Télécom Paris in quantum information and cryptography, who quickly adds that “such a computer does not yet exist.”

Read more on I’MTech: What is a quantum computer?

But a hypothetical threat to the foundations of digital security must not be taken lightly. It would seem wise to start thinking about cryptography techniques to respond to this threat as of today. The time required to develop, test and verify new algorithms must be taken into consideration when it comes to updating these techniques. “Furthermore, some secrets, in the diplomatic world, for example, need to be protected for long periods of time,” explains the researcher. We must plan to act now in order to be able to counter the threats that could materialize ten or twenty years from now.

The American National Institute of Standards and Technologies (NIST) launched a competition as part of an international call published in 2017. Its aim was to identify new cryptographic algorithms called post-quantum algorithms, which will replace those which are known to be vulnerable to a quantum computer,  such as the RSA algorithm for example, which is based on the difficulty of factorizing large numbers.

Between quantum and post-quantum

There are two quite different ways to consider implementing safe cryptography even in the event of an attack by a quantum computer: post quantum and quantum. The first relies on mathematical algorithms and computational hypotheses. It is the same principle used in the traditional cryptography implemented today, but uses mathematical problems that researchers have good reason to believe are difficult even for a quantum computer.

Quantum cryptography security, on the other hand, is not dependent on the computing power of the attacker: it relies on physical principles. The quantum key distribution system (QKD) makes it possible to exchange secrets by encoding information in the properties of light, such as the polarization or phase of single photons.

“QKD won’t replace traditional cryptography,” explains Romain Alléaume, “their use cases, as well as the limitations of their use, are very different in nature.  Let’s imagine that the attack is a car accident and cryptography is our safety system. We can think of traditional cryptography as the seatbelt, and quantum cryptography as the airbag. The latter is an additional safety feature for the critical functions that are not ensured by traditional cryptography.”

“The quality of the distributed secret with QKD provides a very high level of security which is not necessary for all communications,” adds the researcher, “but which can be crucial for increasing the security of critical functions.”

And it requires an optical communication infrastructure — typically fiber optic — but for now physical constraints limit its deployment.  Optical link attenuation and noise significantly limit the portion of optical networks where it is feasible to deploy the technology. As of now, quantum communications are limited to ranges of 100 to 200 km on special fibers.

One of the challenges is to enable the deployment of QKD on shared infrastructures, and co-integrate it with telecom equipment as much as possible. This is the topic of the CiViQ project, one of the projects currently being carried out at Télécom Paris. “The ultimate goal,” says the researcher, “would be to share the network so that it can cover both traditional and quantum cryptography.”

Towards hybrid cryptography

The preferred approach is therefore to work with a well thought-out combination of computational cryptography – which will become post-quantum in the near future – and quantum cryptography. Aimed at redefining the border between the two, this will make the deployment of quantum cryptography possible in more frequent cases.

Romain Alléaume and his team are working on the Quantum Computational Timelock (QCT), which relies on traditional cryptography assumptions and quantum cryptography technologies. It is both computational, to distribute an ephemeral secret, and quantum to encode information in a large quantum state, meaning with a great number of modes. “We’ve shown that with this hybrid hypothesis,  we can increase performance significantly, in terms of throughput and distance.”

The information exchanged is therefore locked for a short period of time, say one day. An important point is that this technique, if not broken the first day, will subsequently ensure long-term security. “The attacker won’t be able to learn anything about the information distributed,” says Romain Alléaume “regardless of his level of intelligence or computing power. As long as the model is verified and the protocols are built properly, we’ll have a perfect guarantee in the future.”

He reminds us that at present, “the challenge is to develop less expensive, safer techniques and to develop a real industrial cryptography system for quantum computing.” As part of the Quantum Communication Infrastructure (QCI) initiative led by the European Commission, the research team is studying ways to deploy quantum communication infrastructures at the industrial level. The OPENQKD project, in which Romain Alléaume and his team are taking part, is a groundbreaking project that will contribute to this European initiative by developing industry standards for public encryption keys.

[box type=”info” align=”” class=”” width=””]

The OPENQKD project

The OPENQKD project brings together multidisciplinary teams of scientists and professionals from 13 European countries, to reinforce Europe’s position in quantum communications. On the French side, project partners include Orange, Thalès Alenia Space, Thalès Six GTS, Nokia Bells Lab, Institut Mines-Télécom, CNRS and IXblue.

[/box]

tactile perception

I like this, I don’t like that

We like the soft feel of a cat’s fur, but we don’t like mud, which is slimy, nearly as much. Why is this? We are told that everyone’s tastes are different. But that does not keep scientists from trying to find answers. Jenny Faucheu, a researcher at Mines Saint-Étienne, has studied tactile perception, a highly original subject.

 

“We’re used to defining what is beautiful or pleasant from a visual perspective,” explains Jenny Faucheu, a materials engineering researcher[1] at Mines Saint-Étienne, “but it is less common to think about these questions in terms of other senses, such as touch.” A Research Group name GDR TACT 2033 was formed on 1 January 2018 to study the complexity of tactile perception across a wide range of disciplines.

A better understanding of the sense of touch would provide a wide range of benefits. On one hand, medical examination focused on the possible causes of tactile deficiency or applications focused on rehabilitation. And on the other hand, efforts to develop and design products geared to tactile interfaces and simulators, or innovation in e-commerce. To make this possible, insight must be gained into a sense that is often considered of secondary importance, after sight or hearing.

Who likes what

“Our goal is not to define a human standard and a perfect material,” says Jenny Faucheu, “we’re trying to create links between different aspects through tactile perceptions.”  The researchers therefore take the material’s texture and substance into account by using sensors to detect finger vibrations on a surface. They also study cerebral activity through electroencephalograms and collect behavioral information using psycho-sensory questionnaires. “For example, we wondered if it bothered people that they couldn’t feel the fabric when buying an item such as a sweater online.” explains Jenny Faucheu.

Other questionnaires were focused on the surfaces themselves. In a multiple-choice format, participants had to sort surfaces into four categories ranging from “I like it a lot” to “I really don’t like it.” Another focused instead on how surprising or familiar participants found a surface to be. They were then asked to sort the materials based on their similarities. Then, they were asked to name them and attribute labels for their characteristics. “These mechanisms allow us to create correspondences,” explains Jenny Faucheu, “so we can then say, if this person likes sample 22, they should like sample 45.”

Still, some materials win a consensus. People generally like smooth materials and like rough materials much less. And a surprise can be either positive or negative. But the cultural impact must be kept in mind. “This study was carried out with French participants,” she explains, “it is highly likely that the results would be different for a different culture.”

Liking without touching

To carry out this study, the research team made special surfaces. The various textures were created by distributing small cylindrical dots of varying heights, diameter and spacing on polyurethane surface. Certain samples were perceived as smooth, rough, vibrating or sticky.

“When we rub a sample, vibrations are generated and travel through our finger to the mechanoreceptors,” explains Jenny Faucheu. The sensors used by the research team make it possible to study the vibrations received by these sensory receptors in our fingers. The lowest frequencies correspond to rough materials, which are generally disliked. And, conversely, the higher the frequencies are, the more the materials are perceived as smooth and tend to be well-liked. But it would also appear that a high amplitude intensifies the feeling of roughness.

All fifty or so samples look alike from a visual perspective, with their whitish color, and the naked eye cannot make out the small bumps on some of them. “We also decided to perform the study in a room that was dark enough to limit the influence of sight,” says the Mines Saint-Étienne researcher. Visual information is omnipresent and can therefore interfere with our perception. But hearing can too, although in a more discreet manner.

“We wanted to know whether tactile perceptions would be modified by associating more or less enjoyable sounds,” adds Jenny Faucheu. The same experiments are therefore repeated but with soundproof headphones. When the tester runs his finger along the surface, various sounds are sent to the headphones. It would therefore seem to be conceivable to reverse the perception of a surface by adjusting the ambient sound. “That said, we’re talking about an unpleasant feeling, not a painful one,” she says.

The loss of tactile sensation may appear with age or following an accident. Tactile simulation exercises can be performed to help slow down this loss or regain this sensation. It is possible that an unpleasant feel could slow rehabilitation by requiring additional effort. Therefore, relying on the sense of hearing to transform the perception of a surface could facilitate this process. “It’s the principle of gameification, ” says Jenny Faucheu, “the more fun and enjoyable the process, the more the patient is engaged and the rehabilitation is effective.” An idea that requires fundamental research on the sense of touch and new protocols for analyzing and understanding.

The same principle applies to tactile interfaces. On a tablet or smartphone, there is usually tactile feedback to emphasize a selection: a tool that improves interaction with the object. “Tactile simulators try to simulate real renderings of surfaces,” adds Jenny Faucheu. Projects such as StimTact aim to develop an augmented tactile screen that gives people the impression that they are touching the material displayed. We could therefore imagine buying a sweater on an online shopping website and caressing the real surface of the fabric, from right in front of our computer.

[1] Jenny Faucheu is a researcher at the Georges Friedel laboratory, a joint research unit between CNRS and Mines Saint-Étienne.

Tiphaine Claveau for I’MTech

 

solidarity

From solidarism to national solidarity: lessons of the epidemic

In this time of crisis, solidarity has been widely called for in response to the risk posed by the epidemic. In this article, Sophie Bretesché explores the historic origins of solidarity as a societal value. A sociologist at IMT Atlantique, she specializes in issues of risk and memory in relation to organizational change. In light of past epidemics and their impacts on social organization, she provides insight into the relationship between solidarity and the challenges we face in terms of social organization.

 

[dropcap]T[/dropcap]he health crisis caused by COVID 19 has led our society to consider collective risk in a new light. Since the outbreak of the virus and its global spread, our vulnerability to the virus has underscored the fragility of our bodies, and even more so, the organic nature of our societies, made up of individuals who interact with one another. This epidemic, viewed as unprecedented since the Spanish flu of 1918, raises serious questions about the economic development models we have constructed since the Second World War, based on interdependence between major economies.  The epidemic has therefore pushed people to rethink their vision of the society in which they live. It reveals that the society we live in is not simply an artificial construction. It is a whole, which we inhabit, are part of, and which has proven to be riddled with fluids, bacteria, interactions and tensions.

Three components of risk management have come under question: scientific knowledge, the vulnerability level of our society, and our capacity to develop forms or resilience. The crisis has therefore led us to rethink the way society regulates its interdependencies. In response to the spread of the contagion, the notions of liberty and equality played a decisive role at the beginning of the crisis. The expected counterpart of the lockdown – a de facto deprivation of liberty – was equal access to care for all citizens. In this perspective, it is the vulnerability to the risk that is managed collectively.

In recent days, another notion has been widely called for in the health, economic and social spheres: national solidarity. It has been the common watchword in the management of the crisis. This notion has been used in the past , in particular in relation to Pasteur’s groundbreaking discoveries that revolutionized medicine. Solidarism was inspired by the great microbial discoveries, and the medical discoveries resulting from the fight against epidemics had a concrete effect on the way societies were managed. In light of the COVID 19 epidemic, it is significant that the notion of solidarity has been used as the structuring principle for the regulation methods to come.

Solidarist theory at the crossroads of medical discoveries and the economic crisis

In the 1890s, solidarist theory was just starting to gain attention, but it had already been used in biology for half a century. Moral principles and scientific methods must align to go beyond simple charity and create “solidarity” (he did not invent the term but gave its noble meaning) organized collectively. Indeed, it grew out of an intellectual fervor for the notion of solidarity, a consequence of the major economic crisis that hit France from 1873 to 1895. At the end of the 19th century, liberalism, based on the glorification of the market and suspicion of any State regulation, was increasingly considered unacceptable, as it increased social inequality and intensified the much-discussed “social question.” Paternalism, along with managing behavior through religion and philanthropy, were no longer considered credible responses to the ills of industrialization and the development of global capitalism. It was against this backdrop that Léon Bourgeois put forward a new social doctrine in the 1890s, in which the principle of solidarity was a cornerstone.

Pasteur’s discoveries would lead to mass vaccination and raise important social questions.

Léon Bourgeois readily acknowledged that Pasteur’s scientific research on microbial contagion was at the origin of his thinking about the interdependence between men and generations. As he saw it, rich and poor were equally exposed to biological and social ills, and the suffering endured by some would inevitably affect the lives of others. “The isolated individual does not exist,” Bourgeois tirelessly repeated, in response to the liberal dogma of the precedence of the individual over social organization – which liberals saw as a coercive power, and considered that any steps in this direction would result in the erosion of individual liberties. Bourgeois and the solidarists, on the other hand, asserted that the individual is born into society and may only thrive through the intellectual and material resources made available by society. Interdependent and united, men are indebted to one another, as well as to the generations that came before them and those to follow.

From biological solidarity to social solidarity

Bourgeois’s vision was based on scientific knowledge about the facts of social organization. Based on natural sciences and emerging sociology, his vision showed the close solidarity that unites the components of an organization, whether social or biological. This vision, supported by the findings of natural sciences, illustrated how the liberal idea of laissez  faire was counterproductive, due precisely to the interdependence of individuals. The solidarity proven in the field of science led to the implementation of a new social contract for debt that would account for the interdependence and reciprocal duty linking the different members of a society.

Social Security would grow out of this insightful intuition – social ill was ultimately turned into public good. Thus, solidarism grew out of the idea of a “social debt” which would gradually come to encompass the right to education, a foundation of basic goods to survive and insurance against the main risks of life for everyone. A “social duty” was assigned to everyone. The solidarity proposed by Bourgeois established, along with liberty, the effective solidarity of the country in response to the perils of life.

From debt to social redistribution

This philosophy upholding the fundamentally social nature of all individual existence  goes hand in hand with a process that Jacques Donzelot called “the invention of the social”. For Bourgeois, there was no purely individual property: all activity and property, had, in part, a social origin and as such, the taxes and social security contributions collected by public authorities on income and assets were the rightful compensation for the services offered by society.

This conception provided the basis for the reforms championed by Bourgeois, which would result in the introduction of progressive rates for estate tax in 1901 as well as the creation of a progressive income tax in 1914. The debate over estate taxation, begun in 1893-1894, represented a key moment in the development of solidarism. In the Revue de métaphysique et de morale (Review of Metaphysics and Morality) the philosopher Alphonse Darlu set out the principle of solidarity between generations, which would provide the basis for the legitimacy of estate tax for over a century.

Covid 19: when the epidemic reveals the role of professional communities    

A look back at the epidemics of the 19th century therefore reminds us how such phenomena have drastically changed conceptions of social relationships and political action. And the current COVID 19 crisis has revealed a number of dimensions that are intrinsic to society.

The first relates to the organization of certain professional communities, which have for years been built upon continuous adaption to complex, difficult situations  with regard to the resources at their disposal. This notion of a “professional community” has now taken on particular significance, even if in recent years it has been damaged by technocratic, bureaucratic and managerial reforms. The management of the crisis has illustrated, for example, how the medical community has shown unwavering commitment to its general interest mission, as well as remarkable effectiveness as a community.  Accounts from nurses, nursing assistants and physicians illustrate what has risen to the surface in the face of adversity: caring for others, self-sacrifice, the hard work and dedication of the community and the unavoidable risk-taking that comes with their jobs.

If national solidarity is now expressed, among other ways, by applauding from balconies, we must not be too quick to relate what healthcare workers are doing to a kind of inexhaustible dedication. The medical community is proving its extraordinary capacity to reorganize itself continually, to form strong and effective collectives, while maintaining what forms the very essence of care: the relationship with the patient. And they are performing their duties after years of hospital rationalization, weakening of collectives and of these professions becoming less valued. The solidarity required for the epidemic calls not only for greater appreciation for healthcare professionals, but for their participation in hospital management practices.

The teaching community, another community that has been undervalued in recent years, has also shown its ability to resist and teach during the lockdown. The teaching profession has often been given short shrift when it comes to the essence of education, the relationship, but has now been highlighted precisely in terms of what’s missing: the physical act of knowledge transfer. Once more, despite attempts to develop distance learning,  the power of in-person group learning situations cannot be overemphasized, especially in order to reduce and correct social inequalities.

The epidemic and the key role of unskilled workers

Lastly, during the lockdown, other, more invisible professions have proven to be especially exposed to contagion, even as they perform an activity that is essential to the country. Cashiers, garbage collectors, farm workers, truck drivers and delivery people are examples of essential activities, and yet they are largely unrecognized and undervalued. These are the jobs that are essential and contribute to production, but physically expose workers to the risk of contagion.

The situation reveals the great inequalities between those who can work from home and those who are exposed to the virus. Ultimately, this inequality of circumstances and salary will require a rethinking, in terms of forms of interprofessional solidarity and value chains for tomorrow. And while various forms of social injustice were at the centre of the yellow vests movement, the epidemic has magnified the essential nature of service jobs. In the French culture, based on logic and honor, it is good form to view the service with distance, or even, condescension, even though it reveals the highly social nature of our activities.

Read more on I’MTech: The current “mini-collapse” requires a democratic response

If the medical discoveries of the 19th century brought to light the interdependent nature of the human beings who make up society, the current COVID 19 crisis has provided a reminder to our societies about the fundamentally organic and physical nature of our social and professional activities. Moreover, the active engagement of certain professions reminds us, as in the 19th century, about the social debt we owe to the workers who are most exposed to the virus.

From a distant view to a society based on solidarity

In commentary about the epidemic, many have cited two works of French literature for insight into what we are experiencing. Giono, in Le Hussard sur le toit (The Horseman on the Roof), shows the distant, cold viewpoint of a soldier who does not change his attitude towards cholera. It highlights the selfishness, hate, fear and passivity in relation to the illness. Camus’s La Peste (The Plague), on the other hand, reveals the fraternity and solidarity displayed in particular by health workers.

By choosing to develop a new kind of solidarity, the regulation of risks requires our society to increase its ability to recognize service jobs, and rethink professional value in light of social contribution. For the strength of professional communities and of organizations’ methods of resilience will determine not only our resistance to the epidemic crisis, but also our ability to create a more equal society based on greater solidarity.

To learn more about Sophie Bresteché’s research work:

pandémie, collapse

The current “mini-collapse” requires a democratic response

Fabrice Flipo, Institut Mines-Télécom Business School

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]C[/dropcap]ovid-19, an anthropocene disease? This is how professor Philippe Sansonetti sees the crisis we are experiencing. As a reminder, this concept proposed by Paul Crutzen and Eugene Stoermer in 2000 refers to the present geological time interval, in which “conditions and processes on Earth are profoundly affected by human impact”. This period has not yet been officially recognized, as its geological interest is still greatly contested.

The phenomena associated with the Anthropocene period include land erosion, the disruption of major biogeochemical cycles, and the consequences of these environmental changes: global warming, rising sea levels, ocean acidification, the proliferation of non-biodegradable waste etc.

The perspective of the dramatic consequences of these changes has led to the idea of a collapse, and the current pandemic can already be seen as a sort of mini-collapse. What do can these notions teach us about the current situation, especially in terms of the links that must be developed with democracy?

Covid-19 and collapsology

From an ecological viewpoint, Covid-19 is a population among others, that, as others, exists within populations with which it is in constantly evolving interaction, in a non-deterministic manner.

The hypothesis of a complete or partial collapse of what Habermas calls the “sub-systems of rational action in relation to a purpose” (such as transport, agriculture, industry) has been anticipated for a long time, as a result of an epidemic or a number of other causes.

Recently, there has been renewed interest in this type of scenario with the concept of “collapsology”. Before the geographer Jared Diamond wrote his famous Collapse  (2006), he published Guns, Germs and Steel in which he described the global effect of viruses, a major factor in the decimation of the inhabitants of the New World, to a much greater extent than wars.

However, it has not been the emergence of a virus, which is difficult to foresee, that has concerned ecology specialists in general for years, but rather the condition of soil, biodiversity, toxic pollutants,  matter and energy, water and climate change. At the core of all these issues, the risk of collapse threatens us.

Precautionary principle

But what is a collapse? Yves Cochet, Pablo Servigne and Raphael Stevens describe a “process which leads to a situation in which the basic needs of a majority of the population are no longer met,” for example, going to school or seeing friends.

This, in part, is what has arisen in the current situation, which can be considered by those interested in the “Anthropocene period” as a sort of testing ground through which we can learn a number of lessons, and which can be linked to other similar situations of collapse of varying severity, such as the fires in Australia, Mad Cow Disease, Seveso, Bhopal, the collapse of local ecosystems. These “dress rehearsals” can also give rise to legislative or conceptual tools developed to avoid, manage or recover from these situations.

Efforts to prevent such catastrophes led to the precautionary principle adopted at the Rio Summit of 1992: “Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a pretext for postponing effective measures  to prevent environmental degradation.”

Precaution, which has generated a wealth of literature, differs from prudence in that it is applied to major risks, defined by Law no. 87-565 of 22 July 1987: accidents, disasters and catastrophes, justifying emergency plans, including the famous Orsec plan.

Major Risks

The occurrence of a major risk represents a collapse of varying severity.

This type of threat justifies the use of precaution, as it presents very specific properties and differs from usual risks that insurance can anticipate.

The basic point of reference for insurance is the probability of the occurrence of a catastrophe and its cost, which implies being able to rely on a high number of occurrences  – car accidents, for example. The fact that a risk is repeated shows that it is part of the everyday life of society, of its normal activity: for example, an insufficient number of hospital beds.

A major risk is just the opposite: it is incalculable and irreversible, it takes society out of a state of normalcy.

It therefore leads to an exceptional state, in the sense that there is a brutal disruption of normalcy. A nuclear example is a prime example: power plants almost never explode, so we have no real statistical basis for anticipating such an event.

But when they do explode, it is “serious” since the sub-systems of rational action must build their response from scratch.

Resilience

To address ecological issues, thousands of scientists rely on the concept of “resilience.” This word, whose definition is highly disputed, draws on ecological sciences, economics, sociology and psychology, and was popularized by C.S. Holling and Gunderson.

It refers to “the capacity of a socio-ecological system to absorb or withstand disruptions and other stress factors, such that the system remains in the same regime. It describes the degree to which the system is capable of self-organization, learning and adaptation.”

The theorists of the Resilience Alliance, which brings together thousands of scientists, provide guidelines for action using diagrams: “adaptive” management, that identifies uncertainties and seeks to understand, and change the system.

When a major risk materializes, the unknown is part of the equation. Any attempts to “say what’s what” in a hierarchical fashion or by relying on a small number of experts, must be dismissed:  the crucial goal is to foster dialogue, since it is the entire society that is the learning system.

The healthcare system does not by itself represent “the organization”.  In this sort of situation, “any pre-established discourse will be seen as false” for a simple reason: everyone sees the situation as unprecedented.

Any attempt to fall back on a plan (for example, that of the WHO) will reflect dogmatism, and blindness, rather than competence and an ability to learn.

The threatened group must instead manage to develop a shared, evolving understanding, working not with certainties, but with questions (the right questions, the ones people are asking, and that have a real impact); not with results, but with processes (that everyone may take part in, contribute to, and therefore play an active role in their situation).

For example, the question of determining where the virus is. Processes must be put in place in order to answer this question –  testing, expanding the “scientific council” to include the political sciences, or creating a forum that includes shopkeepers, military servicemen, farmers, citizens selected at random etc. – instead of asking a handful of physicians to single-handedly determine the best practice for the entire country.

Management of the crisis in South Korea and Taiwan

South Korea and Taiwan’s responses to COVID-19 are considered “atypical” by the current government. But in reality, these countries have made resilient choices. They understood that they were facing not only a health crisis, but a political one as well, implying a “need to develop the response in close collaboration with civil society.”

Early on, Taiwan equipped its entire population with masks, tests and a high-quality public digital space, allowing everyone to manage the crisis in a democratic, creative and evolving way. This also included non-governmental organizations responsible for dealing with fake news.

After a chaotic beginning, South Korea was quick to implement similar measures, in particular wide-scale testing, which allows individuals to manage their effect on the environment themselves: expertise is decentralized as much as possible. These two countries had no reason to be concerned about the capacity of their healthcare systems, since they were not strained.

South Korean and Taiwan are therefore setting an example, while China is trying to suggest that it is doing so. Europe, meanwhile, has been mocked and criticized. The geopolitics of tomorrow are also at stake.

The force of democracy

The current crisis has taught us a lesson: democracy is the best way to respond to the major crises humanity will face in coming years, due to the reckless yet predictable effects of its activities on the planet.

But dictatorship and authoritarian management will never be far off, set to offer their (bad) services, whether in the form of a leader and savior, at war against a “Chinese virus”, or a Platonic government, claiming to rely on Science.

There is a temptation to create a state of exception, as described by Carl Schmitt: temporarily giving an individual, leader or scientist, full power to save the Republic – an attitude that will instead worsen the crisis.

Let us act with discretion: not deny the crises or shut ourselves off from unsettling news and weak signals, but actively seek out what we do not know, where the absence of a response poses a problem for us, and let us put in place democratic, flexible, evolving structures. Trusting one another, collectively, is the best weapon we have.

[divider style=”dotted” top=”20″ bottom=”20″]

Fabrice Flipo, Professor of social and political philosophy, epistemology and the history of science and technology, Institut Mines-Télécom Business School

This article has been republished from The Conversation under a Creative Commons license. Read the original article.

 

réseaux, internet

How to prevent internet congestion during the lockdown

Hervé Debar, Télécom SudParis – Institut Mines-Télécom ; Gaël Thomas, Télécom SudParis – Institut Mines-Télécom ; Gregory Blanc, Télécom SudParis – Institut Mines-Télécom and Olivier Levillain, Télécom SudParis – Institut Mines-Télécom

[divider style=”normal” top=”20″ bottom=”20″]

[dropcap]T[/dropcap]he current health crisis has led to a rise in the use of digital services. Telework, along with school closures and the implementation of distance learning solutions (CNED, MOOCs, online learning platforms such as Moodle for example), will put additional strain on these infrastructures since all of these activities are carried out within the network. This raises concerns about overloads during the lockdown period. Across the internet, however, DNS server loads have not shown a massive increase in traffic, therefore demonstrating that internet use remains under control.

The internet is a network that is designed to handle the load. However, telework and distance learning will create an unprecedented load. Simple measures must therefore be taken to limit network load and make better use of the internet. Of course, these rules can be adapted depending on the tools you have at your disposal.

How do telecommunications networks work?

The internet network functions by sending packets between machines that are connected to it. An often-used analogy is that of the information highway. In this analogy, the information exchanged between machines of all kinds (computers, telephones and personal assistants, to name just a few) is divided into packets (small and large vehicles). Each packet travels through the network between a source and a destination. All current networks operate according to this principle: internet, wireless (wifi) and mobile (3G, 4G) networks etc.

The network must provide two important properties: reliability and communication speed.

Reliability ensures accurate communication between the source and the destination, meaning that information from the source is transmitted accurately to the destination. Should there be transmission errors, they are detected and the data is retransmitted. If there are too many errors, communication is interrupted. An example of this type of communication is email. The recipient must receive exactly what the sender has sent. Long packets are preferred for this type of communication in order to minimize communication errors and maximize the quantity of data transmitted.

Communication speed makes real-time communication possible. As such, the packets must all travel across the network as quickly as possible, and their crossing time must be roughly constant. This is true for voice networks (3G, 4G) and television. Should a packet be lost, its absence may be imperceptible. This applies to videos or sound, for example, since our brain compensates for the loss. In this case, it is better to lose a packet from time to time – this leads to communication of lower quality, but they remain useable in most cases.

Congestion problems

The network has a large overall capacity but it is limited for each of its components. When there is very high demand, certain components can become congested (routerslinks – primarily fiber todayservers). In such cases, the two properties (reliability and speed) can break down.

For communications that require reliability (web, email), the network uses the TCP protocol (TCP from the expression “TCP/IP”). This protocol introduces a session mechanism, which is implemented to ensure reliability. When a packet is detected as lost by its source, it is retransmitted until the destination indicates that it has arrived. This retransmission of packets exacerbates network congestion, and what was a temporary slowdown turns into a bottleneck. To put it simply, the more congested the network, the more the sessions resend packets. Such congestion is a well-known phenomenon during the ‘internet rush hour’ after work.

If the source considers that a communication has been subject too many errors, it will close the “session.” When this occurs, a great quantity of data may be lost, since the source and the destination no longer know much about the other’s current state.  The congestion therefore causes a wastage of capacity, even once it is over.

For communications that require speed (video, voice), the network instead uses the UDP protocol. Unfortunately, routers are often configured to reject this kind of traffic in the event of a temporary overload. This makes it possible to prioritize traffic using sessions (TCPemailweb). Losing a few packets in a video or voice communication is not a problem, but losing a significant amount can greatly affect the quality of the communication. Since the source and destination exchange only limited information about problems encountered, they may have the impression that they are communicating when this is not actually the case.

The following proposals aim to limit network load and congestion, in order to avoid a situation in which packets start to get lost. It should be noted that the user may be explicitly informed about this loss of packets, but this is not always the case. It may be observed following delays or a deterioration of communication quality.

What sort of communications should be prioritized in the professional sphere?

Professional use must prioritize connection time for exchanging emails or synchronizing files. But the majority of work should be carried out without being connected to the network, since for a great number of activities, there is no need to be connected.

The most crucial and probably most frequently-used tool is email. The main consequence of the network load may be the time it takes to send and transmit messages. The following best practices will allow you to send shorter, less bulky messages, and therefore make email use more fluid:

– Choose thick clients (Outlook, Thunderbird for example) rather than web-based clients (Outlook Web Access, Zimbra, Gmail for example) since using email in a browser increases data exchange. Moreover, using a thick client means that you do not always have to be connected to the network to send and receive emails.

– When responding to email, delete non-essential content, including attachments and signatures.

– Delete or simplify signatures, especially those that include icons and social media images.

– Send shorter messages than usual, giving preference to plain text.

– Do not add attachments or images that are not essential, and opt for exchanging attachments by shared disks or other services.

When it comes to file sharing, VPNs (for “Virtual Private Networks“) and cloud computing are the two main solutions. Corporate VPNs will likely be the preferred way to connect to company systems. As noted above, they should only be activated when needed, or potentially on a regular basis, but long sessions should be avoided as they may lead to network congestion.

Most shared disks can also be synchronized locally in order to work remotely. Synchronization is periodic and makes it possible to work offline, for example on office documents.

Keeping in touch with friends and family without overloading the network

Social media will undoubtedly be under great strain. Guidelines similar to those for email should be followed and photos, videos, animated GIFs and other fun but bulky content should only be sent on a limited basis.

Certain messages may be rejected by the network. Except in exceptional circumstances, you should wait for the load to ease before trying again.

Advertising represents a significant portion of web content and congests the network without benefits for the user. Most browsers can incorporate extensions (privacy badger) to delete such content automatically. Some browsers, such as Brave for example,  also offer this feature. In general, the use of these tools does not have an impact on important websites such as government websites.

Television and on-demand video services also place great strain on the network. When it comes to video, it is preferable to use TNT (terrestrial network) instead of boxes, which use the Internet. The use of VoD services should be limited, especially during the day, so as to give priority to educational and work applications. And a number of video services have limited their broadcast quality, which significantly reduces bandwidth consumption.

Cybercrime and security

The current crisis will unfortunately be used as an attack tool. Messages about coronavirus must be handled with caution. Such messages must be read carefully and care must be taken with regard to links they may contain if they do not lead to government websites. Attachments should not be opened. The Hoaxbuster website and the Décodeurs application by the Le Monde newspaper can be used to verify whether information is reliable.

At this time in which online meeting systems are extensively used, attention must be given to personal data protection.

The ARCEP (French regulator for telecom operators) provides guidelines for making the best use of the network. To best protect yourself from attacks, the ANSSI (French cybersecurity agency) rules for IT security are more important than ever at this time, when cybercriminalty may flourish.

[divider style=”dotted” top=”20″ bottom=”20″]

Hervé Debar, Head of the Telecommunications, Networks and Services Department at Télécom SudParis, Télécom SudParis – Institut Mines-Télécom; Gaël Thomas, Professor, Télécom SudParis – Institut Mines-Télécom ; Gregory Blanc, Associate Research Professor in cybersecurity, Télécom SudParis – Institut Mines-Télécom and Olivier Levillain, Associate Research Professor, Télécom SudParis – Institut Mines-Télécom

This article has been republished from The Conversation under a Creative Commons license. Read original article (in French).

coronacheck, queries

CoronaCheck : separating fact from fiction in the Covid-19 epidemic

Rumors about the origins of the Covid-19 epidemic and news about miracle cures are rampant. And some leaders have taken the liberty of putting forth questionable figures. To combat such misinformation, Paolo Papotti and his team at EURECOM have developed an algorithmic tool for the general public, which can determine the accuracy of the figures. In addition to its potential for informing the public about the epidemic, this work illustrates the challenges and current limitations of automated fact-checking tools.

 

The world is in the midst of an unprecedented health crisis, which has unfortunately been accompanied by an onslaught of incorrect or misleading information. Described by the World Health Organization (WHO) as an ‘infodemic’, such ‘fake news’ – which  is not a new problem – has been spreading over social media and by public figures. We see the effects this may have on the overall vision of this epidemic. One notable example is when public figures such as the president of the United States use incorrect figures to underestimate the impact of this virus and justify continuing the country’s economic activity.

“As IT researchers in data processing and information quality, we can contribute by providing an algorithmic tool to help with fact-checking,” says Paolo Papotti, a researcher at EURECOM. Working with PhD student Mohammed Saeed, and with support from Master’s student Youssef Doubli, he developed a tool that can check this information, following research previously carried out with Professor Immanuel Trummer from Cornell University.

Originally intended for the energy industry – where data is constantly changing and must be painstakingly verified – this tool, which is called CoronaCheck and is now available in French, was adapted in early March to meet current needs.

This fact-checking work is a job in its own right for many journalists: they must use reliable sources to check whether the information heard in various places is correct. And if it turns out to be a rumor, journalists must find sources and explanations to set the record straight. “Our tool does not seek to replace journalists’ investigative work,” explains Paolo Papotti, “but a certain amount of this information can be checked by an algorithm. Our goal is therefore to help social media moderators and journalists manage the wealth of information that is constantly springing up online.”

With millions of messages exchanged every day on networks like Twitter or Facebook, it is impossible for humans to accomplish such a task. Before checking information, at-risk claims must first be identified. But an algorithm can be used by these networks to analyze various data simultaneously and target misinformation. This is the aim of the research program funded by Google to combat misinformation online, which includes the CoronaCheck project. The goal is to provide the general public with a tool to verify figures relating to the epidemic.

A statistical tool

CoronaCheck is a statistical tool that is able to compare quantitative data with the proposed queries. The site works a bit like a search engine: the user enters a query – a claim – and CoronaCheck says whether it is true or false. For example, “there are more coronavirus cases in Italy than in France.”  It’s a tool that speaks with statistics. It can handle logical statements using terms such as “less than” or “constant” but will not understand queries such as “Donald Trump has coronavirus.”

“We think that it’s important for users to be able to understand CoronaCheck’s response,” adds Paolo Papotti. To go back to the previous example, the software will not only respond as to whether the statement is true or false, but will also provide details in its response. It will specify the number of cases in each country and the date for which these data are correct. “If the date is not specified, it will take the most recent results by default, meaning for the month of March,” says the researcher.

This means that it is essential to update the data regularly. “Every day, we enter the new data compiled by John Hopkins University,” he says. The university also collects data from several official sources such as the WHO and the European Centre for Disease Prevention and Control.

“We know that this tool isn’t perfect,” says Paolo Papotti. The system relies on machine learning, so the model must be trained. “We know that it is not exhaustive and that a user may enter a word that is unknown to the model.” User feedback is therefore essential in order to improve the system. Comments are analyzed to incorporate questions or wording of statements that have not been taken into account. Users must also follow CoronaCheck’s instructions and speak a language the system understands.

Ambiguity of language

It is important to recognize that language can be a significant barrier for an automatic verification tool since it is ambiguous.  The term “death rate” is a perfect example of such ambiguity. For the general public it refers to the mortality rate, meaning the number of deaths in relation to a population for a given period of time. However, the “death rate” can also mean the case fatality rate, meaning the number of deaths in relation to the total number of cases of the disease. The results will therefore differ greatly depending on the meaning of the term.

Such differences in interpretation are always possible in human language, but must not be possible be in this verification work. “So the system has to be able to provide two responses, one for each interpretation of death rate,” explains Paolo Papotti. This would also work in cases where a lack of rigor may lead to an interpretation problem.

If the user enters the query, “there are more cases in Italy than in the United States,” this may be true for February, but false for April. “Optimally, we would have to evolve towards a system that gives different responses, which are more complex than true or false,” says Paolo Papotti. “This is the direction we’re focusing on in order to solve this interpretation problem and go further than a statistical tool,” he adds.

The team is working on another system, which could respond to queries that cannot be  determined with statistics, for example, “Donald Trump has coronavirus.” This requires developing a different algorithm and the goal would be to combine the two systems. “We will then have to figure out how to assign a query to one system or another, and combine it all in a single interface that is accessible and easy to use.”

Tiphaine Claveau for I’MTech

 

Being Human with algorithms : Marc-Oliver Pahl meets with Sir Tony Hoare

Marc-Oliver Pahl is a researcher in cybesecurity at IMT Atlantique. In 2018, he launched “Being human with algorithms”, a series of video interviews between technicians and non-technicians around the topic of digital transformation. Through open discussions and dialogues, he depicts how digital technologies are perceived, and affect humans as citizens, consumers, workers…

In this episode, Marc-Oliver meets with Sir Charles Antony Richard Hoare, winner of a Turing Award in 1980, and developer of quicksort, formal logic for program verification, and the CSP language.