oispg, Pierre Simai, IMT

OISPG: Promoting open innovation in Europe

Projets européens H2020On January 1st, 2017, Pierre Simay was appointed as the new OISPG Rapporteur. This group of experts from the European Commission supports and promotes open innovation practices, particularly in the context of the Horizon 2020 program.

 

Today’s companies can no longer innovate alone. They exist in innovation ecosystems in which the collaborative model is prioritized,” explains Pierre Simay, Coordinator for International Relations at IMT. Open innovation is a way of viewing research and innovation strategy as being open to external contributions through collaboration with third parties.

The Horizon 2020 framework program pools all the European Union funding for research and innovation. The program receives funding of nearly €80 billion for a 7-year period (2014-2020). Each year, calls for tenders are published to finance research and innovation projects (individual and collaborative projects). The European Commission services in charge of Horizon 2020 have established external advisory groups to advise them in the preparation of the calls for proposals. Since 2010, IMT has been actively involved in the expert group on open innovation, OISPG – Open Innovation Strategy and Policy Group. Pierre Simay, the recently appointed OISPG Rapporteur, presents this group and the role played by IMT within the group.

 

What is the OISPG?

Pierre Simay: OISPG is a DG CONNECT expert group, the European Commission’s Directorate General for Information and Communication Technology. The open innovation phenomenon has increased over the past few years, with the appearance of more collaborative and open models. These models are based, for example, on user participation in research projects and the development of living labs in Europe (EnoLL network). I should also mention the new research and innovation ecosystems that have emerged around platforms and infrastructures. This is the case for the European “Fiware” initiative which, by making copyright-free software building block platforms available to developers and SMEs, seeks to facilitate the creation and roll-out of the internet applications of the future in what are referred to as the vertical markets (healthcare, energy, transportation, etc.).

Open innovation refers to several concepts and practices – joint laboratories, collaborative projects, crowdsourcing, user innovation, entrepreneurship, hackathons, technological innovation platforms, and Fablabs which are still relatively new and require increasingly cross-sectoral collaborative efforts. Take farms of the future, for example, with precision agriculture that requires cooperation between farms and companies in the ICT sector (robotics, drones, satellite imagery, sensors, big data…) for the deployment and integration of agronomic information systems. OISPG was created in response to these kinds of challenges.

Our mission focuses on two main areas. The first is to advise the major decision-makers of the European Commission on open innovation matters. The second is to encourage major private and public European stakeholders to adopt open innovation, particularly through the broad dissemination of the practical examples and best practices featured in the OISPG reports and publications. To accomplish its mission, OISPG has been organized around a panel of 20 European experts from the industry (INTEL, Atos Origin, CGI, Nokia, Mastercard…), the academic world (Amsterdam University of Applied Sciences, ESADE, IMT…), and the institutional sector (DG CONNECT, the European Committee of the Regions, Enoll, ELIG…).

 

What does your role within this group involve?

PS: My role is to promote the group’s work and maintain links with the European Commission experts who question us about the current issues related to the Horizon 2020 program and who seek an external perspective on open innovation and its practices. Examples include policy that is being established in the area of digital innovation hubs, and reflections on blockchain technology and the collaborative issues it involves. OISPG must also propose initiatives to improve the definition of collaborative approaches and the assessment criteria used by the Commission in financing Horizon 2020 projects. In Europe, we still suffer from cumbersome and rigid administrative procedures, which are not always compatible with the nature of innovation and its current demands: speed and flexibility.

My role also includes supporting DG CONNECT in organizing its annual conference on open innovation (OI 2.0). This year, it will be held from June 13 to 14 in Cluj-Napoca, Romania. During the conference, political decision-makers, professionals, theorists and practitioners will be able to exchange and work together on the role and impacts of open innovation.

 

What issues/opportunities exist for IMT as a member of this group?

PS: IMT is actively involved in open innovation, with major programs such as those of the Fondation Télécom (FIRST program), our Carnot institutes and our experimentation platforms (for example, the TeraLab for Big Data). Our participation in OISPG positions us at the heart of European collaborative innovation issues, enables us to meet with political decision-makers and numerous European research and innovation stakeholders to create partnerships and projects. This also allows us to promote our expertise internationally.

 

green it, epoc, cloud computing, data center

Data centers: Taking up the energy challenge

Increasingly present throughout the world, data centers consume significant amounts of energy. Researchers at IMT Atlantique have conducted a study that combines renewable energy and electricity grids to power these infrastructures. To reduce energy consumption, scientists are also looking at the network and the anticipation and organization of tasks on the various servers.

 

Accessing this website via a search engine consumes four times more CO2 than getting here by typing the website address or using a bookmark shortcut, according to the ADEME. [1] While it is easy to forget, because of its lack of physical presence, that digital technology emits greenhouse gases, it is important to remember that data centers emit as much carbon dioxide as civil aviation (2% of global emissions) due to their electricity consumption. This observation is far from trivial, considering the significant increase in the number of data centers around the world. Furthermore, with the arrival of cloud computing, data centers are consuming increasing amounts of energy, leading to ecological and economic problems.

With each new data center that opens, reducing the environmental impact is therefore a key consideration. With the aim of reducing energy consumption, the EPOC collaborative project (Energy Proportional and Opportunistic Computing systems) combines alternative energies and the EDF grid to power single-site infrastructures containing around fifty servers. This purely theoretical project, launched in 2013 and financed by the Labex CominLabs, brings together researchers in computer science and telecommunications from IMT Atlantique. For this Green IT project, the researchers’ skills in computer optimization, virtualization, and software adaptation are combined with expertise in very high-speed optical networks. Let’s take a look at the theory at the heart of this project, while awaiting its implementation.

 

Technical optimization

In the EPOC theory, data centers are connected to the EDF grid, while still being powered by renewable energy. This renewable energy provides electricity in a discontinuous manner, unlike the electricity provided by the traditional network. For example, in the case of a photovoltaic panel, there is a surge in electricity when the sun is at its peak, whereas there is no production at night. “Currently, the most expensive aspect in a renewable energy electric system is the batteries used to store the energy that is not consumed. We would like to do away with the storage aspect and try to consume the electricity that is produced directly,” explains Jean-Marc Menaud, a researcher specialized in Green IT at IMT Atlantique and the EPOC coordinator. This is yet another reason to optimize the cost-effectiveness of this heterogeneous distribution of energy, and power the data centers in a consistent manner over time.

To achieve these goals, the researchers are improving network communication in particular. They have opted for an installation that is entirely connected via fiber optic technology. This reduces energy consumption during transfers between the servers. The transmission of information via a fiber optic system or, in other words, via light signals, consumes less electricity than an ADSL connection. ADSL, for example, uses cables containing copper and equipment that constantly consumes electricity.

 

Organizing tasks according to the electrical load

Once they are running and connected to a network, data centers are able to host two types of applications. Some, like search engines, require a constant online connection. Others must be performed before a deadline. The researchers are therefore able to coordinate the applications being used based on the energy received. When green energy is available, they can organize the tasks the way they like. Jean-Marc Menaud gives us an example: “at the end of every month, the accounting services must create the pay stubs in PDF format. These files must be available by the 30th of each month, but can be produced any time before this date. They can therefore be created when a high level of green electricity is available.

In addition to the number of tasks in progress, it is also possible to play on the applications that are always online. The goal of a data center is to constantly ensure a certain level of quality of service, or SLA (Service Level Agreement). This is a question of software elasticity, meaning the ability to adapt an application’s operation according to the available energy. Take, for example, a website used to evaluate a trip from one point to another. Its SLA consists in providing a route within a given time frame. If the electrical load is low, it will only meet these simple requirements. If, however, the green electricity rate is high, the website can provide alternative routes, hence improving the service provided.

 

Regulating the workload

Reducing energy consumption also involves reducing the number of servers that are running. Before turning a server off, the applications running on it must be transferred to another server. To do this, the researchers use the virtual machine principle. The researchers have two possibilities for relieving the workload on a server: they either suspend the calculation, or they perform a migration using virtual machines. This process of scheduling tasks on the servers is a complex problem. It is, above all, a question of placement and distribution.

Jean-Marc Menaud explains: “This principle of placement is similar to the principle involved in filling a backpack. Imagine you are leaving for a trek with a 60-liter backpack. You can choose from a wide variety of food items to take with you. Each food item has a calorific value, a volume and a weight. Your goal is to gather a maximum amount of calories within the limitations of a backpack with a set volume, while minimizing the final weight. The solution is easy when there are only 5 food items. But if you have 10,000 items, the problem becomes more complex, because it is impossible to test all the possibilities. Here we have a similar situation. A server is a backpack that can contain a certain quantity of virtual machines. We must maximize the service provided (the calories) and minimize the energy (the weight).

 

SeDuCe: a full-scale test data center

The last area the EPOC researchers are studying is anticipation. Predicting an application’s energy needs, combined with the electricity production forecast, is the key to responsible consumption. This is an aspect that will be directly tested in 2017, with the implementation of the CPER [2] SeDuCe (Sustainable Data Center) project, the logical follow-up to three years of theory. It will enable a test single-site data center to be established, powered by photovoltaic panels. This complete infrastructure will enable the practical analysis of the theories being developed through EPOC. “This type of site is rare in France. They are only beginning to emerge at the industrial level. With photovoltaic panels that are increasingly affordable, we will be able to test the hypotheses of this advanced research directly. The site should be operational by the summer of 2017″, Jean-Marc Menaud concludes.

 

[1] ADEME: The French Environment and Energy Management Agency (Agence De l’Environnement et de la Maîtrise de l’Energie)

[2] CPER: French State-Regional Plan Contract (Contrat Plan Etat-Région)

 

Piezoelectric, Cédric Samuel, IMT Lille Douai

Connected devices enter the piezoelectric generation

Piezoelectric polymers may be key in the manufacturing of future generations of connected devices. Thanks to their capacity to generate electricity under mechanical stress, they could greatly reduce the energy consumption of intelligent sensors and make them autonomous. But in order to achieve this, researchers must be able to make these polymers compatible with classic production techniques in the plastics processing industry. All the possibilities are being considered, from 3D printing to bio-based materials.

 

Certain materials are able to generate an electric charge on their surface when deformed. This property, called piezoelectricity, has proven to be of particular interest in the field of connected devices. “Wearables”, which are intelligent devices worn on the body such as exercise trackers, are regularly subject to torsion, flexion and compression. By using piezoelectric materials in their design, there will be less need for batteries, reducing charging frequency and consequently increasing their autonomy. Piezoelectricity, although limited to ceramics since its discovery at the start of the 19th century, is gaining ground in the sector of polymers thanks to growing demand for flexible or transparent connected devices.

Nevertheless, these new plastics “will not compete with ceramics in their fields of application” warns Cédric Samuel, Materials Researcher at IMT Lille Douai (formerly Mines Douai and Télécom Lille). The coefficients that quantify the electricity produced by the piezoelectric effect are 15 times smaller for polymers than for ceramics: “30 picocoulombs per newton for the most efficient polymers, compared with 500 for ceramics” the scientist explained. But connected devices do not need high coefficients, since they only consume a small amount of energy. On the other hand, they require materials that are inexpensive to manufacture, a specification that would be met by piezoelectric polymers if researchers could make them compatible with classic production techniques in the plastics processing industry.

The researchers’ challenge – and it is a considerable one – lies in the processing and shaping of such materials. PVDF, which is currently the most efficient piezoelectric polymer, is far from easy to process. “Only a single type of PVDF crystal — the beta form — has piezoelectric properties,” Cédric Samuel explains. To obtain this form, PVDF must be deformed by more than 200% by stretching, at temperatures between 90 and 100°C. “This requires numerous processing and post-processing stages, which complicates the process and increases production cost” the researcher continued. Alternative options must be found in order to obtain a large-scale and inexpensive processing and shaping solution for piezoelectric PVDF.

PVDF crystals, a piezoelectric polymer with high potential

Researchers are exploring various other possibilities. Working with the University of Mons (Belgium) through a co-supervised PHD thesis, IMT Lille Douai is concentrating more particularly on polymer blends combining PVDF with another plastic: PMMA. This provides two advantages. Not only is PMMA less expensive than PVDF, but the combination allows a piezoelectric form of PVDF to be obtained directly through extrusion. Scientists thereby skip several stages of processing. “The downside is that it leads to a lower piezoelectric coefficient,” Cédric Samuel points out, before adding, “but then again, applications for piezoelectric polymers do not necessarily need huge coefficients.”

 

Piezoelectric polymers through 3D printing

Although polymer blends are an option worth studying to improve processing of piezoelectric PVDF, they are not the only possible solution. Through the Piezofab project, which involves the two Carnot institutes of the IMT (M.I.N.E.S Carnot institute and Télécom & Société numérique Carnot institute) alongside IMT Atlantique (formerly Mines Nantes and Télécom Bretagne) and IMT Lille Douai, researchers are aiming to create sensors and electric generators from piezoelectric polymers through 3D printing. “We seriously believe we can succeed, because we have sufficient background on polymer-based additive manufacturing thanks notably to the expertise of Jérémie Soulestin on the subject,” declares Cédric Samuel confidently.

Researchers at IMT Lille-Douai will endeavor to test the feasibility of the process. To do so, they will work on a modified form of PVDF supplied by their partner PiezoTech, a company which is part of the Arkema chemicals group. This PVDF has the specificity of directly crystalizing in the piezoelectric from when manufactured using 3D printing. Although the cost of the modified polymer is greater than that of its standard form, the manufacturing process could allow a serious reduction of the quantities used.

This inter-Carnot project will lead researchers to study the relevance of piezoelectric polymers for connected devices. IMT Atlantique’s task will be to incorporate piezoelectric polymers into radio transmitters and characterize their properties during use. “One of their greatest strengths is the integration of systems for specific applications, such as monitoring individual exercise” the researcher explained, referring to work carried out by Christian Person.

 

Piezoelectric materials can also be bio-based!

In the two previously-mentioned options currently being studied by Cédric Samuel and his colleagues, the common factor is PVDF. However, “PVDF is an engineering polymer, which remains expensive compared to the commodity polymers” he underlines, “ideally, we would like to be able to use the commodity polymers of plastics processes, and preferably bio-based if possible” he continued. To achieve this, IMT Lille Douai is directing a cross-border European project called Bioharv which associates academic partners in France and Belgium. The Universities of Mons, Lille and Valenciennes as well as Centexbel, a scientific center specialized in the textiles industry, are working alongside the graduate school.

 

Making prototypes using piezoelectric textile fibers.

 

The researchers are most interested in two bio-based polymers, or bioplastics: Polyamide 11 and Polylactic Acid (PLA). The first has proven piezoelectric properties, although a lot weaker than those of PVDF. For the latter, it is a question of proving whether it can in fact generate electric charges. “Certain scientific articles lead us to suppose that Polylactic acid is a promising option, but there has not yet been a clear demonstration of its piezoelectricity” Cédric Samuel explained. In order to do so, the scientists must obtain PLA in its semi-crystalline form. “It’s a stumbling block, as PLA is currently not easy to crystallize” the researcher went on.

The Bioharv project is organized in several stages, gradually developing increasingly effective generations of piezoelectric polymers. It reflects a dual regional research dynamic focusing on both new textiles and the use of natural resources for designing the materials of tomorrow. The stakes are high because the petrochemical industry will not always be able to meet an increasing demand for polymers. Since PLA is produced using agricultural resources, connected devices in the future may be able to be made using corn or potatoes, rather than oil.

 

Radioactivity, SMART, IMT Atlantique

SMART: radioactive expertise at the service of the environment

Nuclear power stations are not the only sources of radioactivity in France. Other sources undeniably include hospitals and the construction industry, for example. Given the potential dangers to health and the environment, such sites have to be permanently tested. SMART, a team of experts in radioactivity from SUBATECH is a necessary third-party actor in this domain. The team brings objectivity and quality to field measurements and laboratory analyses. It handles regulations on radioactivity and thereby contributes to monitoring the environment.

 

SMART. No, this isn’t an article about the latest model of the famous subcompact car, but the Service for the Measurement and Analysis of Radioactivity and Trace elements. This team of experts (technicians and engineers) which is part of SUBATECH,[1] is not restricted by the usual laboratory and research standards. This freedom gives them a broad scope for responding to very specific questions. “We can answer any question that concerns the measurement of radioactivity,” explains Isabelle Deniau, Senior Technical Engineer at SMART. So, if you want to determine the true nature of a diamond, or find out the origin of a wooden floor, SMART’s doors are wide open and its advice is readily available.

But the range of offerings proposed by these radioactivity experts goes further than answering such unusual questions. Any organization using radioactive sources, such as Engie, AREVA, the CEA, hospitals or the construction industry, can benefit from SMART’s services. These direct or indirect actors in the industry of nuclear energy are subject to strict regulations in terms of tests and waste in order to protect citizens’ health and the environment.

 

Radioactivity tests for monitoring the environment

When we talk of radioactivity, the image of a nuclear power plant often enters our minds. This is hardly surprising in France, given that nuclear power is the leading source of energy, accounting for 77% of all power generation (according to the Electricity Transmission Network – RTE). Nuclear power stations are tested regularly because of the dangers they can lead to. The SMART laboratory works directly on the ground to analyze nuclear waste and put in place impact measurements, notably for the ASN (Nuclear Safety Agency). “Broadly speaking, the laboratory contributes to monitoring the environment in France”, explains Isabelle Deniau.

“We conduct analyses of grass, moss, leaves… anything that may absorb radioactive compounds through exchange with the atmosphere. We also take water samples”, the engineer added. These samples provide information on potential pollution from radioactive isotopes released into the atmosphere in the form of waste liquids or gases. Specialists call them bioindicators. These biological compounds indicate the presence or absence of contamination.

Samples are never taken at random on the ground. Analyses are conducted upstream and downstream of nuclear power plants and then compared to evaluate the impact of the effluent releases. The choice of samples depends on the radioactive isotopes being searched for. The experts must take account of the accumulation capacity of each bioindicator. “Since we entered the era of nuclear power it has become difficult to differentiate between natural radioactive isotopes and those connected with human activity, because some of these compounds are already present in the environment”, Isabelle Deniau explains.

Up-to-date knowledge of the behavior of radioactive elements in the areas studied is vital, since the compounds differ depending on the environment in which they have accumulated. Cesium, for example, is a typical compound of nuclear activity which is deposited among sediments in watercourses. It can be transmitted to algae and then consumed by fish and eventually accumulates in their muscles. “In these cases, it is no longer the physical half-life which interests us, but the biological half-life”.

 

Secure and monitored demolition

These measurements are effective for nuclear power stations in operation, but such sites only have a limited life-span. They have been designed to last for between 25 and 40 years, although this period has been extended over the past few years thanks to advances in knowledge and techniques which ensure the safety of these infrastructures. But a time will come when these power stations will have to be demolished. Measurements are taken by the team at SMART. “Regarding the problem of demolition, our aim might be to determine the level of radioactivity of demolition waste which will later be sent to the ANDRA (National Agency for the Treatment of Radioactive Waste), or to check that no radioactivity remains on the site”, says Isabelle Deniau.

The rehabilitation approach for each of these potentially dangerous sites is taken very seriously at every stage of the process. Before demolition, the experts evaluate the radioactivity of the waste which will be produced. After the demolition, they measure the level of residual radioactivity. “We can also act as third-party experts on demolition sites such as Fort de Vaujours. This constitutes an additional control”.

 

Reputed and award-winning practices

“We are accredited for samples and analyses by COFRAC, an organization that accredits laboratories and certifies their measures. This provides a guarantee that the results follow a standardized procedure and that the samples and measurement techniques have been validated. This means that if we repeat the measurement, our result will always be the same. We undergo regular audits for this certification”, explains Isabelle Deniau.

Samples for radioactivity analyses are standardized by vital measurement constraints. In the case of water samples, for example, scientists must take samples from the middle of the watercourse and not from zones where the water is stagnant. Certain precautions must also be taken, such as rinsing the vials with sample water, or filling them right to the brim to prevent gas exchange. These precautions are all necessary for reliable and representative results.

“COFRAC certification is reassuring for our clients. It certifies the reliability and quality of our work”, Isabelle Deniau adds. It is important to have an independent actor like SMART. The fact that it is not associated with the organizations producing radioactivity means it can carry out safety measurements without bias. The service adheres to strict regulations that guarantee objective results, and therefore plays an essential role on a societal level.

 

[1] SUBATECH is a research laboratory co-operated by IMT Atlantique, the Institut National de Physique Nucléaire et de Physique des Particules (IN2P3) of CNRS, and the Université de Nantes.

Entrepreneur, Thomas Houy, Télécom ParisTech

What is the truth behind the myth of the entrepreneur?

A much more complex and less glamorous truth lies behind the storytelling of the visionary entrepreneur, the insider who has the revolutionary idea of the century. Thomas Houy, Researcher at Télécom ParisTech, breaks down the stereotypes surrounding the false image of the exceptional strategist. Following his presentation during the IMT symposium on businesses, the economy and society in November 2016, he returns to the subject of the truth of an entrepreneur who is above all closely attentive to demand and market data.

 

Even after his death, Steve Jobs embodies the figure of the entrepreneur. Often presented — or represented — as an innovative genius, he symbolizes the power of visionary man, the fortuitous person able to provide the solutions to the problems of a world that only he can understand. Far from being unique to the former CEO of Apple, this rose-tinted perspective seems to concern all successful entrepreneurs, who supposedly build long-term strategies on ideas that are just as sudden as they are disruptive. The only problem with this belief is that it is wrong.

Thomas Houy, Researcher and Head of the Student Entrepreneurship in a Digital World Chair at Télécom ParisTech, studies the characteristics of contemporary entrepreneurship, which is a driver of digital transition. He believes that multiple clichés need to be broken down, and first and foremost, the one which consists in believing that it is question of being visionary, that a person almost needs to have the gift of ubiquity to be a successful entrepreneur. While he does not consider that success comes down to chance, he simply points out that “entrepreneurship is almost completely obscure. Nobody can really predict its success. In the face of so many uncertainties, the most practical solution is to iterate as much as possible with the market.”

This ability to iterate has improved thanks to digital technology. 3D printing, for example, allows a product to be prototyped very quickly and tested among users, before returning just as quickly to the prototype phase if the solution is not suitable. Even better, data collected from users of online services provides a gage of the success or failure of a platform. In short, “it is now much easier to test a market and understand users” Thomas Houy observes.

It is more this capacity for rapid transfer between consumers and enterprises that explains success — which is built on the basis of repeated failures — rather than a long-term strategy or plan. This is one of the advantages of start-ups over big groups in digital technology. “Big businesses function with business plans and projections over several years; they are less able to let themselves be guided by a sort of serendipity, in which the organization accepts the fact it does not know where it will end up” the Researcher explains. Start-ups, on the other hand, are able do this thanks to their short history and close relationship with their first clients.

 

Entrepreneur, Thomas Houy, Télécom ParisTech

In May 2015, Bloomberg illustrated an article about Elon Musk and his SpaceX business using this visual from The Red Dress. A symbol of the mythologization of an entrepreneur raised to a position of pioneer in the conquest of Mars. Credits: Bloomberg / The Red Dress.

 

“Start-ups focus their efforts on finding what is known as the “product market fit”, which is the stage where the developed solution meets the demand of the market. Start-ups can even use tricks such as building false products to test the market’s liking for their solution,” Thomas Houy explains. He uses the example of Dropbox to illustrate this point. This start-up, which offers a system of storing and sharing files via the cloud, found its users before it even had a marketable product. “The founder of Dropbox created a false video over the space of a weekend to present his product, without mentioning that it didn’t exist yet,” the Researcher told us, “he was immediately inundated with emails. He had found his market without even having a product.”

This fast roll-out of solutions in the new digital economy makes business plans obsolete. The acceleration of the rate of digital transformation also prevents any form of prediction over the medium or long term. When Tinder entered the dating applications market, all the existing actors were wiped off the board within the space of a few months, nullifying any 5-year projections. “Business models have replaced business plans in the digital economy today,” Thomas Houy adds, “because nobody can predict what will happen in 6 months’ time.”

 

Being the first is not enough to be successful

Innovations can break through quickly, but this is no reason to believe that an innovation is an idea that strikes like lightening in the mind of an exceptional being. In fact “all ideas already exist” the Researcher believes, “if an idea doesn’t already exist, it’s probably because it’s a bad one. This is also what investors say” he continues. Twitter was in no way the first micro-blogging website. It even arrived ten years after the concept was first invented. Having the idea first is not a sign of success. What counts is the quality of execution. Twitter’s strength lies in the fact it pays constant attention to the user’s perception and adapts to its uses.

Does that mean a good entrepreneur is a good copier, able to take inspiration from others’ successes to develop a better solution? “People who copy successes often copy the wrong things” Thomas Houy warns. The creator of a social media may be tempted to copy Facebook, and will try to isolate what characterizes Facebook, such as its agile methodology based on frequent updates. “The problem is that among all the people who have used agile methodologies, thousands have got it wrong, so it’s not that which is behind the success of Facebook.” the Researcher explained. “Good entrepreneurs take inspiration from others’ mistakes just as much as their successes,” he added

It is partly this need to take a step back that means that good businesses in a given field are not necessarily set up by those already in the inner circles. “Statistically speaking, entrepreneurs with a knowledge of the field are more successful, but this is not the case among disruptive enterprises” the Researcher observed. This paradox is due to the fact that once the mentality of a sector or market has been adopted, it is difficult to break away and change perspective. He illustrated this fact with an example he is very familiar with: teaching. “When teaching staff create MOOCs, they generally do so from an academic point of view,” he explained, “but we are much less efficient at keeping the attention of the audience behind their screen than someone from the audio-visual or media sectors and who is familiar with attention economics.”

The reality of entrepreneurship is in fact often a long way off the widespread vision of it. The storytelling used with regards to this concept does not help paint an accurate portrait of an entrepreneur. Unfortunately, “it is a lot simpler and more attractive to describe how an entrepreneur has the gift of ubiquity, rather than to say that they are a humble person who does not forecast on the strength of their ideas but who constantly observes data for guidance,” comments Thomas Houy.

 

[box type=”shadow” align=”” class=”” width=””]

Transmitting good entrepreneurial practices to students

One of the common beliefs about the qualities of an effective entrepreneur is that you have to scale-up an innovation quickly. Thomas Houy is regularly confronted with this erroneous idea in the projects he leads with his students through the Student Entrepreneurship in a Digital World Chair, financed by LVMH. “My students often want to solve the problem of boredom in public transport,” he explained, “several times, they have come up with the idea of an ephemeral social network allowing travelers to contact each other on the bus or metro. They all have the same thought: ‘it’s a really good idea because 10 million people use the metro every year in France.’ Except that there’s no such thing as ‘the people on the metro’, and there are just as many different needs as there are individuals. In the morning, there may be a student on the metro going home from a night out, next to an executive practicing his project presentation in his head. Trying to scale-up quickly means trying to satisfy everybody, and trying to satisfy everybody means satisfying nobody. Good entrepreneurs target niches, they learn to perfectly satisfy their niche and only scale-up by addressing new categories one at a time.”[/box]