Recovering uranium without digging: in situ leaching

In light of the increasing economic value of underground resources, and the environmental problems caused by disused mines, research into alternative solutions for extracting raw materials is rapidly increasing. One solution being studied is in situ leaching for recovering uranium. During the Natural Resources and Environment conference that took place November 5-6, 2014 at Institut Mines-Télécom, Vincent Lagneau, a researcher at the Mines ParisTech Research Center for Geosciences, presented the results obtained by the “Reactive Hydrodynamics” team in the field of predictive modeling.

 

In Situ Leaching (ISL) is a process aimed at dissolving metals, such as copper and uranium, which are easily dissolved, directly in the deposit. Using a series of injection and production wells, an acid solution called a lixiviant is injected into the subsoil, then pumped down around ten more meters. “To carry out an in situ leaching operation, a porous, permeable and ideally confined environment is required, explains Vincent Lagneau, a researcher at Mines ParisTech, “the lixiviant solution must be able to circulate while avoiding leaks, which represent both an investment loss and an environmental risk.” At the production wellhead, all that remains is to separate the target minerals from the waste.

This alternative solution is perfect for the uranium deposits that are currently being developed: deep, extended deposits, 10 to 20 kilometers long, with low-grade uranium, that cannot be exploited using open-cast mines or underground mining works. “It has been so successful that industrialists have decided to invest: 40% of the world’s production of uranium is produced using in situ leaching,” primarily in Kazakhstan and Australia.

 

Optimizing the technique and assessing environmental impacts

However, this technology, developed in the early 1960s, is empirical. The researchers are currently working on optimizing it. “This involves rationalizing it. If we understand the processes, we can find the right levers to reduce operating costs, increase the quantity of recovered uranium, and improve the retrieval speed.” Currently, it takes three years to recover 90% of the uranium from an environment.

The research also involves an environmental aspect: “When we end the operation after a three-year period, there is still acid everywhere. We can use our tools to try to understand what happens to the site afterwards.” They must determine how long it will take for the site to return to its initial state.

 

Research combining hydrogeology and geochemistry

We can’t look and see what’s happening 400 meters down, we don’t know where the uranium is, nor how it behaves.” To gain an understanding of these processes, the “Reactive Hydrodynamics” team develops models combining hydrogeology and geochemistry. The chemical reactions that take place between the injection well and the production well are directly linked to the transport of the acid solution and the dissolved elements. The reactions also interfere with each other in space and time, due to the flow of water.

Once the processes have been identified, the researchers convert them into equations: dissolution of uranium, the consumption of acid, pressure differences between the wells, the water velocity in the environment… The equations, integrated into the algorithms developed by Vincent Lagneau and his team (HYTEC), provide numerical results, such as the concentration of uranium and the quantity of acid consumed, which can be compared with the operator’s observations at the wellhead. If the model is correct, it can then be used to understand what happens in the injection well, or between the two wells, and to test other operating scenarios. 50% of the researchers’ work involves developing models, and the other half involves applying them. “This balance is very important to us, because each part inspires progress in the other. By applying the models, we can identify the needs, improve our code, and therefore carry out better studies, or studies we could not carry out before.

 

A predictive model used successfully

The AREVA in situ leaching operation site in Kazakhstan (Source: Vincent Lagneau).

This work has been successful: Vincent Lagneau’s team has succeeded in developing a predictive model, in partnership with AREVA, that was used operationally two years ago, in Kazakhstan. “Today, if someone gives me a site that has not yet been operated, I run my model and can tell what the production curve will be like and what the environmental impacts will be. It is really a key result for us.

Eventually, it will be possible to use the tool developed by Vincent Lagneau and his research team to choose the best location and injection solution to use in order to optimize the operation of the site, and will also enable the assessment of its impacts in advance. “We are now applying the model to prospective sites in Mongolia: we assess the fluid circulation in the environment up to the zones in which they could rise to the surface (wells, faults), and changes in its quality as it travels through the system (reduction of the acidity and residual uranium fixation).

 

Mines Douai, Concrete, sediments

Recycling concrete and sediment to create new materials

How can we meet the needs for construction materials in an environmentally responsible manner? Recycling could be the solution, but it is not yet easy to create high-performance and eco-friendly materials using waste. At the conference on Natural Resources and Environment, which took place on November 5-6 2014 at Institut Mines-Télécom, Vincent Thiéry, a researcher at Mines Douai, presented two aspects of the research developed in the Civil and Environmental Engineering department on designing the concrete of the future.

 

It is increasingly difficult to find high-quality raw materials to make materials like concrete. At the same time, we are generating more and more waste that we do not know how to dispose of – industrial by-products, concrete from building renovations or demolitions, and sediments that block ports and canals. Vincent Thiéry and his colleagues are therefore working to design alternative materials, such as concrete and cements, using these new raw materials whose potential remains unexplored: “One the one hand, we could generate less waste, and on the other hand, we wouldn’t have to use as many natural resources.

 

Recycling sediments and old concrete

Our mission is to create high-performance materials that are environmentally friendly. This can be accomplished through recycling, by incorporating a certain amount of waste into these materials.” The sediments are retrieved by dredging ports and canals every 10 to 20 years, which produces large quantities of materials. The construction materials sector is has many uses for the recovered sediments: “We will try to integrate them into the construction of roads, prefabricated concrete (casting of concrete blocks or concrete for street furniture), landscaped mounds, embankments, and in artificial aggregates for reinforcing beaches.” Another possibility is recycled concrete aggregate. 300 million tonnes of construction and demolition waste are generated each year, which can be reused as aggregate for producing concrete, even though this type of aggregate requires much more water than traditional concrete. Better yet, “certain industrial by-products and waste – typically those generated by the steel industry – are resources used in the cement industry: the integration of these materials can produce attractive results, such as resistance to seawater and good mechanical strength.

 

Vincent Thiéry, Mines Douai, concrete, sediments

Recycled concrete aggregate (diameter: approximately 1 cm). Black components: a natural aggregate. Gray components: cement paste. These two components behave differently when they are integrated into new concrete; they must be correctly characterized and quantified.

Recycling challenges: sustainable formulations

Scientific and technical challenges arise in relation to durability.” The dredged sediments contain heavy metals and organic pollutants, which must not be released into the environment: therefore, the stabilization of the recycled materials must be ensured. “Concrete that is poorly made can develop certain pathologies — which appear in the form of swelling, flaking, or crumbling — and can no longer be used for its intended purpose.” A very well known pathology – the alkali-silica reaction – has been one of the Civil and Environmental Engineering (GCE) department’s areas of expertise for around fifteen years. Not all types of aggregate cause this pathology, yet for some of them, a specific mineralogy must undergo extensive analysis to ensure it will not react. “Eventually, we will no longer have a choice; we will have to find ways of using them anyway. Fortunately, several types of industrial waste allow for the recycling of aggregates that could generate pathologies.

An example of microscopy applied to cementitious materials: a thin section of concrete. The colored components are the aggregates; the black part is the cement paste. The use of traditional optical microscopy (thin petrographic sections) makes it possible to compare the different components, thus facilitating their identification.

Prior research is required in order to find the right concrete formulation to match the intended use. “An extremely precise characterization of the recycled aggregate will be required before it can be integrated into new concrete.” Vincent Thiéry works on characterizing the materials, both natural and recycled. In other words, he uses microscopic observation to interpret the arrangement and properties of the minerals that compose the materials. “We know that in certain aggregates, we will need to look for a specific type of mineral, in a specific form, to determine if there is a risk of the concrete developing a pathology.

The laboratory has also developed an experimental method based on the dissolution of the cement paste hardened in salicylic acid to measure the percentage of cement in the recycled concrete. The lab features a mechanical characterization center and a digital simulation center.

 

Partnerships to move from fundamental research to applied research

The EcoSed (Sediments in a Circular Economy) Industrial Chair, launched in April 2014 by Mines Douai, will carry out partnership-based research over a 5-year period on the management of dredging sediments (ports and canals). The tonnages are significant – around 50 million metric tonnes per year in France. “It involves more fundamental aspects, like sediment characterization, before moving towards extremely practical and applied aspects”: developing materials, improving knowledge of sediment-based concrete, and improving knowledge of their behavior in road geotechnics, etc. It is one of the Civil and Environmental Engineering (GCE) department’s flagship projects. It is also participating in the national Recybéton project, which studies the use of materials from concrete recycling sites through experiments in the laboratory and at experimental construction sites.

 

Geological storage, stockage géologique

Geosciences and the environment: the challenges of geological storage

Geological storage is a field of expertise offered by the Institut Mines-Télécom schools’ research centers. During the Natural resources and environment conference held on November 5-6, 2014 at Institut Mines-Télécom, Vincent Lagneau, Assistant Director of the Mines ParisTech Research Center for Geosciences, spoke at a plenary session on “Underground storage and recycling”. He presented the benefits and key issues of R&D relating to storage in geological environments.

 

Gas, liquids or waste can be stored beneath our feet, in man-made cavities and in the voids of natural geological formations. In France, we have been storing natural gas for 60 years. “The underground environment offers two attractive assets for storage: the space and duration.” The volume available underground allows for the storage of enormous quantities of CO2, for example. And the storage time must range from a period of between 1,000 and 10,000 years for CO2, and between 100,000 and 1 million years for radioactive waste. “We do not have any example of human constructions that last for such a long period of time. However, we have examples of geological structures that are much older than these artefacts.

 

Understanding the impacts and creating reliable predictive models

Because we must use and disturb the underground environment for these storage purposes, we are faced with the difficulty of proving that these storage facilities are durable and do not impact the rest of the environment over the very long term.” However, while many studies have been carried out, they are limited in duration: the longest studies range from 10 to 20 years, and they are very rare. “Major scientific and technical challenges need to be overcome in order to understand the mechanisms,” in order to extrapolate how the storage sites will evolve in the very distant future. Multidisciplinary teams and striving to understand the processes that are taking place in the storage areas, in the field of radioactive waste at Mines ParisTech, Mines Douai and Mines Nantes, and acidic gases such as CO2 at Mines Saint-Etienne. They then establish predictive mathematical models that are used to determine the impacts, extrapolate the data observed in situ, and optimize the storage operations.

 

Challenges of storing renewable energies

Recent developments in the storage field are radically changing the current conception of underground storage. As renewable energies develop, society will face the problem of matching electricity supply and demand: a wind turbine only produces when there is wind, not because we need electricity. The underground environment can also provide interesting opportunities to address this energy storage problem: “We can store energy by transforming electricity into natural gas, which can then be burned in a classic thermal power plant. Mines schools are also working on storing energy via compressed air.

In this scenario, the storage is no longer used for large quantities over a very long period of time. The cavity is filled and emptied according to the rate of energy production and consumption. As a result, “the storage area will be used very frequently.” This is an altogether different matter, forcing researchers to think differently about the storage. “During the cycles we are considering, we will need to inject large quantities during the limited hours of production, which leads to mechanical problems.” This is why specialists from all the earth sciences – chemistry, geology, hydrogeology and geomechanics – have come together in the schools’ research centers to work on understanding the impacts of these new storage methods and model their development over the long term.

 

 

Stéphan Clémençon, Automated Learning

Automated learning from data

Big Data is an issue not only of technology, but also for society. Aware of the value of data, Institut Mines-Télécom has made it a major field of research, because a new approach is needed in order to store, share and use data to achieve optimal use. Machine Learning is one such approach. It is being researched by Stéphan Clémençon, Professor at Télécom ParisTech.

 

Several zettabytes of data are generated each year, which is the equivalent of several billion billion thousands of octets, and while this supply of data enables the creation of new services it also considerably modifies our needs and requirements. Yesterday’s tools are outdated and new ways of putting this abundance of data to good use must be found. Machine Learning aims to do this. The discipline combines mathematics and information technology in order to create algorithms for processing big data, with a large number of industrial applications.

Télécom ParisTech recruited Stéphan Clémençon, a mathematician who specializes in modeling and statistics, to help develop Machine Learning. As Stéphan explains, “when you start to deal with very large quantities of data, you enter the realm of probability”. This field has long been neglected, especially by engineering schools where mathematics was taught in a very deterministic way, but students are becoming more and more interested in it, which is fortunate, for “questions of large scale raise difficult problems requiring lots of creativity!” New methods are required and the main difference between these and the old methods is that the latter were based on traditional statistics and relied on predetermined models of the data. They were developed principally in the 1930s when the methods and challenges were different, computers had a very limited capacity for calculations and when data was expensive to produce.

 

Finding the hidden meaning in big data

Nowadays there are sensors everywhere and data is collected automatically, with no pre-defined use but with the notion that it contains valuable information. The idea is to examine the data keenly and make the best use of it. Machine Learning’s objective is to design algorithms suited to dealing with big data. Enabling machines to learn automatically was an idea born from the fact that the data is too large to realistically enable each stage of processing it to be carried out by an expert, as well as from a desire to see the emergence of innovative services and teaching with no a priori.

The question is “how can a machine learn to decide by itself?” How can we compress, represent and predict information from data selected to provide examples? This is the challenge of Machine Learning, which is fuelled by probabilistic modelling based on optimization and a theory of learning which guarantees sound results. The major problem is designing algorithms with good potential for generalization. Using criteria that is too strict may lead to overfitting, meaning the production of models that suit the given examples perfectly but which cannot be generalized. On the other hand, criteria that is not specific enough does not have sufficient predictive capacity. In the case of Machine Learning, the right amount of complexity must of course be deduced automatically from the data.

 

A chair for Machine Learning

The chair in “Machine Learning for Big Data” was created at the end of 2013 and has fifteen professors, all from Télécom ParisTech. Its aim is to inform people about Machine Learning, illustrate the ubiquity of math and carry out a research program with four private partners who are providing a financial contribution of two million euros over five years, as well as real and concrete challenges:

  • Criteo, a world leader in advertising targeting, aims to offer each internet user the link that they are most likely to click on according to their browsing history. How can the enormous space of the Web be explored?
  • The Safran group manufactures 70% of the world’s civil and military plane engines. How can anomalies be detected in real time and the replacement of a part suggested before failure occurs? ?
  • PSA Peugeot Citroën hopes to connect data with its uses. How can construction costs be reduced and the commercial offering optimized, with models that meet market expectations?
  • A French banking group is launching an all-digital bank. How can client accounts be monitored in real time? How can its use be made simpler and the right financial products be offered?

To find out more

 

Designing new services from data

The multiple applications of Machine Learning are a driving force for research“, says Stéphan Clémençon, giving a number of examples showing the variety of fields in which big data is collected and used: “automated facial recognition in biometrics, risk management in finance, analysis of social networks in viral marketing, improving the relevance of results produced by search engines, security in intelligent buildings or in transport, surveillance of infrastructures and predictive maintenance through on-board systems, etc.

In Machine Learning, potential applications are found first, and then the math that allows them to be understood and clearly defined, significantly improving the process. A “professional” knowledge of these applications is therefore necessary. It was in view of this that the chair in Machine Learning for Big Data of Télécom ParisTech (see insert) was created with Criteo, PSA Peugeot Citroën, the Safran group and a major French bank. The idea is to work together with industry and academics to produce effective products based on the partners’ projects and notably providing increased knowledge of the state of the art for some and a keener understanding of the challenges of application for others.

Big Data refers at the same time to a specific infrastructure and a list of unresolved problems. Stéphan Clémençon regrets that in France, “we have missed out on the equipment stage” but fortunately, he adds: “we do have a large number of innovative SMEs headed by well-educated students, especially in applied mathematics.” Data engineering is multi-disciplinary by definition and the trump card of a school like Télécom ParisTech, which teaches in a variety of fields, is being able to offer specialized programs, in particular since, as Stéphan Clémençon underlines, “Machine Learning relates to key business challenges and there are lots of potential applications in this field.

 

Photo_Steph_Clémençon_redimensionné2_blogStimulating Machine Learning research and teaching

Stéphan Clémençon joined Télécom ParisTech in 2007 in order to develop teaching and research in Machine Learning, automated learning from data. He is in charge of the STA (Statistics and Applications) group and is head of the Advanced Master’s program titled “Management and Analysis of Big Data”. He set up the “Data Scientist” Specialized Studies Certificate awarded by the school for continuous learning, for engineers wanting to increase their skills in Machine Learning techniques. Stéphan also teaches at ENS Cachan and at the University of Paris Diderot, and is an Associate Professor at the Ecole des Ponts ParisTech and ENSAE ParisTech.

 

 

Christian Person, textiles, connected people

Textiles and connected people

The quantified self is one of the fields of research at Institut Mines-Télécom, requiring the miniaturization of sensors, the optimization of their energy consumption and sometimes their incorporation into the fabric of clothing. Christian Person, a researcher at Télécom Bretagne, has developed a research activity into this direction, focusing on harvesting ambient energy and locating antennas as close to the body as possible. He carries out his research as part of the Smart Sensing™ consortium, which innovates with cutting-edge technology fabrics designed for the intense uses of communicating clothing, through the first product, the d-shirt, a “digital t-shirt” for sports players.

 

Smart Sensing, a multidisciplinary consortium working on a simple idea

Multiple talents are required to develop a device that measures sportsmen’s physiological parameters, such as body temperature and heart rate, as well as their physical parameters such as speed, acceleration and geolocation. Yet this is what Jean-Luc Errant, founder of the company Cityzen Sciences, has done. He wanted a device that could be worn without being noticed and function in extreme situations such as in high mountains or at sea. During his investigation he met scientists, engineers and athletes, and their opinions gave birth to a seemingly simple idea: rather than a mobile phone, a garment can in fact provide the ideal way of monitoring physical condition throughout the day. All that remained was to incorporate sensors and energy sources. Cityzen Sciences was set up in 2008, and two years of academic research followed in order to create a state-of-the-art product. The project attracted interest and in 2010 received significant support from BPI France, and a consortium was set up consisting of the Payen group, specialists in elastic threads and fabrics for sporting and technical purposes, Éolane, the leader in France for industrial services in professional electronics, the Cyclelab group, bicycle specialists who were to act as distributers, and Télécom Bretagne.

“The multidisciplinary spirit behind the consortium is also present on the academic side” points out Christian Person, who has adopted Isaac Newton’s maxim: “Men build too many walls and not enough bridges”. No less than six research departments at Télécom Bretagne are uniting their expertise for the first time for tomorrow’s communicating garment. Christian Person is working on techniques for integrating and reducing the size of antennas and related elements as well as the evaluation of wave-people interaction and designing intelligent sensors. Since “everything must be optimized”, his co-workers are researching algorithms for measuring electrocardiographic data, the detection of variations in the signals received, software interfaces (the data is collected on a dedicated platform) and IPv6 protocols for connected objects.

 

Harvesting ambient energy

By wearing a d-shirt a cyclist will be able to monitor his cardiac data via electrodes located close to his chest. The information will be transmitted through the threads of the fabric, “threads of approximately 25 microns, containing both insulation fibres and conductors”, to an electronic card situated at the top of his back that enables the transfer of data collected at the end of the ride to a related terminal.

Amongst the technical challenges of the d-shirt – miniaturization, integration of conductor threads into the fabric, connected textile sensors, energy management and data processing – Christian Person is researching antenna components and energy harnessing. For reasons relating to cost, size or weight, battery-powered sensors alone are not sufficient enough to provide the necessary battery life. Temperature gradients, mechanical vibrations, light or radio-frequency waves all constitute potential sources of energy in the surrounding environment. “At the moment”, the researcher continues, “radio waves are our source of energy, with the aim of using the omnipresent electromagnetic spectrum”. Other sources are possible, however, “such as harvesting energy through a piezoelectric micro-generator, using the dilatation of the fabric when we inhale or the movements of the bicycle”. Since energy sources are not all equally reliable, such as the piezoelectric components, “current research focusses on cumulatively harnessing multiple energy sources”. With regard to this, thermal energy seems very promising given the differences between the human body and its surroundings, especially in a sporting context.

 

Monitoring as close to the body as possible

Christian Person’s interest in sensors located on or in the body comes from his long-term research on the analysis of interactions between waves emitted by phones and human body. From probes originally developed for Antennessa (now Satimo), a spin-off company incubated at Télécom Bretagne, as part of the Comobio project that coordinated studies in this field, the researcher, who is now a Cofrac (French Accreditation Committee) expert for the certification of telephones, is currently looking into the advantages of worn sensors and associated waves. “One idea is to place antennas on the body in order to spacially identify body parts precisely by the level of radio signals emitted/received”. This is the aim of the BoWI (Body World Interaction) project launched on 1st October 2012 for a duration of four years within the Breton Labex CominLabs. The researcher is also a senior member of the WHIST laboratory, a joint laboratory between the Institut Mines-Télécom and Orange Labs created in 2009, and which is dedicated to communicating people and the interaction between waves and people. “Corporal networks remain very peripheral (sensors on the body), but we are starting to use methods that interact more and more with the body, with  non-intrusive and non-invasive sensing systems, as well as targeting flagship applications in the field of monitoring health in real time”, he explains. Moreover, by analyzing interaction between sensors and their relative locations, highly original fields of application are created, “like music or piloting drones for example” the researcher says enthusiastically.

For the moment, the d-shirt will be sold at an initial launch price targeting users accustomed to top-of-the-range products. Professional sports teams are also partners of the Smart Sensing consortium and are interested in the idea. “The trainer can see his players’ physical condition live, and make changes at the right moment”, Christian Person explains. In time, amateur sports players will also use this technology, in particular since the connected garment will upload information onto social networks, enabling interaction with others. Since each person will want their garment to be compatible with connected objects sold by different companies, the data must be processed independently of these objects and their origin. This is the purpose of Cityzen Data, a company incubated at Télécom Bretagne. Last but not least, a design school rounds off the list of partners looking to maximize visibility of the project, accelerate innovation transfer and enable large-scale use of the products developed. “Connected people are more and more instrumented” the researcher concludes, and there is no doubt that Smart Sensing’s multidisciplinary approach, dealing at the same time with production, distribution of instrumented objects and the processing of collected data, should guarantee the success of such technological innovation.

 

[box type=”shadow” align=”” class=”” width=””]

Professor and Deputy Scientific Director at Télécom Bretagne, Christian Person considers that he has “been very fortunate” in the research projects he has led. This research professor “who initially intended to become a secondary-school teacher”, leads a team of 25 staff at the CNRS Lab-STICC laboratory and has supervised more than 30 thesis, some of which have led to the creation of companies such as Elliptika, a company specialized in microwave products design for spacial apps. He has also filed twenty or so patents, in particular in sub-marine communication: “given that sea water is very similar to the human body in terms of physical properties, it was used for risk-free studies on the underwater communication”. A Senior Member of IEEE, he has co-edited several articles in international journals, and directs the K commission (Electromagnetics in Biology and Medicine) of the French section of URSI (International Union of Radio Science). He is part of a committee for the organization of international science conferences in the field of microwaves.[/box]

 

 

 

 

Towards a smart electricity network

For several years, Smart Grids have been the hot topic in the energy field. As current networks become less and less adapted to new energy issues, we need to thoroughly rethink energy management and introduce smarter electricity networks. These networks of the future should be able to adapt production to consumption as precisely as possible, taking into account new parameters and constraints such as renewable energy or electric vehicles. The Institut Mines-Télécom, with expertise in the field of energy with the Mines schools and knowledge in the field of digital technology thanks to the Télécom schools, is involved in various projects for the Smart Grids engineering.

 

If the amount of research on Smart Grids is significantly increasing, it’s because there is a sense of urgency. “The driving force behind it all is energy transition. France must quarter its CO2 emissions by 2050”, explains Marc Girod-Genet, researcher at Télécom SudParis. The difficulty is that the solutions proposed for this transition are not always compatible with the state of current electricity grids. Most renewable energy sources are, for example, highly fluctuating and dependent on weather conditions. Moreover, some end users are becoming prosumers (i.e. both producers and consumers) by installing a miniature wind turbine or solar panels with associated small energy storage infrastructures, a dual role that has not been sufficiently accounted for by the network.

Another problem is the future popularization of electric vehicles, with heavy consumption at any hour of the day. “A few years ago it was a lot simpler to manage electricity grids,” Marc Girod-Genet points out. “If we needed more energy, with standard power plants, it was generally enough to increase the output. Today, production and consumption phases fluctuate much more.” The expected energy transition is therefore impossible without Smart Grids that are allowing the matching of these new demands.

 

Changes at all levels

Smart Grids can be seen as classical electricity grids to which three new aspects are added: a telecommunications network to relay information on the, a large scale and distributed information management system, and an energy services platform.

As primary links in the chain, consumers’ houses will very soon be fitted with smart meters. “The objective within Europe is to equip 80 % of households with this type of meter by horizon 2020” says Marc Girod-Genet. These new tools will allow local management decisions to be made according to in particular consumer habits, equipment available and levels of consumption. “Télécom Bretagne research teams are currently working on all these mechanisms for measurement communication and preprocessing information locally (Advance Metering Infrastructure – AMI), in partnership with Itron and Texas Instruments in a joint research center for developing such smart meters”, explains Marc Girod-Genet. The resulting combination of these meters with a telecommunications network will allow information to be received on a larger scale in the various control centers that will collect this sea of data. These control centers will link the information up with other types of data, for example wind speed measurements for forecasting wind turbine production. “This data is particularly large in quantity (Big Data) and diverse in nature and scales. One of the challenge is therefore to find ways of managing this heterogeneity in order to facilitate decision-making on actions aiming to improve energy efficiency. Big Data analysis is one of our specialties at Télécom SudParis.” Finally, this new network architecture, provided with an extended energy-related knowledge base, will also allow the provision of new services such as dynamic pricing, dynamic energy provider selection, and smart monitoring of electricity consumption, for each consumer. The Institut Mines-Télécom is involved in such societal changes in energy consumption.

 

Micro-grids in electric vehicles

Researchers at the Institut Mines-Télécom have been working for several years on different aspects of these Smart Grids, and more generally on what is known as Smart Energy.  For Marc Girod-Genet, this involvement is logical: “The Mines schools have always been strong in the field of pure energy, such as in particular production management, energy transport and storage, renewable energy management, fuel cells and energy transformation. Télécom research teams are contributing expert knowledge on communications networks and their management; information management systems and data modeling/processing issues (Big Data included); smart metering, Advance Metering Architecture (AMI); and service architectures. We are therefore perfectly qualified to work on Smart Grids.” Such expertise has led to the involvement of the Institut Mines-Télécom laboratories in various large-scale projects. One of them, Nice Grid (with the participation of Mines ParisTech), was launched in 2011 and allowed a Smart Grid, or “micro-grid” to be tested out at local level within a district of Nice (fifth French city). With smart meters, energy monitoring and consumer involvement, every aspect of smart grids was included in this project.

The Institut Mines-Télécom is also looking into issues less directly linked to smart grids, like electric vehicles. Completed in 2013, the VELCRI project[1] (with the participation of Télécom SudParis) consisted of three main research priorities: optimizing the distribution of charge amongst vehicles, allowing secure communication between the charging station, cars and the electricity network and finally using car batteries as a means of energy micro-storage. “Thanks to internal partnerships between Mines and Télécom researchers, we have lots of different studies on future energy networks” concludes the researcher. The Internet of Things and machine-to-machine communication (M2M), cloud architecture, environmental impact, societal issues, infrastructural and data security etc.: these are all lines of research within the Institut Mines-Télécom for optimal answer to energy transition objective requirements.

[box type=”shadow” align=”” class=”” width=””]

SEAS, for a European smart grid

Studies on Smart Grids are not limited to the local or national scale. With globalization, Europe is becoming increasingly focused on energy management. At the end of 2013, a major European project named SEAS (Smart Energy Aware Systems) was launched, involving collaborators from Spain, Finland, Portugal, Turkey and France. The aim is as simple as it is ambitious: achieve interoperability of different energy systems, leading to umbrella energy management. Since management systems vary greatly from one country to another, the challenge will be to standardize generic mechanisms, data format, protocols and materials for energy information representation, processing and exchange between energy systems, automation systems, ICT based digital services and all related stakeholders. In this respect, Armines, Télécom Bretagne, Télécom ParisTech and Télécom SudParis researchers are contributing significantly to this 3-year project, focusing particularly on the open semantic energy data models used, the energy information exchange platform and the management of electric vehicles. According to the partners involved, the concrete applications of this research should be seen at the end of the project scheduled for 2017.[/box]

[author title=”” image=”https://imtech-test.imt.fr/wp-content/uploads/2014/06/Photo_Marc_Girod_Genet_recadré.jpg”]In 1994, Marc Girod-Genet left EPITA. His engineering diploma in hand, he headed to the USA where he obtained a Master of Science at the Stevens Institute of Technology. “Born in Zaire, I’ve always been attracted to going abroad”, says the researcher. Yet it was in France that he accepted his first job, at CNET (now Orange Labs), whilst also writing a thesis at the same time. In 2000 Marc Girod-Genet earned his PhD in Information Sciences at the University of Versailles Saint-Quentin-en-Yvelines before joining Télécom SudParis in the same year as EU research project manager. He concentrated in particular on mobile networks, already endeavoring to introduce more intelligence.

He later became an associate research and professor in 2005, which allowed him to teach alongside his research. “I like this perspective of knowledge transfer,” he admits “the field of new information and communication technology is evolving very quickly, and our teaching must therefore be constantly adapted.” With two decades already under his belt studying all types of networks as well as multiple awards, Marc Girod-Genet has a considerable amount of knowledge to transfer…[/author]

[1] VELCRI: Electric Vehicle with Integrated Quick Charging

From epertise in telecommunications networks to the performance of electricity grids

From networks to everyday objects, the internet has radically changed our environment. From the main arteries to the smallest vessels, it is embedded in such a large number of the most banal objects that it puts a strain on the energy bill. Yet now communicating objects can exchange information to optimize their electricity consumption. After several years of research on the IPv6 protocol, Laurent Toutain and Alexander Pelov, researchers at Télécom Bretagne, are adapting this protocol to suit objects with a small energy supply and to the smart grids that are being built. Their work is part of a series of Institut Mines-Télécom projects on energy transition, focusing on the evolution, performance and compatibility of the energy networks of the future.

 

From the web to the Internet of things: 20 years of protocol development

Over the past few years the advent of smart transport and the Internet of Things has exposed the limits of the classical model of the internet. Mobility, creation of spontaneous networks, energy constraints and security must be taken into account. The number of devices eligible for an Internet address has exceeded capacities of IP, the network’s fundamental protocol. With IPv6, a version offering 667 million billions of possible IP addresses per mm2 on Earth, each component or receptor of an object can now have its own address and be consulted. But IP was not designed for receptors located in the middle of nowhere with finite resources such as the processor, battery and memory and with low-speed connection. For such “LowPAN”, Low Power Wireless Personal Area Networks, a new version of IPv6 has been created, 6LowPAN, with an associated consultation protocol, CoAP (constraint application protocol), which assists continual communication between the traditional internet and that of Things.

“CoAP is a new way of structuring networks,” explains Laurent Toutain, “the interaction between existing networks and communicating objects can be established in two ways: either by improved integration of IP protocols making the network more uniform, or by marginalization of IP within the network and a diversification of the protocols for access to things”. Confidentiality and security aspects will be fundamental to the success of either one of these architectures. The researcher and his team also use mathematical models and game theory, applying them to the fields of smart transport and energy management.

Transmitting data in local energy loops

Over the past few years several regions in France producing considerably less electricity than they consume have endeavored to mobilize the region towards adopting concerted energy efficient behavior. Alexander Pelov observes that “this is the case of the poorly supplied Provence-Alpes-Côte d’Azur region, which is becoming the leader in smart grids”, meaning a vision of an electricity network with optimized links to improve its overall level of energy efficiency. Brittany and its partners have also been working for several years on controlling the demand for electricity, developing the production of energy from renewable sources and the security of electricity supply. In 2012 it sent out an initial call for projects on the “local energy loop”.

One of the objectives of electricity suppliers today is to be able to exchange data through the electricity network, “a network that was never designed to transport them”, emphasizes Laurent Toutain. It will use a low-speed 250 kb/s configuration similar to LowPAN, with the same constraints as the Internet of Things. Laurent Toutain’s team has built a simulator to precisely model the behavior of such networks. This simulator allows the re-definition of routing algorithms and the study of new applicative behavior. “We try to adapt to the existing infrastructure: we must use it so we can adapt to all forms of traffic”, and also improve the network’s performance to increase its uses. This is a major challenge because the electricity network must, for example, communicate with vehicles and negotiate if there is a priority ambulance, as well as supplying energy and transferring it from one place to another. “Without prior knowledge of telecoms networks, none of that is possible”, explains the researcher.

[box type=”shadow” align=”” class=”” width=””]

A smart grid skills center on the Rennes campus of Télécom Bretagne

The fruit of a partnership with the Itron group, specialists in the development of metering solutions, and of Texas Instruments, experts in the field of semi-conductors, this research centre for power line communication technology inaugurated in November 2013 creates innovative solutions for electricity suppliers (technical upgrading of networks, smart metering etc.) and serves the French smart grids industry with the expertise of its researchers and engineers. Find out more +[/box]

Giving consumers a more active role in consumption

While better energy management can be achieved by the supplier, consumers must also play their part. Rennes is a pioneering city in thinking on the Digital City, a smart, sustainable and creative city built on the openness of public data, and has in this context sent out a call for projects concerned with energy and transport policies. Currently developing the ÉcoCité ViaSilva district, Rennes is encouraging inhabitants to restrict their energy usage and has committed to an Open Energy Data program.

Based on the observation that “we cannot double the existing infrastructure in order to transmit data”, the team of researchers based in Rennes are working on systems that give people a more active role in their consumption. It has been observed that the simple fact of showing users their consumption levels encouraged them to adopt better habits and led to savings of between 5 and 10%. “The idea is to make it fun, to imagine the “foursquare” of energy” explain Laurent Toutain and Alexander Pelov, referring to the localization mobile application whose most active users win badges. Another aspect is the visual representation of user behavior, which the team is working on with the École Européenne Supérieure d’Art de Bretagne, in digital laboratories (FabLabs) in Brittany. “Ultimately”, the researchers continue with a smile, “it’s like doing quantified-self at home”. This famous concept of “self-quantification” refers to the notion of being a “consum’actor” studied by sociologists and which is proving to be significant at this time of energy transition.

[box type=”shadow” align=”” class=”” width=””]

Research fostering spin-offs

“It’s extremely rewarding to work on a societal issue like energy”, enthuses Alexander Pelov. Numerous collaborations with start-ups like Cityzen Data, companies like Deltadore, Kerlink, Médria and the FabLabs bear witness to this passion. The start-up Homadeus that is currently in the Télécom Bretagne incubator offers both “open energy data” materials and the interfaces (web and mobile) to drive them.[/box]

 

Toutain_recadréLaurent Toutain and Alexander Pelov are both researchers in the Networks, Security and Multimedia Services department of Télécom Bretagne. A reputed expert in IP networks, and in particular in service quality, metrology, routing protocols and IPv6, Laurent is currently looking at newPelov architectures and services for domestic networks with a  focus on industry and technology rather than research. After studies in Bulgaria and a thesis at the University of Strasbourg in 2009, Alexander joined Télécom Bretagne in 2010 to work on energy efficiency in wireless networks and the use of smart grids in the context of smart metering and electric vehicles.

Rédaction : Nereÿs

réseaux sociaux, social networks, Christine Balagué, social media

Social Media: understanding digital uses

The digital revolution, from which emerged social networks and big data, has led to far-reaching changes in a large number of economic sectors. Businesses need help to develop new strategies suited to these transformations, and it is with this in mind that Christine Balagué, researcher at Télécom École de Management and Vice-President of the National Digital Council, created the “Social media: economic and social value creation” chair within the Institut Mines-Télécom.

 

The digital explosion is characterized by mass uses, data and trends. We are seeing a transfer of audiences from the traditional media to the internet and, within the internet, from websites to social networks, which now attract more than 1.6 billion people in the world,[1] including at least 32 million in France.[2] New technological developments, tablets, smartphones and other touchscreen interfaces allow information to be shared in real time. The consequences for businesses are numerous. Communication, customer relations, innovation, market surveys and the marketing-related professions have been totally transformed. Social networks, which give everybody a chance to have their say, have become unavoidable, but how should they be used?

 

Using social media: yes! But how?

Christine Balagué’s work has been focusing on this matter for several years, and it was with the aim of thoroughly understanding the uses of digital technology in order to offer appropriate strategies to businesses that in 2011 she created the Chair called “Social media: economic and social value creation”. It is an ambitious project: modeling the behaviors of social media users, identifying emerging uses, studying the way social media have transformed customer relations management and designing tools adapted to big data.* There are 15 or so researchers in various disciplines working on these four lines of research in the four telecoms schools of the Institut Mines-Télécom.

Most of the research on the theme of “quantitative modeling” of behavior is carried out at Télécom Ecole de Management. The aim is to understand how information spreads through networks. The emphasis is placed on quantitative aspects in this approach. The target is to be able to explain and predict the number of comments, likes, shares, tweets, re-tweets, and posts: in a nutshell, all the activity generated on social networks. The key for this model is the choice of explanatory variables. Some of them are structure-related variables such as page rank, number of hits and degree centrality. However, graph theory alone does not explain the spread of information within the networks as there are other variables related to the character of the individual or the content of what is being shared. Communities also develop within social networks and need to be identified. One branch of research focuses on their detection and the analysis of what users talk about. For example, in collaboration with researchers at Télécom Bretagne and as part of the Open Food System[3] project which focuses on the influence of digital technology on food preparation, the study of conversations has identified communities not necessarily linked to a given platform.

The research requires the development of crawling* tools and text mining* software adapted to social networks. These crawling tools have been designed at Télécom École de Management, and a thesis at Télécom SudParis is being carried out on large-scale crawling. The number of fans or followers is not a very meaningful indicator. Metrics on audience, algorithms for the measurement of influencers and the right variables explaining internet users’ behavior are assets with genuine value-added compared to existing tools. In particular, they allow companies to be shown where to concentrate their efforts and what to pay attention to.

Three main lines of research are being carried out on the topic of “uses” by sociologists at Télécom ParisTech: geolocation, cultural diversity and multi-network. Displaying spatial location is one of the emerging behavioral phenomena that sociologists are interested in. Moreover, uses differ according to the area of the world and many companies are very interested in such cultural diversities.

 

The metamorphosis of customer relations

The issue of “managing customer relations through social media” is being explored at Télécom École de Management. The subject has been revolutionized by the ability for consumers to express themselves directly, in real time and at any moment. We are in the era of user empowerment where the expression of dissatisfaction has to be managed. Hence the numerous Twitter feeds dedicated to real-time customer satisfaction management. It really works: research shows that a client who expressed dissatisfaction and whose problem was quickly resolved is then 5 times more loyal than any other average client. New jobs meet these new needs for companies, such as that of community manager*. A measurement model for this role and its impact on customer satisfaction has been developed.

Lastly, the research at Télécom École de Management on the subject of Big Data revolves around two projects in particular: Sodatech and Inovagora.[4] Sodatech is a monitoring and CRM 2.0* project based on the processing of both purchasing data and social data leading to innovations in the field of recommendations. It is built around the idea that today, an individual’s behavior is influenced much more by what their peers say on a social network or community platform than by information published by a brand, company or institution. The objective of Sodatech is to qualify active social media users through innovative criteria and to detect the dynamics of influence, while Innovagora aims to create crowd-innovation* platforms.

 

The advantage of the multidisciplinary approach

All of this research meets businesses’ operational needs. The work of the Chair has led to the elaboration of new algorithms and key metrics for understanding and measuring internet users’ behavior on social networks. Today, the work is focused mainly on big data analysis of social networks, for which the scope of research is still very open. Against this backdrop, the Chair has a major asset: the ability to capitalize on the diversity of skills within the Institut Mines-Télécom graduate schools in order to broaden horizons and create interplay between complementary visions. The different aspects brought to the table by marketing, IT, sociology, math, physics and psychology allow varied advances to be used in connection with each other. Long live the multidisciplinary approach!

Se renouveler pour mieux continuer
Lancée en 2011 par Christine Balagué, la chaire « Réseaux Sociaux : création de valeur économique et sociale » est structurée autour d’un programme de recherche sur trois ans. Le premier budget d’un million d’euros, conclu avec les trois premiers partenaires, Danone, Les Pages Jaunes et La Poste, arrive à son terme fin 2014. Une nouvelle phase de financement prolongera la collaboration avec les partenaires historiques ou permettra d’en impliquer de nouveaux. Elle portera sur le Social Internet of Things (SIoT), réseaux sociaux comprenant à la fois des individus et des objets connectés. Learn more

Christine Balagué, digital technologies in her genes

“How can digital technology create value for businesses?” This is the main thread of the work conducted by Christine Balagué, research professor at Télécom Ecole de Management, ESSEC graduate (1987), ENSAE graduate (1993), Dauphine University graduate (2000) and accredited to supervise research (2013). Her doctoral thesis in Management Sciences, which she defended in 2005 at HEC, is about modeling internet users’ behavior using Petri nets. Christine is Vice-President of the French National Digital Technology Council, a group of experts regularly called upon by the government to consider all matters relating to the impact of digital technology on society and the economy, and to issue recommendations accordingly.

 

* Glossary

  • Big data: Extremely large datasets needing to be processed in real time
  • Community manager : person who manages communities on social networks
  • Crawling: technique of data extraction from websites.
  • Text mining: extraction of knowledge in texts with the help of algorithms.
  • Crowd innovation: innovation of products and services via the creation of digital platforms destined to collect internet users’ opinions and ideas.
  • CRM 2.0 : Customer Relationship Management via social media

[1] Source Institut eMarketer, November 2013
[2] Source Institut Médiamétrie, June 2013
[3] Project selected within the framework of the call for “Structuring Projects for poles of competitiveness” in the French Future Investments Program
[4] Projects selected within the framework of the “Big data” call for projects in the French Future Investments Program.

computational sciences

Computational sciences: when math makes all the difference

What is the link between thought-controlled computers, being able to turn invisible, using virtual worlds to test hypotheses in archeology, and research to better understand and treat epilepsy? Each of these fields of research benefit from a new scientific approach, known as computational, which is becoming a new discipline in its own right.

Computational physics is an area of research at the Institut Mines-Télécom, led more specifically by Francesco Andriulli, an associate professor at Télécom Bretagne. He is part of the generation of scientists who enjoy the benefit of advanced technology and deal with masses of data, and who must master not only their particular field of research, but also math and engineering.

 

The use of computational science in physics

Prompted by Jim Gray, a database researcher and winner of the 1998 Turing prize, the scientific community has noted a major paradigm shift over the past couple of years within the sea of data surrounding us. Three main paradigms have driven science since its beginnings, complementing rather than replacing each other. For centuries, science was based on an empirical methodology of observation and study of only observable phenomenon. Later, scientific practices became more theoretical, using models, abstractions and generalizations. It was in this way that Kepler’s laws, Newton’s laws of motion, Maxwell’s equations in electromagnetism and the kinetic theory of gases emerged. Lastly, the invention of computers in the middle of the 20th century added programming to researchers’ list of tools, allowing them to model complex phenomena.

Now science has evolved once more. Scientists almost never look directly into their instruments (telescopes, microscopes etc.) but instead simultaneously examine data recorded by such instruments and those created by simulations. This data are becoming increasingly big. For Jim Gray, the exploration of data has become the fourth scientific paradigm.

Computational physics is part of this heritage. Francesco Andriulli explains that “It is a multidisciplinary field, theoretical and at the same time part of applied math and physics, advanced engineering and high performance computing.” Computational electromagnetism comes down to solving Maxwell’s equations, a much more predictive theory than mechanics, in complex spaces and environments. Even through these equations are 150 years old, the field is thriving: “We find new effects and new elements, particularly in the quantum field and in the science of complex materials.”

In addition to contributing to scientific advances, computational physics has also had a significant impact on cutting-edge engineering and its applications. Throughout his career, Francesco Andriulli has collaborated with many organisations, research centers and companies in Europe and the USA. The electromagnetic and digital techniques that he has developed have been applied to several problems and techniques in industry such as the design and electromagnetic characterization of circuits, metamaterials, wideband antennas, the characterization of electromagnetic physics of the brain, brain-machine interfaces and brain imaging.

 

Modern brain-machine interfaces

256 sensors precisely measure the points of origin of electrical activity in the brain

For anyone wishing to understand the electromagnetic interactions taking place within the brain, it is a highly complex environment. “There are several major areas of complexity,” explains the scientist, for whom this topic is one of the applications of his research field. “The physical properties are not the same throughout; the movement of the electric charge follows a preferential route between the neuron cells, and not in all directions; the forces involved are very weak and, above all, it is difficult to reproduce results of an experiment because it is impossible to obtain the same brain state twice.” The brain can be studied with the aid of functional magnetic resonance imaging, a technique that allows examination of the fabric of the brain, but which is cumbersome, costly and difficult to implement. Another technique, electroencephalography, involves measuring the brain’s electrical activity with the aid of sensors placed on the scalp: “We have the most powerful machine available today, with 256 high-density sensors”, adds the researcher. But measuring isn’t enough, the location of where the measurement was taken must be known, which is why the patient is surrounded by a dome that precisely locates each sensor and allows a deformable model of the patient’s head to be calculated.

A mathematical and physical theoretical model, brain-machine sensors and interfaces that allow every kind of interpretation possible, and new algorithms for processing the data collected… computational electromagnetism proves to be useful at every step of the journey, because it resolves the complexity presented at each of these levels.

 

Coupled with the latest technologies, math plays a highly practical role

We take the measurement on the surface of the brain volume, and math allows us to move from surface to volume”, summarizes Francesco Andriulli. Algorithms are then the key to accelerating the calculations. “Here, math is not just useful, it’s essential.” Thanks to these algorithms and supercomputers, calculations have gone from taking four years to just two hours.

Today, all this is possible for two main reasons. Firstly, there was a major advance in mathematics at the end of the 1990s: the Fast Multipole Method was developed, among other things, to help resolve the n-body problem within a reasonable timeframe, as well as other complex problems. The second reason is that the technology available today allows such problems to be tackled. Concerning the brain, such technology increases predictive power for electromagnetic behavior and allows the elaboration of very precise imaging.

The algorithms’ success also depends on the equipment they are applied to. The two fields are not separate. “It’s definitely transdisciplinary, not multidisciplinary”, explains the researcher enthusiastically, “transdisciplinary work is intrinsically multidisciplinary: it’s like swimming in multiple swimming pools at the same time.” This transdisciplinary character is given concrete form in several fundamental and applied research projects initiated by Francesco Andriulli at regional, national and European levels. He was awarded a grant from the ANR JCJC program (French National Research Agency, Young Researchers Program) for advances in electroencephalography imaging techniques. Even though he has had access to the world’s seventh most powerful computer, he also works with more standard equipment in his laboratory TeleComPhysLab for Brain Research Applications at Télécom Bretagne, to see what is achievable on this scale. “Our expertise will make all the difference” he confidently concludes.

 

Currently in his thirties, Francesco Andriulli’s career is in a period of acceleration where he can dedicate himself to his research whilst at the same time being actively involved in the scientific community and communicating his passion. A senior IEEE member, in July 2013 he was named associate editor of the prestigious scientific review IEEE Antennas and Wireless Propagation Letters, and later IEEE Transaction on Antennas & Propagation. “I have wanted to be a researcher since high school, when I discovered that knowledge gave you skills for achievements later on” he recalls. Hence his keen interest in math, which is “essential, because even if we develop applications that seem nice and obvious later, we must never forget that they originated from mathematics”, paired with an engineer’s mentality, in order to “be able to do things.

A graduate of the Polytechnic of Turin and the University of Michigan, Ann Arbor, the young researcher arrived at Télécom Bretagne in 2010. An accredited research supervisor, he heads up a team of PhD students and post-docs. For the European “Marie Curie Ambassador” program he is currently leading a project to popularize science in several high schools in Brest.