Paris Innovation Review – Combining chemistry and hydrodynamics is a trendy idea, but only a few teams in the world have had the intuition to fully develop it. To start with: how did this project even begin?
Vincent Lagneau – It all started in the 1970s, with the development of quantitative hydrogeology and flow codes that calculated the flowpath of water into the soil and the transport of dissolved elements.
We were quickly able to integrate some chemistry into these models, to represent, for example, how transported elements are delayed by chemical reactions. But we would merely take into account an approximate effect of the chemical reaction, by applying a delay coefficient. This coefficient was derived from chemical equations but did not represent the full complexity of all the interactions involved.
A first breakthrough occurred in the 1990s, with the publication of a groundbreaking paper on reactive transport, on the one hand, and the need for modeling the long-term behavior of radioactive waste storage, on the other. At a global scale, this is at the heart of the research endeavors on reactive transport by combining hydrogeology with complex chemistry. I need to mention Jan van der Lee, former PhD at École des Mines, who eventually formed his own research team, “Hydrodynamics and Reactions.” The first model was designed as early as 1996 and, by 1999, it was advanced enough to be shown to prospective partners. At that moment, Laurent and I integrated the team, after working on the project – as a doctoral student, in my case; as a post-doc for Laurent.
Until the early 2000s, we had to struggle to defend the idea that reactive transport had an added value compared to previous models that process the evolution of nuclear waste storage. By 2005, our model had become a reference tool, at least in the academic world. But there are several others: today, half a dozen of reactive transport modeling tools serve as benchmarks in this field.
Laurent De Windt – There was also a significant change between the 1990s and today: we moved from the study of the environmental impact to the question of sustainability of materials, with the emergence of the subject of “geomaterials.” In this context, the model became meaningful because codes from the 1970s to 1990s were simply unable to address this issue. The notion of coupling different fields was of crucial importance. It not only combines two disciplines, hydrogeology and chemistry, but also environmental and material science. As soon as you think of it on the long term – in hundreds of thousands of years in the case of radioactive waste! – you simply cannot overlook the way environment damages materials, by degrading both steel and clay used to contain waste products. Conversely, the natural environment is also changed. These complex interactions can only be described by models that combine all processes: corrosion of packages, release of radionuclides, migration into the rocks…
Vincent Lagneau – The first applications mainly took into account the interfaces. For example, the contact of concrete with clay provokes certain reactions. If clay is considered as the boundary condition, the impact on concrete is also studied. Or conversely, if concrete is the boundary condition, the impact on clay is also studied. With these new tools, both phenomena are analyzed simultaneously, by taking into account their crossed interactions.
Could you give us an idea of how nuclear waste is stored in an underground environment?
Laurent De Windt – For medium-level waste, imagine galleries of concrete, approximately the size of metro galleries, dug into a layer of argillites, at over -1,500 ft. Another type of clay, bentonite, is also used to seal the entrance of galleries. These materials are not well balanced, and they will react. Concrete chemistry is aggressive for clays, and vice versa – but over very long time scales. This crucial fact motivated our entire work: demonstrating the stability of a system over 100,000 years. A time scale that dwarfs the time of an experiment, which lasts generally three years, or even the time of a thesis, at most a decade. We needed to extrapolate. This is precisely what our tools were designed for: to use data collected over a short period of time to identify all the processes involved, before calculating what will occur on a different time scale.
However, extrapolating is not only about making projections or extending a curve. It is also about calculating interactions which, while relatively simple when taken separately, build up over time into a complex system whose representation requires many calculations.
Let’s go back to the industrial and social context of your project, because this long-term vision is not only an exciting challenge for researchers, it is also a social issue and a political framework. Was National Agency for the Management of Radioactive Waste (Andra), the public industrial and commercial body responsible for the management of radioactive waste in France, a stakeholder in your work?
Vincent Lagneau – Not directly. The project was supported from its outset by the Radioprotection and Nuclear Safety Institute (IRSN). Hence, we refrained from working with Andra because it seemed to us that the judge (i.e. the IRSN) should not have access to the same tools as the operator they are supposed to control. The Andra knows our work, of course, and even thought at some point about using Hytec. But they applied the same principle as we did and decided to use other codes. By having access to two independent expertise, we also improve the overall quality of the control... even if, of course, expertise cannot be reduced to a software!
But I would like to pick up on your question and mention the project stakeholders, because this is a crucial aspect. One of the great ideas of Jan van der Lee, since the beginning, was to make the project sustainable. For a very simple reason: most scientific codes don’t resist to the departure of the doctoral student who developed them. To prevent this from happening, in January 2001, Jan set up a consortium, the “Pôle géochimie transport” (Transport Geochemistry Cluster), which brought together two executive boards of the Center of Atomic Energy (CEA): the IRSN and EDF. Progressively, the idea extended to other players. Today, Areva, Bel V (equivalent of the IRSN in Belgium) and the cement manufacturer Lafarge are present alongside the founding members. Total, Schlumberger and Ineris (French National Institute for Industrial Environment and Risks) also supported us during a few years. And lastly, there are also a few academic partners who do not contribute financially but take part in generating ideas.
Thus, the Hytec code, which was created over 20 years ago, is now robust enough to be used by researchers, institutes or partners dealing with industrial issues.
Hytec has created a highly specialized product but over time, its uses have expanded. Was this generic dimension part of its initial ambition?
Vincent Lagneau – It’s simpler than that. This expansion occurred naturally and stems in part from the fact that the structure of the code is not as specialized as its first applications. In a nutshell, Hytec doesn’t contain radioactive waste: it is made of chemistry and flow equations.
Soon after 2000, we realized that what we had learned about radioactive waste could be applied to other situations. Our model for concrete-clay interaction in a radioactive waste repository could also be applied to analyze the degradation of concrete in a dam, for instance.
The model was also applied to issues of soil pollution, corrosion in the cooling circuits of nuclear reactors, but also to emerging issues such as the storage of CO2 or hydrogen sulfide (H2S). It should be noted in passing that these different problems operate on very different scales, from microns to kilometers. This gives an idea of the flexibility of the model. In essence, the basic equations which formalize chemical reactions remain the same.
What computer architecture did you use?
For simple calculations, the code (written in C++) can run on a desktop computer. But when applied at full capacity and on a large scale, it requires significant computing power. In order to manage this change in the amplitude of the calculations, we opted for parallel calculations. Distributed computing is very appropriate for our model, since chemical reactions are independent in one part of space from another: hydrodynamics is what transports reagents between different areas in space. Hydrodynamics is solved globally, over the whole area, but chemistry is distributed: one processor analyzes certain areas in space; another processor treats other areas, and so on.
Lastly, Hytec relies on important chemical databases. The parameters necessary to process a problem can be very numerous but in practice, most of these parameters are informed by physical (geometry), geological (e.g. permeability) or basic thermodynamic data. Therefore, the number of calibration parameters remains limited, which is a guarantee of the robustness and predictivity of the simulations.
We should also mention that the code was developed internally. But for visualizing the results, we use standard external solutions.
This brings us to the issue of skills. Without even taking into account engineering skills such as computer programming, at least two major scientific fields are involved. What skills are used within the team?
Vincent Lagneau – I have a background in applied mathematics with a training in hydrogeology. Laurent is a fundamental chemist who has turned to geochemistry. We also have colleagues who are geochemists and numerical analysts. The team as a whole –but also each member individually – stands at the crossroads of at least two or three disciplines.
Laurent De Windt – Numerical simulation imposed itself, alongside experience and theory, as one of the main paths in geoscience. In itself, this is a novelty and the reason for which our model is so innovative. It is no coincidence that the founding article of this branch dates back to 1989, when computers starting to become really powerful. Between the time when Jan van der Lee wrote his thesis and today, the computing power available to researchers has significantly increased. Thanks to Moore’s law, we can describe much more precisely the phenomena that we model.
The laws we use have been known, for some, for over a century: for example, Darcy’s law on the permeability of porous media. But IT capabilities did not allow us to combine all these parameters, because a thorough modeling requires hundreds of millions, even billions, of calculations.
The additional complexity is due, in particular, to the objects described. For approximately a decade, we were describing an interface between two materials. But we quickly moved to more complex objects, while remaining in geometries of industrial objects – typically, galleries. Around 2007, we began to think differently, and this led us to a second breakthrough, after the fundamental breakthrough of combining phenomena.
This second breakthrough consisted no longer in reasoning on homogeneous objects, but on the geological environment, by including its spatial variability and inherent uncertainty. The medium has intrinsic properties: porosity, distribution of minerals in the rock... About ten years ago, we started to include these aspects of spatial variability of the geological environment into our simulations. This has subsequently increased computing requirements.
When the model is applied to a uranium mine, for example, the spatial variability of uranium content requires significant calculation capacities – even if only a few variables are taken into account. Based on the variables, the model must produce tens or even hundreds of representations, which are then confronted with the reality.
Since we are talking about use in an industrial context, a question arises. Your software has been developed by and for researchers. Can it also be used by engineers?
Vincent Lagneau – This is an important point. As soon as the tool leaves the lab to be used elsewhere, the quality and simplicity of the interface become crucial factors. We have always had this concern, but it translated into practical terms only a few years ago.
Thanks to our consortium structure, we began disseminating the code among engineers who, on the one hand, do not necessarily have a training as hydrogeologists or geochemists, and on the other, will use the tool only during part of their time.
This is all the more critical that Hytec is starting to be used as a production tool. We have been working with Areva for about ten years on modeling reactive transport in uranium mines. The extraction of uranium is largely performed (and will be increasingly performed, because less expensive) by in situ recovery i.e. by injecting a solution (e.g. sulfuric acid) to dissolve the uranium, before pumping it out.
Even today, managing mines is a highly empirical process that relies on the know-how of engineers. This is exactly the kind of situation in which our codes can be used, involving reactive transport and a heterogeneous environment in which operators blunder their way in the dark: operators only see what they inject and recover. Our tools simulate precisely the media, the chemistry of reactions, based on a few variables such as the porosity and permeability of the medium, the distribution of uranium (provided by a geostatistical model), the injection rate, but also databases built within Hytec. All in all, two or three calibration parameters are enough to obtain a fairly precise and predictive model which grants operators a much clearer representation than their previous empirical knowledge and opens the way to significant productivity gains.
Realizing this “proof of concept” in 2014 took us several months. The next step, in which we are engaged and supported by the National Agency for Research (ANR) with an industrial chair project, involves connecting Hytec with the needs of the mine. Rather than informing physical parameters (permeability, porosity), we need to inform what the miner sees: well geometry, distribution of uranium, production curve, input and output flow rates...
We have added a layer that will collect data from the mine and produce the curves we need. These curves don’t result from a statistical calculation, like conventional mining tools, but from completely deterministic equations.
I would add, in conclusion, that by changing the model, we don’t simply adapt it to the parameters of the mining industry. We also enrich it with new specific questions, such as the effects of fluid density on flow – a question that proved very relevant and led us to improve our model. Interacting with the industry has the virtue of broadening our view, which, in turn, helps us develop cutting-edge research.