National Academies Press: OpenBook
« Previous: 1 Introduction
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

2

Materials Design

FIRST-DAY MORNING SESSION

Workshop planning committee member Stefano Curtarolo, Duke University, chaired the first day’s morning session, which had four speakers: Jakoah Brgoch, University of Houston; Wendy Mao, Stanford University; Nir Goldman, Lawrence Livermore National Laboratory; and Eric Homer, Brigham Young University. A brief discussion period followed each presentation.

Finding Thermally Robust Superhard Materials with Machine Learning

Superhard materials have a large variety of uses, from drill bits to artificial joints, with estimated annual sales expected to reach around $14 billion by 2024, and as Jakoah Brgoch, University of Houston, told the workshop, the coming transition to clean energy will make it even more important to find superhard materials for use in extraction of the minerals needed for clean energy technologies. Thus, Brgoch’s group has been looking for new superhard materials that can be produced efficiently and used under demanding conditions, such as high temperatures. In his presentation, he described the approach his group is using in that search, which involves using computations, machine learning, and material synthesis together to develop new high-hardness materials.

Brgoch began by offering some background. He thinks of high-hardness materials as falling in two separate classes of compounds. The first are the “classic high-hardness materials” such as diamond, cubic boron nitride (c-BN), cubic BC2N, and

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

boron suboxide (B6O). These materials have tremendous physical properties—in particular, they are very hard—but their synthesis is challenging. They require extreme pressures and temperatures—on the order of 15 gigapascals and 2,100°C—to produce. They are made industrially, but the process is costly.

The second class of materials, he continued, contains transition metal atoms along with atoms such as boron, carbon, or nitrogen from the periodic table’s main group. Examples include rhenium diboride (ReB2) and tungsten tetraboride (WB4). These materials tend to be not quite as hard as those in the first group, but they are much easier to manufacture, requiring only 1,500–2,000°C and normal pressure.

The hardness of the two groups arises in two different ways, Brgoch noted. Diamond, cubic boron nitride, and cubic BC2N are mechanically strong because their chemical bonds are arrayed in an ideal way, with the sp3 hydridized bonds coming from each atom symmetrically arranged, pointing toward the corners of a regular tetrahedron centered on that atom. This creates an extremely rigid crystalline structure. As a result, the Vickers hardness (the standard measure of hardness used in this field) of these materials is well above 40 gigapascals, which is considered the threshold for superhardness.

The mechanism behind the hardness of the materials in the second group is different. The transition metals by themselves are resistant to pressure on an atom-by-atom basis because they are rich in valence electrons, which resist a push, but the atoms in a pure metal can move around in response to pressure, thus making the pure metal not particularly hard. Combining the transition metals with boron or carbon or nitrogen leads to a much more rigid structure, Brgoch explained, as those atoms “sort of act as a scaffold that pins everything together.” The resulting Vickers hardness is generally 40 gigapascals or less, depending on the compound.

However, he continued, there is some controversy over exactly how hard these materials are because their measured hardness depends on the applied load. “If you use a very small applied load, then you get a much higher hardness than if you go to a high applied load,” Brgoch said. By contrast, diamond, cubic boron nitride, and BC2N have a Vickers hardness well about 40 gigapascals no matter what load is put on them.

A second issue with the use of the second group of materials for superhard applications, he explained, is that many of the transition metals are very scarce, so it would not be feasible to make commercial amounts of a superhard material that relied on one of very rare transition metal elements.

Thus, Brgoch continued, his laboratory is looking for superhard materials that avoid all of these issues. They need to be in the second group for ease of manufacture, but they should not rely on a particularly rare transition metal, and they should have a Vickers hardness that is higher than usual for this group so that even under an applied load they will still be superhard.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

How could such superhard materials be found? Brgoch said that his group’s first approach in such a project is to use computational modeling and molecular dynamics to design a compound with the desired properties. However, in this case it would be extremely difficult to model the movements of the molecules in these compounds accurately enough to predict the hardness of a given material.

One simple approach to predicting Vickers hardness, Brgoch said, would be to combine the values of bulk modulus and shear modulus, both of which are loosely correlated to a material’s hardness, to create an estimate of Vickers hardness. This would be convenient, as the bulk modulus and shear modulus of materials can be calculated relatively easily with density-functional theory, a computational method based on quantum mechanics. However, when his group tried this approach, they found that while they could get figures that were in the ballpark of Vickers hardness, they were not accurate enough for the group’s purposes.

He stated that his group decided on an alternative approach. They would use machine learning to work with the vast body of data available about the many different materials of this type in order to create a predictive algorithm that would estimate the Vickers hardness of thousands of different materials of the sort that they were interested in and thus identify candidates that were likely to be superhard. The search would also allow them to specify other parameters of interest, such as how well a material stood up to high heat.

The first step was to create a training data set with data on as many materials of the right type that they could find where there had been measurements made of the Vickers hardness data. “We went to the literature and we mined it,” Brgoch said. “We pulled all of the Vickers hardness that we could find. We pulled it out of graphs, we pulled it out of paper, we pulled it from the Japanese NIMS (National Institute for Materials Science) Materials Database—everything that we could get our hands on.” They ended up with Vickers hardness data on 1,063 different materials and also included hardness measurements that they had collected in their laboratory. The materials they found for the training set included 57 different elements, among them a large percentage of the transition metals and many of the elements in the main group as well. The list did not include any noble gases or alkali metals, he said, which made sense because compounds with these elements are not the sort that are of interest as structural materials.

One problem with the materials they found was that almost all of them had a Vickers hardness below 40 gigapascals. “From our analysis,” he said, “there is something like only 15 percent of the compounds that are actually considered to be superhard.” So, for the purpose of predicting which materials are superhard, the training set was relatively thin. Furthermore, most of the data on Vickers hardness was taken at low applied loads, so there would be relatively few data for predicting Vickers hardness at higher applied loads.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

Having come up with the set of materials with a known Vickers hardness, the next step for Brgoch’s group was to decide on a group of variables—a “feature set”—that would be used to describe each material in a way that could be coded into a computer and used for machine learning. They settled on a wide variety of variables covering the composition, structure, and physical properties of the materials and their component elements. These variables included atomic number and atomic weight, atomic radius and covalent radius, metallic valence, the number of electrons in the various electron shells, melting point and boiling point, density, thermal conductivity, and many more. Each research group decides on its own set of variables, he said, and the process is a mixture of science and art.

The next step, Brgoch continued, was to settle on an algorithm that would find relationships between the set of variables and the Vickers hardness, which, if it was successful, would make it possible to accurately predict the Vickers hardness of materials for which the hardness had not been measured, just based to the other properties of the material. The team tried a variety of machine learning algorithms, such as random forest, gradient boosting, and XGBoost, and the XGBoost model had the best performance, with an R2 correlation of 0.97 between the predicted value of Vickers hardness and the measured value, indicating a very strong performance (Figure 2-1).

The final step was to validate the model, which the team did both by looking at statistics and testing its predictions on materials that were not in the training set.

Image
FIGURE 2-1 Predicting Vickers hardness with XGBoost, a gradient-boosted decision tree.
SOURCES: Jakoah Brgoch, University of Houston, presentation to the workshop, October 5, 2022. Image from Zhang et al., 2021, “Finding the Next Superhard Material Through Ensemble Learning,” Advanced Materials 33:2005112, © 2020 Wiley-VCH GmbH.
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

What they found, Brgoch said, was that not only did the model accurately predict the Vickers hardness of new materials, but it also predicted how the hardness changed as a function of applied load. “This is one of the first examples of being able to do that,” he highlighted.

With the model validated, the team then used it to screen other materials, looking for those that were superhard but had not yet been identified as such. “We took Pierson’s Crystal data set and limited it to the 57 elements,” he explained. “We had 66,440 compounds—10,000 binaries, 36,000 ternaries, 20,000 quaternary compositions—that on a desktop we could predict in about 30 seconds. Once it’s trained, this is a very quick process.”

Once the predictions had been made, the team sorted them, looking not only for superhard materials—that is, those with a predicted Vickers hardness of at least 40 gigapascals—but also for materials whose hardness was predicted to change little as an increasing load was applied. Some of the materials they identified with this process were derivatives of cubic boron nitride, which they were not interested in because of the high pressures and temperatures needed to synthesize them. He emphasized that they focused instead on materials made with transition metals and main group elements and looked in particular for those where it would not be super-expensive to synthesize them—that is, all of the component’s elements should be reasonably abundant.

“So as a proof of principle,” Brgoch said, “we picked this yttrium borosilicate,” whose chemical formula was YB41.2S1.42. A student on the team synthesized the material as a pure-phase ingot using arc melting. It turned out to have “an unbelievably complex crystal structure,” Brgoch noted, with a unit cell that was around 1,000 angstroms across with a lot of disorder.

In particular, he commented, that complexity would have made it impossible to calculate the material’s Vickers hardness with model calculations, but when the team measured the material’s hardness, they found that the measurement agreed almost perfectly with what the machine learning algorithm had predicted. It was not a superhard material—its maximum hardness was around 35 gigapascals—but as a proof of principle it showed that the algorithm could be trusted to predict Vickers hardness from basic chemical, structural, and physical information about a material.

With that success under their belts, the members of Brgoch’s team applied the same process to look for data on Vickers hardness taken at various temperatures. There was much less information on this—the team was able to find only about 500 datapoints in the literature, and almost all of the materials they found fell below the 40-gigapascal threshold for superhardness. Furthermore, almost all of the data came from measurements performed at low loads and relatively low temperatures as well.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

At the same time, the group expanded its model to contain more complex descriptions of the crystalline structures of the various materials in the database. They used what is called a smooth overlap of atomic position (SOAP) descriptor. Brgoch explained that this is a way to describe a crystal structure and its interatomic connections more completely than just giving it a space group number or a density.

Next, the team applied their machine learning algorithm to create a model that predicts temperature-dependent Vickers hardness. They found that the model’s results were not as clean as the earlier results—which made sense because it was based on far fewer data—but it was still able to capture the basic trend of how Vickers hardness varied with temperature, with the hardness dropping as the temperature increased. Brgoch said that when they validated the model by comparing its predictions to data on a couple of industrial materials—B4C and ReB2—for which the literature contained detailed information on Vickers hardness as a function of temperature, they found that their model captured the experimentally measured behavior of the materials quite nicely.

In closing, Brgoch said that his team’s work shows that “development of new high-hardness materials is still possible in compounds containing transition models and main group elements” and that “machine learning methods are definitely promising for screening a huge number of compounds.” Furthermore, by including additions parameters such as load and temperature into the model, it becomes possible to capture some more complex phenomena, such as how hardness varies with temperature. “That’s really what the power of our work was,” he said, “and we can think now about including other parameters—for example, radiation exposure—if we want to start thinking about really extreme environments.”

Currently, he noted, work in the area is limited to screening large numbers of compounds that are already known. The challenge for the future will be to design generative algorithms with the ability to design new materials with desired properties instead of simply searching through lists of already-known substances.

In the discussion period following his presentation, Brgoch addressed the issue of whether his machine learning model could be made extrapolative in addition to being interpolative. “This is a big question,” he answered, “I think it depends on how much you trust your model and want to believe your model. I tend not to. We tend to only interpolate. I think that you’re asking a lot. You can take it a little bit; you can push the bounds a little bit. But to go through and now predict the hardness of lithium fluoride where I have absolutely none of this information, maybe we should try this and see what actually comes out. I bet it’s probably nonsense, the reason being that the model has to have the information. You can’t just take and extrapolate sort of freely far beyond the bounds. And so, I’d love to say that you can, I just think that I can’t in good conscience do this.”

Brgoch also stated that material experimental data scarcity is one of his biggest challenges. “Things will get better as we curate and pull out more and more and

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

more experimental data, but data are expensive,” he acknowledged, “and so figuring out ways to address this is going to be an interesting challenge.”

New Materials at High Pressure

In contrast to Brgoch, who is looking for high-hardness materials that can be produced at normal pressures, Wendy Mao, Stanford University, spoke about materials that can be produced at high pressures. The reason for creating materials under pressure is that pressure can induce profound changes in materials and produce materials with strikingly different properties than the starting material. The classic example is carbon, she noted, which at normal temperature and pressure typically appears in the form of graphite, which is opaque, black, soft, and a semi-metal; exposing it to the right combination of high pressure and temperature changes it to diamond, which is transparent, superhard, and an electrical insulator.

Diamond is also a great example of a material that is suitable for a variety of extreme environments, Mao noted, because once it has been formed, diamond will maintain its structure across a wide range of pressures and temperatures.

More generally, she continued, pressure can produce new phases and forms in materials, it can induce and tune desirable properties, and it can also be used to provide important insights into materials’ behavior, which can in turn be useful in aiding the design and discovery of materials with exceptional properties that might be suitable for extreme environments.

The first step that Mao’s team takes in using high temperatures to create new materials is to “study systems of interest over a wide range of pressures and temperatures using a suite of in situ and ex situ probes.” A typical way to do this, she explained, is to use diamond anvil cells, in which a sample is held between the flat faces of two diamonds, with a gasket preventing the sample from escaping as pressure is applied. Since the samples inside the diamond anvil cells are very small—typically less than 0.001 mm3—the cells can generate very high pressures on the sample with a relatively modest application of force. The temperature of the sample can be controlled using cryostats to go to low temperatures and laser and resistive heating techniques to go to very high temperatures. And because diamonds are transparent to a wide range of electromagnetic radiation, it is possible to observe the sample within a cell with various imaging techniques, extracting various types of information from the sample.

She said that high pressures and temperatures can also be created using dynamic compression, with shockwaves generated in a material using gas guns, lasers, or magnetic fields that produce a finite jump in the sample’s pressure, temperature, and density.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

Diamond anvil cells can generate pressures of greater than 500 gigapascals, Mao said, while the dynamic techniques can be used to create pressures of 1,000 gigapascals or more in a high-power laser facility.

A wide variety of techniques are used to characterize the samples at high pressures and temperatures, including X-ray scattering, neutron scattering, Raman spectroscopy, optical microscopy, electron scattering, resistivity measurements, and many more. Taken individually, Mao said, the observations do not really provide a full picture of the sample, but it is possible to integrate the different observations to piece together a relatively complete understanding of the sample. The observations can also be complemented by theory and simulations, she added. She claimed they can be very powerful in terms of making predictions, helping us to interpret our results.

Mao then offered several examples of the sorts of observations her team has made of high-pressure samples. One example was radial X-ray diffraction, where an X-ray beam is aimed at the sample perpendicular to the compression axis, which makes it possible to measure properties such as shear strength, elasticity, and texture development. High-power optical lasers can be coupled with an ultra-fast, ultra-bright X-ray source to take time-resolved snapshots as a shockwave passed through a sample. And nanoscale transmission X-ray microscopy can be used to take high-resolution images that can be combined for three-dimensional tomography with a resolution on the order of tens of nanometers.

Having provided this background, Mao then described some of the work being done to use high pressures to produce new materials. She focused on a number of different types of carbon materials.

Mao said that glassy carbon is an amorphous form of carbon in which the bonds are sp2 type—that is, the bonds that are present in the two-dimensional carbon sheets that make up graphene. (Graphene is simply a single sheet of graphite, so it is one carbon atom thick, but extends out in the other two dimensions.) Seen at an atomic level, glassy carbon looks like a collection of small, twisting pieces of these two-dimensional sheets, much like someone had crumpled up randomly shaped pieces of chicken wire and pushed them all together in a mass. Glassy carbon combines some of the properties of glass and ceramics with some of the properties of graphite.

When Mao’s group took a 10-micron sphere of glassy carbon and subjected it to high pressures, they observed some intriguing transformations. As the pressure increases up to 40 gigapascals, the sphere is progressively squashed, becoming increasingly denser, and that increased density remains even when the pressure is removed. Measurements indicated that after being exposed to 40 gigapascals, the density of the glassy carbon was permanently increased by 30 percent.

Furthermore, after reaching about 40 gigapascals, all of the carbon–carbon bonds in the materials have changed from sp2 to sp3, the type of carbon–carbon

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

bonds found in diamond. At the same time, the material has become superhard and able to maintain a 70-gigapascal pressure difference across the sample. However, X-ray diffraction shows that the sample has not crystallized, and it is still black.

If the sample is laser heated at a high pressure—for example, she said, around 50 gigapascals—it becomes transparent, more like a diamond. It is no longer glassy carbon, but nor is it crystalline diamond, as one would get by exposing graphite to high pressure and temperature. Instead, it is “amorphous diamond,” a new material with properties that lie between diamond and glassy carbon.

Next, Mao described her team’s research on “diamondoids,” which she described as “cage-like ultra-stable molecules that form carbon–carbon frameworks” that can be superimposed on a diamond lattice. “Think of them as cages of diamond. They’re all terminated by hydrogens.” The simplest diamondoid is adamantane, which is one cage unit. Diamantine has two fused cage units, triamantine has three, and so forth.

These diamondoids are great precursors for making diamond, she said. When her team compressed and heated a series of diamondoids, they found that the lower-order diamondoids transform to diamond at pressures as low as 12 gigapascals and temperatures as low as 625°C. This temperature is surprisingly low, especially considering that there is no catalyst involved, she said.

The final example Mao offered was of a new method for creating graphene nanoribbons. These can be thought of as strips of graphene, and because they have a tunable band gap, they have some very useful properties. However, it is difficult to make these graphene nanoribbons, especially long, thin ones with smooth edges, so a faculty member in the Stanford chemistry department suggested that it might be easier to create nanoribbons by first synthesizing carbon nanotubes and then flattening the tubes in a diamond anvil cell. Carbon nanotubes, which consist of a single layer (or sometimes multiple layers) of graphene rolled up on itself to form a long, flexible straw, are much simpler to create than graphene nanoribbons.

Mao’s team tried it, and it worked. Heating and compressing single-wall and double-wall carbon nanotubes in a diamond anvil cell, they produced two-layer and four-layer graphene nanoribbons that were long and thin with atomically smooth edges.

A key feature of some of the materials in these examples, she noted, is that they maintain their structure after the pressure and temperature are returned to normal. That is, they are metastable—they have lower-energy configurations that they would “prefer” to be in, but they cannot transition to those lower-energy configurations without some outside help, such as by being heated up to a high temperature. Without this property, it does no good to create an interesting material at high temperature and pressure because it will change into a different material when the conditions are returned to normal.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

Along those lines, Mao first described some recent work with cesium lead iodide where tuning certain factors made it possible to form nanocrystals of a particular perovskite-structured phase that were metastable and did not return to a non-perovskite phase when the pressure and temperature returned to normal. She thinks there are still a lot of tuning parameters that can be explored to try to maximize metastability in materials that may be formed at high pressure.

“But what if you have a material that has very limited metastability,” she asked, “and it readily back transforms to the low-pressure phase? Or what if you have a material that has great properties at high pressure, but once you release pressure, again you lose those properties?” How can such a phase be preserved under ambient conditions?

Nature offers one answer in the form of diamond inclusions. When a diamond is formed in Earth’s mantle, it sometimes has tiny pockets of another material that has also been formed at high temperature and pressure. Then when the diamond comes to Earth’s surface, the diamond maintains the pressure inside the inclusion, allowing the material in that inclusion to maintain the phase it was in when the diamond was still in the mantle, even though the material would relax out of that phase if it were outside the diamond. “The message here is that diamond is a great envelope, a great container for preserving pressure,” Mao said.

So, her team has done something similar with argon in diamond. They loaded a diamond anvil with glassy carbon and argon, which diffused into the nanopores of the glassy carbon. When this was all exposed to pressures of greater than 20 gigapascals, the glassy carbon was transformed into nanodiamond, which held high-pressure argon in its inclusions (Figure 2-2). When the sample was brought back down to ambient conditions, the nanodiamond maintained its structure and kept the enclosed argon at pressures over 20 gigapascals. “The argon crystals were still at high pressure,” she said, “even though you took the sample out of the diamond cell.”

One benefit of this technique, she highlighted, is that the high-pressure argon encased in the nanodiamond can be characterized by various probes, such as transmission electron microscopy, that only work in vacuum or ambient conditions. This will open many doors, with the ability to create new phases of materials at high pressure and then examine them with standard techniques even if those new phases are not stable at ambient conditions. She concluded that the pressure really expands the playing field that we can look for new phases and novel phenomena to be discovered.

In the discussion following her presentation, Mao responded to a question by saying that a diamond anvil cell can be used not only to apply pressure to a sample but also to apply rotational shear, which is “another tuning parameter.” In response to another question, she said that at very high pressures, such as above 100 gigapascals, the properties of materials tend to change profoundly—their magne-

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Image
FIGURE 2-2 Synthetic diamond inclusion process.
SOURCES: Wendy Mao, Stanford University, presentation to the workshop, October 5, 2022. Images from Zeng et al., 2022, “Preservation of High-Pressure Volatiles in Nanostructured Diamond Capsules,” Nature 608:513–517, https://doi.org/10.1038/s41586-022-04955-z, © 2020 Springer Nature, reproduced with permission from SNCSC.

tism, electron activity, and so on. She also noted that it is very difficult to measure the temperatures of sample at very high pressures. One can apply pyrometry to measure the temperatures, but it is hard. Last, Mao spoke to the issue of preserving meta-stable phases created at high pressures when the conditions are returned to normal. There are various approaches, including changing the microstructure. “In Science,” she said, “we had some model nanocrystal systems where they did inter-particle sintering and found that’s able to further maximize the kinetic barrier, and they found out those phases didn’t go back, and now also they add some ligands.”

Machine Learning Tools for Extending Quantum Simulations of Reacting Materials

Nir Goldman, Lawrence Livermore National Laboratory, spoke about using machine learning to fill in a gap between what quantum simulations can provide and what experimentalists would like to know in terms of synthesizing materials. In particular, he spoke about a specific model, the Chebyshev Interaction Model for Efficient Simulation (ChIMES), as a way of taking quantum simulations and extending them to much larger scales.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

As context, he discussed the difficulty in creating new materials with desired properties. Finding the right starting materials and the right reaction conditions is a huge combinatorics problem with many different variables that can be changed. Furthermore, the development of additive manufacturing has added another layer of complexity because one needs to consider not only the chemistry of the materials and how they interact but also their grain size and the geometry of the printed material. “All of these things, in particular under dynamic compression, can lead you to different chemical paths,” he said, “So if you have a desired outcome and you want to do materials design and additive manufacturing, you have this gigantic problem to deal with.”

The traditional approach to solving such a problem would be trial and error: start with a molecule, synthesize it, create some sort of device with it, then measure it, and repeat over and over again until a suitable path is found to the desired material. But today, he explained, what experimentalists and synthesists are looking for is something that is “plug and play”—that is, something that they can download that will help guide them and severely reduce the number of permutations they have to explore. Ultimately, the holy grail would be an inverse-design process that would take a list of desired features and provide instructions for how to create a material with those properties.

That is what the people who work with machine learning tools eventually hope to provide, Goldman said, but the current situation is far from that. “The problem is that what people like me generally deliver is much more cumbersome to build,” he noted. “We don’t have very many tools that are kind of plug and play, meaning you can just download it and use it. Instead, we tend to have, particularly for new materials, a fairly burdensome way of doing things that’s highly effective but time consuming.”

The starting point for determining how to create a particular material is generally ab initio molecular dynamics—that is, calculating how atoms and molecules interact from first principles. These calculations are very computationally intensive and hard to generalize, Goldman said, but they have the advantage that they can be very accurate.

The particular quantum mechanical approach used in this calculation is called density functional theory (DFT), which Goldman called the “gold standard in condensed matter quantum mechanical calculations.” At each time-step in the calculations, a potential energy surface is calculated, and the electrons are treated quantum mechanically, while ions are generally propagated with classical equations. The result is a description of the chemistry and physics of molecules that has very few a priori inputs.

In general, DFT methods are limited to very small spatial and timescales—typically tens of picoseconds and single nanometers. In these restricted regimes, the calculations are very accurate. However, they require significant amounts of

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

computing time, and if one wants to carry out calculations over a large array of possibilities, it becomes quite costly very quickly. The exploration of even simple materials is quite expensive, Goldman explained, which is an obstacle to creating data sets of materials that other researchers can use. As a result, there is a gap in the time and length scales between what the DFT calculations can practically provide and the experimental scales that researchers are interested in, which are much larger and longer.

Machine learning has great potential to bridge that gap in time and length scale between DFT or other quantum calculations and experiments, Goldman said. In the remainder of his talk, he described two types of machine learning approaches that his group has worked on. One is a classical molecular dynamics approach that involves no quantum mechanical calculations, while the other involves a mixture of quantum mechanical approaches and empirical functions. In both cases, the goal is to provide users with something that is relatively easy to learn and use.

In the first case, his group has worked to make molecular dynamics models easier to create by using linearly parameterized functions to speed up the optimization process. “We have very expensive DFT or quantum calculations,” he explained, “and we want to map them onto something very cheap.” What makes the quantum calculations so difficult to perform is the underlying potential, which is generally highly nonlinear. To make the computations easier, Goldman’s approach is to approximate the potential with linear combinations of Chebyshev polynomials, transforming a nonlinear least-squares problem into a linear least-squares problem.

In particular, the group’s model, ChIMES, is a way to decompose a many-body problem into pieces that are easier to work with. The many-body problem, in which there are multiple particles all interacting with one another, is a famously difficult problem to solve because the jointly interacting particles create a complex array of forces and potentials. ChIMES breaks down the many-body problem into a collection of simpler interactions: first all the pairs of particles creating two-body interactions, then all the triples of particles creating three-body interactions, and so on. A Chebyshev polynomial is constructed for each interaction, so the total interaction among all the particles can be expressed as a linear combination of Chebyshev polynomials. The quantum calculations can then be performed with the linear combination of Chebyshev polynomials in place of the complex, nonlinear many-body potential. And solving a ChIMES model is a linear optimization problem, which can be carried out much more quickly than the original nonlinear problem. In practice, Goldman said, the group generally uses only Chebyshev polynomials representing the two-body, three-body, and four-body interactions because the problem starts to get very complex beyond that point, but leaving out the fifth-order and higher interactions does not hurt the accuracy of the model too much.

To use a ChIMES model, Goldman explained, one starts with a DFT molecular dynamics model to generate some training data—say, on a solid material at vari-

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

ous pressures and temperatures. “That’s where most of your time will be spent,” he said. Then the ChIMES model is trained through carrying out many trial-and-error calculations to optimally reproduce the force fields in the training data. The advantage of using ChIMES is that the optimization step is usually extremely fast.

As an example, he described work done by Becky Lindsey, who was a postdoctorate in his group, on analyzing what happens to carbon monoxide (CO) at a very high temperature and pressure. In particular, she modeled the reactivity and phase separation in liquid CO at 6,500 K and 26 gigapascals. This is a question of interest to those who study what happens in planetary interiors, for instance. Carbon-rich materials are known to condense into nanometer-scale carbon clusters (i.e., soot particles) over nanosecond timescales, but DFT cannot model the process because its timescale is too long for those calculations, and experimentalists studying the growth mechanisms, kinetics, and structures involved in the process have had difficulties interpreting their experimental results, so the ChIMES approach promised to offer some new insights.

After fitting her model to DFT data, Lindsey scaled her model up to over 1 million atoms and a timescale approaching 1 ns. Such scaling is possible because the model is linear, Goldman noted. When Hughes ran simulations using her model, she got a clear picture of what was happening with carbon atoms at this temperature and pressure. “They form droplets on the order of a few nanometers across, with kind of a layer of oxygen atoms on the outside,” Goldman said. Hughes was also able to calculate some nucleation and growth-related parameters, and one of the things she found was that to really understand the process she had to go to simulation with around 10,000 atoms or more—far beyond what DFT can handle. She also found that it took around 100 picoseconds for the carbon droplets to fully nucleate, which is something that could be tested experimentally.

And, indeed, when others in Goldman’s laboratory carried out experiments on CO at these temperatures and pressures, they found evidence that carbon droplets were forming at approximately the size that Hughes’s work had predicted and on approximately the same timescale. This is an indication that ChIMES can provide the “handshake” that connects experiments with computationally intensive quantum calculations, Goldman concluded.

Switching then to the second machine learning approach that his team has developed, Goldman described how the group has used semi-empirical quantum mechanics to get closer to something that is closer to plug and play—that is, much more ready for download and use. In particular, they use data from existing high-accuracy quantum chemical databases to improve semi-empirical quantum mechanical models via machine learning.

As an example of such databases, he mentioned the ANI databases created by Adrian Roitberg, Olexandr Isayev, and Sergei Tretyak, which have millions of data points created by DFT (Smith et al. 2020). The efforts of those three researchers

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

were originally focused on training neural network models of potentials, so there was no quantum mechanics component in the original work with those data. This provided results that were reasonably accurate at a low cost. By contrast, a coupled-cluster approach has the highest accuracy, but it is very expensive. Using machine learning approaches with semi-empirical quantum mechanics makes it possible to approach coupled cluster accuracy but keep computational costs down, Goldman said.

The semi-empirical method that Goldman’s group uses is called the density functional tight binding (DFTB) method. It is derived directly from density functional theory and can be thought of as approximate density functional theory, but the quantum mechanical parts—which represent the band structure and coulomb forces—are “pre-tabulated” via approximate quantum mechanics, while the short-range repulsive energy is represented by an empirically fit function. In this form, the solution is much easier for machine learning to determine.

This idea of improving semi-empirical quantum mechanics with machine learning is not new, Goldman said, and there have been various approaches, including several that use neural networks, but those methods tend to be very data hungry, and they also tend to involve nonlinear functions, which can make it difficult to train the neural networks. So, Goldman’s group chose to use ChIMES as their machine learning approach to working with semi-empirical quantum mechanics.

Testing with data from the ANI data set showed that Goldman’s DFTB/ChIMES approach could make accurate models with a relatively small amount of training data. In another test, the DFTB/ChIMES method was able to give the correct stability order for coumarin molecular crystal polymorphs, which is a grand challenge in crystal structure prediction.

In summary, Goldman said that most users in this field would like a “plug and play” system where they can download code and use it with minimum efforts. ChIMES is not plug and play, but it can reduce the burden in creating new reactive molecular dynamics models through rapid parameterization, so this represents progress. His team is working to make high-accuracy, plug-and-play versions of DFTB/ChIMES, including plans to extend the model to molecular condensed phases in the future. Last, training DFTB/ChIMES can be less burdensome as well, as small data sets can be used along with rapid screening.

In the discussion period following the presentation, Goldman responded to a question about the cost of ChIMES by saying that it tends to be about as expensive as the Spectral Neighbor Analysis Potential (SNAP) model and less expensive than the Embedded Atom Model (EAM), “where you can do multi-billion-atom simulations.” Responding to another question, he said it might be possible to embed physical laws or principles into ChIMES. “It’s something we have been experimenting with a bit more recently,” mentioning computing the four-body-interaction term with density functional theory and inserting that term, for instance.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

The Impact of Well-Curated Data Sets

In the session’s final presentation, Eric Homer, Brigham Young University, spoke about the importance of well-curated data sets and the effect that they can have on a field, using his own field, the study of the structure and properties of grain boundaries, as an example. In particular, he focused on the effects of one data set developed by David Olmsted, Stephen Foiles, and Elizabeth A. Holm (Olmsted et al. 2009a,b). The data set contained information on 388 computed grain boundaries in four face-centered cubic metals: nickel, aluminum, copper, and gold. Even though the data set was not that large, Homer said that it enabled researchers to learn a great deal about grain boundaries, and it moved the entire field forward in various ways. Thus, it serves as an exemplar of how well-curated data sets can help researchers understand their field better and answer questions that they could not otherwise address.

In the first part of his presentation, Homer spoke about how the Olmsted data set affected work on grain boundary energy. Then he switched to the effects of the data set on work related to ground boundary migration and wrapped up his talk with a look to the future.

Studying grain boundaries is crucial to understanding and shaping the properties of materials, he emphasized. Most solid materials are composed, at the microscopic level, of individual crystalline grains pressed up against each other. The macroscopic properties of the material are dependent not only on the properties of the individual microscopic crystals (grains) but also on what happens at the boundaries between these grains. The contribution of the grain boundaries to the properties of the material is the more difficult issue to study because the crystalline structures of the individual grains are relatively simple, while grain boundaries are quite complex and variable, depending on such things as the size and shape of the grains and how the crystalline structures of two adjacent grains are aligned relative to each other.

The Olmsted data set was constructed by computing the atomic arrangements of 388 different cases where two grains were next to each other at differing orientations relative to each other. “At the macroscopic level, a grain boundary is characterized by its misorientation and boundary plane orientation,” Homer explained, “but each of these grain boundaries also has microscopic degrees of freedom”—that is, how the individual atoms at the grain boundaries between the two grains are organized. The Olmsted group would choose a macroscopic arrangement of two grains and then carry out a minimum-energy analysis to determine how the atoms at the grain boundary would be arranged. They did this for each of 388 macroscopic arrangements, covering a large variety of orientations to create the data.

One of the first things that was done with this data set after its publication, Homer said, was that it was compared with the results of experiments. The calcu-

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

lated values of the boundary energy for different grain alignments were shown to agree well with experimentally measured values; in another study, the calculated boundary energy values were found to be inversely proportional to the frequency of occurrence for the particular grain alignment, which is what was expected. This showed that the predictions of grain boundary energy from the simulations could be trusted to validate or inform experimental results.

The Olmsted database was also used to create a function that would predict grain boundary energy over the full five-dimensional (5D) space of possible alignments between two grains (Figure 2-3, left). This had not been possible before the publication of the database, Homer said, and it was a huge advance to be able to predict grain boundary energy for any alignment between two grains. That also made it possible to study structure–property relationships, examining how the grain boundary energy and other properties changed as the misorientation angle between the two grains varies; the resulting research showed that the properties varied smoothly.

Another important insight gained from the data set was that the grain boundary energies for nickel, copper, aluminum, and gold were almost all linearly correlated and that only two fitting parameters were needed to specify the relationship among the grain boundary energies for the four different metals (Figure 2-3, right). This insight—that face-centered cubic (FCC) materials such as nickel, copper, and aluminum have very similar grain-boundary behaviors—was perhaps expected previously, Homer said, but now there is evidence that it is true. That is important.

The data set also made it possible to look at the atomic structures of the grain boundaries—the microscopic structures composed of individual atoms—and see how they related to various grain boundary properties as well. Since the atomic structure of a boundary can be quite complex and can differ from the structure of other boundaries significantly, Homer’s group used machine learning to look for patterns. “We’re able to predict grain boundary energy, grain boundary mobility, and whether grain boundaries are shear coupled or not using this machine learning,” he said. “And this is coming from when we only have 388 datapoints in this data set. That’s actually pretty impressive that we can predict with such a small data set.”

The machine learning also did such things as find structures that correlated with certain behaviors, such as a particular atom in boundary being correlated with certain migration behaviors—that is, how the grain boundary would shift over time as the adjacent grains grew or shrunk.

“So, all of these insights that I think are quite significant—experimental validation of grain boundary energies, understanding their structure, being able to interpolate energy over the 5D space, commonalities in FCC materials, and then finally getting these structure–property relationships from both an atomic and

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Image
FIGURE 2-3 Grain boundary energy over the five-dimensional space of possible alignments.
SOURCES: Eric Homer, Brigham Young University, presentation to the workshop, October 5, 2022. Reprinted from V.V. Bulatov, B.W. Reed, and M. Kumar, 2014, “Grain Boundary Energy Function for FCC Metals,” Acta Materialia 65:161–175; Copyright 2022, with permission from Elsevier.
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

crystallographic point of view—come from the single data set that David Olmsted created,” Homer concluded.

Switching gears, he noted that the insights he had been talking about were obtained directly from the Olmsted database, but other insights were gained because the database inspired new work. For example, most of the boundaries in the data set exhibited thermally activated boundary migration or mobility, which is what one would expect, but there were a higher-than-expected number showing anti-thermal boundary migration, where the degree of mobility was inversely related to the temperature—that is, there was more boundary migration at lower temperatures. He noted that it is not included in textbooks of grain boundary migration and everything you would expect would suggest that it should be thermally activated.

Homer’s group looked more closely at what was going on with the cases of anti-thermal boundary migration. They found that it is a real phenomenon—in some cases, for instance, grain coarsening is faster at cryogenic temperatures than at room temperatures—and they uncovered the explanation for why such anti-thermal (or non-Arrhenius, as it is more accurately termed) boundary migration takes place. It turns out that the model that had been used to explain such boundary migration was accurate, but people had been making incorrect assumptions in applying the model.

This exemplifies another benefit of the Olmsted database, Homer said: “It has reinvigorated the field in terms of studying grain boundary migration.” More generally, he said, “It has moved our field forward in important ways. It has led to the development of new methods. It has led to improved understanding of various phenomena. It has led to a better understanding of the sorts of conditions that are necessary for the success of various experiments. And it has been used directly in the study of many phenomena.”

Looking to the future, Homer noted that there are other databases available to work with as well. Indeed, seeing how valuable Olmsted’s database was, even with only 388 grain boundaries, his team decided to create one of their own. It is a computed data set of 7,304 unique aluminum grain boundaries, where the five-dimensional grain-boundary space has been comprehensively studied, providing information on grain boundaries of all possible alignments, whereas Olmsted’s set was very sparse in various parts of that 5D space. And not only does the new database include the 7,304 stable, minimum-energy grain boundaries, but it also includes more than 43 million meta-stable configurations, where the atoms in a grain boundary are not in the lowest-energy configuration but are in a configuration that may appear in nature. “If you want to do machine learning, this could be an incredible training set,” he said, adding that the database will be available to the community starting in January 2023.

Homer closed with a list of important questions for the community to answer as its members try to figure out how best to do data-driven research: How do we

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

promote the creation of useful data sets that can be shared and examined by all? What size data sets provide the maximum return on investment? How do we promote standardization of methods or best practices? How can we ensure high-quality data when it is produced in large quantities that make it difficult to verify every result? How do we promote and share codes that follow best practices in coding to enable repeatability and avoid the reinvention of software? And, last, how do we maintain shared codes when there is little incentive to do so?

In response to a question in the discussion session following his presentation, Homer elaborates on the issue of how to incentivize people to maintain and improve codes. It is important to align incentives, he said. “Having a paper is what our sponsors care about. Having a code is not what they care about, or they care about that less.… If I don’t get credit for doing something and maintaining something that is benefiting the community, I have no reason to continue to do that. So, my answer is aligning incentives, but I don’t know what that looks like.”

FIRST-DAY AFTERNOON SESSION

Horacio Espinosa, Northwestern University, chaired the first day’s afternoon session, which had three speakers: Christopher Weinberger, Colorado State University; Penghui Cao, University of California, Irvine; and Shyue Ping Ong, University of California, San Diego. A brief discussion period followed each presentation.

Ultra-High-Temperature Ceramic Phases and Compositional Complexity

In the afternoon’s first session, Christopher Weinberger, Colorado State University, spoke about the computational complexity involved in finding ultra-high-temperature ceramic phases. The context, he explained, is the search for materials that can protect a space shuttle as it enters Earth’s atmosphere or a hypersonic vehicle as it flies through the lower atmosphere. Such materials would cover the tip and leading edges of these vehicles as well as propulsion system components, and they would need to have high melting temperatures—preferably above 3,000°C—as well as mechanical strength, oxidation resistance, good thermal conductivity, and thermal shock resistance.

The materials currently used for such purposes are reinforced carbon–carbon composites. Going beyond that, however, will require a new class of materials, and some of the most promising are borides, carbides, nitrides, and oxides of early transition metals such as zirconium diboride (ZrB2), hafnium diboride (HfB2), hafnium carbide (HfC), hafnium nitride (HfN), and tantalum carbide (TaC). There are many examples of such materials with melting temperatures above 2,000°C; however,

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

requiring a melting temperature of 3,000°C or greater reduces the potential pool from around 300 options to only about 15.

One way to get more candidate materials is to increase the compositional complexity by having more elements in the given material. For example, high-entropy alloys are often defined as having five or more principal elements, each accounting for 5–35 percent of the total composition. Moving from the more traditional materials to these high-entropy alloys provides a lot more options to someone looking for materials with a particular set of properties, such as the ability to withstand ultra-high temperatures (i.e., above 3,000°C). And it does not have to be just high-entropy alloys, Weinberger said—high-entropy ceramics are another possibility.

Researchers interested in exploring high-entropy materials in a search for ultra-high-temperature materials with other desirable properties will face several challenges, Weinberger said. One is the complexity of the different phases that a material can appear in, depending on the material’s precise composition, the temperature and pressure it is exposed to, and the details of the material’s construction. To illustrate, he showed a couple of phase diagrams of carbon-tantalum compounds; as the phase diagrams illustrated, the compounds would take on a dozen or more different crystalline structures depending on the percentages of carbon and tantalum and on the temperature.

In short, there is significant complexity in the different phases that can form—and that is with just two elements. The complexity of the phases of a high-entropy material would be far greater. “That’s certainly important to think about if you’re thinking about a design space and how to select materials,” he commented.

Several approaches exist for exploring the properties of these materials, Weinberger said. One well-established approach, for instance, is the calculation of phase diagrams (CALPHAD) method, which allows one to determine the boundaries between phases if the Gibbs free energies of the phases are known; a weakness of this approach is that it works only for known phases and cannot predict the existence of others. Another approach is DFT, which allows one to search for and explore different crystalline structures but is very computation-intensive and thus expensive for complex calculations. Then there are a variety of methods that can be used to try to understand order–disorder phase transitions—that is, the point at which the atoms in a material organize into a crystalline structure.

Extending these approaches to compositionally complex materials will be challenging, Weinberger said. You start to generate an enormous amount of data, which will need to be complemented with experiments. All of this will demand careful consideration of the best approaches, whether with CALPHAD, DFT, or something else.

He continued that it will be just the beginning because those approaches will only provide information about the stability of different phases. Beyond that, re-

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

searchers will be interested in certain properties for the materials they are considering—properties such as high melting temperatures, high hardness, high fracture toughness, oxidation resistance, thermal conductivity, and suitable diffusion and creep. He emphasized that “we have to know how to engineer them or select them if we’re going to start mixing a lot of elements.”

There are various techniques available to tackle these issues. For instance, Qi-Jun Hong and Axel van de Walle reported using molecular dynamics calculations to identify a material composed of hafnium, nitrogen, and carbon that is predicted to have a higher melting point than any experimentally tested material (Hong and van de Walle 2015). The researchers did it by developing a DFT method that predicts melting temperature, validating it against several experimental measurements, and then using the model to suggest a new material with a higher melting temperature than had ever been seen.

In general, Weinberger said, scientists understand many of the things that lead a material to have a high melting temperature, such as high enthalpy of fusion and high entropy. But understanding and predicting melting points in computationally complex materials is a much bigger problem, and DFT calculations could be impractically expensive. “So how do we explore that space?” he asked, “How do we start predicting melting temperatures?”

Switching to the question of how to predict hardness in materials, Weinberger referred to Jakoah Brgoch’s presentation earlier in the day on using machine learning to find thermally robust superhard materials and then offered some additional thoughts. A common approach to predicting material hardness, he said, is to use elastic constants, but there are cases where hardness does not behave as the elastic constants would suggest. For instance, when the amount of carbon in a transition metal carbide, such as niobium carbide, decreases, the normal assumption would be that this breaks covalent bonds, causing the elastic constants to go down and the hardness to decrease. But the opposite happens—the hardness increases—and so there are still many questions about what causes hardness in materials.

Thus, Weinberger said, better predictors of hardness are needed, and he suggested that it may be useful to look at both bonding and microstructure for help with predicting hardness.

On the topic of fracture toughness, which is particularly important for ceramics, Weinberger said that one interesting observation that has come out of studying tantalum carbide is that a phase called the zeta phase is closely associated with hardness, with a material’s fracture toughness increasing as the volume fraction of the zeta phase goes up. How might one use insights from that behavior when designing fracture toughness in high-entropy carbides? “There is opportunity there to not only understand the toughening behaviors, but also potentially design it and enable it in other systems,” he said.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

Because the materials Weinberger is interested in will be used at high temperatures, he must also think about creep and diffusion. Creep, which generally occurs at high temperatures, is the movement or deformation of a solid material under the sustained application of mechanical stresses; diffusion, which is also mainly a high-temperature phenomenon, is the movement of atoms from a region where there is a high concentration of the atoms to a region where the concentration is less. One problem in understanding these two phenomena, he said, is that there are relatively few data on them. He claimed that we take understanding of creep for granted, but we have a lot of holes to fill and we are missing critical data.

And that, Weinberger said, is a theme that holds for all the properties of interest for high-entropy, ultra-high-temperature materials. This means that there is a great deal of opportunity for measuring melting temperatures, hardness, toughness, creep, diffusion, and other properties in these materials, and there is also a need for innovation in how the tests are done, particularly as it relates to doing tests over wider ranges of chemistries. The goal is to “build up the information that we need to make better models, to understand these material behaviors better.” Given the computational complexity of these materials, that information will be crucial in getting to the next generation of this class of materials.

In the discussion period following his presentation, Weinberger spoke about the need to understand the liquid phases of materials at high temperatures and pressures. In terms of phase stability, he said, one of the areas that is least understood is the liquids phase, so it is important to figure out better ways to understand, characterize, and model those liquid phases. Furthermore, he added that conventional wisdom says that as you add more things, you are going to lower the melting temperature, so you need to understand exactly what the impact is. Understanding the exact impact allows you to understand whether you are actually lowering the temperature and perhaps making the materials worse. On the topic of filling in data gaps, Weinberger said that there are several reasons why needed data are not being collected. In the case of diffusion rates, for instance, he said that researchers know how to measure the rates, but he does not see anyone doing it anymore. It is not an area that is particularly well funded, and people who do such work do not generally get much professional credit. The issue of creep is an even bigger problem, and he concluded by stating “We do not have good mesoscale models of creep. It’s a very tough problem, and we haven’t as a community invested in that enough.… But as a modeler, that’s a great place to put effort so that we can start to get back the experiments on creep.”

Fundamental Mechanisms Under Extreme Environments and the Role of Machine Learning

Penghui Cao, University of California, Irvine, spoke about the role that machine learning can play in studying fundamental mechanisms in extreme environ-

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

ments. Specifically, he discussed the impact that short-range order has on diffusion and the motion of dislocations in multi-principal-element alloys that are under high stresses. Multi-principal-element alloys, which are generally defined as having four or more base elements, are like the high-entropy alloys that Weinberger, the previous speaker, had discussed, although the high-entropy alloys are generally defined as having five or more principal elements.

As background, Cao mentioned two 2004 publications that served to jumpstart research into high-entropy alloys, one by Cantor et al. and the other by Yeh et al. Among other things, Cantor’s group created the alloy FeCrMnNiCo, and they explored its structure and properties as well as the structure and properties of several other alloys with up to 20 components (Cantor et al. 2004). The Yeh group did something similar (independently), and they suggested that this high-entropy approach had great potential: “Preliminary results demonstrate examples of the alloys with simple crystal structures, nanostructures, and promising mechanical properties. This approach may be opening a new era in materials science and engineering” (Yeh et al. 2004, p. 299).

The underlying hypothesis of this work, he noted, was that increasing the number of elements in an alloy would increase the configurational entropy, so that in making alloys with a larger number of elements, the solidification would lead to the formation of random solid solutions rather than of intermetallic compounds (Yeh et al. 2004).

Explaining why random solid solutions would be expected to form in these multi-principle-element materials, Cao said that when an alloy is formed at temperatures close to the melting points of the component elements, the Gibbs free energy—which is the sum of mixing enthalpy and configurational entropy—would be dominated by the entropy, resulting in a random solid solution in which the component elements are randomly distributed, as in an intermetallic compound. However, he added, actual materials are processed, homogenized, annealed, and applied at temperatures well below the melting point. In this case, mixing enthalpy and solute–solute interactions predominate, and short-, medium-, and long-range chemical ordering in the material is inevitable. This leads to the possibility of controlling the structure of an alloy to a certain degree by choosing compositions and processing conditions that lead to certain types of ordering.

As an example, Cao pointed to a recent paper in Nature by Ding et al. (2019). The group reported several different types of order they found appearing in various alloys, including the one originally studied by Cantor and colleagues and another in which the manganese is replaced by palladium (Pd). Among other things, Ding and colleagues reported that iron atoms tend to aggregate, while nickel atoms prefer to form linear arrays. These are examples of the sort of short-range order that appears in high-entropy alloys under the right processing conditions.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

With that introduction, Cao launched into the main body of his talk, which was devoted to three topics: understanding how short-range order can affect the diffusion of vacancies (i.e., places where the crystalline structure would be expected to have an atom but does not) through a high-entropy alloy; how short-range order can affect the movement of dislocations (defects in the crystalline structure where the arrangement of the atoms changes abruptly) through the alloy; and tackling the large compositional space in multi-principle-element alloys.

Turning first to the diffusion of vacancies, Cao noted that in a pure element, a vacancy is equally likely to move in any direction since the material is homogeneous and all directions look the same to the vacancy. By contrast, in an alloy with two or more elements, the diffusion barrier will be different in different directions, depending on which types of atoms lie next to the vacancy, so the likelihood of a vacancy moving in a particular direction will depend on the arrangement of nearby atoms; the diffusion is no longer equally likely in all directions (Figure 2-4).

Then Cao described studies using kinetic Monte Carlo modeling that calculated the diffusion correlation factor f for different materials at different conditions. For pure niobium, f, which is a measure of how random the diffusion is, was calculated to be 0.73, which agrees well with the textbook value of 0.72. But it was the values of f for a multi-principle-element alloy, MoNbTa, where things got interesting. The calculated values of f for that alloy at 3,000 K, 800 K, and 400 K were 0.57, 0.034,

Image
FIGURE 2-4 Diffusion and atomic jump processes.
SOURCES: Penghui Cao, University of California, Irvine, presentation to the workshop, October 5, 2022. Left and middle images from Cao Research Group, University of California, Irvine, https://cao.eng.uci.edu. Right images from M. Jin, P. Cao, and M.P. Short, 2018, “Thermodynamic Mixing Energy and Heterogeneous Diffusion Uncover the Mechanisms of Radiation Damage Reduction in Single-Phase Ni-Fe Alloys,” Acta Materialia 147:16–23; Copyright 2018, with permission from Elsevier.
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

and 0.004, respectively. That is, the value of f varied by two orders of magnitude over that range, so that while the diffusivity at 3,000 K was similar to that of pure niobium, its diffusion at 400 K was far lower, meaning that the diffusion at low temperatures was far less random than it was at 3,000 K.

Short-range order in the multi-principle-element alloy plays a role in this diffusion, Cao said. In a short-range ordered system, such as the high-entropy alloy, the diffusion is shortened and becomes more localized. What happens is that the short-range order, because of its effects on the energy barriers for jumping from one position in the crystalline lattice to an adjacent one, can “trap” a vacancy, reducing how quickly it diffuses through the material.

Cao then turned to a discussion of dislocations and how their movement can be affected by a material’s short-range order. Dislocations are a key factor in a material’s tensile strength, as dislocations that can move in response to an applied force allow a material to flex without breaking. Thus, one can strengthen a material by decreasing the number of dislocations or introducing defects that block the movement of the dislocations, with the addition of blocking mechanisms generally being the more practical approach.

The energy needed to move a dislocation from one location in a crystalline lattice to the next location (per unit length of the dislocation) is known as the “Peierls energy”; the lower the Peierls energy, the easier it is for a dislocation to move, and the value of the energy will change from place to place within the lattice, depending on the arrangement of atoms throughout the lattice. Cao described results from a recent publication on how short-range order within a multi-principle-element material affects the energy landscape of the lattice and, thus, how easily dislocations can move at different points in the lattice (Wang et al. 2022). One finding, for instance, was that short-range order increases the energy barriers to dislocation movement, meaning that greater force must be applied to the material for a dislocation to move. More generally, Cao and his colleagues found that short-range order had effects on the strength, microstructure evolution, and local strain of the multi-principle-element materials they studied.

The lesson here, Cao said, is that by manipulating such variables as the short-range order in a multi-principle-element material, it may be possible to control the deformation mechanisms in that material and thus shape the overall properties of that system.

And that conclusion led Cao into his final topic: learning how to choose multi-principal-element materials from all the available choices (i.e., from a large compositional space) that have the diffusion properties and dislocation mobility that one desires. Cao focused on one key question: Is it possible to predict such properties as short-range order and diffusion from the composition of a material?

To answer that question in a relatively simple system, Cao’s team chose three elements—tantalum, niobium, and molybdenum—and 45 compositions with vary-

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

ing percentages of those three elements (Ta30Nb30Mo40, for instance), and then trained a machine learning model on data about diffusivity from those 45 compositions. They showed that the machine learning model, which was a convolutional neural network, was able to successfully predict diffusivity for other compositions involving tantalum, niobium, and molybdenum that were not part of the training set (Fan et al. 2022).

Ultimately, he said, the goal is to develop models that can predict various properties of multi-principal-element alloys for any combination of elements. “Can we predict diffusivity, or can we predict the timescale required to form a different degree of short-range order, or can we predict the maximum degree of short-range order?” he asked. He stated he does not have the answer.

Cao closed with a list of questions and challenges:

  • In an analogy with the time–temperature–transition diagram, can one construct a time–temperature–short-range order diagram and predict the timescale associated with the formation of short-range order at different times and temperatures of processing?
  • Short-range order and local chemical ordering offers a new dimension for tuning the behaviors of structural materials; will it be possible to use experimental tools to manipulate local chemical order over a large enough range to study these materials effectively?
  • Given that the diffusion correlation factor f can vary by a few orders of magnitude in high-entropy alloys, how can one compute or predict diffusivity at low and intermediate temperatures (i.e., below about half the melting temperature)?
  • Is it possible to build and train a repository of diffusivity data and models for high-entropy alloys that anyone could use and that would allow the rapid computation of diffusion barriers and temperature-dependent diffusivity?

Designing Extreme Materials at Scale with Machine Learning

Shyue Ping Ong, University of California, San Diego, spoke of the promise and challenges of using machine learning to design extreme materials. In particular, he focused on multi-principal-element alloys and ceramics—that is, those alloys and ceramics that have many principal elements in their composition. It is known that many of these materials have very useful properties at high temperatures.

Machine learning is valuable in the design of multi-principal-element materials in two basic ways, Ong said. First, it helps researchers deal with the structural complexity of these materials. In general, multi-principal-element materials can exhibit chemical short-range order, which can affect the materials’ properties

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

profoundly, but such order is complicated to deal with; machine learning makes it feasible. Multi-principle-element materials also have various microstructural features, such as grain boundaries, that can play a significant role in determining their properties; again, machine learning is useful in finding the correlations between these microstructural features and the properties of the materials.

He mentioned that a second way in which machine learning is valuable in working with multi-principal-element materials is in accessing the materials’ compositional complexity. As the number of elements in a material increase, the number of possible compositions and structures explodes—the so-called “combinatorial explosion”—and creates a huge space of potential compositions that must be considered in looking for materials with optimal properties for extreme conditions. Machine learning makes it possible to deal with this huge number of potential compositions in a reasonable amount of time.

In describing the first way in which machine learning is valuable—in dealing with the structural complexity of multi-principle-element materials—Ong spoke about how machine learning compares with other computational approaches to analyzing materials, such as first-principles methods, analyses with empirical potentials, and finite element and continuum models. There is always a cost–complexity trade-off to consider with such approaches, he noted, as doing computations on increasing complex systems will lead to rapidly rising costs. A critical challenge in computational materials science is bringing the 10–10–10–6 meter and 10–12–10–6 second scales in a way that retains transferability (i.e., the ability to move from material to material) and accuracy and that is scalable.

One approach to doing this is the use of machine-learning interatomic potentials (ML-IAPs), Ong said. There are different machine learning approaches to calculating these potentials, including neural network potentials, Gaussian approximation potentials, and spectral neighbor analysis potentials.

ML-IAPs are valuable, he said, not just because they are accurate but because the construction of these potentials is systematic and automatable. This makes it possible to shorten the time needed to calculate interatomic potentials and to analyze much more complicated systems. A recent analysis by his research group concluded that ML-IAPs performed much better than classical IAPs in predicting energies and forces as well as in predicting such properties as elastic constants and phonon dispersion curves (Zuo et al. 2020).

As an example, Ong spoke about some findings in a niobium-tantalum-molybdenum-tungsten multi-principal-element materials. Short-range order in that material, he explained, leads the niobium to segregate at the grain boundaries, for instance, and causes a decrease in the overall von Mises strain. Short-range order also enhances the mobility of edge dislocations but impedes the motion of screw dislocations. Furthermore, the short-range-order effect is attenuated at higher temperatures.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

One important application of the ML-IAPs, Ong said, is to help bridge the modeling gap between quantum theoretical calculations (density functional theory) at the quantum scale and atomistic modeling at the continuum scale. Describing work in molybdenum–niobium–titanium and tantalum–niobium–titanium alloys, he listed several findings in this in-between scale. Lower temperatures lead to higher short-range order, for instance. Non-equimolar compositions (i.e., those in which the percentages of the elements in the composition are not the same) have higher short-range order than equimolar compositions. And greater amounts of molybdenum in the alloys he studies results in higher short-range order.

Switching to the second way in which machine learning is valuable in dealing with multi-principle-element materials (in accessing compositional complexity), Ong spoke about accelerating materials discovery with machine learning. In particular, he described a new machine learning paradigm called graph networks that supports both relational reasoning and combinatorial generalization (Chen et al. 2019) and a particular graph network model called MEGNet (for Materials Graph Network). MEGNet models trained on approximately 60,000 crystals from the Materials Project were shown to significantly outperform previous machine learning models in predicting formation energies, band gaps, and the elastic moduli of crystals (Chen et al. 2019).

Ong closed with a list of open questions:

  • What are some other areas where data science and machine learning can play a transformative role in the design of extreme materials?
  • How can we address the issues of data heterogeneity and collection (especially for high-quality experimental data)?
  • What further data and software infrastructure investments are needed to push the frontiers in data-driven extreme materials design?

FIRST-DAY PANEL

The last activity of the first day was a panel discussion with three participants who each gave a 15-minute presentation and then answered questions as a group. The three panel members were Aaron Stebner, Georgia Institute of Technology (Georgia Tech); Douglas E. Wolfe, Pennsylvania State University; and Elizabeth J. Opila, University of Virginia.

Benchmarking Machine Learning Modeling Approaches

Aaron Stebner, Georgia Tech, discussed benchmarks for machine learning modeling approaches to materials and manufacturing research and development. His talk was derived, he said, from his work as a researcher using data informatics

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

and machine learning to solve problems involving metals and additive manufacturing and also from his work as deputy editor for the Journal of Additive Manufacturing, where the staff has wrestled with the issue of how to “benchmark and have a quality standard for machine learning publications and research and make sure that we maintain a constant clarity within the journal.”

As context, he noted the difference between machine learning and traditional science. In normal science or engineering, one takes inputs, puts them into a computer program based on a scientific understanding of the problem, and gets outputs. In computer learning, by contrast, one supplies the inputs and outputs, and the computer determines a program or functional form that connects the inputs with the outputs. The computer is learning the parameters of a statistical model, comparing different forms of that statistical model and finding the optimal form. “I just want to make sure we all remember machine learning is not learning things about physics that we don’t already know,” he said. Instead, the computer is using statistics to compare possible functions and find which fits the data best. In doing so it is important to avoid the dual pitfalls of underfitting and overfitting, choosing either too few or too many parameters to capture the phenomena of interest.

Much of what he would be talking about, Stebner said, was motivated by the recent materials science decadal survey (NASEM 2019). A key message of that publication, he said, is that they typical development cycle—discovery, development, property optimization, systems design and integration, certification, manufacturing, and deployment—needs to be modified so that the different steps are connected and that any parts of the process can be carried out in parallel. “These can no longer be separate iterative cycles,” he said. “We have to be able to look at all of those degrees of freedom in that process at the same time.”

With that background, Stebner described a nine-point rubric that the editors at the Journal of Additive Manufacturing use to evaluate whether a machine learning submission to the journal meets our quality metric and that they ask all of their authors and reviewers to keep in mind. He also teaches the rubric in a machine learning course for engineers at Georgia Tech. That is how they are teaching people to walk through machine learning problems.

  1. Delineate the use of multiple (statistical) models. Different models have different strengths and weakness, and which is best in a particular situation will depend on a researcher’s goals. Which properties are most important to optimize, for instance?
  2. Provide a clear definition of the process model formulation. In this step, the data are characterized (are they discrete or continuous, for instance), the outputs are described, and the type of analysis is specified. It is also important to acknowledge uncontrolled factors, as these are sources of noise as well as things that can be improved in future versions of the statistical model, if the data are available.
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
  1. Provide a clear definition of the data set as it is used for statistical modeling. “Anybody who has worked on machine learning knows you don’t use the data like your computer or your instruments spit it out,” Stebner explained. “You have to transform that data and extract the information that you’re going to use for machine learning. Sometimes we have to clean it. Oftentimes we have to construct features.” It is in this upstream work where scientists and engineers input their domain knowledge and the physics into the problem.
  2. Provide evidence that the data set supports the statistical model. “If we are trying to fit something and the outputs aren’t statistically distributed, we’re in trouble,” he said, “because the fundamental assumption of machine learning is that the data can be modeled by probability distributions. So, if there isn’t a distribution in our outputs and if there are no correlations between the outputs and inputs, we’re in trouble.” In particular, he said, not all problems are well posed for machine learning. Machine learning is not a magic bullet—it is statistical modeling in high dimensions.
  3. Provide evidence that machine learning is necessary. Ten years ago, Stebner said, machine learning was fun and exciting to apply to materials science problems, and many papers were published showing how well it worked in various situations. But, he added, “I think we’re near the point where we need to say, okay, what am I getting from using machine learning that I couldn’t get other ways?” The fundamental premise underlying statistical modeling has not changed in 100 years, he added—that is, the best statistical model is the one with the fewest number of statistically sufficient parameters that can describe the process—and sometimes you do not need machine learning to identify those parameters.
  4. Document how the basic statistical properties of the data set motivate the choice and selection of the statistical or machine learning model. The optimal algorithm will depend on the questions that are being asked and the data sets being worked with.
  5. Document and discuss the parameter estimation (training) of the model. How were the parameters determined? What optimization scheme was used, and why? What test–train split strategy was used, and why? Did the researchers just use the black box answer, or did they actually do uncertainty quantification and look at variations of the hyperparameters based on what they knew about the data?
  6. Completely assess the statistical model’s prediction performances. It is not enough, for instance, to simply report on the root-mean-squared error of the predictions; this only pertains to bias. It is also importance to report on the variance. Are the residuals well behaved? These questions are related to the issue of whether the model is overfit.
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
  1. Document the scientific or engineering value gained from using the model. It is important to document not just that the model worked, but what it allowed researchers to do that they could not otherwise have done. “It doesn’t always have to be rocket science,” Stebner said, “but it should at least compare versus the previous state of the art—whether there was a machine learning model before or whether you’re improving on somebody’s previous machine learning work.” It is also important to document the “edge” cases. Where does the model fail? Knowing that will help future researchers build on the work.

Stebner closed with a look to the future. Georgia Tech has recently received a generous grant to build an artificial intelligence materials and manufacturing testbed, whose purpose is to examine what it takes to run autonomous experiments and curate the data from them automatically, so that the learning process about materials and manufacturing can be speeded up significantly. One of the major questions facing the development of this facility is how to automate everything upstream of the machine learning model training. “There’s a full hour talk I have on what it takes to automate all of these, from the data management and curation, getting data into the form you need, and identifying which features are best for the statistical modeling,” he said.

One of the biggest open challenges is how to do machine learning modeling across multiple kinds of data, from micrographs to time series data and scalar properties such as strength or stiffness. He stated that they do not have the machine learning today where we can just feed the data files to an algorithm and it sorts all this out which is a big challenge that this community can work on.

In the discussion period following the panel presentations, Wolfe addressed the issue of education of a new generation of scientists and engineers who can explore the types of issues being addressed at the workshop. There are already several programs reaching all the way down to middle school that are trying to get students interested in science, technology, engineering, and mathematics (STEM) subjects, he said. One issue, he said, is that so many students are “getting snatched up by the Googles of the world—companies doing seemingly more glamorous work. “How can we entice them to stay in the materials world and to the government side of things?” he asked.

Advanced Materials and Manufacturing Techniques for Various Applications

In the second panel presentation Douglas E. Wolfe, Pennsylvania State University, offered a survey of various challenges related to using advanced materials in various applications, particularly those in extreme environments.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

The holy grail of materials design, he said, is to be able to bridge the gaps between the length scales. Different approaches provide different sorts of information about materials at different length scales, and a fully realized approach to materials design would be able to integrate all of these, including quantum theory, thermodynamic modeling, kinetic modeling, microscale modeling, macroscale finite-element modeling, mechanical design and virtual design iterations, and trade space modeling for design optimization.

The key to such integration, he said, will be to bring together the experts from the various areas, the experimentalists and the modelers, to work together. He believes if we can combine folks together and have more interdisciplinary research and discussions to identify the problems, we can be in a better position to advance this technology forward. Bringing together expertise from different areas will make it possible to focus on what is practical and find the best solutions to a problem that can be implemented. “We can’t always design for the best model, for the best manufacturing process, or the best intrinsic material,” he said. “So, you may only get 90 percent of the best intrinsic material, but you can manufacture it.” The entire community will be most successful at finding materials to meet needs for extreme environments by working together and having dialogues on what can be done and what cannot.

With that introduction, Wolfe launched into an overview of the manufacturing technologies in use today. He began by talking about coatings, noting that this is his area of expertise. Coatings have various benefits for use in extreme environments. It may be possible, for instance, to use a less expensive alloy with a better coating and still achieve the same performance.

There are more than 100 different coating processes in use, and they all have their advantages and disadvantages. For example, he said, one could deposit a titanium nitride coating via 10 different processes and get eight different results because the structure of the coating will differ according to which process is used. This makes it crucial to study structure–property relationships in order to gain insight into the performance of coatings laid down by different processes.

Similarly, he continued, the type of process used in additive manufacturing will affect the structure and properties of the resulting material. “We’ve got binder jet printing, directed energy deposition, e-beam deposition, laser deposition,” he said. “All these are critical parameters in … the microstructure of the material.” So, in order to be able to understand and predict the properties of items created by additive manufacturing, it is not enough to study the properties of the materials being used in isolation; one must examine the details of the process itself. The factors that should be considered include flowability of the material, the particle size distribution, laser–particle interactions, cooling rate, and heat transfer. All of these can affect the microstructure and the properties of the final product.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

Sintering is another area is which the details of the particular process—pressureless sintering, liquid-phase sintering, microwave sintering, hot isostatic pressing, field-assisted sintering technology, and so on—can affect the properties of the finished product, he added.

Moving on to ultrahard coatings, Wolfe offered a brief overview of the current state of the field. The best-performing coatings now are the ultrahard diamond, and cubic or wurtzite boron nitride, which can resist pressures of more than 80 gigapascals, and super-hard transition metal carbides, nitrides, and borides, which can resist more than 40 gigapascals.

Speaking of the challenges related to ultrahard materials he spoke first of being able to synthesize new ultrahard materials and dealing with the impurities that result in grain-boundary melting and affect the material’s mechanical properties. How can those be eliminated or controlled? A second challenge is quantifying a material’s hardness across various conditions and length scales. For example, it can be expensive to do high-temperature mechanical testing. And, he said, as superhard materials approach or exceed the hardness of a diamond, the question arises of how to measure the hardness, given that traditional measures of hardness use diamonds. He expanded on that topic in response to a question in the discussion period that followed the panel presentations. “Can we actually measure the hardness of a material that is harder than diamond?” he asked, “I’m currently stumped with regard to that, and I’ve talked about that with some of my grad students.… We have given it some thought with regards to if we do make a hard material, do we then make a diamond indenter … do we coat it? Those are the questions we have to understand.”

In the next part of his presentation, Wolfe discussed plasmonics, a field concerned with electron oscillations in metallic nanostructures and nanoparticles (Barbillon 2019). Ultra-high-temperature transition metal high-entropy coatings, he said, are not only super-hard and refractory, but also demonstrate special high-temperature plasmonic capabilities. These materials are of interest because plasmonics opens up new opportunities for high-temperature sensing, telecommunication, and thermal management technologies for next-generation aerospace vehicles. Of particular interest is the fact that models of these materials have been experimentally tested and validated in HfTa4C5, with the plasmonic nature of these materials successfully predicted (Calzolari et al. 2022). “This is a gamechanger when we talk about high temperature plasmonics,” he said. There are many different applications that could be developed with these materials’ combination of plasmonic activity, high hardness, and high thermal stability, but there will be many challenges related to their manufacturing.

Wolfe then spoke briefly about ceramic matrix composites made of carbon fibers in a silicon-carbide matrix. These materials have useful properties, such as high thermal stability, strength, toughness, and resistance to thermal shock and

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

creep, but there are several manufacturing challenges. “We don’t have enough folks associated with it,” he said.

Yet another promising area, he pointed out, is the additive manufacturing of high-temperature refractory metal alloys. These materials could have many valuable uses, such as in hypersonic vehicles and plasma-facing components in the fusion generators now under development.

Wolfe closed with a comment about the importance of workforce development. The ability to deal with emerging challenges in this area will depend in large part on the education and training of a sufficient number of people. He emphasized that he hopes the community realizes that they must educate our next generation of engineers and scientists.

Thermochemical Stability of Materials in Extreme Environments

Elizabeth J. Opila, University of Virginia, the final panelist, spoke about the thermochemical stability of materials in extreme environments, with a focus on degradation mechanisms. The context for the talk was the testing materials for use in extreme environments, such as protective materials on a space shuttle, and what can be learned from how those materials fail. Her focus in the talk was materials for two specific applications that must work in very high-temperature environments: hot section components in aero turbine engines and hypersonic protection systems.

Experience shows that the lifetime of materials varies inversely as the temperature at which they operate—the hotter the operating temperature, the shorter the lifetime. The material used for hypersonic vehicles may last for only a dozen or so hours under the most extreme temperatures they experience, while industrial gas turbines may last for 10,000 hours. Opila noted that they would like hypersonic vehicles to last for 10,000 hours also, or at least find a way to significantly increase the lifetime of materials under extreme temperatures.

But to do that, she said, it is not enough to only consider how materials behave at high temperatures; the environments that they are exposed to are extreme not just in temperature but also in reactive materials. It is not just heat but chemical reactions that cause materials in these environments to degrade, and in order to develop materials that can last longer in these environments, it is necessary to understand the degradation mechanisms at play. Thus, Opila’s team is working to create extreme environments in the laboratory—extreme not just in heat but in also reactivity—in order to test the performance of materials and study why they degrade in order to design hardier materials.

The process that she uses in her laboratory consists of several steps (Figure 2-5). First, a material is exposed to a high-temperature, high-reactivity environment. Then they observe what happens to the material in order to understand the reaction thermodynamics. They also work to understand the kinetics of the various

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Image
FIGURE 2-5 Research approach.
SOURCE: Elizabeth J. Opila, University of Virginia, presentation to the workshop, October 5, 2022.

reactions. Once they understand those two things, she explained, they can begin thinking about rate control mechanisms, predicting lifetime, and understanding these degradation mechanisms. The final step is to use this understanding to develop new materials and then iterate, putting these new materials through the same process.

To illustrate, she showed a diagram of a benchtop testing environment designed to simulate the conditions in a jet engine. It gets very hot, there are reactive gases, there is combustion, it has water vapors, and it has a very high gas velocity. The test sample is placed at a point where it is exposed to all of these factors. After the test is done, the sample is examined with a focus on its microstructure to understand the degradation mechanism. Finally, the understanding gained from the test is used to predict the lifetime.

Testing materials for use in hypersonic vehicles requires a different approach because the temperatures get much higher and cannot be achieved with the usual laboratory furnace. Instead, she said, the team uses resistive heating of the sample in a vacuum chamber where the oxidizing gases can be controlled. The temperatures in the chamber get up as high as 1,800°C.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

Currently, the group is not able to expose the samples to dissociated species, something that happens to materials on hypersonic vehicles, so Opila’s team is working to incorporate that capability into their system. The goal is to create micro-plasmas inside the chamber and expose the sample to the micro-plasma in dissociated oxygen. This will allow them to separate the effects of the ultra-high temperatures from the dissociated oxygen in the degradation of these materials, she said. This, she said, is moving a step closer to this actual hypersonic environment.

Other researchers, she noted, have been studying the mechanical properties of materials at ultra-high temperatures, which provides different information from Opila’s studies of their thermochemical behavior.

In closing, she listed several key research needs, research challenges, and research questions. One research need is getting a better understanding of the extreme environments in which these materials operate. In some cases, she noted, we do not know what is going on in a combustion environment or in a hypersonic flight. More thermodynamic data are needed on these materials, she said, noting that researchers in another workshop were talking about how to measure thermodynamic properties in the 2,000–3,000°C range. Getting more kinetic data will also be important, she said, and at this point there are fewer kinetic data than thermodynamic data. Among the types of kinetic data needed are data on reaction kinetics, diffusivities, and microstructural evolution.

Switching to research challenges, she first mentioned getting a better understanding of the extreme environments in which these materials must operate. In accumulating thermodynamic data, a big challenge is the increasing complexity of the materials under consideration. Another challenge is getting accurate high-temperature data as it is very difficult to measure temperatures accurately at this level. The lack of kinetic data is also a challenge, especially in complex materials.

Opila ended with a set of research questions:

  • Do we have sufficient understanding of extreme environments?
  • What test facilities are needed to increase our understanding of materials in extreme environments?
  • Given the proliferation of thermodynamic databases, how can we assess data and unify databases? How can we gain open access to government-funded data in private databases?
  • What are the barriers for increased experimental and computational determination of kinetic data?
  • Is there a need for a large program to accelerate our understanding of microstructural evolution?

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

DAY 1 RECAP

Alisdair Davey, Daniel K. Inouye Solar Telescope Data Center, provided a recap of the highlights of the first day. He began by offering short takeaways from each of the presenters:

  • Jakoah Brgoch: Machine learning requires multi-dimensional data. How does one identify fundamental material features that affect hardness?
  • Wendy Mao: One can take advantage of high-strength phases that form under high pressure by using a technique such as encapsulation to preserve them at ambient conditions. Could this offer a path to the design of new materials?
  • Nir Goldman: The ML-IAP technique can be used to bring gaps between DFT simulations and experiments.
  • Eric Homer: The Olmsted data set is the gift that keeps on giving. Well-curated data sets are extremely important. Curation may be more important than the size of the data set itself. How do we standardize data storage and determine data quality?
  • Christopher Weinberger: Properties are the key, but there are data gaps and questions on how we bridge them, especially for experiments (like creep) that are not well suited to high-throughput experiments.
  • Penghui Cao: It is important in training machine learning models to identify the material parameters, such as short-range order, that can affect material strength; these parameters can be identified using modeling and experiments.
  • Shye Ping Ong: Machine learning is making it possible to bridge from the quantum to the continuum scale and is facilitating the exploration of millions of possible materials. He described the Materials Project, which has offered major benefits to the materials community.
  • Aaron Stebner: He described a nine-point rubric for success in using machine learning for engineering. It is important to note where subject matter expert input is needed; machine learning is not going to make that knowledge and experience redundant. He also spoke about using robotics to build a foundation for high-throughput experiments.
  • Douglas Wolfe: Integration is vital. He spoke about tools, scales, and design and manufacturing. There are many challenges.
  • Elizabeth Opila: Do we have sufficient understanding of materials in extreme environments, and what test facilities do we need to increase our understanding? How do we come up with high-throughput experiments suitable for providing data for these efforts?
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×

Davey noted that machine learning came up regularly in the presentations. It is an increasingly common tool that is being used to bridge the gap between simulations and experiments and between length scales. The presenters made it clear that it is vital to validate machine learning, as well as standards for machine learning and the nine-point rubric that Stebner detailed. Data heterogeneity is an impediment to doing machine learning easily. That is something that is common to pretty much every scientific field, he said, and in many cases the first part of doing machine learning work is conditioning the data. Access to well-curated data sets is the key to using machine learning in materials science, and the key feature of a data set may not be whether it is large or small—both types have their applications—but rather the quality of the data set and how well curated it is. Finally, Davey said, it was very clear from the presentations that the ability to do extrapolative machine learning will be very valuable to the field.

Then Davey identified some common issues that had been mentioned by multiple presenters and talked about in the panels. It is important to have the right mix of people, so workforce development is vital. It is hard to find people who can both do the computational work and understand the physics. It is also difficult to get people to do the tedious work of filling data gaps. The health of the field will depend on getting the importance of the work realized and funded. How could that be done? Should it be called out in the decadal survey? Last, he said, making databases more available—and ensuring that they are well curated—should help the field.

Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 3
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 4
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 5
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 6
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 7
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 8
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 9
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 10
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 11
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 12
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 13
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 14
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 15
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 16
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 17
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 18
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 19
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 20
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 21
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 22
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 23
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 24
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 25
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 26
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 27
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 28
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 29
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 30
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 31
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 32
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 33
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 34
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 35
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 36
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 37
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 38
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 39
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 40
Suggested Citation:"2 Materials Design." National Academies of Sciences, Engineering, and Medicine. 2023. Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/26983.
×
Page 41
Next: 3 Materials Characterization »
Frontiers in Data Analytics and Monitoring Tools for Extreme Materials: Proceedings of a Workshop Get This Book
×
Buy Paperback | $25.00 Buy Ebook | $20.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

One of the major challenges in materials science today is developing materials that can survive and function in extreme environments, such as the high-radiation environments found in a fission or fusion reactor or the ultra-high temperature experienced by a hypervelocity vessel or a spacecraft traveling through Earths atmosphere on its return to the planets surface. What is needed to discover such materials was the topic of a 2-day workshop held at the National Academies of Sciences, Engineering, and Medicine on October 5-6, 2022. That workshop, titled Materials in Extreme Environments: New Monitoring Tools and Data-Driven Approaches, brought together an international collection of experts on the testing and measurement of materials in extreme environments and on discovering and developing new materials. This Proceedings of a Workshop recaps the presentations and discussions that took place during the 2 days of the workshop.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!