Skip to main content

Currently Skimming:

3 Materials Characterization
Pages 42-73

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 42...
... Thermo-Mechanical Testing and Characterization in Extreme Environments Gregory B Thompson, University of Alabama, began by speaking about the importance of characterizing materials in extreme environments, such as at extreme temperatures, under high strains, or under irradiation.
From page 43...
... The first is that it uses small sample sizes, which is advantageous for many of the materials designed for extreme environments, as they can be very expensive to produce. The second is that the technique has the potential to study a reasonably broad range of materials.
From page 44...
... , pp. 185-196  Three pairs of orthogonal electrodes stabilize the sample sphere via electrostatic levitation  200 W YAG laser beam heats and rotate the sample at many thousands of revs/s (absorbed photons apply small net torque)
From page 45...
... It is also possible to test materials at extreme temperatures and extreme me chanical properties at the same time, Thompson said, describing as an example a tool developed at Sandia National Laboratory. The tool is based on a transmission electron microscope (TEM)
From page 46...
... In summary, he said that in situ electron microscopy offer unique capabilities for probing phase stability, deformation mechanisms, and other properties, but developing measurement methods and specimen preparation is challenging. And, looking to the future, he noted that extreme environments offer a rich-realm of experimental opportunities in instrumentation development, measurement sci ence, and, ultimately, materials development.
From page 47...
... . Noting that previous workshop speakers had discussed a lack of temperature standards in extreme environments, Rasmussen said that there are temperature standards, but people often do not use them.
From page 48...
... Researchers have produced conditions of nearly 6,000°C and up to 500 gigapascals in these cells, although un certainty in the measurements is increased at these more extreme conditions. The key features of this technique are that the measurement of the sample conditions is mainly indirect, the time the sample is at experimental conditions is somewhat long, the measurement of the property is via a relative technique, and the uncer tainty of the state variables is pretty high.
From page 49...
... There are few measurement techniques or facilities for generating reference-quality data on materials in extreme conditions. There is also a lack of innovation in meth ods to measure these properties.
From page 50...
... Tabletop Hypervelocity Launcher for Studying Extreme Materials Following Rasmussen, Dana D Dlott, University of Illinois at Urbana-Cham paign, described how he studies materials under extreme conditions that he creates by using a pulsed laser to launch "little bullets" into the material.
From page 51...
... The resulting temperature in this situation will be 2,200 K or 1,925°C. "Water is a low-density compressible material," he commented, "and if you shoot this at plastic or if you shoot it at metal, you're not going to get that temperature." The key feature here is that shockwave created by shooting a flyer plate into a target creates extreme conditions as the shock goes through the sample.
From page 52...
... They are most interested in what they call hot spots. "When you put a shockwave into one of these materials," he explained, "there are some regions where the energy of the shockwave gets concentrated and causes the explosive to ignite, and those are called hot spots." Work at national laboratories has shown that when a shockwave travels through a plastic-bonded explosive, the hot spots develop behind the shockwave and then coalesce, leading to a detonation.
From page 53...
... What they have found by examining the detonations is that when a shockwave hits it squeezes the voids in the materials, igniting hot spots. Then you end up with an ensemble of hot spots that have different sizes and temperatures.
From page 54...
... All of these strategies require understanding materials' properties at the micro and macro levels in order to design optimum materials for different uses. With this introduction, she said that the focus of her presentation would be the task of finding complex metal alloys as well as modifications to mature materials systems that would increase sustainability.
From page 55...
... The question Leonard addressed in her research was whether she could link dislocation interactions within the precipitates of the different phases to the tensile behavior of the entire alloy. Because the kappa phases have varying sizes, answering the question requires studying the material at different scales, and she used SEM, TEM, and micro-computed tomography.
From page 56...
... The SEM shows where there is strain localization, but it does not reveal the dislocation interactions with the specific precipitates, for which TEM is needed. Specifically, she is looking for hot spots which would be the places where a crack would start when the material was put under stress.
From page 57...
... "So 3,000°C is completely brand new to me." Still, she explained, she believed that her work in determining the structures of materials from observational data such as X-ray absorption spectra could be applied in the area of extreme materials. The focus of Chan's group is on the inversion of characterization data to get structures, using theoretical modeling as guidance and constraints -- in other words, using information from such instruments as electron microscopes, atomic probes, and synchrotrons to determine the atomic structure of the material under study.
From page 58...
... The bottom line, Chan said, is "Using computer vision deep learning, we can process electron microscopy data and extract information from it, sometimes thinking about the physics of the problem a little bit." As a second example, Chan talked about using data from X-ray absorption near-edge spectroscopy (XANES) to determine structural and electronic descrip tors, specifically the coordination numbers, of materials.
From page 59...
... It works by taking whatever characterization data are available, generating possible atomic structures with machine learning, then simulating the experimental signals from those candidate atomic structures and looking for a match. DFT is used to constrain the solutions to those that are physi cally reasonable.
From page 60...
... The first four of these challenges relate to the ultra-high-temperature regime, that is, above around 2,000°C, while for the last two, even data are lower temperatures are important. After a brief overview of the state of the art in situ characterization of materials in extreme environments, Misture raised the question of what researchers should do with their data once they have collected information from various sources about a material at extreme conditions.
From page 61...
... ; and access to new and improved beamlines for in situ studies under extreme environments (such as the beamlines that will be available after the planned upgrade of the Advanced Photon Source)
From page 62...
... The need to deal with all the data he was generating and extracting information from it led him to develop various machine learning ap proaches. He explained that a lot of what the group has focused on is how to make machine learning models and machine learning tools usable and practical for the applications that they have.
From page 63...
... A clear lesson from the work, he said, is "as you include more and more physics in the model, you're going to get much better results." Agar's last subject was the use of machine learning tools to perform real-time data reduction as a way of dealing with the vast amounts of data being generated by model physics experiments. As an example of why data reduction is needed, he pointed to the Large Hadron Collider, which collects data at a rate that is on the order of petabytes per second.
From page 64...
... In the discussion period following the presentation, Agar responded to a question about how easy it is to integrate the electron microscope with the data reduction tool. That is perhaps the biggest pain point, he said, because instruments generally have proprietary data acquisition systems, and it can be difficult to tap into the data streams.
From page 65...
... Cherukara, Argonne National Laboratory, began by explaining that his group develops the computational tools, algorithms, and machine learn ing methods that the users of the Advanced Photon Source (APS) use when they analyze the data.
From page 66...
... By using a neural network that has been trained to generate the desired images from the data, researchers can not only speed up the production of the images, but the images can be produced with significantly fewer data. In one case that Cherukara described, the AI method was able to accurately produce images with only 4 percent of the data that had nearly the same fidelity as images made with 100 percent of the data.
From page 67...
... For instance, at lower temperatures there was sink-limited behavior, but as the temperature in creased there was a recombination regime and then finally a regime with Arrhenius diffusion behavior. She said that this was a totally nonintuitive type of behavior, but it is something that they can now extract rigorously.
From page 68...
... However, the use of scanning nanobeam diffraction may improve the segmentation issues and provide localized information. The second case study that Krogstad described involved tracking radiation induced point defect accumulation using nanobeam diffraction techniques.
From page 69...
... What we really need are in situ probes and in situ characterization to understand radiation damage, he concluded. There are various techniques that have been used for in situ studies of radia tion damage, including Rutherford backscattering spectrometry, X-ray diffraction, X-ray photoelectron spectroscopy, and Raman spectroscopy, but the most used technique is in situ TEM.
From page 70...
... This means, Uberuaga said, that no ex situ experiment could measure this change in conductivity. "So, we're sensitive again to fast-moving defects that you can only see when you're actually irradiating the material, again emphasizing the need for in situ probes." Switching gears, he then spoke about experiments involving irradiation cou pled with other extreme environments.
From page 71...
... Indeed, "the more probes that we throw at our material, the more we're going to learn." In recent work looking at defects in chromium oxide, they used TEM, Raman spectroscopy, electrochemi cal impedance spectroscopy, and PAS. And as more probes are used to examine complex responses, he said, artificial intelligence and machine learning will be able to help synthesize all those data.
From page 72...
... Cherukara commented that according to a memo from the White House, the expectation is that by 2026 data from all federally funded research will have to be made available by the investigators, and Uberuaga said that at Los Alamos National Laboratory researchers "need to write a data management plan with pretty much every proposal now." Unfortunately, Los Alamos National Laboratory does not provide a resource to help its scientists manage their data or make it available, he said, "so I feel like we're all scrambling to figure out a way to do that on our own, without the right expertise or resources to be able to do that in the way that it should be done." Joshua Agar, Drexel University, said that there are various ways that scientists are rewarded for supplying their data to others or putting them in a database, but there are various challenges related to data storage and sharing. "I was lucky recently and got awarded enough money to buy about one and a half petabytes of data storage," he said, "but it is really challenging, and I think we have to push on our institutions to support it as part of our overhead costs." Right now, he said, if he generates two terabytes of data, he has no good way to send those data to another institution.
From page 73...
... There are 70 different beam lines, and each of them does different things, he said. "There are beamlines where you can do high T and high P


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.