Materials at Extreme Conditions
Neural network potentials and GNN coarse-grain models to predict shock response, hotspot formation, and detonation in energetic materials.
Learn more →School of Materials Engineering · Purdue University
We develop and apply predictive atomistic and molecular simulations to understand and design materials, from energetic materials and metals to polymers and beyond.
Predictive, physics-based understanding of materials that enables rational design, from the atomic scale to engineering applications.
We develop atomistic and molecular simulation methods, machine learning models, and open data infrastructure to understand and design materials for energy, defense, and sustainability, bridging quantum mechanics, statistical mechanics, and data science.
Neural network potentials and GNN coarse-grain models to predict shock response, hotspot formation, and detonation in energetic materials.
Learn more →Active learning and high-throughput DFT to discover high-performance alloys and map the compositional landscape of 2D MXene precursors.
Learn more →Reactive MD of condensed-phase chemistry — from carbon fiber stabilization to thermoset curing — coupled with GNN models for reaction rates.
Learn more →Sim2L workflows and queryable nanoHUB databases making simulation data reusable — 10× faster active learning.
Learn more →
The Strachan Group is a team of graduate students, postdoctoral researchers, and undergraduate students at Purdue's School of Materials Engineering. We are united by a passion for understanding materials through computation and simulation.
Meet the team → About Ale Strachan →Molecular modeling of thermally activated chemistry in condensed phases is essential to understand polymerization, depolymerization, and other processing steps of molecular materials. Current methods typically combine molecular dynamics (MD) simulations to describe short-time relaxation with stochastic descriptions of predetermined chemical reactions. Possible reactions are often selected on the basis of geometric criteria, such as the capture distance between reactive atoms. Although these simulations have provided valuable insight, the approximations used to determine possible reactions often lead to significant molecular strain and unrealistic structures. We show that the local molecular environment surrounding the reactive site plays a crucial role in determining the resulting molecular strain energy and, in turn, the associated reaction rates. We develop a graph neural network capable of predicting the strain energy associated with a cyclization reaction from the prereaction, local, molecular environment surrounding the reactive site. The model is trained on a large data set of condensed-phase reactions during the activation of polyacrylonitrile (PAN) obtained from MD simulations and can be used to adjust relative reaction rates in condensed systems and advance our understanding of thermally activated chemical processes in complex materials.
Machine learning interatomic potentials (ML-IAPs) are emerging as transformative tools in materials modeling, promising quantum-level accuracy at a fraction of the computational cost. However, their ability to generalize beyond equilibrium configurations and to reliably capture defect- and temperature-driven behavior remains underexplored. Here, we develop and benchmark two state-of-the-art ML-IAPs, Spectral Neighbor Analysis Potential (SNAP) and Allegro, on a comprehensive dataset for monolayer MoSe2. Using density functional theory (DFT) as the reference, we evaluate their performance in capturing stress-strain behavior, phase transition energetics, defect evolution, edge stability, and fracture toughness. Allegro, a deep equivariant neural network potential, surpasses both SNAP and the classical Tersoff potential in accuracy, efficiency, and transferability. Importantly, both ML potentials accurately reproduce experimental fracture measurements and ab initio predictions of inversion domain formation — phenomena well beyond their training sets. Our findings establish ML-IAPs as viable replacements for traditional force fields in the study of non-equilibrium mechanical phenomena, enabling large-scale, high-fidelity simulations in 2D materials and beyond. This work provides a broadly applicable framework for data-driven modeling of structural and functional transformations under extreme conditions.
Nanoscale precipitates have been proved to affect the transformation behavior of shape memory alloys (SMA) significantly. In this work, the effects of Heusler precipitates (Ni2TiAl) on the Ni50Ti24.5Hf22.5Al3 SMA were systematically investigated. Density functional theory (DFT) calculations showed that coherent Heusler precipitates can mechanically stabilize the austenite phase, indicating the decrease of transformation temperatures. This trend was further confirmed experimentally. The depression of transformation temperatures with the increase of heat treatment time was observed as nanoscale coherent precipitates nucleated and grew. This was followed by an abrupt increase in transformation temperatures when aging time exceeded 200 hours at 700 °C. We attribute this increase to the loss of coherency between the Heusler precipitates and matrix which weakened their mechanical coupling. Detailed microstructure characterization by high-resolution transmission electron microscopy (TEM) and energy-dispersive X-ray spectroscopy (EDS) were provided to support the finding.
Machine learning has become a central technique for modeling in science and engineering, either complementing or as surrogates to physics-based models. Significant efforts have recently been devoted to models capable of predicting field quantities, but the limitations of current state-of-the-art models in describing complex physics are not well understood. We characterize the ability of generative diffusion models and generative adversarial networks (GANs) to describe the Ising model. We find diffusion models trained using equilibrium configurations obtained using Metropolis Monte Carlo for a range of temperatures around the critical temperature that can capture average thermodynamic variables across the phase transformation and extrapolate to higher and lower temperatures. The model also captures the overall trends of physical properties associated with fluctuations (specific heat and susceptibility) except at the non-ergodic low temperatures and non-trivial scale-free correlations at the critical temperature, albeit with some difference in the critical exponent compared to Monte Carlo simulations. GANs perform more poorly on thermodynamic properties and are susceptible to mode collapse without careful training. This investigation highlights the potential and limitations of generative models in capturing the complex phenomena associated with certain physical systems.