Using CycleGANs to Generate Realistic STEM Images for Machine Learning
The rise of automation and machine learning (ML) in electron microscopy has the potential to revolutionize materials research by enabling the autonomous collection and processing of vast amounts of atomic resolution data. However, a major challenge is developing ML models that can reliably and rapidly generalize to large data sets with varying experimental conditions. To overcome this challenge, we develop a cycle generative adversarial network (CycleGAN) that introduces a novel reciprocal space discriminator to augment simulated data with realistic, complex spatial frequency information learned from experimental data. This enables the CycleGAN to generate nearly indistinguishable images from real experimental data, while also providing labels for further ML applications. We demonstrate the effectiveness of this approach by training a fully convolutional network (FCN) to identify single atom defects in a large data set of 4.5 million atoms, which we collected using automated acquisition in an aberration-corrected scanning transmission electron microscope (STEM). Our approach yields highly adaptable FCNs that can adjust to dynamically changing experimental variables, such as lens aberrations, noise, and local contamination, with minimal manual intervention. This represents a significant step towards building fully autonomous approaches for harnessing microscopy big data.
READ FULL TEXT