Autoencoding Galaxy Spectra I: Architecture
Authors: Peter Melchior, Yan Liang, ChangHoon Hahn, Andy Goulding
Abstract: We introduce the neural network architecture SPENDER as a core differentiable building block for analyzing, representing, and creating galaxy spectra. It combines a convolutional encoder, which pays attention to up to 256 spectral features and compresses them into a low-dimensional latent space, with a decoder that generates a restframe representation, whose spectral range and resolution exceeds that of the observing instrument. The decoder is followed by explicit redshift, resampling, and convolution transformations to match the observations. The architecture takes galaxy spectra at arbitrary redshifts and is robust to glitches like residuals of the skyline subtraction, so that spectra from a large survey can be ingested directly without additional preprocessing. We demonstrate the performance of SPENDER by training on the entire spectroscopic galaxy sample of SDSS-II; show its ability to create highly accurate reconstructions with substantially reduced noise; perform deconvolution and oversampling for a super-resolution model that resolves the [OII] doublet; introduce a novel method to interpret attention weights as proxies for important spectral features; and infer the main degrees of freedom represented in the latent space. We conclude with a discussion of future improvements and applications.
Explore the paper tree
Click on the tree nodes to be redirected to a given paper and access their summaries and virtual assistant
Look for similar papers (in beta version)
By clicking on the button above, our algorithm will scan all papers in our database to find the closest based on the contents of the full papers and not just on metadata. Please note that it only works for papers that we have generated summaries for and you can rerun it from time to time to get a more accurate result while our database grows.