ProductAE: Towards Training Larger Channel Codes based on Neural Product Codes

10/09/2021
by   Mohammad Vahid Jamali, et al.
0

There have been significant research activities in recent years to automate the design of channel encoders and decoders via deep learning. Due the dimensionality challenge in channel coding, it is prohibitively complex to design and train relatively large neural channel codes via deep learning techniques. Consequently, most of the results in the literature are limited to relatively short codes having less than 100 information bits. In this paper, we construct ProductAEs, a computationally efficient family of deep-learning driven (encoder, decoder) pairs, that aim at enabling the training of relatively large channel codes (both encoders and decoders) with a manageable training complexity. We build upon the ideas from classical product codes, and propose constructing large neural codes using smaller code components. More specifically, instead of directly training the encoder and decoder for a large neural code of dimension k and blocklength n, we provide a framework that requires training neural encoders and decoders for the code parameters (k_1,n_1) and (k_2,n_2) such that k_1 k_2=k and n_1 n_2=n. Our training results show significant gains, over all ranges of signal-to-noise ratio (SNR), for a code of parameters (100,225) and a moderate-length code of parameters (196,441), over polar codes under successive cancellation (SC) decoder. Moreover, our results demonstrate meaningful gains over Turbo Autoencoder (TurboAE) and state-of-the-art classical codes. This is the first work to design product autoencoders and a pioneering work on training large channel codes.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset