Reed Solomon Codes in Neural Networks

Published: Updated:
Published: Updated:

Today I came up with another weird combination. How can I use Reed-Solomon error correction on directed acyclic graph? In another words this graph is somewhat dynamic and some routes can be altered (edges removed), and we nee to update other weights in order to restore the missing path.

For some reason I just like Reed-Solomon codes by their name, it’s the first thing that pops in my mind, but any other fault tolerance methodology should work too.

So first we should keep redundancy - different paths can carry the same information.

Wait, how the path can become information itself?

Encoding using neurons instead of discrete gates (over Galois fields) - decoder in complex numbers–Solomon_codes_for_coders#RS_encoding - full Python code - popular Python library unireedsolomon

Berlekamp-Massey algorithm for Binary codes

RS codes - general outline, chapters from a real book

Short introduction to modern approach (nice formulas)

New improvements

Lagrangian Interpolation (from dots to polynomial)

Lagrange interpolating polynomial in Wolfram

Lagrange interpolating polynomial (-1,8),(0,5),(1,12),(2,12),(3,15) (9 x^4)/8 - (61 x^3)/12 + (31 x^2)/8 + (85 x)/12 + 5

Evaluate in Wolfram

evaluate f(x)=8+5x+12x^2+12x^3+15x^4 at x=-1 a-b+c-d+e=18, a=8, a+b+c+d+e=52, a+2b+4c+8d+16e=402, a+3b+9c+27d+81e=1670 (Gaussian elimination)
(15x^7+12x^6+12x^5+5x^4+8x^3)/(x^3-x) evaluate 15x^7+12x^6+12x^5+5x^4+8x^3 -35x-17x^2 at 0^7%2B13x^6%2B12x^5%2B5x^4%2B8x^3+-35x-17x^2+at+0

  • Reed-Solomon codes are widely used in deep-space communication, compact disc audio systems, and frequency-hopped systems. However, the VLSI implementation of these codes is still very complex, and encoding/decoding by using these chips is very time consuming. Neural network implementation of these codes has resulted in reduced complexity, enhanced error correction capability, fast processing, and improved signal-to-noise ratio. The proposed scheme requires less bandwidth, utilizes soft decision decoding, and exploits the redundancy in the English language.

  • Deep learning recently shows outstanding potential in channel decoding optimization, but its effect on the decoding of Reed-Solomon (RS) codes has yet to be explored. In this paper, we propose a RS decoder based on deep learning for the first time, and pave a new way to improve the existing RS decoding algorithms. We exploit a deep neural network (DNN) to estimate the error numbers of the received codewords, and according to the estimation results, a novel decoder is designed, which can adjust the most suitable decoding method to each received codeword automatically. Experiments show that for (7, 3), (15, 9) and (63, 55) RS codes, the average computational complexity of our decoder can be reduced by 68.96 %, 62.38 %, 50.61 % respectively compared with the HDD-LCC algorithm. - and maybe more examples just for decoding

Implementation with discrete gates

Decoding theory first - encoding

Add scheduler and debug it

Area optimized Syndrome Calculation for Reedsolomon Decoder - syndromes formulas and diagrams with gates (low quality, I do not trust) - about Linear Feedback Shift Registers

D = D Flip Flop

Article to understand

Play with digital gates:

Rate this page