I have a 128-bit string, and my supervisor has asked me to represent those 128 bits as a polynomial. This is a scan of the paper he was writing on:
His idea is, since we are eliminating the 0s from these bits, we will be able to perform the next operations (most of which are XOR between the bits/polynomials) much faster than if we worked on all bits.
I understand what the requirement is, and I can do it on paper, and also in the application. But my way will not achieve his goal, which is improving the performance. He actually said that there are libraries already that do this, but unfortunately I couldn't find any. The only thing I found was a Polynomial class that evaluates polynomials, which is not what I want.
So do you guys know how can I implement this to improve the performance? Any code/snippets/articles is very much appreciated.
The application is written in Java, if that makes any difference.
Thanks,
Mota
Update:
My supervisor says that this C library will do the task. I couldn't though figure out how it works and how it will do this.