WebGL (OpenGL) allows you to create shader programs which allows you to run your own application code on the GPU. Normally, this is used to calculate the position and color to display. To to this, you feed the GPU data and you run your own code using the data you fed to the GPU. As you can imagine, the ability to pass in arbitrary data and run arbitrary instructions = possibility of general computation (within limits).
You are also not bound to render to the screen directly (back buffer) but to an texture of your choice. But one of the limitations (base spec) is that you can only output a 4*8-bit RGBA value for each pixel. How you calculate this 32 bit RGBA and how you interpret it is entirely up to you.
As an example, perhaps you want to run many sqrt calculations. First you create a texture with each "pixel" on the texture being your initial value (float packed into RGBA); then on the GPU, you instruct it to do a sqrt calculation for every pixel and output the results; the result is garbage as screen color, but if you write a unpack function that convert RGBA back to float, then you have your GPU based sqrt calculator!
Also, a texture is nothing more than an array of values packed in a certain format. Try canvas.getContext("2d").getImageData(0,0,10,10)
for example. The equivalent in WebGL is gl.readPixels()
.
No runnable example from me however, WebGL api is very verbose and it takes a lot of repetition to get something simple done.