I am having to work on a web server for work and there are portions of our code written by our chief engineers that I don't understand and am currently trying to decipher. Here is a similar and much simpler version of what is happening within our code base, and I was wondering if anyone could give me a deep explanation of what this is doing step by step.
package main
import "fmt"
import "encoding/binary"
func main() {
////////////////////////////////// no need to explain anything
b := []byte{2,3,5,7,11,13} /// within this comment block.
for _,e := range b { //
fmt.Printf("%d ",e) //
} //
fmt.Printf("\n") //
//////////////////////////////
length:= binary.LittleEndian.Uint32(b) /// <<< Why this results in
/// 117768962 is the question.
fmt.Printf("customLen=%d\n",int(length))
}
uint32
is only 32 bits, so the11
and13
are ignored? – ScintillaLittleEndian.Uint32
function is only really 1 line of code: golang.org/src/encoding/binary/binary.go#L62 – Scintillauint32
) as a uint32. All of the stdlib code is available to view, and it's even linked from the documentation, if you're curious to know what any stdlib function does step by step. – Jourdain0x07050302
is the same as117768962
, the former is just a different representation or encoding the latter. – Scintilla