I have worked a lot with byte streams so I understand the concept - some bytes signify a higher numerical value than others.
When reading the wiki on the definitions I become very confused though.
When I see the word "Big-endian" or BIG + END what I expect is that the last/ending byte is the biggest. No brainer right?
Example for those who don't know about endianness at all: Bytes 0x01 0x00, stored at index 0 and index 1, would be equal to 1 with the last byte being significant and equal to 256 with the first byte being significant. (A byte gives 2^8 options and 2^8 = 256)
Obviously if the last byte is most significant I would call that "big endian".
But no, reading wikipedia the definitions are the exact opposite: Last byte significance is called little-endian and first byte significance is called big-endian.
What gives?