Size of each character in ASCII
(StreamWriter
) takes 1
byte whether its a number or character.
Similarly what will be the size of each character, integer in binary? (BinaryWriter
). Can some one explain in brief?
Size of each character in ASCII
(StreamWriter
) takes 1
byte whether its a number or character.
Similarly what will be the size of each character, integer in binary? (BinaryWriter
). Can some one explain in brief?
Let's start with difference between StreamWriter
and BinaryWriter
. StreamWriter
is for writing a textual representation to a stream. StreamWriter
converts anything that is written (via Write* method) into a string, then converts via the encoding to bytes, and writes the bytes to the underlying stream.
BinaryWriter
is for writing raw "primitive" data types to a stream. For numeric types it takes the in memory representation, does some work up to normalize the representation (e.g. to handle differences in endianess), and then writes the bytes to the underlying stream. Note that it also has an encoding provided in the constructor. This is used only for converting char
and string
into to bytes. The default encoding is UTF8.
Size of each character in ASCII (StreamWriter) takes 1 byte whether its a number or character.
This is statement is somewhat confusing to me. Let me clarify. The int
1 will be converted to the string "1" which encodes in ASCII as 49 which is indeed one byte, but 100 will be converted to the string "10000" which encodes in ASCII to 49 48 48 48 48, so that's 5 bytes. If using BinaryWriter
both would occupy 4 bytes (the size of an int
).
Similarly what will be the size of each character, integer in binary? (BinaryWriter). Can some one explain in brief?
The size of a char
depends on the encoding used for both BinaryWriter
and StreamWriter
. The size of numeric types like int
, long
, double
are the sizes of the underlying types, 4, 8, and 8 bytes respectively. The amount of data written is documented in each Write overload of BinaryWriter. Strings are treated distinctly from char[]
in BinaryWriter and will have their length prefixed before the encoded bytes are written.
© 2022 - 2024 — McMap. All rights reserved.
sizeof(char) == 2
, so single character is 2 bytes long; however, the size ofStream
depends on encoding (Ascii, UTF-8 etc.) and characters itself:aaa
(3 English lettersa
) can be shorter thanааа
(3 Russian lettersа
) – Hermitagechar
in C# is UTF-16, this means each character takes 2 bytes. An intergral type depends on the it's size, meaning it can be either a single byte (Byte
), aShort
(16bit),int
(32Bit),long
(64bit). – PoulsonBinaryWriter
, they use it for badly considered reasons. What is it that you need to do here? Would a binary serializer be a better choice? (there are several to choose from) – ColonnadeWrite
method - not really sure what other information you need - i.e. BinaryWriter.Write(Int32) - "Writes a four-byte signed integer to the current stream and advances the stream position by four bytes."... – Melise