Why is my ASCII char to int conversion failing?
Asked Answered
S

2

0

According to the chart here:

http://www.idautomation.com/barcode-faq/code-128/

This character:

Ë

equates to the value 103.

Yet this code:

string barcode = textBoxRawCode128.Text.Trim(); 
. . .
int runningTotal = ConvertToASCIIInt(barCode[0]);
. . .

private int ConvertToASCIIInt(char valToConvertToASCII)
{
    const int ASCII_ADJUSTMENT_VAL = 32;
    return valToConvertToASCII - ASCII_ADJUSTMENT_VAL;
}

...when the value in the textbox and thus of barcode is "ËTry another string.", thus where barcode[0] is "Ë", returns a value of 171 instead of 103...???

And according to this chart: http://www.adams1.com/128table.html, the value corresponding to 103 is ‡, but when I set barCode to "‡Try another string.", the returned value is 8193...??? Curiouser and curiouser...

Note: A related/preliminary post is Is this code for calculating Code128 barcode check digits correct?

Sartin answered 11/9, 2013 at 18:51 Comment(5)
You quote the USS Code-128 Character Set value for Ë (your link). But .NET does not use USS Code-128 Character Set, instead it uses Unicode. In Unicode the codepoint is hexadecimal 00CB, see the same table or the official Unicode chart. You might need a tool that converts between Unicode and USS Code-128 Character Set.Kong
IOW, this is like a Russian doll, or a vicious vortex. Odd/no wonder that somebody hasn't already solved this challenge with a method like GetCheckDigit(barcodeType typ, string barcodeVal) - where "typ" in this case would be BarcodeType.Code128 or so. "Odd" because it can't be that rare that it needs to be calculated, and "no wonder" because it's such a pain in the donkey.Sartin
The page you linked also has a .NET Font Encoder.Kong
Is there a reason I would want a .NET Font Encoder?Sartin
Good question. When I wrote that comment I was under the impression that the DLL from that link also provided .NET methods to call for converting between ASCII and this Code128 format. But I haven't tried it, and maybe my impression was incorrect.Kong
D
1

Keep in mind that to find the correct number for the Code 128 symbol you have to subtract 32, therefore to get the ASCII value of the Code 128 symbol 103, you'll have to add 32, giving you 135 which is not 7-bit ASCII. Adams has it right, but since it's 'high-ASCII' you get into the mess of code pages. So, depending on the language of the PC and if the applications that touch the string are using DBCS or Unicode or 8-bit ASCII, you may find different characters because the international standards differ. The character given by Russ Adams can be found if you bring up the Small Fonts typefont in the Windows Character Map application. Look at the character under "0x87" which is 135 in decimal.

The IDAutomation people are using their own algorithm to arrive at a barcode character based on the letters you feed it, so if they say they need a 'Ë' to get a 103, then that's what they need. 'Ë' doesn't equate to a 103, it's just what gets their software to cough up a 103. This is all to work around the conversion from a numeric 7-bit standard to the vendor's method for producing bars.

Sadly, different symbologies do not use the same algorithms for encoding data or deriving checksums, so each barcode type has to have its own software.

Duct answered 11/9, 2013 at 23:25 Comment(1)
Wholly purple cow, what a mess!Sartin
C
4

Actually the character you have shown has the value of Ë = 203. Small g is the character which has the ascii value 103.

Hence 203 - 32 = 171.

Reference

Chat answered 11/9, 2013 at 19:7 Comment(4)
Yeah, I tested "g", too; that gives me 71, since 103-32==71. The result I get from my check digit calculation is "gTry another string." with an unprintable (obscure, not obscene) character appended.Sartin
Odd that "g" would be a start char; after all, what if somebody wanted to encode the value "go west young man" into a Code128 barcode? The parser would consider the "g" as a start character, and consider the value to be, "o west young man"...wouldn't it?Sartin
Not sure what you're trying to achieve. I usually convert char to ascii int like this. int value = (int)ch. Should be enough. Am is missing something?Chat
The difference between ascii and barcode is 32, hence the subtraction. In THIS ascii chart ;) see that Ë character is ascii 211 which is outside the barcode's 128 which would make it unprintable.Virgina
D
1

Keep in mind that to find the correct number for the Code 128 symbol you have to subtract 32, therefore to get the ASCII value of the Code 128 symbol 103, you'll have to add 32, giving you 135 which is not 7-bit ASCII. Adams has it right, but since it's 'high-ASCII' you get into the mess of code pages. So, depending on the language of the PC and if the applications that touch the string are using DBCS or Unicode or 8-bit ASCII, you may find different characters because the international standards differ. The character given by Russ Adams can be found if you bring up the Small Fonts typefont in the Windows Character Map application. Look at the character under "0x87" which is 135 in decimal.

The IDAutomation people are using their own algorithm to arrive at a barcode character based on the letters you feed it, so if they say they need a 'Ë' to get a 103, then that's what they need. 'Ë' doesn't equate to a 103, it's just what gets their software to cough up a 103. This is all to work around the conversion from a numeric 7-bit standard to the vendor's method for producing bars.

Sadly, different symbologies do not use the same algorithms for encoding data or deriving checksums, so each barcode type has to have its own software.

Duct answered 11/9, 2013 at 23:25 Comment(1)
Wholly purple cow, what a mess!Sartin

© 2022 - 2024 — McMap. All rights reserved.