In .NET 4.5 this cipher worked perfectly on 32 and 64 bit architecture. Switching the project to .NET 4.6 breaks this cipher completely in 64-bit, and in 32-bit there's an odd patch for the issue.
In my method "DecodeSkill", SkillLevel is the only part that breaks on .NET 4.6. The variables used here are read from a network stream and are encoded.
DecodeSkill (Always returns the proper decoded value for SkillLevel)
private void DecodeSkill()
{
SkillId = (ushort) (ExchangeShortBits((SkillId ^ ObjectId ^ 0x915d), 13) + 0x14be);
SkillLevel = ((ushort) ((byte)SkillLevel ^ 0x21));
TargetObjectId = (ExchangeLongBits(TargetObjectId, 13) ^ ObjectId ^ 0x5f2d2463) + 0x8b90b51a;
PositionX = (ushort) (ExchangeShortBits((PositionX ^ ObjectId ^ 0x2ed6), 15) + 0xdd12);
PositionY = (ushort) (ExchangeShortBits((PositionY ^ ObjectId ^ 0xb99b), 11) + 0x76de);
}
ExchangeShortBits
private static uint ExchangeShortBits(uint data, int bits)
{
data &= 0xffff;
return (data >> bits | data << (16 - bits)) & 65535;
}
DecodeSkill (Patched for .NET 4.6 32-bit, notice "var patch = SkillLevel")
private void DecodeSkill()
{
SkillId = (ushort) (ExchangeShortBits((SkillId ^ ObjectId ^ 0x915d), 13) + 0x14be);
var patch = SkillLevel = ((ushort) ((byte)SkillLevel ^ 0x21));
TargetObjectId = (ExchangeLongBits(TargetObjectId, 13) ^ ObjectId ^ 0x5f2d2463) + 0x8b90b51a;
PositionX = (ushort) (ExchangeShortBits((PositionX ^ ObjectId ^ 0x2ed6), 15) + 0xdd12);
PositionY = (ushort) (ExchangeShortBits((PositionY ^ ObjectId ^ 0xb99b), 11) + 0x76de);
}
Assigning the variable as SkillLevel, in 32-bit only, will cause SkillLevel to always be the correct value. Remove this patch, and the value is always incorrect. In 64-bit, this is always incorrect even with the patch.
I've tried using MethodImplOptions.NoOptimization and MethodImplOptions.NoInlining on the decode method thinking it would make a difference.
Any ideas to what would cause this?
Edit: I was asked to give an example of input, good output, and bad output. This is from an actual usage scenario, values were sent from the client and properly decoded by the server using the "patch" on .NET 4.6.
Input:
ObjectId = 1000001
TargetObjectId = 2778236265
PositionX = 32409
PositionY = 16267
SkillId = 28399
SkillLevel = 8481
Good Output
TargetObjectId = 0
PositionX = 302
PositionY = 278
SkillId = 1115
SkillLevel = 0
Bad Output
TargetObjectId = 0
PositionX = 302
PositionY = 278
SkillId = 1115
SkillLevel = 34545
Edit#2:
I should include this part, definitely an important part to this.
EncodeSkill (Timestamp is Environment.TickCount)
private void EncodeSkill()
{
SkillId = (ushort) (ExchangeShortBits(ObjectId - 0x14be, 3) ^ ObjectId ^ 0x915d);
SkillLevel = (ushort) ((SkillLevel + 0x100*(Timestamp%0x100)) ^ 0x3721);
Arg1 = MathUtils.BitFold32(SkillId, SkillLevel);
TargetObjectId = ExchangeLongBits(((TargetObjectId - 0x8b90b51a) ^ ObjectId ^ 0x5f2d2463u), 19);
PositionX = (ushort) (ExchangeShortBits((uint) PositionX - 0xdd12, 1) ^ ObjectId ^ 0x2ed6);
PositionY = (ushort) (ExchangeShortBits((uint) PositionY - 0x76de, 5) ^ ObjectId ^ 0xb99b);
}
BitFold32
public static int BitFold32(int lower16, int higher16)
{
return (lower16) | (higher16 << 16);
}
ExchangeLongBits
private static uint ExchangeLongBits(uint data, int bits)
{
return data >> bits | data << (32 - bits);
}
SkillID
andEncodeSkill
referencesSkillId
. Is this pseudocode? – Genevieve