In C#, I can cast things to 8bit signed ints like so:
(sbyte)arg1;
which when arg1 = 2, the cast returns 2 also. However, obviously casting 128 will return -128. More specifically casting 251 will return -5.
What's the best way to emulate this behavior?
Edit: Found a duplicate question: Typecasting in Python
s8 = (i + 2**7) % 2**8 - 2**7 // convert to signed 8-bit