12-27-2023, 01:37 PM
In this case, it doesn't appear to be CONST itself that's off, but something more internal, which we're just seeing via CONST here.
For hex values, CONST doesn't try to interpret the best value for us when calculating the number; it just assigns it to an _UNSIGNED _INTEGER64 -- which should hold the largest hex values that we push it, with no problems.
The problem is, at that internal step, the value is now getting corrupted:
Those aren't the same two values at all!
For hex values, CONST doesn't try to interpret the best value for us when calculating the number; it just assigns it to an _UNSIGNED _INTEGER64 -- which should hold the largest hex values that we push it, with no problems.
The problem is, at that internal step, the value is now getting corrupted:
Code: (Select All)
t~&& = &HFFFF
Print Hex$(t~&&)
Quote:FFFFFFFFFFFFFFFF
Those aren't the same two values at all!