Posts: 3,965
Threads: 176
Joined: Apr 2022
Reputation:
219
Look I know I am not going to change computer science with this stuff, just saying if you want consistency between number bases this is the way I would go. You don't need 2 sets of rules for base 10 and NOT base 10.
Sorry this isn't even about QB64pe versions... we'd have to go back in time and waste 1 decmal place for the sign.
b = b + ...
Posts: 128
Threads: 12
Joined: Apr 2022
Reputation:
14
To be honest I've never been really interested if &h represented itself as a positive or negative decimal number. It's just a decimal representation that hardly never is relevant for your program.
Always when I used &h in my code it was for better readability of that specific value.
Mostly used for example with things like:
Color &hFF008080
SendCommand Chr$(&hFA)
If StatusBits And &h08 Then
I normally expect a program language to automatically type-cast to the needed format.
For example if Color function was defined something like 'void Color (uint32_t fcolor, uint32_t bcolor)' it would not matter what &hFFFFFFFF represents decimally because it would be type-casted to unsigned long.
Same would go for Chr$(uint8_t)
etc.
Not sure if this helps for a future way forward?
45y and 2M lines of MBASIC>BASICA>QBASIC>QBX>QB64 experience