01-07-2024, 07:24 PM
To be honest I've never been really interested if &h represented itself as a positive or negative decimal number. It's just a decimal representation that hardly never is relevant for your program.
Always when I used &h in my code it was for better readability of that specific value.
Mostly used for example with things like:
Color &hFF008080
SendCommand Chr$(&hFA)
If StatusBits And &h08 Then
I normally expect a program language to automatically type-cast to the needed format.
For example if Color function was defined something like 'void Color (uint32_t fcolor, uint32_t bcolor)' it would not matter what &hFFFFFFFF represents decimally because it would be type-casted to unsigned long.
Same would go for Chr$(uint8_t)
etc.
Not sure if this helps for a future way forward?
Always when I used &h in my code it was for better readability of that specific value.
Mostly used for example with things like:
Color &hFF008080
SendCommand Chr$(&hFA)
If StatusBits And &h08 Then
I normally expect a program language to automatically type-cast to the needed format.
For example if Color function was defined something like 'void Color (uint32_t fcolor, uint32_t bcolor)' it would not matter what &hFFFFFFFF represents decimally because it would be type-casted to unsigned long.
Same would go for Chr$(uint8_t)
etc.
Not sure if this helps for a future way forward?
45y and 2M lines of MBASIC>BASICA>QBASIC>QBX>QB64 experience