11-17-2025, 02:55 AM
I don't think you're really seeing a bug here. You're getting the mouse coordinates properly. You're just mathing them wrongly to get the correct row.
For example, the let's say the font is 8 pixels in size.
From Y 0 to Y7, that's row 1.
From Y8 to Y15, that's row 2.
Your math is simply:
row = INT(_MouseY / _FontHeight)
So when the mouse is at pixel 0, it's going to report being in row INT(0/8).. or 0.
Now when it's at pixel 8, it's going to report being in row INT(8/8).. or 1.
Mouse starts at base 0, but row counting starts on the first row.
You basically need to offset the math there to adjust for that +1 base difference.
(Note that in SCREEN 0, this offset is *NOT* needed as screen 0 gives us the row value automatically so it begins with a base value of 1, whereas mouse begins with a base value of 0.)
For example, the let's say the font is 8 pixels in size.
From Y 0 to Y7, that's row 1.
From Y8 to Y15, that's row 2.
Your math is simply:
row = INT(_MouseY / _FontHeight)
So when the mouse is at pixel 0, it's going to report being in row INT(0/8).. or 0.
Now when it's at pixel 8, it's going to report being in row INT(8/8).. or 1.
Mouse starts at base 0, but row counting starts on the first row.
You basically need to offset the math there to adjust for that +1 base difference.

(Note that in SCREEN 0, this offset is *NOT* needed as screen 0 gives us the row value automatically so it begins with a base value of 1, whereas mouse begins with a base value of 0.)

