I don't expect this is currently possible, but figured it can't hurt to ask...
Is this supported in QB64PE or doable via some API hack?
I'm thinking of applications like
- input for multiplayer games, e.g. a pong game where each player has a zone of the screen such as top/bottom/left/right edges or top left/top right/bottom left/bottom right quadrants. If we detect a screen press in a given quadrant we use that to reposition the associated player.
- input for musical instruments, e.g. a simple synth with an onscreen keyboard, but also slider controls to bend notes, change various sound parameters, buttons to change waveform, etc., all in realtime and simultaneously. (And later when QB64PE gets multi-channel audio, polyphony.)
I've never done any kind of programming for multitouch input, but multitouch screens are common enough nowadays that I would think there is pretty standard OS level support for them?
Is this supported in QB64PE or doable via some API hack?
I'm thinking of applications like
- input for multiplayer games, e.g. a pong game where each player has a zone of the screen such as top/bottom/left/right edges or top left/top right/bottom left/bottom right quadrants. If we detect a screen press in a given quadrant we use that to reposition the associated player.
- input for musical instruments, e.g. a simple synth with an onscreen keyboard, but also slider controls to bend notes, change various sound parameters, buttons to change waveform, etc., all in realtime and simultaneously. (And later when QB64PE gets multi-channel audio, polyphony.)
I've never done any kind of programming for multitouch input, but multitouch screens are common enough nowadays that I would think there is pretty standard OS level support for them?