Context: I'm trying to create a fader-like widget that can have multiple instances in the same view, each of which can be controlled simultaneously by different fingers.
I want to use Qt's gesture recognition system, but I also need some functionality above and beyond the standard Qt::PanGesture. To this end, I've subclassed both QGesture and QGestureRecognizer. In FooGestureRecognizer::recognize(...), I'm currently intercepting both QMouseEvents and QTouchEvents (for the time being, at least).
On Windows I only receive QMouseEvents - I handle them and everything works as expected (though obviously I don't have to deal with the multitouch problem when my input is from a physical mouse). The events I receive (in order):
QEvent::MouseButtonPress- A string of
QEvent::MouseMoves QEvent::MouseButtonRelease
On Android, I receive a strange mix of QMouseEvents and QTouchEvents (in order):
QEvent::TouchBeginQEvent::MouseButtonPressQEvent::MouseMove(with no actual change in position)- Another
QEvent::MouseButtonPress(not sure why I needed another one) - My actual string of
QEvent::MouseMoves, as expected QEvent::MouseButtonRelease
The global attribute Qt::AA_SynthesizeMouseForUnhandledTouchEvents is true by default. Turning it off changes the events I receive to:
QEvent::TouchBegin
...nothing else.
Here's a precursor question then: What can I do inside QGestureRecognizer::recognize() to tell Qt that I'm handling the QEvent::TouchBegin, and that it doesn't need to synthesize a QEvent::MouseButtonPress for me? event->accept() doesn't appear to make any difference.
The actual question: If (as it appears) Qt is synthesizing MouseEvents from TouchEvents, why do I see I see QEvent::MouseMove and QEvent::MouseButtonRelease but not QEvent::TouchUpdate or QEvent::TouchRelease?
Code is available, but in the interests of conciseness I've not included it here. Please ask if needed.