If I set Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT; will there still be dispatched a MouseEvent.CLICK when the users tabs? or will there ONLY be dispatched a TouchEvent.TOUCH_TAP event ?
(on a multitouch supported device)
Actually, mouse events are dispatched in this case for the first point of contact. That's how UI elements not suited for touch input continue to work on touch devices.
At least MOUSE_DOWN and MOUSE_UP are dispatched right after TOUCH_BEGIN and TOUCH_END which is very annoying sometimes.
Finally found the answer to this, sorry Stackoverflow, wasn't trying to spam!
MultitouchInputMode.TOUCH_POINT: Use this mode if you are interested only in touch events and no mouse or gesture events. You can use this mode to synthesize your own gestures if you want to support gestures that are not supported by the runtime, or if you need to support both multitouch and gestures. (http://www.adobe.com/devnet/flash/articles/multitouch_gestures.html)
In case anyone else finds the native touch implementation falls short there is also the following which may be worth looking into:
Gestouch: NUI gestures detection framework for mouse, touch and multitouch AS3 development.
Gestouch is a ActionScript library/framework that helps you to deal with single- and multitouch gestures for building better NUI (Natural User Interface).
https://github.com/fljot/Gestouch
Related
I ported some code to MonoGame and I got a problem: A hold gesture doesn't happen until I make a move, so in order to get a hold gesture I need press finger/mouse, wait for some time and then move finger/mouse a little bit. This issue reproduces on both Device and Emulator. I don't have this problem when use XNA library on the same device.
The code is simple
while (TouchPanel.IsGestureAvailable)
{
GestureSample originalGesture = TouchPanel.ReadGesture();
...
Is there any common solution except for emulating a hold gesture by processing TouchLocation?
Why not just check the TouchLocation.State for Pressed. If its pressed for more than 1 draw cycle then perhaps assume its a hold gesture?
Thats what I do in my monogame code, check TouchLocation.State for TouchLocationState.Pressed and TouchLocationState.Released.
Creating an application that requires BOTH gesture(swipe) support as well as simple touch events. I understand that one limiation of the built-in touch support in actionscript is that you must choose either Gesture OR Touch events as input.
So I was wondering if you can easily simulate gesture events using the TouchEvent.TOUCH_BEGIN +TouchEvent.TOUCH_END events? Are they essentially the same thing as using Gesture events?
I believe you'll be able to simulate the gestures appropriately by using the touch events. Each time a finger goes down a temporary id is assigned to it so you can easily tell if this is the first or second finger down. In terms of them being the same, it's not exactly the same since the GestureEvents seem to be all dependent on the mobile OS to report as gestures instead of just as touches so any calculation for deltas (or whatever else) would be handled by the OS already instead of you doing it (with the overhead of the VM). http://help.adobe.com/en_US/as3/dev/WS1ca064e08d7aa93023c59dfc1257b16a3d6-7ffd.html
Try making gestureevents with touchevents. There are lots of properties that can easily be converted / combined.
I'm using a multi-touch screen for making an interactive presentation with Adobe AIR and four people is going to use the screen at once.
I've done some testing with MouseEvent (which works fine with one user) and I think that replacing that event with my own that can handle multiple users is the way to go, or did I miss something here?
Theres some work creating that event so i'd love some input, thanks.
You need to set the inputMode to use multi-touch, then you'll want to be listening for TouchEvents instead of MouseEvents (believe for a single point at least it will still dispatch the MouseEvent, not sure if this is true for multiple touches though).
http://help.adobe.com/en_US/flex/mobileapps/WSe11993ea1bd776e5-13e27e4812a431dbafc-8000.html
flash.ui.Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT;
If mouse events are used instead of touch events on touch enabled devices, does that limit "touch" input to one touch at a time?
If a mouse down event is currently in progress, will a following mouse down event simply not register or cancel the previous?
How are mouse events, historically used as single control pointers on desktop systems, handled on touch enabled devices capable of several simultaneous touch points?
Event classes have a clone() function typically used to fire multiple events, so I'm assuming MouseEvent is not limited. However, my goal is to actually limit my application to one touch at a time (exclusive touch), but I'm not sure if this will be automatically handled with the use of mouse events.
Mouse events are handled in the same manner on both single-touch and multi-touch devices. If you only want single-touch, use the MouseEvent events and if you want multi-touch use the TouchEvent events. You can use the Multitouch.supportsTouchEvents property to determine touch support.
i have designed a gaming kiosk app in as3
i am using it on a Sony vaio l pc (like hp's touchsmarts) in windows 7
the app doesn't need any multi-touch gestures (only single touch clicks and drags) so i am using mouse events
everything is fine (including mouse click and move events) except that a single touch to the screen (with no move) doesn't fire a mouse down. it is fired only after a small move of the finger
outside the app, on my desktop, i see that the small windows 7 cursor jumps immediately to where a finger is placed, meaning this issue isn't a hardware or a windows problem but rather how internally the flash app receives "translated" touch-to-mouse events from the os.
for example, in a windows Solitaire game, a simple touch to the screen immediately highlights the touched card.
in my app, a button will change to the down state only if i touch it and also move my finger slightly (click events - down and up - are triggered fine)
shouldn't the MOUSE_DOWN event trigger exactly like how a TOUCH_BEGIN would in the new touchevent class?
any ideas?
I encountered the same problem.
Setting the Multitouch.inputMode property to MultitouchInputMode.TOUCH_POINT (the default value is MultitouchInputMode.GESTURE) appears to make the MOUSE_DOWN event dispatch when the user touches the screen and not when they touch and move or touch and release.
If the cursor moves when they touch, then I assume the OS is just registering this as a MOUSE_MOVE and not a MOUSE_DOWN. Since it's a touchscreen, you could just consider MOUSE_MOVE a click since the user probably isn't actually dragging their finger around creating a real MOUSE_MOVE event.
Well, if they are actually dragging their finger around for stuff then you could assume a MOUSE_MOVE that suddenly places the cursor on a button (with no prior MOUSE_MOVE i.e. dragging), it's a MOUSE_DOWN.
Just bought a new touchscreen and encountered the problem again.
So the solution is to set Multitouch.inputMode to MultitouchInputMode.TOUCH_POINT by writing anywhere in your code:
Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT;
Notice, that it does not work when testing by Ctrl+Enter in Flash Editor (at least in CC 2015). So, for example, you need to open .SWF separately in Flash Player.
EDIT: But it does work in Debug mode! (Ctrl+Shift+Enter)