Polymer Gestures tracking not working on touch devices - polymer

I have created a Polymer application that is listening for the trackstart, track and trackend events. It uses these events allow a SVG element to be dragged around. The events work correctly for desktop, however, for my Galaxy Nexus and Nexus 10 the events are not fired.
I have looked at the source code in the polymer-gestures project and it seems like the events are implemented using the touchstart, etc events in touch.js.
I'm using version 0.3.1 of platform.js which I assume has the polymer-gestures 0.3.1 in it.
How can I get the trackstart, track and trackend (also hold would be nice too) events working on my touchscreen devices?

So it'll be the touch-action:none CSS value that is needed to make sure the events are handled with JavaScript.

Related

Forge Viewer select element by tapping screen not working correctly on Surface Pro using IE 11 (via cordova)

Using Surface pro touch screen to select element in viewer works sometimes other times it seems to translate to a rotate/zoom action. In this case the viewer rotates/moves and the element is NOT selected.
When logging the events there are plenty of mouse down/up events along with mouse moves when it doesnt work. When select does work a single click event occurs.
Double click seems to work ok.
Zoom/rotate etc using standard tools works ok.
Using the keyboard cover touch pad that you can get for the Surface pro to move and click works as expected and the element is selected.
Running same application on a GETAC Windows 10 ruggadised tablet the select element works correctly so it seems related to the Surface Pro.
Unable to change browsers as cordova apps use IE11 on windows and that is currently fixed.
The only solution i can think of for the moment is to remove the standard navigation tools completely (somehow) and recreate a select mode tool that would ignore any mouse moves and use button down event to select element.
Any suggestions on how to fix this?
Tech Details:
Windows 10 Pro,
Surface Pro,
Browser: IE11,
Viewer version 2.11,
Other: WINJS81 cordova application
Thanks for any help
We've had problems with touch events on the Surface Pro in the past. It sounds like the edges of the touch screen are overly sensitive and are triggering extra touch points.
Does the problem happen if you are holding the device up, gripping with one hand, and using your other hand to touch/select a 3D object ?
Could you try doing a selection again, but this time, make sure you other hand is not holding the edge of the screen? (Perhaps place the device on the surface of a desk, so you are not holding it up)
Found a fix to this issue. In viewer3D in the base toolcontroller there is line
var kClickThreshold = 2;
This value is used further down in code to determine if singleClick has occured. It does it by comparing XY on down and up events.
var deltaX = _downX - event.canvasX;
var deltaY = _downY - event.canvasY;
_downX = -1;
_downY = -1;
if( Math.abs(deltaX) <= kClickThreshold && Math.abs(deltaY) <= kClickThreshold )
_this.handleSingleClick( event );
If movement is above this threshold it does not trigger singleClick if below it does.
Did testing and increasing the value to around 5-7 means that the selection of elements worked consistently. (there is still a minor rotate or zoom at same time as select occurs but I assume that this would be another part of viewer that would need adjusting)
Unfortunately it does require edit of viewer code but easy enough. I added code to overwrite standard value if external variable existed.
Would be nice for future viewer dev if more of these types of properties be exposed so that direct edit of code is not required.
Still it is good to have the source code to be able to debug at this level.
At a guess the Surface Pro 4 must have a more sensitive touch system or it could just be related to IE11 as well.

NPAPI plugin refresh issue in Google Chrome

Or ... "When to call pluginWindowMac::InvalidateWindow()"?
Apologies. It's hard to be brief here, it's a fairly specific test case.
I'm testing an NPAPI plugin (now adapted to Firebreath) under Firefox,Safari,Chrome and IE.
I'm testing the plugin under both Windows and OS X.
It's a video player using OpenGL for all rendering. So on OS X I'm using a CAOpenGLLayer
derived class, and supporting the Invalidating Core Animation model, which is used under
both Firefox and Chrome.
The plugin is running well in all cases and on both platforms, except for Chrome (v26.0.1410.65) on OS X, where I'm seeing a weird 'refresh' issue under both OS X 10.6
and 10.7.
It's as if Chrome is double-buffering the composited layer somewhere. It's always one
'frame' behind the one drawn via OpenGL. I've proved this by numbering each draw call
and drawing that number into the OpenGL render as well as logging it via NSLog. What's
on screen is always one behind the logged draw callback (I'm not rendering video at
this point - it's paused).
By using a specific keyboard event to trigger a extra call to pluginWindowMac::InvalidateWindow(),
I can force an external invalidate that will bring what's on screen into sync with
what was last rendered, but it will lose sync again on the next draw call (triggered
by the layer setNeedsDisplay).
I'm presently calling pluginWindowMac::InvalidateWindow() at the end of each draw
callback. In other words at the end of the layer 'drawInCGLContext()' call. This would
seem to be the only logical place to do it. But I'm beginning to suspect that under
Chrome this just isn't working? Clearly the call works - as proved by the keyboard
triggered InvalidateWindow, but maybe it doesn't work from inside drawInCGLContext?
The plugin is running fine under Firefox (the other browser that uses Invalidating
Core Animation). But maybe that simply doesn't rely on a working pluginWindowMac::InvalidateWindow()?
Has anyone else tripped over this issue with Chrome? Any workarounds discovered? About
all I can think of right now is some kind of timer driven event to trigger an extra
Invalidate.

AS3 GestureEvent vs TouchEvent.TOUCH_BEGIN +TouchEvent.TOUCH_END

Creating an application that requires BOTH gesture(swipe) support as well as simple touch events. I understand that one limiation of the built-in touch support in actionscript is that you must choose either Gesture OR Touch events as input.
So I was wondering if you can easily simulate gesture events using the TouchEvent.TOUCH_BEGIN +TouchEvent.TOUCH_END events? Are they essentially the same thing as using Gesture events?
I believe you'll be able to simulate the gestures appropriately by using the touch events. Each time a finger goes down a temporary id is assigned to it so you can easily tell if this is the first or second finger down. In terms of them being the same, it's not exactly the same since the GestureEvents seem to be all dependent on the mobile OS to report as gestures instead of just as touches so any calculation for deltas (or whatever else) would be handled by the OS already instead of you doing it (with the overhead of the VM). http://help.adobe.com/en_US/as3/dev/WS1ca064e08d7aa93023c59dfc1257b16a3d6-7ffd.html
Try making gestureevents with touchevents. There are lots of properties that can easily be converted / combined.

ActionScript / AIR - MouseEvent Limitations on Touch Enabled Devices?

If mouse events are used instead of touch events on touch enabled devices, does that limit "touch" input to one touch at a time?
If a mouse down event is currently in progress, will a following mouse down event simply not register or cancel the previous?
How are mouse events, historically used as single control pointers on desktop systems, handled on touch enabled devices capable of several simultaneous touch points?
Event classes have a clone() function typically used to fire multiple events, so I'm assuming MouseEvent is not limited. However, my goal is to actually limit my application to one touch at a time (exclusive touch), but I'm not sure if this will be automatically handled with the use of mouse events.
Mouse events are handled in the same manner on both single-touch and multi-touch devices. If you only want single-touch, use the MouseEvent events and if you want multi-touch use the TouchEvent events. You can use the Multitouch.supportsTouchEvents property to determine touch support.

Adobe AIR: touch screen doesn't trigger mouse down event correctly

i have designed a gaming kiosk app in as3
i am using it on a Sony vaio l pc (like hp's touchsmarts) in windows 7
the app doesn't need any multi-touch gestures (only single touch clicks and drags) so i am using mouse events
everything is fine (including mouse click and move events) except that a single touch to the screen (with no move) doesn't fire a mouse down. it is fired only after a small move of the finger
outside the app, on my desktop, i see that the small windows 7 cursor jumps immediately to where a finger is placed, meaning this issue isn't a hardware or a windows problem but rather how internally the flash app receives "translated" touch-to-mouse events from the os.
for example, in a windows Solitaire game, a simple touch to the screen immediately highlights the touched card.
in my app, a button will change to the down state only if i touch it and also move my finger slightly (click events - down and up - are triggered fine)
shouldn't the MOUSE_DOWN event trigger exactly like how a TOUCH_BEGIN would in the new touchevent class?
any ideas?
I encountered the same problem.
Setting the Multitouch.inputMode property to MultitouchInputMode.TOUCH_POINT (the default value is MultitouchInputMode.GESTURE) appears to make the MOUSE_DOWN event dispatch when the user touches the screen and not when they touch and move or touch and release.
If the cursor moves when they touch, then I assume the OS is just registering this as a MOUSE_MOVE and not a MOUSE_DOWN. Since it's a touchscreen, you could just consider MOUSE_MOVE a click since the user probably isn't actually dragging their finger around creating a real MOUSE_MOVE event.
Well, if they are actually dragging their finger around for stuff then you could assume a MOUSE_MOVE that suddenly places the cursor on a button (with no prior MOUSE_MOVE i.e. dragging), it's a MOUSE_DOWN.
Just bought a new touchscreen and encountered the problem again.
So the solution is to set Multitouch.inputMode to MultitouchInputMode.TOUCH_POINT by writing anywhere in your code:
Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT;
Notice, that it does not work when testing by Ctrl+Enter in Flash Editor (at least in CC 2015). So, for example, you need to open .SWF separately in Flash Player.
EDIT: But it does work in Debug mode! (Ctrl+Shift+Enter)