We are experiencing an issue related to GoogleMaps. The issue is mainly related to touch screens. We were trying to resolve the issue, but so far no success.
We found in this article that the Google Maps API V3 does not supports touch event? Is this is true or false?
UPDATE
This issue was handled in the bug
https://issuetracker.google.com/issues/35824421
and was solved in version 3.27 of Google Maps JavaScript API in December 2016.
In my experience, the mousedown, mouseup, dragstart, dragend events work fine in place of touchstart, touchmove, touchend.
google.maps.event.addListener(myMap, "mousedown", function(event){...});
I'm pretty sure that gesture events are not going to be supported, since those are used for pinch-zoom functionality.
If you need gestures, you'd have to build your own recognizer by tracking mousedown events, storing them in an array, then tracking positions to determine angles, distances etc...
They aren't currently supported. See here for an interactive map that shows demonstrations of currently available events:
https://developers.google.com/maps/documentation/javascript/events#EventsOverview
This page also states that:
For a complete list of events, consult the Google Maps JavaScript API
Reference.
Touch related events are absent from that page, therefore they aren't supported.
Commenting on the accepted answer, the events that are supported are not strictly equivalent (a touch is clearly semantically different from a mouse click), and in my experience results can be variable (for example in some cases a touch can result in firing of an onclick event on Google maps, and in some cases it can result in firing of a mouseover event), so some fall-through handling may be required to reliably handle this type of occurence when 'borrowing' these events to detect touch.
Here's a good article on handling touch with a listener:
https://www.google.com.au/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0ahUKEwiwm6y38dXXAhWHVbwKHRdUCY4QFggmMAA&url=https%3A%2F%2Fmedium.com%2F%40david.gilbertson%2Fthe-only-way-to-detect-touch-with-javascript-7791a3346685&usg=AOvVaw3pWv0R_AESWKqwF72D0hix
Using mousedown, mouseup with e.domEvent.pointerType == "touch" makes it possible to detect the event type:
map/marker.addListener("mousedown"/"mouseup", (e) => {
if(e.domEvent.pointerType == "touch") console.log("touch");
else console.log("mouse/pen")
});
NOTE: e.domEvent.pointerType doesn't work for mousemove (which is BTW only available on map and not on markers) So I would recommend using the document.addEventListener() for mousemove/touchmove + flagging the current mouse/touch down/up state of the map/marker to spot the mousemove event type.
Related
When selecting many elements in the Autodesk's online viewer, I double click accidentally and the model zooms to the element. I have to zoom and pan to the view I'm comfortable with again and it is very annoying. Is there a way to disable this functionality in the online viewer?
I found similar questions for the desktop apps (https://forums.autodesk.com/t5/autocad-forum/turn-off-wheel-double-click-zoom-extents/td-p/53144) but not the online viewer.
There is a fairly straightforward workaround for that: create a custom tool and absorb the double-click event. See that article for a start Creating "Tools" for the View & Data API.
To absorb an event implement the corresponding handler and return true:
this.handleDoubleClick = function(event, button) {
// ... do your stuff if needed ...
// event handled
return true;
};
I am using the Map control in Windows Phone 8.
I need to implement a page where user can select his location using the map control.
I am trying to know when the app was first manipulated by the user.
Some background info:
I saw that when the control is shown, it automatically centers the world map, and CenterChanged event is raised.
I am not able to understand how ManipulationStarted, ManipulationDelta and ManipulationCompleted work.
the first time I drag, ManipulationStarted is not called, only ManipulationCompleted.
I could consider the first manipulation by user as being the 2nd time the CenterChanged is fired.
But this is a hack or a guess, I am not happy not having a good understanding how it works.
The Map control intercepts and handles Manipulation events and as such you don't get all of them. Remember, once routed events are marked at e.Handled=true they no longer bubble up.
Depending on your Scenario WP8 exposes the UseOptimizedManipulationRouting property which might prove useful. Setting UseOptimizedManipulationRouting=false causes Map, Pivot and other controls to not swallow events for nested controls.
If that doesn't help, have a look at the following Nokia Wiki article where the author ran into the same problem as you did and used Touch.FrameReported to get out of it # http://www.developer.nokia.com/Community/Wiki/Real-time_rotation_of_the_Windows_Phone_8_Map_Control
I have events working fine in Chrome and IE10 for Google Maps (APIv3) and RichMarkers. Problem is, the same code borks on Firefox19 with event undefined. So, this code works on Chrome and IE10...
google.maps.event.addListener( marker, 'mouseover', function(event) {
console.log(event);
});
But not on Firefox. Interestingly, attaching a CLICK event to the map object does work as you'd expect. The event object is visible within the called function in all browsers. So, does anyone have any idea as to how to fix this? I really need to pass the event object onwards as I have funcs that use it for positioning and so on.
Normally, I'd get around this using jQuery to attach the events, but this is not an option here.
Cheers
CT
There is no mouseover-event : http://google-maps-utility-library-v3.googlecode.com/svn/trunk/richmarker/docs/reference.html
Apply the mouseover-event to the content of the marker(the content must be a node)
If I set Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT; will there still be dispatched a MouseEvent.CLICK when the users tabs? or will there ONLY be dispatched a TouchEvent.TOUCH_TAP event ?
(on a multitouch supported device)
Actually, mouse events are dispatched in this case for the first point of contact. That's how UI elements not suited for touch input continue to work on touch devices.
At least MOUSE_DOWN and MOUSE_UP are dispatched right after TOUCH_BEGIN and TOUCH_END which is very annoying sometimes.
Finally found the answer to this, sorry Stackoverflow, wasn't trying to spam!
MultitouchInputMode.TOUCH_POINT: Use this mode if you are interested only in touch events and no mouse or gesture events. You can use this mode to synthesize your own gestures if you want to support gestures that are not supported by the runtime, or if you need to support both multitouch and gestures. (http://www.adobe.com/devnet/flash/articles/multitouch_gestures.html)
In case anyone else finds the native touch implementation falls short there is also the following which may be worth looking into:
Gestouch: NUI gestures detection framework for mouse, touch and multitouch AS3 development.
Gestouch is a ActionScript library/framework that helps you to deal with single- and multitouch gestures for building better NUI (Natural User Interface).
https://github.com/fljot/Gestouch
If mouse events are used instead of touch events on touch enabled devices, does that limit "touch" input to one touch at a time?
If a mouse down event is currently in progress, will a following mouse down event simply not register or cancel the previous?
How are mouse events, historically used as single control pointers on desktop systems, handled on touch enabled devices capable of several simultaneous touch points?
Event classes have a clone() function typically used to fire multiple events, so I'm assuming MouseEvent is not limited. However, my goal is to actually limit my application to one touch at a time (exclusive touch), but I'm not sure if this will be automatically handled with the use of mouse events.
Mouse events are handled in the same manner on both single-touch and multi-touch devices. If you only want single-touch, use the MouseEvent events and if you want multi-touch use the TouchEvent events. You can use the Multitouch.supportsTouchEvents property to determine touch support.