I ported some code to MonoGame and I got a problem: A hold gesture doesn't happen until I make a move, so in order to get a hold gesture I need press finger/mouse, wait for some time and then move finger/mouse a little bit. This issue reproduces on both Device and Emulator. I don't have this problem when use XNA library on the same device.
The code is simple
while (TouchPanel.IsGestureAvailable)
{
GestureSample originalGesture = TouchPanel.ReadGesture();
...
Is there any common solution except for emulating a hold gesture by processing TouchLocation?
Why not just check the TouchLocation.State for Pressed. If its pressed for more than 1 draw cycle then perhaps assume its a hold gesture?
Thats what I do in my monogame code, check TouchLocation.State for TouchLocationState.Pressed and TouchLocationState.Released.
Related
I have just started using starling (framework based on stage3d) and i am working on a simple ping pong game.
What my problem is the moment my mouse leaves the stage area everything just stops and resumes working when the mouse enter the area again. I guess this is some feature in the framework but how can i control it... is there some kind of event being fired..?? or is there some way to stop this feature ??
If its relevant i am using the 'TouchEvent.Touch' event and using the 'moved' phase. Any other details if required i am ready to provide....
Thanks. :-)
Ok i finally got the solution from the official starling forums.
As my mouse left the stage my 'touch' was getting null... hence the whole 'pausing' of game code.. I don't know why i didn't get an error of some kind.
Still if someone is facing something similar.. might check ur values are not null...
As you may know in "real" games you need to check if a key is pressed at this time. Right now I'm using events and I remember if last event was a key-up or key-down to know if a button is held down.
BUT I'm experiencing some lags (not network) in the game when several buttons are held down. New key pushes are recognized a little bit too late. From DirectInput in DirectX and from LWJGL I know how nice and smooth input for games can work.
a) So I ask: Is there a way to check keys directly in ActionScript 3 ? I don't think any other extra package would be useful since it would just do what I do right now.
b) If no. Why wouldn't Adobe add direct input feature to Flash since it is used for games to a high percentage ?!?
My intention is to have different gesture-based actions occur depending on whether the user swipes the screen with one or two fingers. I'm a little new to touch app development and I'm not seeing the necessary information in either Manipulation events or the Toolkit GestureService. I'm fine with using more low-level touch logic or manually tracking the number of touch contacts if necessary, I just need a little guidance on how to differentiate single from double-touch in a gesture.
This may help you to resolve your problem. There is property to find the second touch point in the gesture panel.
GestureSample sample = TouchPanel.ReadGesture();
Vector2 first = sample.Position;
Vector2 second = sample.Position2;
Position2 property will return the seocond touch position in the Gesture panel.
Creating an application that requires BOTH gesture(swipe) support as well as simple touch events. I understand that one limiation of the built-in touch support in actionscript is that you must choose either Gesture OR Touch events as input.
So I was wondering if you can easily simulate gesture events using the TouchEvent.TOUCH_BEGIN +TouchEvent.TOUCH_END events? Are they essentially the same thing as using Gesture events?
I believe you'll be able to simulate the gestures appropriately by using the touch events. Each time a finger goes down a temporary id is assigned to it so you can easily tell if this is the first or second finger down. In terms of them being the same, it's not exactly the same since the GestureEvents seem to be all dependent on the mobile OS to report as gestures instead of just as touches so any calculation for deltas (or whatever else) would be handled by the OS already instead of you doing it (with the overhead of the VM). http://help.adobe.com/en_US/as3/dev/WS1ca064e08d7aa93023c59dfc1257b16a3d6-7ffd.html
Try making gestureevents with touchevents. There are lots of properties that can easily be converted / combined.
I'm trying to make a google maps style interface for a design project. I've got the drag/drop and zoom functions working, but I also want to make it react to gestures on a trackpad (macbook). I assumed 'listening' to the event.delta of a MouseEvent would do the trick, but somehow it's not working. So what's wrong with my code?
stage.addEventListener(MouseEvent.MOUSE_WHEEL, onMouseWheelEvent);
function onMouseWheelEvent(event:MouseEvent):void {
tafelOrigineel_mc.y += event.delta;
}
I have loaded the flash MouseEvents earlier in the document, so that shouldn't be the problem. After I got this working, I will try to use it on the x-axis too. Is that possible with the MOUSE_WHEEL eventlistener?
Thx in advance
It is a long time problem regarding flash player on MacOS.
MOUSE_WHEEL event won't dispatch on MacOS. Though there are some workarounds involving the use of JavaScript to detect the use of the wheel (over the entire flash content), if it isn't a issue, try checking one of those.
There is a list in this blog post:
http://www.impossibilities.com/v4/2009/03/06/flash-mousewheel-implementations-for-mac-os-x/