I am working on a game with LibGDX, and my stage is set as input processor:
Gdx.input.setInputProcessor(stage);
Everything works fine, but now I want to act on swipe (left and right). I see some samples which suggest extending GestureListener, and setting it as input processor. But if I do that, then my stage can no longer be input processor as well. So, how do I get both touch and swipe events?
When you want to have more than one InputProcessor, you have to use an InputMultiplexer which chains multiple InputProcessors. For example:
InputMultiplexer multiplexer = new InputMultiplexer();
multiplexer.addProcessor(stage);
multiplexer.addProcessor(myGestureListener);
Gdx.input.setInputProcessor(multiplexer);
This way Libgdx will listen to both stage's and GestureListener's events.
Related
The application I am writing is not a game, but does require many of the features one would use in a game... displaying a 2D scene, moving the camera to pan and zoom, rotating or otherwise animating objects within the scene. But the display of the scene will be controlled via numerous regular windows controls.
The best comparison I can think of right now is a level editor. The majority of the user interface is a standard window with panes that contain different controls. The scene is contained in another child window. When the user makes adjustments such as camera location, the scene responds accordingly.
So far, everything I've seen about cocos is geared around a single window. Is it possible to embed a scene into a child window as I've described?
You can add your custom layer to your main scene using Director::getInstance()->getRunningScene()->addChild(...) function
Im developing a flash game, and i would love to implement raining effect. Here's my progress on rain so far: http://www.squ4re.eu/Rain.html
The code is pretty simple; every raindrop is an object, when it hits the ground it places itself again at the top of the screen and adds splash animation.
But the problem is to click something BEHIND the rain. Lets say i have some selectable units at the battleground. In most cases an random raindrop interrupts selecting an object behind it. So here's my question: Is it possible in flash to create object "transparent" to mouse click, so i can click an object behind it? Or is there any other way to solve this problem?
Thank you in advance.
As #putvande mentioned, you could use mouseEnabled on every interactive object that should be disabled for mouse interaction. You also could create rainLayer and disable it for mouse interaction:
myRainLayer.mouseEnabled = false;
myRainLayer.mouseChildren = false;
mouseChildren - determines whether or not the children of the object are mouse, or user input device, enabled. If an object is enabled, a user can interact with it by using a mouse or user input device. The default is true.
Also consider to use display objects that don't inherit from InteractiveObject, like Bitmap, Shape and Video
I'm working on game made with libgdx that needs some GUI above my game screen. Something like FrameLayout in Android.
I have GameScreen where everything is happening.
What I want now is to add a "pause" button, highscore information etc.
I've tried to combine a Stage object with regular sprite drawing.
But I had some problems with handling inputs: how to manage if user clicked pause button in stage, or clicked game area (where I should add some bullets)...
You should be able to use a Stage to manage your UI. To get input working correctly, you'll need to add an InputMultiplexer
so that the Stage and then your current input scheme will both get the inputs.
To set it up, you'll do something like this:
InputMultiplexer multiplexer = new InputMultiplexer();
multiplexer.addProcessor(stage);
multiplexer.addProcessor(gameScreenInputProcessor);
Gdx.input.setInputProcessor(multiplexer);
(Code sample based on code from https://code.google.com/p/libgdx/wiki/InputEvent)
Note that the order is important (I'm guessing you'll want the stage to get events first to see if the UI is being touched or not). Also, the boolean return value from input event handlers are more important with a multiplexer, as "handled" events will not be propagated by the mutliplexer. UI events inside the Stage have their own "handled" flag (mostly it does the right thing but there are some subtle differences).
One alternative to the InputMultiplexer would be to create a "GameScreenActor" (a new subclass of Actor) that contains your current game screen that you plug into the global Stage. You'd have to move your input processing to the scene2d approach, though. This probably isn't the right choice for you, but it is a viable one.
My intention is to have different gesture-based actions occur depending on whether the user swipes the screen with one or two fingers. I'm a little new to touch app development and I'm not seeing the necessary information in either Manipulation events or the Toolkit GestureService. I'm fine with using more low-level touch logic or manually tracking the number of touch contacts if necessary, I just need a little guidance on how to differentiate single from double-touch in a gesture.
This may help you to resolve your problem. There is property to find the second touch point in the gesture panel.
GestureSample sample = TouchPanel.ReadGesture();
Vector2 first = sample.Position;
Vector2 second = sample.Position2;
Position2 property will return the seocond touch position in the Gesture panel.
I am trying to make it possible to display and interact with Java Swing components on top of a Java3D canvas. I am displaying the components by painting a transparent JPanel to a buffered image, and then painting that buffer over the canvas using J3DGraphics2D.
What I can't figure out is how to forward mouse events to the swing components in the JPanel.
I want all keyboard and mouse events on the Canvas3D to be dispatched to the JPanel, and then fall back through to the Canvas3D if they aren't captured by any swing components (e.g. the mouse isn't over any of them).
I tried calling Container.dispatchEvent(AWTEvent), but it doesn't successfully dispatch the events to the proper components, even when for example the mouse cursor is right over a button in the Container.
Does anyone know a way to do this? It should be possible.
At long last, I figured it out! It's already been done -- use JCanvas3D and a JLayeredPane. This is the opposite approach to rendering the Swing components in postRender()-- JCanvas3D renders into an offscreen buffer, and then paints to the screen with AWT, creating a lightweight canvas that interacts properly with components in the JLayeredPane, even if they are transparent.
One thing to watch out for -- JCanvas3D redirects all input to the offscreen Canvas3D, but at first my Orbiter didn't work like it had with a heavyweight Canvas3D. All you have to do is add mouse & key listeners to the JCanvas3D, because AWT won't even deliver those events if there are no listeners registered for them.