Libgdx, viewport in actor - libgdx

Is it possible to put viewport inside a custom actor, so that viewport width and height will be actor's width and height, and rendering will start from actor position, no from (0,0).

Stage is a 2D scene graph. It has a hierarchical structure, which means that Actors added to a Group should already be rendered only inside that given Group. Moving the Group will also move all children inside. Via clipBegin() and clipEnd you can also "cut off" everything that's not inside the actor, which is kind of what a Viewport does as well, when setting the glViewport.
So probably you won't need an extra Viewport for whatever you are trying to do. If you still think you need one, you can create an ActorViewport extends Viewport which gets an Actor field. You would have to override the apply(boolean) method and synchronize the worldWidth, worldHeight, screenX, screenY, screenWidth and screenHeight variables to match the Actor. Remember that you will have to update the viewport everytime the actor changes, which is in every frame in the worst case.

Thanks to noone solution was very simple:
public class ActorViewport extends Viewport
{
private Actor m_actor;
public ActorViewport(Actor actor, int worldWidth, int worldHeight, Camera camera)
{
m_actor = actor;
setWorldSize(worldWidth, worldHeight);
setCamera(camera);
}
#Override
public void update(int screenWidth, int screenHeight, boolean centerCamera)
{
setScreenPosition((int)m_actor.getX(), (int)m_actor.getY());
setScreenSize((int)m_actor.getWidth(), (int)m_actor.getHeight());
}
}

Related

Confused about Inputlistener LIBGDX

I have
Main Test Class creating a Stage, Adding an Actor to the stage and setting the Inputprocessor to the stage
extended Group Class with several Actors added. In the Constructor of the Group I have added an InputListener.
The InputListener is not fired . Can someone tell why not and how to do ?
public class Test extends ApplicationAdapter implements ApplicationListener {
public void create() {
stage = new Stage(new ScreenViewport());
specialScene = new SpecialScene();
stage.addActor(specialScene);
Gdx.input.setInputProcessor(stage);
}
}
public class SpecialScene extends com.badlogic.gdx.scenes.scene2d.Group {
public SpecialScene {
<add some actors ...>
addListener(specialListener);
}
private static InputListener specialListener = new InputListener() {
public boolean touchDown (InputEvent event, float x, float y, int pointer, int button) {
return true; //or false
}
#Override
public void touchUp(InputEvent event, float x, float y, int pointer, int button) {
super.touchUp(event, x, y, pointer, button);
}
#Override
public void enter(InputEvent event, float x, float y, int pointer, Actor fromActor) {
super.enter(event, x, y, pointer, fromActor);
}
};
}
* UPDATE *
I found the problem. The Listener did not find any region of my Actors.
I have to set explicitely the Region with setBounds().
My Problem is solved, but I am still confused. Why do I have to set the bounds myself. I will forget this with every Actor in the future I am sure, because it is unlogical to me. Is this the way I have to, or do I understand the concept wrong ?
Group is a skeleton class that can be used to develop your own functionality, so it does not presume anything, even the way its child actors contribute to its bounds. (For example, you might have some actors that you don't want to contribute because they are a visual flourish, like particles.) You can extend Group to create your own base class to fit your needs.
So why doesn't LibGDX already include a class like that? In LibGDX, Stage is used primarily for the UI system. Although it was designed to be extensible to all kinds of purposes, it only includes a framework for you to do that, unless you are using the fully baked UI implementation that is based on it. That UI implementation does include a subclass of Group called WidgetGroup, which does what you'd expect with the bounds.
IIRC, the author of Stage wrote a blog post a few years ago on libgdx.com discussing how he tried using Stage for the gameplay of a simple game jam game, and basically concluded that it caused his game to be more convoluted, or at least more time-consuming to code.
I have personally used it for a turn-based game jam game, and it was good for that. I used the Actions system to have nice animated transitions of the game pieces. But I think it would make a real-time game more convoluted than creating your own organization structure that is tailored to your particular game. If you are creating a more complicated game, you might check out the Ashley plugin for LibGDX.
In either case, you definitely should use it for GUI stuff because that is all fully implemented and a huge time-saver.

libgdx - ClickListener not affected by actor scale?

Here's a constructor for a card class I was making. When I create one that is scaled and click on it, only clicks within the 1.0x scaled area are actually registered. i.e. if I pass in 1.5 for the scale, clicks on the borders don't work. Why not? I've scaled the actor itself.
public Card(float x, float y, float scale)
{
this.setPosition(x, y);
faceSprite = new Sprite(MyResources.getInstance().cardTextureRegion);
faceSprite.setPosition(x, y);
faceSprite.setScale(scale);
borderSprite = new Sprite(MyResources.getInstance().cardBorderTextureRegion);
borderSprite.setPosition(x, y);
borderSprite.setScale(scale);
// Set boundaries for ourselves (the actor). Note that we have to match the scale of the sprites.
setSize(borderSprite.getWidth(), borderSprite.getHeight());
setScale(scale);
// Add ClickListener
final Card thisCard = this;
addListener(new ClickListener() {
public void clicked(InputEvent event, float x, float y) {
((MyStage)(thisCard.getStage())).cardClicked(thisCard);
}
});
}
Basic Actor class might be missing some functionalities, including proper scale handling. If all you want is displaying some images (judging by the Sprite objects you use), I'd suggest using the existing classes rather than making custom actors - especially since you might have trouble rendering Sprites with exact Actor parameters if you use a lot of custom actions.
Image allows you to display "sprites" - although somewhat simplified, it should be enough. To store the images, you can use a Table (more flexible) or a Stack (will work with multiple images of the same size).
If you really want to stick with the custom actor approach, try this instead of changing the scale:
setSize(borderSprite.getWidth() * scale, borderSprite.getHeight() * scale);
If you don't change Card scale manually in runtime (only set it up during creation), it should just work.
You Can use image Class to display Images.
Texture texture=new Texture("test.jpg");
Image image=new Image(texture);
//you can use setsize() or setBounds() as per your requirement.
image.setScale(scale);
image.addListener(new ClickListener(){
#Override
public void clicked(InputEvent event, float x, float y) {
((MyStage)(thisCard.getStage())).cardClicked(event.getTarget());
}
});

Post overriding the paint method of the components in java

In java awt or swing when you want to change painting of some component you usually have to override the method paint(Graphics g) (in awt) or paintComponent(Graphics g) (in swing).
This is usually (maybe allways - I'm not sure) done when you are creating the component for example:
JPanel jPanel = new JPanel() {
#Override
protected void paintComponent(Graphics g) {
super.paintComponent(g);
Graphics2D g2d = (Graphics2D) g;
//... my implementation of paint, some transfromations, rotation, etc
}
};
Imagine that you have container of components which could for example consists of some JLabels, some JTextFields, some image. Which will be all put on one component.
By container I mean you have some list or map with ids or some similar structure in which are all components you will put on one JFrame.
The question is if I can change the painting method after creating with all of the components which are in this list in the moment when all of them are already created. For example I want do the rotation action (rotate), which is defined in Graphisc2D, with all of them.
So basicaly what I want is that I throught the list of componets I have and say:
"All of you (components) which are in the list will be rotated by some angle". Is that possible? If yes how?
Edit:
This is my not correctly working solution:
graphicalDisplayPanel = new JPanel() {
#Override
protected void paintComponent(Graphics g) {
super.paintComponent(g);
g2d = (Graphics2D) g;
g2d.rotate(Math.PI, anchorx, anchory);
}
#Override
public void paintChildren(Graphics g) {
super.paintChildren(g);
Graphics2D g2d2 = (Graphics2D) g;
g2d2.rotate(Math.PI, anchorx, anchory);
}
};
JFrame jFrame = JFrame();
// ... setting dimension, position, visible etc for JFrame, it works correctly nonrotated
jFrame.setContentPane(graphicalDisplayPanel);
I have not tested this, but it seems like it would work. A JComponent's paint() method calls:
paintComponent(co);
paintBorder(co);
paintChildren(co);
where co is a Graphics object. In theory you create an image, retrieve the graphics object and then pass that into paintChildren(). you will have to call paintComponent() and paintBorder() yourself, if you do this. Then, just rotate the image and draw it into your component. You may have to crop the image or resize your component accordingly for this to work. It might look something like this:
BufferedImage myImage;
#Override
public void paint(Graphics g){
myImage = new BufferedImage(getWidth(), getHeight(), BufferedImage.TRANSLUCENT);
//using a transparent BufferedImage might not be efficient in your case
Graphics myGraphics = myImage.getGraphics();
super.paintComponent(g);
super.paintBorder(g);
super.paintChildren(myGraphics);
//rotation code here
// ...
//draw children onto your component
g.drawImage(myImage, 0, 0,getWidth(), getHeight(), null);
}
I hope I didn't make any mistakes, please let me know if this works.
So basicaly what I want is that I throught the list of componets I have and say: "All of you (components) which are in the list will be rotated by some angle".
If you want to rotate panel and therefore all the components on the panel as a single using then you need to do the custom painting in the paintComponent() method.
If you want to rotate, for example, individual images that each have a different angle of rotation then you can again do this in the paintComponent(...) method and change the angle for each component.
Or, in this second case you can use the Rotated Icon class. In this case the Icon is just added to a JLabel. Then you can change the degrees of rotation and repaint the label, so there is no custom painting (except in the Icon itself).

LibGdx View Port, resolutions, sprite size

Helllo, im new to libgdx, i need some help, if my desktoplauncher resolution is 480x320, sprite takes 80% of the screen, but if 1280x720, sprite is small, i need to make it look the same at all resoltions so how do i do this? may be easy for you, but not for my, im using libgdx 1.3.1
this is my libgdx code:
SpriteBatch batch;
Texture img;
Sprite mysprite;
#Override
public void create ()
{
batch = new SpriteBatch();
img = new Texture("badlogic.jpg");
mysprite = new Sprite(img);
stage = new Stage(new StretchViewport(480, 320));
}
StretchViewport myviewport = new StretchViewport(480, 320);
public void resize(int width, int height)
{
// use true here to center the camera
// that's what you probably want in case of a UI
stage.setViewport(myviewport);
stage.getCamera().position.set(640/2, 480/2, 0);
}
private Stage stage;
#Override
public void render ()
{
Gdx.gl.glClearColor(1, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
mysprite.draw(batch);
batch.end();
}
You are allready using the right thing: Viewport.
There are different Viewport classes, some of them support working with a virtual screen size, which is what you are looking for.
What is a virtual screen size? Well it is the screen size your code is working with and which is then scaled up to match the real resolution.
Basicly you can work with your own units and they are then automatically scaled up to match pixels.
I guess in your case there are 2 possible Viewport-types:
- StretchViewport supports virtual screen sizes and scales it up to match the real screen size and the real aspect ratio. If the real aspect ration does not match the virtual one the Sprites will be stretched, which could look strange.
- FitViewport is the same as the StretchViewport, but it will keep the aspect ratio. If the real aspect ration does not match the virtual one, black borders will appear.
How to use it:
First you need to create it:
myViewport = new StretchViewport(VIRTUAL_WIDTH, VIRTUAL_HEIGHT);
Then set the Stages Viewport:
stage = new Stage(myViewport);
In the resize method you need to update your Viewport:
myViewport.update(width, height);
Thats all.
The stage now uses the Viewport and its camera to render. You don't need to touch the camera, unless you need to move it arround.
So your errors are:
stage = new Stage(new StretchViewport(480, 320));
Which creates a new StretchViewport you don't store/use.
stage.setViewport(myviewport);
You only need to set it once, when you create the stage
You never call update(width, height) for the Viewport.

Java libgdx 1.20 version texture bug

For 2 days I've been trying to fix bug with fickering and distored textures in my game. I was serching on the internet and i tried a few solutions like using scene2d, but it didn't work. What should i do ?
This screenshot shows the problem: as the character moves, one eye is sometimes bigger than the other:
edit:
I still got the problem widthdistored eye when i use sprite.setPosition((int) sprite.getX(), (int) sprite.getY()); every time before i render my character.
When i use custom viewport from the answer i see nothing on the game window what i do wrong?
package com.mygdx.redHoodie;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.Screen;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.OrthographicCamera;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.utils.viewport.StretchViewport;
public class GameScreen implements Screen {
public static final int GAME_WIDTH = 800;
public static final int GAME_HEIGHT= 480 ;
SpriteBatch batch;
Background background;
public Hoodie hoodie;
public PixelMultipleViewport viewport;
OrthographicCamera camera;
public int gameMode; // 0 normalna gra, 1 level up, 2 end game
public GameScreen(){
camera= new OrthographicCamera(GAME_WIDTH,GAME_HEIGHT);
viewport = new PixelMultipleViewport(GAME_WIDTH, GAME_HEIGHT, camera);
viewport.update();
camera.setToOrtho(false, GAME_WIDTH, GAME_HEIGHT);
batch = new SpriteBatch();
//klasy wyswietlane
background= new Background(this);
hoodie = new Hoodie(this);
startNewGame();
}
#Override
public void render(float delta) {
// TODO Auto-generated method stu
Gdx.gl.glClearColor(1, 1, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.setProjectionMatrix(camera.projection);
batch.setTransformMatrix(camera.view);
camera.update();
//batch.setProjectionMatrix(camera.combined);
this.update(delta);
this.batch.begin();
toRender(delta);
this.batch.end();
}
public void update(float delta){
hoodie.update(delta);
}
public void toRender(float delta){
background.render();
hoodie.render();
}
public void startNewGame(){
}
public void startNevLevel(){
}
#Override
public void resize(int width, int height) {
// TODO Auto-generated method stub
viewport.update(width, height,false);
}
#Override
public void show() {
// TODO Auto-generated method stub
}
#Override
public void hide() {
// TODO Auto-generated method stub
}
#Override
public void pause() {
// TODO Auto-generated method stub
}
#Override
public void resume() {
// TODO Auto-generated method stub
}
#Override
public void dispose() {
// TODO Auto-generated method stub
}
}
When loading your texture, use linear filtering and mip-mapping. The default filter is Nearest/Nearest, which will cause the issue you're seeing.
Texture myTexture = new Texture("textureFilename", true); //must enable mip-mapping in constructor
myTexture.setFilter(TextureFilter.MipMapLinearNearest, TextureFilter.Linear);
EDIT:
I realize now, looking at your screenshot, that you are doing pixelated graphics in a larger window. In order to do this, yes you need to keep the Nearest/Nearest filtering, instead of what I suggested.
To avoid having the some of the pixels vary in size, you must round off character movement and camera movement to the nearest world unit. When your character is partway between pixels, the size of the sprite pixels varies because they don't line up with the screen pixels.
You have your world scaled so one unit equals one of your large pixels. So whenever you draw anything, you need to first round its position to the nearest integer in the x and the y, as well as the camera position. So after you move the camera or the sprites, you must do something like this:
sprite.position.set((int)sprite.position.x,(int)sprite.position.y,sprite.position.z);
As far as your Viewport goes, if you don't want any black bars, you will probably need a custom Viewport class that tries to match your desired resolution as closely as possible and then extends it outwards to avoid distortion. ExtendViewport does something similar, but the difference with pixellated graphics is that you need the world resolution to be an integer multiple of the screen's resolution so the edges of pixels look crisp rather than fuzzy.
I think this will do what you want. It takes your desired screen resolution and shrinks it to fit where the size of each of your pixels in screen pixels is an integer. Then it extends the view beyond your desired resolution to avoid distortion and black bars. This class makes the assumption that all screen dimensions are always a multiple of 4. I think that's true. If you want to get fancy, you could use OpenGL scissoring to round down the viewport size to the nearest multiples of 4, to be safe. At most you would be having 2 pixels of black bar, which I don't think would be noticeable.
public class PixelMultipleViewport extends Viewport {
private int minWorldWidth, minWorldHeight;
public PixelMultipleViewport (int minWorldWidth, int minWorldHeight, Camera camera) {
this.minWorldHeight = minWorldHeight;
this.minWorldWidth = minWorldWidth;
this.camera = camera;
}
#Override
public void update (int screenWidth, int screenHeight, boolean centerCamera) {
viewportWidth = screenWidth;
viewportHeight = screenHeight;
int maxHorizontalMultiple = screenWidth / minWorldWidth;
int maxVerticalMultiple = screenHeight / minWorldHeight;
int pixelSize = Math.min(maxHorizontalMultiple, maxVerticalMultiple);
worldWidth = (float)screenWidth/(float)pixelSize;
worldHeight = (float)screenHeight/(float)pixelSize;
super.update(screenWidth, screenHeight, centerCamera);
}
}
Here's a different option I just came across. This is a way to draw your scene at the scale you like without black bars, at any resolution.
The visual quality will be slightly worse than in my other answer (where you draw at an integer multiple of your desired scene scale), but significantly better than using straight nearest filtering like in your screenshot.
The basic idea is to draw everything to a small FrameBuffer at the scale you want, and then draw the FrameBuffer's color texture to the screen using an upscaling shader that (unlike linear filtering) interpolates pixel colors only along the edges of sprite pixels.
The explanation is here. I have not ported this to Libgdx or tested it. And I'm not sure how well this shader would run on mobile. It involves running four dependent texture look-ups per screen fragment.
I know this topic is old but as my search led me here, there may be more people following in the future. I was having the same pixel tearing issue, but only on my iOS 9 iPhone 4S. It was rendering fine on my Android 9 Pixel 2. Tried many things (especially rounding to full pixels) but even using an unzoomed fullscreen orthographic camera it suffered from the artefacts.
Forcing my texture to be POT (power of two) fixed the issue!