Try to add VBO in Slick2D - lwjgl

I'm trying to add VBO in slick2D. All I find on the web is how to initialize VBO in a 3D context. Anyone knows how to do it in 2D ?
My actual test (make 4 square in slick context) make this (i add corrds in black) :
(source: canardpc.com)
.
Below my init (in the init method of my GameState) :
// set up OpenGL
GL11.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
GL11.glEnableClientState(GL11.GL_VERTEX_ARRAY);
GL11.glEnableClientState(GL11.GL_COLOR_ARRAY);
GL11.glMaterial(GL11.GL_FRONT, GL11.GL_SPECULAR, floatBuffer(1.0f, 1.0f, 1.0f, 1.0f));
GL11.glMaterialf(GL11.GL_FRONT, GL11.GL_SHININESS, 25.0f);
// set up the camera
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glLoadIdentity();
// create our vertex buffer objects
IntBuffer buffer = BufferUtils.createIntBuffer(1);
GL15.glGenBuffers(buffer);
int vertex_buffer_id = buffer.get(0);
FloatBuffer vertex_buffer_data = BufferUtils.createFloatBuffer(vertex_data_array.length);
vertex_buffer_data.put(vertex_data_array);
vertex_buffer_data.rewind();
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vertex_buffer_id);
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, vertex_buffer_data, GL15.GL_STATIC_DRAW);
And the render (in the render method of game state) :
g.setDrawMode(Graphics.MODE_ALPHA_BLEND) ;
// perform rotation transformations
GL11.glPushMatrix();
// render the cube
GL11.glVertexPointer(3, GL11.GL_FLOAT, 28, 0);
GL11.glColorPointer(4, GL11.GL_FLOAT, 28, 12);
GL11.glDrawArrays(GL11.GL_QUADS, 0, vertex_data_array.length / 7);
// restore the matrix to pre-transformation values
GL11.glPopMatrix();
I think something wrong because all other render disappear (text and sprites) and coords are not window size anymore.
edit : I try something like this GL11.glOrtho(0,800,600,0,-1,1); with strange result
Thanks

I resolv the issue by adding GL11.glOrtho(0,800,600,0,-1,1); and disabling glEnableClientState (glDisableClientState).
But I will finally move to libgdx framework whoes do that natively.

Related

Convert OpenGL HQX shader to LibGDX

I was getting in to shaders for LibGDX and noticed there are some attributes that are only being used in LibGDX.
The standard Vertex and Fragment shaders from https://github.com/libgdx/libgdx/wiki/Shaders work perfect and gets applied to my SpriteBatch.
When i try to use a HQX shader like https://github.com/SupSuper/OpenXcom/blob/master/bin/common/Shaders/HQ2x.OpenGL.shader i get a lot of errors.
Probably because i need to send some LibGDX dependant variables to the shader but i can't find out which that should be.
I'd like to use these shaders on desktops with large screens so the game keeps looking great on these screens.
I used this code to load the shader:
try {
shaderProgram = new ShaderProgram(Gdx.files.internal("vertex.glsl").readString(), Gdx.files.internal("fragment.glsl").readString());
shaderProgram.pedantic = false;
System.out.println("Shader Log:");
System.out.println(shaderProgram.getLog());
} catch(Exception ex) { }
The Shader Log outputs:
No errors.
Thanks in advance.
This is a post processing shader, so your flow should go like this:
Draw your scene to a FBO at pixel perfect resolution using SpriteBatch's default shader.
Draw the FBO's texture to the screen's frame buffer using the upscaling shader. You can do this with SpriteBatch if you modify the shader to match the attributes and uniforms that SpriteBatch uses. (You could alternatively create a simple mesh with the attribute names that the shader expects, but SpriteBatch is probably easiest.)
First of all, we are not using a typical shader with SpriteBatch so you need to call ShaderProgram.pedantic = false; somewhere before loading anything.
Now you need a FrameBuffer at the right size. It should be sized for your sprites to be pixel perfect (one pixel of texture scales to one pixel of world). Something like this:
public void resize (int width, int height){
float ratio = (float)width / (float) height;
int gameWidth = (int)(GAME_HEIGHT / ratio);
boolean needNewFrameBuffer = false;
if (frameBuffer != null && (frameBuffer.getWidth() != gameWidth || frameBuffer.getHeight() != GAME_HEIGHT)){
frameBuffer.dispose();
needNewFrameBuffer = true;
}
if (frameBuffer == null || needNewFrameBuffer)
frameBuffer = new FrameBuffer(Format.RGBA8888, gameWidth, GAME_HEIGHT);
camera.viewportWidth = gameWidth;
camera.viewportHeight = GAME_HEIGHT;
camera.update();
}
Then you can draw to the frame buffer as if it's your screen. And after that, you draw the frame buffer's texture to the screen.
public void render (){
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
frameBuffer.begin();
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.setProjectionMatrix(camera.combined);
batch.setShader(null); //use default shader
batch.begin();
//draw your game
batch.end();
frameBuffer.end();
batch.setShader(upscaleShader);
batch.begin();
upscaleShader.setUniformf("rubyTextureSize", frameBuffer.getWidth(), frameBuffer.getHeight());//this is the uniform in your shader. I assume it's wanting the scene size in pixels
batch.draw(frameBuffer.getColorBufferTexture(), -1, 1, 2, -2); //full screen quad for no projection matrix, with Y flipped as needed for frame buffer textures
batch.end();
}
There are also some changes you need to make to your shader so it will work with OpenGL ES, and because SpriteBatch is wired for specific attribute and uniform names:
At the top of your vertex shader, add this to define your vertex attributes and varyings (which your linked shader doesn't need because it's relying on built-in variables that aren't available in GL ES):
attribute vec4 a_position;
attribute vec2 a_texCoord;
varying vec2 v_texCoord[5];
Then in the vertex shader, change the gl_Position line to
gl_Position = a_position; //since we can't rely on built-in variables
and replace all occurrences of gl_TexCoord with v_texCoord for the same reason.
In the fragment shader, to be compatible with OpenGL ES, you need to declare precision. You also need to declare the same varying, so add this to the top:
#ifdef GL_ES
precision mediump float;
#endif
varying vec2 v_texCoord[5];
As with the vertex shader, replace all occurrences of gl_TexCoord with v_texCoord. And also replace all occurrences of rubyTexture with u_texture, which is the texture name that SpriteBatch uses.
I think that's everything. I didn't actually test this and I'm going off of memory, but hopefully it gets you close.

Cocos2dx: Sprite3D won't render to texture

I use RenderTexture to render a layer with all its nodes to a texture then apply an OpenGL shader on that texture to create post-process effects. It works all fine except with Sprite3D and Billboard nodes. It has been asked on their forums a few times without any response. I wonder if anyone got this to work.
Here is an example:
Layer* gameLayer = Layer::create();
this->addChild(gameLayer, 0);
auto dir = Director::getInstance()->getWinSize();
Camera *camera = Camera::createPerspective(60, (GLfloat)dir.width / dir.height, 1, 1000);
camera->setPosition3D(Vec3(0, 100, 100));
camera->lookAt(Vec3(0, 0, 0), Vec3(0, 1, 0));
gameLayer->addChild(camera); //add camera to the scene
// You'll get a NULL camera inside BillBoard::calculateBillbaordTransform() function
// if you call visit()
/*auto billboard = BillBoard::create("cocos2d-x.png", BillBoard::Mode::VIEW_POINT_ORIENTED);
billboard->setPosition(Vec2(VisibleRect::center().x, VisibleRect::center().y));
gameLayer->addChild(billboard, 100);*/
// This one won't render into the texture
Sprite3D* sprite3D = Sprite3D::create("blend_test/character_3_animations_test.c3b");
sprite3D->setScale(5.0f); //sets the object scale in float
sprite3D->setRotation3D(Vec3(0.0f, 0.0f, 0.0f));
//sprite3D->setPosition3D(Vec3(VisibleRect::center().x, VisibleRect::center().y, 0.0f)); //sets sprite position
sprite3D->setPosition(Vec2(VisibleRect::center().x, VisibleRect::center().y));
gameLayer->addChild(sprite3D, 1); //adds sprite to scene, z-index: 1
// This one works just fine and appears black and white as expected
// in the resulting texture
Sprite* sprite2D = Sprite::create("cocos2d-x.png");
sprite2D->setPosition(Vec2(VisibleRect::center().x, VisibleRect::center().y));
gameLayer->addChild(sprite2D);
// Black and white OpenGL shader
GLProgram* glProgram = GLProgram::createWithFilenames("shaders/gray.vert", "shaders/gray.frag");
glProgram->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_COLOR, GLProgram::VERTEX_ATTRIB_POSITION);
glProgram->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_POSITION, GLProgram::VERTEX_ATTRIB_COLOR);
glProgram->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_TEX_COORD, GLProgram::VERTEX_ATTRIB_TEX_COORD);
glProgram->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_TEX_COORD1, GLProgram::VERTEX_ATTRIB_TEX_COORD1);
glProgram->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_TEX_COORD2, GLProgram::VERTEX_ATTRIB_TEX_COORD2);
glProgram->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_TEX_COORD3, GLProgram::VERTEX_ATTRIB_TEX_COORD3);
glProgram->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_NORMAL, GLProgram::VERTEX_ATTRIB_NORMAL);
glProgram->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_BLEND_WEIGHT, GLProgram::VERTEX_ATTRIB_BLEND_WEIGHT);
glProgram->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_BLEND_INDEX, GLProgram::VERTEX_ATTRIB_BLEND_INDEX);
glProgram->link();
glProgram->updateUniforms();
RenderTexture* renderTexture = RenderTexture::create(VisibleRect::width(), VisibleRect::height());
renderTexture->retain();
Sprite* ppSprite = Sprite::createWithTexture(renderTexture->getSprite()->getTexture());
ppSprite->setTextureRect(Rect(0, 0, ppSprite->getTexture()->getContentSize().width,
ppSprite->getTexture()->getContentSize().height));
ppSprite->setAnchorPoint(Point::ZERO);
ppSprite->setPosition(Point::ZERO);
ppSprite->setFlippedY(true);
ppSprite->setGLProgram(glProgram);
this->addChild(ppSprite, 100);
renderTexture->beginWithClear(0.0f, 0.0f, 0.0f, 0.0f);
auto renderer = _director->getRenderer();
auto& parentTransform = _director->getMatrix(MATRIX_STACK_TYPE::MATRIX_STACK_MODELVIEW);
gameLayer->visit(renderer, parentTransform, true);
//gameLayer->visit();
renderTexture->end();
ppSprite->setTexture(renderTexture->getSprite()->getTexture());
Cocos2d-x v3.11.1 (current as of this post) and below don't properly support RenderTextures with Sprite3D because of a clear depth buffer bug.
There is a GitHub issue on the bug. But a workaround now exists:
...
sprite3D->setForce2DQueue(true); // puts your Sprite3D on same render queue as the RenderTexture. More info below.
...
auto rt = RenderTexture::create(1280, 720, Texture2D::PixelFormat::RGBA8888, GL_DEPTH24_STENCIL8); // By default a depth buffer isn't created
rt->setKeepMatrix(true); // required
...
...
rt->beginWithClear(0, 0, 0, 0, 1); // required, clears the depth buffer
Also, changes need to be made to RenderTexture.cpp. This fixes the clear depth buffer bug in Cocos2d-x.
void RenderTexture::onClear()
{
// save clear color
GLfloat oldClearColor[4] = {0.0f};
GLfloat oldDepthClearValue = 0.0f;
GLint oldStencilClearValue = 0;
GLboolean oldDepthWrite = GL_FALSE;
// backup and set
if (_clearFlags & GL_COLOR_BUFFER_BIT)
{
glGetFloatv(GL_COLOR_CLEAR_VALUE, oldClearColor);
glClearColor(_clearColor.r, _clearColor.g, _clearColor.b, _clearColor.a);
}
if (_clearFlags & GL_DEPTH_BUFFER_BIT)
{
glGetFloatv(GL_DEPTH_CLEAR_VALUE, &oldDepthClearValue);
glClearDepth(_clearDepth);
glGetBooleanv(GL_DEPTH_WRITEMASK, &oldDepthWrite);
glDepthMask(true);
}
if (_clearFlags & GL_STENCIL_BUFFER_BIT)
{
glGetIntegerv(GL_STENCIL_CLEAR_VALUE, &oldStencilClearValue);
glClearStencil(_clearStencil);
}
// clear
glClear(_clearFlags);
// restore
if (_clearFlags & GL_COLOR_BUFFER_BIT)
{
glClearColor(oldClearColor[0], oldClearColor[1], oldClearColor[2], oldClearColor[3]);
}
if (_clearFlags & GL_DEPTH_BUFFER_BIT)
{
glClearDepth(oldDepthClearValue);
glDepthMask(oldDepthWrite);
}
if (_clearFlags & GL_STENCIL_BUFFER_BIT)
{
glClearStencil(oldStencilClearValue);
}
}
See the issue for more details. I also made an example gist of the workaround. Screenshot below.
I'm not sure about billboards, but this workaround might fix it too.
Info on Cocos2d-x render queues:
The Sprite3D needs to be on the same render queue as the RenderTexture. Cocos2d-x (as of v3.7 or so) now has 5 render queues:
Global Z Order < 0
3D Opaque
3D Transparent
Global Z Order == 0 (default for 2D)
Global Z Order > 0
You can put the Sprite3D and the RenderTexture on the last queue with setGlobalZOrder(1) or just put the Sprite3D in the 2D queue with sprite3D->setForce2DQueue(true).
unlike cocos2d RenderTexture the following worked fine for 3D screen capture (or anything i imagine)!
Sprite * CcGlobal::getScreenAsSprite(void) {
Size screenSize = Director::getInstance()->getWinSize();
int width = screenSize.width;
int height = screenSize.height;
std::shared_ptr<GLubyte> buffer(new GLubyte[width * height * 4], [](GLubyte* p) { CC_SAFE_DELETE_ARRAY(p); });
glPixelStorei(GL_PACK_ALIGNMENT, 1);
glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer.get());
Image* image = new (std::nothrow) Image;
image->initWithRawData(buffer.get(), width * height * 4, width, height, 8);
Texture2D *texture = new (std::nothrow) Texture2D();
texture->initWithImage(image);
SpriteFrame *spriteFrame = SpriteFrame::createWithTexture(texture, Rect(Vec2(0, 0), screenSize));
Sprite *sprite = Sprite::createWithSpriteFrame(spriteFrame);
sprite->setFlippedY(true);
delete image;
return sprite;
}
===================================================

LibGDX FrameBuffer

I'm trying to make a game where you build a spaceship from parts, and fly it around and such.
I would like to create the ship from a series of components (from a TextureAtlas for instance). I'd like to draw the ship from it's component textures, and then save the drawn ship as one large Texture. (So i don't have to draw 50 component textures, just one ship texture. What would be the best way to go about this?
I've been trying to do so using a FrameBuffer. I've been able to draw the components to a Texture, and draw the texture to the screen, but no matter what I try the Texture ends up with a solid background the size of the frame buffer. It's like the clear command can't clear with transparency. Here's the drawing code I have at the moment. The ship is just a Sprite which i save the FrameBuffer texture to.
public void render(){
if (ship == null){
int screenwidth = Gdx.graphics.getWidth();
int screenheight = Gdx.graphics.getHeight();
SpriteBatch fb = new SpriteBatch();
FrameBuffer fbo = new FrameBuffer(Format.RGB888, screenwidth, screenheight, false);
fbo.begin();
fb.enableBlending();
Gdx.gl.glBlendFuncSeparate(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA, GL20.GL_ONE, GL20.GL_ONE_MINUS_SRC_ALPHA);
Gdx.gl.glClearColor(1, 0, 1, 0);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
fb.begin();
atlas.createSprite("0").draw(fb);
fb.end();
fbo.end();
ship = new Sprite(fbo.getColorBufferTexture());
ship.setPosition(0, -screenheight);
}
Gdx.gl.glClearColor(1, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
batch.enableBlending();
batch.setBlendFunction(GL20.GL_ONE, GL20.GL_ZERO);
ship.draw(batch);
batch.end();
}
The problem here lies in this line:
FrameBuffer fbo = new FrameBuffer(Format.RGB888, screenwidth, screenheight, false);
specifically with Format.RGB888. This line is saying that your FrameBuffer should be Red (8 bits) followed by Green (8 bits) followed by Blue (8 bits). Notice however, that this format doesn't have any bits for Alpha (transparency). In order to get transparency out of your frame buffer, you probably instead want to use the Format.RGBA8888, which includes an additional 8 bits for Alpha.
Hope this helps.

Pixel shader with SharpDX and DirectX toolkit outputting pure red color

I am creating a Windows Phone 8 app and I'm working with camera. When I don't use any shader, my C# code works perfectly:
void photoDevice_PreviewFrameAvailable(ICameraCaptureDevice sender, object args)
{
sender.GetPreviewBufferArgb(captureData);
previewTexture.SetData<int>(captureData);
}
...
spriteBatch.Begin();
spriteBatch.Draw(previewTexture, new Vector2(backBufferXCenter, backBufferYCenter), null, Color.White, (float)Math.PI / 2.0f,
new Vector2(textureXCenter, textureYCenter), new Vector2(xScale, yScale), SpriteEffects.None, 0.0f);
spriteBatch.End();
I am getting camera input in realtime. However, I'm (just trying to passthrough the input) trying to use a pixel shader:
Texture2D MyTexture : register(t0);
sampler textureSampler = sampler_state{
Texture = (MyTexture);
Filter = MIN_MAG_MIP_LINEAR;
};
...
float4 pixelShader(float4 color : COLOR0,
float2 texCoord : TEXCOORD0) : SV_Target0
{
float4 textureColor = tex2D(textureSampler, texCoord);
return textureColor;
}
The shader runs fine (assigning it at the beginning of the sprite batch) with no exceptions etc but all I'm getting is red color. The whole output is pure red. What could be the reason? I am new to shaders and I'm trying to understand how they work, especially with samplers. Thank you.
If I'm not wrong, you need to get the pixel data in BGRA format, not RGBA. Could you check if it works for you?
You can check this article.
Creating a Lens application that uses HLSL effects for filters
Regards,
Pieter Voloshyn

WebGL: drawArrays: attribs not setup correctly

Here's my vertex shader:
attribute vec3 aVertexPosition;
attribute vec4 aVertexColor;
attribute float type;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
varying vec4 vColor;
void main(void) {
gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);
vColor = aVertexColor;
if(type > 0.0) {
} else {
}
}
What I want to do is pretty simple, just capture a float value named type and use it for logic operates.
The problem is, when I try to use it in Javascript, the error comes:
shaderProgram.textureCoordAttribute = gl.getAttribLocation(shaderProgram, "type");
gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute);
WebGL: INVALID_OPERATION: drawArrays: attribs not setup correctly main.js:253
WebGL: INVALID_OPERATION: drawArrays: attribs not setup correctly main.js:267
WebGL: INVALID_OPERATION: drawElements: attribs not setup correctly
The output of getAttribLocation is meaningful, all of them are equal greater than 0.
================= UPDATE ===================
Here's my whole project code:
https://gist.github.com/royguo/5873503
Explanation:
index.html Shaders script are here.
main.js Start the WebGL application and draw scene.
shaders.js Load shaders and bind attributes.
buffers.js Init vertex and color buffers.
utils.js Common used utils.
Here is a link to a gist with the files I updated to get the type attribute working.
If you search for //ADDED CODE you should be able to view every change I had to make to get it working.
In addition to enabling the objectTypeAttribute you have to create an array buffer for each object you are drawing:
triangleObjectTypeBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, triangleObjectTypeBuffer);
objectTypes = [
1.0, 1.0, 0.0
];
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(objectTypes), gl.STATIC_DRAW);
triangleObjectTypeBuffer.itemSize = 1;
triangleObjectTypeBuffer.numItems = 3;
And bind that array buffer for each object before you draw the object:
gl.bindBuffer(gl.ARRAY_BUFFER, triangleObjectTypeBuffer);
gl.vertexAttribPointer(shaderProgram.objectTypeAttribute, triangleObjectTypeBuffer.itemSize, gl.FLOAT, false, 0, 0);
You probably already tried this and accidentally went wrong somewhere along the way.