Pixel shader with SharpDX and DirectX toolkit outputting pure red color - windows-phone-8

I am creating a Windows Phone 8 app and I'm working with camera. When I don't use any shader, my C# code works perfectly:
void photoDevice_PreviewFrameAvailable(ICameraCaptureDevice sender, object args)
{
sender.GetPreviewBufferArgb(captureData);
previewTexture.SetData<int>(captureData);
}
...
spriteBatch.Begin();
spriteBatch.Draw(previewTexture, new Vector2(backBufferXCenter, backBufferYCenter), null, Color.White, (float)Math.PI / 2.0f,
new Vector2(textureXCenter, textureYCenter), new Vector2(xScale, yScale), SpriteEffects.None, 0.0f);
spriteBatch.End();
I am getting camera input in realtime. However, I'm (just trying to passthrough the input) trying to use a pixel shader:
Texture2D MyTexture : register(t0);
sampler textureSampler = sampler_state{
Texture = (MyTexture);
Filter = MIN_MAG_MIP_LINEAR;
};
...
float4 pixelShader(float4 color : COLOR0,
float2 texCoord : TEXCOORD0) : SV_Target0
{
float4 textureColor = tex2D(textureSampler, texCoord);
return textureColor;
}
The shader runs fine (assigning it at the beginning of the sprite batch) with no exceptions etc but all I'm getting is red color. The whole output is pure red. What could be the reason? I am new to shaders and I'm trying to understand how they work, especially with samplers. Thank you.

If I'm not wrong, you need to get the pixel data in BGRA format, not RGBA. Could you check if it works for you?
You can check this article.
Creating a Lens application that uses HLSL effects for filters
Regards,
Pieter Voloshyn

Related

Convert OpenGL HQX shader to LibGDX

I was getting in to shaders for LibGDX and noticed there are some attributes that are only being used in LibGDX.
The standard Vertex and Fragment shaders from https://github.com/libgdx/libgdx/wiki/Shaders work perfect and gets applied to my SpriteBatch.
When i try to use a HQX shader like https://github.com/SupSuper/OpenXcom/blob/master/bin/common/Shaders/HQ2x.OpenGL.shader i get a lot of errors.
Probably because i need to send some LibGDX dependant variables to the shader but i can't find out which that should be.
I'd like to use these shaders on desktops with large screens so the game keeps looking great on these screens.
I used this code to load the shader:
try {
shaderProgram = new ShaderProgram(Gdx.files.internal("vertex.glsl").readString(), Gdx.files.internal("fragment.glsl").readString());
shaderProgram.pedantic = false;
System.out.println("Shader Log:");
System.out.println(shaderProgram.getLog());
} catch(Exception ex) { }
The Shader Log outputs:
No errors.
Thanks in advance.
This is a post processing shader, so your flow should go like this:
Draw your scene to a FBO at pixel perfect resolution using SpriteBatch's default shader.
Draw the FBO's texture to the screen's frame buffer using the upscaling shader. You can do this with SpriteBatch if you modify the shader to match the attributes and uniforms that SpriteBatch uses. (You could alternatively create a simple mesh with the attribute names that the shader expects, but SpriteBatch is probably easiest.)
First of all, we are not using a typical shader with SpriteBatch so you need to call ShaderProgram.pedantic = false; somewhere before loading anything.
Now you need a FrameBuffer at the right size. It should be sized for your sprites to be pixel perfect (one pixel of texture scales to one pixel of world). Something like this:
public void resize (int width, int height){
float ratio = (float)width / (float) height;
int gameWidth = (int)(GAME_HEIGHT / ratio);
boolean needNewFrameBuffer = false;
if (frameBuffer != null && (frameBuffer.getWidth() != gameWidth || frameBuffer.getHeight() != GAME_HEIGHT)){
frameBuffer.dispose();
needNewFrameBuffer = true;
}
if (frameBuffer == null || needNewFrameBuffer)
frameBuffer = new FrameBuffer(Format.RGBA8888, gameWidth, GAME_HEIGHT);
camera.viewportWidth = gameWidth;
camera.viewportHeight = GAME_HEIGHT;
camera.update();
}
Then you can draw to the frame buffer as if it's your screen. And after that, you draw the frame buffer's texture to the screen.
public void render (){
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
frameBuffer.begin();
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.setProjectionMatrix(camera.combined);
batch.setShader(null); //use default shader
batch.begin();
//draw your game
batch.end();
frameBuffer.end();
batch.setShader(upscaleShader);
batch.begin();
upscaleShader.setUniformf("rubyTextureSize", frameBuffer.getWidth(), frameBuffer.getHeight());//this is the uniform in your shader. I assume it's wanting the scene size in pixels
batch.draw(frameBuffer.getColorBufferTexture(), -1, 1, 2, -2); //full screen quad for no projection matrix, with Y flipped as needed for frame buffer textures
batch.end();
}
There are also some changes you need to make to your shader so it will work with OpenGL ES, and because SpriteBatch is wired for specific attribute and uniform names:
At the top of your vertex shader, add this to define your vertex attributes and varyings (which your linked shader doesn't need because it's relying on built-in variables that aren't available in GL ES):
attribute vec4 a_position;
attribute vec2 a_texCoord;
varying vec2 v_texCoord[5];
Then in the vertex shader, change the gl_Position line to
gl_Position = a_position; //since we can't rely on built-in variables
and replace all occurrences of gl_TexCoord with v_texCoord for the same reason.
In the fragment shader, to be compatible with OpenGL ES, you need to declare precision. You also need to declare the same varying, so add this to the top:
#ifdef GL_ES
precision mediump float;
#endif
varying vec2 v_texCoord[5];
As with the vertex shader, replace all occurrences of gl_TexCoord with v_texCoord. And also replace all occurrences of rubyTexture with u_texture, which is the texture name that SpriteBatch uses.
I think that's everything. I didn't actually test this and I'm going off of memory, but hopefully it gets you close.

Setting a custom shader messes up the scale and position of a Sprite

I am using a custom shader on cocos2dx 3.1 trying to accomplish an special effect on a sprite used as a background (has to cover the whole screen)
However If i do not use the shader, the sprite is perfectly scaled and positioned, but when i do use it its show way smaller and on the bottom left.
Here is how i load the sprite
this->background_image = Sprite::create(image_name->GetText());
// Add background shader
if (this->background_image)
{
const GLchar *shaderSource = (const GLchar*) CCString::createWithContentsOfFile("OverlayShader.fsh")->getCString();
GLProgram * p = new GLProgram();
p->initWithByteArrays(ccPositionTextureA8Color_vert, shaderSource);
p->link();
p->updateUniforms();
this->background_image->setGLProgram(p);
}
// Classroom will be streched to cover all of the game screen
Size bgImgSize = this->background_image->getContentSize();
Size windowSize = Director::getInstance()->getWinSize();
float xScaleFactor = windowSize.width/bgImgSize.width;
float yScaleFactor = (windowSize.height-MARGIN_SPACE+10)/bgImgSize.height;
this->background_image->setScale(xScaleFactor, yScaleFactor);
this->background_image->setPosition(Vec2(windowSize.width/2.0f,windowSize.height/2.0f + ((MARGIN_SPACE-10)/2.0)));
this->background_image->retain();
And this is the shader im trying to use (a simple one, once this works ill change it to a photoshop-overlay style one)
varying vec4 v_fragmentColor;
varying vec2 v_texCoord;
void main()
{
vec4 v_orColor = v_fragmentColor * texture2D(CC_Texture0, v_texCoord);
float gray = dot(v_orColor.rgb, vec3(0.299, 0.587, 0.114));
gl_FragColor = vec4(gray, gray, gray, v_orColor.a);
}
My question is, what am i doing wrong? The first thing that comes to my mind is that the attribute pointers used on the vertex shader are not correct, but now i am using the default vertex shader.
I found the solution on another post, so i'll just quote it and link to that post:
Found the solution. The vert shader should not use the MVP matrix so I
loaded ccPositionTextureColor_noMVP_vert instead of
ccPositionTextureA8Color_vert.
Weird y-position offset using custom frag shader (Cocos2d-x)

libgdx open gles 2.0 stencil alpha masking

I'm looking for a solution to implement alpha masking with stencil buffer in libgdx with open gles 2.0.
I have managed to implement simple alpha masking with stencil buffer and shaders, where if alpha channel of fragment is greater then some specified value it gets discarted. That works fine.
The problem is when I want to use some gradient image mask, or fethered png mask, I don't get what I wanned (I get "filled" rectangle mask with no alpha channel), instead I want smooth fade out mask.
I know that the problem is that in stencil buffer there are only 0s and 1s, but I want to write to stencil some other values, that represent actual alpha value of fragment that passed in fragment shader, and to use that value from stencil to somehow do some blending.
I hope that I've explained what I want to get, actually if it's possible.
I've recently started playing with OpenGL ES, so I still have some misunderstandings.
My questions is: How to setup and stencil buffer to store values other then 0s and 1s, and how to use that values later for alpha masking?
Tnx in advance.
This is currently my stencil setup:
Gdx.gl.glClearColor(1, 1, 1, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_STENCIL_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
// setup drawing to stencil buffer
Gdx.gl20.glEnable(GL20.GL_STENCIL_TEST);
Gdx.gl20.glStencilFunc(GL20.GL_ALWAYS, 0x1, 0xffffffff);
Gdx.gl20.glStencilOp(GL20.GL_REPLACE, GL20.GL_REPLACE, GL20.GL_REPLACE);
Gdx.gl20.glColorMask(false, false, false, false);
Gdx.gl20.glDepthMask(false);
spriteBatch.setShader(shaderStencilMask);
spriteBatch.begin();
// push to the batch
spriteBatch.draw(Assets.instance.actor1, Gdx.graphics.getWidth() / 2, Gdx.graphics.getHeight() / 2, Assets.instance.actor1.getRegionWidth(), Assets.instance.actor1.getRegionHeight());
spriteBatch.end();
// fix stencil buffer, enable color buffer
Gdx.gl20.glColorMask(true, true, true, true);
Gdx.gl20.glDepthMask(true);
Gdx.gl20.glStencilOp(GL20.GL_KEEP, GL20.GL_KEEP, GL20.GL_KEEP);
// draw where pattern has NOT been drawn
Gdx.gl20.glStencilFunc(GL20.GL_EQUAL, 0x1, 0xff);
decalBatch.add(decal);
decalBatch.flush();
Gdx.gl20.glDisable(GL20.GL_STENCIL_TEST);
decalBatch.add(decal2);
decalBatch.flush();
The only ways I can think of doing this are with a FrameBuffer.
Option 1
Draw your scene's background (the stuff that will not be masked) to a FrameBuffer. Then draw your entire scene without masks to the screen. Then draw your mask decals to the screen using the FrameBuffer's color attachment. Downside to this method is that in OpenGL ES 2.0 on Android, a FrameBuffer can have RGBA4444, not RGBA8888, so there will be visible seams along the edges of the masks where the color bit depth changes.
Option 2
Draw you mask decals as B&W opaque to your FrameBuffer. Then draw your background to the screen. When you draw anything that can be masked, draw it with multi-texturing, multiplying by the FrameBuffer's color texture. Potential downside is that absolutely anything that can be masked must be drawn multi-textured with a custom shader. But if you're just using decals, then this isn't really any more complicated than Option 1.
The following is untested...might require a bit of debugging.
In both options, I would subclass CameraGroupStrategy to be used with the DecalBatch when drawing the mask decals, and override beforeGroups to also set the second texture.
public class MaskingGroupStrategy extends CameraGroupStrategy{
private Texture fboTexture;
//call this before using the DecalBatch for drawing mask decals
public void setFBOTexture(Texture fboTexture){
this.fboTexture = fboTexture;
}
#Override
public void beforeGroups () {
super.beforeGroups();
fboTexture.bind(1);
shader.setUniformi("u_fboTexture", 1);
shader.setUniformf("u_screenDimensions", Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
}
}
And in your shader, you can get the FBO texture color like this:
vec4 fboColor = texture2D(u_fboTexture, gl_FragCoord.xy/u_screenDimensions.xy);
Then for option 1:
gl_FragColor = vec4(fboColor.rgb, 1.0-texture2D(u_texture, v_texCoords).a);
or for option 2:
gl_FragColor = v_color * texture2D(u_texture, v_texCoords);
gl_FragColor.a *= fboColor.r;

Try to add VBO in Slick2D

I'm trying to add VBO in slick2D. All I find on the web is how to initialize VBO in a 3D context. Anyone knows how to do it in 2D ?
My actual test (make 4 square in slick context) make this (i add corrds in black) :
(source: canardpc.com)
.
Below my init (in the init method of my GameState) :
// set up OpenGL
GL11.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
GL11.glEnableClientState(GL11.GL_VERTEX_ARRAY);
GL11.glEnableClientState(GL11.GL_COLOR_ARRAY);
GL11.glMaterial(GL11.GL_FRONT, GL11.GL_SPECULAR, floatBuffer(1.0f, 1.0f, 1.0f, 1.0f));
GL11.glMaterialf(GL11.GL_FRONT, GL11.GL_SHININESS, 25.0f);
// set up the camera
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glLoadIdentity();
// create our vertex buffer objects
IntBuffer buffer = BufferUtils.createIntBuffer(1);
GL15.glGenBuffers(buffer);
int vertex_buffer_id = buffer.get(0);
FloatBuffer vertex_buffer_data = BufferUtils.createFloatBuffer(vertex_data_array.length);
vertex_buffer_data.put(vertex_data_array);
vertex_buffer_data.rewind();
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vertex_buffer_id);
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, vertex_buffer_data, GL15.GL_STATIC_DRAW);
And the render (in the render method of game state) :
g.setDrawMode(Graphics.MODE_ALPHA_BLEND) ;
// perform rotation transformations
GL11.glPushMatrix();
// render the cube
GL11.glVertexPointer(3, GL11.GL_FLOAT, 28, 0);
GL11.glColorPointer(4, GL11.GL_FLOAT, 28, 12);
GL11.glDrawArrays(GL11.GL_QUADS, 0, vertex_data_array.length / 7);
// restore the matrix to pre-transformation values
GL11.glPopMatrix();
I think something wrong because all other render disappear (text and sprites) and coords are not window size anymore.
edit : I try something like this GL11.glOrtho(0,800,600,0,-1,1); with strange result
Thanks
I resolv the issue by adding GL11.glOrtho(0,800,600,0,-1,1); and disabling glEnableClientState (glDisableClientState).
But I will finally move to libgdx framework whoes do that natively.

Limiting custom BlendMode made with PixelBender - how to merge images

i need to mix two images one photo and a placeholder. The idea is that we see the placeholder except where the palceholder has a particular color, in that case the user sees the photo. Something like chroma key.
For this purpose i wrote a Pixel Bender shader that acts as a BlendMode. If the background is in the right color output the pixel image otherwhise output the pixel from the placeholder.
<languageVersion : 1.0;>
kernel Crossfade
< namespace : "mynamesp";
vendor : "Artbits snc";
version : 1;
description : "description ... "; >
{
input image4 placeHolder;
input image4 myImage;
output pixel4 dst;
const float3 SPECIAL_COLOR = float3(159.0, 160.0, 161.0);
void evaluatePixel()
{
float4 imgPixel = sample(myImagee, outCoord());
float4 placeHolderPixel = sample(placeHolder, outCoord());
dst = placeHolderPixel;
if(placeHolderPixel.r == (SPECIAL_COLOR.r / 255.0) && placeHolderPixel.g == (SPECIAL_COLOR.g / 255.0) && placeHolderPixel.b == (SPECIAL_COLOR.b / 255.0)){
dst = imgPixel;
}
}
}
Everything works fine except for the fact that i had multiple placeholder, one over the other and my shader don't check the color of its own placeholder but the color of everything under the photo.
Is there a way to force BlendMode to consider only a layer or a specific background color ?
Is there a smarter way to obtain the same result ?
Thanks for your help! i know that this is quite a long and complex question, especially for my english :-)
I assume that you are using this BlendMode on generic DisplayObjects and not directly on BitmapData. In this case PixelBender can only work with the data that gets passed in from the drawing API, so you have to make sure that only those layers are used. They way to do it is to add only the placeholder object and the image object to one holder Sprite.