In my vertex shader I would love to modify attribute vec2 a_position variable that is shared in fragment shader. By this modifycation I should get image into cylindrical projection.
This is what I'm doing in my shaders:
<!-- vertex shader -->
<script id="2d-vertex-shader" type="x-shader/x-vertex">
attribute vec2 a_position;
uniform vec2 u_resolution;
uniform mat3 u_matrix;
varying vec2 v_texCoord;
void main() {
// modifying START
float angle = atan(a_position.y, a_position.x);
float r = sqrt(a_position.x*a_position.x + a_position.y*a_position.y);
a_position.x = r*cos(angle);
a_position.y = r*sin(angle);
// modifying STOP
gl_Position = vec4(u_matrix * vec3(a_position, 1), 1);
v_texCoord = a_position;
}
</script>
<!-- fragment shader -->
<script id="2d-fragment-shader" type="x-shader/x-fragment">
precision mediump float;
// our texture
uniform sampler2D u_image;
// the texCoords passed in from the vertex shader.
varying vec2 v_texCoord;
void main() {
gl_FragColor = texture2D(u_image, v_texCoord);
}
</script>
But I'm getting this error:
compiling shader '[object WebGLShader]':ERROR: 0:12: 'assign' : l-value required "a_position" (can't modify an attribute)
ERROR: 0:13: 'assign' : l-value required "a_position" (can't modify an attribute)
Don't you have any idea how to fix that?
Just use another variable
attribute vec2 a_position;
uniform vec2 u_resolution;
uniform mat3 u_matrix;
varying vec2 v_texCoord;
void main() {
// modifying START
float angle = atan(a_position.y, a_position.x);
float r = sqrt(a_position.x*a_position.x + a_position.y*a_position.y);
vec3 p = vec3(
r*cos(angle),
r*sin(angle),
1);
// modifying STOP
gl_Position = vec4(u_matrix * p, 1);
v_texCoord = a_position;
}
Related
I have a new class MyActor extended Actor, with applying shader into it.
However, the shader will fill up the transparent background unexpectedly.
draw() method codes inside MyActor as follow:
#Override
public void draw(Batch batch, float parentAlpha) {
if(shaderProgram!=null)
{
batch.setShader(shaderProgram);
}
if(!drawParentAtBack)super.draw(batch, parentAlpha); // by default is false
Color c = getColor(); // used to apply tint color effect
batch.setColor(c.r, c.g, c.b, c.a * parentAlpha);
if ( isVisible() )
{
if(displayFrame !=null) // a textureregion
{
batch.draw(displayFrame,
getX(),getY(),
getOriginX(),getOriginY(),
getWidth(),getHeight(),
getScaleX(),getScaleY(),
getRotation());
}
}
if(drawParentAtBack)super.draw(batch, parentAlpha);
if(shaderProgram!=null)
{
batch.setShader(null);
}
}
public void setShader(String vs, String fs){
vertexShaderCode = Gdx.files.internal("shaders/" + vs + ".vs").readString();
fragmentShaderCode = Gdx.files.internal("shaders/" + fs + ".fs").readString();
shaderProgram = new ShaderProgram(vertexShaderCode, fragmentShaderCode);
if (!shaderProgram.isCompiled())
{
d( "Shader compile error: " + shaderProgram.getLog() );
}
}
My definition goes like this:
MyActor myActor1;myActor2;
..... // setting up myActor1 & myActor2
myActor1.setShader("default","greyScale");
myActor2.setShader("default","greyScale");
my simple shaderCodes from this tutorial:
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform mat4 u_projTrans;
void main() {
vec3 color = texture2D(u_texture, v_texCoords).rgb;
float gray = (color.r + color.g + color.b) / 3.0;
vec3 grayscale = vec3(gray);
gl_FragColor = vec4(grayscale, 1.0);
}
My vertex shader codes from this tutorial:
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform mat4 u_projTrans;
void main() {
vec3 color = texture2D(u_texture, v_texCoords).rgb;
float gray = (color.r + color.g + color.b) / 3.0;
vec3 grayscale = vec3(gray);
gl_FragColor = vec4(grayscale, 1.0);
}
My expected result images are gray color shapes with transparent background, BUT it turns out like this:
sample shape image without shader:
Any helps please.
In general transparency is achieved Alpha Blending. The alpha channel of the fragment controls the transparency and has to be set.
In the fragment shader the alpha channel of the texture is omitted:
gl_FragColor = vec4(grayscale, 1.0);
Set the alpha channel of the texture (u_texture) to the alpha channel of the output (gl_FragColor.a):
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform mat4 u_projTrans;
void main() {
// read RGB color channels and alpha channel
vec4 color = texture2D(u_texture, v_texCoords);
float gray = (color.r + color.g + color.b) / 3.0;
vec3 grayscale = vec3(gray);
// write gray scale and alpha channel
gl_FragColor = vec4(grayscale.rgb, color.a);
}
I'm looking for a minimal scanline shader to use with libGDX, preferably with the option to alter the intensity of the effect.
There's a libGDX example here (missing the vert and frag files):
Shaders in libgdx have no effect [Desktop]
However, this requires the use of a FrameBuffer. Is there a more elegant solution, where I can just drop the vert and frag files into my shaders folder, then setup in my code like this:
private String vertexShader;
private String fragmentShader;
private ShaderProgram shaderProgram;
#Override
public void create()
{
vertexShader = Gdx.files.internal("shaders/vertex.glsl").readString();
fragmentShader = Gdx.files.internal("shaders/fragment.glsl").readString();
shaderProgram = new ShaderProgram(vertexShader, fragmentShader);
spriteBatch.setShader(shaderProgram);
}
My game is targeted at low-end Android phones. I currently get a reasonably stable 60fps, and would like to keep this performance.
edit 1:
Following Tenfour04's snippet, my vertex file currently looks like this:
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
uniform float u_screenHeight;
varying vec4 v_color;
varying vec2 v_texCoords;
void main()
{
v_color = a_color;
v_texCoords = a_texCoord0;
gl_Position = u_projTrans * a_position;
}
edit 2:
New, possibly simpler method, but rendering loses transparency:
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_sampler2D;
uniform mat4 u_projTrans;
void main(void)
{
vec2 p = vec2(floor(gl_FragCoord.x), floor(gl_FragCoord.y));
if (mod(p.y, 2.0)==0.0)
gl_FragColor = vec4(texture2D(u_sampler2D,v_texCoords).xyz ,1.0);
else
gl_FragColor = vec4(0.0,0.0,0.0 ,1.0);
}
Here's a shader I think will work. I didn't test it, so you might have to debug it. You can customize the line count and intensity constants to get the look you want. This is very simply based off a sine curve, and it only causes darkening. There are more elaborate effects you could do by lightening and darkening the color using the sine wave. You could also truncate the sine function with step functions to possibly increase its realism.
//Vertex shader (same as SpriteBatch's default)
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
varying vec2 v_texCoords;
varying vec4 v_color;
uniform mat4 u_projTrans;
void main()
{
v_texCoords = a_texCoord0;
v_color = a_color;
v_color.a = v_color.a * (255.0/254.0);
gl_Position = u_projTrans * a_position;
}
//Fragment shader
#ifdef GL_ES
precision mediump float;
#endif
const float LINE_COUNT = 90.0;
const float FREQ = LINE_COUNT * 2.0 * 3.14159;
const float INTENSITY = 0.4;
varying vec2 v_texCoords;
varying vec3 v_color;
uniform sampler2D u_texture;
uniform float u_screenHeight;
void main()
{
vec4 texture = texture2D(u_texture, v_texCoords);
float screenV = gl_FragCoord.y / u_screenHeight;
float scanLine = 1.0 - INTENSITY * (sin(FREQ * screenV) * 0.5 + 0.5);
//Possibly cheaper methods, in increasing realism / performance hit
//float scanLine = 1.0 - INTENSITY * mod(screenV * LINE_COUNT, 1.0);
//float scanLine = 1.0 - INTENSITY * step(0.5, mod(screenV * LINE_COUNT, 1.0));
//float scanLine = 1.0 - INTENSITY * abs(mod(screenV * LINE_COUNT, 1.0) - 0.5);
gl_FragColor = v_color * vec4(texture.rgb * scanLine, texture.a);
}
It uses screen height as a parameter, so you have to set the screen height when the screen is resized:
public void resize (int width, int height){
//...
shader.begin();
shader.setUniformf("u_screenHeight", height);
shader.end();
}
If you use one of the mod() scan line calculations above that I commented out instead of the sin() based one, you can further optimize by changing this to:
shader.setUniformf("u_screenHeight", LINE_COUNT / (float)height);
and changing the shader's screenV line's / to a *, and then removing * LINE_COUNT from the scanLine calculation. This would save an operation, and also I think * is slightly faster than /. (If you do this, you might consider renaming the u_screenHeight variable to something that makes sense.)
You can keep your method and still have transparency.
This is the main() in my fragment shader:
gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates);
if (mod(floor(gl_FragCoord.y),4.0) == 0.0)
gl_FragColor.rgb *= 0.5;
With this I have transparent textures showing cheap scanlines. Note that I'm using 4.0 as divisor, just personal taste.
In any case, the key is to only modify gl_FragColor.rgb and leave alone the 'gl_FragColor.a` (the alpha channel).
I was working on a quick and dirty scanline effect too and arrived to this question after searching for gl_FragCoord.y and mod. The floor(gl_fragCoord.y) in your question solved my problem.
I have the below fragment shader. It changes the video to grey when I press a certain button. It works but there is one pb: all the sprites go down a few pixels after the shader is set. It seems the whole screen is offset in the downward direction with a few pixels
#ifdef GL_ES
precision mediump float;
precision mediump int;
#else
#define highp;
#endif
uniform sampler2D u_texture;
varying vec4 v_color;
varying vec2 v_texCoord;
const vec3 grayChange = vec3(0.299*0.4, 0.587*0.4, 0.114*0.4);
void main() {
vec4 texColor = texture2D(u_texture, v_texCoord);
vec3 grayCol = vec3(dot(texColor.rgb, grayChange));
gl_FragColor = vec4(grayCol.r, grayCol.g, grayCol.b, texColor.a * v_color.a);
}
Here's the vertex shader:
uniform mat4 U_WORLD_VIEW;
attribute vec4 a_position;
attribute vec2 a_texCoord0;
attribute vec4 a_color;
varying vec4 v_color;
varying vec2 v_texCoord;
void main() {
gl_Position = U_WORLD_VIEW * vec4(a_position.xy,0,1);
v_texCoord = a_texCoord0;
v_color = a_color;
}
Here my vertex and fragment shaders:
<script id="shader-fs" type="x-shader/x-fragment">
precision mediump float;
uniform sampler2D uSampler;
varying vec4 vColor;
varying vec2 vTextureCoord;
void main(void) {
gl_FragColor = vColor;
// gl_FragColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
}
</script>
<script id="shader-vs" type="x-shader/x-vertex">
attribute vec3 aVertexPosition;
attribute vec4 aVertexColor;
attribute vec2 aTextureCoord;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
varying vec4 vColor;
varying vec2 vTextureCoord;
void main(void) {
gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);
vColor = aVertexColor;
// vTextureCoord = aTextureCoord;
}
</script>
And here's my shader initializer:
function initShaders() {
var fragmentShader = getShader(gl, "shader-fs");
var vertexShader = getShader(gl, "shader-vs");
shaderProgram = gl.createProgram();
gl.attachShader(shaderProgram, vertexShader);
gl.attachShader(shaderProgram, fragmentShader);
gl.linkProgram(shaderProgram);
if (!gl.getProgramParameter(shaderProgram, gl.LINK_STATUS)) {
alert("Could not initialise shaders");
}
gl.useProgram(shaderProgram);
shaderProgram.vertexPositionAttribute = gl.getAttribLocation(shaderProgram, "aVertexPosition");
gl.enableVertexAttribArray(shaderProgram.vertexPositionAttribute);
shaderProgram.vertexColorAttribute = gl.getAttribLocation(shaderProgram, "aVertexColor");
gl.enableVertexAttribArray(shaderProgram.vertexColorAttribute);
shaderProgram.textureCoordAttribute = gl.getAttribLocation(shaderProgram, "aTextureCoord");
gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute);
shaderProgram.pMatrixUniform = gl.getUniformLocation(shaderProgram, "uPMatrix");
shaderProgram.mvMatrixUniform = gl.getUniformLocation(shaderProgram, "uMVMatrix");
shaderProgram.samplerUniform = gl.getUniformLocation(shaderProgram, "uSampler");
}
The error comes from this line:
gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute);
>> enablevertexattribarray index out of range
How do I deal with it ?
Thats simply because you do not use aTextureCoord in your vertex program, so the GLSL-Compiler optimizes it by removing it. You really should check the result of gl.GetAttribLocation() for errors, and enable only the attributes that are present in your program. Issuing a warning in case an attribute is missing would be sufficient, I know no way to distinguish shader-authoring-errors from optimizations by the compiler.
I am using this Vertex Shader on Stage3d:
<script id="per-fragment-lighting-vs2" type="x-shader/x-vertex">
attribute vec3 aVertexPosition;
attribute vec3 aVertexNormal;
attribute vec2 aTextureCoord;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
uniform mat3 uNMatrix;
varying vec2 vTextureCoord;
varying vec3 vTransformedNormal;
varying vec4 vPosition;
void main(void) {
vPosition = uMVMatrix * vec4(aVertexPosition, 1.0);
gl_Position = uPMatrix * vPosition;
vTextureCoord = aTextureCoord;
vTransformedNormal = uNMatrix * aVertexNormal;
}
</script>
With this result:
If I edit the shader like this:
<script id="per-fragment-lighting-vs2" type="x-shader/x-vertex">
attribute vec3 aVertexPosition;
attribute vec3 aVertexNormal;
attribute vec2 aTextureCoord;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
uniform mat3 uNMatrix;
varying vec2 vTextureCoord;
varying vec3 vTransformedNormal;
varying vec4 vPosition;
void main(void) {
**gl_Position = uMVMatrix * vec4(aVertexPosition, 1.0);**
vTextureCoord = aTextureCoord;
vTransformedNormal = uNMatrix * aVertexNormal;
}
</script>
And premultiply the perspective in actionscript I get a correct UV display, although it has a different perspective, it looks ok.
Any suggestion?
Most likely you have reversed the matrix multiply sequence in your first shader implementation, so you should first multiply by perspective, and then by MV.
void main(void) {
vPosition = uPMatrix * vec4(aVertexPosition, 1.0);
gl_Position = uMVMatrix * vPosition;
vTextureCoord = aTextureCoord;
vTransformedNormal = uNMatrix * aVertexNormal;
}