I am using this Vertex Shader on Stage3d:
<script id="per-fragment-lighting-vs2" type="x-shader/x-vertex">
attribute vec3 aVertexPosition;
attribute vec3 aVertexNormal;
attribute vec2 aTextureCoord;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
uniform mat3 uNMatrix;
varying vec2 vTextureCoord;
varying vec3 vTransformedNormal;
varying vec4 vPosition;
void main(void) {
vPosition = uMVMatrix * vec4(aVertexPosition, 1.0);
gl_Position = uPMatrix * vPosition;
vTextureCoord = aTextureCoord;
vTransformedNormal = uNMatrix * aVertexNormal;
}
</script>
With this result:
If I edit the shader like this:
<script id="per-fragment-lighting-vs2" type="x-shader/x-vertex">
attribute vec3 aVertexPosition;
attribute vec3 aVertexNormal;
attribute vec2 aTextureCoord;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
uniform mat3 uNMatrix;
varying vec2 vTextureCoord;
varying vec3 vTransformedNormal;
varying vec4 vPosition;
void main(void) {
**gl_Position = uMVMatrix * vec4(aVertexPosition, 1.0);**
vTextureCoord = aTextureCoord;
vTransformedNormal = uNMatrix * aVertexNormal;
}
</script>
And premultiply the perspective in actionscript I get a correct UV display, although it has a different perspective, it looks ok.
Any suggestion?
Most likely you have reversed the matrix multiply sequence in your first shader implementation, so you should first multiply by perspective, and then by MV.
void main(void) {
vPosition = uPMatrix * vec4(aVertexPosition, 1.0);
gl_Position = uMVMatrix * vPosition;
vTextureCoord = aTextureCoord;
vTransformedNormal = uNMatrix * aVertexNormal;
}
Related
I have a new class MyActor extended Actor, with applying shader into it.
However, the shader will fill up the transparent background unexpectedly.
draw() method codes inside MyActor as follow:
#Override
public void draw(Batch batch, float parentAlpha) {
if(shaderProgram!=null)
{
batch.setShader(shaderProgram);
}
if(!drawParentAtBack)super.draw(batch, parentAlpha); // by default is false
Color c = getColor(); // used to apply tint color effect
batch.setColor(c.r, c.g, c.b, c.a * parentAlpha);
if ( isVisible() )
{
if(displayFrame !=null) // a textureregion
{
batch.draw(displayFrame,
getX(),getY(),
getOriginX(),getOriginY(),
getWidth(),getHeight(),
getScaleX(),getScaleY(),
getRotation());
}
}
if(drawParentAtBack)super.draw(batch, parentAlpha);
if(shaderProgram!=null)
{
batch.setShader(null);
}
}
public void setShader(String vs, String fs){
vertexShaderCode = Gdx.files.internal("shaders/" + vs + ".vs").readString();
fragmentShaderCode = Gdx.files.internal("shaders/" + fs + ".fs").readString();
shaderProgram = new ShaderProgram(vertexShaderCode, fragmentShaderCode);
if (!shaderProgram.isCompiled())
{
d( "Shader compile error: " + shaderProgram.getLog() );
}
}
My definition goes like this:
MyActor myActor1;myActor2;
..... // setting up myActor1 & myActor2
myActor1.setShader("default","greyScale");
myActor2.setShader("default","greyScale");
my simple shaderCodes from this tutorial:
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform mat4 u_projTrans;
void main() {
vec3 color = texture2D(u_texture, v_texCoords).rgb;
float gray = (color.r + color.g + color.b) / 3.0;
vec3 grayscale = vec3(gray);
gl_FragColor = vec4(grayscale, 1.0);
}
My vertex shader codes from this tutorial:
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform mat4 u_projTrans;
void main() {
vec3 color = texture2D(u_texture, v_texCoords).rgb;
float gray = (color.r + color.g + color.b) / 3.0;
vec3 grayscale = vec3(gray);
gl_FragColor = vec4(grayscale, 1.0);
}
My expected result images are gray color shapes with transparent background, BUT it turns out like this:
sample shape image without shader:
Any helps please.
In general transparency is achieved Alpha Blending. The alpha channel of the fragment controls the transparency and has to be set.
In the fragment shader the alpha channel of the texture is omitted:
gl_FragColor = vec4(grayscale, 1.0);
Set the alpha channel of the texture (u_texture) to the alpha channel of the output (gl_FragColor.a):
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform mat4 u_projTrans;
void main() {
// read RGB color channels and alpha channel
vec4 color = texture2D(u_texture, v_texCoords);
float gray = (color.r + color.g + color.b) / 3.0;
vec3 grayscale = vec3(gray);
// write gray scale and alpha channel
gl_FragColor = vec4(grayscale.rgb, color.a);
}
I am trying to create ocean waves in libGDX following this paper:
http://www-evasion.imag.fr/Publications/2002/HNC02/wavesSCA.pdf.
Everything is working out but when i try to shade my triangle mesh using reflective shading, the shading is not smooth and very choppy. I am just passing vertex normals to the mesh and not the surface normals but I dont really know how to pass surface normals. Here is my screenshot.
Below are my shaders:-
[vertexShader]
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projTrans;
uniform mat4 u_viewTrans;
varying vec3 pos_eye;
varying vec3 nor_eye;
void main() {
pos_eye = vec3 (u_viewTrans * u_worldTrans * vec4 (a_position, 1.0));
nor_eye = vec3 (u_viewTrans * u_worldTrans * vec4 (a_normal, 0.0));
gl_Position = u_projTrans * u_worldTrans * vec4(a_position, 1.0);
}
[FragmentShader]
#ifdef GL_ES
precision mediump float;
#endif
varying vec3 pos_eye;
varying vec3 nor_eye;
uniform samplerCube u_environmentCubemap;
uniform mat4 u_viewTrans;
uniform mat4 u_inverseViewTrans;
void main () {
vec3 incident_eye = normalize (pos_eye);
vec3 normal = normalize (nor_eye);
vec3 reflected = reflect (incident_eye, normal);
reflected = vec3 (u_inverseViewTrans * vec4 (reflected, 0.0));
gl_FragColor = vec4(textureCube(u_environmentCubemap, reflected).rgb, 1.0);
}
I have spent much time to make the shading smooth , I have also tried using a framebuffer but I am not able to achieve the desired result. If someone could help me with smooth shading, it would be really helpful.
Thanks
In my vertex shader I would love to modify attribute vec2 a_position variable that is shared in fragment shader. By this modifycation I should get image into cylindrical projection.
This is what I'm doing in my shaders:
<!-- vertex shader -->
<script id="2d-vertex-shader" type="x-shader/x-vertex">
attribute vec2 a_position;
uniform vec2 u_resolution;
uniform mat3 u_matrix;
varying vec2 v_texCoord;
void main() {
// modifying START
float angle = atan(a_position.y, a_position.x);
float r = sqrt(a_position.x*a_position.x + a_position.y*a_position.y);
a_position.x = r*cos(angle);
a_position.y = r*sin(angle);
// modifying STOP
gl_Position = vec4(u_matrix * vec3(a_position, 1), 1);
v_texCoord = a_position;
}
</script>
<!-- fragment shader -->
<script id="2d-fragment-shader" type="x-shader/x-fragment">
precision mediump float;
// our texture
uniform sampler2D u_image;
// the texCoords passed in from the vertex shader.
varying vec2 v_texCoord;
void main() {
gl_FragColor = texture2D(u_image, v_texCoord);
}
</script>
But I'm getting this error:
compiling shader '[object WebGLShader]':ERROR: 0:12: 'assign' : l-value required "a_position" (can't modify an attribute)
ERROR: 0:13: 'assign' : l-value required "a_position" (can't modify an attribute)
Don't you have any idea how to fix that?
Just use another variable
attribute vec2 a_position;
uniform vec2 u_resolution;
uniform mat3 u_matrix;
varying vec2 v_texCoord;
void main() {
// modifying START
float angle = atan(a_position.y, a_position.x);
float r = sqrt(a_position.x*a_position.x + a_position.y*a_position.y);
vec3 p = vec3(
r*cos(angle),
r*sin(angle),
1);
// modifying STOP
gl_Position = vec4(u_matrix * p, 1);
v_texCoord = a_position;
}
I have the below fragment shader. It changes the video to grey when I press a certain button. It works but there is one pb: all the sprites go down a few pixels after the shader is set. It seems the whole screen is offset in the downward direction with a few pixels
#ifdef GL_ES
precision mediump float;
precision mediump int;
#else
#define highp;
#endif
uniform sampler2D u_texture;
varying vec4 v_color;
varying vec2 v_texCoord;
const vec3 grayChange = vec3(0.299*0.4, 0.587*0.4, 0.114*0.4);
void main() {
vec4 texColor = texture2D(u_texture, v_texCoord);
vec3 grayCol = vec3(dot(texColor.rgb, grayChange));
gl_FragColor = vec4(grayCol.r, grayCol.g, grayCol.b, texColor.a * v_color.a);
}
Here's the vertex shader:
uniform mat4 U_WORLD_VIEW;
attribute vec4 a_position;
attribute vec2 a_texCoord0;
attribute vec4 a_color;
varying vec4 v_color;
varying vec2 v_texCoord;
void main() {
gl_Position = U_WORLD_VIEW * vec4(a_position.xy,0,1);
v_texCoord = a_texCoord0;
v_color = a_color;
}
Here my vertex and fragment shaders:
<script id="shader-fs" type="x-shader/x-fragment">
precision mediump float;
uniform sampler2D uSampler;
varying vec4 vColor;
varying vec2 vTextureCoord;
void main(void) {
gl_FragColor = vColor;
// gl_FragColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
}
</script>
<script id="shader-vs" type="x-shader/x-vertex">
attribute vec3 aVertexPosition;
attribute vec4 aVertexColor;
attribute vec2 aTextureCoord;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
varying vec4 vColor;
varying vec2 vTextureCoord;
void main(void) {
gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);
vColor = aVertexColor;
// vTextureCoord = aTextureCoord;
}
</script>
And here's my shader initializer:
function initShaders() {
var fragmentShader = getShader(gl, "shader-fs");
var vertexShader = getShader(gl, "shader-vs");
shaderProgram = gl.createProgram();
gl.attachShader(shaderProgram, vertexShader);
gl.attachShader(shaderProgram, fragmentShader);
gl.linkProgram(shaderProgram);
if (!gl.getProgramParameter(shaderProgram, gl.LINK_STATUS)) {
alert("Could not initialise shaders");
}
gl.useProgram(shaderProgram);
shaderProgram.vertexPositionAttribute = gl.getAttribLocation(shaderProgram, "aVertexPosition");
gl.enableVertexAttribArray(shaderProgram.vertexPositionAttribute);
shaderProgram.vertexColorAttribute = gl.getAttribLocation(shaderProgram, "aVertexColor");
gl.enableVertexAttribArray(shaderProgram.vertexColorAttribute);
shaderProgram.textureCoordAttribute = gl.getAttribLocation(shaderProgram, "aTextureCoord");
gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute);
shaderProgram.pMatrixUniform = gl.getUniformLocation(shaderProgram, "uPMatrix");
shaderProgram.mvMatrixUniform = gl.getUniformLocation(shaderProgram, "uMVMatrix");
shaderProgram.samplerUniform = gl.getUniformLocation(shaderProgram, "uSampler");
}
The error comes from this line:
gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute);
>> enablevertexattribarray index out of range
How do I deal with it ?
Thats simply because you do not use aTextureCoord in your vertex program, so the GLSL-Compiler optimizes it by removing it. You really should check the result of gl.GetAttribLocation() for errors, and enable only the attributes that are present in your program. Issuing a warning in case an attribute is missing would be sufficient, I know no way to distinguish shader-authoring-errors from optimizations by the compiler.