I'm new to the forum, so... Hello everyone!
First of all, I would like to congratulate and thank Egon for this wonderful library!
Now to the point: I have a problem with a custom shader.
I am developing a simple application with a dice simulation, in which the user can (obviously) throw dices.
I wanted to use a bump mapping shader for the dices, and I took the bump shader from the wiki and converted it for GLSL with OpenGL ES2 (substituted the "gl_Something" with the uniforms and attributes provided by JPCT-AE).
I was developing on a Tegra3 device and with the emulator, and I achived the same result, so I was happy
When I tried the app on a Adreno320 powered device, I noticed that the dices were much whiter, and when I tried it on a Galaxy Note II (Mali GPU) the dices were almost black (except for some bright spots)!!!
After a lot of tests, I wrote a much more simple shader that isolate the problem.
The shader is:
-------------------------------VERTEX:
varying vec3 tmpVec;
uniform mat4 modelViewMatrix;
uniform mat4 modelViewProjectionMatrix;
attribute vec4 position;
void main(void)
{
gl_Position = modelViewProjectionMatrix * position;
tmpVec = vec3(modelViewMatrix * position);
}
-------------------------------FRAGMENT:
precision mediump float;
varying vec3 tmpVec;
void main (void)
{
gl_FragColor = vec4(normalize(tmpVec), 1.0);
}
In this shader I take the vertex position, convert it in camera space and pass it to the fragment, in which I normalize and put the vertex position in the fragment color.
If I use the position for the color, the result is the same both on the One X and the Mi2, but if I use the position converted in camera space, I obtain a different result (as the attachment to the post shows).
My question is: how is it possible?
Could the same code and the same shader lead to so much different results? The shader is very simple, and I cannot see the error.
Am I missing something?
Thank you in advance!
Christian
[attachment deleted by admin]