Author Topic: Shadows: differend behaviour between gfx cards, culling problem  (Read 4298 times)

Offline Klaudiusz

  • int
  • **
  • Posts: 75
    • View Profile
Shadows: differend behaviour between gfx cards, culling problem
« on: October 25, 2007, 08:11:54 pm »
Hi Egon, I hope You're not tired couse of my problems? :)

I've noticed differend behaviour between two graphics cards which i use.

I have Intel card:
Code: [Select]
Driver is: idisw2km/2.50.4136.2000 on Intel / Intel 915G
FBO not supported or disabled!
OpenGL renderer initialized (using 4 texture stages)

and also ATI:
Code: [Select]
Driver is: ati2dvag/6.14.10.6715 on ATI Technologies Inc. / Radeon X300/X550/X1050 Series x86/MMX/3DNow!/SSE2
FBO supported and used!
OpenGL renderer initialized (using 4 texture stages)

My problem is that i need to disable or enable culling (depending of the graphics card), if i want to have desired shadow effect:


1. Intel, culling enabled:

It's good. Shadow is visible on objects.

2. Intel,culling disabled:

Looks very bad, this is not what i want.

3. ATI, culling enabled:

No shadow on the objects!

4. ATI, culling disabled:

Looks nice! This is it!


On Intel everything is nice when culling is enabled, but on ATI i need to disable culling and lowes the performance.... I think it's not good when behaviour is so differend.




I used this code:

Code: [Select]
import java.awt.*;
import com.threed.jpct.*;
import com.threed.jpct.util.*;

public class TestShadow2{

private FrameBuffer fb = null;
private World world = null;
private Object3D plane = null;
private Object3D cube = null;
private Object3D sphere = null;
private Projector projector=null;
private ShadowHelper sh = null;
private Light sun=null;

public TestShadow2() {
Config.glColorDepth = 24;
Config.glFullscreen = false;
Config.farPlane = 1000;
Config.glShadowZBias = 0.5f;
Config.glTrilinear=false;
}

private void initStuff() throws Exception {
fb = new FrameBuffer(800, 600, FrameBuffer.SAMPLINGMODE_NORMAL);
world = new World();
fb.enableRenderer(IRenderer.RENDERER_OPENGL, IRenderer.MODE_OPENGL);
fb.disableRenderer(IRenderer.RENDERER_SOFTWARE);

plane = Primitives.getPlane(20, 30);
plane.rotateX((float) Math.PI / 2f);

cube=Primitives.getCube(25);
cube.setAdditionalColor(Color.RED);
cube.translate(0, -30, -0);
cube.setCulling(Object3D.CULLING_DISABLED);

sphere=Primitives.getSphere(17);
sphere.translate(40, -20, -50);
sphere.setAdditionalColor(new Color(0,0,50));
sphere.setCulling(Object3D.CULLING_DISABLED);

world.addObject(sphere);
world.addObject(plane);
world.addObject(cube);

TextureManager tm = TextureManager.getInstance();

projector = new Projector();
sh = new ShadowHelper(world, fb, projector, 1024);

sh.addCaster(cube);
sh.addCaster(sphere);
sh.addReceiver(cube);
sh.addReceiver(plane);
sh.addReceiver(sphere);
sh.setAmbientLight(new Color(0,0,0));
sh.setFiltering(false);

world.setAmbientLight(90,90,90);
world.buildAllObjects();

sun=new Light(world);
sun.setIntensity(50, 50, 50);
}

private void doIt() throws Exception {
Camera cam = world.getCamera();
cam.moveCamera(Camera.CAMERA_MOVEOUT, 150);
cam.moveCamera(Camera.CAMERA_MOVEUP, 100);
cam.lookAt(plane.getTransformedCenter());

projector.setFOV(0.5f);
projector.setYFOV(0.5f);

SimpleVector pos=cube.getTransformedCenter();

projector.setPosition(pos);
projector.moveCamera(Camera.CAMERA_MOVEUP, 200);
projector.lookAt(pos);
SimpleVector offset=new SimpleVector(1,0,-1).normalize();
projector.moveCamera(offset, 215);
offset.rotateY(0.7f);
while (!org.lwjgl.opengl.Display.isCloseRequested()) {

projector.lookAt(cube.getTransformedCenter());
//offset.rotateY(0.007f);
projector.setPosition(pos);
projector.moveCamera(new SimpleVector(0,-1,0), 200);
projector.moveCamera(offset, 215);
sun.setPosition(projector.getPosition());

sh.updateShadowMap();

fb.clear();
sh.drawScene();

fb.update();
fb.displayGLOnly();

Thread.sleep(10);
}
fb.disableRenderer(IRenderer.RENDERER_OPENGL);
fb.dispose();
System.exit(0);
}

public static void main(String[] args) throws Exception {
TestShadow2 cd = new TestShadow2();
cd.initStuff();
cd.doIt();
}
}

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 11777
    • View Profile
    • http://www.jpct.net
Re: Shadows: differend behaviour between gfx cards, culling problem
« Reply #1 on: October 26, 2007, 11:11:04 am »
3) is how it is supposed to look...maybe i should elaborate a little more about how shadows are created: When creating a shadow map (done when you call updateShadowMap() of the ShadowHelper), the scene (to be precise, only the shadow casters are) is rendered from the light source's point of view into a depth texture, i.e. not the color is actually rendered but the depth information only. This texture is then projected (like projective texturing does it too) into the scene and a special texture mode is used to compare the depth stored in the depth texture with the depth at the current position on screen when viewed from the light source (calculated via a matrix operation). If the depth in the texture is lower then the one calculated "on the fly", the point is in shadow. Otherwise, it isn't.
This depth-compare-operation isn't exact. I have rounding errors and precision issues. How much and on which surfaces depends on the hardware...every chip is doing this differently. This may result in what you get in 1)...a surface that shouldn't be shadowed but is a shadow caster will be partially shadowed and shows some precision artifacts that look ugly. To compensate for this, the geometry isn't calculated with default culling but with inverted culling, so that actually the back faces are being used as shadow casters, not the front faces. As long as the object is closed, this is not the problem. Then, an offset (can be adjusted in Config) is used to offset the shadow from the back faces. The result is 3), i.e. an object may shadow itself but usually not in a way that a surface recieves it's own shadow. If lighting is choosen wisely, this is not a problem because vertex lighting won't lit this surfaces anyway. The dust-demo is a good example of this: The lamp casts a shadow but doesn't recieve its own.
On to your example pics:

1) is bad, because the surfaces show accuracy problems. The offset isn't large enough for the precision of the Intel chips
2) two is even worse, because now (caused by the disabling of the culling), the front faces (as seen from the light) are covered with artifacts...bad thing!
3) is how it is supposed to look. It may not be what you expected, but it is, what i had in mind when coding it...:)
4) looks fine, but actually isn't what i had in mind.

However, i understand your urge to make a surface recieve its own shadow, but due to the precision problems, this isn't that easy. On monday, i'll add the option to the helper to not use the inverted culling when rendering the shadow map. This, combined with a larger offset, may lead to something like 4) on both, ATI and Intel (in this particular scene...it may look bad in others, but it's better to have the option than not to).

Hope this explains the problem a little better.
« Last Edit: October 26, 2007, 11:13:58 am by EgonOlsen »

Offline Klaudiusz

  • int
  • **
  • Posts: 75
    • View Profile
Re: Shadows: differend behaviour between gfx cards, culling problem
« Reply #2 on: October 26, 2007, 12:39:45 pm »
Yes, in my scene all objects are casters and recieves, so this is a real problem for me, i know the difference between cards is a big problem. Besides selfshadowing is pretty:) Thank You very much for the explanations.

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 11777
    • View Profile
    • http://www.jpct.net
Re: Shadows: differend behaviour between gfx cards, culling problem
« Reply #3 on: October 30, 2007, 05:50:30 pm »
I've added the mentioned option to the helper (setCullingMode(<boolean>)) in this version: http://www.jpct.net/download/beta/jpctapi116pre2.zip. You may combine this with a larger offset for Config.glShadowZBias and see if that helps in some ways.