Recent Posts

Pages: 1 ... 8 9 [10]
91
Support / Re: Are merged Object3D able to have shared submeshes?
« Last post by AeroShark333 on November 24, 2021, 02:10:10 pm »
This is the code of setDepth():

Code: [Select]
public void setDepth(float depth) {
if (depth < Config.nearPlane) {
depth = Config.nearPlane + 1f;
}
this.depth = depth;
}

As you can see, setting it to -infinity has no effect other than setting the depth to Config.nearPlane + 1f
Hmmm, okay...

Another idea: Split scene and overlay in two different world instances and render them one after the other with a z-buffer clear in between...
This does work without any issues.
I thought the renderhook solution should have worked as well, but apparently not..?
I did make sure it got called.

Anyway, I guess this will do for now.
Thank you for the support and help :D
92
Support / Re: Are merged Object3D able to have shared submeshes?
« Last post by EgonOlsen on November 22, 2021, 09:51:56 am »
This is the code of setDepth():

Code: [Select]
public void setDepth(float depth) {
if (depth < Config.nearPlane) {
depth = Config.nearPlane + 1f;
}
this.depth = depth;
}

As you can see, setting it to -infinity has no effect other than setting the depth to Config.nearPlane + 1f

I'm puzzled, why all of my suggestions have no effect. At least the sorting should have helped. Maybe the issue is something other than I think it is...

Another idea: Split scene and overlay in two different world instances and render them one after the other with a z-buffer clear in between...
93
Support / Re: Are merged Object3D able to have shared submeshes?
« Last post by EgonOlsen on November 22, 2021, 09:51:20 am »
This is the code of setDepth():

Code: [Select]
public void setDepth(float depth) {
if (depth < Config.nearPlane) {
depth = Config.nearPlane + 1f;
}
this.depth = depth;
}

As you can see, setting it to -infinity has no effect other than setting the depth to Config.nearPlane + 1f

I'm puzzled, why all of my suggestions have no effect. At least the sorting should have helped.

Another idea: Split scene and overlay in two different world instances and render them one after the other with a z-buffer clear in between...
94
Support / Re: Are merged Object3D able to have shared submeshes?
« Last post by AeroShark333 on November 20, 2021, 02:41:26 pm »
If you set Config.glIgnoreNearPlane, the value set for the near plane won't be used. The default (1) will be used instead. If you combine that with your overlay and the fact that you've set the near plane to 0 (which usually is too close btw), you will end up with a near plane of 1 for the scene and a depth of (0+1) for the overlay as well. I'm surprised that it shows up at all.
Hmmm okay...

Try to set Config.glIgnoreNearPlane to false and see if that changes anything. And, as said, 0 as the near plane is too close. Try some higher value instead.
I set Config.glIgnoreNearPlane to false and I varied the near plane value between 0.5f and 3f... Still no luck...
I thought it'd work for values higher than 1f, since the default value is 1f according to documentations... but apparently it doesn't work still..?

That said, if the overlay is transparent it won't write into the depth buffer, which can lead to objects drawn later to be painted over it. If you set it to opaque, the depth buffer will handle the overdraw and in turn avoid the problem. You can try to change the sort order of the overlay to make it render after everything else:

  • use Oyerlay.getObject3D() to get the internal Object3D from the overlay
  • call setZSortOffset(-xxxx) with xxxx being a large value on the Object3D of the overlay

That should, in theory, change the overlays sorting so that is will be rendered after everything else.
I tried this using:
Code: [Select]
overlay.getObject3D().setSortOffset(-3E33f);but still no luck. This is one of the first things I tried too when trying to solve this issue... positive values seem to affect it, negative values don't..?

This can be replicated with the test case that I provided before with minor changes:
- adding the setSortOffset
- removing any Config changes
- keeping clearZBufferOnly()
But none of these seem to help..?

Something that does seem to 'fix it' (not entirely sure if it completely fixes it but it seems so visually) is:
overlay.setDepth(Float.NEGATIVE_INFINITY);
But I wouldn't be able to explain how or why if you say the depth should be capped by nearPlane..?
Now values such as 0f, 1f and 2f seem to work too for the setDepth method..?
I wonder if the depth is initialized properly to begin with.. which I assume it to be 2f by default if I understood your last post properly?
Around 3f, it will start clipping through again which makes me think that the depth is already too large by default..? (probably >3f or something by default..?)

EDIT: I believe I found something that might cause the flickering effect with setDepth(...):
When the Object3D's are close, the floating point precision is probably high enough to order the Overlay on top of it still.
But when the camera is further away, the floating point precision might not be high enough (basically causing the overlay and Object3D to have equal z-depths) and giving priority to the Object3Ds sometimes rather than the Overlay (causing flickering).
But I find that weird since the Object3D's are further away..?
What I could do then, is to increase the nearPlane depth... but then the clipping will start happening again.
Let's say I use setDepth(0f) on the overlays:
Bringing Config.nearPlane to 0 will remove the clipping but introduce flickering. Increasing Config.nearPlane will increase clipping but remove flickering... Seems like some trade-off thing to me.
Config.nearPlane = 1E-6f; has too much flickering
Config.nearPlane = 1E-1f; has too much clipping
Config.nearPlane = 1E-3f; seems to work best so far...
95
Support / Re: Are merged Object3D able to have shared submeshes?
« Last post by EgonOlsen on November 20, 2021, 12:34:56 pm »
If you set Config.glIgnoreNearPlane, the value set for the near plane won't be used. The default (1) will be used instead. If you combine that with your overlay and the fact that you've set the near plane to 0 (which usually is too close btw), you will end up with a near plane of 1 for the scene and a depth of (0+1) for the overlay as well. I'm surprised that it shows up at all.

Try to set Config.glIgnoreNearPlane to false and see if that changes anything. And, as said, 0 as the near plane is too close. Try some higher value instead.

That said, if the overlay is transparent it won't write into the depth buffer, which can lead to objects drawn later to be painted over it. If you set it to opaque, the depth buffer will handle the overdraw and in turn avoid the problem. You can try to change the sort order of the overlay to make it render after everything else:

  • use Oyerlay.getObject3D() to get the internal Object3D from the overlay
  • call setZSortOffset(-xxxx) with xxxx being a large value on the Object3D of the overlay

That should, in theory, change the overlays sorting so that is will be rendered after everything else.
96
Support / Re: Are merged Object3D able to have shared submeshes?
« Last post by AeroShark333 on November 19, 2021, 06:37:11 pm »
Yes, I did make sure that the RenderHook implementation gets called, I double checked with a "System.out.println("ok");" which flooded the logcat.
In my scene, the overlay is transparent itself, but most of the objects in the scene are not... So I don't think that is the issue...
But maybe the transparency of the overlay itself causes issues...
The "Config.nearPlane = 0f" in my scene and I also use "Config.glIgnoreNearPlane = true"... I'm not sure how or if the nearPlane value is used if it's ignored..?

A test case... (I hope this is complete enough..?)
So... as I said before, I believe the culprit might be that the Overlay is transparent... Without transparency, there's no such issue with the clipping.
Code: [Select]
    final GLSurfaceView.Renderer testRenderer = new GLSurfaceView.Renderer() {
            private FrameBuffer frameBuffer;
            private World world;
            private Overlay overlay;
            private Object3D sphere;
            private Camera camera;

            @Override
            public void onSurfaceCreated(GL10 gl10, EGLConfig eglConfig) {

            }

            @Override
            public void onSurfaceChanged(GL10 gl10, int w, int h) {
                Config.nearPlane = 0.0f;
                Config.glIgnoreNearPlane = true;

                frameBuffer = new FrameBuffer(w, h);
                world = new World();
                camera = world.getCamera();

                final Texture sphereTexture = new Texture(2, 2, new RGBColor(75, 0, 0));
                final Texture overlayTexture = new Texture(2, 2, new RGBColor(0, 0, 75));

                final TextureManager textureManager = TextureManager.getInstance();
                textureManager.addTexture("tex0", sphereTexture);
                textureManager.addTexture("tex1", overlayTexture);

                sphere = Primitives.getSphere(1.0f);
                sphere.setTexture("tex0");
                sphere.strip();
                sphere.build();
                world.addObject(sphere);

                overlay = new Overlay(world, 0, 0, frameBuffer.getWidth(), frameBuffer.getHeight() / 2, "tex1");
                // I believe this line below is the culprit; without it, things seem fine...
                overlay.setTransparency(255);
                // Gone..?
                //overlay.setDepth(0f);
                // Fixed..?
                //overlay.setDepth(Float.NEGATIVE_INFINITY);
                // Not fixed..?
                //overlay.setDepth(-3E33f);
                overlay.getObject3D().setRenderHook(new IRenderHook() {
                    @Override
                    public void beforeRendering(int i) {
                        frameBuffer.clearZBufferOnly();
                        // check that it gets called
                        overlay.setRotation(((float)Math.random())*0.01f);
                    }

                    @Override
                    public void afterRendering(int i) {

                    }

                    @Override
                    public void setCurrentObject3D(Object3D object3D) {

                    }

                    @Override
                    public void setCurrentShader(GLSLShader glslShader) {

                    }

                    @Override
                    public void setTransparency(float v) {

                    }

                    @Override
                    public void onDispose() {

                    }

                    @Override
                    public boolean repeatRendering() {
                        return false;
                    }
                });
            }

            @Override
            public void onDrawFrame(GL10 gl10) {
                frameBuffer.clear();
                final float maxDistance = 7.5f;
                camera.setPosition(0f, 0f, -(0.5f * ((float) Math.abs(Math.cos(world.getFrameCounter() * 0.03f))) + 0.5f) * maxDistance);
                world.renderScene(frameBuffer);
                world.draw(frameBuffer);
                frameBuffer.display();
            }
        };

        // Attach to some GLSurfaceView here...
        viewer.setRenderer(testRenderer);

Thanks for the tips, I'll keep that in mind!  ;D
97
Support / Re: Are merged Object3D able to have shared submeshes?
« Last post by EgonOlsen on November 19, 2021, 08:17:16 am »
I tried this but it didn't seem to make a difference unfortunately... Any other suggestions?
I believe Overlay#setDepth(-9E30f) sort of works best for now but still not perfectly

That's a bit strange. If the overlay is being rendered last (which should be the case) but you are clearing the depth buffer before rendering, there has to paint over the entire scene no matter what. Have you checked that your implementation gets actually called? The overlay is transparent, I assume? Is there any chance that the object interfering with it is as well?

About the depth thing...that value can't do anything useful, to be honest. A negative value would move the overlay behind the clipping plane, which makes mathematically no sense. Therefor, the method checks, if the value given to it is smaller than the value in Config.nearPlane and if it is, it will auto-adjust it to Config.nearPlane+1. So you can go smaller than Config.nearPlane+1, but you can't go smaller than the actual value in Config.nearPlane. Have you tried playing around with Config.nearPlane itself?

If nothing helps, can you create a test case that shows the problem?

About the enabling/disabling, did you mean Object3D#setVisibility(boolean), World.removeObject(Object3D) or something else? (I assume these are basically the same anyway...)
Does this need to be actively done for every frame (or camera update)? ??? I thought jPCT took care of this automatically based on what's visible inside the camera's view...
Depends on your implementation, I guess. Yes, jPCT takes cares of objects that aren't visible, but it's still faster to disable them in the first place if you are sure that they aren't visible in the current scene. For that, setVisibility() is usually the better option than adding/removing them. In Naroth, it's a bit different. The maps are stored as simple ASCII maps. At load time, I parse the maps and convert the ASCII data into matching beans that contain information about their position, orientation and apperance (a wall, a door, a chest...). When the player moves, I iterate through these objects to see, which ones are potentially visible (the check is rather rough, but good enough). For all that are visible, I assign an actual Object3D to them that gets positioned in the world so that it matches the data in the bean. So basically, I have a pool of Object3D instances which dynamically form the scene. If visualized, this would look like a scene out of a sci-fi movie where the world builds itself as you walk along while it collapses behind you.
98
Hi, and thanks!

You´re great, both of you, thanks a lot really appreciate your help :)
Makes sense, I'll try this.

99
There is one, just not in the way in which you expect it to be. What you have to do is to grab the information you need from the various methods in the PolygonManager, create a TextureInfo-Instance from these and assign it to the polygon by using setPolygonTexture(). This code from the wiki has an example of that in the setTexture()-method where I'm tiling a terrain texture: https://www.jpct.net/wiki/index.php?title=Texture_splatting_on_a_terrain
Ah... yes, I forgot TextureInfo holds UV information... My bad
100
Support / Re: Are merged Object3D able to have shared submeshes?
« Last post by AeroShark333 on November 18, 2021, 03:08:15 pm »
  • create the overlay as usual
  • user Oyerlay.getObject3D() to get the internal Object3D from the overlay
  • attach an IRenderHook implementation to that object, that...
  • ...implements IRenderHook.beforeRendering(<int>) in a way that FrameBuffer.clearZBufferOnly() is being called in it.
  • ...and then check what this does...
I tried this but it didn't seem to make a difference unfortunately... Any other suggestions?
I believe Overlay#setDepth(-9E30f) sort of works best for now but still not perfectly

About the merging: It's the usual trade-off, I guess. Merging them requires time at startup (not sure if that matters here) and consumes more memory, because it limits the ability to share meshes. On the other hand, it uses less draw calls, which should make it render faster. Apart from that, the basic idea sounds fine to me. It would have done it the same way (maybe that doesn't say much, but anyway...). In fact, I did something very similar with the dungeons in Naroth. They aren't a solid mesh but consist of building blocks for walls, ceilings and floors that are constantly enabled, disabled and moved around when the player moves. Of course, that's slower than a single mesh but it's actually fast enough and requires much less memory and is more flexible.
About the enabling/disabling, did you mean Object3D#setVisibility(boolean), World.removeObject(Object3D) or something else? (I assume these are basically the same anyway...)
Does this need to be actively done for every frame (or camera update)? ??? I thought jPCT took care of this automatically based on what's visible inside the camera's view...

Merging at startup would be possible but the scenery would be more dull in that sense... I'm not sure if merging can be recommended during render time but it would make the scenery better I guess....
But I believe merging might be better still since I have the feeling that currently too many draw calls are being made, lowering the FPS as a result. I guess I'll split it up in a mergeable and non-mergeable part per segment or something
Pages: 1 ... 8 9 [10]