Author Topic: Are merged Object3D able to have shared submeshes?  (Read 2994 times)

Offline AeroShark333

  • float
  • ****
  • Posts: 319
    • View Profile
Are merged Object3D able to have shared submeshes?
« on: November 16, 2021, 10:06:27 am »
Hello,

Let's say I have 300 dynamic Object3D's and 300 static Object3D's.
The 300 static ones share the same mesh, the dynamic one does not.
However, in the world scene these objects will always be rendered together (one dynamic Object3D with one static Object3D). Each of the 300 Object3D's have different positions and rotations, but every dynamic Object3D is 'coupled' with a static Object3D.

Would it be better to have 600 Object3D's with a child/parent relationship? Or is it possible to merge these to 300 Object3D's which could be cheaper to render (because the submesh could be shared..?). I wonder which is best in terms of memory usage and performance. Or is there another better way to implement this..? (The Object3D's have between 12 to 60 triangles each for further information)

In the documentation it said merged Object3D's aren't compressed by default but I am not sure how or if that holds if one of two Object3D shares their mesh.

Cheers.

Edit: I'm also having issues with my multiple Overlays. When I use Overlay#setDepth() the Overlay seems to vanish..? I can solve the sorting problem of multiple Overlays using Object3D#setSortingOffset on its Object3D. However, it is still possible for my other World Object3D's to appear in front of the Overlay... (even without the sorting offset). How can I make sure the Overlay is always on top?
Edit2: Nevermind... apparently setDepth allows negative values (although documentation says it should always be positive..?) Maybe having Config.nearPlane = 0f and Config.ignoreNearPlane = true ruins compatibility or something. setDepth with negative values do seem to work and also fix my issue...
Edit3: Sometimes the other Object3Ds in the world still seem to clip through the overlay... (it appears as flickering but it's barely noticable but still weird...) Well whatever... I give up for now
« Last Edit: November 17, 2021, 12:07:09 am by AeroShark333 »

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Are merged Object3D able to have shared submeshes?
« Reply #1 on: November 17, 2021, 12:45:24 pm »
The overlay is part of the scene, because it's actually a 3d object which has a real depth and read/writes from/into the depth buffer. But they are rendered after everything else. It actually shouldn't happen that objects clip through them, but it might be an accuracy issue. Depth buffers, especially on mobile, aren't very accurate at times. It might help to move the clipping plane into the scene a little more. Do you have a screen shot of how this looks?

About the objects: I'm not sure, if I got you correctly here. You can't merge two objects when one is static and one is dynamic. The result will either be 100% static or 100% dynamic, depending on your setting for the merged object. In any case, you can't animate the dynamic "part" of it independently from the other objects that share the same mesh.

Offline AeroShark333

  • float
  • ****
  • Posts: 319
    • View Profile
Re: Are merged Object3D able to have shared submeshes?
« Reply #2 on: November 17, 2021, 02:38:39 pm »
Here are some images for explanation: https://imgur.com/a/mt0FB2K
I only started on this project last week or so... So it doesn't look that good yet...
Although these screenshots are taken from an emulator, the same problem is present on an actual phone.
The red circles show the clipping problem. The cyan/blue circles explain what I meant with static/dynamic.

The overlays in the images use the default depth.
Using: Overlay#setDepth(-9E30f); seems to help a lot but the objects very rarely still seem to clip through (causing a slight flickering effect)

Basically the road is randomly generated (and so are the mountains and guardrails)
However, everything is made up from small segments. (So there is road segments, guardrail segments, mountain segments, etc.)
The segments in the back get moved to the front once the car is moving (so it is an infinitely long road).
The segmented implementation allows rotations (curved roads) to happen without having to modify the meshes.

The road segments and the guardrail segments all share the same mesh.
The mountain segments do not share the same mesh (so these are dynamically created...).
I didn't mean dynamic as in for animation purposes.

What I basically wanted is to merge the mountain segment and the guardrail segments together into a 'terrain segment'.
So you could see the terrain segment's mesh to consist of two parts: a (shareable) mesh part and a (variable/non-shareable) mesh part
Where the shareable mesh is the guardrail, and the non-shareable mesh is the mountain.
I'm not even sure if this idea would have any performance benefits...

I wonder if there are better ways to implement this whole idea anyway
Currently, I have 150 road segments, 150 mountain segments, 150 dirt segments and (maximally) (150*2+50*2) guardrail segments (left and right guardrail and left and right guardrail pole respectively).
On my phone rendering this scene gives about 30 to 60 FPS depending on the direction of where the camera is pointing.
400+ Object3D's seem to be a lot anyway... But I'm not sure if it's more expensive to have them merged or to let them have parent-child relationships.
My issue is basically: creating a lot of Object3D's with child-parent relationships (avoids merging) vs. creating a lot of Object3D's and merging to reduce the amount of Object3D's (avoids having a lot of Object3D)
« Last Edit: November 17, 2021, 03:10:08 pm by AeroShark333 »

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Are merged Object3D able to have shared submeshes?
« Reply #3 on: November 18, 2021, 07:38:23 am »
Regarding the objects clipping through, I'm not entirely sure how to fight this. I see why it happens but I've never seen this happen to me. There's a kind of hack that you might want to try to see if that does help:

  • create the overlay as usual
  • user Oyerlay.getObject3D() to get the internal Object3D from the overlay
  • attach an IRenderHook implementation to that object, that...
  • ...implements IRenderHook.beforeRendering(<int>) in a way that FrameBuffer.clearZBufferOnly() is being called in it.
  • ...and then check what this does...

About the merging: It's the usual trade-off, I guess. Merging them requires time at startup (not sure if that matters here) and consumes more memory, because it limits the ability to share meshes. On the other hand, it uses less draw calls, which should make it render faster. Apart from that, the basic idea sounds fine to me. It would have done it the same way (maybe that doesn't say much, but anyway...). In fact, I did something very similar with the dungeons in Naroth. They aren't a solid mesh but consist of building blocks for walls, ceilings and floors that are constantly enabled, disabled and moved around when the player moves. Of course, that's slower than a single mesh but it's actually fast enough and requires much less memory and is more flexible.


Offline AeroShark333

  • float
  • ****
  • Posts: 319
    • View Profile
Re: Are merged Object3D able to have shared submeshes?
« Reply #4 on: November 18, 2021, 03:08:15 pm »
  • create the overlay as usual
  • user Oyerlay.getObject3D() to get the internal Object3D from the overlay
  • attach an IRenderHook implementation to that object, that...
  • ...implements IRenderHook.beforeRendering(<int>) in a way that FrameBuffer.clearZBufferOnly() is being called in it.
  • ...and then check what this does...
I tried this but it didn't seem to make a difference unfortunately... Any other suggestions?
I believe Overlay#setDepth(-9E30f) sort of works best for now but still not perfectly

About the merging: It's the usual trade-off, I guess. Merging them requires time at startup (not sure if that matters here) and consumes more memory, because it limits the ability to share meshes. On the other hand, it uses less draw calls, which should make it render faster. Apart from that, the basic idea sounds fine to me. It would have done it the same way (maybe that doesn't say much, but anyway...). In fact, I did something very similar with the dungeons in Naroth. They aren't a solid mesh but consist of building blocks for walls, ceilings and floors that are constantly enabled, disabled and moved around when the player moves. Of course, that's slower than a single mesh but it's actually fast enough and requires much less memory and is more flexible.
About the enabling/disabling, did you mean Object3D#setVisibility(boolean), World.removeObject(Object3D) or something else? (I assume these are basically the same anyway...)
Does this need to be actively done for every frame (or camera update)? ??? I thought jPCT took care of this automatically based on what's visible inside the camera's view...

Merging at startup would be possible but the scenery would be more dull in that sense... I'm not sure if merging can be recommended during render time but it would make the scenery better I guess....
But I believe merging might be better still since I have the feeling that currently too many draw calls are being made, lowering the FPS as a result. I guess I'll split it up in a mergeable and non-mergeable part per segment or something
« Last Edit: November 18, 2021, 03:32:56 pm by AeroShark333 »

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Are merged Object3D able to have shared submeshes?
« Reply #5 on: November 19, 2021, 08:17:16 am »
I tried this but it didn't seem to make a difference unfortunately... Any other suggestions?
I believe Overlay#setDepth(-9E30f) sort of works best for now but still not perfectly

That's a bit strange. If the overlay is being rendered last (which should be the case) but you are clearing the depth buffer before rendering, there has to paint over the entire scene no matter what. Have you checked that your implementation gets actually called? The overlay is transparent, I assume? Is there any chance that the object interfering with it is as well?

About the depth thing...that value can't do anything useful, to be honest. A negative value would move the overlay behind the clipping plane, which makes mathematically no sense. Therefor, the method checks, if the value given to it is smaller than the value in Config.nearPlane and if it is, it will auto-adjust it to Config.nearPlane+1. So you can go smaller than Config.nearPlane+1, but you can't go smaller than the actual value in Config.nearPlane. Have you tried playing around with Config.nearPlane itself?

If nothing helps, can you create a test case that shows the problem?

About the enabling/disabling, did you mean Object3D#setVisibility(boolean), World.removeObject(Object3D) or something else? (I assume these are basically the same anyway...)
Does this need to be actively done for every frame (or camera update)? ??? I thought jPCT took care of this automatically based on what's visible inside the camera's view...
Depends on your implementation, I guess. Yes, jPCT takes cares of objects that aren't visible, but it's still faster to disable them in the first place if you are sure that they aren't visible in the current scene. For that, setVisibility() is usually the better option than adding/removing them. In Naroth, it's a bit different. The maps are stored as simple ASCII maps. At load time, I parse the maps and convert the ASCII data into matching beans that contain information about their position, orientation and apperance (a wall, a door, a chest...). When the player moves, I iterate through these objects to see, which ones are potentially visible (the check is rather rough, but good enough). For all that are visible, I assign an actual Object3D to them that gets positioned in the world so that it matches the data in the bean. So basically, I have a pool of Object3D instances which dynamically form the scene. If visualized, this would look like a scene out of a sci-fi movie where the world builds itself as you walk along while it collapses behind you.

Offline AeroShark333

  • float
  • ****
  • Posts: 319
    • View Profile
Re: Are merged Object3D able to have shared submeshes?
« Reply #6 on: November 19, 2021, 06:37:11 pm »
Yes, I did make sure that the RenderHook implementation gets called, I double checked with a "System.out.println("ok");" which flooded the logcat.
In my scene, the overlay is transparent itself, but most of the objects in the scene are not... So I don't think that is the issue...
But maybe the transparency of the overlay itself causes issues...
The "Config.nearPlane = 0f" in my scene and I also use "Config.glIgnoreNearPlane = true"... I'm not sure how or if the nearPlane value is used if it's ignored..?

A test case... (I hope this is complete enough..?)
So... as I said before, I believe the culprit might be that the Overlay is transparent... Without transparency, there's no such issue with the clipping.
Code: [Select]
    final GLSurfaceView.Renderer testRenderer = new GLSurfaceView.Renderer() {
            private FrameBuffer frameBuffer;
            private World world;
            private Overlay overlay;
            private Object3D sphere;
            private Camera camera;

            @Override
            public void onSurfaceCreated(GL10 gl10, EGLConfig eglConfig) {

            }

            @Override
            public void onSurfaceChanged(GL10 gl10, int w, int h) {
                Config.nearPlane = 0.0f;
                Config.glIgnoreNearPlane = true;

                frameBuffer = new FrameBuffer(w, h);
                world = new World();
                camera = world.getCamera();

                final Texture sphereTexture = new Texture(2, 2, new RGBColor(75, 0, 0));
                final Texture overlayTexture = new Texture(2, 2, new RGBColor(0, 0, 75));

                final TextureManager textureManager = TextureManager.getInstance();
                textureManager.addTexture("tex0", sphereTexture);
                textureManager.addTexture("tex1", overlayTexture);

                sphere = Primitives.getSphere(1.0f);
                sphere.setTexture("tex0");
                sphere.strip();
                sphere.build();
                world.addObject(sphere);

                overlay = new Overlay(world, 0, 0, frameBuffer.getWidth(), frameBuffer.getHeight() / 2, "tex1");
                // I believe this line below is the culprit; without it, things seem fine...
                overlay.setTransparency(255);
                // Gone..?
                //overlay.setDepth(0f);
                // Fixed..?
                //overlay.setDepth(Float.NEGATIVE_INFINITY);
                // Not fixed..?
                //overlay.setDepth(-3E33f);
                overlay.getObject3D().setRenderHook(new IRenderHook() {
                    @Override
                    public void beforeRendering(int i) {
                        frameBuffer.clearZBufferOnly();
                        // check that it gets called
                        overlay.setRotation(((float)Math.random())*0.01f);
                    }

                    @Override
                    public void afterRendering(int i) {

                    }

                    @Override
                    public void setCurrentObject3D(Object3D object3D) {

                    }

                    @Override
                    public void setCurrentShader(GLSLShader glslShader) {

                    }

                    @Override
                    public void setTransparency(float v) {

                    }

                    @Override
                    public void onDispose() {

                    }

                    @Override
                    public boolean repeatRendering() {
                        return false;
                    }
                });
            }

            @Override
            public void onDrawFrame(GL10 gl10) {
                frameBuffer.clear();
                final float maxDistance = 7.5f;
                camera.setPosition(0f, 0f, -(0.5f * ((float) Math.abs(Math.cos(world.getFrameCounter() * 0.03f))) + 0.5f) * maxDistance);
                world.renderScene(frameBuffer);
                world.draw(frameBuffer);
                frameBuffer.display();
            }
        };

        // Attach to some GLSurfaceView here...
        viewer.setRenderer(testRenderer);

Thanks for the tips, I'll keep that in mind!  ;D
« Last Edit: November 19, 2021, 06:56:12 pm by AeroShark333 »

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Are merged Object3D able to have shared submeshes?
« Reply #7 on: November 20, 2021, 12:34:56 pm »
If you set Config.glIgnoreNearPlane, the value set for the near plane won't be used. The default (1) will be used instead. If you combine that with your overlay and the fact that you've set the near plane to 0 (which usually is too close btw), you will end up with a near plane of 1 for the scene and a depth of (0+1) for the overlay as well. I'm surprised that it shows up at all.

Try to set Config.glIgnoreNearPlane to false and see if that changes anything. And, as said, 0 as the near plane is too close. Try some higher value instead.

That said, if the overlay is transparent it won't write into the depth buffer, which can lead to objects drawn later to be painted over it. If you set it to opaque, the depth buffer will handle the overdraw and in turn avoid the problem. You can try to change the sort order of the overlay to make it render after everything else:

  • use Oyerlay.getObject3D() to get the internal Object3D from the overlay
  • call setZSortOffset(-xxxx) with xxxx being a large value on the Object3D of the overlay

That should, in theory, change the overlays sorting so that is will be rendered after everything else.

Offline AeroShark333

  • float
  • ****
  • Posts: 319
    • View Profile
Re: Are merged Object3D able to have shared submeshes?
« Reply #8 on: November 20, 2021, 02:41:26 pm »
If you set Config.glIgnoreNearPlane, the value set for the near plane won't be used. The default (1) will be used instead. If you combine that with your overlay and the fact that you've set the near plane to 0 (which usually is too close btw), you will end up with a near plane of 1 for the scene and a depth of (0+1) for the overlay as well. I'm surprised that it shows up at all.
Hmmm okay...

Try to set Config.glIgnoreNearPlane to false and see if that changes anything. And, as said, 0 as the near plane is too close. Try some higher value instead.
I set Config.glIgnoreNearPlane to false and I varied the near plane value between 0.5f and 3f... Still no luck...
I thought it'd work for values higher than 1f, since the default value is 1f according to documentations... but apparently it doesn't work still..?

That said, if the overlay is transparent it won't write into the depth buffer, which can lead to objects drawn later to be painted over it. If you set it to opaque, the depth buffer will handle the overdraw and in turn avoid the problem. You can try to change the sort order of the overlay to make it render after everything else:

  • use Oyerlay.getObject3D() to get the internal Object3D from the overlay
  • call setZSortOffset(-xxxx) with xxxx being a large value on the Object3D of the overlay

That should, in theory, change the overlays sorting so that is will be rendered after everything else.
I tried this using:
Code: [Select]
overlay.getObject3D().setSortOffset(-3E33f);but still no luck. This is one of the first things I tried too when trying to solve this issue... positive values seem to affect it, negative values don't..?

This can be replicated with the test case that I provided before with minor changes:
- adding the setSortOffset
- removing any Config changes
- keeping clearZBufferOnly()
But none of these seem to help..?

Something that does seem to 'fix it' (not entirely sure if it completely fixes it but it seems so visually) is:
overlay.setDepth(Float.NEGATIVE_INFINITY);
But I wouldn't be able to explain how or why if you say the depth should be capped by nearPlane..?
Now values such as 0f, 1f and 2f seem to work too for the setDepth method..?
I wonder if the depth is initialized properly to begin with.. which I assume it to be 2f by default if I understood your last post properly?
Around 3f, it will start clipping through again which makes me think that the depth is already too large by default..? (probably >3f or something by default..?)

EDIT: I believe I found something that might cause the flickering effect with setDepth(...):
When the Object3D's are close, the floating point precision is probably high enough to order the Overlay on top of it still.
But when the camera is further away, the floating point precision might not be high enough (basically causing the overlay and Object3D to have equal z-depths) and giving priority to the Object3Ds sometimes rather than the Overlay (causing flickering).
But I find that weird since the Object3D's are further away..?
What I could do then, is to increase the nearPlane depth... but then the clipping will start happening again.
Let's say I use setDepth(0f) on the overlays:
Bringing Config.nearPlane to 0 will remove the clipping but introduce flickering. Increasing Config.nearPlane will increase clipping but remove flickering... Seems like some trade-off thing to me.
Config.nearPlane = 1E-6f; has too much flickering
Config.nearPlane = 1E-1f; has too much clipping
Config.nearPlane = 1E-3f; seems to work best so far...
« Last Edit: November 20, 2021, 03:37:28 pm by AeroShark333 »

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Are merged Object3D able to have shared submeshes?
« Reply #9 on: November 22, 2021, 09:51:20 am »
This is the code of setDepth():

Code: [Select]
public void setDepth(float depth) {
if (depth < Config.nearPlane) {
depth = Config.nearPlane + 1f;
}
this.depth = depth;
}

As you can see, setting it to -infinity has no effect other than setting the depth to Config.nearPlane + 1f

I'm puzzled, why all of my suggestions have no effect. At least the sorting should have helped.

Another idea: Split scene and overlay in two different world instances and render them one after the other with a z-buffer clear in between...

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Are merged Object3D able to have shared submeshes?
« Reply #10 on: November 22, 2021, 09:51:56 am »
This is the code of setDepth():

Code: [Select]
public void setDepth(float depth) {
if (depth < Config.nearPlane) {
depth = Config.nearPlane + 1f;
}
this.depth = depth;
}

As you can see, setting it to -infinity has no effect other than setting the depth to Config.nearPlane + 1f

I'm puzzled, why all of my suggestions have no effect. At least the sorting should have helped. Maybe the issue is something other than I think it is...

Another idea: Split scene and overlay in two different world instances and render them one after the other with a z-buffer clear in between...
« Last Edit: November 25, 2021, 10:02:50 am by EgonOlsen »

Offline AeroShark333

  • float
  • ****
  • Posts: 319
    • View Profile
Re: Are merged Object3D able to have shared submeshes?
« Reply #11 on: November 24, 2021, 02:10:10 pm »
This is the code of setDepth():

Code: [Select]
public void setDepth(float depth) {
if (depth < Config.nearPlane) {
depth = Config.nearPlane + 1f;
}
this.depth = depth;
}

As you can see, setting it to -infinity has no effect other than setting the depth to Config.nearPlane + 1f
Hmmm, okay...

Another idea: Split scene and overlay in two different world instances and render them one after the other with a z-buffer clear in between...
This does work without any issues.
I thought the renderhook solution should have worked as well, but apparently not..?
I did make sure it got called.

Anyway, I guess this will do for now.
Thank you for the support and help :D
« Last Edit: November 24, 2021, 02:12:28 pm by AeroShark333 »

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: Are merged Object3D able to have shared submeshes?
« Reply #12 on: November 25, 2021, 10:04:16 am »
I thought the renderhook solution should have worked as well, but apparently not..?
It should have. Especially in combination with the sort offset, but I guess something interfered with the sorting in some way so that the overlay didn't render at the end, as I though it would. That might happen, the sorting is just a heuristic anyway.