Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - EgonOlsen

Pages: 1 ... 3 4 [5] 6 7 ... 822
61
Support / Re: ShadowHelper and special Textures
« on: December 06, 2021, 08:57:20 pm »
Is these two later methods overwriting the shadow map? What is going on here?

Yes, that's the problem here. Shadow mapping is two pass operation. Firstly, the scene when rendered when viewed from the light source and the depth of each (caster-)pixel is stored into the depth map. Then, the scene is rendered normally expect that the shadow mapp gets projected into the scene in addition and a check is done to compare the distance from each pixel to the light source with the value stored in the depth map. If it's larger, the pixel is shadowed. Otherwise, it's not. For that to work, the shadow map has to be part of each object receiving the shadow.

In your case, you are removing the shadow map from the object with that code, hence no shadows. If you call your method before setting up the shadow helper, it should actually work.

62
Support / Re: Lighting on merged Object3Ds
« on: December 03, 2021, 07:14:31 am »
The merging does nothing to the normals nor does an overlap, at least not on itself. Are these meshes sharing coordinates? You might want to try to call build() on the objects before merging them and not afterwards to see if that helps.

63
Yes, that's because the object gets textured based on its vertices' positions in space, not on its previous texture coordinates. Assuming that the sphere itself has proper coordinates, you shold get them via https://www.jpct.net/jpct-ae/doc/com/threed/jpct/PolygonManager.html#getTextureUV-int-int- and then do something like

Code: [Select]
TextureInfo ti = new TextureInfo(tid, u0, v0, u1, v1, u2, v2);

instead of

Code: [Select]
TextureInfo ti = new TextureInfo(tid, v0.x / dx, v0.z / dz, v1.x / dx, v1.z / dz, v2.x / dx, v2.z / dz);

For splatting, just multiple these coodinates with some value and assign them to another layer.


64
Support / Re: Are merged Object3D able to have shared submeshes?
« on: November 25, 2021, 10:04:16 am »
I thought the renderhook solution should have worked as well, but apparently not..?
It should have. Especially in combination with the sort offset, but I guess something interfered with the sorting in some way so that the overlay didn't render at the end, as I though it would. That might happen, the sorting is just a heuristic anyway.

65
Support / Re: Are merged Object3D able to have shared submeshes?
« on: November 22, 2021, 09:51:56 am »
This is the code of setDepth():

Code: [Select]
public void setDepth(float depth) {
if (depth < Config.nearPlane) {
depth = Config.nearPlane + 1f;
}
this.depth = depth;
}

As you can see, setting it to -infinity has no effect other than setting the depth to Config.nearPlane + 1f

I'm puzzled, why all of my suggestions have no effect. At least the sorting should have helped. Maybe the issue is something other than I think it is...

Another idea: Split scene and overlay in two different world instances and render them one after the other with a z-buffer clear in between...

66
Support / Re: Are merged Object3D able to have shared submeshes?
« on: November 22, 2021, 09:51:20 am »
This is the code of setDepth():

Code: [Select]
public void setDepth(float depth) {
if (depth < Config.nearPlane) {
depth = Config.nearPlane + 1f;
}
this.depth = depth;
}

As you can see, setting it to -infinity has no effect other than setting the depth to Config.nearPlane + 1f

I'm puzzled, why all of my suggestions have no effect. At least the sorting should have helped.

Another idea: Split scene and overlay in two different world instances and render them one after the other with a z-buffer clear in between...

67
Support / Re: Are merged Object3D able to have shared submeshes?
« on: November 20, 2021, 12:34:56 pm »
If you set Config.glIgnoreNearPlane, the value set for the near plane won't be used. The default (1) will be used instead. If you combine that with your overlay and the fact that you've set the near plane to 0 (which usually is too close btw), you will end up with a near plane of 1 for the scene and a depth of (0+1) for the overlay as well. I'm surprised that it shows up at all.

Try to set Config.glIgnoreNearPlane to false and see if that changes anything. And, as said, 0 as the near plane is too close. Try some higher value instead.

That said, if the overlay is transparent it won't write into the depth buffer, which can lead to objects drawn later to be painted over it. If you set it to opaque, the depth buffer will handle the overdraw and in turn avoid the problem. You can try to change the sort order of the overlay to make it render after everything else:

  • use Oyerlay.getObject3D() to get the internal Object3D from the overlay
  • call setZSortOffset(-xxxx) with xxxx being a large value on the Object3D of the overlay

That should, in theory, change the overlays sorting so that is will be rendered after everything else.

68
Support / Re: Are merged Object3D able to have shared submeshes?
« on: November 19, 2021, 08:17:16 am »
I tried this but it didn't seem to make a difference unfortunately... Any other suggestions?
I believe Overlay#setDepth(-9E30f) sort of works best for now but still not perfectly

That's a bit strange. If the overlay is being rendered last (which should be the case) but you are clearing the depth buffer before rendering, there has to paint over the entire scene no matter what. Have you checked that your implementation gets actually called? The overlay is transparent, I assume? Is there any chance that the object interfering with it is as well?

About the depth thing...that value can't do anything useful, to be honest. A negative value would move the overlay behind the clipping plane, which makes mathematically no sense. Therefor, the method checks, if the value given to it is smaller than the value in Config.nearPlane and if it is, it will auto-adjust it to Config.nearPlane+1. So you can go smaller than Config.nearPlane+1, but you can't go smaller than the actual value in Config.nearPlane. Have you tried playing around with Config.nearPlane itself?

If nothing helps, can you create a test case that shows the problem?

About the enabling/disabling, did you mean Object3D#setVisibility(boolean), World.removeObject(Object3D) or something else? (I assume these are basically the same anyway...)
Does this need to be actively done for every frame (or camera update)? ??? I thought jPCT took care of this automatically based on what's visible inside the camera's view...
Depends on your implementation, I guess. Yes, jPCT takes cares of objects that aren't visible, but it's still faster to disable them in the first place if you are sure that they aren't visible in the current scene. For that, setVisibility() is usually the better option than adding/removing them. In Naroth, it's a bit different. The maps are stored as simple ASCII maps. At load time, I parse the maps and convert the ASCII data into matching beans that contain information about their position, orientation and apperance (a wall, a door, a chest...). When the player moves, I iterate through these objects to see, which ones are potentially visible (the check is rather rough, but good enough). For all that are visible, I assign an actual Object3D to them that gets positioned in the world so that it matches the data in the bean. So basically, I have a pool of Object3D instances which dynamically form the scene. If visualized, this would look like a scene out of a sci-fi movie where the world builds itself as you walk along while it collapses behind you.

69
Since I assume you don't want to do the second option... Here's what I'd suggest still:
I think you should be able to retrieve UV coordinates from an Object3D: https://www.jpct.net/jpct-ae/doc/com/threed/jpct/PolygonManager.html#getTextureUV-int-int-
I am missing a setTextureUV() method here though...
There is one, just not in the way in which you expect it to be. What you have to do is to grab the information you need from the various methods in the PolygonManager, create a TextureInfo-Instance from these and assign it to the polygon by using setPolygonTexture(). This code from the wiki has an example of that in the setTexture()-method where I'm tiling a terrain texture: https://www.jpct.net/wiki/index.php?title=Texture_splatting_on_a_terrain

Quote
In the end, I believe https://www.jpct.net/jpct-ae/doc/com/threed/jpct/Object3D.html#setTextureMatrix-com.threed.jpct.Matrix- might be your best bet for scaling/offsetting UV coordinates
That should work as well, I think...

70
Support / Re: Are merged Object3D able to have shared submeshes?
« on: November 18, 2021, 07:38:23 am »
Regarding the objects clipping through, I'm not entirely sure how to fight this. I see why it happens but I've never seen this happen to me. There's a kind of hack that you might want to try to see if that does help:

  • create the overlay as usual
  • user Oyerlay.getObject3D() to get the internal Object3D from the overlay
  • attach an IRenderHook implementation to that object, that...
  • ...implements IRenderHook.beforeRendering(<int>) in a way that FrameBuffer.clearZBufferOnly() is being called in it.
  • ...and then check what this does...

About the merging: It's the usual trade-off, I guess. Merging them requires time at startup (not sure if that matters here) and consumes more memory, because it limits the ability to share meshes. On the other hand, it uses less draw calls, which should make it render faster. Apart from that, the basic idea sounds fine to me. It would have done it the same way (maybe that doesn't say much, but anyway...). In fact, I did something very similar with the dungeons in Naroth. They aren't a solid mesh but consist of building blocks for walls, ceilings and floors that are constantly enabled, disabled and moved around when the player moves. Of course, that's slower than a single mesh but it's actually fast enough and requires much less memory and is more flexible.


71
Support / Re: Are merged Object3D able to have shared submeshes?
« on: November 17, 2021, 12:45:24 pm »
The overlay is part of the scene, because it's actually a 3d object which has a real depth and read/writes from/into the depth buffer. But they are rendered after everything else. It actually shouldn't happen that objects clip through them, but it might be an accuracy issue. Depth buffers, especially on mobile, aren't very accurate at times. It might help to move the clipping plane into the scene a little more. Do you have a screen shot of how this looks?

About the objects: I'm not sure, if I got you correctly here. You can't merge two objects when one is static and one is dynamic. The result will either be 100% static or 100% dynamic, depending on your setting for the merged object. In any case, you can't animate the dynamic "part" of it independently from the other objects that share the same mesh.

72
If I remember correctly, I got black textures on some devices when using either non-square (but still POT) or NPOT textures. I believe the only way I could force to make it work was to first convert it/stretch it to a POT square texture... But yeah, memory... So I believe I didn't try to bother to go into it that much anymore
Yes, that's an issue indeed. It's a driver/hardware limitation, not jPCT-AE's fault. It shouldn't be a problem on any modern hardware though.

73
Support / Re: Diffuse vs. Specular light color
« on: November 06, 2021, 06:41:33 pm »
I don't think shaders would mind additionalColor to be negative valued
That depends on the graphics chip. Some don't mind and just clip the values below 0 and larger than 1, some render everything dark and some so crazy. One could do the clipping in the shader and/or in the code (for the fixed function pipeline, which still exists) but that would decrease performance (albeit it shouldn't be that much of an issue). But it would break compatibility with desktop jPCT, because on that one, RGBColor extends AWTColor (I don't know the reason anymore, but I'm sure that there was one... ;) ) and that doesn't support it.

74
I believe jPCT has support for mipmapping but I've found it a little buggy for myself... (Or I'm just not sure how to make use of it properly)
It's actually a hardware thing. If it doesn't work, it's a driver thing o, on very old devices, an issue with the graphics chip. But other than that, it should work just fine.... ???

75
Support / Re: Diffuse vs. Specular light color
« on: November 03, 2021, 03:38:31 pm »
Maybe that's the best approach after all. Just take a fitting default shader and modify it to do what you want. After all, you might want to do that anyway, because the specular implementation actually isn't that great IIRC and more of an afterthought. I didn't really expect it to be used by anybody, to be honest... ;)

The changes that you suggested would partly break compatibility with desktop jPCT as well.

Pages: 1 ... 3 4 [5] 6 7 ... 822