Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - EgonOlsen

Pages: 1 2 3 [4] 5 6 ... 822
46
Support / Re: jPCT-AE vs. jPCT (and the future of jPCT-AE)
« on: January 25, 2022, 08:45:32 am »
Quote
Hmmm, okay... Is there a possibility to still add some more control for lighting?
I believe there's variables/attributes passed to the shader that are more or less unused or unchangable?
At least, I haven't found a way to change the specularColor for instance.

There isn't, because specular reflection is just a variant/addition to normal lighting. It's based on the same light sources. If you need something different, you can always set your own uniforms and/or attributes to a custom shader and do whatever you want with it.

Quote
What about meshes that are well outside of the fogging parameters (too far away), these are still 'visible' it seems...
Would it be better to clip the farPlane then or to manually set these 'out of range' Object3D's visibility to false?
Everything behind the far clipping plane won't be rendered. If the fog is much nearer than the far clipping plane is, it might be a good idea to adjust the far plane.

Code: [Select]
What about the last stacktrace? That one is from Java (non-native), but it's out of my scope kinda I guess...
It's a check for frame buffer completeness...no idea, why it shouldn't be complete, though. Might be a driver issues or something with the context. My gut feeling is, that a lot of these errors happens on context changes and are actually hidden from the user, they don't happen during normal operation but then the user...I don't know...rotates the phone while switching it off or something...putting it into some state, from which it can't really recover properly. But that's just a guess based on the fact that personally, I've never seen such an exception on my own devices.

Code: [Select]
private void checkFrameBufferObject() {
int state = GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER);
switch (state) {
case GLES20.GL_FRAMEBUFFER_COMPLETE:
break;
case GLES20.GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT:
Logger.log("FrameBuffer: " + fbo + " has caused a GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT exception", Logger.ERROR);
break;
case GLES20.GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT:
Logger.log("FrameBuffer: " + fbo + ", has caused a GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT exception", Logger.ERROR);
break;
case GLES20.GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS:
Logger.log("FrameBuffer: " + fbo + ", has caused a GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS exception", Logger.ERROR);
break;
default:
Logger.log("Unexpected reply from glCheckFramebufferStatus: " + state, Logger.ERROR);
break;
}
if (state != GLES20.GL_FRAMEBUFFER_COMPLETE) {
fbo = -1;
}
}

47
Support / Re: jPCT-AE vs. jPCT (and the future of jPCT-AE)
« on: January 18, 2022, 05:38:32 pm »
That's quite a lot to answer... ;) I'll do my very best...

On a general note: The basic "design" of jPCT is almost 23 years old now. That alone explains a lot and I'll refer to this as an excuse explaination in the latter. This will become more clear once I manage to reply to the "jPCT origins thread"...

Anyway, here we go...

Quote
1. Why can a Light object only be assigned to one World? As far as I know you can only add a Light to the World using the constructor of Light.
That's because I was stupid/lazy when doing the light implemention in the early days of jPCT. I simply coded the lights to be part of the world in a way that they are bound to it, because of...reasons...I guess. I didn't change that, when I released jPCT to the public after doing some cleanup work. However, I then added the Light class to hide this to a degree. See the documention of the Light class for desktop jPCT, which also tries to explain this. When converting jPCT to jPCT-AE, I kept this behaviour for compatibility reasons, but I tried to hide the actual implementation completely by solely relying on the Light class and removing all light related methods from World...except for removeAllLights(). I kept this, because I had to give the user at least some way to remove the lights from a World instance that he actually never attached to it directly. Yes, this all sucks...but that's the way it is now and it will stay this way.

Quote
2. Is there any beneficial multi-threading capabilities jPCT-AE has or could have?
No, there isn't. OpenGL is a state machine. It can't handle multiple threads. You could do the processing in multiple threads, of course. But in almost all cases, it's not worth it for what you do with such an engine like jPCT. The desktop version supports multiple threads for the software renderer (which really helps a lot) and for the processing (via the WorldProcessor), which isn't worth it expect in some rare and almost artifical cases. Also, jPCT isn't thread-safe and jPCT-AE is even less thread-safe. The reason for the latter is, again, history. When I started jPCT-AE in 2008, Android used the Dalvik-VM on single core processors running at around 800Mhz. The VM had no JIT, a maximum ram size of 24mb and only a stop-the-world-garbage collector. Which was very slow as well. That meant that garbage collection took somewhere between 80 and 500ms once triggered. And that meant, that the usual game loop when using jPCT-AE wasn't allowed to produce any garbage at all, or the application would have suffered greatly from stuttering. To achieve that, I added some static "caches" if you will call them that. Variables and instance pools, that could be reused instead of created/destroyed each time you called the method. But this, of course, limits the ability to multi-thread even more.
If you abstract your game logic/entities from jPCT's objects and couple jPCT loosly to render your entities, then you have more options to multi-thread things on your own. I did this for the grass rendering in Naroth...turns out, it wasn't worth it, so I disabled it ;)

Quote
3. Will ReflectionHelper be ported to jPCT-AE?
Not very likely. OpenGL ES doesn't support some functions (clipping plane related IIRC) that this implementation is using in desktop jPCT. You would have to do in a shader yourself but that would make the default shaders overly complicated, so I didn't do that. It's just a helper class anyway. It's not part of the core.

Quote
4. Will BloomGLProcessor be ported to jPCT-AE?
Again, no. This thing is more of a hack anyway. It requires huge amounts of fillrate, which still isn't there on mobile.

Quote
5. Why can a VertexController only be assigned to a single mesh?
Because the GenericVertexController implementation stores some mesh related data on initialization.

Quote
6. Why does jPCT-AE use a static TextureManager instance?
Again, it's a design decision that I made over 20 years ago. However, I think it's still fine (unlike the lights...). I see it as a pool for textures that each class can easily access without the need to pass instances around. I fail to see the benefit of having multiple texture pools in an app.

Quote
7. Why does jPCT-AE still attempt to draw non-visible polygons?
Because it has to. It culls away objects on all six planes, but you can't cull single polygons that are part of a single mesh. OpenGL is most efficient when rendering large batches of polygons, not single ones. The per-polygon culling happens later on the GPU. That said, desktop jPCT culls single polygons when using the software renderer or the hybrid hardware rendering mode (i.e. the one without compiled objects), but both aren't feasible on mobile. I didn't port Portals, because it requires some preconditions that usually aren't met in most scenes and it greatly simplified the port to leave them out. Nobody ever used them anyway except me in one single demo application. Portals were all the rage when I started with jPCT, but given their limitations and the fact that we went from software to hardware rendering, their significance faded.

Quote
8. Is there any other plans for the future of jPCT-AE?
I'll continue to support it, fix problems and maybe add some things if people need them, but I haven't got a big plan for it apart from that. The default lighting model will stay like it is. If you need more complex lighting, then write your own shaders. Naroth for example uses a lot of custom shaders for texture splatting, foilage, phong shading, dynamic recoloring of textures, parallax mapping in dungeons etc. I can't possibly cover all these things in a way that they will still work with the given lighting model.
I once looked into Vulkan, laughed and never looked at it again. It's completely low level. Don't get me wrong, I like my low level stuff as much as the next guy. After all, I'm still coding in assembly language for old 8-bit computers. But to support Vulkan, one would have to implement a new library of functions that would map what jPCT-AE needs to Vulkan calls. And I really don't see the point of this. Draw call performance isn't an issue for an app running in a VM on Android.
Regarding open source: I usually open source all of my stuff (https://github.com/EgonOlsen71/), but jPCT. Mainly because I don't feel comfortable without some major rewrite of some parts (like the lights... ;) ) on the one hand but I can't be bothered to do this on the other hand.

Quote
9. Any idea what causes these crashes..?
No. I see similar crashes every now and then in my own developer console as well but I don't know what causes them. I've never experienced anything like this in all my testings on all of my devices...not once. I checked some code paths in the engine at least a dozen times without finding any issue, so I doubt that it's that. Might be driver problems or problems when an app returns from sleep or when a context change goes wrong...I really don't know, I'm just guessing widely here.




48
Support / Re: Lighting on merged Object3Ds
« on: January 09, 2022, 02:43:44 pm »
Good to hear. This flaw was in there since the very first version of jPCT-AE. Strange that nobody ever noticed it before. I guess it has been mitigated by the fact that it only affects some devices and that you have to use Object3Ds which are using more than one texture combination, which isn't a common use case when you are loading your objects from a file.

49
Support / Re: Lighting on merged Object3Ds
« on: January 06, 2022, 03:07:06 pm »
Happy new year to you as well.

I had a look and I think that I've identified the issue. Can you please give this jar a try: https://jpct.de/download/beta/jpct_ae.jar ?

50
Support / Re: Lighting on merged Object3Ds
« on: January 06, 2022, 09:03:34 am »
Strange that this happens in the emulator as well in your case. It doesn't for me.

Anyhow...

Quote
According to my IDE (Android Studio in debugging mode) the shininess value sometimes is 0.0 instead of 1.0
I want to stress the "sometimes" above here... it seems that for the merged Object3D, it is sometimes 0.0 and sometimes 1.0..?
I assume it is for different parts of the merged Object3D that it is sometimes 0.0 and sometimes 1.0

That's what I noticed as well (but by doing some checks in the shader on the device instead of by using a debugger). In your case, it seems to be 1 for the red part and 0 for the green (or vice versa, but something like that). And I also think that multiplying with an undefined value might cause these problems. However, if I set it to 0 or 1 manually, the problem goes away (at least for me), which is not what I would expect.

But maybe you are right. Maybe I'm setting it wrongly. That would explain why it only happens when using multiple textures, because that would cause the object to be split into several parts and it looks like as if I'm setting it correctly for one part but not for the other. Albeit I'm completely in the dark on why that should happen...I'll look into it some more...

51
Support / Re: Lighting on merged Object3Ds
« on: January 04, 2022, 08:48:28 pm »
Ok, it's cleary some kind of shader bug (...edit: or maybe not...see post below...). Not in the shader code itself but in the shader compiler of some devices. Or in the GPU...I don't know...

Here's what it does (that's not the exact same code that's used by the engine, but it has the same issue):

Code: [Select]
vec3 specPart = specularColors * pow(max(0.0, dot(normalize(-vertexPos.xyz), reflect(-surface2Light, normalEye))), shininess);
vertexColor += vec4((diffuseColors * angle + specPart)*(1.0/(1.0+length(lightPositions - vertexPos.xyz)*attenuation)), 1);

As you can see, it calculates a vertex' color. For this, it also takes the specular color into account. If there is none (i.e. specular on the object is off, which is the default), shininess is 1 and specularColors is vec3(0.0, 0.0, 0.0), i.e. zeroed out. A zero vector multiplied by some value (in this case the result of the pow()-calculation) should be...a zero vector. Because, as we all know, 0 times something is still 0. Turns out, that some devices beg to differ.

We can modify the code to make this more clear:

Code: [Select]
vec3 specPart = vec3(0.0, 0.0, 0.0) * pow(max(0.0, dot(normalize(-vertexPos.xyz), reflect(-surface2Light, normalEye))), shininess);
vertexColor += vec4((diffuseColors * angle + specPart)*(1.0/(1.0+length(lightPositions - vertexPos.xyz) * attenuation)), 1);

Here, specularColors is gone and has been replaced by a static zero vector. Clearly, specPart has to be a zero vector as well in all cases. But it's not. shininess on the other hand is set to 1. However, it too seems to fluctuate in value as well while the shader is executing, which is actually impossible. It's supposed to stay fixed once the shader is executed for a given mesh, hence the name "uniform". Or maybe it doesn't fluctuate but evaluating it in the shader goes wrong for the same reasons that 0 times something is something else rather than 0.

Anyway, if we do this:

Code: [Select]
vec3 specPart = vec3(0.0, 0.0, 0.0) * pow(max(0.0, dot(normalize(-vertexPos.xyz), reflect(-surface2Light, normalEye))), 1.0);
vertexColor += vec4((diffuseColors * angle + specPart)*(1.0/(1.0+length(lightPositions - vertexPos.xyz)*attenuation)), 1);

i.e. replace shininess with a fixed value (doesn't matter which one, 0,1,10...they all work and give the same results), all of a sudden, 0 times something is 0 again.

However, we can't do that because we need both uniforms in case somebody is acutally using specular lighting. I also tried to split the term into smaller parts, which helped in similar situations in the past, but to no avail.

The fact that the behaviour changes when using a single texture is a red herring IMHO. It also changes/works if you are assigning the red texture to the ball instead of the white one.

I'm not sure what to do here. Are you by any chance already using your own shaders? I that case, we could just cut the specular part (if not used, that is) and call it a day. If not, it might be worth considering to do so.

To be sure: Which device are you using that has this issue? As said, I tried it on a S7 and it works fine. It also works fine in the emulator. I fails on the LG G4.






52
Support / Re: Lighting on merged Object3Ds
« on: January 04, 2022, 05:53:44 pm »
I made myself a test case from your example. On a Samsung S7, there's no issue. The lighting is consistent whether the merged object is visible or the unmerged ones are. On an old LG G4, I can see the issue, though. I'll have a look...

53
Support / Re: Lighting on merged Object3Ds
« on: December 30, 2021, 04:41:12 pm »
I'm not sure if what's going on then, but then again, I'm not sure what the actual issue really is. The videos don't really make it clear to me. Can you create something that is more, well, obvious?

54
Support / Re: Lighting on merged Object3Ds
« on: December 17, 2021, 05:29:42 pm »
Have you tried to create larger textures? Maybe it's a rendering glitch caused by the small size and mip-mapping somehow?

55
Support / Re: Lighting on merged Object3Ds
« on: December 13, 2021, 06:04:51 pm »
Honestly, I'm not sure what I'm supposed to see on these images!? The last one (the one with the animation) doesn't work for me on the desktop. I can view it on my phone though, but all I'm seeing are two bouncing planes...I'm not sure what to look out for, they look fine (albeit highly compressed) to me.

Anyway, the lighting calculation done by the default shaders doesn't depend on the number of texture stages. It just multiplies the calculated light color/intensity with the pixel after all texture stages have been blended together. If one texture is very dark or the blending mode is some unusual one, the results might not be what one expects, but I don't think that this is the case here.

Looking at your code, you:

  • create the plane/object
  • compile it (not needed if you merge them anyway)
  • strip it (not sure, if that's a good idea to this before calling build(), but if you merge later, it's not needed anyway)
  • build it. This can be done here, but actually doesn't have to.

I suggest to do this instead:

  • create the plane
  • merge
  • build()
  • compile (if needed, actually I don't see why it should)
  • strip()

I don't think that this will change something but it's actually the way it's intended to be done and maybe your way has some effect on something that I don't see ATM.

56
Support / Re: Lighting on merged Object3Ds
« on: December 11, 2021, 07:40:22 pm »
Actually, if the normals have been calculated (or loaded from file with Config.useNormalsFromOBJ=true) already before merging, they won't be recalculated after the merge when you call build on the merged object. I'm not sure if your issue is related to merging at all, though. I rather think it's caused by normal vectors that aren't calculated in a why that fits your scene. Does the file from which these objects come includes normals? In that case, you could try the config switch mentioned above.

57
Support / Re: jPCT(-AE) origins
« on: December 10, 2021, 03:35:20 pm »
I'll post a lengthy reply once I find the time...

58
Support / Re: Lighting on merged Object3Ds
« on: December 10, 2021, 03:34:44 pm »
Each object will be lit by itself, no matter how they overlap with each other. Regarding the lighting, they are completely independant. Maybe it's just caused by way in which the normals are created? A vertex normal is calculated by taking the face normals of the adjacent polygons and averaging them. If you, for example, stack boxes onto another the normals on the lower corners will face something like 45° down while the ones on the upper corners will face 45° up. That will cause uneven lighting on something like a wall that consists of these boxes.


59
Support / Re: ShadowHelper and special Textures
« on: December 07, 2021, 10:43:46 am »
Yes, the shadow is an additional texture as mentioned above. I wasn't aware that you are adding 2 layers yourself, so that explains it.

60
Support / Re: ShadowHelper and special Textures
« on: December 07, 2021, 08:37:02 am »
Can you please set the Logger to LL_DEBUG and post what gets printed out when adding the sphere as a receiver?

Pages: 1 2 3 [4] 5 6 ... 822