Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - christian

Pages: [1]
Support / Re: Live Wallpaper and textures
« on: June 18, 2013, 10:19:14 am »
You are actually not supposed to deal with textures in different contexts by adding them twice. The engine keeps one physical copy in main memory but it will upload the data to the gpu to any context that tries to access it. The problem with wallpapers (and/or the this wallpaper framework) that everybody is using, is actually that multiple contexts exist (and are being rendered into) at the same time and THAT isn't a good idea. You have to synchronize this somehow so that it doesn't happen, but i can't tell exactly how. I would love to add a wallpaper example to the wiki but my personal experience with wallpaper is exactly zero. All that i know have i taken from threads here in the forum. Maybe you can PM one who seemed to have managed to get it to work.

Thank you, I will write directly to someone... :)

Support / Live Wallpaper and textures
« on: June 17, 2013, 10:52:41 am »
I am trying to add a live wallpaper to my dice rolling application (DnDice, check it out:

The problem I am experiencing is with textures: if more than one instance of the renderer/jpct-ae is running (i.e. the one inside the dice app, the live wallpaper preview or the live wallpaper itself, let's call them "instances") the application crashes telling me that I cannot instantiate more than once a texture with the same name. Maybe this is only a symptom of a bigger problem, I don't know...

I tried to put a:
Code: [Select]
textureManager.addTexture("texture_name", texture);
, and the app does not crash anymore, but I cannot see the background texture (but textures on the dice are still there, is this because dice also have shaders and the background texture not?).

Searching in the forum, I found that the textures are bound to a specific context in the graphics memory but inside jpct-ae their are treated as unique, is this correct?
How can I tell jpct-ae to use the same texture with more that one "instance"?
Or, how can I tell to treat every "instance" to use their own textures and instantiate one "texture_name" texture for each "instance"?
Speaking in general, what lines of code do I have to write to let more than one "instance" of an app run without errors?

A lot of people is having problems with live wallpapers, maybe it would be useful to add a sample project/tutorial with a sample live wallpaper (such as a textured cube) to help everyone use this great engine :)
I saw that a lot of live wallpapers use jpct-ae, so the solution is out there, please let me know, you will save a lot of headaches ;D :o

Thank you for your help!

Support / Re: Different shader behaviour on different GPUs
« on: March 14, 2013, 05:11:37 pm »
Well, I tried to scale down the world by 10, and now the behaviour on Tegra3 and on Adreno320 is similar, so I guess that the problem was related to number precision...
ASAP I'll try again on the Mali, I'll let you know  :)

Now it also works on Mali (Galaxy Note II), so the problem was definitely the scene size.
As a rule of thumb, you should always keep the scene bounding box size under a hundred units side...

PS: Egon, thank you for putting DnDice inside Projects page! :)

Projects / Re: Dice Roller App: DnDice!
« on: March 13, 2013, 11:35:46 am »
Thank you for the suggestion. When I'll have the time I'll do it, and I'll also try to implement some fake shadows :D

Projects / Dice Roller App: DnDice!
« on: March 12, 2013, 09:18:31 pm »
Hello everyone!
This is my first Android app.
I used JPCT-AE for the graphics and JBullet for physics simulation.

If you want to try it, go to

To Egon: You can put this game to projects page if you like :)
Any comment is welcome! :)

[attachment deleted by admin]

Support / Re: Different shader behaviour on different GPUs
« on: March 05, 2013, 10:48:52 am »
Well, I tried to scale down the world by 10, and now the behaviour on Tegra3 and on Adreno320 is similar, so I guess that the problem was related to number precision...
ASAP I'll try again on the Mali, I'll let you know  :)

Support / Re: Different shader behaviour on different GPUs
« on: March 05, 2013, 09:16:18 am »
Thank you for the answer!

Have you tried to use highp in the fragment shader? I'm just guessing here...i actually don't think that it should make a difference, but you never know.
I tried, but the result seems to be the same...

Could it be world dimensions? The camera is 500 units far from the wood plane, and the dices BB are 75 units big... Could it lead to precision problems in the shader?

Support / Different shader behaviour on different GPUs
« on: March 04, 2013, 08:52:32 pm »
I'm new to the forum, so... Hello everyone!
First of all, I would like to congratulate and thank Egon for this wonderful library!

Now to the point: I have a problem with a custom shader.
I am developing a simple application with a dice simulation, in which the user can (obviously) throw dices.
I wanted to use a bump mapping shader for the dices, and I took the bump shader from the wiki and converted it for GLSL with OpenGL ES2 (substituted the "gl_Something" with the uniforms and attributes provided by JPCT-AE).

I was developing on a Tegra3 device and with the emulator, and I achived the same result, so I was happy :D

When I tried the app on a Adreno320 powered device, I noticed that the dices were much whiter, and when I tried it on a Galaxy Note II (Mali GPU) the dices were almost black (except for some bright spots)!!!

After a lot of tests, I wrote a much more simple shader that isolate the problem.
The shader is:

Code: [Select]
varying vec3 tmpVec;
uniform mat4 modelViewMatrix;
uniform mat4 modelViewProjectionMatrix;
attribute vec4 position;

void main(void)
gl_Position = modelViewProjectionMatrix * position;
tmpVec = vec3(modelViewMatrix * position);

precision mediump float;
varying vec3 tmpVec;

void main (void)
gl_FragColor = vec4(normalize(tmpVec), 1.0);

In this shader I take the vertex position, convert it in camera space and pass it to the fragment, in which I normalize and put the vertex position in the fragment color.
If I use the position for the color, the result is the same both on the One X and the Mi2, but if I use the position converted in camera space, I obtain a different result (as the attachment to the post shows).

My question is: how is it possible?
Could the same code and the same shader lead to so much different results? The shader is very simple, and I cannot see the error.
Am I missing something?

Thank you in advance! :)


[attachment deleted by admin]

Pages: [1]