Main Menu
Menu

Show posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Show posts Menu

Topics - christian

#1
Support / Live Wallpaper and textures
June 17, 2013, 10:52:41 AM
Hello!
I am trying to add a live wallpaper to my dice rolling application (DnDice, check it out: https://play.google.com/store/apps/details?id=com.christian.bar.dndice).

The problem I am experiencing is with textures: if more than one instance of the renderer/jpct-ae is running (i.e. the one inside the dice app, the live wallpaper preview or the live wallpaper itself, let's call them "instances") the application crashes telling me that I cannot instantiate more than once a texture with the same name. Maybe this is only a symptom of a bigger problem, I don't know...

I tried to put a:
if(!textureManager.containsTexture("texture_name"))
textureManager.addTexture("texture_name", texture);

, and the app does not crash anymore, but I cannot see the background texture (but textures on the dice are still there, is this because dice also have shaders and the background texture not?).

Searching in the forum, I found that the textures are bound to a specific context in the graphics memory but inside jpct-ae their are treated as unique, is this correct?
How can I tell jpct-ae to use the same texture with more that one "instance"?
Or, how can I tell to treat every "instance" to use their own textures and instantiate one "texture_name" texture for each "instance"?
Speaking in general, what lines of code do I have to write to let more than one "instance" of an app run without errors?

A lot of people is having problems with live wallpapers, maybe it would be useful to add a sample project/tutorial with a sample live wallpaper (such as a textured cube) to help everyone use this great engine :)
I saw that a lot of live wallpapers use jpct-ae, so the solution is out there, please let me know, you will save a lot of headaches ;D :o

Thank you for your help!
Christian
#2
Projects / Dice Roller App: DnDice!
March 12, 2013, 09:18:31 PM
Hello everyone!
This is my first Android app.
I used JPCT-AE for the graphics and JBullet for physics simulation.

If you want to try it, go to https://play.google.com/store/apps/details?id=com.christian.bar.dndice

To Egon: You can put this game to projects page if you like :)
Any comment is welcome! :)


[attachment deleted by admin]
#3
I'm new to the forum, so... Hello everyone!
First of all, I would like to congratulate and thank Egon for this wonderful library!

Now to the point: I have a problem with a custom shader.
I am developing a simple application with a dice simulation, in which the user can (obviously) throw dices.
I wanted to use a bump mapping shader for the dices, and I took the bump shader from the wiki and converted it for GLSL with OpenGL ES2 (substituted the "gl_Something" with the uniforms and attributes provided by JPCT-AE).

I was developing on a Tegra3 device and with the emulator, and I achived the same result, so I was happy :D

When I tried the app on a Adreno320 powered device, I noticed that the dices were much whiter, and when I tried it on a Galaxy Note II (Mali GPU) the dices were almost black (except for some bright spots)!!!

After a lot of tests, I wrote a much more simple shader that isolate the problem.
The shader is:


-------------------------------VERTEX:
varying vec3 tmpVec;
uniform mat4 modelViewMatrix;
uniform mat4 modelViewProjectionMatrix;
attribute vec4 position;

void main(void)
{
gl_Position = modelViewProjectionMatrix * position;
tmpVec = vec3(modelViewMatrix * position);
}

-------------------------------FRAGMENT:
precision mediump float;
varying vec3 tmpVec;

void main (void)
{
gl_FragColor = vec4(normalize(tmpVec), 1.0);
}


In this shader I take the vertex position, convert it in camera space and pass it to the fragment, in which I normalize and put the vertex position in the fragment color.
If I use the position for the color, the result is the same both on the One X and the Mi2, but if I use the position converted in camera space, I obtain a different result (as the attachment to the post shows).

My question is: how is it possible?
Could the same code and the same shader lead to so much different results? The shader is very simple, and I cannot see the error.
Am I missing something?

Thank you in advance! :)

Christian

[attachment deleted by admin]