Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - ronald

Pages: [1]
1
Support / Re: OpenGL matrix to Camera
« on: April 07, 2011, 09:00:27 am »
for the model texture/vertices depth problem, it seems to be fixed (not 100% sure) by adding this "glEnable(GL_DEPTH_TEST);" in the native code before calling jpct render code.. but still need to test more to confirm that since it is very difficult to rotate out the 3d model and see that in the view coz i am experimenting with the matrix.

Currently i am testing the matrix returned by an inverse function which "supposedly" will be the eye view (camera) of the model if I pass in the modelview matrix but without much success. So are you saying I should set those values to "0" (except the last 1.0) in the last row?

Is jpct matrix row-major or column major? I am not sure if the 4x4 OpenGL matrix returned by Qualcomm is ready to use..

Here is the qualcomm library util function for getting an inverse:

SampleMath::Matrix44FInverse(QCAR::Matrix44F& m)
{
    QCAR::Matrix44F r;

    float det = 1.0f / Matrix44FDeterminate(m);

    r.data[0]   = m.data[6]*m.data[11]*m.data[13] - m.data[7]*m.data[10]*m.data[13]
                + m.data[7]*m.data[9]*m.data[14] - m.data[5]*m.data[11]*m.data[14]
                - m.data[6]*m.data[9]*m.data[15] + m.data[5]*m.data[10]*m.data[15];

    r.data[4]   = m.data[3]*m.data[10]*m.data[13] - m.data[2]*m.data[11]*m.data[13]
                - m.data[3]*m.data[9]*m.data[14] + m.data[1]*m.data[11]*m.data[14]
                + m.data[2]*m.data[9]*m.data[15] - m.data[1]*m.data[10]*m.data[15];

    r.data[8]   = m.data[2]*m.data[7]*m.data[13] - m.data[3]*m.data[6]*m.data[13]
                + m.data[3]*m.data[5]*m.data[14] - m.data[1]*m.data[7]*m.data[14]
                - m.data[2]*m.data[5]*m.data[15] + m.data[1]*m.data[6]*m.data[15];

    r.data[12]  = m.data[3]*m.data[6]*m.data[9] - m.data[2]*m.data[7]*m.data[9]
                - m.data[3]*m.data[5]*m.data[10] + m.data[1]*m.data[7]*m.data[10]
                + m.data[2]*m.data[5]*m.data[11] - m.data[1]*m.data[6]*m.data[11];

    r.data[1]   = m.data[7]*m.data[10]*m.data[12] - m.data[6]*m.data[11]*m.data[12]
                - m.data[7]*m.data[8]*m.data[14] + m.data[4]*m.data[11]*m.data[14]
                + m.data[6]*m.data[8]*m.data[15] - m.data[4]*m.data[10]*m.data[15];

    r.data[5]   = m.data[2]*m.data[11]*m.data[12] - m.data[3]*m.data[10]*m.data[12]
                + m.data[3]*m.data[8]*m.data[14] - m.data[0]*m.data[11]*m.data[14]
                - m.data[2]*m.data[8]*m.data[15] + m.data[0]*m.data[10]*m.data[15];

    r.data[9]   = m.data[3]*m.data[6]*m.data[12] - m.data[2]*m.data[7]*m.data[12]
                - m.data[3]*m.data[4]*m.data[14] + m.data[0]*m.data[7]*m.data[14]
                + m.data[2]*m.data[4]*m.data[15] - m.data[0]*m.data[6]*m.data[15];

    r.data[13]  = m.data[2]*m.data[7]*m.data[8] - m.data[3]*m.data[6]*m.data[8]
                + m.data[3]*m.data[4]*m.data[10] - m.data[0]*m.data[7]*m.data[10]
                - m.data[2]*m.data[4]*m.data[11] + m.data[0]*m.data[6]*m.data[11];

    r.data[2]   = m.data[5]*m.data[11]*m.data[12] - m.data[7]*m.data[9]*m.data[12]
                + m.data[7]*m.data[8]*m.data[13] - m.data[4]*m.data[11]*m.data[13]
                - m.data[5]*m.data[8]*m.data[15] + m.data[4]*m.data[9]*m.data[15];

    r.data[6]   = m.data[3]*m.data[9]*m.data[12] - m.data[1]*m.data[11]*m.data[12]
                - m.data[3]*m.data[8]*m.data[13] + m.data[0]*m.data[11]*m.data[13]
                + m.data[1]*m.data[8]*m.data[15] - m.data[0]*m.data[9]*m.data[15];

    r.data[10]  = m.data[1]*m.data[7]*m.data[12] - m.data[3]*m.data[5]*m.data[12]
                + m.data[3]*m.data[4]*m.data[13] - m.data[0]*m.data[7]*m.data[13]
                - m.data[1]*m.data[4]*m.data[15] + m.data[0]*m.data[5]*m.data[15];

    r.data[14]  = m.data[3]*m.data[5]*m.data[8] - m.data[1]*m.data[7]*m.data[8]
- m.data[3]*m.data[4]*m.data[9] + m.data[0]*m.data[7]*m.data[9]
                + m.data[1]*m.data[4]*m.data[11] - m.data[0]*m.data[5]*m.data[11];

    r.data[3]   = m.data[6]*m.data[9]*m.data[12] - m.data[5]*m.data[10]*m.data[12]
                - m.data[6]*m.data[8]*m.data[13] + m.data[4]*m.data[10]*m.data[13]
                + m.data[5]*m.data[8]*m.data[14] - m.data[4]*m.data[9]*m.data[14];

    r.data[7]  = m.data[1]*m.data[10]*m.data[12] - m.data[2]*m.data[9]*m.data[12]
                + m.data[2]*m.data[8]*m.data[13] - m.data[0]*m.data[10]*m.data[13]
                - m.data[1]*m.data[8]*m.data[14] + m.data[0]*m.data[9]*m.data[14];

    r.data[11]  = m.data[2]*m.data[5]*m.data[12] - m.data[1]*m.data[6]*m.data[12]
                - m.data[2]*m.data[4]*m.data[13] + m.data[0]*m.data[6]*m.data[13]
                + m.data[1]*m.data[4]*m.data[14] - m.data[0]*m.data[5]*m.data[14];

    r.data[15]  = m.data[1]*m.data[6]*m.data[8] - m.data[2]*m.data[5]*m.data[8]
                + m.data[2]*m.data[4]*m.data[9] - m.data[0]*m.data[6]*m.data[9]
                - m.data[1]*m.data[4]*m.data[10] + m.data[0]*m.data[5]*m.data[10];

    for (int i = 0; i < 16; i++)
        r.data *= det;
   
    return r;
}

2
Support / Re: OpenGL matrix to Camera
« on: April 05, 2011, 07:16:12 am »
i add this line:
world.getCamera().rotateX((float)Math.PI);

and the model is now standing. I apply the matrix I got from the AR engine directly to the camera but the movement is not right.. when move the phone to the right, the model move to the right too! I found this OpenGL article:
http://www.opengl.org/resources/faq/technical/viewing.htm

and does that mean i need to "inverse" that matrix before calling camera.setBack(matrix)? I tried matrix.invert() and it doesn't seem to help also.

Another big problem I see is related to the texture and model in general (vertices). The problem is that when I rotate the Ninja model with my finger, the "depth" of the parts are not correct. say the sword is supposed to be hidden behind the body after rotation but instead it stays on top, same with other body parts like hands. I tried disable the AR camera feed and it backs to normal, so I wonder what the AR engine has done to screw that up... or something I can reset before asking jpct to render the model.

thanks for taking time to reply my questions.


3
Support / Re: OpenGL matrix to Camera
« on: April 04, 2011, 04:11:11 am »
actually instead of using "fillDump" I should use "setDump" i guess. If i use that, I can see the model moving relatively kind of, but the whole model is upside down. The relative movement is probably not correct although at least it moves relative to my phone and the tracked target. Also it cannot handle the "depth" part, meaning if I pull my phone away, the model doesn't know zoom out.. what could be wrong??

4
Support / Re: OpenGL matrix to Camera
« on: April 04, 2011, 12:39:32 am »
I try to use that modelview matrix returned by the AR engine from jni, which is a float [4*4] matrix, and I try to do this:

Matrix mat = new Matrix();
mat.fillDump(mvmat);
world.getCamera().setBack(mat);

I don't see the model (I am using the Ninja sample) when first started, then i use my finger to drag a bit to rotate the model out and become visible. Then I move the phone arround to see if the model will move (the matrix changes). I am expecting that even the matrix could be wrong, the model (or the camera) should move as well, but nothing happen. I also try to use the "mat.invert()" but doesn't seem to help..

Is it the right way to apply a matrix on a camera?

I was also thinking about just calling those OpenGL calls to set the projection and model view matrix before calling those jpct "world" functions but without much success so far:

world.renderScene(frameBuffer);
world.draw(frameBuffer);

what do you mean by "confuse the engine if you use the objects' matrices"? How do I make sure that the jpct objects matrices are not being used at all? But it is properly not a good idea i guess to try to override the engine matrix since if i want to make a game i will need to animate the model and move it around using the engines function calls..

More info on what the projection and modelview matrix (4*4) looks like below, could you let me know if you have a chance what function in jpct to use to convert the model view matrix and how to correctly apply them? Thanks a lot!!
__ pmat:
2.2,0.0,0.0,0.0,
0.0,-3.3,0.0,0.0,
0.0,-0.00625,1.002002,1.0,
0.0,0.0,-4.004004,0.0,
__ mvmat:
0.88026357,-0.11042596,0.46145666,0.0,
-0.05797603,-0.99028623,-0.12638049,0.0,
0.47092986,0.08449472,-0.8781149,0.0,
37.16981,-27.505302,541.533,1.0,

5
Support / Re: OpenGL matrix to Camera
« on: April 03, 2011, 05:54:26 pm »
those matrices are created by an AR engine, so I guess they are created based on the "target image" tracked by the camera and the phone orientation (I am using jpct for android). I have no access to those logic that creates those matrices. So do you think it is still possible to apply that modelview matrix to the jpct camera? If not, is there any backdoor way to directly apply this matrix at the low level code at all? coz the AR engine pretty much act like the Camera already in my case, all I need from jpct is mainly the ability to load and render models pretty much.. although I don't prefer this hacky way and still want to somehow make use of the jpct camera object of course.

thanks a lot for replying!

6
Support / OpenGL matrix to Camera
« on: April 03, 2011, 11:16:02 am »
Hi,

I have 2 OpenGL matrix for projection and modelview (data is a float[4*4]), originally used as this:
glMatrixMode(GL_PROJECTION);
glLoadMatrixf(projectionMatrix.data);

glMatrixMode(GL_MODELVIEW);
glLoadMatrixf(modelViewMatrix.data);

So now I get those matrix out from the C/C++ code and wants to apply that on the jpct Camera:

world.getCamera()

There are couple functions available. So how should I apply those 2 matrix on the camera correctly?

thanks.
ron



Pages: [1]