Author Topic: OpenGL matrix to Camera  (Read 6207 times)

Offline ronald

  • byte
  • *
  • Posts: 6
    • View Profile
OpenGL matrix to Camera
« on: April 03, 2011, 11:16:02 am »
Hi,

I have 2 OpenGL matrix for projection and modelview (data is a float[4*4]), originally used as this:
glMatrixMode(GL_PROJECTION);
glLoadMatrixf(projectionMatrix.data);

glMatrixMode(GL_MODELVIEW);
glLoadMatrixf(modelViewMatrix.data);

So now I get those matrix out from the C/C++ code and wants to apply that on the jpct Camera:

world.getCamera()

There are couple functions available. So how should I apply those 2 matrix on the camera correctly?

thanks.
ron



Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: OpenGL matrix to Camera
« Reply #1 on: April 03, 2011, 04:24:01 pm »
There is no way to set the projection matrix. It will be created at runtime using the current camera settings. The modelview matrix has to be converted into jPCT's format. There is a method to do the opposite...i'm not sure right now if it can be used to do the conversion from gl too, but it might. However, it depends on how the c++ code creates this matrix. You might have to split it into the camera and the object part (if even possible). Imho, it would be better to convert the logic that leads to these matrices and not the matrices themselves, because that somehow bypasses the engine.

Offline ronald

  • byte
  • *
  • Posts: 6
    • View Profile
Re: OpenGL matrix to Camera
« Reply #2 on: April 03, 2011, 05:54:26 pm »
those matrices are created by an AR engine, so I guess they are created based on the "target image" tracked by the camera and the phone orientation (I am using jpct for android). I have no access to those logic that creates those matrices. So do you think it is still possible to apply that modelview matrix to the jpct camera? If not, is there any backdoor way to directly apply this matrix at the low level code at all? coz the AR engine pretty much act like the Camera already in my case, all I need from jpct is mainly the ability to load and render models pretty much.. although I don't prefer this hacky way and still want to somehow make use of the jpct camera object of course.

thanks a lot for replying!

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: OpenGL matrix to Camera
« Reply #3 on: April 03, 2011, 09:53:07 pm »
You can still use this matrix and set it, but it will confuse the engine if you use the objects' matrices too. As long as you don't use them, you might get away with it. You still have to convert the modelview matrix into jPCT's format and extract the translation from it. As said, i'm not sure how to convert it right now and if the method that does the opposite fits here...and i can't try right now (no access to a computer, just to a smart phone).

Offline ronald

  • byte
  • *
  • Posts: 6
    • View Profile
Re: OpenGL matrix to Camera
« Reply #4 on: April 04, 2011, 12:39:32 am »
I try to use that modelview matrix returned by the AR engine from jni, which is a float [4*4] matrix, and I try to do this:

Matrix mat = new Matrix();
mat.fillDump(mvmat);
world.getCamera().setBack(mat);

I don't see the model (I am using the Ninja sample) when first started, then i use my finger to drag a bit to rotate the model out and become visible. Then I move the phone arround to see if the model will move (the matrix changes). I am expecting that even the matrix could be wrong, the model (or the camera) should move as well, but nothing happen. I also try to use the "mat.invert()" but doesn't seem to help..

Is it the right way to apply a matrix on a camera?

I was also thinking about just calling those OpenGL calls to set the projection and model view matrix before calling those jpct "world" functions but without much success so far:

world.renderScene(frameBuffer);
world.draw(frameBuffer);

what do you mean by "confuse the engine if you use the objects' matrices"? How do I make sure that the jpct objects matrices are not being used at all? But it is properly not a good idea i guess to try to override the engine matrix since if i want to make a game i will need to animate the model and move it around using the engines function calls..

More info on what the projection and modelview matrix (4*4) looks like below, could you let me know if you have a chance what function in jpct to use to convert the model view matrix and how to correctly apply them? Thanks a lot!!
__ pmat:
2.2,0.0,0.0,0.0,
0.0,-3.3,0.0,0.0,
0.0,-0.00625,1.002002,1.0,
0.0,0.0,-4.004004,0.0,
__ mvmat:
0.88026357,-0.11042596,0.46145666,0.0,
-0.05797603,-0.99028623,-0.12638049,0.0,
0.47092986,0.08449472,-0.8781149,0.0,
37.16981,-27.505302,541.533,1.0,

Offline ronald

  • byte
  • *
  • Posts: 6
    • View Profile
Re: OpenGL matrix to Camera
« Reply #5 on: April 04, 2011, 04:11:11 am »
actually instead of using "fillDump" I should use "setDump" i guess. If i use that, I can see the model moving relatively kind of, but the whole model is upside down. The relative movement is probably not correct although at least it moves relative to my phone and the tracked target. Also it cannot handle the "depth" part, meaning if I pull my phone away, the model doesn't know zoom out.. what could be wrong??

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: OpenGL matrix to Camera
« Reply #6 on: April 04, 2011, 11:37:20 am »
OpenGL and jPCT differ in their coordinate system. Try to apply a rotation around x with 90°. That should actually fix the problem. About the zoom...looks like as if the native code does this by tweaking the projection matrix. In JPCT, you can do this by setting the fov. I'm not sure how derive the fov from the projection matrix, but it should be possible to find the answer in Google.

Offline ronald

  • byte
  • *
  • Posts: 6
    • View Profile
Re: OpenGL matrix to Camera
« Reply #7 on: April 05, 2011, 07:16:12 am »
i add this line:
world.getCamera().rotateX((float)Math.PI);

and the model is now standing. I apply the matrix I got from the AR engine directly to the camera but the movement is not right.. when move the phone to the right, the model move to the right too! I found this OpenGL article:
http://www.opengl.org/resources/faq/technical/viewing.htm

and does that mean i need to "inverse" that matrix before calling camera.setBack(matrix)? I tried matrix.invert() and it doesn't seem to help also.

Another big problem I see is related to the texture and model in general (vertices). The problem is that when I rotate the Ninja model with my finger, the "depth" of the parts are not correct. say the sword is supposed to be hidden behind the body after rotation but instead it stays on top, same with other body parts like hands. I tried disable the AR camera feed and it backs to normal, so I wonder what the AR engine has done to screw that up... or something I can reset before asking jpct to render the model.

thanks for taking time to reply my questions.


Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: OpenGL matrix to Camera
« Reply #8 on: April 06, 2011, 09:29:27 pm »
The wrong order might be caused by the matrix being not only a normal rotation matrix. As mentioned, it contains the translation part too (the last row / 37.16981,-27.505302,541.533,1.0). You have to extract and remove this part, do the same rotation as you did for the matrix itself and apply the result as seperate translation to the camera. Can you post a screen shot of the result? Maybe the culling is inversed by the "enemy" matrix.

Offline ronald

  • byte
  • *
  • Posts: 6
    • View Profile
Re: OpenGL matrix to Camera
« Reply #9 on: April 07, 2011, 09:00:27 am »
for the model texture/vertices depth problem, it seems to be fixed (not 100% sure) by adding this "glEnable(GL_DEPTH_TEST);" in the native code before calling jpct render code.. but still need to test more to confirm that since it is very difficult to rotate out the 3d model and see that in the view coz i am experimenting with the matrix.

Currently i am testing the matrix returned by an inverse function which "supposedly" will be the eye view (camera) of the model if I pass in the modelview matrix but without much success. So are you saying I should set those values to "0" (except the last 1.0) in the last row?

Is jpct matrix row-major or column major? I am not sure if the 4x4 OpenGL matrix returned by Qualcomm is ready to use..

Here is the qualcomm library util function for getting an inverse:

SampleMath::Matrix44FInverse(QCAR::Matrix44F& m)
{
    QCAR::Matrix44F r;

    float det = 1.0f / Matrix44FDeterminate(m);

    r.data[0]   = m.data[6]*m.data[11]*m.data[13] - m.data[7]*m.data[10]*m.data[13]
                + m.data[7]*m.data[9]*m.data[14] - m.data[5]*m.data[11]*m.data[14]
                - m.data[6]*m.data[9]*m.data[15] + m.data[5]*m.data[10]*m.data[15];

    r.data[4]   = m.data[3]*m.data[10]*m.data[13] - m.data[2]*m.data[11]*m.data[13]
                - m.data[3]*m.data[9]*m.data[14] + m.data[1]*m.data[11]*m.data[14]
                + m.data[2]*m.data[9]*m.data[15] - m.data[1]*m.data[10]*m.data[15];

    r.data[8]   = m.data[2]*m.data[7]*m.data[13] - m.data[3]*m.data[6]*m.data[13]
                + m.data[3]*m.data[5]*m.data[14] - m.data[1]*m.data[7]*m.data[14]
                - m.data[2]*m.data[5]*m.data[15] + m.data[1]*m.data[6]*m.data[15];

    r.data[12]  = m.data[3]*m.data[6]*m.data[9] - m.data[2]*m.data[7]*m.data[9]
                - m.data[3]*m.data[5]*m.data[10] + m.data[1]*m.data[7]*m.data[10]
                + m.data[2]*m.data[5]*m.data[11] - m.data[1]*m.data[6]*m.data[11];

    r.data[1]   = m.data[7]*m.data[10]*m.data[12] - m.data[6]*m.data[11]*m.data[12]
                - m.data[7]*m.data[8]*m.data[14] + m.data[4]*m.data[11]*m.data[14]
                + m.data[6]*m.data[8]*m.data[15] - m.data[4]*m.data[10]*m.data[15];

    r.data[5]   = m.data[2]*m.data[11]*m.data[12] - m.data[3]*m.data[10]*m.data[12]
                + m.data[3]*m.data[8]*m.data[14] - m.data[0]*m.data[11]*m.data[14]
                - m.data[2]*m.data[8]*m.data[15] + m.data[0]*m.data[10]*m.data[15];

    r.data[9]   = m.data[3]*m.data[6]*m.data[12] - m.data[2]*m.data[7]*m.data[12]
                - m.data[3]*m.data[4]*m.data[14] + m.data[0]*m.data[7]*m.data[14]
                + m.data[2]*m.data[4]*m.data[15] - m.data[0]*m.data[6]*m.data[15];

    r.data[13]  = m.data[2]*m.data[7]*m.data[8] - m.data[3]*m.data[6]*m.data[8]
                + m.data[3]*m.data[4]*m.data[10] - m.data[0]*m.data[7]*m.data[10]
                - m.data[2]*m.data[4]*m.data[11] + m.data[0]*m.data[6]*m.data[11];

    r.data[2]   = m.data[5]*m.data[11]*m.data[12] - m.data[7]*m.data[9]*m.data[12]
                + m.data[7]*m.data[8]*m.data[13] - m.data[4]*m.data[11]*m.data[13]
                - m.data[5]*m.data[8]*m.data[15] + m.data[4]*m.data[9]*m.data[15];

    r.data[6]   = m.data[3]*m.data[9]*m.data[12] - m.data[1]*m.data[11]*m.data[12]
                - m.data[3]*m.data[8]*m.data[13] + m.data[0]*m.data[11]*m.data[13]
                + m.data[1]*m.data[8]*m.data[15] - m.data[0]*m.data[9]*m.data[15];

    r.data[10]  = m.data[1]*m.data[7]*m.data[12] - m.data[3]*m.data[5]*m.data[12]
                + m.data[3]*m.data[4]*m.data[13] - m.data[0]*m.data[7]*m.data[13]
                - m.data[1]*m.data[4]*m.data[15] + m.data[0]*m.data[5]*m.data[15];

    r.data[14]  = m.data[3]*m.data[5]*m.data[8] - m.data[1]*m.data[7]*m.data[8]
- m.data[3]*m.data[4]*m.data[9] + m.data[0]*m.data[7]*m.data[9]
                + m.data[1]*m.data[4]*m.data[11] - m.data[0]*m.data[5]*m.data[11];

    r.data[3]   = m.data[6]*m.data[9]*m.data[12] - m.data[5]*m.data[10]*m.data[12]
                - m.data[6]*m.data[8]*m.data[13] + m.data[4]*m.data[10]*m.data[13]
                + m.data[5]*m.data[8]*m.data[14] - m.data[4]*m.data[9]*m.data[14];

    r.data[7]  = m.data[1]*m.data[10]*m.data[12] - m.data[2]*m.data[9]*m.data[12]
                + m.data[2]*m.data[8]*m.data[13] - m.data[0]*m.data[10]*m.data[13]
                - m.data[1]*m.data[8]*m.data[14] + m.data[0]*m.data[9]*m.data[14];

    r.data[11]  = m.data[2]*m.data[5]*m.data[12] - m.data[1]*m.data[6]*m.data[12]
                - m.data[2]*m.data[4]*m.data[13] + m.data[0]*m.data[6]*m.data[13]
                + m.data[1]*m.data[4]*m.data[14] - m.data[0]*m.data[5]*m.data[14];

    r.data[15]  = m.data[1]*m.data[6]*m.data[8] - m.data[2]*m.data[5]*m.data[8]
                + m.data[2]*m.data[4]*m.data[9] - m.data[0]*m.data[6]*m.data[9]
                - m.data[1]*m.data[4]*m.data[10] + m.data[0]*m.data[5]*m.data[10];

    for (int i = 0; i < 16; i++)
        r.data *= det;
   
    return r;
}

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
Re: OpenGL matrix to Camera
« Reply #10 on: April 07, 2011, 09:42:55 am »
Quote
...glEnable(GL_DEPTH_TEST);...
This makes sense only if the AR-engine fiddles around with the gl render pipeline, because jPCT actually enables the depth test by default. Unless the AR part doesn't disable it, there should be no need to do this call....but well, obviously it does. Combining the two still doesn't look like a good idea to me...

Like the jPCT docs state:
Quote
All matrices in jPCT are row major.
. But if you are using the fillDump()-method, this conversion should actually be done "by accident" anyway.