Author Topic: How to access the vertex buffer, normal buffer and tex coord buffer from Bones?  (Read 16339 times)

Offline PsychoTodd

  • byte
  • *
  • Posts: 5
    • View Profile
Hello everyone,

currently I would like to use the Bones' function to load the animated files. It works nice. So thank you for providing this tools for android.

However, after I animated the  model based on time to a certain pose (AnimatedGroup:: animate(index, animation)), I need to use openGL ES to draw the mesh by myself in my rendering context. I may not be able to call World::renderScene and draw since I have already set up the OpenGL ES rendering in my program.

So my question is: how can I access the animated vertices, texture coordinate and normal (if there is any) so I can apply them to " GLES20.glVertexAttribPointer" to draw by myself? Is there any example I can look into?

Thank you so much!

Offline EgonOlsen

  • Administrator
  • quad
  • *****
  • Posts: 12295
    • View Profile
    • http://www.jpct.net
You can't. That's actually defeating the purpose of the engine, if you would be able to do that....but why would you? What kind of program is that that prevents you from calling renderScene() and such?

Offline raft

  • Moderator
  • quad
  • *****
  • Posts: 1993
    • View Profile
    • http://www.aptalkarga.com
normally, you can access vertices and normals of an Object3D by attaching an IVertexController to its Mesh. but as Animated3D already attaches its own IVertexController, if you set another you will break the animation.

so, the simplest solution will be to alter the code of Animated3D and expose destMesh and destNormals fields. you can use them after animation is applied (if autoApplyAnimation is true animation is applied automatically o/w you should manually call Animated3D.applyAnimation())

texture coordinates is another story..

Offline PsychoTodd

  • byte
  • *
  • Posts: 5
    • View Profile
Hello, I know it is defeating the purpose of engine. But for example I want only to use the Bones animation component because I need to render the mesh in an argument reality engine. The engine will draw the live camera feed first then draw the 3D object inside.  Do you think renderScene() can handle that? BTW, is JPCT android based on OpenGL ES 1.0 now? The engine I work on uses OpenGL ES 2.0 so I am not sure if renderScene() is ok to work in this case.

Thank you for your reply to my questions.

You can't. That's actually defeating the purpose of the engine, if you would be able to do that....but why would you? What kind of program is that that prevents you from calling renderScene() and such?

Offline PsychoTodd

  • byte
  • *
  • Posts: 5
    • View Profile
Hello raft

I am working on another argument reality project for android and I want to include some animation into it. That is the reason I find Bones of JPCT. I feel that loading the model and animations with Bones are very good functionalities.

But when I draw it I hope there is still way to access the information I need to call openGL es functions to draw the model by myself. Have you encounter the questions before that want to draw the animated model outside of JPCT engine?

I will try the way you suggested but as you said, if texture coordinate cannot provide easily then how can I apply texture to the animated model?

Thank you!


normally, you can access vertices and normals of an Object3D by attaching an IVertexController to its Mesh. but as Animated3D already attaches its own IVertexController, if you set another you will break the animation.

so, the simplest solution will be to alter the code of Animated3D and expose destMesh and destNormals fields. you can use them after animation is applied (if autoApplyAnimation is true animation is applied automatically o/w you should manually call Animated3D.applyAnimation())

texture coordinates is another story..

Offline PsychoTodd

  • byte
  • *
  • Posts: 5
    • View Profile
Hello guys

Thanks for RAFT's suggestion, I am able to extract the data which I need to render one mesh. For the texture UV or vertices indices, since they don't dynamically change, Bones puts them in the MeshData as uvs and indices. As the experts suggested above, expose the information is not elegant and violate the design of Bones in terms of an component of JPCT, but there is way to work around if you want to give this nice light-weighted animation system a try.

I will continue to try out whether I can animated the character. Keep updating....

A quick question, to move rotate the animated character as a whole object, I may need to deal with OpenGL rotation and transformation myself since I believe the Animated3D related rotation and translation only work if the model is drawn with the World instance, right?
« Last Edit: August 27, 2015, 06:35:32 am by PsychoTodd »

Offline raft

  • Moderator
  • quad
  • *****
  • Posts: 1993
    • View Profile
    • http://www.aptalkarga.com
But for example I want only to use the Bones animation component because I need to render the mesh in an argument reality engine. The engine will draw the live camera feed first then draw the 3D object inside.  Do you think renderScene() can handle that?
I think any two or more library can render to same OpenGL context. Actually in my former job, I worked on an augmented reality project and managed to render camera view and jPCT world into same opengl context. you can find more information in this thread

A quick question, to move rotate the animated character as a whole object, I may need to deal with OpenGL rotation and transformation myself since I believe the Animated3D related rotation and translation only work if the model is drawn with the World instance, right?
Correct. Normally the suggested way to translate/rotate Animated3D's is to apply such transformation to AnimatedGroup.getRoot() but that information is only used by jPCT at render time.

for texture coordinates, you can expose and use MeshData of Animated3D.

but my overall suggestion will be using jPCT for rendering instead of such workarounds and hacks.