Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - PsychoTodd

Pages: [1]
Bones / [resolved] How to use bones to load models without animation.
« on: November 19, 2015, 11:43:41 pm »
Hi guys,

I think it is good to share some of my experience here regarding using the animation library to load and display some static model. In certain case, you may want to use the same bones library to handle both static object and animated character. However, when you convert a static model to collada format and try to use the Bones script to convert the mesh, it throws exception complaining that there is no skins : "ColladaStorage contains no skins". It is not that difficult to update your static model in Maya to create a new dae file which can be accepted by the script.

Pre: If you are not familiar with Maya rigging, just have a look of lesson 3 and 4 from this youtube link:

The idea is to add a very simple skeleton to the whole meshes in your original format meshes and do certain minimum steps so that the Bones script is happy. All these description is in Maya 2016.

1. Make sure your menu is in Rigging, choose skeleton menu -> create joint.
2. Create one joint at the root of your meshes (the base joint), and another one at the top (joint 2)(similar like what is shown in the youtube video).
3. Select one of the meshes in your scene, and Shift + select the base joint of the skeleton. Then click key 'p', to bind them. Do Step 3 for all the meshes in the scene with the base joint.
4. Now in the tree list view, you should see that Joint 2 and all the meshes are branches at the same level to Joint 1 (the base joint).
5. Select joint 1 and all the meshes, and go to Skin menu, choose binding skin.
6. Finally, choose the joint 1 (which will select everything since joint 1 is the root). Then export the selection through OpenCOLLADA plugin for Maya.

This dae file now can be convert by the Bones script ardorCollada2Bones.bat / with some warning and you should have a bones that can be loaded in the engine. However, since there is no animation inside, do not try to animate it.

Hope it helps.

Hello guys

Thanks for RAFT's suggestion, I am able to extract the data which I need to render one mesh. For the texture UV or vertices indices, since they don't dynamically change, Bones puts them in the MeshData as uvs and indices. As the experts suggested above, expose the information is not elegant and violate the design of Bones in terms of an component of JPCT, but there is way to work around if you want to give this nice light-weighted animation system a try.

I will continue to try out whether I can animated the character. Keep updating....

A quick question, to move rotate the animated character as a whole object, I may need to deal with OpenGL rotation and transformation myself since I believe the Animated3D related rotation and translation only work if the model is drawn with the World instance, right?

Hello raft

I am working on another argument reality project for android and I want to include some animation into it. That is the reason I find Bones of JPCT. I feel that loading the model and animations with Bones are very good functionalities.

But when I draw it I hope there is still way to access the information I need to call openGL es functions to draw the model by myself. Have you encounter the questions before that want to draw the animated model outside of JPCT engine?

I will try the way you suggested but as you said, if texture coordinate cannot provide easily then how can I apply texture to the animated model?

Thank you!

normally, you can access vertices and normals of an Object3D by attaching an IVertexController to its Mesh. but as Animated3D already attaches its own IVertexController, if you set another you will break the animation.

so, the simplest solution will be to alter the code of Animated3D and expose destMesh and destNormals fields. you can use them after animation is applied (if autoApplyAnimation is true animation is applied automatically o/w you should manually call Animated3D.applyAnimation())

texture coordinates is another story..

Hello, I know it is defeating the purpose of engine. But for example I want only to use the Bones animation component because I need to render the mesh in an argument reality engine. The engine will draw the live camera feed first then draw the 3D object inside.  Do you think renderScene() can handle that? BTW, is JPCT android based on OpenGL ES 1.0 now? The engine I work on uses OpenGL ES 2.0 so I am not sure if renderScene() is ok to work in this case.

Thank you for your reply to my questions.

You can't. That's actually defeating the purpose of the engine, if you would be able to do that....but why would you? What kind of program is that that prevents you from calling renderScene() and such?

Hello everyone,

currently I would like to use the Bones' function to load the animated files. It works nice. So thank you for providing this tools for android.

However, after I animated the  model based on time to a certain pose (AnimatedGroup:: animate(index, animation)), I need to use openGL ES to draw the mesh by myself in my rendering context. I may not be able to call World::renderScene and draw since I have already set up the OpenGL ES rendering in my program.

So my question is: how can I access the animated vertices, texture coordinate and normal (if there is any) so I can apply them to " GLES20.glVertexAttribPointer" to draw by myself? Is there any example I can look into?

Thank you so much!

Pages: [1]