Number of vertices between blender and bones not same

Started by kkl, November 09, 2014, 06:42:01 AM

Previous topic - Next topic

kkl

Hi raft,

I'm exporting bones from Blender using blender2orge and I notice the number of vertices is different. My object in Blender has 574 vertices, but in jpct getUniqueVertexCount() shows 600. Do you have any clue what's happening? I need to keep track of a few selected vertices for manipulation (for hardware skinning object instancing) but it seems like it got messed up after that.

raft

no, not really. seems as caused by exporter. maybe there is some setting or something?

kkl

I checked the printout from Blender during export, it really does export 600 vertices! I tried wif different settings like unchecking the optimize array, reorganize vertex buffer and etc, but it's still exporting the addition vertices. That's kinda outrageous coz we can't keep track of the vertices anymore, and some of the vertices seem to be shared as well. How'd you recommend to keep track of vertices after exporting? Does it happen in 3DS MAX orge export plugin?

raft

i never needed such a thing. but even if the vertex count matches, how can you be sure of ordering of vertices? that maybe totally different.

what is your final purpose? maybe there is another way

kkl

yea, u're right. We might have have vertex ordering issue too. Initially i tested by giving each vertex an integer id according to the order in Blender, and pass it to shader. It looks like the order is pretty close, but just a few vertices are messed up (~2-5). The rest looks ok.

My goal is to instance each object in a mesh for hardware skinning. Say, a 3D model with 2 meshes together. Each mesh has different location. By passing id to each vertex, I can move/seperate them out in shader. The reason behind this is to do reduce the draw call of many object to reduce power comsumption. Currently im drawing 20 objects with 350 vertices and 3 layer textures, it takes >15% of battery in just 5 minutes. That's too much for a live wallpaper.


raft

you first load the model into memory via Bones' loader than upload it to GPU, right? why not use the index information at the memory?

kkl

Hi raft,

Yes, im using bones loader. How do we know which vertex is which by using the index information?

raft

following your scenario, you can separate objects as different meshes and load with Bones'. you can than later upload them as a single mesh to GPU. that way you know where each sub object starts and ends.

kkl

It's actually a good idea. I never thought of that. By merging multiple objects and manually point to skinClipSequence, it's working like a charm now.. But there's world space problem where object's position is incorrect. I think it can be fixed and should be trivial. Thanks alot for your help, raft ; )

raft

if you export objects together there would be no need to manually point to SkinClipSequence. they will be imported as multiple objects in same AnimatedGroup and share same SkinClipSequence.

kkl

Exporting objects all together? but to combine the mesh, the only way we can use is mergeObject(), right? The output of mergeObject() is Object3D class, then we recreate a new Animated3D with pointing to existing SkinClipSequence. CMIIW, I might have missed a better way to merge the mesh from all objects.

raft

actually i wasnt thinking that you were merging objects. i thought you would upload objects to GPU by 'manually' appending their vertices.

you merge objects and than create a new 'merged' SkinData and create an Animated3D out of that?

kkl

Quoteyou merge objects and than create a new 'merged' SkinData and create an Animated3D out of that?
Yes. SkinData is not merged, but only using one object's SkinData and pass it in Animated3D(Object3D object, SkinData skin, SkeletonPose currentPose). Then point to the SkinClipSequence. It's kinda tricky when animating the skin though, coz I need to re-create the skin palette matrix for 20 objects manually, and tell shader to use the correct palette with the sub mesh respectively. Or is there a better way to do it?

I couldn't upload the mesh manually to GPU. It might take alot of work and might be redundant to JPCT framework. I'm just thinking of reusing the framework as much as we can.

raft

if you dont merge SkinData, software animation will be broken. i.e 2nd, 3rd, etc. sub objects wont be animated. but if you dont use it it's not a problem.

the 20 palettes you are talking about are the equivalents of SkeletonPose.palette, right? why are you duplicating it 20 times? looking at Bones code, it is shared among all objects in the same AnimatedGroup. by duplicating it, you are updating pallette 20 times instead of one.

kkl

Quoteif you dont merge SkinData, software animation will be broken
Yes, you're right. In this case, I'm using hardware skinning. I guess it should be okay.

Quotethe 20 palettes you are talking about are the equivalents of SkeletonPose.palette, right? why are you duplicating it 20 times?
Yes. Im refering to SkeletonPose.palette. The palette is shared among all objects? Im not sure if I can use the shared palette, else all submeshes will be animating at the same frame. The 20 objects have to be animated by its own frame. Then, each palette is uploaded to GPU to be applied on vertex in shader.