def getVertBoneMapping(skel, skeletonMesh): from animation import VertexBoneWeights vertBoneMapping = {} # Format: { boneName: [(vertIdx, weight), ...], ... } if not hasattr(skeletonMesh, "boneShape"): raise RuntimeError( "Specified mesh object %s is not a skeleton mesh. Make sure it is created using meshFromSkeleton()" % skeletonMesh) type = skeletonMesh.boneShape if type == 'axis': global _axismesh_ if _axismesh_ is None: import geometry3d _axismesh_ = geometry3d.AxisMesh(scale=0.5) nVertsPerBone = _axismesh_.getVertexCount() else: nVertsPerBone = len(SHAPE_VECTORS[type]) #nBones = len(skel.getBones()) #nVertsPerBone = int(mesh.getVertexCount()/nBones) offset = 0 for bone in skel.getBones( ): # We assume that skeleton mesh has bones in breadt-first order verts = range(offset, offset + nVertsPerBone) weights = np.repeat(1, nVertsPerBone) vertBoneMapping[bone.name] = zip(verts, weights) offset = offset + nVertsPerBone return VertexBoneWeights( vertBoneMapping) # , nVertsPerBone*skel.getBoneCount())
def fromFile(self, filepath, mesh=None): """ Load skeleton from json rig file. """ import json from collections import OrderedDict import getpath import os self._clear() skelData = json.load(open(filepath, 'rb'), object_pairs_hook=OrderedDict) self.name = skelData.get("name", self.name) self.version = int(skelData.get("version", 1)) self.copyright = skelData.get("copyright", "") self.description = skelData.get("description", "") self.plane_map_strategy = int(skelData.get("plane_map_strategy", 3)) for joint_name, v_idxs in skelData.get("joints", dict()).items(): if isinstance(v_idxs, list) and len(v_idxs) > 0: self.joint_pos_idxs[joint_name] = v_idxs self.planes = skelData.get("planes", dict()) # Order bones breadth-first breadthfirst_bones = [] prev_len = -1 # anti-deadlock while(len(breadthfirst_bones) != len(skelData["bones"]) and prev_len != len(breadthfirst_bones)): prev_len = len(breadthfirst_bones) for bone_name, bone_defs in skelData["bones"].items(): if bone_name not in breadthfirst_bones: if not bone_defs.get("parent", None): breadthfirst_bones.append(bone_name) elif bone_defs["parent"] in breadthfirst_bones: breadthfirst_bones.append(bone_name) if len(breadthfirst_bones) != len(skelData["bones"]): missing = [bname for bname in skelData["bones"].keys() if bname not in breadthfirst_bones] log.warning("Some bones defined in file %s could not be added to skeleton %s, because they have an invalid parent bone (%s)", filepath, self.name, ', '.join(missing)) for bone_name in breadthfirst_bones: bone_defs = skelData["bones"][bone_name] self.addBone(bone_name, bone_defs.get("parent", None), bone_defs["head"], bone_defs["tail"], bone_defs.get("rotation_plane", 0), bone_defs.get("reference",None), bone_defs.get("weights_reference",None)) self.build() if "weights_file" in skelData and skelData["weights_file"]: weights_file = skelData["weights_file"] weights_file = getpath.thoroughFindFile(weights_file, os.path.dirname(getpath.canonicalPath(filepath)), True) self.vertexWeights = VertexBoneWeights.fromFile(weights_file, mesh.getVertexCount() if mesh else None, rootBone=self.roots[0].name)
def fromFile(self, filepath, mesh=None): """ Load skeleton from json rig file. """ import json from collections import OrderedDict import getpath import os self._clear() skelData = json.load(open(filepath, 'rb'), object_pairs_hook=OrderedDict) self.name = skelData.get("name", self.name) self.version = int(skelData.get("version", 1)) self.copyright = skelData.get("copyright", "") self.description = skelData.get("description", "") for joint_name, v_idxs in skelData.get("joints", dict()).items(): if isinstance(v_idxs, list) and len(v_idxs) > 0: self.joint_pos_idxs[joint_name] = v_idxs # Order bones breadth-first breadthfirst_bones = [] prev_len = -1 # anti-deadlock while(len(breadthfirst_bones) != len(skelData["bones"]) and prev_len != len(breadthfirst_bones)): prev_len = len(breadthfirst_bones) for bone_name, bone_defs in skelData["bones"].items(): if bone_name not in breadthfirst_bones: if not bone_defs.get("parent", None): breadthfirst_bones.append(bone_name) elif bone_defs["parent"] in breadthfirst_bones: breadthfirst_bones.append(bone_name) if len(breadthfirst_bones) != len(skelData["bones"]): missing = [bname for bname in skelData["bones"].keys() if bname not in breadthfirst_bones] log.warning("Some bones defined in file %s could not be added to skeleton %s, because they have an invalid parent bone (%s)", filepath, self.name, ', '.join(missing)) for bone_name in breadthfirst_bones: bone_defs = skelData["bones"][bone_name] self.addBone(bone_name, bone_defs.get("parent", None), bone_defs["head"], bone_defs["tail"], bone_defs["roll"], bone_defs.get("reference",None), bone_defs.get("weights_reference",None)) self.build() if "weights_file" in skelData and skelData["weights_file"]: weights_file = skelData["weights_file"] weights_file = getpath.thoroughFindFile(weights_file, os.path.dirname(getpath.canonicalPath(filepath)), True) self.vertexWeights = VertexBoneWeights.fromFile(weights_file, mesh.getVertexCount() if mesh else None, rootBone=self.roots[0].name)
def loadBinaryProxy(path, human, type): log.debug("Loading binary proxy %s.", path) npzfile = np.load(path) #if type is None: # proxyType = npzfile['proxyType'].tostring() #else: proxyType = type proxy = Proxy(path, proxyType, human) proxy.name = npzfile['name'].tostring() proxy.uuid = npzfile['uuid'].tostring() proxy.basemesh = npzfile['basemesh'].tostring() if 'description' in npzfile: proxy.description = npzfile['description'].tostring() if 'version' in npzfile: proxy.version = int(npzfile['version']) if 'lic_str' in npzfile and 'lic_idx' in npzfile: proxy.license.fromNumpyString(npzfile['lic_str'], npzfile['lic_idx']) proxy.tags = set(_unpackStringList(npzfile['tags_str'], npzfile['tags_idx'])) if 'z_depth' in npzfile: proxy.z_depth = int(npzfile['z_depth']) if 'max_pole' in npzfile: proxy.max_pole = int(npzfile['max_pole']) if 'special_pose_str' in npzfile: special_poses = _unpackStringList(npzfile['special_pose_str'], npzfile['special_pose_idx']) for idx in range(0, len(special_poses), 2): proxy.special_pose[special_poses[idx]] = special_poses[idx+1] num_refverts = int(npzfile['num_refverts']) if num_refverts > 1: # 3 or 4 proxy.ref_vIdxs = npzfile['ref_vIdxs'] proxy.weights = npzfile['weights'] if 'offsets' in npzfile: proxy.offsets = npzfile['offsets'] else: if proxy.new_fitting: proxy.offsets = None else: proxy.offsets = np.zeros((num_refs,3), dtype=np.float32) else: # 1 refvert num_refs = npzfile['ref_vIdxs'].shape[0] proxy.ref_vIdxs = np.zeros((num_refs,3), dtype=np.uint32) proxy.ref_vIdxs[:,0] = npzfile['ref_vIdxs'] proxy.offsets = np.zeros((num_refs,3), dtype=np.float32) proxy.weights = np.zeros((num_refs,3), dtype=np.float32) proxy.weights[:,0] = npzfile['weights'] if "deleteVerts" in npzfile: proxy.deleteVerts = npzfile['deleteVerts'] # Reconstruct reverse vertex (and weights) mapping proxy._reloadReverseMapping() if proxy.new_fitting: # Create alias proxy.deltas = proxy.weights # TODO we could skip this for new-style proxies proxy.tmatrix.fromNumpyStruct(npzfile) proxy.uvLayers = {} for uvIdx, uvName in enumerate(_unpackStringList(npzfile['uvLayers_str'], npzfile['uvLayers_idx'])): proxy.uvLayers[uvIdx] = uvName proxy.material = material.Material(proxy.name) if 'material_file' in npzfile: proxy._material_file = npzfile['material_file'].tostring() if proxy.material_file: proxy.material.fromFile(proxy.material_file) proxy._obj_file = npzfile['obj_file'].tostring() if 'vertexBoneWeights_file' in npzfile: proxy._vertexBoneWeights_file = npzfile['vertexBoneWeights_file'].tostring() if proxy.vertexBoneWeights_file: from animation import VertexBoneWeights proxy.vertexBoneWeights = VertexBoneWeights.fromFile(proxy.vertexBoneWeights_file) if proxy.z_depth == -1: log.warning('Proxy file %s does not specify a Z depth. Using 50.', path) proxy.z_depth = 50 return proxy
def loadTextProxy(human, filepath, type="Clothes"): from codecs import open try: fp = open(filepath, "rU", encoding="utf-8") except IOError: log.error("*** Cannot open %s", filepath) return None folder = os.path.realpath(os.path.expanduser(os.path.dirname(filepath))) proxy = Proxy(filepath, type, human) refVerts = [] status = 0 vnum = 0 for line in fp: words = line.split() if len(words) == 0: # Reset status on empty line #status = 0 continue if words[0].startswith('#'): # Comment # Try interpreting comment attributes as license info proxy.license.updateFromComment(line) continue key = words[0] if key == 'name': proxy.name = " ".join(words[1:]) elif key == 'uuid': proxy.uuid = " ".join(words[1:]) elif key == 'description': proxy.description = " ".join(words[1:]) elif key in ['author', 'license', 'homepage']: proxy.license.updateFromComment(words) elif key == 'tag': proxy.tags.append( " ".join(words[1:]).lower() ) elif key == 'version': proxy.version = int(words[1]) elif key == 'z_depth': proxy.z_depth = int(words[1]) elif key == 'max_pole': proxy.max_pole = int(words[1]) elif key == 'special_pose': proxy.special_pose[words[1]] = words[2] elif key == 'verts': status = doRefVerts elif key == 'weights': status = doWeights if proxy.weights == None: proxy.weights = {} weights = [] proxy.weights[words[1]] = weights elif key == "delete_verts": status = doDeleteVerts elif key == 'obj_file': proxy._obj_file = _getFileName(folder, words[1], ".obj") elif key == 'material': matFile = _getFileName(folder, words[1], ".mhmat") proxy._material_file = matFile proxy.material.fromFile(proxy.material_file) elif key == 'vertexboneweights_file': from animation import VertexBoneWeights proxy._vertexBoneWeights_file = _getFileName(folder, words[1], ".jsonw") proxy.vertexBoneWeights = VertexBoneWeights.fromFile(proxy.vertexBoneWeights_file) elif key == 'backface_culling': # TODO remove in future log.warning('Deprecated parameter "backface_culling" used in proxy file. Set property backfaceCull in material instead.') elif key == 'transparent': # TODO remove in future log.warning('Deprecated parameter "transparent" used in proxy file. Set property in material file instead.') elif key == 'uvLayer': # TODO is this still used? if len(words) > 2: layer = int(words[1]) uvFile = words[2] else: layer = 0 uvFile = words[1] #uvMap = material.UVMap(proxy.name+"UV"+str(layer)) #uvMap.read(proxy.mesh, _getFileName(folder, uvFile, ".mhuv")) # Delayed load, only store path here proxy.uvLayers[layer] = _getFileName(folder, uvFile, ".mhuv") elif key == 'x_scale': proxy.tmatrix.getScaleData(words, 0) elif key == 'y_scale': proxy.tmatrix.getScaleData(words, 1) elif key == 'z_scale': proxy.tmatrix.getScaleData(words, 2) elif key == 'shear_x': proxy.tmatrix.getShearData(words, 0, None) elif key == 'shear_y': proxy.tmatrix.getShearData(words, 1, None) elif key == 'shear_z': proxy.tmatrix.getShearData(words, 2, None) elif key == 'l_shear_x': proxy.tmatrix.getShearData(words, 0, 'Left') elif key == 'l_shear_y': proxy.tmatrix.getShearData(words, 1, 'Left') elif key == 'l_shear_z': proxy.tmatrix.getShearData(words, 2, 'Left') elif key == 'r_shear_x': proxy.tmatrix.getShearData(words, 0, 'Right') elif key == 'r_shear_y': proxy.tmatrix.getShearData(words, 1, 'Right') elif key == 'r_shear_z': proxy.tmatrix.getShearData(words, 2, 'Right') elif key == 'basemesh': proxy.basemesh = words[1] elif key in ['shapekey', 'subsurf', 'shrinkwrap', 'solidify', 'objfile_layer', 'uvtex_layer', 'use_projection', 'mask_uv_layer', 'texture_uv_layer', 'delete', 'vertexgroup_file']: log.warning('Deprecated parameter "%s" used in proxy file. Please remove.', key) elif status == doRefVerts: refVert = ProxyRefVert(human) refVerts.append(refVert) if proxy.new_fitting: refVert.fromQuad(words, vnum, proxy.vertWeights) elif len(words) == 1: refVert.fromSingle(words, vnum, proxy.vertWeights) else: refVert.fromTriple(words, vnum, proxy.vertWeights) vnum += 1 elif status == doWeights: v = int(words[0]) w = float(words[1]) weights.append((v,w)) elif status == doDeleteVerts: sequence = False for v in words: if v == "-": sequence = True else: v1 = int(v) if sequence: for vn in range(v0,v1+1): proxy.deleteVerts[vn] = True sequence = False else: proxy.deleteVerts[v1] = True v0 = v1 else: log.warning('Unknown keyword %s found in proxy file %s', key, filepath) if proxy.z_depth == -1: log.warning('Proxy file %s does not specify a Z depth. Using 50.', filepath) proxy.z_depth = 50 proxy._finalize(refVerts) return proxy
def loadBinaryProxy(path, human, type): log.debug("Loading binary proxy %s.", path) npzfile = np.load(path) #if type is None: # proxyType = npzfile['proxyType'].tostring() #else: proxyType = type proxy = Proxy(path, proxyType, human) proxy.name = npzfile['name'].tostring() proxy.uuid = npzfile['uuid'].tostring() proxy.basemesh = npzfile['basemesh'].tostring() if 'description' in npzfile: proxy.description = npzfile['description'].tostring() if 'version' in npzfile: proxy.version = int(npzfile['version']) if 'lic_str' in npzfile and 'lic_idx' in npzfile: proxy.license.fromNumpyString(npzfile['lic_str'], npzfile['lic_idx']) proxy.tags = set( _unpackStringList(npzfile['tags_str'], npzfile['tags_idx'])) if 'z_depth' in npzfile: proxy.z_depth = int(npzfile['z_depth']) if 'max_pole' in npzfile: proxy.max_pole = int(npzfile['max_pole']) num_refverts = int(npzfile['num_refverts']) if num_refverts == 3: proxy.ref_vIdxs = npzfile['ref_vIdxs'] proxy.offsets = npzfile['offsets'] proxy.weights = npzfile['weights'] else: num_refs = npzfile['ref_vIdxs'].shape[0] proxy.ref_vIdxs = np.zeros((num_refs, 3), dtype=np.uint32) proxy.ref_vIdxs[:, 0] = npzfile['ref_vIdxs'] proxy.offsets = np.zeros((num_refs, 3), dtype=np.float32) proxy.weights = np.zeros((num_refs, 3), dtype=np.float32) proxy.weights[:, 0] = npzfile['weights'] if "deleteVerts" in npzfile: proxy.deleteVerts = npzfile['deleteVerts'] # Reconstruct reverse vertex (and weights) mapping proxy._reloadReverseMapping() proxy.tmatrix = TMatrix() proxy.uvLayers = {} for uvIdx, uvName in enumerate( _unpackStringList(npzfile['uvLayers_str'], npzfile['uvLayers_idx'])): proxy.uvLayers[uvIdx] = uvName proxy.material = material.Material(proxy.name) if 'material_file' in npzfile: proxy._material_file = npzfile['material_file'].tostring() if proxy.material_file: proxy.material.fromFile(proxy.material_file) proxy._obj_file = npzfile['obj_file'].tostring() if 'vertexBoneWeights_file' in npzfile: proxy._vertexBoneWeights_file = npzfile[ 'vertexBoneWeights_file'].tostring() if proxy.vertexBoneWeights_file: from animation import VertexBoneWeights proxy.vertexBoneWeights = VertexBoneWeights.fromFile( proxy.vertexBoneWeights_file) if proxy.z_depth == -1: log.warning('Proxy file %s does not specify a Z depth. Using 50.', path) proxy.z_depth = 50 return proxy
def loadTextProxy(human, filepath, type="Clothes"): from codecs import open try: fp = open(filepath, "rU", encoding="utf-8") except IOError: log.error("*** Cannot open %s", filepath) return None folder = os.path.realpath(os.path.expanduser(os.path.dirname(filepath))) proxy = Proxy(filepath, type, human) refVerts = [] status = 0 vnum = 0 for line in fp: words = line.split() if len(words) == 0: # Reset status on empty line #status = 0 continue if words[0].startswith('#'): # Comment # Try interpreting comment attributes as license info proxy.license.updateFromComment(line) continue key = words[0] if key == 'name': proxy.name = " ".join(words[1:]) elif key == 'uuid': proxy.uuid = " ".join(words[1:]) elif key == 'description': proxy.description = " ".join(words[1:]) elif key in ['author', 'license', 'homepage']: proxy.license.addStatement(words) elif key == 'tag': proxy.tags.append(" ".join(words[1:]).lower()) elif key == 'version': proxy.version = int(words[1]) elif key == 'z_depth': proxy.z_depth = int(words[1]) elif key == 'max_pole': proxy.max_pole = int(words[1]) elif key == 'verts': status = doRefVerts elif key == 'weights': status = doWeights if proxy.weights == None: proxy.weights = {} weights = [] proxy.weights[words[1]] = weights elif key == "delete_verts": status = doDeleteVerts elif key == 'obj_file': proxy._obj_file = _getFileName(folder, words[1], ".obj") elif key == 'material': matFile = _getFileName(folder, words[1], ".mhmat") proxy._material_file = matFile proxy.material.fromFile(proxy.material_file) elif key == 'vertexboneweights_file': from animation import VertexBoneWeights proxy._vertexBoneWeights_file = _getFileName( folder, words[1], ".jsonw") proxy.vertexBoneWeights = VertexBoneWeights.fromFile( proxy.vertexBoneWeights_file) elif key == 'backface_culling': # TODO remove in future log.warning( 'Deprecated parameter "backface_culling" used in proxy file. Set property backfaceCull in material instead.' ) elif key == 'transparent': # TODO remove in future log.warning( 'Deprecated parameter "transparent" used in proxy file. Set property in material file instead.' ) elif key == 'uvLayer': # TODO is this still used? if len(words) > 2: layer = int(words[1]) uvFile = words[2] else: layer = 0 uvFile = words[1] #uvMap = material.UVMap(proxy.name+"UV"+str(layer)) #uvMap.read(proxy.mesh, _getFileName(folder, uvFile, ".mhuv")) # Delayed load, only store path here proxy.uvLayers[layer] = _getFileName(folder, uvFile, ".mhuv") elif key == 'x_scale': proxy.tmatrix.getScaleData(words, 0) elif key == 'y_scale': proxy.tmatrix.getScaleData(words, 1) elif key == 'z_scale': proxy.tmatrix.getScaleData(words, 2) elif key == 'shear_x': proxy.tmatrix.getShearData(words, 0, None) elif key == 'shear_y': proxy.tmatrix.getShearData(words, 1, None) elif key == 'shear_z': proxy.tmatrix.getShearData(words, 2, None) elif key == 'l_shear_x': proxy.tmatrix.getShearData(words, 0, 'Left') elif key == 'l_shear_y': proxy.tmatrix.getShearData(words, 1, 'Left') elif key == 'l_shear_z': proxy.tmatrix.getShearData(words, 2, 'Left') elif key == 'r_shear_x': proxy.tmatrix.getShearData(words, 0, 'Right') elif key == 'r_shear_y': proxy.tmatrix.getShearData(words, 1, 'Right') elif key == 'r_shear_z': proxy.tmatrix.getShearData(words, 2, 'Right') elif key == 'uniform_scale': proxy.uniformScale = True if len(words) > 1: proxy.scaleCorrect = float(words[1]) proxy.uniformizeScale() elif key == 'basemesh': proxy.basemesh = words[1] elif key in [ 'shapekey', 'subsurf', 'shrinkwrap', 'solidify', 'objfile_layer', 'uvtex_layer', 'use_projection', 'mask_uv_layer', 'texture_uv_layer', 'delete', 'vertexgroup_file' ]: log.warning( 'Deprecated parameter "%s" used in proxy file. Please remove.', key) elif status == doRefVerts: refVert = ProxyRefVert(human) refVerts.append(refVert) if len(words) == 1: refVert.fromSingle(words, vnum, proxy.vertWeights) else: refVert.fromTriple(words, vnum, proxy.vertWeights) vnum += 1 elif status == doWeights: v = int(words[0]) w = float(words[1]) weights.append((v, w)) elif status == doDeleteVerts: sequence = False for v in words: if v == "-": sequence = True else: v1 = int(v) if sequence: for vn in range(v0, v1 + 1): proxy.deleteVerts[vn] = True sequence = False else: proxy.deleteVerts[v1] = True v0 = v1 else: log.warning('Unknown keyword %s found in proxy file %s', key, filepath) if proxy.z_depth == -1: log.warning('Proxy file %s does not specify a Z depth. Using 50.', filepath) proxy.z_depth = 50 proxy._finalize(refVerts) return proxy
def loadBinaryProxy(path, human, type): log.debug("Loading binary proxy %s.", path) npzfile = np.load(path) #if type is None: # proxyType = npzfile['proxyType'].tostring() #else: proxyType = type proxy = Proxy(path, proxyType, human) proxy.name = npzfile['name'].tostring() proxy.uuid = npzfile['uuid'].tostring() proxy.basemesh = npzfile['basemesh'].tostring() if 'description' in npzfile: proxy.description = npzfile['description'].tostring() if 'version' in npzfile: proxy.version = int(npzfile['version']) if 'lic_str' in npzfile and 'lic_idx' in npzfile: proxy.license.fromNumpyString(npzfile['lic_str'], npzfile['lic_idx']) proxy.tags = set( _unpackStringList(npzfile['tags_str'], npzfile['tags_idx'])) if 'z_depth' in npzfile: proxy.z_depth = int(npzfile['z_depth']) if 'max_pole' in npzfile: proxy.max_pole = int(npzfile['max_pole']) num_refverts = int(npzfile['num_refverts']) if num_refverts == 3: proxy.ref_vIdxs = npzfile['ref_vIdxs'] proxy.offsets = npzfile['offsets'] proxy.weights = npzfile['weights'] else: num_refs = npzfile['ref_vIdxs'].shape[0] proxy.ref_vIdxs = np.zeros((num_refs, 3), dtype=np.uint32) proxy.ref_vIdxs[:, 0] = npzfile['ref_vIdxs'] proxy.offsets = np.zeros((num_refs, 3), dtype=np.float32) proxy.weights = np.zeros((num_refs, 3), dtype=np.float32) proxy.weights[:, 0] = npzfile['weights'] if "deleteVerts" in npzfile: proxy.deleteVerts = npzfile['deleteVerts'] # Reconstruct reverse vertex (and weights) mapping proxy._reloadReverseMapping() proxy.tmatrix = TMatrix() proxy.uvLayers = {} for uvIdx, uvName in enumerate( _unpackStringList(npzfile['uvLayers_str'], npzfile['uvLayers_idx'])): uvLayers[uvIdx] = uvName proxy.material = material.Material(proxy.name) if 'material_file' in npzfile: proxy._material_file = npzfile['material_file'].tostring() if proxy.material_file: proxy.material.fromFile(proxy.material_file) proxy._obj_file = npzfile['obj_file'].tostring() if 'vertexBoneWeights_file' in npzfile: proxy._vertexBoneWeights_file = npzfile[ 'vertexBoneWeights_file'].tostring() if proxy.vertexBoneWeights_file: proxy.vertexBoneWeights = VertexBoneWeights.fromFile( proxy.vertexBoneWeights_file) if proxy.z_depth == -1: log.warning('Proxy file %s does not specify a Z depth. Using 50.', path) proxy.z_depth = 50 return proxy