diff --git a/docs/list.js b/docs/list.js index fca71268ae9e4eba16a6fe107240dfcf6c08a285..7be9db7e388088835c0f58871582cc9a76887ea8 100644 --- a/docs/list.js +++ b/docs/list.js @@ -10,6 +10,7 @@ var list = { "How to run things locally": "manual/introduction/How-to-run-things-locally", "Drawing Lines": "manual/introduction/Drawing-lines", "Creating Text": "manual/introduction/Creating-text", + "Loading 3D Models": "manual/introduction/Loading-3D-models", "Migration Guide": "manual/introduction/Migration-guide", "Code Style Guide": "manual/introduction/Code-style-guide", "FAQ": "manual/introduction/FAQ", diff --git a/docs/manual/introduction/Animation-system.html b/docs/manual/introduction/Animation-system.html index ac70163075ffabdb19194ec0bcc57be091f0fcbb..19a665df5dce62dad3681accf8493a9e03079739 100644 --- a/docs/manual/introduction/Animation-system.html +++ b/docs/manual/introduction/Animation-system.html @@ -35,11 +35,11 @@

If you have successfully imported an animated 3D object (it doesn't matter if it has - bones or morph targets or both) - for example exporting it from Blender with the - [link:https://github.com/mrdoob/three.js/tree/master/utils/exporters/blender/addons/io_three Blender exporter] and - loading it into a three.js scene using [page:JSONLoader] -, one of the geometry's - properties of the loaded mesh should be an array named "animations", containing the - [page:AnimationClip AnimationClips] for this model (see a list of possible loaders below).

+ bones or morph targets or both) — for example exporting, it from Blender with the + [link:https://github.com/KhronosGroup/glTF-Blender-Exporter glTF Blender exporter] and + loading it into a three.js scene using [page:GLTFLoader] — one of the response fields + should be an array named "animations", containing the [page:AnimationClip AnimationClips] + for this model (see a list of possible loaders below).

Each *AnimationClip* usually holds the data for a certain activity of the object. If the mesh is a character, for example, there may be one AnimationClip for a walkcycle, a second diff --git a/docs/manual/introduction/FAQ.html b/docs/manual/introduction/FAQ.html index 8753f54d136b2881b42e7c88d58c29e43f64ae54..944e8086054fa4a5979b49e3a6ad79b4874f59a2 100644 --- a/docs/manual/introduction/FAQ.html +++ b/docs/manual/introduction/FAQ.html @@ -16,7 +16,7 @@ The recommended format for importing and exporting assets is glTF (GL Transmission Format). Because glTF is focused on runtime asset delivery, it is compact to transmit and fast to load.

- three.js provides loaders for many other popular formats like FBX, Collada or OBJ as well. Nevertheless, you should always try to establish a glTF based workflow in your projects first. + three.js provides loaders for many other popular formats like FBX, Collada or OBJ as well. Nevertheless, you should always try to establish a glTF based workflow in your projects first. For more information, see [link:#manual/introduction/Loading-3D-models loading 3D models].

diff --git a/docs/manual/introduction/Loading-3D-models.html b/docs/manual/introduction/Loading-3D-models.html new file mode 100644 index 0000000000000000000000000000000000000000..4cf4d39afdfb3aa8c4efda3f590824d0614883ac --- /dev/null +++ b/docs/manual/introduction/Loading-3D-models.html @@ -0,0 +1,132 @@ + + + + + + + + + + + + +

[name]

+
+ +

+ 3D models are available in hundreds of file formats, each with different + purposes, assorted features, and varying complexity. Although + + three.js provides many loaders, choosing the right format and + workflow will save time and frustration later on. Some formats are + difficult to work with, inefficient for realtime experiences, or simply not + fully supported at this time. +

+ +

+ This guide provides a workflow recommended for most users, and suggestions + for what to try if things don't go as expected. +

+ +

Before we start

+ +

+ If you're new to running a local server, begin with + [link:#manual/introduction/How-to-run-things-locally how to run things locally] + first. Many common errors viewing 3D models can be avoided by hosting files + correctly. +

+ +

Recommended workflow

+ +

+ Where possible, we recommend using glTF (GL Transmission Format). Both + .GLB and .GLTF versions of the format are + well supported. Because glTF is focused on runtime asset delivery, it is + compact to transmit and fast to load. Features include meshes, materials, + textures, skins, skeletons, morph targets, animations, lights, and + cameras. +

+ +

+ Public-domain glTF files are available on sites like + + Sketchfab, or various tools include glTF export: +

+ + + +

+ If your preferred tools do not support glTF, consider requesting glTF + export from the authors, or posting on + the glTF roadmap thread. +

+ +

+ When glTF is not an option, popular formats such as FBX, OBJ, or COLLADA + are also available and regularly maintained. +

+ +

Troubleshooting

+ +

+ You've spent hours modeling an artisanal masterpiece, you load it into + the webpage, and — oh no! 😭 It's distorted, miscolored, or missing entirely. + Start with these troubleshooting steps: +

+ +
    +
  1. + Check the JavaScript console for errors, and make sure you've used an + onError callback when calling .load() to log the result. +
  2. +
  3. + View the model in another application. For glTF, drag-and-drop viewers + are available for + three.js and + babylon.js. If the model + appears correctly in one or more applications, + file a bug against three.js. + If the model cannot be shown in any application, we strongly encourage + filing a bug with the application used to create the model. +
  4. +
  5. + Try scaling the model up or down by a factor of 1000. Many models are + scaled differently, and large models may not appear if the camera is + inside the model. +
  6. +
  7. + Look for failed texture requests in the network tab, like + C:\\Path\To\Model\texture.jpg. Use paths relative to your + model instead, such as images/texture.jpg — this may require + editing the model file in a text editor. +
  8. +
+ +

Asking for help

+ +

+ If you've gone through the troubleshooting process above and your model + still isn't working, the right approach to asking for help will get you to + a solution faster. Whenever possible, include your model (or a simpler + model with the same problem) in any formats you have available. Include + enough information for someone else to reproduce the issue quickly — + ideally, a live demo. +

+ +

+ TODO: Do we recommend model-related questions go to GitHub, Stack Overflow, + or the Discourse forum? +

+ + + \ No newline at end of file diff --git a/utils/exporters/blender/.gitignore b/utils/exporters/blender/.gitignore deleted file mode 100644 index c1c6da1922d327bdeb30d154a06ee998cfccc0db..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/.gitignore +++ /dev/null @@ -1,3 +0,0 @@ -tests/review -__pycache__/ -tmp/ diff --git a/utils/exporters/blender/README.md b/utils/exporters/blender/README.md index 5860d90c9432579655675cb564981eca229bb157..7eab0b90eb6ba90ba82c08e0d3477807f4e77fd8 100644 --- a/utils/exporters/blender/README.md +++ b/utils/exporters/blender/README.md @@ -1,68 +1,3 @@ # Three.js Blender Export -Exports Three.js' ASCII JSON format. - -## IMPORTANT - -The exporter (r69 and earlier) has been completely replaced. Please ensure you have removed the io_three_mesh addon from your Blender addons directory before installing the current addon (io_three). - -## Installation - - -Recommended Blender version **>= 2.73.0** - -Copy the io_three folder to the scripts/addons folder. If it doesn't exist, create it. The full path is OS-dependent (see below). - -Once that is done, you need to activate the plugin. Open Blender preferences, look for -Addons, search for `three`, enable the checkbox next to the `Import-Export: Three.js Format` entry. - -Goto Usage. - -### Windows - -Should look like this: - - C:\Program Files\Blender Foundation\Blender\2.7X\scripts\addons - -OR (for 2.6) - - C:\Users\USERNAME\AppData\Roaming\Blender Foundation\Blender\2.6X\scripts\addons - -### OSX - -In your user's library for user installed Blender addons: - - /Users/(myuser)/Library/Application Support/Blender/2.7X/scripts/addons - -OR (for 2.79) - - /Applications/Blender/blender.app/Contents/Resources/2.79/scripts/addons - -### Linux - -By default, this should look like: - - /home/USERNAME/.config/blender/2.6X/scripts/addons - -For Ubuntu users who installed Blender 2.68 via apt-get, this is the location: - - /usr/lib/blender/scripts/addons - -For Ubuntu users who installed Blender 2.7x via apt-get, this is the location: - - /usr/share/blender/scripts/addons - - -## Usage - -Activate the Import-Export addon under "User Preferences" > "Addons" and then use the regular Export menu within Blender, select `Three.js (json)`. - - -## Enabling msgpack - -To enable msgpack compression copy the msgpack to scripts/modules. - - -## Importer - -Currently there is no import functionality available. +> **NOTICE:** The Blender exporter for the Three.js JSON format has been removed, to focus on better support for other workflows. For recommended alternatives, see [Loading 3D Models](https://threejs.org/docs/#manual/introduction/loading-3d-models). The Three.js JSON format is still fully supported for use with [Object3D.toJSON](https://threejs.org/docs/#api/core/Object3D.toJSON), the [Editor](https://threejs.org/editor/), [THREE.ObjectLoader](https://threejs.org/docs/#api/loaders/ObjectLoader), [THREE.JSONLoader](https://threejs.org/docs/#api/loaders/JSONLoader), and [converters](https://github.com/mrdoob/three.js/tree/dev/utils/converters). diff --git a/utils/exporters/blender/addons/io_three/__init__.py b/utils/exporters/blender/addons/io_three/__init__.py deleted file mode 100644 index b26539c9c655be1ef374625729560ecc6f2fcc12..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/__init__.py +++ /dev/null @@ -1,1059 +0,0 @@ -# ##### BEGIN GPL LICENSE BLOCK ##### -# -# This program is free software; you can redistribute it and/or -# modify it under the terms of the GNU General Public License -# as published by the Free Software Foundation; either version 2 -# of the License, or (at your option) any later version. -# -# This program is distributed in the hope that it will be useful, -# but WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the -# GNU General Public License for more details. -# -# You should have received a copy of the GNU General Public License -# along with this program; if not, write to the Free Software Foundation, -# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA. -# -# ##### END GPL LICENSE BLOCK ##### - -import os -import json -import logging - -import bpy -from bpy_extras.io_utils import ExportHelper -from bpy.props import ( - EnumProperty, - BoolProperty, - FloatProperty, - IntProperty, - StringProperty -) - -from . import constants - -logging.basicConfig( - format='%(levelname)s:THREE:%(message)s', - level=logging.DEBUG) - -bl_info = { - 'name': "Three.js Format", - 'author': "repsac, mrdoob, yomotsu, mpk, jpweeks, rkusa, tschw, jackcaron, bhouston", - 'version': (1, 5, 0), - 'blender': (2, 74, 0), - 'location': "File > Export", - 'description': "Export Three.js formatted JSON files.", - 'warning': "Importer not included.", - 'wiki_url': "https://github.com/mrdoob/three.js/tree/"\ - "master/utils/exporters/blender", - 'tracker_url': "https://github.com/mrdoob/three.js/issues", - 'category': 'Import-Export' -} - - -def _geometry_types(): - """The valid geometry types that are supported by Three.js - - :return: list of tuples - - """ - keys = (constants.GLOBAL, - constants.GEOMETRY, - constants.BUFFER_GEOMETRY) - types = [] - for key in keys: - types.append((key, key.title(), key)) - - return types - -bpy.types.Mesh.THREE_geometry_type = EnumProperty( - name="Geometry type", - description="Geometry type", - items=_geometry_types(), - default=constants.GLOBAL) - -class ThreeMesh(bpy.types.Panel): - """Creates custom properties on a mesh node""" - - bl_label = 'THREE' - bl_space_type = 'PROPERTIES' - bl_region_type = 'WINDOW' - bl_context = 'data' - - def draw(self, context): - """ - - :param context: - - """ - row = self.layout.row() - if context.mesh: - row.prop(context.mesh, - 'THREE_geometry_type', - text="Type") - -def _blending_types(index): - """Supported blending types for Three.js - - :param index: - :type index: int - :returns: tuple if types (str, str, str) - - """ - types = (constants.BLENDING_TYPES.NONE, - constants.BLENDING_TYPES.NORMAL, - constants.BLENDING_TYPES.ADDITIVE, - constants.BLENDING_TYPES.SUBTRACTIVE, - constants.BLENDING_TYPES.MULTIPLY, - constants.BLENDING_TYPES.CUSTOM) - return (types[index], types[index], types[index]) - -bpy.types.Material.THREE_blending_type = EnumProperty( - name="Blending type", - description="Blending type", - items=[_blending_types(x) for x in range(6)], - default=constants.BLENDING_TYPES.NORMAL) - -bpy.types.Material.THREE_depth_write = BoolProperty(default=True) -bpy.types.Material.THREE_depth_test = BoolProperty(default=True) -bpy.types.Material.THREE_double_sided = BoolProperty(default=False) - -class ThreeMaterial(bpy.types.Panel): - """Adds custom properties to the Materials of an object""" - - bl_label = 'THREE' - bl_space_type = 'PROPERTIES' - bl_region_type = 'WINDOW' - bl_context = 'material' - - def draw(self, context): - """ - - :param context: - - """ - layout = self.layout - mat = context.material - - if mat is not None: - row = layout.row() - row.label(text="Selected material: %s" % mat.name) - - row = layout.row() - row.prop(mat, 'THREE_blending_type', - text="Blending type") - - row = layout.row() - row.prop(mat, 'THREE_depth_write', - text="Enable depth writing") - - row = layout.row() - row.prop(mat, 'THREE_depth_test', - text="Enable depth testing") - - row = layout.row() - row.prop(mat, 'THREE_double_sided', - text="Double-sided") - -def _mag_filters(index): - """Three.js mag filters - - :param index: - :type index: int - :returns: tuple with the filter values - - """ - types = (constants.LINEAR_FILTERS.LINEAR, - constants.NEAREST_FILTERS.NEAREST) - return (types[index], types[index], types[index]) - -bpy.types.Texture.THREE_mag_filter = EnumProperty( - name="Mag Filter", - items=[_mag_filters(x) for x in range(2)], - default=constants.LINEAR_FILTERS.LINEAR) - -def _min_filters(index): - """Three.js min filters - - :param index: - :type index: int - :returns: tuple with the filter values - - """ - types = (constants.LINEAR_FILTERS.LINEAR, - constants.LINEAR_FILTERS.MIP_MAP_NEAREST, - constants.LINEAR_FILTERS.MIP_MAP_LINEAR, - constants.NEAREST_FILTERS.NEAREST, - constants.NEAREST_FILTERS.MIP_MAP_NEAREST, - constants.NEAREST_FILTERS.MIP_MAP_LINEAR) - return (types[index], types[index], types[index]) - -bpy.types.Texture.THREE_min_filter = EnumProperty( - name="Min Filter", - items=[_min_filters(x) for x in range(6)], - default=constants.LINEAR_FILTERS.MIP_MAP_LINEAR) - -def _mapping(index): - """Three.js texture mappings types - - :param index: - :type index: int - :returns: tuple with the mapping values - - """ - types = (constants.MAPPING_TYPES.UV, - constants.MAPPING_TYPES.CUBE_REFLECTION, - constants.MAPPING_TYPES.CUBE_REFRACTION, - constants.MAPPING_TYPES.SPHERICAL_REFLECTION) - return (types[index], types[index], types[index]) - -bpy.types.Texture.THREE_mapping = EnumProperty( - name="Mapping", - items=[_mapping(x) for x in range(4)], - default=constants.MAPPING_TYPES.UV) - -class ThreeTexture(bpy.types.Panel): - """Adds custom properties to a texture""" - bl_label = 'THREE' - bl_space_type = 'PROPERTIES' - bl_region_type = 'WINDOW' - bl_context = 'texture' - - #@TODO: possible to make cycles compatible? - def draw(self, context): - """ - - :param context: - - """ - layout = self.layout - tex = context.texture - - if tex is not None: - row = layout.row() - row.prop(tex, 'THREE_mapping', text="Mapping") - - row = layout.row() - row.prop(tex, 'THREE_mag_filter', text="Mag Filter") - - row = layout.row() - row.prop(tex, 'THREE_min_filter', text="Min Filter") - -bpy.types.Object.THREE_export = bpy.props.BoolProperty(default=True) - -class ThreeObject(bpy.types.Panel): - """Adds custom properties to an object""" - bl_label = 'THREE' - bl_space_type = 'PROPERTIES' - bl_region_type = 'WINDOW' - bl_context = 'object' - - def draw(self, context): - """ - - :param context: - - """ - layout = self.layout - obj = context.object - - row = layout.row() - row.prop(obj, 'THREE_export', text='Export') - -class ThreeExportSettings(bpy.types.Operator): - """Save the current export settings (gets saved in .blend)""" - bl_label = "Save Settings" - bl_idname = "scene.three_export_settings_set" - - def execute(self, context): - cycles = context.scene.cycles - cycles.use_samples_final = True - - context.scene[constants.EXPORT_SETTINGS_KEY] = set_settings(context.active_operator.properties) - - self.report({"INFO"}, "Three Export Settings Saved") - - return {"FINISHED"} - -def restore_export_settings(properties, settings): - """Restore the settings - - :param properties: - - """ - - ## Geometry { - properties.option_vertices = settings.get( - constants.VERTICES, - constants.EXPORT_OPTIONS[constants.VERTICES]) - - properties.option_faces = settings.get( - constants.FACES, - constants.EXPORT_OPTIONS[constants.FACES]) - properties.option_normals = settings.get( - constants.NORMALS, - constants.EXPORT_OPTIONS[constants.NORMALS]) - - properties.option_skinning = settings.get( - constants.SKINNING, - constants.EXPORT_OPTIONS[constants.SKINNING]) - - properties.option_bones = settings.get( - constants.BONES, - constants.EXPORT_OPTIONS[constants.BONES]) - - properties.option_influences = settings.get( - constants.INFLUENCES_PER_VERTEX, - constants.EXPORT_OPTIONS[constants.INFLUENCES_PER_VERTEX]) - - properties.option_apply_modifiers = settings.get( - constants.APPLY_MODIFIERS, - constants.EXPORT_OPTIONS[constants.APPLY_MODIFIERS]) - - properties.option_extra_vgroups = settings.get( - constants.EXTRA_VGROUPS, - constants.EXPORT_OPTIONS[constants.EXTRA_VGROUPS]) - - properties.option_geometry_type = settings.get( - constants.GEOMETRY_TYPE, - constants.EXPORT_OPTIONS[constants.GEOMETRY_TYPE]) - - properties.option_index_type = settings.get( - constants.INDEX_TYPE, - constants.EXPORT_OPTIONS[constants.INDEX_TYPE]) - ## } - - ## Materials { - properties.option_materials = settings.get( - constants.MATERIALS, - constants.EXPORT_OPTIONS[constants.MATERIALS]) - - properties.option_uv_coords = settings.get( - constants.UVS, - constants.EXPORT_OPTIONS[constants.UVS]) - - properties.option_face_materials = settings.get( - constants.FACE_MATERIALS, - constants.EXPORT_OPTIONS[constants.FACE_MATERIALS]) - - properties.option_maps = settings.get( - constants.MAPS, - constants.EXPORT_OPTIONS[constants.MAPS]) - - properties.option_colors = settings.get( - constants.COLORS, - constants.EXPORT_OPTIONS[constants.COLORS]) - - properties.option_mix_colors = settings.get( - constants.MIX_COLORS, - constants.EXPORT_OPTIONS[constants.MIX_COLORS]) - ## } - - ## Settings { - properties.option_scale = settings.get( - constants.SCALE, - constants.EXPORT_OPTIONS[constants.SCALE]) - - properties.option_round_off = settings.get( - constants.ENABLE_PRECISION, - constants.EXPORT_OPTIONS[constants.ENABLE_PRECISION]) - - properties.option_round_value = settings.get( - constants.PRECISION, - constants.EXPORT_OPTIONS[constants.PRECISION]) - - properties.option_custom_properties = settings.get( - constants.CUSTOM_PROPERTIES, - constants.EXPORT_OPTIONS[constants.CUSTOM_PROPERTIES]) - - properties.option_logging = settings.get( - constants.LOGGING, - constants.EXPORT_OPTIONS[constants.LOGGING]) - - properties.option_compression = settings.get( - constants.COMPRESSION, - constants.NONE) - - properties.option_indent = settings.get( - constants.INDENT, - constants.EXPORT_OPTIONS[constants.INDENT]) - - properties.option_export_textures = settings.get( - constants.EXPORT_TEXTURES, - constants.EXPORT_OPTIONS[constants.EXPORT_TEXTURES]) - - properties.option_embed_textures = settings.get( - constants.EMBED_TEXTURES, - constants.EXPORT_OPTIONS[constants.EMBED_TEXTURES]) - - properties.option_texture_folder = settings.get( - constants.TEXTURE_FOLDER, - constants.EXPORT_OPTIONS[constants.TEXTURE_FOLDER]) - - properties.option_embed_animation = settings.get( - constants.EMBED_ANIMATION, - constants.EXPORT_OPTIONS[constants.EMBED_ANIMATION]) - ## } - - ## Scene { - properties.option_export_scene = settings.get( - constants.SCENE, - constants.EXPORT_OPTIONS[constants.SCENE]) - - #properties.option_embed_geometry = settings.get( - # constants.EMBED_GEOMETRY, - # constants.EXPORT_OPTIONS[constants.EMBED_GEOMETRY]) - - properties.option_lights = settings.get( - constants.LIGHTS, - constants.EXPORT_OPTIONS[constants.LIGHTS]) - - properties.option_cameras = settings.get( - constants.CAMERAS, - constants.EXPORT_OPTIONS[constants.CAMERAS]) - - properties.option_hierarchy = settings.get( - constants.HIERARCHY, - constants.EXPORT_OPTIONS[constants.HIERARCHY]) - ## } - - ## Animation { - properties.option_animation_morph = settings.get( - constants.MORPH_TARGETS, - constants.EXPORT_OPTIONS[constants.MORPH_TARGETS]) - - properties.option_blend_shape = settings.get( - constants.BLEND_SHAPES, - constants.EXPORT_OPTIONS[constants.BLEND_SHAPES]) - - properties.option_animation_skeletal = settings.get( - constants.ANIMATION, - constants.EXPORT_OPTIONS[constants.ANIMATION]) - - properties.option_keyframes = settings.get( - constants.KEYFRAMES, - constants.EXPORT_OPTIONS[constants.KEYFRAMES]) - - properties.option_bake_keyframes = settings.get( - constants.BAKE_KEYFRAMES, - constants.EXPORT_OPTIONS[constants.BAKE_KEYFRAMES]) - - properties.option_frame_step = settings.get( - constants.FRAME_STEP, - constants.EXPORT_OPTIONS[constants.FRAME_STEP]) - - properties.option_frame_index_as_time = settings.get( - constants.FRAME_INDEX_AS_TIME, - constants.EXPORT_OPTIONS[constants.FRAME_INDEX_AS_TIME]) - ## } - -def set_settings(properties): - """Set the export settings to the correct keys. - - :param properties: - :returns: settings - :rtype: dict - - """ - settings = { - constants.VERTICES: properties.option_vertices, - constants.FACES: properties.option_faces, - constants.NORMALS: properties.option_normals, - constants.SKINNING: properties.option_skinning, - constants.BONES: properties.option_bones, - constants.EXTRA_VGROUPS: properties.option_extra_vgroups, - constants.APPLY_MODIFIERS: properties.option_apply_modifiers, - constants.GEOMETRY_TYPE: properties.option_geometry_type, - constants.INDEX_TYPE: properties.option_index_type, - - constants.MATERIALS: properties.option_materials, - constants.UVS: properties.option_uv_coords, - constants.FACE_MATERIALS: properties.option_face_materials, - constants.MAPS: properties.option_maps, - constants.COLORS: properties.option_colors, - constants.MIX_COLORS: properties.option_mix_colors, - - constants.SCALE: properties.option_scale, - constants.ENABLE_PRECISION: properties.option_round_off, - constants.PRECISION: properties.option_round_value, - constants.CUSTOM_PROPERTIES: properties.option_custom_properties, - constants.LOGGING: properties.option_logging, - constants.COMPRESSION: properties.option_compression, - constants.INDENT: properties.option_indent, - constants.EXPORT_TEXTURES: properties.option_export_textures, - constants.EMBED_TEXTURES: properties.option_embed_textures, - constants.TEXTURE_FOLDER: properties.option_texture_folder, - - constants.SCENE: properties.option_export_scene, - #constants.EMBED_GEOMETRY: properties.option_embed_geometry, - constants.EMBED_ANIMATION: properties.option_embed_animation, - constants.LIGHTS: properties.option_lights, - constants.CAMERAS: properties.option_cameras, - constants.HIERARCHY: properties.option_hierarchy, - - constants.MORPH_TARGETS: properties.option_animation_morph, - constants.BLEND_SHAPES: properties.option_blend_shape, - constants.ANIMATION: properties.option_animation_skeletal, - constants.KEYFRAMES: properties.option_keyframes, - constants.BAKE_KEYFRAMES: properties.option_bake_keyframes, - constants.FRAME_STEP: properties.option_frame_step, - constants.FRAME_INDEX_AS_TIME: properties.option_frame_index_as_time, - constants.INFLUENCES_PER_VERTEX: properties.option_influences - } - - return settings - - -def compression_types(): - """Supported compression formats - - :rtype: tuple - - """ - types = [(constants.NONE, constants.NONE, constants.NONE)] - - try: - import msgpack - types.append((constants.MSGPACK, constants.MSGPACK, - constants.MSGPACK)) - except ImportError: - pass - - return types - - -def animation_options(): - """The supported skeletal animation types - - :returns: list of tuples - - """ - anim = [ - (constants.OFF, constants.OFF.title(), constants.OFF), - (constants.POSE, constants.POSE.title(), constants.POSE), - (constants.REST, constants.REST.title(), constants.REST) - ] - - return anim - -def resolve_conflicts(self, context): - if(not self.option_export_textures): - self.option_embed_textures = False; - -class ExportThree(bpy.types.Operator, ExportHelper): - """Class that handles the export properties""" - - bl_idname = 'export.three' - bl_label = 'Export THREE' - bl_options = {'PRESET'} - - filename_ext = constants.EXTENSION - - option_vertices = BoolProperty( - name="Vertices", - description="Export vertices", - default=constants.EXPORT_OPTIONS[constants.VERTICES]) - - option_faces = BoolProperty( - name="Faces", - description="Export faces (Geometry only)", - default=constants.EXPORT_OPTIONS[constants.FACES]) - - option_normals = BoolProperty( - name="Normals", - description="Export normals", - default=constants.EXPORT_OPTIONS[constants.NORMALS]) - - option_colors = BoolProperty( - name="Vertex Colors", - description="Export vertex colors", - default=constants.EXPORT_OPTIONS[constants.COLORS]) - - option_mix_colors = BoolProperty( - name="Mix Colors", - description="Mix material and vertex colors", - default=constants.EXPORT_OPTIONS[constants.MIX_COLORS]) - - option_uv_coords = BoolProperty( - name="UVs", - description="Export texture coordinates", - default=constants.EXPORT_OPTIONS[constants.UVS]) - - option_materials = BoolProperty( - name="Materials", - description="Export materials", - default=constants.EXPORT_OPTIONS[constants.MATERIALS]) - - option_face_materials = BoolProperty( - name="Face Materials", - description="Face mapping materials (Geometry only)", - default=constants.EXPORT_OPTIONS[constants.FACE_MATERIALS]) - - option_maps = BoolProperty( - name="Textures", - description="Include texture maps", - default=constants.EXPORT_OPTIONS[constants.MAPS]) - - option_skinning = BoolProperty( - name="Skinning", - description="Export skin data", - default=constants.EXPORT_OPTIONS[constants.SKINNING]) - - option_bones = BoolProperty( - name="Bones", - description="Export bones", - default=constants.EXPORT_OPTIONS[constants.BONES]) - - option_extra_vgroups = StringProperty( - name="Extra Vertex Groups", - description="Non-skinning vertex groups to export (comma-separated, w/ star wildcard, BufferGeometry only).", - default=constants.EXPORT_OPTIONS[constants.EXTRA_VGROUPS]) - - option_apply_modifiers = BoolProperty( - name="Apply Modifiers", - description="Apply Modifiers to mesh objects", - default=constants.EXPORT_OPTIONS[constants.APPLY_MODIFIERS] - ) - - index_buffer_types = [ - (constants.NONE,) * 3, - (constants.UINT_16,) * 3, - (constants.UINT_32,) * 3] - - option_index_type = EnumProperty( - name="Index Buffer", - description="Index buffer type that will be used for BufferGeometry objects.", - items=index_buffer_types, - default=constants.EXPORT_OPTIONS[constants.INDEX_TYPE]) - - option_scale = FloatProperty( - name="Scale", - description="Scale vertices", - min=0.01, - max=1000.0, - soft_min=0.01, - soft_max=1000.0, - default=constants.EXPORT_OPTIONS[constants.SCALE]) - - option_round_off = BoolProperty( - name="Enable Precision", - description="Round off floating point values", - default=constants.EXPORT_OPTIONS[constants.ENABLE_PRECISION]) - - option_round_value = IntProperty( - name="", - min=0, - max=16, - description="Floating point precision", - default=constants.EXPORT_OPTIONS[constants.PRECISION]) - - option_custom_properties = BoolProperty( - name="Custom Properties", - description="Export custom properties as userData", - default=False) - - logging_types = [ - (constants.DISABLED, constants.DISABLED, constants.DISABLED), - (constants.DEBUG, constants.DEBUG, constants.DEBUG), - (constants.INFO, constants.INFO, constants.INFO), - (constants.WARNING, constants.WARNING, constants.WARNING), - (constants.ERROR, constants.ERROR, constants.ERROR), - (constants.CRITICAL, constants.CRITICAL, constants.CRITICAL)] - - option_logging = EnumProperty( - name="", - description="Logging verbosity level", - items=logging_types, - default=constants.DISABLED) - - option_geometry_type = EnumProperty( - name="Type", - description="Geometry type", - items=_geometry_types()[1:], - default=constants.EXPORT_OPTIONS[constants.GEOMETRY_TYPE]) - - option_export_scene = BoolProperty( - name="Scene", - description="Export scene", - default=constants.EXPORT_OPTIONS[constants.SCENE]) - - #@TODO: removing this option since the ObjectLoader doesn't have - # support for handling external geometry data - #option_embed_geometry = BoolProperty( - # name="Embed geometry", - # description="Embed geometry", - # default=constants.EXPORT_OPTIONS[constants.EMBED_GEOMETRY]) - - option_embed_animation = BoolProperty( - name="Embed animation", - description="Embed animation data with the geometry data", - default=constants.EXPORT_OPTIONS[constants.EMBED_ANIMATION]) - - option_export_textures = BoolProperty( - name="Export textures", - description="Export textures", - default=constants.EXPORT_OPTIONS[constants.EXPORT_TEXTURES], - update=resolve_conflicts) - - option_embed_textures = BoolProperty( - name="Embed textures", - description="Embed base64 textures in .json", - default=constants.EXPORT_OPTIONS[constants.EMBED_TEXTURES]) - - option_texture_folder = StringProperty( - name="Texture folder", - description="add this folder to textures path", - default=constants.EXPORT_OPTIONS[constants.TEXTURE_FOLDER]) - - option_lights = BoolProperty( - name="Lights", - description="Export default scene lights", - default=False) - - option_cameras = BoolProperty( - name="Cameras", - description="Export default scene cameras", - default=False) - - option_hierarchy = BoolProperty( - name="Hierarchy", - description="Export object hierarchy", - default=False) - - option_animation_morph = BoolProperty( - name="Morph animation", - description="Export animation (morphs)", - default=constants.EXPORT_OPTIONS[constants.MORPH_TARGETS]) - - option_blend_shape = BoolProperty( - name="Blend Shape animation", - description="Export Blend Shapes", - default=constants.EXPORT_OPTIONS[constants.BLEND_SHAPES]) - - option_animation_skeletal = EnumProperty( - name="", - description="Export animation (skeletal)", - items=animation_options(), - default=constants.OFF) - - option_keyframes = BoolProperty( - name="Keyframe animation", - description="Export animation (keyframes)", - default=constants.EXPORT_OPTIONS[constants.KEYFRAMES]) - - option_bake_keyframes = BoolProperty( - name="Bake keyframe animation", - description="Bake keyframe animation each frame step", - default=constants.EXPORT_OPTIONS[constants.BAKE_KEYFRAMES]) - - option_frame_index_as_time = BoolProperty( - name="Frame index as time", - description="Use (original) frame index as frame time", - default=constants.EXPORT_OPTIONS[constants.FRAME_INDEX_AS_TIME]) - - option_frame_step = IntProperty( - name="Frame step", - description="Animation frame step", - min=1, - max=1000, - soft_min=1, - soft_max=1000, - default=1) - - option_indent = BoolProperty( - name="Indent JSON", - description="Disable this to reduce the file size", - default=constants.EXPORT_OPTIONS[constants.INDENT]) - - option_compression = EnumProperty( - name="", - description="Compression options", - items=compression_types(), - default=constants.NONE) - - option_influences = IntProperty( - name="Influences", - description="Maximum number of bone influences", - min=1, - max=4, - default=2) - - def invoke(self, context, event): - - settings = context.scene.get(constants.EXPORT_SETTINGS_KEY) - if settings: - try: - restore_export_settings(self.properties, settings) - except AttributeError as e: - logging.error("Loading export settings failed:") - logging.exception(e) - logging.debug("Removed corrupted settings") - - del context.scene[constants.EXPORT_SETTINGS_KEY] - - return ExportHelper.invoke(self, context, event) - - @classmethod - def poll(cls, context): - """ - - :param context: - - """ - return context.active_object is not None - - def execute(self, context): - """ - - :param context: - - """ - - if not self.properties.filepath: - raise Exception("filename not set") - - settings = set_settings(self.properties) - settings['addon_version'] = bl_info['version'] - - filepath = self.filepath - if settings[constants.COMPRESSION] == constants.MSGPACK: - filepath = "%s%s" % (filepath[:-4], constants.PACK) - - from io_three import exporter - if settings[constants.SCENE]: - exporter.export_scene(filepath, settings) - else: - exporter.export_geometry(filepath, settings) - - return {'FINISHED'} - - def draw(self, context): - """ - - :param context: - - """ - - using_geometry = self.option_geometry_type == constants.GEOMETRY - - layout = self.layout - - ## Scene { - box = layout.box() - column = box.column(True) - row = column.row(True) - row.alignment = 'CENTER' - - row.label(text="SCENE", icon="SCENE_DATA") - - row = box.row() - row.prop(self.properties, 'option_export_scene') - row.prop(self.properties, 'option_materials') - - #row = box.row() - #row.prop(self.properties, 'option_embed_geometry') - - row = box.row() - row.prop(self.properties, 'option_lights') - row.prop(self.properties, 'option_cameras') - - row = box.row() - row.prop(self.properties, 'option_hierarchy') - ## } - - layout.separator() - - ## Geometry { - box = layout.box() - column = box.column(True) - row = column.row(True) - row.alignment = 'CENTER' - - row.label(text="GEOMETRY", icon="MESH_DATA") - - row = box.row() - row.prop(self.properties, 'option_geometry_type') - - row = box.row() - row.prop(self.properties, 'option_index_type') - - row = box.row() - row.prop(self.properties, 'option_vertices') - col = row.column() - col.prop(self.properties, 'option_faces') - col.enabled = using_geometry - - row = box.row() - row.prop(self.properties, 'option_normals') - row.prop(self.properties, 'option_uv_coords') - - row = box.row() - row.prop(self.properties, 'option_apply_modifiers') - - row = box.row() - row.prop(self.properties, 'option_extra_vgroups') - row.enabled = not using_geometry - ## } - - layout.separator() - - ## Materials { - box = layout.box() - column = box.column(True) - row = column.row(True) - row.alignment = 'CENTER' - row.label(text="MATERIAL", icon="MATERIAL_DATA") - - row = box.row() - row.prop(self.properties, 'option_colors') - row.prop(self.properties, 'option_mix_colors') - - row = box.row() - row.prop(self.properties, 'option_face_materials') - row.enabled = using_geometry - ## } - - layout.separator() - - ## Textures { - box = layout.box() - column = box.column(True) - row = column.row(True) - row.alignment = 'CENTER' - - row.label(text="TEXTURE", icon="TEXTURE_DATA") - - row = box.row() - row.prop(self.properties, 'option_maps') - row.prop(self.properties, 'option_export_textures') - - row = box.row() - row.prop(self.properties, 'option_embed_textures') - row.enabled = self.properties.option_export_textures - - row = box.row() - row.prop(self.properties, 'option_texture_folder') - ## } - - layout.separator() - - ## Armature { - box = layout.box() - column = box.column(True) - row = column.row(True) - row.alignment = 'CENTER' - - row.label(text="ARMATURE", icon="ARMATURE_DATA") - - row = box.row() - row.prop(self.properties, 'option_bones') - row.prop(self.properties, 'option_skinning') - ## } - - layout.separator() - - ## Animation { - box = layout.box() - column = box.column(True) - row = column.row(True) - row.alignment = 'CENTER' - - row.label(text="ANIMATION", icon="POSE_DATA") - - row = box.row() - row.prop(self.properties, 'option_animation_morph') - row.prop(self.properties, 'option_blend_shape') - - row = box.row() - row.label(text="Skeletal animations:") - row.prop(self.properties, 'option_animation_skeletal') - - row = box.row() - row.prop(self.properties, 'option_keyframes') - - row = box.row() - row.prop(self.properties, 'option_bake_keyframes') - - row = box.row() - row.prop(self.properties, 'option_influences') - - row = box.row() - row.prop(self.properties, 'option_frame_step') - - row = box.row() - row.prop(self.properties, 'option_frame_index_as_time') - - row = box.row() - row.prop(self.properties, 'option_embed_animation') - ## } - - layout.separator() - - ## Settings { - box = layout.box() - column = box.column(True) - row = column.row(True) - row.alignment = 'CENTER' - - row.label(text="SETTINGS", icon="SETTINGS") - - row = box.row() - row.prop(self.properties, 'option_scale') - - row = box.row() - row.prop(self.properties, 'option_round_off') - row.prop(self.properties, 'option_round_value') - - row = box.row() - row.prop(self.properties, 'option_custom_properties') - - row = box.row() - row.prop(self.properties, 'option_indent') - - row = box.row() - row.label(text="Logging verbosity:") - row.prop(self.properties, 'option_logging') - - row = box.row() - row.label(text="File compression format:") - row.prop(self.properties, 'option_compression') - ## } - - ## Operators { - has_settings = context.scene.get(constants.EXPORT_SETTINGS_KEY, False) - row = layout.row() - row.operator( - ThreeExportSettings.bl_idname, - ThreeExportSettings.bl_label, - icon="%s" % "PINNED" if has_settings else "UNPINNED") - ## } - - - -def menu_func_export(self, context): - """ - - :param self: - :param context: - - """ - default_path = bpy.data.filepath.replace('.blend', constants.EXTENSION) - text = "Three.js (%s)" % constants.EXTENSION - operator = self.layout.operator(ExportThree.bl_idname, text=text) - operator.filepath = default_path - - -def register(): - """Registers the addon (Blender boilerplate)""" - bpy.utils.register_module(__name__) - bpy.types.INFO_MT_file_export.append(menu_func_export) - - -def unregister(): - """Unregisters the addon (Blender boilerplate)""" - bpy.utils.unregister_module(__name__) - bpy.types.INFO_MT_file_export.remove(menu_func_export) - - -if __name__ == '__main__': - register() diff --git a/utils/exporters/blender/addons/io_three/constants.py b/utils/exporters/blender/addons/io_three/constants.py deleted file mode 100644 index 7dc1f6fb480153d171fe9f6b20c3df9509c4db25..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/constants.py +++ /dev/null @@ -1,398 +0,0 @@ -''' -All constant data used in the package should be defined here. -''' - -from collections import OrderedDict as BASE_DICT - -BLENDING_TYPES = type('Blending', (), { - 'NONE': 'NoBlending', - 'NORMAL': 'NormalBlending', - 'ADDITIVE': 'AdditiveBlending', - 'SUBTRACTIVE': 'SubtractiveBlending', - 'MULTIPLY': 'MultiplyBlending', - 'CUSTOM': 'CustomBlending' -}) - -BLENDING_CONSTANTS = type('BlendingConstant', (), { - 'NoBlending':0, - 'NormalBlending':1, - 'AdditiveBlending':2, - 'SubtractiveBlending':3, - 'MultiplyBlending':4, - 'CustomBlending':5 -}) - -NEAREST_FILTERS = type('NearestFilters', (), { - 'NEAREST': 'NearestFilter', - 'MIP_MAP_NEAREST': 'NearestMipMapNearestFilter', - 'MIP_MAP_LINEAR': 'NearestMipMapLinearFilter' -}) - -LINEAR_FILTERS = type('LinearFilters', (), { - 'LINEAR': 'LinearFilter', - 'MIP_MAP_NEAREST': 'LinearMipMapNearestFilter', - 'MIP_MAP_LINEAR': 'LinearMipMapLinearFilter' -}) - -MAPPING_TYPES = type('Mapping', (), { - 'UV': 'UVMapping', - 'CUBE_REFLECTION': 'CubeReflectionMapping', - 'CUBE_REFRACTION': 'CubeRefractionMapping', - 'SPHERICAL_REFLECTION': 'SphericalReflectionMapping' -}) - -NUMERIC = { - 'UVMapping': 300, - 'CubeReflectionMapping': 301, - 'CubeRefractionMapping': 302, - 'EquirectangularReflectionMapping': 303, - 'EquirectangularRefractionMapping': 304, - 'SphericalReflectionMapping': 305, - - 'RepeatWrapping': 1000, - 'repeat': 1000, - 'ClampToEdgeWrapping': 1001, - 'MirroredRepeatWrapping': 1002, - - 'NearestFilter': 1003, - 'NearestMipMapNearestFilter': 1004, - 'NearestMipMapLinearFilter': 1005, - 'LinearFilter': 1006, - 'LinearMipMapNearestFilter': 1007, - 'LinearMipMapLinearFilter': 1008 -} -JSON = 'json' -EXTENSION = '.%s' % JSON -INDENT = 'indent' - - -MATERIALS = 'materials' -SCENE = 'scene' -VERTICES = 'vertices' -FACES = 'faces' -NORMALS = 'normals' -BONES = 'bones' -UVS = 'uvs' -APPLY_MODIFIERS = 'applyModifiers' -COLORS = 'colors' -MIX_COLORS = 'mixColors' -EXTRA_VGROUPS = 'extraVertexGroups' -INDEX = 'index' -DRAW_CALLS = 'drawcalls' -DC_START = 'start' -DC_COUNT = 'count' -DC_INDEX = 'index' - -GROUPS = 'groups' - -SCALE = 'scale' -COMPRESSION = 'compression' -MAPS = 'maps' -FRAME_STEP = 'frameStep' -FRAME_INDEX_AS_TIME = 'frameIndexAsTime' -ANIMATION = 'animations' -CLIPS="clips" -KEYFRAMES = 'tracks' -BAKE_KEYFRAMES = 'bake_tracks' -MORPH_TARGETS = 'morphTargets' -MORPH_TARGETS_ANIM = 'morphTargetsAnimation' -BLEND_SHAPES = 'blendShapes' -POSE = 'pose' -REST = 'rest' -SKIN_INDICES = 'skinIndices' -SKIN_WEIGHTS = 'skinWeights' -LOGGING = 'logging' -CAMERAS = 'cameras' -LIGHTS = 'lights' -HIERARCHY = 'hierarchy' -FACE_MATERIALS = 'faceMaterials' -SKINNING = 'skinning' -EXPORT_TEXTURES = 'exportTextures' -EMBED_TEXTURES = 'embedTextures' -TEXTURE_FOLDER = 'textureFolder' -ENABLE_PRECISION = 'enablePrecision' -PRECISION = 'precision' -DEFAULT_PRECISION = 6 -CUSTOM_PROPERTIES = 'customProperties' -EMBED_GEOMETRY = 'embedGeometry' -EMBED_ANIMATION = 'embedAnimation' -OFF = 'off' - -GLOBAL = 'global' -BUFFER_GEOMETRY = 'BufferGeometry' -GEOMETRY = 'geometry' -GEOMETRY_TYPE = 'geometryType' -INDEX_TYPE = 'indexType' - -CRITICAL = 'critical' -ERROR = 'error' -WARNING = 'warning' -INFO = 'info' -DEBUG = 'debug' -DISABLED = 'disabled' - -NONE = 'None' -MSGPACK = 'msgpack' - -PACK = 'pack' - -FLOAT_32 = 'Float32Array' -UINT_16 = 'Uint16Array' -UINT_32 = 'Uint32Array' - -INFLUENCES_PER_VERTEX = 'influencesPerVertex' - -EXPORT_OPTIONS = { - FACES: True, - VERTICES: True, - NORMALS: True, - UVS: True, - APPLY_MODIFIERS: True, - COLORS: False, - EXTRA_VGROUPS: '', - INDEX_TYPE: UINT_16, - MATERIALS: False, - FACE_MATERIALS: False, - SCALE: 1, - FRAME_STEP: 1, - FRAME_INDEX_AS_TIME: False, - SCENE: False, - MIX_COLORS: False, - COMPRESSION: None, - MAPS: False, - ANIMATION: OFF, - KEYFRAMES: False, - BAKE_KEYFRAMES: False, - BONES: False, - SKINNING: False, - MORPH_TARGETS: False, - BLEND_SHAPES: False, - CAMERAS: False, - LIGHTS: False, - HIERARCHY: False, - EXPORT_TEXTURES: True, - EMBED_TEXTURES: False, - TEXTURE_FOLDER: '', - LOGGING: DEBUG, - ENABLE_PRECISION: False, - PRECISION: DEFAULT_PRECISION, - CUSTOM_PROPERTIES: False, - EMBED_GEOMETRY: True, - EMBED_ANIMATION: True, - GEOMETRY_TYPE: BUFFER_GEOMETRY, - INFLUENCES_PER_VERTEX: 2, - INDENT: True -} - - -FORMAT_VERSION = 4.4 -VERSION = 'version' -THREE = 'io_three' -GENERATOR = 'generator' -SOURCE_FILE = 'sourceFile' -VALID_DATA_TYPES = (str, int, float, bool, list, tuple, dict) - -JSON = 'json' -GZIP = 'gzip' - -EXTENSIONS = { - JSON: '.json', - MSGPACK: '.pack', - GZIP: '.gz' -} - -METADATA = 'metadata' -GEOMETRIES = 'geometries' -IMAGES = 'images' -TEXTURE = 'texture' -TEXTURES = 'textures' - -USER_DATA = 'userData' -DATA = 'data' -TYPE = 'type' - -MATERIAL = 'material' -OBJECT = 'object' -PERSPECTIVE_CAMERA = 'PerspectiveCamera' -ORTHOGRAPHIC_CAMERA = 'OrthographicCamera' -AMBIENT_LIGHT = 'AmbientLight' -DIRECTIONAL_LIGHT = 'DirectionalLight' -POINT_LIGHT = 'PointLight' -SPOT_LIGHT = 'SpotLight' -# TODO (abelnation): confirm this is correct area light string for exporter -RECT_AREA_LIGHT = 'RectAreaLight' -HEMISPHERE_LIGHT = 'HemisphereLight' -# TODO: RectAreaLight support -MESH = 'Mesh' -EMPTY = 'Empty' -SPRITE = 'Sprite' - -DEFAULT_METADATA = { - VERSION: FORMAT_VERSION, - TYPE: OBJECT.title(), - GENERATOR: THREE -} - -UUID = 'uuid' - -MATRIX = 'matrix' -POSITION = 'position' -QUATERNION = 'quaternion' -ROTATION = 'rotation' -SCALE = 'scale' - -UV = 'uv' -UV2 = 'uv2' -ATTRIBUTES = 'attributes' -NORMAL = 'normal' -ITEM_SIZE = 'itemSize' -ARRAY = 'array' - -VISIBLE = 'visible' -CAST_SHADOW = 'castShadow' -RECEIVE_SHADOW = 'receiveShadow' -QUAD = 'quad' - -MASK = { - QUAD: 0, - MATERIALS: 1, - UVS: 3, - NORMALS: 5, - COLORS: 7 -} - - -CHILDREN = 'children' - -URL = 'url' -WRAP = 'wrap' -REPEAT = 'repeat' -WRAPPING = type('Wrapping', (), { - 'REPEAT': 'repeat', - 'CLAMP': 'clamp', - 'MIRROR': 'mirror' -}) -ANISOTROPY = 'anisotropy' -MAG_FILTER = 'magFilter' -MIN_FILTER = 'minFilter' -MAPPING = 'mapping' - -IMAGE = 'image' - -NAME = 'name' -PARENT = 'parent' -LENGTH = 'length' -FPS = 'fps' -HIERARCHY = 'hierarchy' -POS = 'pos' -ROTQ = 'rotq' -ROT = 'rot' -SCL = 'scl' -TIME = 'time' -KEYS = 'keys' - -COLOR = 'color' -EMISSIVE = 'emissive' -SPECULAR = 'specular' -SPECULAR_COEF = 'specularCoef' -SHININESS = 'shininess' -SIDE = 'side' -OPACITY = 'opacity' -TRANSPARENT = 'transparent' -WIREFRAME = 'wireframe' -BLENDING = 'blending' -VERTEX_COLORS = 'vertexColors' -DEPTH_WRITE = 'depthWrite' -DEPTH_TEST = 'depthTest' - -MAP = 'map' -SPECULAR_MAP = 'specularMap' -LIGHT_MAP = 'lightMap' -BUMP_MAP = 'bumpMap' -BUMP_SCALE = 'bumpScale' -NORMAL_MAP = 'normalMap' -NORMAL_SCALE = 'normalScale' - -#@TODO ENV_MAP, REFLECTIVITY, REFRACTION_RATIO, COMBINE - -MAP_DIFFUSE = 'mapDiffuse' -MAP_DIFFUSE_REPEAT = 'mapDiffuseRepeat' -MAP_DIFFUSE_WRAP = 'mapDiffuseWrap' -MAP_DIFFUSE_ANISOTROPY = 'mapDiffuseAnisotropy' - -MAP_SPECULAR = 'mapSpecular' -MAP_SPECULAR_REPEAT = 'mapSpecularRepeat' -MAP_SPECULAR_WRAP = 'mapSpecularWrap' -MAP_SPECULAR_ANISOTROPY = 'mapSpecularAnisotropy' - -MAP_LIGHT = 'mapLight' -MAP_LIGHT_REPEAT = 'mapLightRepeat' -MAP_LIGHT_WRAP = 'mapLightWrap' -MAP_LIGHT_ANISOTROPY = 'mapLightAnisotropy' - -MAP_NORMAL = 'mapNormal' -MAP_NORMAL_FACTOR = 'mapNormalFactor' -MAP_NORMAL_REPEAT = 'mapNormalRepeat' -MAP_NORMAL_WRAP = 'mapNormalWrap' -MAP_NORMAL_ANISOTROPY = 'mapNormalAnisotropy' - -MAP_BUMP = 'mapBump' -MAP_BUMP_REPEAT = 'mapBumpRepeat' -MAP_BUMP_WRAP = 'mapBumpWrap' -MAP_BUMP_ANISOTROPY = 'mapBumpAnisotropy' -MAP_BUMP_SCALE = 'mapBumpScale' - -NORMAL_BLENDING = 0 - -VERTEX_COLORS_ON = 2 -VERTEX_COLORS_OFF = 0 - -SIDE_DOUBLE = 2 - -THREE_BASIC = 'MeshBasicMaterial' -THREE_LAMBERT = 'MeshLambertMaterial' -THREE_PHONG = 'MeshPhongMaterial' - -INTENSITY = 'intensity' -DISTANCE = 'distance' -ANGLE = 'angle' -DECAY = 'decayExponent' - -FOV = 'fov' -ASPECT = 'aspect' -NEAR = 'near' -FAR = 'far' - -LEFT = 'left' -RIGHT = 'right' -TOP = 'top' -BOTTOM = 'bottom' - -SHADING = 'shading' -COLOR_DIFFUSE = 'colorDiffuse' -COLOR_EMISSIVE = 'colorEmissive' -COLOR_SPECULAR = 'colorSpecular' -DBG_NAME = 'DbgName' -DBG_COLOR = 'DbgColor' -DBG_INDEX = 'DbgIndex' -EMIT = 'emit' - -PHONG = 'phong' -LAMBERT = 'lambert' -BASIC = 'basic' - -NORMAL_BLENDING = 'NormalBlending' - -DBG_COLORS = (0xeeeeee, 0xee0000, 0x00ee00, 0x0000ee, - 0xeeee00, 0x00eeee, 0xee00ee) - -DOUBLE_SIDED = 'doubleSided' - -EXPORT_SETTINGS_KEY = 'threeExportSettings' - -# flips vectors - -XZ_Y = "XZ_Y" -X_ZY = "X_ZY" -XYZ = "XYZ" -_XY_Z = "_XY_Z" diff --git a/utils/exporters/blender/addons/io_three/dialogs.py b/utils/exporters/blender/addons/io_three/dialogs.py deleted file mode 100644 index 01baf18eeb2f29755fed09ec528566424370e50c..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/dialogs.py +++ /dev/null @@ -1,112 +0,0 @@ -from bpy import context - -CONTEXT = { - 0: { - 'title': "Error Message", - 'icon': 'CANCEL' - }, - 1: { - 'title': "Warning Message", - 'icon': 'ERROR' # I prefer this icon for warnings - }, - 2: { - 'title': "Message", - 'icon': 'NONE' - }, - 3: { - 'title': "Question", - 'icon': 'QUESTION' - } -} - - -def error(message, title="", wrap=40): - """Creates an error dialog. - - :param message: text of the message body - :param title: text to append to the title - (Default value = "") - :param wrap: line width (Default value = 40) - - """ - _draw(message, title, wrap, 0) - - -def warning(message, title="", wrap=40): - """Creates an error dialog. - - :param message: text of the message body - :param title: text to append to the title - (Default value = "") - :param wrap: line width (Default value = 40) - - """ - _draw(message, title, wrap, 1) - - - -def info(message, title="", wrap=40): - """Creates an error dialog. - - :param message: text of the message body - :param title: text to append to the title - (Default value = "") - :param wrap: line width (Default value = 40) - - """ - _draw(message, title, wrap, 2) - - - -def question(message, title="", wrap=40): - """Creates an error dialog. - - :param message: text of the message body - :param title: text to append to the title - (Default value = "") - :param wrap: line width (Default value = 40) - - """ - _draw(message, title, wrap, 3) - - - -# Great idea borrowed from -# http://community.cgcookie.com/t/code-snippet-easy-error-messages/203 -def _draw(message, title, wrap, key): - """ - - :type message: str - :type title: str - :type wrap: int - :type key: int - - """ - lines = [] - if wrap > 0: - while len(message) > wrap: - i = message.rfind(' ', 0, wrap) - if i == -1: - lines += [message[:wrap]] - message = message[wrap:] - else: - lines += [message[:i]] - message = message[i+1:] - if message: - lines += [message] - - def draw(self, *args): - """ - - :param self: - :param *args: - - """ - for line in lines: - self.layout.label(line) - - title = "%s: %s" % (title, CONTEXT[key]['title']) - icon = CONTEXT[key]['icon'] - - context.window_manager.popup_menu( - draw, title=title.strip(), icon=icon) diff --git a/utils/exporters/blender/addons/io_three/exceptions.py b/utils/exporters/blender/addons/io_three/exceptions.py deleted file mode 100644 index bf4cf7ee8d67b8444fcdd6c7632c197ccf5a62fe..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exceptions.py +++ /dev/null @@ -1,9 +0,0 @@ -class ThreeError(Exception): pass -class UnimplementedFeatureError(ThreeError): pass -class ThreeValueError(ThreeError): pass -class UnsupportedObjectType(ThreeError): pass -class GeometryError(ThreeError): pass -class MaterialError(ThreeError): pass -class SelectionError(ThreeError): pass -class NGonError(ThreeError): pass -class BufferGeometryError(ThreeError): pass diff --git a/utils/exporters/blender/addons/io_three/exporter/__init__.py b/utils/exporters/blender/addons/io_three/exporter/__init__.py deleted file mode 100644 index 3ca23d63b70e7863fc84652f735eac767919c678..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/__init__.py +++ /dev/null @@ -1,96 +0,0 @@ -import os -import sys -import traceback -from .. import constants, logger, exceptions, dialogs -from . import scene, geometry, api, base_classes - - -def _error_handler(func): - - def inner(filepath, options, *args, **kwargs): - level = options.get(constants.LOGGING, constants.DISABLED) - version = options.get('addon_version') - if level != constants.DISABLED: - logger.init('io_three.export.log', level=level) - if version is not None: - logger.debug("Addon Version %s", version) - api.init() - - try: - func(filepath, options, *args, **kwargs) - except: - info = sys.exc_info() - trace = traceback.format_exception( - info[0], info[1], info[2].tb_next) - trace = ''.join(trace) - logger.error(trace) - - print('Error recorded to %s' % logger.LOG_FILE) - - raise - else: - print('Log: %s' % logger.LOG_FILE) - - return inner - - -@_error_handler -def export_scene(filepath, options): - selected = [] - - # during scene exports unselect everything. this is needed for - # applying modifiers, if it is necessary - # record the selected nodes so that selection is restored later - for obj in api.selected_objects(): - api.object.unselect(obj) - selected.append(obj) - active = api.active_object() - - try: - scene_ = scene.Scene(filepath, options=options) - scene_.parse() - scene_.write() - except: - _restore_selection(selected, active) - raise - - _restore_selection(selected, active) - - -@_error_handler -def export_geometry(filepath, options, node=None): - msg = "" - exception = None - if node is None: - node = api.active_object() - if node is None: - msg = "Nothing selected" - logger.error(msg) - exception = exceptions.SelectionError - if node.type != 'MESH': - msg = "%s is not a valid mesh object" % node.name - logger.error(msg) - exception = exceptions.GeometryError - - if exception is not None: - if api.batch_mode(): - raise exception(msg) - else: - dialogs.error(msg) - return - - mesh = api.object.mesh(node, options) - parent = base_classes.BaseScene(filepath, options) - geo = geometry.Geometry(mesh, parent) - geo.parse() - geo.write() - - if not options.get(constants.EMBED_ANIMATION, True): - geo.write_animation(os.path.dirname(filepath)) - - -def _restore_selection(objects, active): - for obj in objects: - api.object.select(obj) - - api.set_active_object(active) diff --git a/utils/exporters/blender/addons/io_three/exporter/_json.py b/utils/exporters/blender/addons/io_three/exporter/_json.py deleted file mode 100644 index 50031c323d07a87d75ef210aa58888ee51378f2a..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/_json.py +++ /dev/null @@ -1,208 +0,0 @@ -import json -from .. import constants - -ROUND = constants.DEFAULT_PRECISION - -## THREE override function -def _json_floatstr(o): - if ROUND is not None: - o = round(o, ROUND) - - return '%g' % o - - -def _make_iterencode(markers, _default, _encoder, _indent, _floatstr, - _key_separator, _item_separator, _sort_keys, _skipkeys, _one_shot, - ## HACK: hand-optimized bytecode; turn globals into locals - ValueError=ValueError, - dict=dict, - float=float, - id=id, - int=int, - isinstance=isinstance, - list=list, - str=str, - tuple=tuple, - ): - ''' - Overwrite json.encoder for Python 2.7 and above to not - assign each index of a list or tuple to its own row as - this is completely asinine behaviour - ''' - - ## @THREE - # Override the function - _floatstr = _json_floatstr - - if _indent is not None and not isinstance(_indent, str): - _indent = ' ' * _indent - - def _iterencode_list(lst, _current_indent_level): - if not lst: - yield '[]' - return - if markers is not None: - markerid = id(lst) - if markerid in markers: - raise ValueError("Circular reference detected") - markers[markerid] = lst - buf = '[' - ## @THREEJS - # - block the moronic functionality that puts each - # index on its own line causing insane row counts - #if _indent is not None: - # _current_indent_level += 1 - # newline_indent = '\n' + _indent * _current_indent_level - # separator = _item_separator + newline_indent - # buf += newline_indent - #else: - newline_indent = None - separator = _item_separator - first = True - for value in lst: - if first: - first = False - else: - buf = separator - if isinstance(value, str): - yield buf + _encoder(value) - elif value is None: - yield buf + 'null' - elif value is True: - yield buf + 'true' - elif value is False: - yield buf + 'false' - elif isinstance(value, int): - yield buf + str(value) - elif isinstance(value, float): - yield buf + _floatstr(value) - else: - yield buf - if isinstance(value, (list, tuple)): - chunks = _iterencode_list(value, _current_indent_level) - elif isinstance(value, dict): - chunks = _iterencode_dict(value, _current_indent_level) - else: - chunks = _iterencode(value, _current_indent_level) - for chunk in chunks: - yield chunk - if newline_indent is not None: - _current_indent_level -= 1 - yield '\n' + _indent * _current_indent_level - yield ']' - if markers is not None: - del markers[markerid] - - def _iterencode_dict(dct, _current_indent_level): - if not dct: - yield '{}' - return - if markers is not None: - markerid = id(dct) - if markerid in markers: - raise ValueError("Circular reference detected") - markers[markerid] = dct - yield '{' - if _indent is not None: - _current_indent_level += 1 - newline_indent = '\n' + _indent * _current_indent_level - item_separator = _item_separator + newline_indent - yield newline_indent - else: - newline_indent = None - item_separator = _item_separator - first = True - if _sort_keys: - items = sorted(dct.items(), key=lambda kv: kv[0]) - else: - items = dct.items() - for key, value in items: - if isinstance(key, str): - pass - # JavaScript is weakly typed for these, so it makes sense to - # also allow them. Many encoders seem to do something like this. - elif isinstance(key, float): - key = _floatstr(key) - elif key is True: - key = 'true' - elif key is False: - key = 'false' - elif key is None: - key = 'null' - elif isinstance(key, int): - key = str(key) - elif _skipkeys: - continue - else: - raise TypeError("key " + repr(key) + " is not a string") - if first: - first = False - else: - yield item_separator - yield _encoder(key) - yield _key_separator - if isinstance(value, str): - yield _encoder(value) - elif value is None: - yield 'null' - elif value is True: - yield 'true' - elif value is False: - yield 'false' - elif isinstance(value, int): - yield str(value) - elif isinstance(value, float): - yield _floatstr(value) - else: - if isinstance(value, (list, tuple)): - chunks = _iterencode_list(value, _current_indent_level) - elif isinstance(value, dict): - chunks = _iterencode_dict(value, _current_indent_level) - else: - chunks = _iterencode(value, _current_indent_level) - for chunk in chunks: - yield chunk - if newline_indent is not None: - _current_indent_level -= 1 - yield '\n' + _indent * _current_indent_level - yield '}' - if markers is not None: - del markers[markerid] - - def _iterencode(o, _current_indent_level): - if isinstance(o, str): - yield _encoder(o) - elif o is None: - yield 'null' - elif o is True: - yield 'true' - elif o is False: - yield 'false' - elif isinstance(o, int): - yield str(o) - elif isinstance(o, float): - yield _floatstr(o) - elif isinstance(o, (list, tuple)): - for chunk in _iterencode_list(o, _current_indent_level): - yield chunk - elif isinstance(o, dict): - for chunk in _iterencode_dict(o, _current_indent_level): - yield chunk - else: - if markers is not None: - markerid = id(o) - if markerid in markers: - raise ValueError("Circular reference detected") - markers[markerid] = o - o = _default(o) - for chunk in _iterencode(o, _current_indent_level): - yield chunk - if markers is not None: - del markers[markerid] - return _iterencode - - -# override the encoder -json.encoder._make_iterencode = _make_iterencode - - diff --git a/utils/exporters/blender/addons/io_three/exporter/api/__init__.py b/utils/exporters/blender/addons/io_three/exporter/api/__init__.py deleted file mode 100644 index 36f940343cb7cdd9831735f49929ad2d6f1b9ee6..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/api/__init__.py +++ /dev/null @@ -1,77 +0,0 @@ -import os -import bpy -from . import object as object_, mesh, material, camera, light -from .. import logger - - -def active_object(): - """ - - :return: The actively selected object - - """ - return bpy.context.scene.objects.active - - -def batch_mode(): - """ - - :return: Whether or not the session is interactive - :rtype: bool - - """ - return bpy.context.area is None - - -def data(node): - """ - - :param node: name of an object node - :returns: the data block of the node - - """ - try: - return bpy.data.objects[node].data - except KeyError: - pass - - -def init(): - """Initializing the api module. Required first step before - initializing the actual export process. - """ - logger.debug("Initializing API") - object_.clear_mesh_map() - - -def selected_objects(valid_types=None): - """Selected objects. - - :param valid_types: Filter for valid types (Default value = None) - - """ - logger.debug("api.selected_objects(%s)", valid_types) - for node in bpy.context.selected_objects: - if valid_types is None: - yield node.name - elif valid_types is not None and node.type in valid_types: - yield node.name - - -def set_active_object(obj): - """Set the object as active in the scene - - :param obj: - - """ - logger.debug("api.set_active_object(%s)", obj) - bpy.context.scene.objects.active = obj - - -def scene_name(): - """ - - :return: name of the current scene - - """ - return os.path.basename(bpy.data.filepath) diff --git a/utils/exporters/blender/addons/io_three/exporter/api/animation.py b/utils/exporters/blender/addons/io_three/exporter/api/animation.py deleted file mode 100644 index 7dd906767dfe9cfcbbdeb3573bc6d62287cd593c..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/api/animation.py +++ /dev/null @@ -1,683 +0,0 @@ -""" -Module for handling the parsing of skeletal animation data. -updated on 2/07/2016: bones scaling support (@uthor verteAzur verteAzur@multivers3d.fr) -""" - -import math -import mathutils -from bpy import data, context, ops -from .. import constants, logger - -def pose_animation(armature, options): - """Query armature animation using pose bones - - :param armature: - :param options: - :returns: list dictionaries containing animationdata - :rtype: [{}, {}, ...] - - """ - logger.debug("animation.pose_animation(%s)", armature) - func = _parse_pose_action - return _parse_action(func, armature, options) - - -def rest_animation(armature, options): - """Query armature animation (REST position) - - :param armature: - :param options: - :returns: list dictionaries containing animationdata - :rtype: [{}, {}, ...] - - """ - logger.debug("animation.rest_animation(%s)", armature) - func = _parse_rest_action - return _parse_action(func, armature, options) - - -def _parse_action(func, armature, options): - """ - - :param func: - :param armature: - :param options: - - """ - animations = [] - logger.info("Parsing %d actions", len(data.actions)) - for action in data.actions: - logger.info("Parsing action %s", action.name) - animation = func(action, armature, options) - animations.append(animation) - return animations - - -def _parse_rest_action(action, armature, options): - """ - - :param action: - :param armature: - :param options: - - """ - end_frame = action.frame_range[1] - start_frame = action.frame_range[0] - frame_length = end_frame - start_frame - rot = armature.matrix_world.decompose()[1] - rotation_matrix = rot.to_matrix() - hierarchy = [] - parent_index = -1 - frame_step = options.get(constants.FRAME_STEP, 1) - fps = context.scene.render.fps - - start = int(start_frame) - end = int(end_frame / frame_step) + 1 - - for bone in armature.data.bones: - # I believe this was meant to skip control bones, may - # not be useful. needs more testing - if bone.use_deform is False: - logger.info("Skipping animation data for bone %s", bone.name) - continue - - logger.info("Parsing animation data for bone %s", bone.name) - - keys = [] - for frame in range(start, end): - computed_frame = frame * frame_step - pos, pchange = _position(bone, computed_frame, - action, armature.matrix_world) - rot, rchange = _rotation(bone, computed_frame, - action, rotation_matrix) - rot = _normalize_quaternion(rot) - - sca, schange = _scale(bone, computed_frame, - action, armature.matrix_world) - - pos_x, pos_y, pos_z = pos.x, pos.z, -pos.y - rot_x, rot_y, rot_z, rot_w = rot.x, rot.z, -rot.y, rot.w - sca_x, sca_y, sca_z = sca.x, sca.z, sca.y - - if frame == start_frame: - - time = (frame * frame_step - start_frame) / fps - - keyframe = { - constants.TIME: time, - constants.POS: [pos_x, pos_y, pos_z], - constants.ROT: [rot_x, rot_y, rot_z, rot_w], - constants.SCL: [sca_x, sca_y, sca_z] - } - keys.append(keyframe) - - # END-FRAME: needs pos, rot and scl attributes - # with animation length (required frame) - - elif frame == end_frame / frame_step: - - time = frame_length / fps - keyframe = { - constants.TIME: time, - constants.POS: [pos_x, pos_y, pos_z], - constants.ROT: [rot_x, rot_y, rot_z, rot_w], - constants.SCL: [sca_x, sca_y, sca_z] - } - keys.append(keyframe) - - # MIDDLE-FRAME: needs only one of the attributes, - # can be an empty frame (optional frame) - - elif pchange is True or rchange is True or schange is True: - - time = (frame * frame_step - start_frame) / fps - - if pchange is True and rchange is True: - keyframe = { - constants.TIME: time, - constants.POS: [pos_x, pos_y, pos_z], - constants.ROT: [rot_x, rot_y, rot_z, rot_w], - constants.SCL: [sca_x, sca_y, sca_z] - } - elif pchange is True: - keyframe = { - constants.TIME: time, - constants.POS: [pos_x, pos_y, pos_z] - } - elif rchange is True: - keyframe = { - constants.TIME: time, - constants.ROT: [rot_x, rot_y, rot_z, rot_w] - } - elif schange is True: - keyframe = { - constants.TIME: time, - constants.SCL: [sca_x, sca_y, sca_z] - } - - keys.append(keyframe) - - hierarchy.append({ - constants.KEYS: keys, - constants.PARENT: parent_index - }) - parent_index += 1 - - animation = { - constants.HIERARCHY: hierarchy, - constants.LENGTH: frame_length / fps, - constants.FPS: fps, - constants.NAME: action.name - } - - return animation - - -def _parse_pose_action(action, armature, options): - """ - - :param action: - :param armature: - :param options: - - """ - try: - current_context = context.area.type - except AttributeError: - for window in context.window_manager.windows: - screen = window.screen - for area in screen.areas: - if area.type != 'VIEW_3D': - continue - - override = { - 'window': window, - 'screen': screen, - 'area': area - } - ops.screen.screen_full_area(override) - break - current_context = context.area.type - - context.scene.objects.active = armature - context.area.type = 'DOPESHEET_EDITOR' - context.space_data.mode = 'ACTION' - context.area.spaces.active.action = action - - armature_matrix = armature.matrix_world - fps = context.scene.render.fps - - end_frame = action.frame_range[1] - start_frame = action.frame_range[0] - frame_length = end_frame - start_frame - - frame_step = options.get(constants.FRAME_STEP, 1) - used_frames = int(frame_length / frame_step) + 1 - - keys = [] - channels_location = [] - channels_rotation = [] - channels_scale = [] - - for pose_bone in armature.pose.bones: - logger.info("Processing channels for %s", - pose_bone.bone.name) - keys.append([]) - channels_location.append( - _find_channels(action, - pose_bone.bone, - 'location')) - channels_rotation.append( - _find_channels(action, - pose_bone.bone, - 'rotation_quaternion')) - channels_rotation.append( - _find_channels(action, - pose_bone.bone, - 'rotation_euler')) - channels_scale.append( - _find_channels(action, - pose_bone.bone, - 'scale')) - - frame_step = options[constants.FRAME_STEP] - frame_index_as_time = options[constants.FRAME_INDEX_AS_TIME] - for frame_index in range(0, used_frames): - if frame_index == used_frames - 1: - frame = end_frame - else: - frame = start_frame + frame_index * frame_step - - logger.info("Processing frame %d", frame) - - time = frame - start_frame - if frame_index_as_time is False: - time = time / fps - - context.scene.frame_set(frame) - - bone_index = 0 - - def has_keyframe_at(channels, frame): - """ - - :param channels: - :param frame: - - """ - def find_keyframe_at(channel, frame): - """ - - :param channel: - :param frame: - - """ - for keyframe in channel.keyframe_points: - if keyframe.co[0] == frame: - return keyframe - return None - - for channel in channels: - if not find_keyframe_at(channel, frame) is None: - return True - return False - - for pose_bone in armature.pose.bones: - - logger.info("Processing bone %s", pose_bone.bone.name) - if pose_bone.parent is None: - bone_matrix = armature_matrix * pose_bone.matrix - else: - parent_matrix = armature_matrix * pose_bone.parent.matrix - bone_matrix = armature_matrix * pose_bone.matrix - bone_matrix = parent_matrix.inverted() * bone_matrix - - pos, rot, scl = bone_matrix.decompose() - rot = _normalize_quaternion(rot) - - pchange = True or has_keyframe_at( - channels_location[bone_index], frame) - rchange = True or has_keyframe_at( - channels_rotation[bone_index], frame) - schange = True or has_keyframe_at( - channels_scale[bone_index], frame) - - pos = (pos.x, pos.z, -pos.y) - rot = (rot.x, rot.z, -rot.y, rot.w) - scl = (scl.x, scl.z, scl.y) - - keyframe = {constants.TIME: time} - if frame == start_frame or frame == end_frame: - keyframe.update({ - constants.POS: pos, - constants.ROT: rot, - constants.SCL: scl - }) - elif any([pchange, rchange, schange]): - if pchange is True: - keyframe[constants.POS] = pos - if rchange is True: - keyframe[constants.ROT] = rot - if schange is True: - keyframe[constants.SCL] = scl - - if len(keyframe.keys()) > 1: - logger.info("Recording keyframe data for %s %s", - pose_bone.bone.name, str(keyframe)) - keys[bone_index].append(keyframe) - else: - logger.info("No anim data to record for %s", - pose_bone.bone.name) - - bone_index += 1 - - hierarchy = [] - bone_index = 0 - for pose_bone in armature.pose.bones: - hierarchy.append({ - constants.PARENT: bone_index - 1, - constants.KEYS: keys[bone_index] - }) - bone_index += 1 - - if frame_index_as_time is False: - frame_length = frame_length / fps - - context.scene.frame_set(start_frame) - context.area.type = current_context - - animation = { - constants.HIERARCHY: hierarchy, - constants.LENGTH: frame_length, - constants.FPS: fps, - constants.NAME: action.name - } - - return animation - - -def _find_channels(action, bone, channel_type): - """ - - :param action: - :param bone: - :param channel_type: - - """ - result = [] - - if len(action.groups): - - group_index = -1 - for index, group in enumerate(action.groups): - if group.name == bone.name: - group_index = index - # @TODO: break? - - if group_index > -1: - for channel in action.groups[group_index].channels: - if channel_type in channel.data_path: - result.append(channel) - - else: - bone_label = '"%s"' % bone.name - for channel in action.fcurves: - data_path = [bone_label in channel.data_path, - channel_type in channel.data_path] - if all(data_path): - result.append(channel) - - return result - - -def _position(bone, frame, action, armature_matrix): - """ - - :param bone: - :param frame: - :param action: - :param armature_matrix: - - """ - - position = mathutils.Vector((0, 0, 0)) - change = False - - ngroups = len(action.groups) - - if ngroups > 0: - - index = 0 - - for i in range(ngroups): - if action.groups[i].name == bone.name: - index = i - - for channel in action.groups[index].channels: - if "location" in channel.data_path: - has_changed = _handle_position_channel( - channel, frame, position) - change = change or has_changed - - else: - - bone_label = '"%s"' % bone.name - - for channel in action.fcurves: - data_path = channel.data_path - if bone_label in data_path and "location" in data_path: - has_changed = _handle_position_channel( - channel, frame, position) - change = change or has_changed - - position = position * bone.matrix_local.inverted() - - if bone.parent is None: - - position.x += bone.head.x - position.y += bone.head.y - position.z += bone.head.z - - else: - - parent = bone.parent - - parent_matrix = parent.matrix_local.inverted() - diff = parent.tail_local - parent.head_local - - position.x += (bone.head * parent_matrix).x + diff.x - position.y += (bone.head * parent_matrix).y + diff.y - position.z += (bone.head * parent_matrix).z + diff.z - - return armature_matrix*position, change - - -def _rotation(bone, frame, action, armature_matrix): - """ - - :param bone: - :param frame: - :param action: - :param armature_matrix: - - """ - - # TODO: calculate rotation also from rotation_euler channels - - rotation = mathutils.Vector((0, 0, 0, 1)) - - change = False - - ngroups = len(action.groups) - - # animation grouped by bones - - if ngroups > 0: - - index = -1 - - for i in range(ngroups): - if action.groups[i].name == bone.name: - index = i - - if index > -1: - for channel in action.groups[index].channels: - if "quaternion" in channel.data_path: - has_changed = _handle_rotation_channel( - channel, frame, rotation) - change = change or has_changed - - # animation in raw fcurves - - else: - - bone_label = '"%s"' % bone.name - - for channel in action.fcurves: - data_path = channel.data_path - if bone_label in data_path and "quaternion" in data_path: - has_changed = _handle_rotation_channel( - channel, frame, rotation) - change = change or has_changed - - rot3 = rotation.to_3d() - rotation.xyz = rot3 * bone.matrix_local.inverted() - rotation.xyz = armature_matrix * rotation.xyz - - return rotation, change - -def _scale(bone, frame, action, armature_matrix): - """ - - :param bone: - :param frame: - :param action: - :param armature_matrix: - - """ - scale = mathutils.Vector((1.0, 1.0, 1.0)) - - change = False - - ngroups = len(action.groups) - - # animation grouped by bones - - if ngroups > 0: - - index = -1 - - for i in range(ngroups): - if action.groups[i].name == bone.name: - - print(action.groups[i].name) - - index = i - - if index > -1: - for channel in action.groups[index].channels: - - if "scale" in channel.data_path: - has_changed = _handle_scale_channel( - channel, frame, scale) - change = change or has_changed - - # animation in raw fcurves - - else: - - bone_label = '"%s"' % bone.name - - for channel in action.fcurves: - data_path = channel.data_path - if bone_label in data_path and "scale" in data_path: - has_changed = _handle_scale_channel( - channel, frame, scale) - change = change or has_changed - - - #scale.xyz = armature_matrix * scale.xyz - - return scale, change - - -def _handle_rotation_channel(channel, frame, rotation): - """ - - :param channel: - :param frame: - :param rotation: - - """ - - change = False - - if channel.array_index in [0, 1, 2, 3]: - - for keyframe in channel.keyframe_points: - if keyframe.co[0] == frame: - change = True - - value = channel.evaluate(frame) - - if channel.array_index == 1: - rotation.x = value - - elif channel.array_index == 2: - rotation.y = value - - elif channel.array_index == 3: - rotation.z = value - - elif channel.array_index == 0: - rotation.w = value - - return change - - -def _handle_position_channel(channel, frame, position): - """ - - :param channel: - :param frame: - :param position: - - """ - - change = False - - if channel.array_index in [0, 1, 2]: - for keyframe in channel.keyframe_points: - if keyframe.co[0] == frame: - change = True - - value = channel.evaluate(frame) - - if channel.array_index == 0: - position.x = value - - if channel.array_index == 1: - position.y = value - - if channel.array_index == 2: - position.z = value - - return change - -def _handle_scale_channel(channel, frame, scale): - """ - - :param channel: - :param frame: - :param position: - - """ - change = False - - if channel.array_index in [0, 1, 2]: - for keyframe in channel.keyframe_points: - if keyframe.co[0] == frame: - change = True - - value = channel.evaluate(frame) - - if channel.array_index == 0: - scale.x = value - - if channel.array_index == 1: - scale.y = value - - if channel.array_index == 2: - scale.z = value - - return change - - -def _quaternion_length(quat): - """Calculate the length of a quaternion - - :param quat: Blender quaternion object - :rtype: float - - """ - return math.sqrt(quat.x * quat.x + quat.y * quat.y + - quat.z * quat.z + quat.w * quat.w) - - -def _normalize_quaternion(quat): - """Normalize a quaternion - - :param quat: Blender quaternion object - :returns: generic quaternion enum object with normalized values - :rtype: object - - """ - enum = type('Enum', (), {'x': 0, 'y': 0, 'z': 0, 'w': 1}) - length = _quaternion_length(quat) - if length is not 0: - length = 1 / length - enum.x = quat.x * length - enum.y = quat.y * length - enum.z = quat.z * length - enum.w = quat.w * length - return enum diff --git a/utils/exporters/blender/addons/io_three/exporter/api/camera.py b/utils/exporters/blender/addons/io_three/exporter/api/camera.py deleted file mode 100644 index 06b12135a1e6270daafd03550081ed8fc3a1d5ef..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/api/camera.py +++ /dev/null @@ -1,130 +0,0 @@ -import math -from bpy import data, types, context -from .. import logger - - -def _camera(func): - """ - - :param func: - - """ - - def inner(name, *args, **kwargs): - """ - - :param name: - :param *args: - :param **kwargs: - - """ - - if isinstance(name, types.Camera): - camera = name - else: - camera = data.cameras[name] - - return func(camera, *args, **kwargs) - - return inner - - -@_camera -def aspect(camera): - """ - - :param camera: - :rtype: float - - """ - logger.debug("camera.aspect(%s)", camera) - render = context.scene.render - return render.resolution_x/render.resolution_y - - -@_camera -def bottom(camera): - """ - - :param camera: - :rtype: float - - """ - logger.debug("camera.bottom(%s)", camera) - return -(camera.angle_y * camera.ortho_scale) - - -@_camera -def far(camera): - """ - - :param camera: - :rtype: float - - """ - logger.debug("camera.far(%s)", camera) - return camera.clip_end - - -@_camera -def fov(camera): - """ - - :param camera: - :rtype: float - - """ - logger.debug("camera.fov(%s)", camera) - fov_in_radians = camera.angle - aspect_ratio = aspect(camera) - if aspect_ratio > 1: - fov_in_radians = 2 * math.atan(math.tan(fov_in_radians / 2) / aspect_ratio) - return math.degrees(fov_in_radians) - - -@_camera -def left(camera): - """ - - :param camera: - :rtype: float - - """ - logger.debug("camera.left(%s)", camera) - return -(camera.angle_x * camera.ortho_scale) - - -@_camera -def near(camera): - """ - - :param camera: - :rtype: float - - """ - logger.debug("camera.near(%s)", camera) - return camera.clip_start - - -@_camera -def right(camera): - """ - - :param camera: - :rtype: float - - """ - logger.debug("camera.right(%s)", camera) - return camera.angle_x * camera.ortho_scale - - -@_camera -def top(camera): - """ - - :param camera: - :rtype: float - - """ - logger.debug("camera.top(%s)", camera) - return camera.angle_y * camera.ortho_scale diff --git a/utils/exporters/blender/addons/io_three/exporter/api/constants.py b/utils/exporters/blender/addons/io_three/exporter/api/constants.py deleted file mode 100644 index 76a4a46b2044c9062959888359051986b559ea80..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/api/constants.py +++ /dev/null @@ -1,29 +0,0 @@ -MESH = 'MESH' -LAMP = 'LAMP' -EMPTY = 'EMPTY' -ARMATURE = 'ARMATURE' - -SPOT = 'SPOT' -POINT = 'POINT' -SUN = 'SUN' -HEMI = 'HEMI' -AREA = 'AREA' - -NO_SHADOW = 'NOSHADOW' - -CAMERA = 'CAMERA' -PERSP = 'PERSP' -ORTHO = 'ORTHO' - -RENDER = 'RENDER' - -ZYX = 'ZYX' - -MULTIPLY = 'MULTIPLY' - -WIRE = 'WIRE' -IMAGE = 'IMAGE' - -MAG_FILTER = 'LinearFilter' -MIN_FILTER = 'LinearMipMapLinearFilter' -MAPPING = 'UVMapping' diff --git a/utils/exporters/blender/addons/io_three/exporter/api/image.py b/utils/exporters/blender/addons/io_three/exporter/api/image.py deleted file mode 100644 index c0581173fe7394a6d86c7d24bb7142950cd07e09..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/api/image.py +++ /dev/null @@ -1,52 +0,0 @@ -import os -from bpy import data, types -from .. import logger - - -def _image(func): - """ - - :param func: - - """ - - def inner(name, *args, **kwargs): - """ - - :param name: - :param *args: - :param **kwargs: - - """ - - if isinstance(name, types.Image): - mesh = name - else: - mesh = data.images[name] - - return func(mesh, *args, **kwargs) - - return inner - - -def file_name(image): - """ - - :param image: - :rtype: str - - """ - logger.debug("image.file_name(%s)", image) - return os.path.basename(file_path(image)) - - -@_image -def file_path(image): - """ - - :param image: - :rtype: str - - """ - logger.debug("image.file_path(%s)", image) - return os.path.normpath(image.filepath_from_user()) diff --git a/utils/exporters/blender/addons/io_three/exporter/api/light.py b/utils/exporters/blender/addons/io_three/exporter/api/light.py deleted file mode 100644 index 25a8cc48da1aecd12425c40034030ba715ddd4b1..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/api/light.py +++ /dev/null @@ -1,97 +0,0 @@ -from bpy import data, types -from .. import utilities, logger - - -def _lamp(func): - """ - - :param func: - - """ - - def inner(name, *args, **kwargs): - """ - - :param name: - :param *args: - :param **kwargs: - - """ - - if isinstance(name, types.Lamp): - lamp = name - else: - lamp = data.lamps[name] - - return func(lamp, *args, **kwargs) - - return inner - - -@_lamp -def angle(lamp): - """ - - :param lamp: - :rtype: float - - """ - logger.debug("light.angle(%s)", lamp) - return lamp.spot_size - - -@_lamp -def color(lamp): - """ - - :param lamp: - :rtype: int - - """ - logger.debug("light.color(%s)", lamp) - colour = (lamp.color.r, lamp.color.g, lamp.color.b) - return utilities.rgb2int(colour) - - -@_lamp -def distance(lamp): - """ - - :param lamp: - :rtype: float - - """ - logger.debug("light.distance(%s)", lamp) - return lamp.distance - - -@_lamp -def intensity(lamp): - """ - - :param lamp: - :rtype: float - - """ - logger.debug("light.intensity(%s)", lamp) - return round(lamp.energy, 2) - -# mapping enum values to decay exponent -__FALLOFF_TO_EXP = { - 'CONSTANT': 0, - 'INVERSE_LINEAR': 1, - 'INVERSE_SQUARE': 2, - 'CUSTOM_CURVE': 0, - 'LINEAR_QUADRATIC_WEIGHTED': 2 -} - -@_lamp -def falloff(lamp): - """ - - :param lamp: - :rtype: float - - """ - logger.debug("light.falloff(%s)", lamp) - return __FALLOFF_TO_EXP[lamp.falloff_type] diff --git a/utils/exporters/blender/addons/io_three/exporter/api/material.py b/utils/exporters/blender/addons/io_three/exporter/api/material.py deleted file mode 100644 index 609b14bb733f9ef3c687387e3daf8e56ee6c2d25..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/api/material.py +++ /dev/null @@ -1,405 +0,0 @@ -from bpy import data, types -from .. import constants, logger -from .constants import MULTIPLY, WIRE, IMAGE - - -def _material(func): - """ - - :param func: - - """ - - def inner(name, *args, **kwargs): - """ - - :param name: - :param *args: - :param **kwargs: - - """ - - material = None - if isinstance(name, types.Material): - material = name - elif name: - material = data.materials[name] - - return func(material, *args, **kwargs) if material else None - - return inner - - -@_material -def blending(material): - """ - - :param material: - :return: THREE_blending_type value - - """ - logger.debug("material.blending(%s)", material) - try: - blend = material.THREE_blending_type - except AttributeError: - logger.debug("No THREE_blending_type attribute found") - blend = constants.NORMAL_BLENDING - - blend = getattr( constants.BLENDING_CONSTANTS , blend) #manthrax: Translate the blending type name, to the three.js constant value. - return blend - - -@_material -def bump_map(material): - """ - - :param material: - :return: texture node for bump - - """ - logger.debug("material.bump_map(%s)", material) - for texture in _valid_textures(material): - if texture.use_map_normal and not \ - texture.texture.use_normal_map: - return texture.texture - - -@_material -def bump_scale(material): - """ - - :param material: - :rtype: float - - """ - logger.debug("material.bump_scale(%s)", material) - for texture in _valid_textures(material): - if texture.use_map_normal: - return texture.normal_factor - - -@_material -def depth_test(material): - """ - - :param material: - :return: THREE_depth_test value - :rtype: bool - - """ - logger.debug("material.depth_test(%s)", material) - try: - test = material.THREE_depth_test - except AttributeError: - logger.debug("No THREE_depth_test attribute found") - test = True - return test - - -@_material -def depth_write(material): - """ - - :param material: - :return: THREE_depth_write value - :rtype: bool - - """ - logger.debug("material.depth_write(%s)", material) - try: - write = material.THREE_depth_write - except AttributeError: - logger.debug("No THREE_depth_write attribute found") - write = True - return write - - -@_material -def double_sided(material): - """ - - :param material: - :return: THREE_double_sided value - :rtype: bool - - """ - logger.debug("material.double_sided(%s)", material) - try: - write = material.THREE_double_sided - except AttributeError: - logger.debug("No THREE_double_sided attribute found") - write = False - return write - - -@_material -def diffuse_color(material): - """ - - :param material: - :return: rgb value - :rtype: tuple - - """ - logger.debug("material.diffuse_color(%s)", material) - return (material.diffuse_intensity * material.diffuse_color[0], - material.diffuse_intensity * material.diffuse_color[1], - material.diffuse_intensity * material.diffuse_color[2]) - - -@_material -def diffuse_map(material): - """ - - :param material: - :return: texture node for map - - """ - logger.debug("material.diffuse_map(%s)", material) - for texture in _valid_textures(material): - if texture.use_map_color_diffuse and not \ - texture.blend_type == MULTIPLY: - return texture.texture - - -@_material -def emissive_color(material): - """ - - :param material: - :return: rgb value - :rtype: tuple - - """ - logger.debug("material.emissive_color(%s)", material) - diffuse = diffuse_color(material) - return (material.emit * diffuse[0], - material.emit * diffuse[1], - material.emit * diffuse[2]) - - -@_material -def light_map(material): - """ - - :param material: - :return: texture node for light maps - - """ - logger.debug("material.light_map(%s)", material) - for texture in _valid_textures(material, strict_use=False): - if texture.use_map_color_diffuse and \ - texture.blend_type == MULTIPLY: - return texture.texture - - -@_material -def normal_scale(material): - """ - - :param material: - :rtype: float - - """ - logger.debug("material.normal_scale(%s)", material) - for texture in _valid_textures(material): - if texture.use_map_normal: - return (texture.normal_factor, texture.normal_factor) - - -@_material -def normal_map(material): - """ - - :param material: - :return: texture node for normals - - """ - logger.debug("material.normal_map(%s)", material) - for texture in _valid_textures(material): - if texture.use_map_normal and \ - texture.texture.use_normal_map: - return texture.texture - - -@_material -def opacity(material): - """ - - :param material: - :rtype: float - - """ - logger.debug("material.opacity(%s)", material) - return round(material.alpha, 2) - - -@_material -def shading(material): - """ - - :param material: - :return: shading type (phong or lambert) - - """ - logger.debug("material.shading(%s)", material) - dispatch = { - True: constants.PHONG, - False: constants.LAMBERT - } - - if material.use_shadeless: - return constants.BASIC - - return dispatch[material.specular_intensity > 0.0] - - -@_material -def specular_coef(material): - """ - - :param material: - :rtype: float - - """ - logger.debug("material.specular_coef(%s)", material) - return material.specular_hardness - - -@_material -def specular_color(material): - """ - - :param material: - :return: rgb value - :rtype: tuple - - """ - logger.debug("material.specular_color(%s)", material) - return (material.specular_intensity * material.specular_color[0], - material.specular_intensity * material.specular_color[1], - material.specular_intensity * material.specular_color[2]) - - -@_material -def specular_map(material): - """ - - :param material: - :return: texture node for specular - - """ - logger.debug("material.specular_map(%s)", material) - for texture in _valid_textures(material): - if texture.use_map_specular: - return texture.texture - - -@_material -def transparent(material): - """ - - :param material: - :rtype: bool - - """ - logger.debug("material.transparent(%s)", material) - return material.use_transparency - - -@_material -def type(material): - """ - - :param material: - :return: THREE compatible shader type - - """ - logger.debug("material.type(%s)", material) - if material.diffuse_shader != 'LAMBERT': - material_type = constants.BASIC - elif material.specular_intensity > 0: - material_type = constants.PHONG - else: - material_type = constants.LAMBERT - - return material_type - - -@_material -def use_vertex_colors(material): - """ - - :param material: - :rtype: bool - - """ - logger.debug("material.use_vertex_colors(%s)", material) - return material.use_vertex_color_paint - - -def used_materials(): - """ - - :return: list of materials that are in use - :rtype: generator - - """ - logger.debug("material.used_materials()") - for material in data.materials: - if material.users > 0: - yield material.name - - -@_material -def visible(material): - """ - - :param material: - :return: THREE_visible value - :rtype: bool - - """ - logger.debug("material.visible(%s)", material) - try: - vis = material.THREE_visible - except AttributeError: - logger.debug("No THREE_visible attribute found") - vis = True - - return vis - - -@_material -def wireframe(material): - """ - - :param material: - :rtype: bool - - """ - logger.debug("material.wireframe(%s)", material) - return material.type == WIRE - - -def _valid_textures(material, strict_use=True): - """ - - :param material: - :rtype: generator - - """ - for texture in material.texture_slots: - if not texture: - continue - if strict_use: - in_use = texture.use - else: - in_use = True - if not in_use: - continue - if not texture.texture or texture.texture.type != IMAGE: - logger.warning("Unable to export non-image texture %s", texture) - continue - logger.debug("Valid texture found %s", texture) - yield texture diff --git a/utils/exporters/blender/addons/io_three/exporter/api/mesh.py b/utils/exporters/blender/addons/io_three/exporter/api/mesh.py deleted file mode 100644 index 3402a233fe048c2a3d2fbe8095fbd6a1581dcc21..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/api/mesh.py +++ /dev/null @@ -1,1390 +0,0 @@ -""" -Blender API for querying mesh data. Animation data is also -handled here since Three.js associates the animation (skeletal, -morph targets) with the geometry nodes. -""" - -import operator -import re -from bpy import data, types, context -from . import material, texture, animation -from . import object as object_ -from .. import constants, utilities, logger, exceptions - - -# flips vectors - -XZ_Y = constants.XZ_Y -X_ZY = constants.X_ZY -XYZ = constants.XYZ -_XY_Z = constants._XY_Z - - -def flip_axes (a, dir=XYZ): - """ - - :function to swap vectors: - - """ - - if dir == XZ_Y: - a = (a[0], a[2], -a[1]) - elif dir == X_ZY: - a = (a[0], -a[2], a[1]) - elif dir == _XY_Z: - a = (-a[0], -a[1], a[2]) - - return (a[0], a[1], a[2]) - - -def _mesh(func): - """ - - :param func: - - """ - - def inner(name, *args, **kwargs): - """ - - :param name: - :param *args: - :param **kwargs: - - """ - - if isinstance(name, types.Mesh): - mesh = name - else: - mesh = data.meshes[name] - - return func(mesh, *args, **kwargs) - - return inner - - -@_mesh -def skeletal_animation(mesh, options): - """ - - :param mesh: - :param options: - :rtype: [] - - """ - logger.debug("mesh.animation(%s, %s)", mesh, options) - armature = _armature(mesh) - - if not armature: - logger.warning("No armature found (%s)", mesh) - return [] - - anim_type = options.get(constants.ANIMATION) -# pose_position = armature.data.pose_position - dispatch = { - constants.POSE: animation.pose_animation, - constants.REST: animation.rest_animation - } - - func = dispatch[anim_type] -# armature.data.pose_position = anim_type.upper() - animations = func(armature, options) -# armature.data.pose_position = pose_position - - return animations - -@_mesh -def bones(mesh, options): - """ - - :param mesh: - :param options: - :rtype: [], {} - - """ - logger.debug("mesh.bones(%s)", mesh) - armature = _armature(mesh) - - if not armature: - return [], {} - - anim_type = options.get(constants.ANIMATION) -# pose_position = armature.data.pose_position - - if anim_type == constants.OFF: - logger.info("Animation type not set, defaulting " - "to using REST position for the armature.") - func = _rest_bones -# armature.data.pose_position = "REST" - else: - dispatch = { - constants.REST: _rest_bones, - constants.POSE: _pose_bones - } - logger.info("Using %s for the armature", anim_type) - func = dispatch[anim_type] -# armature.data.pose_position = anim_type.upper() - - bones_, bone_map = func(armature) -# armature.data.pose_position = pose_position - - return (bones_, bone_map) - - -@_mesh -def buffer_color(mesh, options): - """ - - :param mesh: - :rtype: [] - - """ - colors_ = [] - - try: - vertex_colors_ = mesh.vertex_colors[0] # only supports one set - except IndexError: - return [] # no colors found - for color_data in vertex_colors_.data: - colors_.extend(tuple(color_data.color)) - - return colors_ - - -@_mesh -def buffer_normal(mesh, options): - """ - - :param mesh: - :rtype: [] - - """ - normals_ = [] - - for face in mesh.tessfaces: - vert_count = len(face.vertices) - if vert_count is not 3: - msg = "Non-triangulated face detected" - raise exceptions.BufferGeometryError(msg) - - # using Object Loader with skinned mesh - if options.get(constants.SCENE, True) and _armature(mesh): - - for vertex_index in face.vertices: - normal = mesh.vertices[vertex_index].normal - vector = flip_axes(normal, XZ_Y) if face.use_smooth else flip_axes(face.normal, XZ_Y) - normals_.extend(vector) - - # using Object Loader with static mesh - elif options.get(constants.SCENE, True) and not _armature(mesh): - - for vertex_index in face.vertices: - normal = mesh.vertices[vertex_index].normal - vector = flip_axes(normal, _XY_Z) if face.use_smooth else flip_axes(face.normal, _XY_Z) - normals_.extend(vector) - - # using JSON Loader with skinned mesh - elif not options.get(constants.SCENE, True) and _armature(mesh): - - for vertex_index in face.vertices: - normal = mesh.vertices[vertex_index].normal - vector = flip_axes(normal) if face.use_smooth else flip_axes(face.normal) - normals_.extend(vector) - - # using JSON Loader with static mesh - else: - - for vertex_index in face.vertices: - normal = mesh.vertices[vertex_index].normal - vector = flip_axes(normal) if face.use_smooth else flip_axes(face.normal) - normals_.extend(vector) - - return normals_ - - -@_mesh -def buffer_face_material(mesh, options): - """ - - :param mesh: - :rtype: [] - - """ - face_material = [] - logger.info("Retrieving face materials.") - - for face in mesh.tessfaces: - #logger.info("face:%d,%d",face.index,face.material_index) - face_material.append(face.material_index) - - return face_material - - -@_mesh -def buffer_position(mesh, options): - """ - - :param mesh: - :rtype: [] - - """ - position = [] - - for face in mesh.tessfaces: - vert_count = len(face.vertices) - if vert_count is not 3: - msg = "Non-triangulated face detected" - raise exceptions.BufferGeometryError(msg) - - # using Object Loader with skinned mesh - if options.get(constants.SCENE, True) and _armature(mesh): - - for vertex_index in face.vertices: - vertex = mesh.vertices[vertex_index] - vector = flip_axes(vertex.co, XZ_Y) - position.extend(vector) - - # using Object Loader with static mesh - elif options.get(constants.SCENE, True) and not _armature(mesh): - - for vertex_index in face.vertices: - vertex = mesh.vertices[vertex_index] - vector = flip_axes(vertex.co, _XY_Z) - position.extend(vector) - - # using JSON Loader with skinned mesh - elif not options.get(constants.SCENE, True) and _armature(mesh): - - for vertex_index in face.vertices: - vertex = mesh.vertices[vertex_index] - vector = flip_axes(vertex.co) - position.extend(vector) - - # using JSON Loader with static mesh - else: - - for vertex_index in face.vertices: - vertex = mesh.vertices[vertex_index] - vector = flip_axes(vertex.co) - position.extend(vector) - - return position - - -@_mesh -def buffer_uv(mesh, layer=0): - """ - - :param mesh: - :param layer: (Default value = 0) - :rtype: [] - - """ - uvs_ = [] - if len(mesh.uv_layers) <= layer: - return uvs_ - - for uv_data in mesh.uv_layers[layer].data: - uv_tuple = (uv_data.uv[0], uv_data.uv[1]) - uvs_.extend(uv_tuple) - - return uvs_ - - -@_mesh -def extra_vertex_groups(mesh, patterns_string): - """ - Returns (name,index) tuples for the extra (non-skinning) vertex groups - matching the given patterns. - The patterns are comma-separated where the star character can be used - as a wildcard character sequence. - - :param mesh: - :param patterns_string: - :rtype: [] - - """ - logger.debug("mesh._extra_vertex_groups(%s)", mesh) - pattern_re = None - extra_vgroups = [] - if not patterns_string.strip(): - return extra_vgroups - armature = _armature(mesh) - obj = object_.objects_using_mesh(mesh)[0] - for vgroup_index, vgroup in enumerate(obj.vertex_groups): - # Skip bone weights: - vgroup_name = vgroup.name - if armature: - is_bone_weight = False - for bone in armature.pose.bones: - if bone.name == vgroup_name: - is_bone_weight = True - break - if is_bone_weight: - continue - - if pattern_re is None: - # Translate user-friendly patterns to a regular expression: - # Join the whitespace-stripped, initially comma-separated - # entries to alternatives. Escape all characters except - # the star and replace that one with '.*?'. - pattern_re = '^(?:' + '|'.join( - map(lambda entry: - '.*?'.join(map(re.escape, entry.strip().split('*'))), - patterns_string.split(','))) + ')$' - - if not re.match(pattern_re, vgroup_name): - continue - - extra_vgroups.append((vgroup_name, vgroup_index)) - return extra_vgroups - - -@_mesh -def vertex_group_data(mesh, index): - """ - Return vertex group data for each vertex. Vertices not in the group - get a zero value. - - :param mesh: - :param index: - - """ - group_data = [] - for vertex in mesh.vertices: - weight = None - for group in vertex.groups: - if group.group == index: - weight = group.weight - group_data.append(weight or 0.0) - return group_data - - -@_mesh -def buffer_vertex_group_data(mesh, index): - """ - Return vertex group data for each deindexed vertex. Vertices not in the - group get a zero value. - - :param mesh: - :param index: - - """ - group_data = [] - for face in mesh.tessfaces: - for vertex_index in face.vertices: - vertex = mesh.vertices[vertex_index] - weight = None - for group in vertex.groups: - if group.group == index: - weight = group.weight - group_data.append(weight or 0.0) - return group_data - - -@_mesh -def faces(mesh, options, material_list=None): - """ - - :param mesh: - :param options: - :param material_list: (Default value = None) - - """ - logger.debug("mesh.faces(%s, %s, materials=%s)", - mesh, options, materials) - - material_list = material_list or [] - vertex_uv = len(mesh.uv_textures) > 0 - has_colors = len(mesh.vertex_colors) > 0 - logger.info("Has UVs = %s", vertex_uv) - logger.info("Has vertex colours = %s", has_colors) - - opt_colours = options[constants.COLORS] and has_colors - opt_uvs = options[constants.UVS] and vertex_uv - opt_materials = options.get(constants.FACE_MATERIALS) - opt_normals = options[constants.NORMALS] - logger.debug("Vertex colours enabled = %s", opt_colours) - logger.debug("UVS enabled = %s", opt_uvs) - logger.debug("Materials enabled = %s", opt_materials) - logger.debug("Normals enabled = %s", opt_normals) - - uv_indices = _uvs(mesh)[1] if opt_uvs else None - vertex_normals = _normals(mesh, options) if opt_normals else None - vertex_colours = vertex_colors(mesh) if opt_colours else None - - faces_data = [] - - colour_indices = {} - if vertex_colours: - logger.debug("Indexing colours") - for index, colour in enumerate(vertex_colours): - colour_indices[str(colour)] = index - - normal_indices = {} - - if vertex_normals: - logger.debug("Indexing normals") - - # using Object Loader with skinned mesh - if options.get(constants.SCENE, True) and _armature(mesh): - - for index, normal in enumerate(vertex_normals): - normal = flip_axes(normal, XYZ) - normal_indices[str(normal)] = index - - # using Object Loader with static mesh - elif options.get(constants.SCENE, True) and not _armature(mesh): - - for index, normal in enumerate(vertex_normals): - normal = flip_axes(normal, XYZ) - normal_indices[str(normal)] = index - - # using JSON Loader with skinned mesh - elif not options.get(constants.SCENE, True) and _armature(mesh): - - for index, normal in enumerate(vertex_normals): - normal = flip_axes(normal) - normal_indices[str(normal)] = index - - # using JSON Loader with static mesh - else: - - for index, normal in enumerate(vertex_normals): - normal = flip_axes(normal) - normal_indices[str(normal)] = index - - logger.info("Parsing %d faces", len(mesh.tessfaces)) - for face in mesh.tessfaces: - vert_count = len(face.vertices) - - if vert_count not in (3, 4): - logger.error("%d vertices for face %d detected", - vert_count, - face.index) - raise exceptions.NGonError("ngons are not supported") - - mat_index = face.material_index is not None and opt_materials - mask = { - constants.QUAD: vert_count is 4, - constants.MATERIALS: mat_index, - constants.UVS: False, - constants.NORMALS: False, - constants.COLORS: False - } - - face_data = [] - - face_data.extend([v for v in face.vertices]) - - if mask[constants.MATERIALS]: - for mat_index, mat in enumerate(material_list): - if mat[constants.DBG_INDEX] == face.material_index: - face_data.append(mat_index) - break - else: - logger.warning("Could not map the material index " - "for face %d" % face.index) - face_data.append(0) # default to index zero if there's a bad material - - if uv_indices: - for index, uv_layer in enumerate(uv_indices): - layer = mesh.tessface_uv_textures[index] - - for uv_data in layer.data[face.index].uv: - uv_tuple = (uv_data[0], uv_data[1]) - uv_index = uv_layer[str(uv_tuple)] - face_data.append(uv_index) - mask[constants.UVS] = True - - if vertex_normals: - - # using Object Loader with skinned mesh - if options.get(constants.SCENE, True) and _armature(mesh): - - for vertex in face.vertices: - normal = mesh.vertices[vertex].normal - normal = flip_axes(normal, XZ_Y) if face.use_smooth else flip_axes(face.normal, XZ_Y) - face_data.append(normal_indices[str(normal)]) - mask[constants.NORMALS] = True - - # using Object Loader with static mesh - elif options.get(constants.SCENE, True) and not _armature(mesh): - - for vertex in face.vertices: - normal = mesh.vertices[vertex].normal - normal = flip_axes(normal, _XY_Z) if face.use_smooth else flip_axes(face.normal, _XY_Z) - face_data.append(normal_indices[str(normal)]) - mask[constants.NORMALS] = True - - # using JSON Loader with skinned mesh - elif not options.get(constants.SCENE, True) and _armature(mesh): - - for vertex in face.vertices: - normal = mesh.vertices[vertex].normal - normal = flip_axes(normal) if face.use_smooth else flip_axes(face.normal) - face_data.append(normal_indices[str(normal)]) - mask[constants.NORMALS] = True - - # using JSON Loader with static mesh - else: - - for vertex in face.vertices: - normal = mesh.vertices[vertex].normal - normal = flip_axes(normal) if face.use_smooth else flip_axes(face.normal) - face_data.append(normal_indices[str(normal)]) - mask[constants.NORMALS] = True - - - if vertex_colours: - colours = mesh.tessface_vertex_colors.active.data[face.index] - - for each in (colours.color1, colours.color2, colours.color3): - each = utilities.rgb2int(each) - face_data.append(colour_indices[str(each)]) - mask[constants.COLORS] = True - - if mask[constants.QUAD]: - colour = utilities.rgb2int(colours.color4) - face_data.append(colour_indices[str(colour)]) - - face_data.insert(0, utilities.bit_mask(mask)) - faces_data.extend(face_data) - - return faces_data - - -@_mesh -def morph_targets(mesh, options): - """ - - :param mesh: - :param options: - - """ - logger.debug("mesh.morph_targets(%s, %s)", mesh, options) - obj = object_.objects_using_mesh(mesh)[0] - original_frame = context.scene.frame_current - frame_step = options.get(constants.FRAME_STEP, 1) - scene_frames = range(context.scene.frame_start, - context.scene.frame_end+1, - frame_step) - - morphs = [] - - for frame in scene_frames: - logger.info("Processing data at frame %d", frame) - context.scene.frame_set(frame, 0.0) - morphs.append([]) - vertices_ = object_.extract_mesh(obj, options).vertices[:] - - # using Object Loader with skinned mesh - if options.get(constants.SCENE, True) and _armature(mesh): - - for vertex in vertices_: - morphs[-1].extend(flip_axes(vertex.co, XZ_Y)) - - # using Object Loader with static mesh - elif options.get(constants.SCENE, True) and not _armature(mesh): - - for vertex in vertices_: - morphs[-1].extend(flip_axes(vertex.co, _XY_Z)) - - # using JSON Loader with skinned mesh - elif not options.get(constants.SCENE, True) and _armature(mesh): - - for vertex in vertices_: - morphs[-1].extend(flip_axes(vertex.co)) - - # using JSON Loader with static mesh - else: - - for vertex in vertices_: - morphs[-1].extend(flip_axes(vertex.co)) - - context.scene.frame_set(original_frame, 0.0) - morphs_detected = False - for index, each in enumerate(morphs): - if index is 0: - continue - morphs_detected = morphs[index-1] != each - if morphs_detected: - logger.info("Valid morph target data detected") - break - else: - logger.info("No valid morph data detected") - return [] - - manifest = [] - for index, morph in enumerate(morphs): - manifest.append({ - constants.NAME: 'animation_%06d' % index, - constants.VERTICES: morph - }) - - return manifest - -@_mesh -def blend_shapes(mesh, options): - """ - - :param mesh: - :param options: - - """ - logger.debug("mesh.blend_shapes(%s, %s)", mesh, options) - manifest = [] - if mesh.shape_keys: - logger.info("mesh.blend_shapes -- there's shape keys") - key_blocks = mesh.shape_keys.key_blocks - for key in key_blocks.keys()[1:]: # skip "Basis" - logger.info("mesh.blend_shapes -- key %s", key) - morph = [] - for d in key_blocks[key].data: - co = d.co - morph.extend([co.x, co.y, co.z]) - manifest.append({ - constants.NAME: key, - constants.VERTICES: morph - }) - else: - logger.debug("No valid blend_shapes detected") - return manifest - -@_mesh -def animated_blend_shapes(mesh, name, options): - """ - - :param mesh: - :param options: - - """ - - # let filter the name to only keep the node's name - # the two cases are '%sGeometry' and '%sGeometry.%d', and we want %s - name = re.search("^(.*)Geometry(\..*)?$", name).group(1) - - logger.debug("mesh.animated_blend_shapes(%s, %s)", mesh, options) - tracks = [] - shp = mesh.shape_keys - animCurves = shp.animation_data - if animCurves: - animCurves = animCurves.action.fcurves - - for key in shp.key_blocks.keys()[1:]: # skip "Basis" - key_name = name+".morphTargetInfluences["+key+"]" - found_animation = False - data_path = 'key_blocks["'+key+'"].value' - values = [] - if animCurves: - for fcurve in animCurves: - if fcurve.data_path == data_path: - for xx in fcurve.keyframe_points: - values.append({ "time": xx.co.x, "value": xx.co.y }) - found_animation = True - break # no need to continue - - if found_animation: - tracks.append({ - constants.NAME: key_name, - constants.TYPE: "number", - constants.KEYS: values - }); - - return tracks - -@_mesh -def materials(mesh, options): - """ - - :param mesh: - :param options: - - """ - logger.debug("mesh.materials(%s, %s)", mesh, options) - - # sanity check - if not mesh.materials: - return [] - - indices = [] - - #manthrax: Disable the following logic that attempts to find only the used materials on this mesh - #for face in mesh.tessfaces: - # if face.material_index not in indices: - # indices.append(face.material_index) - # instead, export all materials on this object... they are probably there for a good reason, even if they aren't referenced by the geometry at present... - for index in range(len( mesh.materials )): - indices.append(index) - - - material_sets = [(mesh.materials[index], index) for index in indices] - materials_ = [] - - maps = options.get(constants.MAPS) - - mix = options.get(constants.MIX_COLORS) - use_colors = options.get(constants.COLORS) - logger.info("Colour mix is set to %s", mix) - logger.info("Vertex colours set to %s", use_colors) - - for mat, index in material_sets: - if mat == None: # undefined material for a specific index is skipped - continue - - try: - dbg_color = constants.DBG_COLORS[index] - except IndexError: - dbg_color = constants.DBG_COLORS[0] - - logger.info("Compiling attributes for %s", mat.name) - attributes = { - constants.COLOR_EMISSIVE: material.emissive_color(mat), - constants.SHADING: material.shading(mat), - constants.OPACITY: material.opacity(mat), - constants.TRANSPARENT: material.transparent(mat), - constants.VISIBLE: material.visible(mat), - constants.WIREFRAME: material.wireframe(mat), - constants.BLENDING: material.blending(mat), - constants.DEPTH_TEST: material.depth_test(mat), - constants.DEPTH_WRITE: material.depth_write(mat), - constants.DOUBLE_SIDED: material.double_sided(mat), - constants.DBG_NAME: mat.name, - constants.DBG_COLOR: dbg_color, - constants.DBG_INDEX: index - } - - if use_colors: - colors = material.use_vertex_colors(mat) - attributes[constants.VERTEX_COLORS] = colors - - if (use_colors and mix) or (not use_colors): - colors = material.diffuse_color(mat) - attributes[constants.COLOR_DIFFUSE] = colors - - if attributes[constants.SHADING] == constants.PHONG: - logger.info("Adding specular attributes") - attributes.update({ - constants.SPECULAR_COEF: material.specular_coef(mat), - constants.COLOR_SPECULAR: material.specular_color(mat) - }) - - if mesh.show_double_sided: - logger.info("Double sided is on") - attributes[constants.DOUBLE_SIDED] = True - - materials_.append(attributes) - - if not maps: - continue - - diffuse = _diffuse_map(mat) - if diffuse: - logger.info("Diffuse map found") - attributes.update(diffuse) - - light = _light_map(mat) - if light: - logger.info("Light map found") - attributes.update(light) - - specular = _specular_map(mat) - if specular: - logger.info("Specular map found") - attributes.update(specular) - - if attributes[constants.SHADING] == constants.PHONG: - normal = _normal_map(mat) - if normal: - logger.info("Normal map found") - attributes.update(normal) - - bump = _bump_map(mat) - if bump: - logger.info("Bump map found") - attributes.update(bump) - - return materials_ - - -@_mesh -def normals(mesh, options): - """ - - :param mesh: - :rtype: [] - - """ - logger.debug("mesh.normals(%s)", mesh) - normal_vectors = [] - - for vector in _normals(mesh, options): - normal_vectors.extend(vector) - - return normal_vectors - - -@_mesh -def skin_weights(mesh, bone_map, influences, anim_type): - """ - - :param mesh: - :param bone_map: - :param influences: - :param anim_type - - """ - logger.debug("mesh.skin_weights(%s)", mesh) - return _skinning_data(mesh, bone_map, influences, anim_type, 1) - - -@_mesh -def skin_indices(mesh, bone_map, influences, anim_type): - """ - - :param mesh: - :param bone_map: - :param influences: - :param anim_type - - """ - logger.debug("mesh.skin_indices(%s)", mesh) - return _skinning_data(mesh, bone_map, influences, anim_type, 0) - - -@_mesh -def texture_registration(mesh): - """ - - :param mesh: - - """ - logger.debug("mesh.texture_registration(%s)", mesh) - materials_ = mesh.materials or [] - registration = {} - - funcs = ( - (constants.MAP_DIFFUSE, material.diffuse_map), - (constants.SPECULAR_MAP, material.specular_map), - (constants.LIGHT_MAP, material.light_map), - (constants.BUMP_MAP, material.bump_map), - (constants.NORMAL_MAP, material.normal_map) - ) - - def _registration(file_path, file_name): - """ - - :param file_path: - :param file_name: - - """ - return { - 'file_path': file_path, - 'file_name': file_name, - 'maps': [] - } - - logger.info("found %d materials", len(materials_)) - for mat in materials_: - for (key, func) in funcs: - tex = func(mat) - if tex is None: - continue - - logger.info("%s has texture %s", key, tex.name) - file_path = texture.file_path(tex) - file_name = texture.file_name(tex) - - reg = registration.setdefault( - utilities.hash(file_path), - _registration(file_path, file_name)) - - reg["maps"].append(key) - - return registration - - -@_mesh -def uvs(mesh): - """ - - :param mesh: - :rtype: [] - - """ - logger.debug("mesh.uvs(%s)", mesh) - uvs_ = [] - for layer in _uvs(mesh)[0]: - uvs_.append([]) - logger.info("Parsing UV layer %d", len(uvs_)) - for pair in layer: - uvs_[-1].extend(pair) - return uvs_ - - -@_mesh -def vertex_colors(mesh): - """ - - :param mesh: - - """ - logger.debug("mesh.vertex_colors(%s)", mesh) - vertex_colours = [] - - try: - vertex_colour = mesh.tessface_vertex_colors.active.data - except AttributeError: - logger.info("No vertex colours found") - return - - for face in mesh.tessfaces: - - colours = (vertex_colour[face.index].color1, - vertex_colour[face.index].color2, - vertex_colour[face.index].color3, - vertex_colour[face.index].color4) - - for colour in colours: - colour = utilities.rgb2int((colour.r, colour.g, colour.b)) - - if colour not in vertex_colours: - vertex_colours.append(colour) - - return vertex_colours - - -@_mesh -def vertices(mesh, options): - """ - - :param mesh: - :rtype: [] - - """ - logger.debug("mesh.vertices(%s)", mesh) - vertices_ = [] - - # using Object Loader with skinned mesh - if options.get(constants.SCENE, True) and _armature(mesh): - - for vertex in mesh.vertices: - vertices_.extend(flip_axes(vertex.co, XZ_Y)) - - # using Object Loader with static mesh - elif options.get(constants.SCENE, True) and not _armature(mesh): - - for vertex in mesh.vertices: - vertices_.extend(flip_axes(vertex.co, _XY_Z)) - - # using JSON Loader with skinned mesh - elif not options.get(constants.SCENE, True) and _armature(mesh): - - for vertex in mesh.vertices: - vertices_.extend(flip_axes(vertex.co)) - - # using JSON Loader with static mesh - else: - - for vertex in mesh.vertices: - vertices_.extend(flip_axes(vertex.co)) - - return vertices_ - - -def _normal_map(mat): - """ - - :param mat: - - """ - tex = material.normal_map(mat) - if tex is None: - return - - logger.info("Found normal texture map %s", tex.name) - - normal = { - constants.MAP_NORMAL: - texture.file_name(tex), - constants.MAP_NORMAL_FACTOR: - material.normal_scale(mat), - constants.MAP_NORMAL_ANISOTROPY: - texture.anisotropy(tex), - constants.MAP_NORMAL_WRAP: texture.wrap(tex), - constants.MAP_NORMAL_REPEAT: texture.repeat(tex) - } - - return normal - - -def _bump_map(mat): - """ - - :param mat: - - """ - tex = material.bump_map(mat) - if tex is None: - return - - logger.info("Found bump texture map %s", tex.name) - - bump = { - constants.MAP_BUMP: - texture.file_name(tex), - constants.MAP_BUMP_ANISOTROPY: - texture.anisotropy(tex), - constants.MAP_BUMP_WRAP: texture.wrap(tex), - constants.MAP_BUMP_REPEAT: texture.repeat(tex), - constants.MAP_BUMP_SCALE: - material.bump_scale(mat), - } - - return bump - - -def _specular_map(mat): - """ - - :param mat: - - """ - tex = material.specular_map(mat) - if tex is None: - return - - logger.info("Found specular texture map %s", tex.name) - - specular = { - constants.MAP_SPECULAR: - texture.file_name(tex), - constants.MAP_SPECULAR_ANISOTROPY: - texture.anisotropy(tex), - constants.MAP_SPECULAR_WRAP: texture.wrap(tex), - constants.MAP_SPECULAR_REPEAT: texture.repeat(tex) - } - - return specular - - -def _light_map(mat): - """ - - :param mat: - - """ - tex = material.light_map(mat) - if tex is None: - return - - logger.info("Found light texture map %s", tex.name) - - light = { - constants.MAP_LIGHT: - texture.file_name(tex), - constants.MAP_LIGHT_ANISOTROPY: - texture.anisotropy(tex), - constants.MAP_LIGHT_WRAP: texture.wrap(tex), - constants.MAP_LIGHT_REPEAT: texture.repeat(tex) - } - - return light - - -def _diffuse_map(mat): - """ - - :param mat: - - """ - tex = material.diffuse_map(mat) - if tex is None: - return - - logger.info("Found diffuse texture map %s", tex.name) - - diffuse = { - constants.MAP_DIFFUSE: - texture.file_name(tex), - constants.MAP_DIFFUSE_ANISOTROPY: - texture.anisotropy(tex), - constants.MAP_DIFFUSE_WRAP: texture.wrap(tex), - constants.MAP_DIFFUSE_REPEAT: texture.repeat(tex) - } - - return diffuse - - -def _normals(mesh, options): - """ - - :param mesh: - :rtype: [] - - """ - vectors = [] - - vectors_ = {} - for face in mesh.tessfaces: - - if options.get(constants.SCENE, True) and _armature(mesh): - - for vertex_index in face.vertices: - normal = mesh.vertices[vertex_index].normal - vector = flip_axes(normal, XZ_Y) if face.use_smooth else flip_axes(face.normal, XZ_Y) - - str_vec = str(vector) - try: - vectors_[str_vec] - except KeyError: - vectors.append(vector) - vectors_[str_vec] = True - - elif options.get(constants.SCENE, True) and not _armature(mesh): - - for vertex_index in face.vertices: - normal = mesh.vertices[vertex_index].normal - vector = flip_axes(normal,_XY_Z) if face.use_smooth else flip_axes(face.normal,_XY_Z) - - str_vec = str(vector) - try: - vectors_[str_vec] - except KeyError: - vectors.append(vector) - vectors_[str_vec] = True - - elif not options.get(constants.SCENE, True) and _armature(mesh): - - for vertex_index in face.vertices: - normal = mesh.vertices[vertex_index].normal - vector = flip_axes(normal) if face.use_smooth else flip_axes(face.normal) - - str_vec = str(vector) - try: - vectors_[str_vec] - except KeyError: - vectors.append(vector) - vectors_[str_vec] = True - - else: - - for vertex_index in face.vertices: - normal = mesh.vertices[vertex_index].normal - vector = flip_axes(normal) if face.use_smooth else flip_axes(face.normal) - - str_vec = str(vector) - try: - vectors_[str_vec] - except KeyError: - vectors.append(vector) - vectors_[str_vec] = True - - return vectors - - -def _uvs(mesh): - """ - - :param mesh: - :rtype: [[], ...], [{}, ...] - - """ - uv_layers = [] - uv_indices = [] - - for layer in mesh.uv_layers: - uv_layers.append([]) - uv_indices.append({}) - index = 0 - - for uv_data in layer.data: - uv_tuple = (uv_data.uv[0], uv_data.uv[1]) - uv_key = str(uv_tuple) - - try: - uv_indices[-1][uv_key] - except KeyError: - uv_indices[-1][uv_key] = index - uv_layers[-1].append(uv_tuple) - index += 1 - - return uv_layers, uv_indices - - -def _armature(mesh): - """ - - :param mesh: - - """ - obj = object_.objects_using_mesh(mesh)[0] - armature = obj.find_armature() - - #manthrax: Remove logging spam. This was spamming on every vertex... - #if armature: - # logger.info("Found armature %s for %s", armature.name, obj.name) - #else: - # logger.info("Found no armature for %s", obj.name) - return armature - - -def _skinning_data(mesh, bone_map, influences, anim_type, array_index): - """ - - :param mesh: - :param bone_map: - :param influences: - :param array_index: - :param anim_type - - """ - armature = _armature(mesh) - manifest = [] - if not armature: - return manifest - - # armature bones here based on type - if anim_type == constants.OFF or anim_type == constants.REST: - armature_bones = armature.data.bones - else: - # POSE mode - armature_bones = armature.pose.bones - - obj = object_.objects_using_mesh(mesh)[0] - logger.debug("Skinned object found %s", obj.name) - - for vertex in mesh.vertices: - bone_array = [] - for group in vertex.groups: - bone_array.append((group.group, group.weight)) - - bone_array.sort(key=operator.itemgetter(1), reverse=True) - - for index in range(influences): - if index >= len(bone_array): - manifest.append(0) - continue - name = obj.vertex_groups[bone_array[index][0]].name - for bone_index, bone in enumerate(armature_bones): - if bone.name != name: - continue - if array_index is 0: - entry = bone_map.get(bone_index, -1) - else: - entry = bone_array[index][1] - - manifest.append(entry) - break - else: - manifest.append(0) - - return manifest - - -def _pose_bones(armature): - """ - - :param armature: - :rtype: [], {} - - """ - bones_ = [] - bone_map = {} - bone_count = 0 - - armature_matrix = armature.matrix_world - for bone_count, pose_bone in enumerate(armature.pose.bones): - armature_bone = pose_bone.bone - bone_index = None - - if armature_bone.parent is None: - bone_matrix = armature_matrix * armature_bone.matrix_local - bone_index = -1 - else: - parent_bone = armature_bone.parent - parent_matrix = armature_matrix * parent_bone.matrix_local - bone_matrix = armature_matrix * armature_bone.matrix_local - bone_matrix = parent_matrix.inverted() * bone_matrix - bone_index = index = 0 - - for pose_parent in armature.pose.bones: - armature_parent = pose_parent.bone.name - if armature_parent == parent_bone.name: - bone_index = index - index += 1 - - bone_map[bone_count] = bone_count - - pos, rot, scl = bone_matrix.decompose() - bones_.append({ - constants.PARENT: bone_index, - constants.NAME: armature_bone.name, - - constants.POS: (pos.x, pos.z, -pos.y), - constants.ROTQ: (rot.x, rot.z, -rot.y, rot.w), - constants.SCL: (scl.x, scl.z, scl.y) - }) - - return bones_, bone_map - - -def _rest_bones(armature): - """ - - :param armature: - :rtype: [], {} - - """ - bones_ = [] - bone_map = {} - bone_count = 0 - bone_index_rel = 0 - - for bone in armature.data.bones: - logger.info("Parsing bone %s", bone.name) - - if not bone.use_deform: - logger.debug("Ignoring bone %s at: %d", - bone.name, bone_index_rel) - continue - - if bone.parent is None: - bone_pos = bone.head_local - logger.debug("Root bone:%s",str(bone_pos)) - bone_index = -1 - else: - bone_pos = bone.head_local - bone.parent.head_local - logger.debug("Child bone:%s",str(bone_pos)) - - bone_index = 0 - index = 0 - for parent in armature.data.bones: - if parent.name == bone.parent.name: - bone_index = bone_map.get(index) - index += 1 - - bone_world_pos = armature.matrix_world * bone_pos - - x_axis = bone_world_pos.x - y_axis = bone_world_pos.z - z_axis = -bone_world_pos.y - - logger.debug("Bone pos:%s",str(bone_world_pos)) - - logger.debug("Adding bone %s at: %s, %s", - bone.name, bone_index, bone_index_rel) - bone_map[bone_count] = bone_index_rel - bone_index_rel += 1 - # @TODO: the rotq probably should not have these - # hard coded values - bones_.append({ - constants.PARENT: bone_index, - constants.NAME: bone.name, - constants.POS: (x_axis, y_axis, z_axis), - constants.ROTQ: (0, 0, 0, 1) - }) - - bone_count += 1 - - return (bones_, bone_map) diff --git a/utils/exporters/blender/addons/io_three/exporter/api/object.py b/utils/exporters/blender/addons/io_three/exporter/api/object.py deleted file mode 100644 index 23b84dfef0a02c9abe2fb1be8a41be0bc8d69124..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/api/object.py +++ /dev/null @@ -1,747 +0,0 @@ -import math -import mathutils -import bpy -from bpy import data, context, types -from bpy_extras.io_utils import axis_conversion -from .. import constants, logger, utilities, exceptions -from .constants import ( - MESH, - EMPTY, - ARMATURE, - LAMP, - AREA, - SPOT, - SUN, - POINT, - HEMI, - CAMERA, - PERSP, - ORTHO, - RENDER, - NO_SHADOW, - ZYX -) -# TODO: RectAreaLight support - - -# Blender doesn't seem to have a good way to link a mesh back to the -# objects that are instancing it, or it is bloody obvious and I haven't -# discovered yet. This manifest serves as a way for me to map a mesh -# node to the object nodes that are using it. -_MESH_MAP = {} - - -def _object(func): - """ - - :param func: - - """ - - def inner(arg, *args, **kwargs): - """ - - :param arg: - :param *args: - :param **kwargs: - - """ - - if isinstance(arg, types.Object): - obj = arg - else: - obj = data.objects[arg] - - return func(obj, *args, **kwargs) - - return inner - - -def clear_mesh_map(): - """Clears the mesh map, required on initialization""" - _MESH_MAP.clear() - - -def assemblies(valid_types, options): - """ - - :param valid_types: - :param options: - - """ - logger.debug('object.assemblies(%s)', valid_types) - for obj in data.objects: - - # rigged assets are parented under armature nodes - if obj.parent and obj.parent.type != ARMATURE: - continue - if obj.parent and obj.parent.type == ARMATURE: - logger.info('Has armature parent %s', obj.name) - if _valid_node(obj, valid_types, options): - yield obj.name - - -@_object -def cast_shadow(obj): - """ - - :param obj: - - """ - logger.debug('object.cast_shadow(%s)', obj) - if obj.type == LAMP: - if obj.data.type in (SPOT, SUN): - ret = obj.data.shadow_method != NO_SHADOW - else: - logger.info('%s is a lamp but this lamp type does not '\ - 'have supported shadows in ThreeJS', obj.name) - ret = None - return ret - elif obj.type == MESH: - mats = material(obj) - if mats: - for m in mats: - if data.materials[m].use_cast_shadows: - return True - return False - - -@_object -def children(obj, valid_types): - """ - - :param obj: - :param valid_types: - - """ - logger.debug('object.children(%s, %s)', obj, valid_types) - for child in obj.children: - if child.type in valid_types and child.THREE_export: - yield child.name - - -@_object -def material(obj): - """ - - :param obj: - - """ - logger.debug('object.material(%s)', obj) - - try: - matName = obj.material_slots[0].name # manthrax: Make this throw an error on an empty material array, resulting in non-material - return [o.name for o in obj.material_slots] - except IndexError: - pass - -def extract_time(fcurves, start_index): - time = [] - for xx in fcurves[start_index].keyframe_points: - time.append(xx.co.x) - return time - -def merge_sorted_lists(l1, l2): - sorted_list = [] - l1 = l1[:] - l2 = l2[:] - while (l1 and l2): - h1 = l1[0] - h2 = l2[0] - if h1 == h2: - sorted_list.append(h1) - l1.pop(0) - l2.pop(0) - elif h1 < h2: - l1.pop(0) - sorted_list.append(h1) - else: - l2.pop(0) - sorted_list.append(h2) - # Add the remaining of the lists - sorted_list.extend(l1 if l1 else l2) - return sorted_list - -def appendVec3(track, time, vec3): - track.append({ "time": time, "value": [ vec3.x, vec3.y, vec3.z ] }) - -def appendQuat(track, time, quat): - track.append({ "time": time, "value": [ quat.x, quat.y, quat.z, quat.w ] }) - -# trackable transform fields ( , ) -TRACKABLE_FIELDS = { - "location": ( ".position", 3, "vector3" ), - "scale": ( ".scale", 3, "vector3" ), - "rotation_euler": ( ".rotation", 3, "vector3" ), - "rotation_quaternion": ( ".quaternion", 4, "quaternion" ) -} -EXPORTED_TRACKABLE_FIELDS = [ "location", "scale", "rotation_quaternion" ] - -@_object -def animated_xform(obj, options): - if obj.animation_data is None: - return [] - fcurves = obj.animation_data - if not fcurves: - return [] - if fcurves.action is None: - return [] - fcurves = fcurves.action.fcurves - - objName = obj.name - - tracks = [] - i = 0 - nb_curves = len(fcurves) - - # extract unique frames - times = None - while i < nb_curves: - field_info = TRACKABLE_FIELDS.get(fcurves[i].data_path) - if field_info: - newTimes = extract_time(fcurves, i) - times = merge_sorted_lists(times, newTimes) if times else newTimes # merge list - i += field_info[1] - else: - i += 1 - - # init tracks - track_loc = [] - for fld in EXPORTED_TRACKABLE_FIELDS: - field_info = TRACKABLE_FIELDS[fld] - track = [] - track_loc.append(track) - tracks.append({ - constants.NAME: objName+field_info[0], - constants.TYPE: field_info[2], - constants.KEYS: track - }) - - # track arrays - track_sca = track_loc[1] - track_qua = track_loc[2] - track_loc = track_loc[0] - use_inverted = options.get(constants.HIERARCHY, False) and obj.parent - - if times == None: - logger.info("In animated xform: Unable to extract trackable fields from %s", objName) - return tracks - - # for each frame - inverted_fallback = mathutils.Matrix() if use_inverted else None - convert_matrix = AXIS_CONVERSION # matrix to convert the exported matrix - original_frame = context.scene.frame_current - - if options.get(constants.BAKE_KEYFRAMES): - frame_step = options.get(constants.FRAME_STEP, 1) - logger.info("Baking keyframes, frame_step=%d", frame_step) - times = range(context.scene.frame_start, context.scene.frame_end+1, frame_step) - - for time in times: - context.scene.frame_set(time, 0.0) - if use_inverted: # need to use the inverted, parent matrix might have chance - convert_matrix = obj.parent.matrix_world.inverted(inverted_fallback) - wm = convert_matrix * obj.matrix_world - appendVec3(track_loc, time, wm.to_translation()) - appendVec3(track_sca, time, wm.to_scale() ) - appendQuat(track_qua, time, wm.to_quaternion() ) - context.scene.frame_set(original_frame, 0.0) # restore to original frame - - # TODO: remove duplicated key frames - return tracks - -@_object -def custom_properties(obj): - """ - - :param obj: - - """ - logger.debug('object.custom_properties(%s)', obj) - # Grab any properties except those marked private (by underscore - # prefix) or those with types that would be rejected by the JSON - # serializer object model. - return {K: obj[K] for K in obj.keys() if K[:1] != '_' and isinstance(obj[K], constants.VALID_DATA_TYPES)} # 'Empty' Blender objects do not use obj.data.items() for custom properties, using obj.keys() - -@_object -def mesh(obj, options): - """ - - :param obj: - :param options: - - """ - logger.debug('object.mesh(%s, %s)', obj, options) - if obj.type != MESH: - return - - for mesh_, objects in _MESH_MAP.items(): - if obj in objects: - return mesh_ - else: - logger.debug('Could not map object, updating manifest') - mesh_ = extract_mesh(obj, options) - if len(mesh_.tessfaces) is not 0: - manifest = _MESH_MAP.setdefault(mesh_.name, []) - manifest.append(obj) - mesh_name = mesh_.name - else: - # possibly just being used as a controller - logger.info('Object %s has no faces', obj.name) - mesh_name = None - - return mesh_name - - -@_object -def name(obj): - """ - - :param obj: - - """ - return obj.name - - -@_object -def node_type(obj): - """ - - :param obj: - - """ - logger.debug('object.node_type(%s)', obj) - # standard transformation nodes are inferred - if obj.type == MESH: - return constants.MESH.title() - elif obj.type == EMPTY: - return constants.OBJECT.title() - - # TODO: RectAreaLight support - dispatch = { - LAMP: { - POINT: constants.POINT_LIGHT, - SUN: constants.DIRECTIONAL_LIGHT, - SPOT: constants.SPOT_LIGHT, - AREA: constants.RECT_AREA_LIGHT, - HEMI: constants.HEMISPHERE_LIGHT - }, - CAMERA: { - PERSP: constants.PERSPECTIVE_CAMERA, - ORTHO: constants.ORTHOGRAPHIC_CAMERA - } - } - try: - return dispatch[obj.type][obj.data.type] - except AttributeError: - msg = 'Invalid type: %s' % obj.type - raise exceptions.UnsupportedObjectType(msg) - - -def nodes(valid_types, options): - """ - - :param valid_types: - :param options: - - """ - for obj in data.objects: - if _valid_node(obj, valid_types, options): - yield obj.name - -@_object -def position(obj, options): - """ - - :param obj: - :param options: - - """ - logger.debug('object.position(%s)', obj) - vector = matrix(obj, options).to_translation() - return (vector.x, vector.y, vector.z) - - -@_object -def receive_shadow(obj): - """ - - :param obj: - - """ - if obj.type == MESH: - mats = material(obj) - if mats: - for m in mats: - if data.materials[m].use_shadows: - return True - return False - -AXIS_CONVERSION = axis_conversion(to_forward='Z', to_up='Y').to_4x4() - -@_object -def matrix(obj, options): - """ - - :param obj: - :param options: - - """ - logger.debug('object.matrix(%s)', obj) - if options.get(constants.HIERARCHY, False) and obj.parent: - parent_inverted = obj.parent.matrix_world.inverted(mathutils.Matrix()) - return parent_inverted * obj.matrix_world - else: - return AXIS_CONVERSION * obj.matrix_world - - -@_object -def rotation(obj, options): - """ - - :param obj: - :param options: - - """ - logger.debug('object.rotation(%s)', obj) - vector = matrix(obj, options).to_euler(ZYX) - return (vector.x, vector.y, vector.z) - - -@_object -def scale(obj, options): - """ - - :param obj: - :param options: - - """ - logger.debug('object.scale(%s)', obj) - vector = matrix(obj, options).to_scale() - return (vector.x, vector.y, vector.z) - - -@_object -def select(obj): - """ - - :param obj: - - """ - obj.select = True - - -@_object -def unselect(obj): - """ - - :param obj: - - """ - obj.select = False - - -@_object -def visible(obj): - """ - - :param obj: - - """ - logger.debug('object.visible(%s)', obj) - return obj.is_visible(context.scene) - - -def extract_mesh(obj, options, recalculate=False): - """ - - :param obj: - :param options: - :param recalculate: (Default value = False) - - """ - logger.debug('object.extract_mesh(%s, %s)', obj, options) - bpy.context.scene.objects.active = obj - hidden_state = obj.hide - obj.hide = False - - apply_modifiers = options.get(constants.APPLY_MODIFIERS, True) - if apply_modifiers: - bpy.ops.object.mode_set(mode='OBJECT') - mesh_node = obj.to_mesh(context.scene, apply_modifiers, RENDER) - - # transfer the geometry type to the extracted mesh - mesh_node.THREE_geometry_type = obj.data.THREE_geometry_type - - # now determine whether or not to export using the geometry type - # set globally from the exporter's options or to use the local - # override on the mesh node itself - opt_buffer = options.get(constants.GEOMETRY_TYPE) - opt_buffer = opt_buffer == constants.BUFFER_GEOMETRY - prop_buffer = mesh_node.THREE_geometry_type == constants.BUFFER_GEOMETRY - - # if doing buffer geometry it is imperative to triangulate the mesh - if opt_buffer or prop_buffer: - original_mesh = obj.data - obj.data = mesh_node - logger.debug('swapped %s for %s', - original_mesh.name, - mesh_node.name) - - bpy.ops.object.mode_set(mode='OBJECT') - obj.select = True - bpy.context.scene.objects.active = obj - logger.info('Applying triangulation to %s', obj.data.name) - bpy.ops.object.modifier_add(type='TRIANGULATE') - bpy.ops.object.modifier_apply(apply_as='DATA', - modifier='Triangulate') - obj.data = original_mesh - obj.select = False - - # split sharp edges - original_mesh = obj.data - obj.data = mesh_node - obj.select = True - - logger.info("Applying EDGE_SPLIT modifier....") - bpy.ops.object.modifier_add(type='EDGE_SPLIT') - bpy.context.object.modifiers['EdgeSplit'].use_edge_angle = False - bpy.context.object.modifiers['EdgeSplit'].use_edge_sharp = True - bpy.ops.object.modifier_apply(apply_as='DATA', modifier='EdgeSplit') - - obj.hide = hidden_state - obj.select = False - obj.data = original_mesh - - # recalculate the normals to face outwards, this is usually - # best after applying a modifiers, especialy for something - # like the mirror - if recalculate: - logger.info('Recalculating normals') - original_mesh = obj.data - obj.data = mesh_node - - bpy.context.scene.objects.active = obj - bpy.ops.object.mode_set(mode='EDIT') - bpy.ops.mesh.select_all(action='SELECT') - bpy.ops.mesh.normals_make_consistent() - bpy.ops.object.editmode_toggle() - - obj.data = original_mesh - - if not options.get(constants.SCENE): - xrot = mathutils.Matrix.Rotation(-math.pi/2, 4, 'X') - mesh_node.transform(xrot * obj.matrix_world) - - # blend shapes - if options.get(constants.BLEND_SHAPES) and not options.get(constants.MORPH_TARGETS): - original_mesh = obj.data - if original_mesh.shape_keys: - logger.info('Using blend shapes') - obj.data = mesh_node # swap to be able to add the shape keys - shp = original_mesh.shape_keys - - animCurves = shp.animation_data - if animCurves: - animCurves = animCurves.action.fcurves - - src_kbs = shp.key_blocks - for key in src_kbs.keys(): - logger.info("-- Parsing key %s", key) - obj.shape_key_add(name=key, from_mix=False) - src_kb = src_kbs[key].data - if key == 'Basis': - dst_kb = mesh_node.vertices - else: - dst_kb = mesh_node.shape_keys.key_blocks[key].data - for idx in range(len(src_kb)): - dst_kb[idx].co = src_kb[idx].co - - if animCurves: - data_path = 'key_blocks["'+key+'"].value' - for fcurve in animCurves: - if fcurve.data_path == data_path: - dst_kb = mesh_node.shape_keys.key_blocks[key] - for xx in fcurve.keyframe_points: - dst_kb.value = xx.co.y - dst_kb.keyframe_insert("value",frame=xx.co.x) - pass - break # no need to continue to loop - pass - obj.data = original_mesh - - # now generate a unique name - index = 0 - while True: - if index is 0: - mesh_name = '%sGeometry' % obj.data.name - else: - mesh_name = '%sGeometry.%d' % (obj.data.name, index) - try: - data.meshes[mesh_name] - index += 1 - except KeyError: - break - mesh_node.name = mesh_name - - mesh_node.update(calc_tessface=True) - mesh_node.calc_normals() - mesh_node.calc_tessface() - scale_ = options.get(constants.SCALE, 1) - mesh_node.transform(mathutils.Matrix.Scale(scale_, 4)) - - return mesh_node - - -def objects_using_mesh(mesh_node): - """ - - :param mesh_node: - :return: list of object names - - """ - #manthrax: remove spam - #logger.debug('object.objects_using_mesh(%s)', mesh_node) - for mesh_name, objects in _MESH_MAP.items(): - if mesh_name == mesh_node.name: - return objects - else: - logger.warning('Could not find mesh mapping') - - -def prep_meshes(options): - """Prep the mesh nodes. Preperation includes identifying: - - nodes that are on visible layers - - nodes that have export disabled - - nodes that have modifiers that need to be applied - - :param options: - - """ - logger.debug('object.prep_meshes(%s)', options) - mapping = {} - - visible_layers = _visible_scene_layers() - - for obj in data.objects: - if obj.type != MESH: - continue - - # this is ideal for skipping controller or proxy nodes - # that may apply to a Blender but not a 3js scene - if not _on_visible_layer(obj, visible_layers): - logger.info('%s is not on a visible layer', obj.name) - continue - - # if someone really insists on a visible node not being exportable - if not obj.THREE_export: - logger.info('%s export is disabled', obj.name) - continue - - # need to apply modifiers before moving on, and before - # handling instancing. it is possible for 2 or more objects - # instance the same mesh but to not all use the same modifiers - # this logic identifies the object with modifiers and extracts - # the mesh making the mesh unique to this particular object - if len(obj.modifiers): - logger.info('%s has modifiers' % obj.name) - mesh_node = extract_mesh(obj, options, recalculate=True) - _MESH_MAP[mesh_node.name] = [obj] - continue - - logger.info('adding mesh %s.%s to prep', - obj.name, obj.data.name) - manifest = mapping.setdefault(obj.data.name, []) - manifest.append(obj) - - # now associate the extracted mesh node with all the objects - # that are instancing it - for objects in mapping.values(): - mesh_node = extract_mesh(objects[0], options) - _MESH_MAP[mesh_node.name] = objects - - -def extracted_meshes(): - """ - - :return: names of extracted mesh nodes - - """ - logger.debug('object.extracted_meshes()') - return [key for key in _MESH_MAP.keys()] - - -def _on_visible_layer(obj, visible_layers): - """ - - :param obj: - :param visible_layers: - - """ - is_visible = False - for index, layer in enumerate(obj.layers): - if layer and index in visible_layers: - is_visible = True - break - - if not is_visible: - logger.info('%s is on a hidden layer', obj.name) - - return is_visible - - -def _visible_scene_layers(): - """ - - :return: list of visiible layer indices - - """ - visible_layers = [] - for index, layer in enumerate(context.scene.layers): - if layer: - visible_layers.append(index) - return visible_layers - - -def _valid_node(obj, valid_types, options): - """ - - :param obj: - :param valid_types: - :param options: - - """ - if obj.type not in valid_types: - return False - - # skip objects that are not on visible layers - visible_layers = _visible_scene_layers() - if not _on_visible_layer(obj, visible_layers): - return False - - try: - export = obj.THREE_export - except AttributeError: - export = True - if not export: - return False - - mesh_node = mesh(obj, options) - is_mesh = obj.type == MESH - - # skip objects that a mesh could not be resolved - if is_mesh and not mesh_node: - return False - - # secondary test; if a mesh node was resolved but no - # faces are detected then bow out - if is_mesh: - mesh_node = data.meshes[mesh_node] - if len(mesh_node.tessfaces) is 0: - return False - - # if we get this far assume that the mesh is valid - return True - - - diff --git a/utils/exporters/blender/addons/io_three/exporter/api/texture.py b/utils/exporters/blender/addons/io_three/exporter/api/texture.py deleted file mode 100644 index bebdbcb8c93191cca482b758148eb73c671559b3..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/api/texture.py +++ /dev/null @@ -1,187 +0,0 @@ -from bpy import data, types -from .. import constants, logger -from .constants import IMAGE, MAG_FILTER, MIN_FILTER, MAPPING -from . import image - - -def _texture(func): - """ - - :param func: - - """ - - def inner(name, *args, **kwargs): - """ - - :param name: - :param *args: - :param **kwargs: - - """ - - if isinstance(name, types.Texture): - texture = name - else: - texture = data.textures[name] - - return func(texture, *args, **kwargs) - - return inner - - -@_texture -def anisotropy(texture): - """ - - :param texture: - :return: filter_size value - - """ - logger.debug("texture.file_path(%s)", texture) - return texture.filter_size - - -@_texture -def file_name(texture): - """ - - :param texture: - :return: file name - - """ - logger.debug("texture.file_name(%s)", texture) - if texture.image: - return image.file_name(texture.image) - - -@_texture -def file_path(texture): - """ - - :param texture: - :return: file path - - """ - logger.debug("texture.file_path(%s)", texture) - if texture.image: - return image.file_path(texture.image) - - -@_texture -def image_node(texture): - """ - - :param texture: - :return: texture's image node - - """ - logger.debug("texture.image_node(%s)", texture) - return texture.image - - -@_texture -def mag_filter(texture): - """ - - :param texture: - :return: THREE_mag_filter value - - """ - logger.debug("texture.mag_filter(%s)", texture) - try: - val = texture.THREE_mag_filter - except AttributeError: - logger.debug("No THREE_mag_filter attribute found") - val = MAG_FILTER - - return val - - -@_texture -def mapping(texture): - """ - - :param texture: - :return: THREE_mapping value - - """ - logger.debug("texture.mapping(%s)", texture) - try: - val = texture.THREE_mapping - except AttributeError: - logger.debug("No THREE_mapping attribute found") - val = MAPPING - - return val - - -@_texture -def min_filter(texture): - """ - - :param texture: - :return: THREE_min_filter value - - """ - logger.debug("texture.min_filter(%s)", texture) - try: - val = texture.THREE_min_filter - except AttributeError: - logger.debug("No THREE_min_filter attribute found") - val = MIN_FILTER - - return val - - -@_texture -def repeat(texture): - """The repeat parameters of the texture node - - :param texture: - :returns: repeat_x, and repeat_y values - - """ - logger.debug("texture.repeat(%s)", texture) - return (texture.repeat_x, texture.repeat_y) - - -@_texture -def wrap(texture): - """The wrapping parameters of the texture node - - :param texture: - :returns: tuple of THREE compatible wrapping values - - """ - logger.debug("texture.wrap(%s)", texture) - - if(texture.extension == "REPEAT"): - wrapping = { - True: constants.WRAPPING.MIRROR, - False: constants.WRAPPING.REPEAT - } - return (wrapping[texture.use_mirror_x], - wrapping[texture.use_mirror_y]) - - # provide closest available three.js behavior. - # other possible values: "CLIP", "EXTEND", "CLIP_CUBE", "CHECKER", - # best match CLAMP behavior - else: - return (constants.WRAPPING.CLAMP, constants.WRAPPING.CLAMP); - - -def textures(): - """ - - :return: list of texture node names that are IMAGE - - """ - logger.debug("texture.textures()") - for mat in data.materials: - if mat.users == 0: - continue - for slot in mat.texture_slots: - if (slot and slot.use and - slot.texture and slot.texture.type == IMAGE): - yield slot.texture.name diff --git a/utils/exporters/blender/addons/io_three/exporter/base_classes.py b/utils/exporters/blender/addons/io_three/exporter/base_classes.py deleted file mode 100644 index cb95442a5b7eb27e6952ac21dc87fbf7d2794b5a..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/base_classes.py +++ /dev/null @@ -1,149 +0,0 @@ -from . import utilities -from .. import constants, exceptions - - -class BaseClass(constants.BASE_DICT): - """Base class which inherits from a base dictionary object.""" - _defaults = {} - - def __init__(self, parent=None, type=None): - constants.BASE_DICT.__init__(self) - - self._type = type - - self._parent = parent - - constants.BASE_DICT.update(self, self._defaults.copy()) - BaseClass._defaults = {} - - def __setitem__(self, key, value): - if not isinstance(value, constants.VALID_DATA_TYPES): - msg = "Value is an invalid data type: %s" % type(value) - raise exceptions.ThreeValueError(msg) - constants.BASE_DICT.__setitem__(self, key, value) - - @property - def count(self): - """ - - :return: number of keys - :rtype: int - - """ - return len(self.keys()) - - @property - def parent(self): - """ - - :return: parent object - - """ - return self._parent - - @property - def type(self): - """ - - :return: the type (if applicable) - - """ - return self._type - - def copy(self): - """Copies the items to a standard dictionary object. - - :rtype: dict - - """ - data = {} - - def _dict_copy(old, new): - """Recursive function for processing all values - - :param old: - :param new: - - """ - for key, value in old.items(): - if isinstance(value, (str, list)): - new[key] = value[:] - elif isinstance(value, tuple): - new[key] = value+tuple() - elif isinstance(value, dict): - new[key] = {} - _dict_copy(value, new[key]) - else: - new[key] = value - - _dict_copy(self, data) - - return data - - -class BaseNode(BaseClass): - """Base class for all nodes for the current platform.""" - def __init__(self, node, parent, type): - BaseClass.__init__(self, parent=parent, type=type) - self._node = node - if node is None: - self[constants.UUID] = utilities.id() - else: - self[constants.NAME] = node - self[constants.UUID] = utilities.id() - - if isinstance(parent, BaseScene): - scene = parent - elif parent is not None: - scene = parent.scene - else: - scene = None - - self._scene = scene - - @property - def node(self): - """ - - :return: name of the node - - """ - return self._node - - @property - def scene(self): - """ - - :return: returns the scene point - - """ - - return self._scene - - @property - def options(self): - """ - - :return: export options - :retype: dict - - """ - return self.scene.options - - -class BaseScene(BaseClass): - """Base class that scenes inherit from.""" - def __init__(self, filepath, options): - BaseClass.__init__(self, type=constants.SCENE) - - self._filepath = filepath - - self._options = options.copy() - - @property - def filepath(self): - return self._filepath - - @property - def options(self): - return self._options diff --git a/utils/exporters/blender/addons/io_three/exporter/geometry.py b/utils/exporters/blender/addons/io_three/exporter/geometry.py deleted file mode 100644 index 47b0804e8e86473240c9102543a5ee2a8d215408..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/geometry.py +++ /dev/null @@ -1,639 +0,0 @@ -""" -Module for creating Three.js geometry JSON nodes. -""" - -import os -from .. import constants, logger -from . import base_classes, io, api - - -FORMAT_VERSION = 3 - - -class Geometry(base_classes.BaseNode): - """Class that wraps a single mesh/geometry node.""" - def __init__(self, node, parent=None): - logger.debug("Geometry().__init__(%s)", node) - -# @TODO: maybe better to have `three` constants for -# strings that are specific to `three` properties - geo_type = constants.GEOMETRY.title() - if parent.options.get(constants.GEOMETRY_TYPE): - opt_type = parent.options[constants.GEOMETRY_TYPE] - if opt_type == constants.BUFFER_GEOMETRY: - geo_type = constants.BUFFER_GEOMETRY - elif opt_type != constants.GEOMETRY: - logger.error("Unknown geometry type %s", opt_type) - - logger.info("Setting %s to '%s'", node, geo_type) - - self._defaults[constants.TYPE] = geo_type - base_classes.BaseNode.__init__(self, node, - parent=parent, - type=geo_type) - - @property - def animation_filename(self): - """Calculate the file name for the animation file - - :return: base name for the file - """ - compression = self.options.get(constants.COMPRESSION) - if compression in (None, constants.NONE): - ext = constants.JSON - elif compression == constants.MSGPACK: - ext = constants.PACK - - key = '' - for key in (constants.MORPH_TARGETS, constants.ANIMATION, constants.CLIPS): - if key in self.keys(): - break - else: - logger.info("%s has no animation data", self.node) - return - - return '%s.%s.%s' % (self.node, key, ext) - - @property - def face_count(self): - """Parse the bit masks of the `faces` array. - - :rtype: int - - """ - try: - faces = self[constants.FACES] - except KeyError: - logger.debug("No parsed faces found") - return 0 - - length = len(faces) - offset = 0 - - def bitset(bit, mask): - """ - - :type bit: int - :type mask: int - - """ - return bit & (1 << mask) - - face_count = 0 - - masks = (constants.MASK[constants.UVS], - constants.MASK[constants.NORMALS], - constants.MASK[constants.COLORS]) - - while offset < length: - bit = faces[offset] - offset += 1 - face_count += 1 - - is_quad = bitset(bit, constants.MASK[constants.QUAD]) - vector = 4 if is_quad else 3 - offset += vector - - if bitset(bit, constants.MASK[constants.MATERIALS]): - offset += 1 - - for mask in masks: - if bitset(bit, mask): - offset += vector - - return face_count - - @property - def metadata(self): - """Metadata for the current node. - - :rtype: dict - - """ - metadata = { - constants.GENERATOR: constants.THREE, - constants.VERSION: FORMAT_VERSION - } - - if self[constants.TYPE] == constants.GEOMETRY.title(): - self._geometry_metadata(metadata) - else: - self._buffer_geometry_metadata(metadata) - - return metadata - - def copy(self, scene=True): - """Copy the geometry definitions to a standard dictionary. - - :param scene: toggle for scene formatting - (Default value = True) - :type scene: bool - :rtype: dict - - """ - logger.debug("Geometry().copy(scene=%s)", scene) - dispatch = { - True: self._scene_format, - False: self._geometry_format - } - data = dispatch[scene]() - - try: - data[constants.MATERIALS] = self[constants.MATERIALS].copy() - except KeyError: - logger.debug("No materials to copy") - - return data - - def copy_textures(self, texture_folder=''): - """Copy the textures to the destination directory.""" - logger.debug("Geometry().copy_textures()") - if self.options.get(constants.EXPORT_TEXTURES) and not self.options.get(constants.EMBED_TEXTURES): - texture_registration = self.register_textures() - if texture_registration: - logger.info("%s has registered textures", self.node) - dirname = os.path.dirname(os.path.abspath(self.scene.filepath)) - full_path = os.path.join(dirname, texture_folder) - io.copy_registered_textures( - full_path, texture_registration) - - def parse(self): - """Parse the current node""" - logger.debug("Geometry().parse()") - if self[constants.TYPE] == constants.GEOMETRY.title(): - logger.info("Parsing Geometry format") - self._parse_geometry() - else: - logger.info("Parsing BufferGeometry format") - self._parse_buffer_geometry() - - def register_textures(self): - """Obtain a texture registration object. - - :rtype: dict - - """ - logger.debug("Geometry().register_textures()") - return api.mesh.texture_registration(self.node) - - def write(self, filepath=None): - """Write the geometry definitions to disk. Uses the - destination path of the scene. - - :param filepath: optional output file path - (Default value = None) - :type filepath: str - - """ - logger.debug("Geometry().write(filepath=%s)", filepath) - - filepath = filepath or self.scene.filepath - - io.dump(filepath, self.copy(scene=False), - options=self.scene.options) - - if self.options.get(constants.MAPS): - logger.info("Copying textures for %s", self.node) - self.copy_textures() - - def write_animation(self, filepath): - """Write the animation definitions to a separate file - on disk. This helps optimize the geometry file size. - - :param filepath: destination path - :type filepath: str - - """ - logger.debug("Geometry().write_animation(%s)", filepath) - - for key in (constants.MORPH_TARGETS, constants.ANIMATION, constants.CLIPS): - try: - data = self[key] - break - except KeyError: - pass - else: - logger.info("%s has no animation data", self.node) - return - - filepath = os.path.join(filepath, self.animation_filename) - if filepath: - logger.info("Dumping animation data to %s", filepath) - io.dump(filepath, data, options=self.scene.options) - return filepath - else: - logger.warning("Could not determine a filepath for " - "animation data. Nothing written to disk.") - - def _component_data(self): - """Query the component data only - - :rtype: dict - - """ - logger.debug("Geometry()._component_data()") - - if self[constants.TYPE] != constants.GEOMETRY.title(): - data = {} - index = self.get(constants.INDEX) - if index is not None: - data[constants.INDEX] = index - data[constants.ATTRIBUTES] = self.get(constants.ATTRIBUTES) - data[constants.GROUPS] = self.get(constants.GROUPS) - return {constants.DATA: data} - - components = [constants.VERTICES, constants.FACES, - constants.UVS, constants.COLORS, - constants.NORMALS, constants.BONES, - constants.SKIN_WEIGHTS, - constants.SKIN_INDICES, - constants.INFLUENCES_PER_VERTEX, - constants.INDEX] - - data = {} - anim_components = [constants.MORPH_TARGETS, constants.ANIMATION, constants.MORPH_TARGETS_ANIM, constants.CLIPS] - if self.options.get(constants.EMBED_ANIMATION): - components.extend(anim_components) - else: - for component in anim_components: - try: - self[component] - except KeyError: - pass - else: - data[component] = os.path.basename( - self.animation_filename) - break - else: - logger.info("No animation data found for %s", self.node) - - option_extra_vgroups = self.options.get(constants.EXTRA_VGROUPS) - - for name, index in api.mesh.extra_vertex_groups(self.node, - option_extra_vgroups): - components.append(name) - - for component in components: - try: - data[component] = self[component] - except KeyError: - logger.debug("Component %s not found", component) - - return data - - def _geometry_format(self): - """Three.Geometry formatted definitions - - :rtype: dict - - """ - data = { - constants.METADATA: { - constants.TYPE: self[constants.TYPE] - } - } - data[constants.METADATA].update(self.metadata) - data.update(self._component_data()) - - draw_calls = self.get(constants.DRAW_CALLS) - if draw_calls is not None: - data[constants.DRAW_CALLS] = draw_calls - - return data - - def _buffer_geometry_metadata(self, metadata): - """Three.BufferGeometry metadata - - :rtype: dict - - """ - for key, value in self[constants.ATTRIBUTES].items(): - size = value[constants.ITEM_SIZE] - array = value[constants.ARRAY] - metadata[key] = len(array)/size - - def _geometry_metadata(self, metadata): - """Three.Geometry metadata - - :rtype: dict - - """ - skip = (constants.TYPE, constants.FACES, constants.UUID, - constants.ANIMATION, constants.SKIN_INDICES, - constants.SKIN_WEIGHTS, constants.NAME, - constants.INFLUENCES_PER_VERTEX) - vectors = (constants.VERTICES, constants.NORMALS) - - for key in self.keys(): - if key in vectors: - try: - metadata[key] = int(len(self[key])/3) - except KeyError: - pass - continue - - if key in skip: - continue - - metadata[key] = len(self[key]) - - faces = self.face_count - if faces > 0: - metadata[constants.FACES] = faces - - def _scene_format(self): - """Format the output for Scene compatibility - - :rtype: dict - - """ - data = { - constants.NAME: self[constants.NAME], - constants.UUID: self[constants.UUID], - constants.TYPE: self[constants.TYPE] - } - - if self[constants.TYPE] == constants.GEOMETRY.title(): - data[constants.DATA] = self._component_data() - else: - data.update(self._component_data()) - draw_calls = self.get(constants.DRAW_CALLS) - if draw_calls is not None: - data[constants.DRAW_CALLS] = draw_calls - - return data - - def _parse_buffer_geometry(self): - """Parse the geometry to Three.BufferGeometry specs""" - self[constants.ATTRIBUTES] = {} - - options_vertices = self.options.get(constants.VERTICES) - option_normals = self.options.get(constants.NORMALS) - option_uvs = self.options.get(constants.UVS) - option_colors = self.options.get(constants.COLORS) - option_extra_vgroups = self.options.get(constants.EXTRA_VGROUPS) - option_index_type = self.options.get(constants.INDEX_TYPE) - - pos_tuple = (constants.POSITION, options_vertices, - lambda m: api.mesh.buffer_position(m, self.options), 3) - uvs_tuple = (constants.UV, option_uvs, - api.mesh.buffer_uv, 2) - uvs2_tuple = (constants.UV2, option_uvs, - lambda m: api.mesh.buffer_uv(m, layer=1), 2) - normals_tuple = (constants.NORMAL, option_normals, - lambda m: api.mesh.buffer_normal(m, self.options), 3) - colors_tuple = (constants.COLOR, option_colors, - lambda m: api.mesh.buffer_color(m, self.options), 3) - dispatch = (pos_tuple, uvs_tuple, uvs2_tuple, normals_tuple, colors_tuple) - - for key, option, func, size in dispatch: - - if not option: - continue - - array = func(self.node) or [] - if not array: - logger.warning("No array could be made for %s", key) - continue - - self[constants.ATTRIBUTES][key] = { - constants.ITEM_SIZE: size, - constants.TYPE: constants.FLOAT_32, - constants.ARRAY: array - } - - for name, index in api.mesh.extra_vertex_groups(self.node, - option_extra_vgroups): - - logger.info("Exporting extra vertex group %s", name) - - array = api.mesh.buffer_vertex_group_data(self.node, index) - if not array: - logger.warning("No array could be made for %s", name) - continue - - self[constants.ATTRIBUTES][name] = { - constants.ITEM_SIZE: 1, - constants.TYPE: constants.FLOAT_32, - constants.ARRAY: array - } - - if option_index_type != constants.NONE: - - assert(not (self.get(constants.INDEX) or - self.get(constants.DRAW_CALLS))) - - indices_per_face = 3 - index_threshold = 0xffff - indices_per_face - if option_index_type == constants.UINT_32: - index_threshold = 0x7fffffff - indices_per_face - - attrib_data_in, attrib_data_out, attrib_keys = [], [], [] - - i = 0 - for key, entry in self[constants.ATTRIBUTES].items(): - - item_size = entry[constants.ITEM_SIZE] - - attrib_keys.append(key) - attrib_data_in.append((entry[constants.ARRAY], item_size)) - attrib_data_out.append(([], i, i + item_size)) - i += item_size - - index_data, draw_calls = [], [] - indexed, flush_req, base_vertex = {}, False, 0 - - assert(len(attrib_data_in) > 0) - array, item_size = attrib_data_in[0] - i, n = 0, len(array) / item_size - - - while i < n: - - vertex_data = () - for array, item_size in attrib_data_in: - vertex_data += tuple( - array[i * item_size:(i + 1) * item_size]) - - vertex_index = indexed.get(vertex_data) - - if vertex_index is None: - - vertex_index = len(indexed) - flush_req = vertex_index >= index_threshold - - indexed[vertex_data] = vertex_index - for array, i_from, i_to in attrib_data_out: - array.extend(vertex_data[i_from:i_to]) - - index_data.append(vertex_index) - - i += 1 - if i == n: - flush_req = len(draw_calls) > 0 - assert(i % indices_per_face == 0) - - if flush_req and i % indices_per_face == 0: - start, count = 0, len(index_data) - if draw_calls: - prev = draw_calls[-1] - start = (prev[constants.DC_START] + - prev[constants.DC_COUNT]) - count -= start - draw_calls.append({ - constants.DC_START: start, - constants.DC_COUNT: count, - constants.DC_INDEX: base_vertex - }) - base_vertex += len(indexed) - indexed.clear() - flush_req = False - - - #manthrax: Adding group support for multiple materials - #index_threshold = indices_per_face*100 - face_materials = api.mesh.buffer_face_material(self.node,self.options) - logger.info("Face material list length:%d",len(face_materials)) - logger.info("Drawcall parameters count:%s item_size=%s",n,item_size) - assert((len(face_materials)*3)==n) - #Re-index the index buffer by material - used_material_indexes = {} - #Get lists of faces indices per material - for idx, mat_index in enumerate(face_materials): - if used_material_indexes.get(mat_index) is None: - used_material_indexes[mat_index] = [idx] - else: - used_material_indexes[mat_index].append(idx) - - logger.info("# Faces by material:%s",str(used_material_indexes)) - - #manthrax: build new index list from lists of faces by material, and build the draw groups at the same time... - groups = [] - new_index = [] - print("Mat index:",str(used_material_indexes)) - - for mat_index in used_material_indexes: - face_array=used_material_indexes[mat_index] - print("Mat index:",str(mat_index),str(face_array)) - - print( dir(self.node) ) - - group = { - 'start': len(new_index), - 'count': len(face_array)*3, - 'materialIndex': mat_index - } - groups.append(group) - - for fi in range(len(face_array)): - prim_index = face_array[fi] - prim_index = prim_index * 3 - new_index.extend([index_data[prim_index],index_data[prim_index+1],index_data[prim_index+2]]) - - if len(groups) > 0: - index_data = new_index - self[constants.GROUPS]=groups - #else: - # self[constants.GROUPS]=[{ - # 'start':0, - # 'count':n, - # 'materialIndex':0 - #}] - #manthrax: End group support - - - - for i, key in enumerate(attrib_keys): - array = attrib_data_out[i][0] - self[constants.ATTRIBUTES][key][constants.ARRAY] = array - - self[constants.INDEX] = { - constants.ITEM_SIZE: 1, - constants.TYPE: option_index_type, - constants.ARRAY: index_data - } - if (draw_calls): - logger.info("draw_calls = %s", repr(draw_calls)) - self[constants.DRAW_CALLS] = draw_calls - - def _parse_geometry(self): - """Parse the geometry to Three.Geometry specs""" - if self.options.get(constants.VERTICES): - logger.info("Parsing %s", constants.VERTICES) - self[constants.VERTICES] = api.mesh.vertices(self.node, self.options) or [] - - if self.options.get(constants.NORMALS): - logger.info("Parsing %s", constants.NORMALS) - self[constants.NORMALS] = api.mesh.normals(self.node, self.options) or [] - - if self.options.get(constants.COLORS): - logger.info("Parsing %s", constants.COLORS) - self[constants.COLORS] = api.mesh.vertex_colors( - self.node) or [] - - if self.options.get(constants.FACE_MATERIALS): - logger.info("Parsing %s", constants.FACE_MATERIALS) - self[constants.MATERIALS] = api.mesh.materials( - self.node, self.options) or [] - - if self.options.get(constants.UVS): - logger.info("Parsing %s", constants.UVS) - self[constants.UVS] = api.mesh.uvs(self.node) or [] - - if self.options.get(constants.FACES): - logger.info("Parsing %s", constants.FACES) - material_list = self.get(constants.MATERIALS) - self[constants.FACES] = api.mesh.faces( - self.node, self.options, material_list=material_list) or [] - - no_anim = (None, False, constants.OFF) - if self.options.get(constants.ANIMATION) not in no_anim: - logger.info("Parsing %s", constants.ANIMATION) - self[constants.ANIMATION] = api.mesh.skeletal_animation( - self.node, self.options) or [] - -# @TODO: considering making bones data implied when -# querying skinning data - - bone_map = {} - if self.options.get(constants.BONES): - logger.info("Parsing %s", constants.BONES) - bones, bone_map = api.mesh.bones(self.node, self.options) - self[constants.BONES] = bones - - if self.options.get(constants.SKINNING): - logger.info("Parsing %s", constants.SKINNING) - influences = self.options.get( - constants.INFLUENCES_PER_VERTEX, 2) - anim_type = self.options.get(constants.ANIMATION) - - self[constants.INFLUENCES_PER_VERTEX] = influences - self[constants.SKIN_INDICES] = api.mesh.skin_indices( - self.node, bone_map, influences, anim_type) or [] - self[constants.SKIN_WEIGHTS] = api.mesh.skin_weights( - self.node, bone_map, influences, anim_type) or [] - - if self.options.get(constants.BLEND_SHAPES): - logger.info("Parsing %s", constants.BLEND_SHAPES) - mt = api.mesh.blend_shapes(self.node, self.options) or [] - self[constants.MORPH_TARGETS] = mt - if len(mt) > 0 and self._scene: # there's blend shapes, let check for animation - tracks = api.mesh.animated_blend_shapes(self.node, self[constants.NAME], self.options) or [] - merge = self._scene[constants.ANIMATION][0][constants.KEYFRAMES] - for track in tracks: - merge.append(track) - elif self.options.get(constants.MORPH_TARGETS): - logger.info("Parsing %s", constants.MORPH_TARGETS) - self[constants.MORPH_TARGETS] = api.mesh.morph_targets( - self.node, self.options) or [] - - # In the moment there is no way to add extra data to a Geomtry in - # Three.js. In case there is some day, here is the code: - # - # option_extra_vgroups = self.options.get(constants.EXTRA_VGROUPS) - # - # for name, index in api.mesh.extra_vertex_groups(self.node, - # option_extra_vgroups): - # - # logger.info("Exporting extra vertex group %s", name) - # self[name] = api.mesh.vertex_group_data(self.node, index) diff --git a/utils/exporters/blender/addons/io_three/exporter/image.py b/utils/exporters/blender/addons/io_three/exporter/image.py deleted file mode 100644 index 5f9608b8d564ffa0ec00b2a7dffcd84df9ba6965..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/image.py +++ /dev/null @@ -1,57 +0,0 @@ -import os -import base64 -from .. import constants, logger -from . import base_classes, io, api - - -class Image(base_classes.BaseNode): - """Class the wraps an image node. This is the node that - represent that actual file on disk. - """ - def __init__(self, node, parent): - logger.debug("Image().__init__(%s)", node) - base_classes.BaseNode.__init__(self, node, parent, constants.IMAGE) - - if(self.scene.options.get(constants.EMBED_TEXTURES, False)): - texturefile = open(api.image.file_path(self.node),"rb") - extension = os.path.splitext(api.image.file_path(self.node))[1][1:].strip().lower() - if(extension == 'jpg') : - extension = 'jpeg' - self[constants.URL] = "data:image/" + extension + ";base64," + base64.b64encode(texturefile.read()).decode("utf-8") - texturefile.close(); - else: - texture_folder = self.scene.options.get(constants.TEXTURE_FOLDER, "") - self[constants.URL] = os.path.join(texture_folder, api.image.file_name(self.node)) - - - @property - def destination(self): - """ - - :return: full destination path (when copied) - - """ - dirname = os.path.dirname(self.scene.filepath) - return os.path.join(dirname, self[constants.URL]) - - @property - def filepath(self): - """ - - :return: source file path - - """ - return api.image.file_path(self.node) - - def copy_texture(self, func=io.copy): - """Copy the texture. - self.filepath > self.destination - - :param func: Optional function override (Default value = io.copy) - arguments are (, ) - :return: path the texture was copied to - - """ - logger.debug("Image().copy_texture()") - func(self.filepath, self.destination) - return self.destination diff --git a/utils/exporters/blender/addons/io_three/exporter/io.py b/utils/exporters/blender/addons/io_three/exporter/io.py deleted file mode 100644 index 053daf42cb5ff53fa47cb0b41ee767aeb2e00f76..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/io.py +++ /dev/null @@ -1,106 +0,0 @@ -import os -import shutil -from .. import constants, logger -from . import _json - - -def copy_registered_textures(dest, registration): - """Copy the registered textures to the destination (root) path - - :param dest: destination directory - :param registration: registered textures - :type dest: str - :type registration: dict - - """ - logger.debug("io.copy_registered_textures(%s, %s)", dest, registration) - os.makedirs(dest, exist_ok=True) - for value in registration.values(): - copy(value['file_path'], dest) - - -def copy(src, dst): - """Copy a file to a destination - - :param src: source file - :param dst: destination file/path - - """ - logger.debug("io.copy(%s, %s)" % (src, dst)) - if os.path.isdir(dst): - file_name = os.path.basename(src) - dst = os.path.join(dst, file_name) - - if src != dst: - shutil.copy(src, dst) - - -def dump(filepath, data, options=None): - """Dump the output to disk (JSON, msgpack, etc) - - :param filepath: output file path - :param data: serializable data to write to disk - :param options: (Default value = None) - :type options: dict - - """ - options = options or {} - logger.debug("io.dump(%s, data, options=%s)", filepath, options) - - compress = options.get(constants.COMPRESSION, constants.NONE) - if compress == constants.MSGPACK: - try: - import msgpack - except ImportError: - logger.error("msgpack module not found") - raise - - logger.info("Dumping to msgpack") - func = lambda x, y: msgpack.dump(x, y) - mode = 'wb' - else: - round_off = options.get(constants.ENABLE_PRECISION) - if round_off: - _json.ROUND = options[constants.PRECISION] - else: - _json.ROUND = None - - indent = options.get(constants.INDENT, True) - indent = 4 if indent else None - compact_separators = (',', ':') - logger.info("Dumping to JSON") - func = lambda x, y: _json.json.dump(x, y, indent=indent, separators=compact_separators) - mode = 'w' - - logger.info("Writing to %s", filepath) - with open(filepath, mode=mode) as stream: - func(data, stream) - - -def load(filepath, options): - """Load the contents of the file path with the correct parser - - :param filepath: input file path - :param options: - :type options: dict - - """ - logger.debug("io.load(%s, %s)", filepath, options) - compress = options.get(constants.COMPRESSION, constants.NONE) - if compress == constants.MSGPACK: - try: - import msgpack - except ImportError: - logger.error("msgpack module not found") - raise - module = msgpack - mode = 'rb' - else: - logger.info("Loading JSON") - module = _json.json - mode = 'r' - - with open(filepath, mode=mode) as stream: - data = module.load(stream) - - return data diff --git a/utils/exporters/blender/addons/io_three/exporter/material.py b/utils/exporters/blender/addons/io_three/exporter/material.py deleted file mode 100644 index 746652d7067ce68153bc04c9cbf305e78610e7f4..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/material.py +++ /dev/null @@ -1,101 +0,0 @@ -from .. import constants, logger -from . import base_classes, utilities, api - - -class Material(base_classes.BaseNode): - """Class that wraps material nodes""" - def __init__(self, node, parent): - logger.debug("Material().__init__(%s)", node) - base_classes.BaseNode.__init__(self, node, parent, - constants.MATERIAL) - - self._common_attributes() - if self[constants.TYPE] == constants.THREE_PHONG: - self._phong_attributes() - - textures = self.parent.options.get(constants.MAPS) - if textures: - self._update_maps() - skinning = self.parent.options.get(constants.SKINNING) - if skinning: - self[constants.SKINNING] = True - - def _common_attributes(self): - """Parse the common material attributes""" - logger.debug('Material()._common_attributes()') - dispatch = { - constants.PHONG: constants.THREE_PHONG, - constants.LAMBERT: constants.THREE_LAMBERT, - constants.BASIC: constants.THREE_BASIC - } - shader_type = api.material.type(self.node) - self[constants.TYPE] = dispatch[shader_type] - - diffuse = api.material.diffuse_color(self.node) - self[constants.COLOR] = utilities.rgb2int(diffuse) - - if self[constants.TYPE] != constants.THREE_BASIC: - emissive = api.material.emissive_color(self.node) - self[constants.EMISSIVE] = utilities.rgb2int(emissive) - - vertex_color = api.material.use_vertex_colors(self.node) - if vertex_color: - self[constants.VERTEX_COLORS] = constants.VERTEX_COLORS_ON - else: - self[constants.VERTEX_COLORS] = constants.VERTEX_COLORS_OFF - - self[constants.BLENDING] = api.material.blending(self.node) - - if api.material.transparent(self.node): - self[constants.TRANSPARENT] = True - self[constants.OPACITY] = api.material.opacity(self.node) - - if api.material.double_sided(self.node): - self[constants.SIDE] = constants.SIDE_DOUBLE - - self[constants.DEPTH_TEST] = api.material.depth_test(self.node) - - self[constants.DEPTH_WRITE] = api.material.depth_write(self.node) - - def _phong_attributes(self): - """Parse phong specific attributes""" - logger.debug("Material()._phong_attributes()") - specular = api.material.specular_color(self.node) - self[constants.SPECULAR] = utilities.rgb2int(specular) - self[constants.SHININESS] = api.material.specular_coef(self.node) - - def _update_maps(self): - """Parses maps/textures and updates the textures array - with any new nodes found. - """ - logger.debug("Material()._update_maps()") - - mapping = ( - (api.material.diffuse_map, constants.MAP), - (api.material.specular_map, constants.SPECULAR_MAP), - (api.material.light_map, constants.LIGHT_MAP) - ) - - for func, key in mapping: - map_node = func(self.node) - if map_node: - logger.info('Found map node %s for %s', map_node, key) - tex_inst = self.scene.texture(map_node.name) - self[key] = tex_inst[constants.UUID] - - if self[constants.TYPE] == constants.THREE_PHONG: - mapping = ( - (api.material.bump_map, constants.BUMP_MAP, - constants.BUMP_SCALE, api.material.bump_scale), - (api.material.normal_map, constants.NORMAL_MAP, - constants.NORMAL_SCALE, api.material.normal_scale) - ) - - for func, map_key, scale_key, scale_func in mapping: - map_node = func(self.node) - if not map_node: - continue - logger.info("Found map node %s for %s", map_node, map_key) - tex_inst = self.scene.texture(map_node.name) - self[map_key] = tex_inst[constants.UUID] - self[scale_key] = scale_func(self.node) diff --git a/utils/exporters/blender/addons/io_three/exporter/object.py b/utils/exporters/blender/addons/io_three/exporter/object.py deleted file mode 100644 index 5e8b3f0e65ad12c70d2850712aebc10d5545d283..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/object.py +++ /dev/null @@ -1,167 +0,0 @@ -from .. import constants, logger -from . import base_classes, api - - -class Object(base_classes.BaseNode): - """Class that wraps an object node""" - def __init__(self, node, parent=None, type=None): - logger.debug("Object().__init__(%s)", node) - base_classes.BaseNode.__init__(self, node, parent=parent, type=type) - - if self.node: - self._node_setup() - else: - self._root_setup() - - @property - def data(self): - """ - - :return: returns the data block of the node - - """ - return api.data(self.node) - - - def _init_camera(self): - """Initialize camera attributes""" - logger.debug("Object()._init_camera()") - self[constants.FAR] = api.camera.far(self.data) - self[constants.NEAR] = api.camera.near(self.data) - - if self[constants.TYPE] == constants.PERSPECTIVE_CAMERA: - self[constants.ASPECT] = api.camera.aspect(self.data) - self[constants.FOV] = api.camera.fov(self.data) - elif self[constants.TYPE] == constants.ORTHOGRAPHIC_CAMERA: - self[constants.LEFT] = api.camera.left(self.data) - self[constants.RIGHT] = api.camera.right(self.data) - self[constants.TOP] = api.camera.top(self.data) - self[constants.BOTTOM] = api.camera.bottom(self.data) - - #@TODO: need more light attributes. Some may have to come from - # custom blender attributes. - def _init_light(self): - """Initialize light attributes""" - logger.debug("Object()._init_light()") - self[constants.COLOR] = api.light.color(self.data) - self[constants.INTENSITY] = api.light.intensity(self.data) - - # Commented out because Blender's distance is not a cutoff value. - #if self[constants.TYPE] != constants.DIRECTIONAL_LIGHT: - # self[constants.DISTANCE] = api.light.distance(self.data) - self[constants.DISTANCE] = 0; - - lightType = self[constants.TYPE] - - # TODO (abelnation): handle Area lights - if lightType == constants.SPOT_LIGHT: - self[constants.ANGLE] = api.light.angle(self.data) - self[constants.DECAY] = api.light.falloff(self.data) - elif lightType == constants.POINT_LIGHT: - self[constants.DECAY] = api.light.falloff(self.data) - - def _init_mesh(self): - """Initialize mesh attributes""" - logger.debug("Object()._init_mesh()") - mesh = api.object.mesh(self.node, self.options) - node = self.scene.geometry(mesh) - if node: - self[constants.GEOMETRY] = node[constants.UUID] - else: - msg = "Could not find Geometry() node for %s" - logger.error(msg, self.node) - - def _node_setup(self): - """Parse common node attributes of all objects""" - logger.debug("Object()._node_setup()") - self[constants.NAME] = api.object.name(self.node) - - transform = api.object.matrix(self.node, self.options) - matrix = [] - for col in range(0, 4): - for row in range(0, 4): - matrix.append(transform[row][col]) - - self[constants.MATRIX] = matrix - - self[constants.VISIBLE] = api.object.visible(self.node) - - self[constants.TYPE] = api.object.node_type(self.node) - - if self.options.get(constants.MATERIALS): - logger.info("Parsing materials for %s", self.node) - - - material_names = api.object.material(self.node) #manthrax: changes for multimaterial start here - if material_names: - - logger.info("Got material names for this object:%s",str(material_names)); - - materialArray = [self.scene.material(objname)[constants.UUID] for objname in material_names] - if len(materialArray) == 0: # If no materials.. dont export a material entry - materialArray = None - elif len(materialArray) == 1: # If only one material, export material UUID singly, not as array - materialArray = materialArray[0] - # else export array of material uuids - self[constants.MATERIAL] = materialArray - - logger.info("Materials:%s",str(self[constants.MATERIAL])); - else: - logger.info("%s has no materials", self.node) #manthrax: end multimaterial - - # TODO (abelnation): handle Area lights - casts_shadow = (constants.MESH, - constants.DIRECTIONAL_LIGHT, - constants.SPOT_LIGHT) - - if self[constants.TYPE] in casts_shadow: - logger.info("Querying shadow casting for %s", self.node) - self[constants.CAST_SHADOW] = \ - api.object.cast_shadow(self.node) - - if self[constants.TYPE] == constants.MESH: - logger.info("Querying shadow receive for %s", self.node) - self[constants.RECEIVE_SHADOW] = \ - api.object.receive_shadow(self.node) - - camera = (constants.PERSPECTIVE_CAMERA, - constants.ORTHOGRAPHIC_CAMERA) - - # TODO (abelnation): handle Area lights - lights = (constants.AMBIENT_LIGHT, - constants.DIRECTIONAL_LIGHT, - constants.POINT_LIGHT, - constants.SPOT_LIGHT, constants.HEMISPHERE_LIGHT) - - if self[constants.TYPE] == constants.MESH: - self._init_mesh() - elif self[constants.TYPE] in camera: - self._init_camera() - elif self[constants.TYPE] in lights: - self._init_light() - - no_anim = (None, False, constants.OFF) - if self.options.get(constants.KEYFRAMES) not in no_anim: - logger.info("Export Transform Animation for %s", self.node) - if self._scene: - # only when exporting scene - tracks = api.object.animated_xform(self.node, self.options) - merge = self._scene[constants.ANIMATION][0][constants.KEYFRAMES] - for track in tracks: - merge.append(track) - - if self.options.get(constants.HIERARCHY, False): - for child in api.object.children(self.node, self.scene.valid_types): - if not self.get(constants.CHILDREN): - self[constants.CHILDREN] = [Object(child, parent=self)] - else: - self[constants.CHILDREN].append(Object(child, parent=self)) - - if self.options.get(constants.CUSTOM_PROPERTIES, False): - self[constants.USER_DATA] = api.object.custom_properties(self.node) - - def _root_setup(self): - """Applies to a root/scene object""" - logger.debug("Object()._root_setup()") - self[constants.MATRIX] = [1, 0, 0, 0, 0, 1, 0, 0, 0, 0, - 1, 0, 0, 0, 0, 1] diff --git a/utils/exporters/blender/addons/io_three/exporter/scene.py b/utils/exporters/blender/addons/io_three/exporter/scene.py deleted file mode 100644 index 9f20a22d62f2352ddeb10f989d5ee18587c6a26a..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/scene.py +++ /dev/null @@ -1,259 +0,0 @@ -import os -from .. import constants, logger -from . import ( - base_classes, - texture, - material, - geometry, - object as object_, - utilities, - io, - api -) -from bpy import context - -class Scene(base_classes.BaseScene): - """Class that handles the contruction of a Three scene""" - - def __init__(self, filepath, options=None): - logger.debug("Scene().__init__(%s, %s)", filepath, options) - self._defaults = { - constants.METADATA: constants.DEFAULT_METADATA.copy(), - constants.GEOMETRIES: [], - constants.MATERIALS: [], - constants.IMAGES: [], - constants.TEXTURES: [], - constants.ANIMATION: [] - } - base_classes.BaseScene.__init__(self, filepath, options or {}) - - source_file = api.scene_name() - if source_file: - self[constants.METADATA][constants.SOURCE_FILE] = source_file - self.__init_animation() - - def __init_animation(self): - self[constants.ANIMATION].append({ - constants.NAME: "default", - constants.FPS : context.scene.render.fps, - constants.KEYFRAMES: [] - }); - pass - - @property - def valid_types(self): - """ - - :return: list of valid node types - - """ - valid_types = [api.constants.MESH] - - if self.options.get(constants.HIERARCHY, False): - valid_types.append(api.constants.EMPTY) - - if self.options.get(constants.CAMERAS): - logger.info("Adding cameras to valid object types") - valid_types.append(api.constants.CAMERA) - - if self.options.get(constants.LIGHTS): - logger.info("Adding lights to valid object types") - valid_types.append(api.constants.LAMP) - - return valid_types - - def geometry(self, value): - """Find a geometry node that matches either a name - or uuid value. - - :param value: name or uuid - :type value: str - - """ - logger.debug("Scene().geometry(%s)", value) - return _find_node(value, self[constants.GEOMETRIES]) - - def image(self, value): - """Find a image node that matches either a name - or uuid value. - - :param value: name or uuid - :type value: str - - """ - logger.debug("Scene().image%s)", value) - return _find_node(value, self[constants.IMAGES]) - - def material(self, value): - """Find a material node that matches either a name - or uuid value. - - :param value: name or uuid - :type value: str - - """ - logger.debug("Scene().material(%s)", value) - return _find_node(value, self[constants.MATERIALS]) - - def parse(self): - """Execute the parsing of the scene""" - logger.debug("Scene().parse()") - if self.options.get(constants.MAPS): - self._parse_textures() - - if self.options.get(constants.MATERIALS): - self._parse_materials() - - self._parse_geometries() - self._parse_objects() - - def texture(self, value): - """Find a texture node that matches either a name - or uuid value. - - :param value: name or uuid - :type value: str - - """ - logger.debug("Scene().texture(%s)", value) - return _find_node(value, self[constants.TEXTURES]) - - def write(self): - """Write the parsed scene to disk.""" - logger.debug("Scene().write()") - data = {} - - embed_anim = self.options.get(constants.EMBED_ANIMATION, True) - embed = self.options.get(constants.EMBED_GEOMETRY, True) - - compression = self.options.get(constants.COMPRESSION) - extension = constants.EXTENSIONS.get( - compression, - constants.EXTENSIONS[constants.JSON]) - - export_dir = os.path.dirname(self.filepath) - for key, value in self.items(): - - if key == constants.GEOMETRIES: - geometries = [] - for geom in value: - - if not embed_anim: - geom.write_animation(export_dir) - - geom_data = geom.copy() - if not embed: - geom_data.pop(constants.DATA) - - url = 'geometry.%s%s' % (geom.node, extension) - geometry_file = os.path.join(export_dir, url) - - geom.write(filepath=geometry_file) - geom_data[constants.URL] = os.path.basename(url) - - geometries.append(geom_data) - - data[key] = geometries - elif isinstance(value, list): - data[key] = [] - for each in value: - data[key].append(each.copy()) - elif isinstance(value, dict): - data[key] = value.copy() - - io.dump(self.filepath, data, options=self.options) - - if self.options.get(constants.EXPORT_TEXTURES) and not self.options.get(constants.EMBED_TEXTURES): - texture_folder = self.options.get(constants.TEXTURE_FOLDER) - for geo in self[constants.GEOMETRIES]: - logger.info("Copying textures from %s", geo.node) - geo.copy_textures(texture_folder) - - def _parse_geometries(self): - """Locate all geometry nodes and parse them""" - logger.debug("Scene()._parse_geometries()") - - # this is an important step. please refer to the doc string - # on the function for more information - api.object.prep_meshes(self.options) - geometries = [] - - # now iterate over all the extracted mesh nodes and parse each one - for mesh in api.object.extracted_meshes(): - logger.info("Parsing geometry %s", mesh) - geo = geometry.Geometry(mesh, self) - geo.parse() - geometries.append(geo) - - logger.info("Added %d geometry nodes", len(geometries)) - self[constants.GEOMETRIES] = geometries - - def _parse_materials(self): - """Locate all non-orphaned materials and parse them""" - logger.debug("Scene()._parse_materials()") - materials = [] - - for material_name in api.material.used_materials(): - logger.info("Parsing material %s", material_name) - materials.append(material.Material(material_name, parent=self)) - - logger.info("Added %d material nodes", len(materials)) - self[constants.MATERIALS] = materials - - def _parse_objects(self): - """Locate all valid objects in the scene and parse them""" - logger.debug("Scene()._parse_objects()") - try: - scene_name = self[constants.METADATA][constants.SOURCE_FILE] - except KeyError: - scene_name = constants.SCENE - self[constants.OBJECT] = object_.Object(None, parent=self) - self[constants.OBJECT][constants.TYPE] = constants.SCENE.title() - self[constants.UUID] = utilities.id() - - objects = [] - if self.options.get(constants.HIERARCHY, False): - nodes = api.object.assemblies(self.valid_types, self.options) - else: - nodes = api.object.nodes(self.valid_types, self.options) - - for node in nodes: - logger.info("Parsing object %s", node) - obj = object_.Object(node, parent=self[constants.OBJECT]) - objects.append(obj) - - logger.info("Added %d object nodes", len(objects)) - self[constants.OBJECT][constants.CHILDREN] = objects - - def _parse_textures(self): - """Locate all non-orphaned textures and parse them""" - logger.debug("Scene()._parse_textures()") - textures = [] - - for texture_name in api.texture.textures(): - logger.info("Parsing texture %s", texture_name) - tex_inst = texture.Texture(texture_name, self) - textures.append(tex_inst) - - logger.info("Added %d texture nodes", len(textures)) - self[constants.TEXTURES] = textures - - -def _find_node(value, manifest): - """Find a node that matches either a name - or uuid value. - - :param value: name or uuid - :param manifest: manifest of nodes to search - :type value: str - :type manifest: list - - """ - for index in manifest: - uuid = index.get(constants.UUID) == value - name = index.node == value - if uuid or name: - return index - else: - logger.debug("No matching node for %s", value) - diff --git a/utils/exporters/blender/addons/io_three/exporter/texture.py b/utils/exporters/blender/addons/io_three/exporter/texture.py deleted file mode 100644 index 240f9429746a54ec7b87425002a7fa39ac52237f..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/texture.py +++ /dev/null @@ -1,42 +0,0 @@ -from .. import constants, logger -from . import base_classes, image, api - - -class Texture(base_classes.BaseNode): - """Class that wraps a texture node""" - def __init__(self, node, parent): - logger.debug("Texture().__init__(%s)", node) - base_classes.BaseNode.__init__(self, node, parent, constants.TEXTURE) - - num = constants.NUMERIC - - img_inst = self.scene.image(api.texture.file_name(self.node)) - - if not img_inst: - image_node = api.texture.image_node(self.node) - img_inst = image.Image(image_node.name, self.scene) - self.scene[constants.IMAGES].append(img_inst) - - - self[constants.IMAGE] = img_inst[constants.UUID] - - wrap = api.texture.wrap(self.node) - self[constants.WRAP] = (num[wrap[0]], num[wrap[1]]) - - if constants.WRAPPING.REPEAT in wrap: - self[constants.REPEAT] = api.texture.repeat(self.node) - - self[constants.ANISOTROPY] = api.texture.anisotropy(self.node) - self[constants.MAG_FILTER] = num[api.texture.mag_filter(self.node)] - self[constants.MIN_FILTER] = num[api.texture.min_filter(self.node)] - self[constants.MAPPING] = num[api.texture.mapping(self.node)] - - @property - def image(self): - """ - - :return: the image object of the current texture - :rtype: image.Image - - """ - return self.scene.image(self[constants.IMAGE]) diff --git a/utils/exporters/blender/addons/io_three/exporter/utilities.py b/utils/exporters/blender/addons/io_three/exporter/utilities.py deleted file mode 100644 index d6d2346160c63e0208a8a3c1821c74534b18834f..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/exporter/utilities.py +++ /dev/null @@ -1,60 +0,0 @@ -import uuid -import hashlib - -from .. import constants - - -ROUND = constants.DEFAULT_PRECISION - - -def bit_mask(flags): - """Generate a bit mask. - - :type flags: dict - :return: int - - """ - bit = 0 - true = lambda x, y: (x | (1 << y)) - false = lambda x, y: (x & (~(1 << y))) - - for mask, position in constants.MASK.items(): - func = true if flags.get(mask) else false - bit = func(bit, position) - - return bit - - -def hash(value): - """Generate a hash from a given value - - :param value: - :rtype: str - - """ - hash_ = hashlib.md5() - hash_.update(repr(value).encode('utf8')) - return hash_.hexdigest() - - -def id(): - """Generate a random UUID - - :rtype: str - - """ - return str(uuid.uuid4()).upper() - - -def rgb2int(rgb): - """Convert a given rgb value to an integer - - :type rgb: list|tuple - :rtype: int - - """ - is_tuple = isinstance(rgb, tuple) - rgb = list(rgb) if is_tuple else rgb - - colour = (int(rgb[0]*255) << 16) + (int(rgb[1]*255) << 8) + int(rgb[2]*255) - return colour diff --git a/utils/exporters/blender/addons/io_three/logger.py b/utils/exporters/blender/addons/io_three/logger.py deleted file mode 100644 index c0ce36410c7d9ab721bb88b7cbdfba771f7a93f4..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/addons/io_three/logger.py +++ /dev/null @@ -1,83 +0,0 @@ -import os -import logging -import tempfile - -from . import constants - -LOG_FILE = None -LOGGER = None - -LEVELS = { - constants.DEBUG: logging.DEBUG, - constants.INFO: logging.INFO, - constants.WARNING: logging.WARNING, - constants.ERROR: logging.ERROR, - constants.CRITICAL: logging.CRITICAL -} - - -def init(filename, level=constants.DEBUG): - """Initialize the logger. - - :param filename: base name of the log file - :param level: logging level (Default value = DEBUG) - - """ - global LOG_FILE - LOG_FILE = os.path.join(tempfile.gettempdir(), filename) - with open(LOG_FILE, 'w'): - pass - - global LOGGER - LOGGER = logging.getLogger('Three.Export') - LOGGER.setLevel(LEVELS[level]) - - if not LOGGER.handlers: - stream = logging.StreamHandler() - stream.setLevel(LEVELS[level]) - - format_ = '%(asctime)s - %(name)s - %(levelname)s: %(message)s' - formatter = logging.Formatter(format_) - - stream.setFormatter(formatter) - - file_handler = logging.FileHandler(LOG_FILE) - file_handler.setLevel(LEVELS[level]) - file_handler.setFormatter(formatter) - - LOGGER.addHandler(stream) - LOGGER.addHandler(file_handler) - - -def _logger(func): - - def inner(*args): - if LOGGER is not None: - func(*args) - - return inner - - -@_logger -def info(*args): - LOGGER.info(*args) - - -@_logger -def debug(*args): - LOGGER.debug(*args) - - -@_logger -def warning(*args): - LOGGER.warning(*args) - - -@_logger -def error(*args): - LOGGER.error(*args) - - -@_logger -def critical(*args): - LOGGER.critical(*args) diff --git a/utils/exporters/blender/modules/README.md b/utils/exporters/blender/modules/README.md deleted file mode 100644 index 782b949b12f4688eadc4c1c0a8312100f81f96bc..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/README.md +++ /dev/null @@ -1,2 +0,0 @@ -## mspgack -https://github.com/msgpack/msgpack-python diff --git a/utils/exporters/blender/modules/msgpack/__init__.py b/utils/exporters/blender/modules/msgpack/__init__.py deleted file mode 100644 index 6c5ae53273c747e86985dd3969fe9c963a98d3a8..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/__init__.py +++ /dev/null @@ -1,54 +0,0 @@ -# coding: utf-8 -from msgpack._version import version -from msgpack.exceptions import * - -from collections import namedtuple - - -class ExtType(namedtuple('ExtType', 'code data')): - """ExtType represents ext type in msgpack.""" - def __new__(cls, code, data): - if not isinstance(code, int): - raise TypeError("code must be int") - if not isinstance(data, bytes): - raise TypeError("data must be bytes") - if not 0 <= code <= 127: - raise ValueError("code must be 0~127") - return super(ExtType, cls).__new__(cls, code, data) - - -import os -if os.environ.get('MSGPACK_PUREPYTHON'): - from msgpack.fallback import Packer, unpack, unpackb, Unpacker -else: - try: - from msgpack._packer import Packer - from msgpack._unpacker import unpack, unpackb, Unpacker - except ImportError: - from msgpack.fallback import Packer, unpack, unpackb, Unpacker - - -def pack(o, stream, **kwargs): - """ - Pack object `o` and write it to `stream` - - See :class:`Packer` for options. - """ - packer = Packer(**kwargs) - stream.write(packer.pack(o)) - - -def packb(o, **kwargs): - """ - Pack object `o` and return packed bytes - - See :class:`Packer` for options. - """ - return Packer(**kwargs).pack(o) - -# alias for compatibility to simplejson/marshal/pickle. -load = unpack -loads = unpackb - -dump = pack -dumps = packb diff --git a/utils/exporters/blender/modules/msgpack/_packer.pyx b/utils/exporters/blender/modules/msgpack/_packer.pyx deleted file mode 100644 index 82e4a63ddfdab9b3897c4f3d64ca9bdbee2e560b..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/_packer.pyx +++ /dev/null @@ -1,295 +0,0 @@ -# coding: utf-8 -#cython: embedsignature=True - -from cpython cimport * -from libc.stdlib cimport * -from libc.string cimport * -from libc.limits cimport * -from libc.stdint cimport int8_t - -from msgpack.exceptions import PackValueError -from msgpack import ExtType - - -cdef extern from "pack.h": - struct msgpack_packer: - char* buf - size_t length - size_t buf_size - bint use_bin_type - - int msgpack_pack_int(msgpack_packer* pk, int d) - int msgpack_pack_nil(msgpack_packer* pk) - int msgpack_pack_true(msgpack_packer* pk) - int msgpack_pack_false(msgpack_packer* pk) - int msgpack_pack_long(msgpack_packer* pk, long d) - int msgpack_pack_long_long(msgpack_packer* pk, long long d) - int msgpack_pack_unsigned_long_long(msgpack_packer* pk, unsigned long long d) - int msgpack_pack_float(msgpack_packer* pk, float d) - int msgpack_pack_double(msgpack_packer* pk, double d) - int msgpack_pack_array(msgpack_packer* pk, size_t l) - int msgpack_pack_map(msgpack_packer* pk, size_t l) - int msgpack_pack_raw(msgpack_packer* pk, size_t l) - int msgpack_pack_bin(msgpack_packer* pk, size_t l) - int msgpack_pack_raw_body(msgpack_packer* pk, char* body, size_t l) - int msgpack_pack_ext(msgpack_packer* pk, int8_t typecode, size_t l) - -cdef int DEFAULT_RECURSE_LIMIT=511 - - -cdef class Packer(object): - """ - MessagePack Packer - - usage:: - - packer = Packer() - astream.write(packer.pack(a)) - astream.write(packer.pack(b)) - - Packer's constructor has some keyword arguments: - - :param callable default: - Convert user type to builtin type that Packer supports. - See also simplejson's document. - :param str encoding: - Convert unicode to bytes with this encoding. (default: 'utf-8') - :param str unicode_errors: - Error handler for encoding unicode. (default: 'strict') - :param bool use_single_float: - Use single precision float type for float. (default: False) - :param bool autoreset: - Reset buffer after each pack and return it's content as `bytes`. (default: True). - If set this to false, use `bytes()` to get content and `.reset()` to clear buffer. - :param bool use_bin_type: - Use bin type introduced in msgpack spec 2.0 for bytes. - It also enable str8 type for unicode. - """ - cdef msgpack_packer pk - cdef object _default - cdef object _bencoding - cdef object _berrors - cdef char *encoding - cdef char *unicode_errors - cdef bool use_float - cdef bint autoreset - - def __cinit__(self): - cdef int buf_size = 1024*1024 - self.pk.buf = malloc(buf_size); - if self.pk.buf == NULL: - raise MemoryError("Unable to allocate internal buffer.") - self.pk.buf_size = buf_size - self.pk.length = 0 - - def __init__(self, default=None, encoding='utf-8', unicode_errors='strict', - use_single_float=False, bint autoreset=1, bint use_bin_type=0): - """ - """ - self.use_float = use_single_float - self.autoreset = autoreset - self.pk.use_bin_type = use_bin_type - if default is not None: - if not PyCallable_Check(default): - raise TypeError("default must be a callable.") - self._default = default - if encoding is None: - self.encoding = NULL - self.unicode_errors = NULL - else: - if isinstance(encoding, unicode): - self._bencoding = encoding.encode('ascii') - else: - self._bencoding = encoding - self.encoding = PyBytes_AsString(self._bencoding) - if isinstance(unicode_errors, unicode): - self._berrors = unicode_errors.encode('ascii') - else: - self._berrors = unicode_errors - self.unicode_errors = PyBytes_AsString(self._berrors) - - def __dealloc__(self): - free(self.pk.buf); - - cdef int _pack(self, object o, int nest_limit=DEFAULT_RECURSE_LIMIT) except -1: - cdef long long llval - cdef unsigned long long ullval - cdef long longval - cdef float fval - cdef double dval - cdef char* rawval - cdef int ret - cdef dict d - cdef size_t L - cdef int default_used = 0 - - if nest_limit < 0: - raise PackValueError("recursion limit exceeded.") - - while True: - if o is None: - ret = msgpack_pack_nil(&self.pk) - elif isinstance(o, bool): - if o: - ret = msgpack_pack_true(&self.pk) - else: - ret = msgpack_pack_false(&self.pk) - elif PyLong_Check(o): - # PyInt_Check(long) is True for Python 3. - # Sow we should test long before int. - if o > 0: - ullval = o - ret = msgpack_pack_unsigned_long_long(&self.pk, ullval) - else: - llval = o - ret = msgpack_pack_long_long(&self.pk, llval) - elif PyInt_Check(o): - longval = o - ret = msgpack_pack_long(&self.pk, longval) - elif PyFloat_Check(o): - if self.use_float: - fval = o - ret = msgpack_pack_float(&self.pk, fval) - else: - dval = o - ret = msgpack_pack_double(&self.pk, dval) - elif PyBytes_Check(o): - L = len(o) - if L > (2**32)-1: - raise ValueError("bytes is too large") - rawval = o - ret = msgpack_pack_bin(&self.pk, L) - if ret == 0: - ret = msgpack_pack_raw_body(&self.pk, rawval, L) - elif PyUnicode_Check(o): - if not self.encoding: - raise TypeError("Can't encode unicode string: no encoding is specified") - o = PyUnicode_AsEncodedString(o, self.encoding, self.unicode_errors) - L = len(o) - if L > (2**32)-1: - raise ValueError("dict is too large") - rawval = o - ret = msgpack_pack_raw(&self.pk, len(o)) - if ret == 0: - ret = msgpack_pack_raw_body(&self.pk, rawval, len(o)) - elif PyDict_CheckExact(o): - d = o - L = len(d) - if L > (2**32)-1: - raise ValueError("dict is too large") - ret = msgpack_pack_map(&self.pk, L) - if ret == 0: - for k, v in d.iteritems(): - ret = self._pack(k, nest_limit-1) - if ret != 0: break - ret = self._pack(v, nest_limit-1) - if ret != 0: break - elif PyDict_Check(o): - L = len(o) - if L > (2**32)-1: - raise ValueError("dict is too large") - ret = msgpack_pack_map(&self.pk, L) - if ret == 0: - for k, v in o.items(): - ret = self._pack(k, nest_limit-1) - if ret != 0: break - ret = self._pack(v, nest_limit-1) - if ret != 0: break - elif isinstance(o, ExtType): - # This should be before Tuple because ExtType is namedtuple. - longval = o.code - rawval = o.data - L = len(o.data) - if L > (2**32)-1: - raise ValueError("EXT data is too large") - ret = msgpack_pack_ext(&self.pk, longval, L) - ret = msgpack_pack_raw_body(&self.pk, rawval, L) - elif PyTuple_Check(o) or PyList_Check(o): - L = len(o) - if L > (2**32)-1: - raise ValueError("list is too large") - ret = msgpack_pack_array(&self.pk, L) - if ret == 0: - for v in o: - ret = self._pack(v, nest_limit-1) - if ret != 0: break - elif not default_used and self._default: - o = self._default(o) - default_used = 1 - continue - else: - raise TypeError("can't serialize %r" % (o,)) - return ret - - cpdef pack(self, object obj): - cdef int ret - ret = self._pack(obj, DEFAULT_RECURSE_LIMIT) - if ret == -1: - raise MemoryError - elif ret: # should not happen. - raise TypeError - if self.autoreset: - buf = PyBytes_FromStringAndSize(self.pk.buf, self.pk.length) - self.pk.length = 0 - return buf - - def pack_ext_type(self, typecode, data): - msgpack_pack_ext(&self.pk, typecode, len(data)) - msgpack_pack_raw_body(&self.pk, data, len(data)) - - def pack_array_header(self, size_t size): - if size > (2**32-1): - raise ValueError - cdef int ret = msgpack_pack_array(&self.pk, size) - if ret == -1: - raise MemoryError - elif ret: # should not happen - raise TypeError - if self.autoreset: - buf = PyBytes_FromStringAndSize(self.pk.buf, self.pk.length) - self.pk.length = 0 - return buf - - def pack_map_header(self, size_t size): - if size > (2**32-1): - raise ValueError - cdef int ret = msgpack_pack_map(&self.pk, size) - if ret == -1: - raise MemoryError - elif ret: # should not happen - raise TypeError - if self.autoreset: - buf = PyBytes_FromStringAndSize(self.pk.buf, self.pk.length) - self.pk.length = 0 - return buf - - def pack_map_pairs(self, object pairs): - """ - Pack *pairs* as msgpack map type. - - *pairs* should sequence of pair. - (`len(pairs)` and `for k, v in pairs:` should be supported.) - """ - cdef int ret = msgpack_pack_map(&self.pk, len(pairs)) - if ret == 0: - for k, v in pairs: - ret = self._pack(k) - if ret != 0: break - ret = self._pack(v) - if ret != 0: break - if ret == -1: - raise MemoryError - elif ret: # should not happen - raise TypeError - if self.autoreset: - buf = PyBytes_FromStringAndSize(self.pk.buf, self.pk.length) - self.pk.length = 0 - return buf - - def reset(self): - """Clear internal buffer.""" - self.pk.length = 0 - - def bytes(self): - """Return buffer content.""" - return PyBytes_FromStringAndSize(self.pk.buf, self.pk.length) diff --git a/utils/exporters/blender/modules/msgpack/_unpacker.pyx b/utils/exporters/blender/modules/msgpack/_unpacker.pyx deleted file mode 100644 index 16de40fba5fb0be34bab66c1e17a6084d4e9b5d3..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/_unpacker.pyx +++ /dev/null @@ -1,426 +0,0 @@ -# coding: utf-8 -#cython: embedsignature=True - -from cpython cimport * -cdef extern from "Python.h": - ctypedef struct PyObject - cdef int PyObject_AsReadBuffer(object o, const void** buff, Py_ssize_t* buf_len) except -1 - -from libc.stdlib cimport * -from libc.string cimport * -from libc.limits cimport * - -from msgpack.exceptions import ( - BufferFull, - OutOfData, - UnpackValueError, - ExtraData, - ) -from msgpack import ExtType - - -cdef extern from "unpack.h": - ctypedef struct msgpack_user: - bint use_list - PyObject* object_hook - bint has_pairs_hook # call object_hook with k-v pairs - PyObject* list_hook - PyObject* ext_hook - char *encoding - char *unicode_errors - - ctypedef struct unpack_context: - msgpack_user user - PyObject* obj - size_t count - - ctypedef int (*execute_fn)(unpack_context* ctx, const char* data, - size_t len, size_t* off) except? -1 - execute_fn unpack_construct - execute_fn unpack_skip - execute_fn read_array_header - execute_fn read_map_header - void unpack_init(unpack_context* ctx) - object unpack_data(unpack_context* ctx) - -cdef inline init_ctx(unpack_context *ctx, - object object_hook, object object_pairs_hook, - object list_hook, object ext_hook, - bint use_list, char* encoding, char* unicode_errors): - unpack_init(ctx) - ctx.user.use_list = use_list - ctx.user.object_hook = ctx.user.list_hook = NULL - - if object_hook is not None and object_pairs_hook is not None: - raise TypeError("object_pairs_hook and object_hook are mutually exclusive.") - - if object_hook is not None: - if not PyCallable_Check(object_hook): - raise TypeError("object_hook must be a callable.") - ctx.user.object_hook = object_hook - - if object_pairs_hook is None: - ctx.user.has_pairs_hook = False - else: - if not PyCallable_Check(object_pairs_hook): - raise TypeError("object_pairs_hook must be a callable.") - ctx.user.object_hook = object_pairs_hook - ctx.user.has_pairs_hook = True - - if list_hook is not None: - if not PyCallable_Check(list_hook): - raise TypeError("list_hook must be a callable.") - ctx.user.list_hook = list_hook - - if ext_hook is not None: - if not PyCallable_Check(ext_hook): - raise TypeError("ext_hook must be a callable.") - ctx.user.ext_hook = ext_hook - - ctx.user.encoding = encoding - ctx.user.unicode_errors = unicode_errors - -def default_read_extended_type(typecode, data): - raise NotImplementedError("Cannot decode extended type with typecode=%d" % typecode) - -def unpackb(object packed, object object_hook=None, object list_hook=None, - bint use_list=1, encoding=None, unicode_errors="strict", - object_pairs_hook=None, ext_hook=ExtType): - """ - Unpack packed_bytes to object. Returns an unpacked object. - - Raises `ValueError` when `packed` contains extra bytes. - - See :class:`Unpacker` for options. - """ - cdef unpack_context ctx - cdef size_t off = 0 - cdef int ret - - cdef char* buf - cdef Py_ssize_t buf_len - cdef char* cenc = NULL - cdef char* cerr = NULL - - PyObject_AsReadBuffer(packed, &buf, &buf_len) - - if encoding is not None: - if isinstance(encoding, unicode): - encoding = encoding.encode('ascii') - cenc = PyBytes_AsString(encoding) - - if unicode_errors is not None: - if isinstance(unicode_errors, unicode): - unicode_errors = unicode_errors.encode('ascii') - cerr = PyBytes_AsString(unicode_errors) - - init_ctx(&ctx, object_hook, object_pairs_hook, list_hook, ext_hook, - use_list, cenc, cerr) - ret = unpack_construct(&ctx, buf, buf_len, &off) - if ret == 1: - obj = unpack_data(&ctx) - if off < buf_len: - raise ExtraData(obj, PyBytes_FromStringAndSize(buf+off, buf_len-off)) - return obj - else: - raise UnpackValueError("Unpack failed: error = %d" % (ret,)) - - -def unpack(object stream, object object_hook=None, object list_hook=None, - bint use_list=1, encoding=None, unicode_errors="strict", - object_pairs_hook=None, - ): - """ - Unpack an object from `stream`. - - Raises `ValueError` when `stream` has extra bytes. - - See :class:`Unpacker` for options. - """ - return unpackb(stream.read(), use_list=use_list, - object_hook=object_hook, object_pairs_hook=object_pairs_hook, list_hook=list_hook, - encoding=encoding, unicode_errors=unicode_errors, - ) - - -cdef class Unpacker(object): - """ - Streaming unpacker. - - arguments: - - :param file_like: - File-like object having `.read(n)` method. - If specified, unpacker reads serialized data from it and :meth:`feed()` is not usable. - - :param int read_size: - Used as `file_like.read(read_size)`. (default: `min(1024**2, max_buffer_size)`) - - :param bool use_list: - If true, unpack msgpack array to Python list. - Otherwise, unpack to Python tuple. (default: True) - - :param callable object_hook: - When specified, it should be callable. - Unpacker calls it with a dict argument after unpacking msgpack map. - (See also simplejson) - - :param callable object_pairs_hook: - When specified, it should be callable. - Unpacker calls it with a list of key-value pairs after unpacking msgpack map. - (See also simplejson) - - :param str encoding: - Encoding used for decoding msgpack raw. - If it is None (default), msgpack raw is deserialized to Python bytes. - - :param str unicode_errors: - Used for decoding msgpack raw with *encoding*. - (default: `'strict'`) - - :param int max_buffer_size: - Limits size of data waiting unpacked. 0 means system's INT_MAX (default). - Raises `BufferFull` exception when it is insufficient. - You shoud set this parameter when unpacking data from untrasted source. - - example of streaming deserialize from file-like object:: - - unpacker = Unpacker(file_like) - for o in unpacker: - process(o) - - example of streaming deserialize from socket:: - - unpacker = Unpacker() - while True: - buf = sock.recv(1024**2) - if not buf: - break - unpacker.feed(buf) - for o in unpacker: - process(o) - """ - cdef unpack_context ctx - cdef char* buf - cdef size_t buf_size, buf_head, buf_tail - cdef object file_like - cdef object file_like_read - cdef Py_ssize_t read_size - # To maintain refcnt. - cdef object object_hook, object_pairs_hook, list_hook, ext_hook - cdef object encoding, unicode_errors - cdef size_t max_buffer_size - - def __cinit__(self): - self.buf = NULL - - def __dealloc__(self): - free(self.buf) - self.buf = NULL - - def __init__(self, file_like=None, Py_ssize_t read_size=0, bint use_list=1, - object object_hook=None, object object_pairs_hook=None, object list_hook=None, - str encoding=None, str unicode_errors='strict', int max_buffer_size=0, - object ext_hook=ExtType): - cdef char *cenc=NULL, - cdef char *cerr=NULL - - self.object_hook = object_hook - self.object_pairs_hook = object_pairs_hook - self.list_hook = list_hook - self.ext_hook = ext_hook - - self.file_like = file_like - if file_like: - self.file_like_read = file_like.read - if not PyCallable_Check(self.file_like_read): - raise TypeError("`file_like.read` must be a callable.") - if not max_buffer_size: - max_buffer_size = INT_MAX - if read_size > max_buffer_size: - raise ValueError("read_size should be less or equal to max_buffer_size") - if not read_size: - read_size = min(max_buffer_size, 1024**2) - self.max_buffer_size = max_buffer_size - self.read_size = read_size - self.buf = malloc(read_size) - if self.buf == NULL: - raise MemoryError("Unable to allocate internal buffer.") - self.buf_size = read_size - self.buf_head = 0 - self.buf_tail = 0 - - if encoding is not None: - if isinstance(encoding, unicode): - self.encoding = encoding.encode('ascii') - else: - self.encoding = encoding - cenc = PyBytes_AsString(self.encoding) - - if unicode_errors is not None: - if isinstance(unicode_errors, unicode): - self.unicode_errors = unicode_errors.encode('ascii') - else: - self.unicode_errors = unicode_errors - cerr = PyBytes_AsString(self.unicode_errors) - - init_ctx(&self.ctx, object_hook, object_pairs_hook, list_hook, - ext_hook, use_list, cenc, cerr) - - def feed(self, object next_bytes): - """Append `next_bytes` to internal buffer.""" - cdef Py_buffer pybuff - if self.file_like is not None: - raise AssertionError( - "unpacker.feed() is not be able to use with `file_like`.") - PyObject_GetBuffer(next_bytes, &pybuff, PyBUF_SIMPLE) - try: - self.append_buffer(pybuff.buf, pybuff.len) - finally: - PyBuffer_Release(&pybuff) - - cdef append_buffer(self, void* _buf, Py_ssize_t _buf_len): - cdef: - char* buf = self.buf - char* new_buf - size_t head = self.buf_head - size_t tail = self.buf_tail - size_t buf_size = self.buf_size - size_t new_size - - if tail + _buf_len > buf_size: - if ((tail - head) + _buf_len) <= buf_size: - # move to front. - memmove(buf, buf + head, tail - head) - tail -= head - head = 0 - else: - # expand buffer. - new_size = (tail-head) + _buf_len - if new_size > self.max_buffer_size: - raise BufferFull - new_size = min(new_size*2, self.max_buffer_size) - new_buf = malloc(new_size) - if new_buf == NULL: - # self.buf still holds old buffer and will be freed during - # obj destruction - raise MemoryError("Unable to enlarge internal buffer.") - memcpy(new_buf, buf + head, tail - head) - free(buf) - - buf = new_buf - buf_size = new_size - tail -= head - head = 0 - - memcpy(buf + tail, (_buf), _buf_len) - self.buf = buf - self.buf_head = head - self.buf_size = buf_size - self.buf_tail = tail + _buf_len - - cdef read_from_file(self): - next_bytes = self.file_like_read( - min(self.read_size, - self.max_buffer_size - (self.buf_tail - self.buf_head) - )) - if next_bytes: - self.append_buffer(PyBytes_AsString(next_bytes), PyBytes_Size(next_bytes)) - else: - self.file_like = None - - cdef object _unpack(self, execute_fn execute, object write_bytes, bint iter=0): - cdef int ret - cdef object obj - cdef size_t prev_head - - if self.buf_head >= self.buf_tail and self.file_like is not None: - self.read_from_file() - - while 1: - prev_head = self.buf_head - if prev_head >= self.buf_tail: - if iter: - raise StopIteration("No more data to unpack.") - else: - raise OutOfData("No more data to unpack.") - - ret = execute(&self.ctx, self.buf, self.buf_tail, &self.buf_head) - if write_bytes is not None: - write_bytes(PyBytes_FromStringAndSize(self.buf + prev_head, self.buf_head - prev_head)) - - if ret == 1: - obj = unpack_data(&self.ctx) - unpack_init(&self.ctx) - return obj - elif ret == 0: - if self.file_like is not None: - self.read_from_file() - continue - if iter: - raise StopIteration("No more data to unpack.") - else: - raise OutOfData("No more data to unpack.") - else: - raise ValueError("Unpack failed: error = %d" % (ret,)) - - def read_bytes(self, Py_ssize_t nbytes): - """read a specified number of raw bytes from the stream""" - cdef size_t nread - nread = min(self.buf_tail - self.buf_head, nbytes) - ret = PyBytes_FromStringAndSize(self.buf + self.buf_head, nread) - self.buf_head += nread - if len(ret) < nbytes and self.file_like is not None: - ret += self.file_like.read(nbytes - len(ret)) - return ret - - def unpack(self, object write_bytes=None): - """ - unpack one object - - If write_bytes is not None, it will be called with parts of the raw - message as it is unpacked. - - Raises `OutOfData` when there are no more bytes to unpack. - """ - return self._unpack(unpack_construct, write_bytes) - - def skip(self, object write_bytes=None): - """ - read and ignore one object, returning None - - If write_bytes is not None, it will be called with parts of the raw - message as it is unpacked. - - Raises `OutOfData` when there are no more bytes to unpack. - """ - return self._unpack(unpack_skip, write_bytes) - - def read_array_header(self, object write_bytes=None): - """assuming the next object is an array, return its size n, such that - the next n unpack() calls will iterate over its contents. - - Raises `OutOfData` when there are no more bytes to unpack. - """ - return self._unpack(read_array_header, write_bytes) - - def read_map_header(self, object write_bytes=None): - """assuming the next object is a map, return its size n, such that the - next n * 2 unpack() calls will iterate over its key-value pairs. - - Raises `OutOfData` when there are no more bytes to unpack. - """ - return self._unpack(read_map_header, write_bytes) - - def __iter__(self): - return self - - def __next__(self): - return self._unpack(unpack_construct, None, 1) - - # for debug. - #def _buf(self): - # return PyString_FromStringAndSize(self.buf, self.buf_tail) - - #def _off(self): - # return self.buf_head diff --git a/utils/exporters/blender/modules/msgpack/_version.py b/utils/exporters/blender/modules/msgpack/_version.py deleted file mode 100644 index dddfe496017440f105c0bcac6154748b9305ed6f..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/_version.py +++ /dev/null @@ -1 +0,0 @@ -version = (0, 4, 2) diff --git a/utils/exporters/blender/modules/msgpack/exceptions.py b/utils/exporters/blender/modules/msgpack/exceptions.py deleted file mode 100644 index f7678f135bd26286434ef1b1160363807e10bc12..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/exceptions.py +++ /dev/null @@ -1,29 +0,0 @@ -class UnpackException(Exception): - pass - - -class BufferFull(UnpackException): - pass - - -class OutOfData(UnpackException): - pass - - -class UnpackValueError(UnpackException, ValueError): - pass - - -class ExtraData(ValueError): - def __init__(self, unpacked, extra): - self.unpacked = unpacked - self.extra = extra - - def __str__(self): - return "unpack(b) received extra data." - -class PackException(Exception): - pass - -class PackValueError(PackException, ValueError): - pass diff --git a/utils/exporters/blender/modules/msgpack/fallback.py b/utils/exporters/blender/modules/msgpack/fallback.py deleted file mode 100644 index 49323e634978576eecbc48e966f71a4939a5da93..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/fallback.py +++ /dev/null @@ -1,714 +0,0 @@ -"""Fallback pure Python implementation of msgpack""" - -import sys -import array -import struct - -if sys.version_info[0] == 3: - PY3 = True - int_types = int - Unicode = str - xrange = range - def dict_iteritems(d): - return d.items() -else: - PY3 = False - int_types = (int, long) - Unicode = unicode - def dict_iteritems(d): - return d.iteritems() - - -if hasattr(sys, 'pypy_version_info'): - # cStringIO is slow on PyPy, StringIO is faster. However: PyPy's own - # StringBuilder is fastest. - from __pypy__ import newlist_hint - from __pypy__.builders import StringBuilder - USING_STRINGBUILDER = True - class StringIO(object): - def __init__(self, s=b''): - if s: - self.builder = StringBuilder(len(s)) - self.builder.append(s) - else: - self.builder = StringBuilder() - def write(self, s): - self.builder.append(s) - def getvalue(self): - return self.builder.build() -else: - USING_STRINGBUILDER = False - from io import BytesIO as StringIO - newlist_hint = lambda size: [] - -from msgpack.exceptions import ( - BufferFull, - OutOfData, - UnpackValueError, - PackValueError, - ExtraData) - -from msgpack import ExtType - - -EX_SKIP = 0 -EX_CONSTRUCT = 1 -EX_READ_ARRAY_HEADER = 2 -EX_READ_MAP_HEADER = 3 - -TYPE_IMMEDIATE = 0 -TYPE_ARRAY = 1 -TYPE_MAP = 2 -TYPE_RAW = 3 -TYPE_BIN = 4 -TYPE_EXT = 5 - -DEFAULT_RECURSE_LIMIT = 511 - - -def unpack(stream, **kwargs): - """ - Unpack an object from `stream`. - - Raises `ExtraData` when `packed` contains extra bytes. - See :class:`Unpacker` for options. - """ - unpacker = Unpacker(stream, **kwargs) - ret = unpacker._fb_unpack() - if unpacker._fb_got_extradata(): - raise ExtraData(ret, unpacker._fb_get_extradata()) - return ret - - -def unpackb(packed, **kwargs): - """ - Unpack an object from `packed`. - - Raises `ExtraData` when `packed` contains extra bytes. - See :class:`Unpacker` for options. - """ - unpacker = Unpacker(None, **kwargs) - unpacker.feed(packed) - try: - ret = unpacker._fb_unpack() - except OutOfData: - raise UnpackValueError("Data is not enough.") - if unpacker._fb_got_extradata(): - raise ExtraData(ret, unpacker._fb_get_extradata()) - return ret - - -class Unpacker(object): - """ - Streaming unpacker. - - `file_like` is a file-like object having a `.read(n)` method. - When `Unpacker` is initialized with a `file_like`, `.feed()` is not - usable. - - `read_size` is used for `file_like.read(read_size)`. - - If `use_list` is True (default), msgpack lists are deserialized to Python - lists. Otherwise they are deserialized to tuples. - - `object_hook` is the same as in simplejson. If it is not None, it should - be callable and Unpacker calls it with a dict argument after deserializing - a map. - - `object_pairs_hook` is the same as in simplejson. If it is not None, it - should be callable and Unpacker calls it with a list of key-value pairs - after deserializing a map. - - `ext_hook` is callback for ext (User defined) type. It called with two - arguments: (code, bytes). default: `msgpack.ExtType` - - `encoding` is the encoding used for decoding msgpack bytes. If it is - None (default), msgpack bytes are deserialized to Python bytes. - - `unicode_errors` is used for decoding bytes. - - `max_buffer_size` limits the buffer size. 0 means INT_MAX (default). - - Raises `BufferFull` exception when it is unsufficient. - - You should set this parameter when unpacking data from an untrustred source. - - example of streaming deserialization from file-like object:: - - unpacker = Unpacker(file_like) - for o in unpacker: - do_something(o) - - example of streaming deserialization from socket:: - - unpacker = Unpacker() - while 1: - buf = sock.recv(1024*2) - if not buf: - break - unpacker.feed(buf) - for o in unpacker: - do_something(o) - """ - - def __init__(self, file_like=None, read_size=0, use_list=True, - object_hook=None, object_pairs_hook=None, list_hook=None, - encoding=None, unicode_errors='strict', max_buffer_size=0, - ext_hook=ExtType): - if file_like is None: - self._fb_feeding = True - else: - if not callable(file_like.read): - raise TypeError("`file_like.read` must be callable") - self.file_like = file_like - self._fb_feeding = False - self._fb_buffers = [] - self._fb_buf_o = 0 - self._fb_buf_i = 0 - self._fb_buf_n = 0 - self._max_buffer_size = max_buffer_size or 2**31-1 - if read_size > self._max_buffer_size: - raise ValueError("read_size must be smaller than max_buffer_size") - self._read_size = read_size or min(self._max_buffer_size, 2048) - self._encoding = encoding - self._unicode_errors = unicode_errors - self._use_list = use_list - self._list_hook = list_hook - self._object_hook = object_hook - self._object_pairs_hook = object_pairs_hook - self._ext_hook = ext_hook - - if list_hook is not None and not callable(list_hook): - raise TypeError('`list_hook` is not callable') - if object_hook is not None and not callable(object_hook): - raise TypeError('`object_hook` is not callable') - if object_pairs_hook is not None and not callable(object_pairs_hook): - raise TypeError('`object_pairs_hook` is not callable') - if object_hook is not None and object_pairs_hook is not None: - raise TypeError("object_pairs_hook and object_hook are mutually " - "exclusive") - if not callable(ext_hook): - raise TypeError("`ext_hook` is not callable") - - def feed(self, next_bytes): - if isinstance(next_bytes, array.array): - next_bytes = next_bytes.tostring() - elif isinstance(next_bytes, bytearray): - next_bytes = bytes(next_bytes) - assert self._fb_feeding - if self._fb_buf_n + len(next_bytes) > self._max_buffer_size: - raise BufferFull - self._fb_buf_n += len(next_bytes) - self._fb_buffers.append(next_bytes) - - def _fb_consume(self): - self._fb_buffers = self._fb_buffers[self._fb_buf_i:] - if self._fb_buffers: - self._fb_buffers[0] = self._fb_buffers[0][self._fb_buf_o:] - self._fb_buf_o = 0 - self._fb_buf_i = 0 - self._fb_buf_n = sum(map(len, self._fb_buffers)) - - def _fb_got_extradata(self): - if self._fb_buf_i != len(self._fb_buffers): - return True - if self._fb_feeding: - return False - if not self.file_like: - return False - if self.file_like.read(1): - return True - return False - - def __iter__(self): - return self - - def read_bytes(self, n): - return self._fb_read(n) - - def _fb_rollback(self): - self._fb_buf_i = 0 - self._fb_buf_o = 0 - - def _fb_get_extradata(self): - bufs = self._fb_buffers[self._fb_buf_i:] - if bufs: - bufs[0] = bufs[0][self._fb_buf_o:] - return b''.join(bufs) - - def _fb_read(self, n, write_bytes=None): - buffs = self._fb_buffers - if (write_bytes is None and self._fb_buf_i < len(buffs) and - self._fb_buf_o + n < len(buffs[self._fb_buf_i])): - self._fb_buf_o += n - return buffs[self._fb_buf_i][self._fb_buf_o - n:self._fb_buf_o] - - ret = b'' - while len(ret) != n: - if self._fb_buf_i == len(buffs): - if self._fb_feeding: - break - tmp = self.file_like.read(self._read_size) - if not tmp: - break - buffs.append(tmp) - continue - sliced = n - len(ret) - ret += buffs[self._fb_buf_i][self._fb_buf_o:self._fb_buf_o + sliced] - self._fb_buf_o += sliced - if self._fb_buf_o >= len(buffs[self._fb_buf_i]): - self._fb_buf_o = 0 - self._fb_buf_i += 1 - if len(ret) != n: - self._fb_rollback() - raise OutOfData - if write_bytes is not None: - write_bytes(ret) - return ret - - def _read_header(self, execute=EX_CONSTRUCT, write_bytes=None): - typ = TYPE_IMMEDIATE - n = 0 - obj = None - c = self._fb_read(1, write_bytes) - b = ord(c) - if b & 0b10000000 == 0: - obj = b - elif b & 0b11100000 == 0b11100000: - obj = struct.unpack("b", c)[0] - elif b & 0b11100000 == 0b10100000: - n = b & 0b00011111 - obj = self._fb_read(n, write_bytes) - typ = TYPE_RAW - elif b & 0b11110000 == 0b10010000: - n = b & 0b00001111 - typ = TYPE_ARRAY - elif b & 0b11110000 == 0b10000000: - n = b & 0b00001111 - typ = TYPE_MAP - elif b == 0xc0: - obj = None - elif b == 0xc2: - obj = False - elif b == 0xc3: - obj = True - elif b == 0xc4: - typ = TYPE_BIN - n = struct.unpack("B", self._fb_read(1, write_bytes))[0] - obj = self._fb_read(n, write_bytes) - elif b == 0xc5: - typ = TYPE_BIN - n = struct.unpack(">H", self._fb_read(2, write_bytes))[0] - obj = self._fb_read(n, write_bytes) - elif b == 0xc6: - typ = TYPE_BIN - n = struct.unpack(">I", self._fb_read(4, write_bytes))[0] - obj = self._fb_read(n, write_bytes) - elif b == 0xc7: # ext 8 - typ = TYPE_EXT - L, n = struct.unpack('Bb', self._fb_read(2, write_bytes)) - obj = self._fb_read(L, write_bytes) - elif b == 0xc8: # ext 16 - typ = TYPE_EXT - L, n = struct.unpack('>Hb', self._fb_read(3, write_bytes)) - obj = self._fb_read(L, write_bytes) - elif b == 0xc9: # ext 32 - typ = TYPE_EXT - L, n = struct.unpack('>Ib', self._fb_read(5, write_bytes)) - obj = self._fb_read(L, write_bytes) - elif b == 0xca: - obj = struct.unpack(">f", self._fb_read(4, write_bytes))[0] - elif b == 0xcb: - obj = struct.unpack(">d", self._fb_read(8, write_bytes))[0] - elif b == 0xcc: - obj = struct.unpack("B", self._fb_read(1, write_bytes))[0] - elif b == 0xcd: - obj = struct.unpack(">H", self._fb_read(2, write_bytes))[0] - elif b == 0xce: - obj = struct.unpack(">I", self._fb_read(4, write_bytes))[0] - elif b == 0xcf: - obj = struct.unpack(">Q", self._fb_read(8, write_bytes))[0] - elif b == 0xd0: - obj = struct.unpack("b", self._fb_read(1, write_bytes))[0] - elif b == 0xd1: - obj = struct.unpack(">h", self._fb_read(2, write_bytes))[0] - elif b == 0xd2: - obj = struct.unpack(">i", self._fb_read(4, write_bytes))[0] - elif b == 0xd3: - obj = struct.unpack(">q", self._fb_read(8, write_bytes))[0] - elif b == 0xd4: # fixext 1 - typ = TYPE_EXT - n, obj = struct.unpack('b1s', self._fb_read(2, write_bytes)) - elif b == 0xd5: # fixext 2 - typ = TYPE_EXT - n, obj = struct.unpack('b2s', self._fb_read(3, write_bytes)) - elif b == 0xd6: # fixext 4 - typ = TYPE_EXT - n, obj = struct.unpack('b4s', self._fb_read(5, write_bytes)) - elif b == 0xd7: # fixext 8 - typ = TYPE_EXT - n, obj = struct.unpack('b8s', self._fb_read(9, write_bytes)) - elif b == 0xd8: # fixext 16 - typ = TYPE_EXT - n, obj = struct.unpack('b16s', self._fb_read(17, write_bytes)) - elif b == 0xd9: - typ = TYPE_RAW - n = struct.unpack("B", self._fb_read(1, write_bytes))[0] - obj = self._fb_read(n, write_bytes) - elif b == 0xda: - typ = TYPE_RAW - n = struct.unpack(">H", self._fb_read(2, write_bytes))[0] - obj = self._fb_read(n, write_bytes) - elif b == 0xdb: - typ = TYPE_RAW - n = struct.unpack(">I", self._fb_read(4, write_bytes))[0] - obj = self._fb_read(n, write_bytes) - elif b == 0xdc: - n = struct.unpack(">H", self._fb_read(2, write_bytes))[0] - typ = TYPE_ARRAY - elif b == 0xdd: - n = struct.unpack(">I", self._fb_read(4, write_bytes))[0] - typ = TYPE_ARRAY - elif b == 0xde: - n = struct.unpack(">H", self._fb_read(2, write_bytes))[0] - typ = TYPE_MAP - elif b == 0xdf: - n = struct.unpack(">I", self._fb_read(4, write_bytes))[0] - typ = TYPE_MAP - else: - raise UnpackValueError("Unknown header: 0x%x" % b) - return typ, n, obj - - def _fb_unpack(self, execute=EX_CONSTRUCT, write_bytes=None): - typ, n, obj = self._read_header(execute, write_bytes) - - if execute == EX_READ_ARRAY_HEADER: - if typ != TYPE_ARRAY: - raise UnpackValueError("Expected array") - return n - if execute == EX_READ_MAP_HEADER: - if typ != TYPE_MAP: - raise UnpackValueError("Expected map") - return n - # TODO should we eliminate the recursion? - if typ == TYPE_ARRAY: - if execute == EX_SKIP: - for i in xrange(n): - # TODO check whether we need to call `list_hook` - self._fb_unpack(EX_SKIP, write_bytes) - return - ret = newlist_hint(n) - for i in xrange(n): - ret.append(self._fb_unpack(EX_CONSTRUCT, write_bytes)) - if self._list_hook is not None: - ret = self._list_hook(ret) - # TODO is the interaction between `list_hook` and `use_list` ok? - return ret if self._use_list else tuple(ret) - if typ == TYPE_MAP: - if execute == EX_SKIP: - for i in xrange(n): - # TODO check whether we need to call hooks - self._fb_unpack(EX_SKIP, write_bytes) - self._fb_unpack(EX_SKIP, write_bytes) - return - if self._object_pairs_hook is not None: - ret = self._object_pairs_hook( - (self._fb_unpack(EX_CONSTRUCT, write_bytes), - self._fb_unpack(EX_CONSTRUCT, write_bytes)) - for _ in xrange(n)) - else: - ret = {} - for _ in xrange(n): - key = self._fb_unpack(EX_CONSTRUCT, write_bytes) - ret[key] = self._fb_unpack(EX_CONSTRUCT, write_bytes) - if self._object_hook is not None: - ret = self._object_hook(ret) - return ret - if execute == EX_SKIP: - return - if typ == TYPE_RAW: - if self._encoding is not None: - obj = obj.decode(self._encoding, self._unicode_errors) - return obj - if typ == TYPE_EXT: - return self._ext_hook(n, obj) - if typ == TYPE_BIN: - return obj - assert typ == TYPE_IMMEDIATE - return obj - - def next(self): - try: - ret = self._fb_unpack(EX_CONSTRUCT, None) - self._fb_consume() - return ret - except OutOfData: - raise StopIteration - __next__ = next - - def skip(self, write_bytes=None): - self._fb_unpack(EX_SKIP, write_bytes) - self._fb_consume() - - def unpack(self, write_bytes=None): - ret = self._fb_unpack(EX_CONSTRUCT, write_bytes) - self._fb_consume() - return ret - - def read_array_header(self, write_bytes=None): - ret = self._fb_unpack(EX_READ_ARRAY_HEADER, write_bytes) - self._fb_consume() - return ret - - def read_map_header(self, write_bytes=None): - ret = self._fb_unpack(EX_READ_MAP_HEADER, write_bytes) - self._fb_consume() - return ret - - -class Packer(object): - """ - MessagePack Packer - - usage: - - packer = Packer() - astream.write(packer.pack(a)) - astream.write(packer.pack(b)) - - Packer's constructor has some keyword arguments: - - :param callable default: - Convert user type to builtin type that Packer supports. - See also simplejson's document. - :param str encoding: - Convert unicode to bytes with this encoding. (default: 'utf-8') - :param str unicode_errors: - Error handler for encoding unicode. (default: 'strict') - :param bool use_single_float: - Use single precision float type for float. (default: False) - :param bool autoreset: - Reset buffer after each pack and return it's content as `bytes`. (default: True). - If set this to false, use `bytes()` to get content and `.reset()` to clear buffer. - :param bool use_bin_type: - Use bin type introduced in msgpack spec 2.0 for bytes. - It also enable str8 type for unicode. - """ - def __init__(self, default=None, encoding='utf-8', unicode_errors='strict', - use_single_float=False, autoreset=True, use_bin_type=False): - self._use_float = use_single_float - self._autoreset = autoreset - self._use_bin_type = use_bin_type - self._encoding = encoding - self._unicode_errors = unicode_errors - self._buffer = StringIO() - if default is not None: - if not callable(default): - raise TypeError("default must be callable") - self._default = default - - def _pack(self, obj, nest_limit=DEFAULT_RECURSE_LIMIT, isinstance=isinstance): - default_used = False - while True: - if nest_limit < 0: - raise PackValueError("recursion limit exceeded") - if obj is None: - return self._buffer.write(b"\xc0") - if isinstance(obj, bool): - if obj: - return self._buffer.write(b"\xc3") - return self._buffer.write(b"\xc2") - if isinstance(obj, int_types): - if 0 <= obj < 0x80: - return self._buffer.write(struct.pack("B", obj)) - if -0x20 <= obj < 0: - return self._buffer.write(struct.pack("b", obj)) - if 0x80 <= obj <= 0xff: - return self._buffer.write(struct.pack("BB", 0xcc, obj)) - if -0x80 <= obj < 0: - return self._buffer.write(struct.pack(">Bb", 0xd0, obj)) - if 0xff < obj <= 0xffff: - return self._buffer.write(struct.pack(">BH", 0xcd, obj)) - if -0x8000 <= obj < -0x80: - return self._buffer.write(struct.pack(">Bh", 0xd1, obj)) - if 0xffff < obj <= 0xffffffff: - return self._buffer.write(struct.pack(">BI", 0xce, obj)) - if -0x80000000 <= obj < -0x8000: - return self._buffer.write(struct.pack(">Bi", 0xd2, obj)) - if 0xffffffff < obj <= 0xffffffffffffffff: - return self._buffer.write(struct.pack(">BQ", 0xcf, obj)) - if -0x8000000000000000 <= obj < -0x80000000: - return self._buffer.write(struct.pack(">Bq", 0xd3, obj)) - raise PackValueError("Integer value out of range") - if self._use_bin_type and isinstance(obj, bytes): - n = len(obj) - if n <= 0xff: - self._buffer.write(struct.pack('>BB', 0xc4, n)) - elif n <= 0xffff: - self._buffer.write(struct.pack(">BH", 0xc5, n)) - elif n <= 0xffffffff: - self._buffer.write(struct.pack(">BI", 0xc6, n)) - else: - raise PackValueError("Bytes is too large") - return self._buffer.write(obj) - if isinstance(obj, (Unicode, bytes)): - if isinstance(obj, Unicode): - if self._encoding is None: - raise TypeError( - "Can't encode unicode string: " - "no encoding is specified") - obj = obj.encode(self._encoding, self._unicode_errors) - n = len(obj) - if n <= 0x1f: - self._buffer.write(struct.pack('B', 0xa0 + n)) - elif self._use_bin_type and n <= 0xff: - self._buffer.write(struct.pack('>BB', 0xd9, n)) - elif n <= 0xffff: - self._buffer.write(struct.pack(">BH", 0xda, n)) - elif n <= 0xffffffff: - self._buffer.write(struct.pack(">BI", 0xdb, n)) - else: - raise PackValueError("String is too large") - return self._buffer.write(obj) - if isinstance(obj, float): - if self._use_float: - return self._buffer.write(struct.pack(">Bf", 0xca, obj)) - return self._buffer.write(struct.pack(">Bd", 0xcb, obj)) - if isinstance(obj, ExtType): - code = obj.code - data = obj.data - assert isinstance(code, int) - assert isinstance(data, bytes) - L = len(data) - if L == 1: - self._buffer.write(b'\xd4') - elif L == 2: - self._buffer.write(b'\xd5') - elif L == 4: - self._buffer.write(b'\xd6') - elif L == 8: - self._buffer.write(b'\xd7') - elif L == 16: - self._buffer.write(b'\xd8') - elif L <= 0xff: - self._buffer.write(struct.pack(">BB", 0xc7, L)) - elif L <= 0xffff: - self._buffer.write(struct.pack(">BH", 0xc8, L)) - else: - self._buffer.write(struct.pack(">BI", 0xc9, L)) - self._buffer.write(struct.pack("b", code)) - self._buffer.write(data) - return - if isinstance(obj, (list, tuple)): - n = len(obj) - self._fb_pack_array_header(n) - for i in xrange(n): - self._pack(obj[i], nest_limit - 1) - return - if isinstance(obj, dict): - return self._fb_pack_map_pairs(len(obj), dict_iteritems(obj), - nest_limit - 1) - if not default_used and self._default is not None: - obj = self._default(obj) - default_used = 1 - continue - raise TypeError("Cannot serialize %r" % obj) - - def pack(self, obj): - self._pack(obj) - ret = self._buffer.getvalue() - if self._autoreset: - self._buffer = StringIO() - elif USING_STRINGBUILDER: - self._buffer = StringIO(ret) - return ret - - def pack_map_pairs(self, pairs): - self._fb_pack_map_pairs(len(pairs), pairs) - ret = self._buffer.getvalue() - if self._autoreset: - self._buffer = StringIO() - elif USING_STRINGBUILDER: - self._buffer = StringIO(ret) - return ret - - def pack_array_header(self, n): - if n >= 2**32: - raise ValueError - self._fb_pack_array_header(n) - ret = self._buffer.getvalue() - if self._autoreset: - self._buffer = StringIO() - elif USING_STRINGBUILDER: - self._buffer = StringIO(ret) - return ret - - def pack_map_header(self, n): - if n >= 2**32: - raise ValueError - self._fb_pack_map_header(n) - ret = self._buffer.getvalue() - if self._autoreset: - self._buffer = StringIO() - elif USING_STRINGBUILDER: - self._buffer = StringIO(ret) - return ret - - def pack_ext_type(self, typecode, data): - if not isinstance(typecode, int): - raise TypeError("typecode must have int type.") - if not 0 <= typecode <= 127: - raise ValueError("typecode should be 0-127") - if not isinstance(data, bytes): - raise TypeError("data must have bytes type") - L = len(data) - if L > 0xffffffff: - raise ValueError("Too large data") - if L == 1: - self._buffer.write(b'\xd4') - elif L == 2: - self._buffer.write(b'\xd5') - elif L == 4: - self._buffer.write(b'\xd6') - elif L == 8: - self._buffer.write(b'\xd7') - elif L == 16: - self._buffer.write(b'\xd8') - elif L <= 0xff: - self._buffer.write(b'\xc7' + struct.pack('B', L)) - elif L <= 0xffff: - self._buffer.write(b'\xc8' + struct.pack('>H', L)) - else: - self._buffer.write(b'\xc9' + struct.pack('>I', L)) - self._buffer.write(struct.pack('B', typecode)) - self._buffer.write(data) - - def _fb_pack_array_header(self, n): - if n <= 0x0f: - return self._buffer.write(struct.pack('B', 0x90 + n)) - if n <= 0xffff: - return self._buffer.write(struct.pack(">BH", 0xdc, n)) - if n <= 0xffffffff: - return self._buffer.write(struct.pack(">BI", 0xdd, n)) - raise PackValueError("Array is too large") - - def _fb_pack_map_header(self, n): - if n <= 0x0f: - return self._buffer.write(struct.pack('B', 0x80 + n)) - if n <= 0xffff: - return self._buffer.write(struct.pack(">BH", 0xde, n)) - if n <= 0xffffffff: - return self._buffer.write(struct.pack(">BI", 0xdf, n)) - raise PackValueError("Dict is too large") - - def _fb_pack_map_pairs(self, n, pairs, nest_limit=DEFAULT_RECURSE_LIMIT): - self._fb_pack_map_header(n) - for (k, v) in pairs: - self._pack(k, nest_limit - 1) - self._pack(v, nest_limit - 1) - - def bytes(self): - return self._buffer.getvalue() - - def reset(self): - self._buffer = StringIO() diff --git a/utils/exporters/blender/modules/msgpack/pack.h b/utils/exporters/blender/modules/msgpack/pack.h deleted file mode 100644 index a71c87b154a2fac208b7047e87884bcf58bc65e2..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/pack.h +++ /dev/null @@ -1,103 +0,0 @@ -/* - * MessagePack for Python packing routine - * - * Copyright (C) 2009 Naoki INADA - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -#include -#include -#include "sysdep.h" -#include -#include - -#ifdef __cplusplus -extern "C" { -#endif - -#ifdef _MSC_VER -#define inline __inline -#endif - -typedef struct msgpack_packer { - char *buf; - size_t length; - size_t buf_size; - bool use_bin_type; -} msgpack_packer; - -typedef struct Packer Packer; - -static inline int msgpack_pack_int(msgpack_packer* pk, int d); -static inline int msgpack_pack_long(msgpack_packer* pk, long d); -static inline int msgpack_pack_long_long(msgpack_packer* pk, long long d); -static inline int msgpack_pack_unsigned_short(msgpack_packer* pk, unsigned short d); -static inline int msgpack_pack_unsigned_int(msgpack_packer* pk, unsigned int d); -static inline int msgpack_pack_unsigned_long(msgpack_packer* pk, unsigned long d); -//static inline int msgpack_pack_unsigned_long_long(msgpack_packer* pk, unsigned long long d); - -static inline int msgpack_pack_uint8(msgpack_packer* pk, uint8_t d); -static inline int msgpack_pack_uint16(msgpack_packer* pk, uint16_t d); -static inline int msgpack_pack_uint32(msgpack_packer* pk, uint32_t d); -static inline int msgpack_pack_uint64(msgpack_packer* pk, uint64_t d); -static inline int msgpack_pack_int8(msgpack_packer* pk, int8_t d); -static inline int msgpack_pack_int16(msgpack_packer* pk, int16_t d); -static inline int msgpack_pack_int32(msgpack_packer* pk, int32_t d); -static inline int msgpack_pack_int64(msgpack_packer* pk, int64_t d); - -static inline int msgpack_pack_float(msgpack_packer* pk, float d); -static inline int msgpack_pack_double(msgpack_packer* pk, double d); - -static inline int msgpack_pack_nil(msgpack_packer* pk); -static inline int msgpack_pack_true(msgpack_packer* pk); -static inline int msgpack_pack_false(msgpack_packer* pk); - -static inline int msgpack_pack_array(msgpack_packer* pk, unsigned int n); - -static inline int msgpack_pack_map(msgpack_packer* pk, unsigned int n); - -static inline int msgpack_pack_raw(msgpack_packer* pk, size_t l); -static inline int msgpack_pack_bin(msgpack_packer* pk, size_t l); -static inline int msgpack_pack_raw_body(msgpack_packer* pk, const void* b, size_t l); - -static inline int msgpack_pack_ext(msgpack_packer* pk, int8_t typecode, size_t l); - -static inline int msgpack_pack_write(msgpack_packer* pk, const char *data, size_t l) -{ - char* buf = pk->buf; - size_t bs = pk->buf_size; - size_t len = pk->length; - - if (len + l > bs) { - bs = (len + l) * 2; - buf = (char*)realloc(buf, bs); - if (!buf) return -1; - } - memcpy(buf + len, data, l); - len += l; - - pk->buf = buf; - pk->buf_size = bs; - pk->length = len; - return 0; -} - -#define msgpack_pack_append_buffer(user, buf, len) \ - return msgpack_pack_write(user, (const char*)buf, len) - -#include "pack_template.h" - -#ifdef __cplusplus -} -#endif diff --git a/utils/exporters/blender/modules/msgpack/pack_template.h b/utils/exporters/blender/modules/msgpack/pack_template.h deleted file mode 100644 index 2879bbdbaecc6d54feeb16af4d6f70688bc4c48c..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/pack_template.h +++ /dev/null @@ -1,785 +0,0 @@ -/* - * MessagePack packing routine template - * - * Copyright (C) 2008-2010 FURUHASHI Sadayuki - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -#if defined(__LITTLE_ENDIAN__) -#define TAKE8_8(d) ((uint8_t*)&d)[0] -#define TAKE8_16(d) ((uint8_t*)&d)[0] -#define TAKE8_32(d) ((uint8_t*)&d)[0] -#define TAKE8_64(d) ((uint8_t*)&d)[0] -#elif defined(__BIG_ENDIAN__) -#define TAKE8_8(d) ((uint8_t*)&d)[0] -#define TAKE8_16(d) ((uint8_t*)&d)[1] -#define TAKE8_32(d) ((uint8_t*)&d)[3] -#define TAKE8_64(d) ((uint8_t*)&d)[7] -#endif - -#ifndef msgpack_pack_append_buffer -#error msgpack_pack_append_buffer callback is not defined -#endif - - -/* - * Integer - */ - -#define msgpack_pack_real_uint8(x, d) \ -do { \ - if(d < (1<<7)) { \ - /* fixnum */ \ - msgpack_pack_append_buffer(x, &TAKE8_8(d), 1); \ - } else { \ - /* unsigned 8 */ \ - unsigned char buf[2] = {0xcc, TAKE8_8(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } \ -} while(0) - -#define msgpack_pack_real_uint16(x, d) \ -do { \ - if(d < (1<<7)) { \ - /* fixnum */ \ - msgpack_pack_append_buffer(x, &TAKE8_16(d), 1); \ - } else if(d < (1<<8)) { \ - /* unsigned 8 */ \ - unsigned char buf[2] = {0xcc, TAKE8_16(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } else { \ - /* unsigned 16 */ \ - unsigned char buf[3]; \ - buf[0] = 0xcd; _msgpack_store16(&buf[1], (uint16_t)d); \ - msgpack_pack_append_buffer(x, buf, 3); \ - } \ -} while(0) - -#define msgpack_pack_real_uint32(x, d) \ -do { \ - if(d < (1<<8)) { \ - if(d < (1<<7)) { \ - /* fixnum */ \ - msgpack_pack_append_buffer(x, &TAKE8_32(d), 1); \ - } else { \ - /* unsigned 8 */ \ - unsigned char buf[2] = {0xcc, TAKE8_32(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } \ - } else { \ - if(d < (1<<16)) { \ - /* unsigned 16 */ \ - unsigned char buf[3]; \ - buf[0] = 0xcd; _msgpack_store16(&buf[1], (uint16_t)d); \ - msgpack_pack_append_buffer(x, buf, 3); \ - } else { \ - /* unsigned 32 */ \ - unsigned char buf[5]; \ - buf[0] = 0xce; _msgpack_store32(&buf[1], (uint32_t)d); \ - msgpack_pack_append_buffer(x, buf, 5); \ - } \ - } \ -} while(0) - -#define msgpack_pack_real_uint64(x, d) \ -do { \ - if(d < (1ULL<<8)) { \ - if(d < (1ULL<<7)) { \ - /* fixnum */ \ - msgpack_pack_append_buffer(x, &TAKE8_64(d), 1); \ - } else { \ - /* unsigned 8 */ \ - unsigned char buf[2] = {0xcc, TAKE8_64(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } \ - } else { \ - if(d < (1ULL<<16)) { \ - /* unsigned 16 */ \ - unsigned char buf[3]; \ - buf[0] = 0xcd; _msgpack_store16(&buf[1], (uint16_t)d); \ - msgpack_pack_append_buffer(x, buf, 3); \ - } else if(d < (1ULL<<32)) { \ - /* unsigned 32 */ \ - unsigned char buf[5]; \ - buf[0] = 0xce; _msgpack_store32(&buf[1], (uint32_t)d); \ - msgpack_pack_append_buffer(x, buf, 5); \ - } else { \ - /* unsigned 64 */ \ - unsigned char buf[9]; \ - buf[0] = 0xcf; _msgpack_store64(&buf[1], d); \ - msgpack_pack_append_buffer(x, buf, 9); \ - } \ - } \ -} while(0) - -#define msgpack_pack_real_int8(x, d) \ -do { \ - if(d < -(1<<5)) { \ - /* signed 8 */ \ - unsigned char buf[2] = {0xd0, TAKE8_8(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } else { \ - /* fixnum */ \ - msgpack_pack_append_buffer(x, &TAKE8_8(d), 1); \ - } \ -} while(0) - -#define msgpack_pack_real_int16(x, d) \ -do { \ - if(d < -(1<<5)) { \ - if(d < -(1<<7)) { \ - /* signed 16 */ \ - unsigned char buf[3]; \ - buf[0] = 0xd1; _msgpack_store16(&buf[1], (int16_t)d); \ - msgpack_pack_append_buffer(x, buf, 3); \ - } else { \ - /* signed 8 */ \ - unsigned char buf[2] = {0xd0, TAKE8_16(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } \ - } else if(d < (1<<7)) { \ - /* fixnum */ \ - msgpack_pack_append_buffer(x, &TAKE8_16(d), 1); \ - } else { \ - if(d < (1<<8)) { \ - /* unsigned 8 */ \ - unsigned char buf[2] = {0xcc, TAKE8_16(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } else { \ - /* unsigned 16 */ \ - unsigned char buf[3]; \ - buf[0] = 0xcd; _msgpack_store16(&buf[1], (uint16_t)d); \ - msgpack_pack_append_buffer(x, buf, 3); \ - } \ - } \ -} while(0) - -#define msgpack_pack_real_int32(x, d) \ -do { \ - if(d < -(1<<5)) { \ - if(d < -(1<<15)) { \ - /* signed 32 */ \ - unsigned char buf[5]; \ - buf[0] = 0xd2; _msgpack_store32(&buf[1], (int32_t)d); \ - msgpack_pack_append_buffer(x, buf, 5); \ - } else if(d < -(1<<7)) { \ - /* signed 16 */ \ - unsigned char buf[3]; \ - buf[0] = 0xd1; _msgpack_store16(&buf[1], (int16_t)d); \ - msgpack_pack_append_buffer(x, buf, 3); \ - } else { \ - /* signed 8 */ \ - unsigned char buf[2] = {0xd0, TAKE8_32(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } \ - } else if(d < (1<<7)) { \ - /* fixnum */ \ - msgpack_pack_append_buffer(x, &TAKE8_32(d), 1); \ - } else { \ - if(d < (1<<8)) { \ - /* unsigned 8 */ \ - unsigned char buf[2] = {0xcc, TAKE8_32(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } else if(d < (1<<16)) { \ - /* unsigned 16 */ \ - unsigned char buf[3]; \ - buf[0] = 0xcd; _msgpack_store16(&buf[1], (uint16_t)d); \ - msgpack_pack_append_buffer(x, buf, 3); \ - } else { \ - /* unsigned 32 */ \ - unsigned char buf[5]; \ - buf[0] = 0xce; _msgpack_store32(&buf[1], (uint32_t)d); \ - msgpack_pack_append_buffer(x, buf, 5); \ - } \ - } \ -} while(0) - -#define msgpack_pack_real_int64(x, d) \ -do { \ - if(d < -(1LL<<5)) { \ - if(d < -(1LL<<15)) { \ - if(d < -(1LL<<31)) { \ - /* signed 64 */ \ - unsigned char buf[9]; \ - buf[0] = 0xd3; _msgpack_store64(&buf[1], d); \ - msgpack_pack_append_buffer(x, buf, 9); \ - } else { \ - /* signed 32 */ \ - unsigned char buf[5]; \ - buf[0] = 0xd2; _msgpack_store32(&buf[1], (int32_t)d); \ - msgpack_pack_append_buffer(x, buf, 5); \ - } \ - } else { \ - if(d < -(1<<7)) { \ - /* signed 16 */ \ - unsigned char buf[3]; \ - buf[0] = 0xd1; _msgpack_store16(&buf[1], (int16_t)d); \ - msgpack_pack_append_buffer(x, buf, 3); \ - } else { \ - /* signed 8 */ \ - unsigned char buf[2] = {0xd0, TAKE8_64(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } \ - } \ - } else if(d < (1<<7)) { \ - /* fixnum */ \ - msgpack_pack_append_buffer(x, &TAKE8_64(d), 1); \ - } else { \ - if(d < (1LL<<16)) { \ - if(d < (1<<8)) { \ - /* unsigned 8 */ \ - unsigned char buf[2] = {0xcc, TAKE8_64(d)}; \ - msgpack_pack_append_buffer(x, buf, 2); \ - } else { \ - /* unsigned 16 */ \ - unsigned char buf[3]; \ - buf[0] = 0xcd; _msgpack_store16(&buf[1], (uint16_t)d); \ - msgpack_pack_append_buffer(x, buf, 3); \ - } \ - } else { \ - if(d < (1LL<<32)) { \ - /* unsigned 32 */ \ - unsigned char buf[5]; \ - buf[0] = 0xce; _msgpack_store32(&buf[1], (uint32_t)d); \ - msgpack_pack_append_buffer(x, buf, 5); \ - } else { \ - /* unsigned 64 */ \ - unsigned char buf[9]; \ - buf[0] = 0xcf; _msgpack_store64(&buf[1], d); \ - msgpack_pack_append_buffer(x, buf, 9); \ - } \ - } \ - } \ -} while(0) - - -static inline int msgpack_pack_uint8(msgpack_packer* x, uint8_t d) -{ - msgpack_pack_real_uint8(x, d); -} - -static inline int msgpack_pack_uint16(msgpack_packer* x, uint16_t d) -{ - msgpack_pack_real_uint16(x, d); -} - -static inline int msgpack_pack_uint32(msgpack_packer* x, uint32_t d) -{ - msgpack_pack_real_uint32(x, d); -} - -static inline int msgpack_pack_uint64(msgpack_packer* x, uint64_t d) -{ - msgpack_pack_real_uint64(x, d); -} - -static inline int msgpack_pack_int8(msgpack_packer* x, int8_t d) -{ - msgpack_pack_real_int8(x, d); -} - -static inline int msgpack_pack_int16(msgpack_packer* x, int16_t d) -{ - msgpack_pack_real_int16(x, d); -} - -static inline int msgpack_pack_int32(msgpack_packer* x, int32_t d) -{ - msgpack_pack_real_int32(x, d); -} - -static inline int msgpack_pack_int64(msgpack_packer* x, int64_t d) -{ - msgpack_pack_real_int64(x, d); -} - - -//#ifdef msgpack_pack_inline_func_cint - -static inline int msgpack_pack_short(msgpack_packer* x, short d) -{ -#if defined(SIZEOF_SHORT) -#if SIZEOF_SHORT == 2 - msgpack_pack_real_int16(x, d); -#elif SIZEOF_SHORT == 4 - msgpack_pack_real_int32(x, d); -#else - msgpack_pack_real_int64(x, d); -#endif - -#elif defined(SHRT_MAX) -#if SHRT_MAX == 0x7fff - msgpack_pack_real_int16(x, d); -#elif SHRT_MAX == 0x7fffffff - msgpack_pack_real_int32(x, d); -#else - msgpack_pack_real_int64(x, d); -#endif - -#else -if(sizeof(short) == 2) { - msgpack_pack_real_int16(x, d); -} else if(sizeof(short) == 4) { - msgpack_pack_real_int32(x, d); -} else { - msgpack_pack_real_int64(x, d); -} -#endif -} - -static inline int msgpack_pack_int(msgpack_packer* x, int d) -{ -#if defined(SIZEOF_INT) -#if SIZEOF_INT == 2 - msgpack_pack_real_int16(x, d); -#elif SIZEOF_INT == 4 - msgpack_pack_real_int32(x, d); -#else - msgpack_pack_real_int64(x, d); -#endif - -#elif defined(INT_MAX) -#if INT_MAX == 0x7fff - msgpack_pack_real_int16(x, d); -#elif INT_MAX == 0x7fffffff - msgpack_pack_real_int32(x, d); -#else - msgpack_pack_real_int64(x, d); -#endif - -#else -if(sizeof(int) == 2) { - msgpack_pack_real_int16(x, d); -} else if(sizeof(int) == 4) { - msgpack_pack_real_int32(x, d); -} else { - msgpack_pack_real_int64(x, d); -} -#endif -} - -static inline int msgpack_pack_long(msgpack_packer* x, long d) -{ -#if defined(SIZEOF_LONG) -#if SIZEOF_LONG == 2 - msgpack_pack_real_int16(x, d); -#elif SIZEOF_LONG == 4 - msgpack_pack_real_int32(x, d); -#else - msgpack_pack_real_int64(x, d); -#endif - -#elif defined(LONG_MAX) -#if LONG_MAX == 0x7fffL - msgpack_pack_real_int16(x, d); -#elif LONG_MAX == 0x7fffffffL - msgpack_pack_real_int32(x, d); -#else - msgpack_pack_real_int64(x, d); -#endif - -#else -if(sizeof(long) == 2) { - msgpack_pack_real_int16(x, d); -} else if(sizeof(long) == 4) { - msgpack_pack_real_int32(x, d); -} else { - msgpack_pack_real_int64(x, d); -} -#endif -} - -static inline int msgpack_pack_long_long(msgpack_packer* x, long long d) -{ -#if defined(SIZEOF_LONG_LONG) -#if SIZEOF_LONG_LONG == 2 - msgpack_pack_real_int16(x, d); -#elif SIZEOF_LONG_LONG == 4 - msgpack_pack_real_int32(x, d); -#else - msgpack_pack_real_int64(x, d); -#endif - -#elif defined(LLONG_MAX) -#if LLONG_MAX == 0x7fffL - msgpack_pack_real_int16(x, d); -#elif LLONG_MAX == 0x7fffffffL - msgpack_pack_real_int32(x, d); -#else - msgpack_pack_real_int64(x, d); -#endif - -#else -if(sizeof(long long) == 2) { - msgpack_pack_real_int16(x, d); -} else if(sizeof(long long) == 4) { - msgpack_pack_real_int32(x, d); -} else { - msgpack_pack_real_int64(x, d); -} -#endif -} - -static inline int msgpack_pack_unsigned_short(msgpack_packer* x, unsigned short d) -{ -#if defined(SIZEOF_SHORT) -#if SIZEOF_SHORT == 2 - msgpack_pack_real_uint16(x, d); -#elif SIZEOF_SHORT == 4 - msgpack_pack_real_uint32(x, d); -#else - msgpack_pack_real_uint64(x, d); -#endif - -#elif defined(USHRT_MAX) -#if USHRT_MAX == 0xffffU - msgpack_pack_real_uint16(x, d); -#elif USHRT_MAX == 0xffffffffU - msgpack_pack_real_uint32(x, d); -#else - msgpack_pack_real_uint64(x, d); -#endif - -#else -if(sizeof(unsigned short) == 2) { - msgpack_pack_real_uint16(x, d); -} else if(sizeof(unsigned short) == 4) { - msgpack_pack_real_uint32(x, d); -} else { - msgpack_pack_real_uint64(x, d); -} -#endif -} - -static inline int msgpack_pack_unsigned_int(msgpack_packer* x, unsigned int d) -{ -#if defined(SIZEOF_INT) -#if SIZEOF_INT == 2 - msgpack_pack_real_uint16(x, d); -#elif SIZEOF_INT == 4 - msgpack_pack_real_uint32(x, d); -#else - msgpack_pack_real_uint64(x, d); -#endif - -#elif defined(UINT_MAX) -#if UINT_MAX == 0xffffU - msgpack_pack_real_uint16(x, d); -#elif UINT_MAX == 0xffffffffU - msgpack_pack_real_uint32(x, d); -#else - msgpack_pack_real_uint64(x, d); -#endif - -#else -if(sizeof(unsigned int) == 2) { - msgpack_pack_real_uint16(x, d); -} else if(sizeof(unsigned int) == 4) { - msgpack_pack_real_uint32(x, d); -} else { - msgpack_pack_real_uint64(x, d); -} -#endif -} - -static inline int msgpack_pack_unsigned_long(msgpack_packer* x, unsigned long d) -{ -#if defined(SIZEOF_LONG) -#if SIZEOF_LONG == 2 - msgpack_pack_real_uint16(x, d); -#elif SIZEOF_LONG == 4 - msgpack_pack_real_uint32(x, d); -#else - msgpack_pack_real_uint64(x, d); -#endif - -#elif defined(ULONG_MAX) -#if ULONG_MAX == 0xffffUL - msgpack_pack_real_uint16(x, d); -#elif ULONG_MAX == 0xffffffffUL - msgpack_pack_real_uint32(x, d); -#else - msgpack_pack_real_uint64(x, d); -#endif - -#else -if(sizeof(unsigned long) == 2) { - msgpack_pack_real_uint16(x, d); -} else if(sizeof(unsigned long) == 4) { - msgpack_pack_real_uint32(x, d); -} else { - msgpack_pack_real_uint64(x, d); -} -#endif -} - -static inline int msgpack_pack_unsigned_long_long(msgpack_packer* x, unsigned long long d) -{ -#if defined(SIZEOF_LONG_LONG) -#if SIZEOF_LONG_LONG == 2 - msgpack_pack_real_uint16(x, d); -#elif SIZEOF_LONG_LONG == 4 - msgpack_pack_real_uint32(x, d); -#else - msgpack_pack_real_uint64(x, d); -#endif - -#elif defined(ULLONG_MAX) -#if ULLONG_MAX == 0xffffUL - msgpack_pack_real_uint16(x, d); -#elif ULLONG_MAX == 0xffffffffUL - msgpack_pack_real_uint32(x, d); -#else - msgpack_pack_real_uint64(x, d); -#endif - -#else -if(sizeof(unsigned long long) == 2) { - msgpack_pack_real_uint16(x, d); -} else if(sizeof(unsigned long long) == 4) { - msgpack_pack_real_uint32(x, d); -} else { - msgpack_pack_real_uint64(x, d); -} -#endif -} - -//#undef msgpack_pack_inline_func_cint -//#endif - - - -/* - * Float - */ - -static inline int msgpack_pack_float(msgpack_packer* x, float d) -{ - union { float f; uint32_t i; } mem; - mem.f = d; - unsigned char buf[5]; - buf[0] = 0xca; _msgpack_store32(&buf[1], mem.i); - msgpack_pack_append_buffer(x, buf, 5); -} - -static inline int msgpack_pack_double(msgpack_packer* x, double d) -{ - union { double f; uint64_t i; } mem; - mem.f = d; - unsigned char buf[9]; - buf[0] = 0xcb; -#if defined(__arm__) && !(__ARM_EABI__) // arm-oabi - // https://github.com/msgpack/msgpack-perl/pull/1 - mem.i = (mem.i & 0xFFFFFFFFUL) << 32UL | (mem.i >> 32UL); -#endif - _msgpack_store64(&buf[1], mem.i); - msgpack_pack_append_buffer(x, buf, 9); -} - - -/* - * Nil - */ - -static inline int msgpack_pack_nil(msgpack_packer* x) -{ - static const unsigned char d = 0xc0; - msgpack_pack_append_buffer(x, &d, 1); -} - - -/* - * Boolean - */ - -static inline int msgpack_pack_true(msgpack_packer* x) -{ - static const unsigned char d = 0xc3; - msgpack_pack_append_buffer(x, &d, 1); -} - -static inline int msgpack_pack_false(msgpack_packer* x) -{ - static const unsigned char d = 0xc2; - msgpack_pack_append_buffer(x, &d, 1); -} - - -/* - * Array - */ - -static inline int msgpack_pack_array(msgpack_packer* x, unsigned int n) -{ - if(n < 16) { - unsigned char d = 0x90 | n; - msgpack_pack_append_buffer(x, &d, 1); - } else if(n < 65536) { - unsigned char buf[3]; - buf[0] = 0xdc; _msgpack_store16(&buf[1], (uint16_t)n); - msgpack_pack_append_buffer(x, buf, 3); - } else { - unsigned char buf[5]; - buf[0] = 0xdd; _msgpack_store32(&buf[1], (uint32_t)n); - msgpack_pack_append_buffer(x, buf, 5); - } -} - - -/* - * Map - */ - -static inline int msgpack_pack_map(msgpack_packer* x, unsigned int n) -{ - if(n < 16) { - unsigned char d = 0x80 | n; - msgpack_pack_append_buffer(x, &TAKE8_8(d), 1); - } else if(n < 65536) { - unsigned char buf[3]; - buf[0] = 0xde; _msgpack_store16(&buf[1], (uint16_t)n); - msgpack_pack_append_buffer(x, buf, 3); - } else { - unsigned char buf[5]; - buf[0] = 0xdf; _msgpack_store32(&buf[1], (uint32_t)n); - msgpack_pack_append_buffer(x, buf, 5); - } -} - - -/* - * Raw - */ - -static inline int msgpack_pack_raw(msgpack_packer* x, size_t l) -{ - if (l < 32) { - unsigned char d = 0xa0 | (uint8_t)l; - msgpack_pack_append_buffer(x, &TAKE8_8(d), 1); - } else if (x->use_bin_type && l < 256) { // str8 is new format introduced with bin. - unsigned char buf[2] = {0xd9, (uint8_t)l}; - msgpack_pack_append_buffer(x, buf, 2); - } else if (l < 65536) { - unsigned char buf[3]; - buf[0] = 0xda; _msgpack_store16(&buf[1], (uint16_t)l); - msgpack_pack_append_buffer(x, buf, 3); - } else { - unsigned char buf[5]; - buf[0] = 0xdb; _msgpack_store32(&buf[1], (uint32_t)l); - msgpack_pack_append_buffer(x, buf, 5); - } -} - -/* - * bin - */ -static inline int msgpack_pack_bin(msgpack_packer *x, size_t l) -{ - if (!x->use_bin_type) { - return msgpack_pack_raw(x, l); - } - if (l < 256) { - unsigned char buf[2] = {0xc4, (unsigned char)l}; - msgpack_pack_append_buffer(x, buf, 2); - } else if (l < 65536) { - unsigned char buf[3] = {0xc5}; - _msgpack_store16(&buf[1], (uint16_t)l); - msgpack_pack_append_buffer(x, buf, 3); - } else { - unsigned char buf[5] = {0xc6}; - _msgpack_store32(&buf[1], (uint32_t)l); - msgpack_pack_append_buffer(x, buf, 5); - } -} - -static inline int msgpack_pack_raw_body(msgpack_packer* x, const void* b, size_t l) -{ - if (l > 0) msgpack_pack_append_buffer(x, (const unsigned char*)b, l); - return 0; -} - -/* - * Ext - */ -static inline int msgpack_pack_ext(msgpack_packer* x, int8_t typecode, size_t l) -{ - if (l == 1) { - unsigned char buf[2]; - buf[0] = 0xd4; - buf[1] = (unsigned char)typecode; - msgpack_pack_append_buffer(x, buf, 2); - } - else if(l == 2) { - unsigned char buf[2]; - buf[0] = 0xd5; - buf[1] = (unsigned char)typecode; - msgpack_pack_append_buffer(x, buf, 2); - } - else if(l == 4) { - unsigned char buf[2]; - buf[0] = 0xd6; - buf[1] = (unsigned char)typecode; - msgpack_pack_append_buffer(x, buf, 2); - } - else if(l == 8) { - unsigned char buf[2]; - buf[0] = 0xd7; - buf[1] = (unsigned char)typecode; - msgpack_pack_append_buffer(x, buf, 2); - } - else if(l == 16) { - unsigned char buf[2]; - buf[0] = 0xd8; - buf[1] = (unsigned char)typecode; - msgpack_pack_append_buffer(x, buf, 2); - } - else if(l < 256) { - unsigned char buf[3]; - buf[0] = 0xc7; - buf[1] = l; - buf[2] = (unsigned char)typecode; - msgpack_pack_append_buffer(x, buf, 3); - } else if(l < 65536) { - unsigned char buf[4]; - buf[0] = 0xc8; - _msgpack_store16(&buf[1], (uint16_t)l); - buf[3] = (unsigned char)typecode; - msgpack_pack_append_buffer(x, buf, 4); - } else { - unsigned char buf[6]; - buf[0] = 0xc9; - _msgpack_store32(&buf[1], (uint32_t)l); - buf[5] = (unsigned char)typecode; - msgpack_pack_append_buffer(x, buf, 6); - } - -} - - - -#undef msgpack_pack_append_buffer - -#undef TAKE8_8 -#undef TAKE8_16 -#undef TAKE8_32 -#undef TAKE8_64 - -#undef msgpack_pack_real_uint8 -#undef msgpack_pack_real_uint16 -#undef msgpack_pack_real_uint32 -#undef msgpack_pack_real_uint64 -#undef msgpack_pack_real_int8 -#undef msgpack_pack_real_int16 -#undef msgpack_pack_real_int32 -#undef msgpack_pack_real_int64 diff --git a/utils/exporters/blender/modules/msgpack/sysdep.h b/utils/exporters/blender/modules/msgpack/sysdep.h deleted file mode 100644 index ed9c1bc0b803101ee7806e5660aa7c2fb1470e80..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/sysdep.h +++ /dev/null @@ -1,194 +0,0 @@ -/* - * MessagePack system dependencies - * - * Copyright (C) 2008-2010 FURUHASHI Sadayuki - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -#ifndef MSGPACK_SYSDEP_H__ -#define MSGPACK_SYSDEP_H__ - -#include -#include -#if defined(_MSC_VER) && _MSC_VER < 1600 -typedef __int8 int8_t; -typedef unsigned __int8 uint8_t; -typedef __int16 int16_t; -typedef unsigned __int16 uint16_t; -typedef __int32 int32_t; -typedef unsigned __int32 uint32_t; -typedef __int64 int64_t; -typedef unsigned __int64 uint64_t; -#elif defined(_MSC_VER) // && _MSC_VER >= 1600 -#include -#else -#include -#include -#endif - -#ifdef _WIN32 -#define _msgpack_atomic_counter_header -typedef long _msgpack_atomic_counter_t; -#define _msgpack_sync_decr_and_fetch(ptr) InterlockedDecrement(ptr) -#define _msgpack_sync_incr_and_fetch(ptr) InterlockedIncrement(ptr) -#elif defined(__GNUC__) && ((__GNUC__*10 + __GNUC_MINOR__) < 41) -#define _msgpack_atomic_counter_header "gcc_atomic.h" -#else -typedef unsigned int _msgpack_atomic_counter_t; -#define _msgpack_sync_decr_and_fetch(ptr) __sync_sub_and_fetch(ptr, 1) -#define _msgpack_sync_incr_and_fetch(ptr) __sync_add_and_fetch(ptr, 1) -#endif - -#ifdef _WIN32 - -#ifdef __cplusplus -/* numeric_limits::min,max */ -#ifdef max -#undef max -#endif -#ifdef min -#undef min -#endif -#endif - -#else -#include /* __BYTE_ORDER */ -#endif - -#if !defined(__LITTLE_ENDIAN__) && !defined(__BIG_ENDIAN__) -#if __BYTE_ORDER == __LITTLE_ENDIAN -#define __LITTLE_ENDIAN__ -#elif __BYTE_ORDER == __BIG_ENDIAN -#define __BIG_ENDIAN__ -#elif _WIN32 -#define __LITTLE_ENDIAN__ -#endif -#endif - - -#ifdef __LITTLE_ENDIAN__ - -#ifdef _WIN32 -# if defined(ntohs) -# define _msgpack_be16(x) ntohs(x) -# elif defined(_byteswap_ushort) || (defined(_MSC_VER) && _MSC_VER >= 1400) -# define _msgpack_be16(x) ((uint16_t)_byteswap_ushort((unsigned short)x)) -# else -# define _msgpack_be16(x) ( \ - ((((uint16_t)x) << 8) ) | \ - ((((uint16_t)x) >> 8) ) ) -# endif -#else -# define _msgpack_be16(x) ntohs(x) -#endif - -#ifdef _WIN32 -# if defined(ntohl) -# define _msgpack_be32(x) ntohl(x) -# elif defined(_byteswap_ulong) || (defined(_MSC_VER) && _MSC_VER >= 1400) -# define _msgpack_be32(x) ((uint32_t)_byteswap_ulong((unsigned long)x)) -# else -# define _msgpack_be32(x) \ - ( ((((uint32_t)x) << 24) ) | \ - ((((uint32_t)x) << 8) & 0x00ff0000U ) | \ - ((((uint32_t)x) >> 8) & 0x0000ff00U ) | \ - ((((uint32_t)x) >> 24) ) ) -# endif -#else -# define _msgpack_be32(x) ntohl(x) -#endif - -#if defined(_byteswap_uint64) || (defined(_MSC_VER) && _MSC_VER >= 1400) -# define _msgpack_be64(x) (_byteswap_uint64(x)) -#elif defined(bswap_64) -# define _msgpack_be64(x) bswap_64(x) -#elif defined(__DARWIN_OSSwapInt64) -# define _msgpack_be64(x) __DARWIN_OSSwapInt64(x) -#else -#define _msgpack_be64(x) \ - ( ((((uint64_t)x) << 56) ) | \ - ((((uint64_t)x) << 40) & 0x00ff000000000000ULL ) | \ - ((((uint64_t)x) << 24) & 0x0000ff0000000000ULL ) | \ - ((((uint64_t)x) << 8) & 0x000000ff00000000ULL ) | \ - ((((uint64_t)x) >> 8) & 0x00000000ff000000ULL ) | \ - ((((uint64_t)x) >> 24) & 0x0000000000ff0000ULL ) | \ - ((((uint64_t)x) >> 40) & 0x000000000000ff00ULL ) | \ - ((((uint64_t)x) >> 56) ) ) -#endif - -#define _msgpack_load16(cast, from) ((cast)( \ - (((uint16_t)((uint8_t*)(from))[0]) << 8) | \ - (((uint16_t)((uint8_t*)(from))[1]) ) )) - -#define _msgpack_load32(cast, from) ((cast)( \ - (((uint32_t)((uint8_t*)(from))[0]) << 24) | \ - (((uint32_t)((uint8_t*)(from))[1]) << 16) | \ - (((uint32_t)((uint8_t*)(from))[2]) << 8) | \ - (((uint32_t)((uint8_t*)(from))[3]) ) )) - -#define _msgpack_load64(cast, from) ((cast)( \ - (((uint64_t)((uint8_t*)(from))[0]) << 56) | \ - (((uint64_t)((uint8_t*)(from))[1]) << 48) | \ - (((uint64_t)((uint8_t*)(from))[2]) << 40) | \ - (((uint64_t)((uint8_t*)(from))[3]) << 32) | \ - (((uint64_t)((uint8_t*)(from))[4]) << 24) | \ - (((uint64_t)((uint8_t*)(from))[5]) << 16) | \ - (((uint64_t)((uint8_t*)(from))[6]) << 8) | \ - (((uint64_t)((uint8_t*)(from))[7]) ) )) - -#else - -#define _msgpack_be16(x) (x) -#define _msgpack_be32(x) (x) -#define _msgpack_be64(x) (x) - -#define _msgpack_load16(cast, from) ((cast)( \ - (((uint16_t)((uint8_t*)from)[0]) << 8) | \ - (((uint16_t)((uint8_t*)from)[1]) ) )) - -#define _msgpack_load32(cast, from) ((cast)( \ - (((uint32_t)((uint8_t*)from)[0]) << 24) | \ - (((uint32_t)((uint8_t*)from)[1]) << 16) | \ - (((uint32_t)((uint8_t*)from)[2]) << 8) | \ - (((uint32_t)((uint8_t*)from)[3]) ) )) - -#define _msgpack_load64(cast, from) ((cast)( \ - (((uint64_t)((uint8_t*)from)[0]) << 56) | \ - (((uint64_t)((uint8_t*)from)[1]) << 48) | \ - (((uint64_t)((uint8_t*)from)[2]) << 40) | \ - (((uint64_t)((uint8_t*)from)[3]) << 32) | \ - (((uint64_t)((uint8_t*)from)[4]) << 24) | \ - (((uint64_t)((uint8_t*)from)[5]) << 16) | \ - (((uint64_t)((uint8_t*)from)[6]) << 8) | \ - (((uint64_t)((uint8_t*)from)[7]) ) )) -#endif - - -#define _msgpack_store16(to, num) \ - do { uint16_t val = _msgpack_be16(num); memcpy(to, &val, 2); } while(0) -#define _msgpack_store32(to, num) \ - do { uint32_t val = _msgpack_be32(num); memcpy(to, &val, 4); } while(0) -#define _msgpack_store64(to, num) \ - do { uint64_t val = _msgpack_be64(num); memcpy(to, &val, 8); } while(0) - -/* -#define _msgpack_load16(cast, from) \ - ({ cast val; memcpy(&val, (char*)from, 2); _msgpack_be16(val); }) -#define _msgpack_load32(cast, from) \ - ({ cast val; memcpy(&val, (char*)from, 4); _msgpack_be32(val); }) -#define _msgpack_load64(cast, from) \ - ({ cast val; memcpy(&val, (char*)from, 8); _msgpack_be64(val); }) -*/ - - -#endif /* msgpack/sysdep.h */ diff --git a/utils/exporters/blender/modules/msgpack/unpack.h b/utils/exporters/blender/modules/msgpack/unpack.h deleted file mode 100644 index 27e3b6269e1cb9747eaabdb0372fba0b20714700..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/unpack.h +++ /dev/null @@ -1,263 +0,0 @@ -/* - * MessagePack for Python unpacking routine - * - * Copyright (C) 2009 Naoki INADA - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -#define MSGPACK_EMBED_STACK_SIZE (1024) -#include "unpack_define.h" - -typedef struct unpack_user { - int use_list; - PyObject *object_hook; - bool has_pairs_hook; - PyObject *list_hook; - PyObject *ext_hook; - const char *encoding; - const char *unicode_errors; -} unpack_user; - -typedef PyObject* msgpack_unpack_object; -struct unpack_context; -typedef struct unpack_context unpack_context; -typedef int (*execute_fn)(unpack_context *ctx, const char* data, size_t len, size_t* off); - -static inline msgpack_unpack_object unpack_callback_root(unpack_user* u) -{ - return NULL; -} - -static inline int unpack_callback_uint16(unpack_user* u, uint16_t d, msgpack_unpack_object* o) -{ - PyObject *p = PyInt_FromLong((long)d); - if (!p) - return -1; - *o = p; - return 0; -} -static inline int unpack_callback_uint8(unpack_user* u, uint8_t d, msgpack_unpack_object* o) -{ - return unpack_callback_uint16(u, d, o); -} - - -static inline int unpack_callback_uint32(unpack_user* u, uint32_t d, msgpack_unpack_object* o) -{ - PyObject *p; -#if UINT32_MAX > LONG_MAX - if (d > LONG_MAX) { - p = PyLong_FromUnsignedLong((unsigned long)d); - } else -#endif - { - p = PyInt_FromLong((long)d); - } - if (!p) - return -1; - *o = p; - return 0; -} - -static inline int unpack_callback_uint64(unpack_user* u, uint64_t d, msgpack_unpack_object* o) -{ - PyObject *p; - if (d > LONG_MAX) { - p = PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG)d); - } else { - p = PyInt_FromLong((long)d); - } - if (!p) - return -1; - *o = p; - return 0; -} - -static inline int unpack_callback_int32(unpack_user* u, int32_t d, msgpack_unpack_object* o) -{ - PyObject *p = PyInt_FromLong(d); - if (!p) - return -1; - *o = p; - return 0; -} - -static inline int unpack_callback_int16(unpack_user* u, int16_t d, msgpack_unpack_object* o) -{ - return unpack_callback_int32(u, d, o); -} - -static inline int unpack_callback_int8(unpack_user* u, int8_t d, msgpack_unpack_object* o) -{ - return unpack_callback_int32(u, d, o); -} - -static inline int unpack_callback_int64(unpack_user* u, int64_t d, msgpack_unpack_object* o) -{ - PyObject *p; - if (d > LONG_MAX || d < LONG_MIN) { - p = PyLong_FromLongLong((unsigned PY_LONG_LONG)d); - } else { - p = PyInt_FromLong((long)d); - } - *o = p; - return 0; -} - -static inline int unpack_callback_double(unpack_user* u, double d, msgpack_unpack_object* o) -{ - PyObject *p = PyFloat_FromDouble(d); - if (!p) - return -1; - *o = p; - return 0; -} - -static inline int unpack_callback_float(unpack_user* u, float d, msgpack_unpack_object* o) -{ - return unpack_callback_double(u, d, o); -} - -static inline int unpack_callback_nil(unpack_user* u, msgpack_unpack_object* o) -{ Py_INCREF(Py_None); *o = Py_None; return 0; } - -static inline int unpack_callback_true(unpack_user* u, msgpack_unpack_object* o) -{ Py_INCREF(Py_True); *o = Py_True; return 0; } - -static inline int unpack_callback_false(unpack_user* u, msgpack_unpack_object* o) -{ Py_INCREF(Py_False); *o = Py_False; return 0; } - -static inline int unpack_callback_array(unpack_user* u, unsigned int n, msgpack_unpack_object* o) -{ - PyObject *p = u->use_list ? PyList_New(n) : PyTuple_New(n); - - if (!p) - return -1; - *o = p; - return 0; -} - -static inline int unpack_callback_array_item(unpack_user* u, unsigned int current, msgpack_unpack_object* c, msgpack_unpack_object o) -{ - if (u->use_list) - PyList_SET_ITEM(*c, current, o); - else - PyTuple_SET_ITEM(*c, current, o); - return 0; -} - -static inline int unpack_callback_array_end(unpack_user* u, msgpack_unpack_object* c) -{ - if (u->list_hook) { - PyObject *new_c = PyObject_CallFunctionObjArgs(u->list_hook, *c, NULL); - if (!new_c) - return -1; - Py_DECREF(*c); - *c = new_c; - } - return 0; -} - -static inline int unpack_callback_map(unpack_user* u, unsigned int n, msgpack_unpack_object* o) -{ - PyObject *p; - if (u->has_pairs_hook) { - p = PyList_New(n); // Or use tuple? - } - else { - p = PyDict_New(); - } - if (!p) - return -1; - *o = p; - return 0; -} - -static inline int unpack_callback_map_item(unpack_user* u, unsigned int current, msgpack_unpack_object* c, msgpack_unpack_object k, msgpack_unpack_object v) -{ - if (u->has_pairs_hook) { - msgpack_unpack_object item = PyTuple_Pack(2, k, v); - if (!item) - return -1; - Py_DECREF(k); - Py_DECREF(v); - PyList_SET_ITEM(*c, current, item); - return 0; - } - else if (PyDict_SetItem(*c, k, v) == 0) { - Py_DECREF(k); - Py_DECREF(v); - return 0; - } - return -1; -} - -static inline int unpack_callback_map_end(unpack_user* u, msgpack_unpack_object* c) -{ - if (u->object_hook) { - PyObject *new_c = PyObject_CallFunctionObjArgs(u->object_hook, *c, NULL); - if (!new_c) - return -1; - - Py_DECREF(*c); - *c = new_c; - } - return 0; -} - -static inline int unpack_callback_raw(unpack_user* u, const char* b, const char* p, unsigned int l, msgpack_unpack_object* o) -{ - PyObject *py; - if(u->encoding) { - py = PyUnicode_Decode(p, l, u->encoding, u->unicode_errors); - } else { - py = PyBytes_FromStringAndSize(p, l); - } - if (!py) - return -1; - *o = py; - return 0; -} - -static inline int unpack_callback_bin(unpack_user* u, const char* b, const char* p, unsigned int l, msgpack_unpack_object* o) -{ - PyObject *py = PyBytes_FromStringAndSize(p, l); - if (!py) - return -1; - *o = py; - return 0; -} - -static inline int unpack_callback_ext(unpack_user* u, const char* base, const char* pos, - unsigned int lenght, msgpack_unpack_object* o) -{ - PyObject *py; - int8_t typecode = (int8_t)*pos++; - if (!u->ext_hook) { - PyErr_SetString(PyExc_AssertionError, "u->ext_hook cannot be NULL"); - return -1; - } - // length also includes the typecode, so the actual data is lenght-1 -#if PY_MAJOR_VERSION == 2 - py = PyObject_CallFunction(u->ext_hook, "(is#)", typecode, pos, lenght-1); -#else - py = PyObject_CallFunction(u->ext_hook, "(iy#)", typecode, pos, lenght-1); -#endif - if (!py) - return -1; - *o = py; - return 0; -} - -#include "unpack_template.h" diff --git a/utils/exporters/blender/modules/msgpack/unpack_define.h b/utils/exporters/blender/modules/msgpack/unpack_define.h deleted file mode 100644 index 0dd708d17c3d4d47bf388a9a6f3e19394244789d..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/unpack_define.h +++ /dev/null @@ -1,95 +0,0 @@ -/* - * MessagePack unpacking routine template - * - * Copyright (C) 2008-2010 FURUHASHI Sadayuki - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -#ifndef MSGPACK_UNPACK_DEFINE_H__ -#define MSGPACK_UNPACK_DEFINE_H__ - -#include "msgpack/sysdep.h" -#include -#include -#include -#include - -#ifdef __cplusplus -extern "C" { -#endif - - -#ifndef MSGPACK_EMBED_STACK_SIZE -#define MSGPACK_EMBED_STACK_SIZE 32 -#endif - - -// CS is first byte & 0x1f -typedef enum { - CS_HEADER = 0x00, // nil - - //CS_ = 0x01, - //CS_ = 0x02, // false - //CS_ = 0x03, // true - - CS_BIN_8 = 0x04, - CS_BIN_16 = 0x05, - CS_BIN_32 = 0x06, - - CS_EXT_8 = 0x07, - CS_EXT_16 = 0x08, - CS_EXT_32 = 0x09, - - CS_FLOAT = 0x0a, - CS_DOUBLE = 0x0b, - CS_UINT_8 = 0x0c, - CS_UINT_16 = 0x0d, - CS_UINT_32 = 0x0e, - CS_UINT_64 = 0x0f, - CS_INT_8 = 0x10, - CS_INT_16 = 0x11, - CS_INT_32 = 0x12, - CS_INT_64 = 0x13, - - //CS_FIXEXT1 = 0x14, - //CS_FIXEXT2 = 0x15, - //CS_FIXEXT4 = 0x16, - //CS_FIXEXT8 = 0x17, - //CS_FIXEXT16 = 0x18, - - CS_RAW_8 = 0x19, - CS_RAW_16 = 0x1a, - CS_RAW_32 = 0x1b, - CS_ARRAY_16 = 0x1c, - CS_ARRAY_32 = 0x1d, - CS_MAP_16 = 0x1e, - CS_MAP_32 = 0x1f, - - ACS_RAW_VALUE, - ACS_BIN_VALUE, - ACS_EXT_VALUE, -} msgpack_unpack_state; - - -typedef enum { - CT_ARRAY_ITEM, - CT_MAP_KEY, - CT_MAP_VALUE, -} msgpack_container_type; - - -#ifdef __cplusplus -} -#endif - -#endif /* msgpack/unpack_define.h */ diff --git a/utils/exporters/blender/modules/msgpack/unpack_template.h b/utils/exporters/blender/modules/msgpack/unpack_template.h deleted file mode 100644 index d34eceda6ab6948e29424422d6897026240c7ea6..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/modules/msgpack/unpack_template.h +++ /dev/null @@ -1,475 +0,0 @@ -/* - * MessagePack unpacking routine template - * - * Copyright (C) 2008-2010 FURUHASHI Sadayuki - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -#ifndef USE_CASE_RANGE -#if !defined(_MSC_VER) -#define USE_CASE_RANGE -#endif -#endif - -typedef struct unpack_stack { - PyObject* obj; - size_t size; - size_t count; - unsigned int ct; - PyObject* map_key; -} unpack_stack; - -struct unpack_context { - unpack_user user; - unsigned int cs; - unsigned int trail; - unsigned int top; - /* - unpack_stack* stack; - unsigned int stack_size; - unpack_stack embed_stack[MSGPACK_EMBED_STACK_SIZE]; - */ - unpack_stack stack[MSGPACK_EMBED_STACK_SIZE]; -}; - - -static inline void unpack_init(unpack_context* ctx) -{ - ctx->cs = CS_HEADER; - ctx->trail = 0; - ctx->top = 0; - /* - ctx->stack = ctx->embed_stack; - ctx->stack_size = MSGPACK_EMBED_STACK_SIZE; - */ - ctx->stack[0].obj = unpack_callback_root(&ctx->user); -} - -/* -static inline void unpack_destroy(unpack_context* ctx) -{ - if(ctx->stack_size != MSGPACK_EMBED_STACK_SIZE) { - free(ctx->stack); - } -} -*/ - -static inline PyObject* unpack_data(unpack_context* ctx) -{ - return (ctx)->stack[0].obj; -} - - -template -static inline int unpack_execute(unpack_context* ctx, const char* data, size_t len, size_t* off) -{ - assert(len >= *off); - - const unsigned char* p = (unsigned char*)data + *off; - const unsigned char* const pe = (unsigned char*)data + len; - const void* n = NULL; - - unsigned int trail = ctx->trail; - unsigned int cs = ctx->cs; - unsigned int top = ctx->top; - unpack_stack* stack = ctx->stack; - /* - unsigned int stack_size = ctx->stack_size; - */ - unpack_user* user = &ctx->user; - - PyObject* obj; - unpack_stack* c = NULL; - - int ret; - -#define construct_cb(name) \ - construct && unpack_callback ## name - -#define push_simple_value(func) \ - if(construct_cb(func)(user, &obj) < 0) { goto _failed; } \ - goto _push -#define push_fixed_value(func, arg) \ - if(construct_cb(func)(user, arg, &obj) < 0) { goto _failed; } \ - goto _push -#define push_variable_value(func, base, pos, len) \ - if(construct_cb(func)(user, \ - (const char*)base, (const char*)pos, len, &obj) < 0) { goto _failed; } \ - goto _push - -#define again_fixed_trail(_cs, trail_len) \ - trail = trail_len; \ - cs = _cs; \ - goto _fixed_trail_again -#define again_fixed_trail_if_zero(_cs, trail_len, ifzero) \ - trail = trail_len; \ - if(trail == 0) { goto ifzero; } \ - cs = _cs; \ - goto _fixed_trail_again - -#define start_container(func, count_, ct_) \ - if(top >= MSGPACK_EMBED_STACK_SIZE) { goto _failed; } /* FIXME */ \ - if(construct_cb(func)(user, count_, &stack[top].obj) < 0) { goto _failed; } \ - if((count_) == 0) { obj = stack[top].obj; \ - if (construct_cb(func##_end)(user, &obj) < 0) { goto _failed; } \ - goto _push; } \ - stack[top].ct = ct_; \ - stack[top].size = count_; \ - stack[top].count = 0; \ - ++top; \ - /*printf("container %d count %d stack %d\n",stack[top].obj,count_,top);*/ \ - /*printf("stack push %d\n", top);*/ \ - /* FIXME \ - if(top >= stack_size) { \ - if(stack_size == MSGPACK_EMBED_STACK_SIZE) { \ - size_t csize = sizeof(unpack_stack) * MSGPACK_EMBED_STACK_SIZE; \ - size_t nsize = csize * 2; \ - unpack_stack* tmp = (unpack_stack*)malloc(nsize); \ - if(tmp == NULL) { goto _failed; } \ - memcpy(tmp, ctx->stack, csize); \ - ctx->stack = stack = tmp; \ - ctx->stack_size = stack_size = MSGPACK_EMBED_STACK_SIZE * 2; \ - } else { \ - size_t nsize = sizeof(unpack_stack) * ctx->stack_size * 2; \ - unpack_stack* tmp = (unpack_stack*)realloc(ctx->stack, nsize); \ - if(tmp == NULL) { goto _failed; } \ - ctx->stack = stack = tmp; \ - ctx->stack_size = stack_size = stack_size * 2; \ - } \ - } \ - */ \ - goto _header_again - -#define NEXT_CS(p) ((unsigned int)*p & 0x1f) - -#ifdef USE_CASE_RANGE -#define SWITCH_RANGE_BEGIN switch(*p) { -#define SWITCH_RANGE(FROM, TO) case FROM ... TO: -#define SWITCH_RANGE_DEFAULT default: -#define SWITCH_RANGE_END } -#else -#define SWITCH_RANGE_BEGIN { if(0) { -#define SWITCH_RANGE(FROM, TO) } else if(FROM <= *p && *p <= TO) { -#define SWITCH_RANGE_DEFAULT } else { -#define SWITCH_RANGE_END } } -#endif - - if(p == pe) { goto _out; } - do { - switch(cs) { - case CS_HEADER: - SWITCH_RANGE_BEGIN - SWITCH_RANGE(0x00, 0x7f) // Positive Fixnum - push_fixed_value(_uint8, *(uint8_t*)p); - SWITCH_RANGE(0xe0, 0xff) // Negative Fixnum - push_fixed_value(_int8, *(int8_t*)p); - SWITCH_RANGE(0xc0, 0xdf) // Variable - switch(*p) { - case 0xc0: // nil - push_simple_value(_nil); - //case 0xc1: // never used - case 0xc2: // false - push_simple_value(_false); - case 0xc3: // true - push_simple_value(_true); - case 0xc4: // bin 8 - again_fixed_trail(NEXT_CS(p), 1); - case 0xc5: // bin 16 - again_fixed_trail(NEXT_CS(p), 2); - case 0xc6: // bin 32 - again_fixed_trail(NEXT_CS(p), 4); - case 0xc7: // ext 8 - again_fixed_trail(NEXT_CS(p), 1); - case 0xc8: // ext 16 - again_fixed_trail(NEXT_CS(p), 2); - case 0xc9: // ext 32 - again_fixed_trail(NEXT_CS(p), 4); - case 0xca: // float - case 0xcb: // double - case 0xcc: // unsigned int 8 - case 0xcd: // unsigned int 16 - case 0xce: // unsigned int 32 - case 0xcf: // unsigned int 64 - case 0xd0: // signed int 8 - case 0xd1: // signed int 16 - case 0xd2: // signed int 32 - case 0xd3: // signed int 64 - again_fixed_trail(NEXT_CS(p), 1 << (((unsigned int)*p) & 0x03)); - case 0xd4: // fixext 1 - case 0xd5: // fixext 2 - case 0xd6: // fixext 4 - case 0xd7: // fixext 8 - again_fixed_trail_if_zero(ACS_EXT_VALUE, - (1 << (((unsigned int)*p) & 0x03))+1, - _ext_zero); - case 0xd8: // fixext 16 - again_fixed_trail_if_zero(ACS_EXT_VALUE, 16+1, _ext_zero); - case 0xd9: // str 8 - again_fixed_trail(NEXT_CS(p), 1); - case 0xda: // raw 16 - case 0xdb: // raw 32 - case 0xdc: // array 16 - case 0xdd: // array 32 - case 0xde: // map 16 - case 0xdf: // map 32 - again_fixed_trail(NEXT_CS(p), 2 << (((unsigned int)*p) & 0x01)); - default: - goto _failed; - } - SWITCH_RANGE(0xa0, 0xbf) // FixRaw - again_fixed_trail_if_zero(ACS_RAW_VALUE, ((unsigned int)*p & 0x1f), _raw_zero); - SWITCH_RANGE(0x90, 0x9f) // FixArray - start_container(_array, ((unsigned int)*p) & 0x0f, CT_ARRAY_ITEM); - SWITCH_RANGE(0x80, 0x8f) // FixMap - start_container(_map, ((unsigned int)*p) & 0x0f, CT_MAP_KEY); - - SWITCH_RANGE_DEFAULT - goto _failed; - SWITCH_RANGE_END - // end CS_HEADER - - - _fixed_trail_again: - ++p; - - default: - if((size_t)(pe - p) < trail) { goto _out; } - n = p; p += trail - 1; - switch(cs) { - case CS_EXT_8: - again_fixed_trail_if_zero(ACS_EXT_VALUE, *(uint8_t*)n+1, _ext_zero); - case CS_EXT_16: - again_fixed_trail_if_zero(ACS_EXT_VALUE, - _msgpack_load16(uint16_t,n)+1, - _ext_zero); - case CS_EXT_32: - again_fixed_trail_if_zero(ACS_EXT_VALUE, - _msgpack_load32(uint32_t,n)+1, - _ext_zero); - case CS_FLOAT: { - union { uint32_t i; float f; } mem; - mem.i = _msgpack_load32(uint32_t,n); - push_fixed_value(_float, mem.f); } - case CS_DOUBLE: { - union { uint64_t i; double f; } mem; - mem.i = _msgpack_load64(uint64_t,n); -#if defined(__arm__) && !(__ARM_EABI__) // arm-oabi - // https://github.com/msgpack/msgpack-perl/pull/1 - mem.i = (mem.i & 0xFFFFFFFFUL) << 32UL | (mem.i >> 32UL); -#endif - push_fixed_value(_double, mem.f); } - case CS_UINT_8: - push_fixed_value(_uint8, *(uint8_t*)n); - case CS_UINT_16: - push_fixed_value(_uint16, _msgpack_load16(uint16_t,n)); - case CS_UINT_32: - push_fixed_value(_uint32, _msgpack_load32(uint32_t,n)); - case CS_UINT_64: - push_fixed_value(_uint64, _msgpack_load64(uint64_t,n)); - - case CS_INT_8: - push_fixed_value(_int8, *(int8_t*)n); - case CS_INT_16: - push_fixed_value(_int16, _msgpack_load16(int16_t,n)); - case CS_INT_32: - push_fixed_value(_int32, _msgpack_load32(int32_t,n)); - case CS_INT_64: - push_fixed_value(_int64, _msgpack_load64(int64_t,n)); - - case CS_BIN_8: - again_fixed_trail_if_zero(ACS_BIN_VALUE, *(uint8_t*)n, _bin_zero); - case CS_BIN_16: - again_fixed_trail_if_zero(ACS_BIN_VALUE, _msgpack_load16(uint16_t,n), _bin_zero); - case CS_BIN_32: - again_fixed_trail_if_zero(ACS_BIN_VALUE, _msgpack_load32(uint32_t,n), _bin_zero); - case ACS_BIN_VALUE: - _bin_zero: - push_variable_value(_bin, data, n, trail); - - case CS_RAW_8: - again_fixed_trail_if_zero(ACS_RAW_VALUE, *(uint8_t*)n, _raw_zero); - case CS_RAW_16: - again_fixed_trail_if_zero(ACS_RAW_VALUE, _msgpack_load16(uint16_t,n), _raw_zero); - case CS_RAW_32: - again_fixed_trail_if_zero(ACS_RAW_VALUE, _msgpack_load32(uint32_t,n), _raw_zero); - case ACS_RAW_VALUE: - _raw_zero: - push_variable_value(_raw, data, n, trail); - - case ACS_EXT_VALUE: - _ext_zero: - push_variable_value(_ext, data, n, trail); - - case CS_ARRAY_16: - start_container(_array, _msgpack_load16(uint16_t,n), CT_ARRAY_ITEM); - case CS_ARRAY_32: - /* FIXME security guard */ - start_container(_array, _msgpack_load32(uint32_t,n), CT_ARRAY_ITEM); - - case CS_MAP_16: - start_container(_map, _msgpack_load16(uint16_t,n), CT_MAP_KEY); - case CS_MAP_32: - /* FIXME security guard */ - start_container(_map, _msgpack_load32(uint32_t,n), CT_MAP_KEY); - - default: - goto _failed; - } - } - -_push: - if(top == 0) { goto _finish; } - c = &stack[top-1]; - switch(c->ct) { - case CT_ARRAY_ITEM: - if(construct_cb(_array_item)(user, c->count, &c->obj, obj) < 0) { goto _failed; } - if(++c->count == c->size) { - obj = c->obj; - if (construct_cb(_array_end)(user, &obj) < 0) { goto _failed; } - --top; - /*printf("stack pop %d\n", top);*/ - goto _push; - } - goto _header_again; - case CT_MAP_KEY: - c->map_key = obj; - c->ct = CT_MAP_VALUE; - goto _header_again; - case CT_MAP_VALUE: - if(construct_cb(_map_item)(user, c->count, &c->obj, c->map_key, obj) < 0) { goto _failed; } - if(++c->count == c->size) { - obj = c->obj; - if (construct_cb(_map_end)(user, &obj) < 0) { goto _failed; } - --top; - /*printf("stack pop %d\n", top);*/ - goto _push; - } - c->ct = CT_MAP_KEY; - goto _header_again; - - default: - goto _failed; - } - -_header_again: - cs = CS_HEADER; - ++p; - } while(p != pe); - goto _out; - - -_finish: - if (!construct) - unpack_callback_nil(user, &obj); - stack[0].obj = obj; - ++p; - ret = 1; - /*printf("-- finish --\n"); */ - goto _end; - -_failed: - /*printf("** FAILED **\n"); */ - ret = -1; - goto _end; - -_out: - ret = 0; - goto _end; - -_end: - ctx->cs = cs; - ctx->trail = trail; - ctx->top = top; - *off = p - (const unsigned char*)data; - - return ret; -#undef construct_cb -} - -#undef SWITCH_RANGE_BEGIN -#undef SWITCH_RANGE -#undef SWITCH_RANGE_DEFAULT -#undef SWITCH_RANGE_END -#undef push_simple_value -#undef push_fixed_value -#undef push_variable_value -#undef again_fixed_trail -#undef again_fixed_trail_if_zero -#undef start_container - -template -static inline int unpack_container_header(unpack_context* ctx, const char* data, size_t len, size_t* off) -{ - assert(len >= *off); - uint32_t size; - const unsigned char *const p = (unsigned char*)data + *off; - -#define inc_offset(inc) \ - if (len - *off < inc) \ - return 0; \ - *off += inc; - - switch (*p) { - case var_offset: - inc_offset(3); - size = _msgpack_load16(uint16_t, p + 1); - break; - case var_offset + 1: - inc_offset(5); - size = _msgpack_load32(uint32_t, p + 1); - break; -#ifdef USE_CASE_RANGE - case fixed_offset + 0x0 ... fixed_offset + 0xf: -#else - case fixed_offset + 0x0: - case fixed_offset + 0x1: - case fixed_offset + 0x2: - case fixed_offset + 0x3: - case fixed_offset + 0x4: - case fixed_offset + 0x5: - case fixed_offset + 0x6: - case fixed_offset + 0x7: - case fixed_offset + 0x8: - case fixed_offset + 0x9: - case fixed_offset + 0xa: - case fixed_offset + 0xb: - case fixed_offset + 0xc: - case fixed_offset + 0xd: - case fixed_offset + 0xe: - case fixed_offset + 0xf: -#endif - ++*off; - size = ((unsigned int)*p) & 0x0f; - break; - default: - PyErr_SetString(PyExc_ValueError, "Unexpected type header on stream"); - return -1; - } - unpack_callback_uint32(&ctx->user, size, &ctx->stack[0].obj); - return 1; -} - -#undef SWITCH_RANGE_BEGIN -#undef SWITCH_RANGE -#undef SWITCH_RANGE_DEFAULT -#undef SWITCH_RANGE_END - -static const execute_fn unpack_construct = &unpack_execute; -static const execute_fn unpack_skip = &unpack_execute; -static const execute_fn read_array_header = &unpack_container_header<0x90, 0xdc>; -static const execute_fn read_map_header = &unpack_container_header<0x80, 0xde>; - -#undef NEXT_CS - -/* vim: set ts=4 sw=4 sts=4 expandtab */ diff --git a/utils/exporters/blender/tests/README.md b/utils/exporters/blender/tests/README.md deleted file mode 100644 index c34f6495727d1d56dfb84ec6958ec0efdff37afe..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/README.md +++ /dev/null @@ -1,19 +0,0 @@ -# Running tests -In order to use the test scripts you must have your shell setup to execute Blender from the command line using `$ blender`. This either done by setting up your own wrapper scripts or by symlinking /usr/bin/blender directly to $BLENDER_ROOT/blender. - -## OS X -Make sure your do not point to blender.app as it will not pass the arguments corrently. It is required to execute on $BLENDER_ROOT/blender.app/Contents/MacOS/blender in order for the tests to function correctly. - -# Testing -Each test script focuses on a specific context and feature of the exporter. - -## Context -Context determines whether an entire scene is being exported or a single mesh node. - -## Features -Features should be tested separately (whenever possible), example: animations should be tested separately from bump maps. - -## Review -When a test is executed a new root directory, if it doesn't already exist, is created at three.js/utils/exporters/blender/tests/review. Inside will contain subdirectories of each test (named the same as the script but with the `test_` prefix removed. The test directory will contain the exported JSON file(s), index.html, and textures (if textures are being tested). The index.html is already setup to source the required libraries and load the JSON file. There is nothing else that a user should need to do in order to test their export. - -The review directory has been added to the .gitignore and will not be included when committing changes. diff --git a/utils/exporters/blender/tests/blend/anim.blend b/utils/exporters/blender/tests/blend/anim.blend deleted file mode 100644 index c8e5095735d90a05661cee7c65854cd554ce9a3a..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/anim.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/cubeA.blend b/utils/exporters/blender/tests/blend/cubeA.blend deleted file mode 100644 index 903c85de9d9a6face1a4da800eda9be13eb21e7d..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/cubeA.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/cubeB.blend b/utils/exporters/blender/tests/blend/cubeB.blend deleted file mode 100644 index 24008b2f48f68f254b9fd51cbc5d08a298e35daa..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/cubeB.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/cubeC.blend b/utils/exporters/blender/tests/blend/cubeC.blend deleted file mode 100644 index 7840c6bd212e42f101ab7d658339562137d57c0e..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/cubeC.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/light_setup.blend b/utils/exporters/blender/tests/blend/light_setup.blend deleted file mode 100644 index df9263388b90503965996148a601311e6bdc4390..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/light_setup.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/lightmap.blend b/utils/exporters/blender/tests/blend/lightmap.blend deleted file mode 100644 index d674fae81d0d352c1c29a7dd6db2b639d3c30b4a..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/lightmap.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/persp_camera.blend b/utils/exporters/blender/tests/blend/persp_camera.blend deleted file mode 100644 index eef1dd384016fafc4338e034243f1fd5a604f441..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/persp_camera.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/planeA.blend b/utils/exporters/blender/tests/blend/planeA.blend deleted file mode 100644 index 3693f9426d29123c88e54db11689d6452ec9472d..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/planeA.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/planeB.blend b/utils/exporters/blender/tests/blend/planeB.blend deleted file mode 100644 index 1f07ca5652b52a3767f366d215b24d82238db0fb..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/planeB.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/scene_area_light.blend b/utils/exporters/blender/tests/blend/scene_area_light.blend deleted file mode 100644 index 42c510bf97f1fac0f049e7afa85dd8af1b82f418..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/scene_area_light.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/scene_children.blend b/utils/exporters/blender/tests/blend/scene_children.blend deleted file mode 100644 index 5d857d8b0db436642f0873f38a4005d362253e73..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/scene_children.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/scene_directional_light.blend b/utils/exporters/blender/tests/blend/scene_directional_light.blend deleted file mode 100644 index 14dac8a0a2c88e5a6d01855259a095654bbf7d50..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/scene_directional_light.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/scene_hemi_light.blend b/utils/exporters/blender/tests/blend/scene_hemi_light.blend deleted file mode 100644 index 4dc8b4cc70102db72fc53c5fee86a68d68096e4e..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/scene_hemi_light.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/scene_instancing.blend b/utils/exporters/blender/tests/blend/scene_instancing.blend deleted file mode 100644 index 02af34b6c78bafb242bc6eb4d4d03652a8161dc7..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/scene_instancing.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/scene_maps.blend b/utils/exporters/blender/tests/blend/scene_maps.blend deleted file mode 100644 index 303427909767653a4f994a2acf503098cccc47e4..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/scene_maps.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/scene_orthographic_camera.blend b/utils/exporters/blender/tests/blend/scene_orthographic_camera.blend deleted file mode 100644 index 4791a0547601acf301ed6a0eaaee074139b1a4bf..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/scene_orthographic_camera.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/scene_perspective_camera.blend b/utils/exporters/blender/tests/blend/scene_perspective_camera.blend deleted file mode 100644 index 97e4fedb143a223ed2eb3009eb7fe2bfa97033ed..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/scene_perspective_camera.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/scene_point_light.blend b/utils/exporters/blender/tests/blend/scene_point_light.blend deleted file mode 100644 index 3bdf1e490223106774a938bf025428f80932c1ff..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/scene_point_light.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/scene_spot_light.blend b/utils/exporters/blender/tests/blend/scene_spot_light.blend deleted file mode 100644 index c961989c6e406ca4a984c6b586698cc809a8d6fc..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/scene_spot_light.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/textures/cloud.png b/utils/exporters/blender/tests/blend/textures/cloud.png deleted file mode 100644 index 0e762b24a8467540b5e609cb1e2b56f0c6bc752f..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/textures/cloud.png and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/textures/lightmap.png b/utils/exporters/blender/tests/blend/textures/lightmap.png deleted file mode 100644 index 86a51ec246ef1a9d8f27c104b254f9d727475c7a..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/textures/lightmap.png and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/textures/normal.png b/utils/exporters/blender/tests/blend/textures/normal.png deleted file mode 100644 index 6621f45767568f11bb6574c82db6e99d2f355954..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/textures/normal.png and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/textures/uv_grid.jpg b/utils/exporters/blender/tests/blend/textures/uv_grid.jpg deleted file mode 100644 index 1864b8c4b9c2607783efe46097a172927dc0fc18..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/textures/uv_grid.jpg and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/three_point.blend b/utils/exporters/blender/tests/blend/three_point.blend deleted file mode 100644 index d0021d8efd35b7603e629d9db6c5721d81e1d463..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/three_point.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/blend/torusA.blend b/utils/exporters/blender/tests/blend/torusA.blend deleted file mode 100644 index 26753b83442b2637add5723d58ff0bf1aaed99ab..0000000000000000000000000000000000000000 Binary files a/utils/exporters/blender/tests/blend/torusA.blend and /dev/null differ diff --git a/utils/exporters/blender/tests/scripts/css/style.css b/utils/exporters/blender/tests/scripts/css/style.css deleted file mode 100644 index 51e5603b0739d265abd24856d673b5d2bcdb8f6a..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/css/style.css +++ /dev/null @@ -1,13 +0,0 @@ -body { - margin: 0px; - padding: 0px; - overflow: hidden; -} - -#viewport { - position: absolute; - width: 100%; - height: 100%; - background: #1b1c1e; - background-image: linear-gradient(#7d8fa3, #1b1c1e); -} diff --git a/utils/exporters/blender/tests/scripts/exporter.py b/utils/exporters/blender/tests/scripts/exporter.py deleted file mode 100644 index 11c448a1b5ef7b8ef79d0b73b7d774517a53adad..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/exporter.py +++ /dev/null @@ -1,39 +0,0 @@ -import os -import argparse -import sys -import io_three -from io_three.exporter import constants - - -try: - separator = sys.argv.index('--') -except IndexError: - print('ERROR: no parameters specified') - sys.exit(1) - - -def parse_args(): - parser = argparse.ArgumentParser() - parser.add_argument('filepath') - for key, value in constants.EXPORT_OPTIONS.items(): - if not isinstance(value, bool): - kwargs = {'type': type(value), 'default': value} - else: - kwargs = {'action':'store_true'} - parser.add_argument('--%s' % key, **kwargs) - - return vars(parser.parse_args(sys.argv[separator+1:])) - - -def main(): - args = parse_args() - args[constants.ENABLE_PRECISION] = True - args[constants.INDENT] = True - if args[constants.SCENE]: - io_three.exporter.export_scene(args['filepath'], args) - else: - io_three.exporter.export_geometry(args['filepath'], args) - - -if __name__ == '__main__': - main() diff --git a/utils/exporters/blender/tests/scripts/js/review.js b/utils/exporters/blender/tests/scripts/js/review.js deleted file mode 100644 index 694a676240c3ca9db885024399f32e9adde849dd..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/js/review.js +++ /dev/null @@ -1,268 +0,0 @@ -var scene, renderer, camera, container, animation ,mixer; -var hasMorph = false; -var prevTime = Date.now(); -var clock = new THREE.Clock(); - -function render() { - - renderer.render( scene, camera ); - - if ( hasMorph ) { - - var time = Date.now(); - - animation.update( time - prevTime ); - - prevTime = time; - - } -} - -function animate() { - - requestAnimationFrame( animate ); - - if ( mixer !== null ) { - - var delta = clock.getDelta(); - mixer.update(delta); - - } - - render(); - -} - -function onWindowResize() { - - camera.aspect = container.offsetWidth / container.offsetHeight; - camera.updateProjectionMatrix(); - - renderer.setSize( container.offsetWidth, container.offsetHeight ); - - render(); - -} - -function setupScene( result, data ) { - - scene = new THREE.Scene(); - scene.add( new THREE.GridHelper( 10, 8 ) ); - -} - -function setupLights() { - - var directionalLight = new THREE.DirectionalLight( 0xb8b8b8 ); - directionalLight.position.set(1, 1, 1).normalize(); - directionalLight.intensity = 1.0; - scene.add( directionalLight ); - - directionalLight = new THREE.DirectionalLight( 0xb8b8b8 ); - directionalLight.position.set(-1, 0.6, 0.5).normalize(); - directionalLight.intensity = 0.5; - scene.add(directionalLight); - - directionalLight = new THREE.DirectionalLight(); - directionalLight.position.set(-0.3, 0.6, -0.8).normalize( 0xb8b8b8 ); - directionalLight.intensity = 0.45; - scene.add(directionalLight); - -} - -function loadObject( data ) { - - var loader = new THREE.ObjectLoader(); - scene = loader.parse( data ); - - var hasLights = false; - - // TODO: RectAreaLight support - var lights = ['AmbientLight', 'DirectionalLight', - 'PointLight', 'SpotLight', 'RectAreaLight', 'HemisphereLight']; - - var cameras = ['OrthographicCamera', 'PerspectiveCamera']; - - for ( var i = 0; i < scene.children.length; i ++ ) { - - var lightIndex = lights.indexOf( scene.children[ i ].type ); - - if ( lightIndex > -1 ) { - - hasLights = true; - continue; - - } - - var cameraIndex = cameras.indexOf( scene.children[ i ].type ); - - if ( cameraIndex > -1 ) { - - camera = scene.children[ i ]; - var container = document.getElementById( 'viewport' ); - - orbit = new THREE.OrbitControls( camera, container ); - orbit.addEventListener( 'change', render ); - - var aspect = container.offsetWidth / container.offsetHeight; - camera.aspect = aspect; - camera.updateProjectionMatrix(); - - } - - } - - if ( ! ( hasLights ) ) setupLights(); - - scene.add( new THREE.GridHelper( 10, 2.5 ) ); - - render(); - -} - -function loadGeometry( data, url ) { - - var loader = new THREE.JSONLoader(); - var texturePath = THREE.LoaderUtils.extractUrlBase( url ); - data = loader.parse( data, texturePath ); - - if ( data.materials === undefined ) { - - console.log('using default material'); - data.materials = [new THREE.MeshLambertMaterial( { color: 0xb8b8b8 } )]; - - } - - var mesh; - - if ( data.geometry.animations !== undefined && data.geometry.animations.length > 0 ) { - - console.log( 'loading animation' ); - data.materials[ 0 ].skinning = true; - mesh = new THREE.SkinnedMesh( data.geometry, data.materials, false ); - - mixer = new THREE.AnimationMixer( mesh ); - animation = mixer.clipAction( mesh.geometry.animations[ 0 ] ); - - } else { - - mesh = new THREE.Mesh( data.geometry, data.materials ); - - if ( data.geometry.morphTargets.length > 0 ) { - - console.log( 'loading morph targets' ); - data.materials[ 0 ].morphTargets = true; - - mixer = new THREE.AnimationMixer( mesh ); - animation = mixer.clipAction( mesh.geometry.animations[ 0 ] ); - hasMorph = true; - - } - - } - - setupScene(); - setupLights(); - scene.add( mesh ); - - if ( animation != null ) { - - console.log( 'playing animation' ); - animation.play(); - animate(); - - } else { - - render(); - - } -} - -function loadBufferGeometry( data ) { - - var loader = new THREE.BufferGeometryLoader(); - - var bufferGeometry = loader.parse( data ); - - var material = new THREE.MeshLambertMaterial( { color: 0xb8b8b8 } ); - var mesh = new THREE.Mesh( bufferGeometry, material ); - setupScene(); - setupLights(); - scene.add( mesh ); - - render(); - -} - -function loadData( data, url ) { - - if ( data.metadata.type === 'Geometry' ) { - - loadGeometry( data, url ); - - } else if ( data.metadata.type === 'Object' ) { - - loadObject( data ); - - } else if ( data.metadata.type === 'BufferGeometry' ) { - - loadBufferGeometry( data ); - - } else { - - console.warn( 'can not determine type' ); - - } - -} - -function init( url ) { - - container = document.createElement( 'div' ); - container.id = 'viewport'; - document.body.appendChild( container ); - - renderer = new THREE.WebGLRenderer( { antialias: true, alpha: true } ); - renderer.setSize( container.offsetWidth, container.offsetHeight ); - renderer.setClearColor( 0x000000, 0 ); - container.appendChild( renderer.domElement ); - renderer.gammaInput = true; - renderer.gammaOutput = true; - - var aspect = container.offsetWidth / container.offsetHeight; - camera = new THREE.PerspectiveCamera( 50, aspect, 0.01, 50 ); - orbit = new THREE.OrbitControls( camera, container ); - orbit.addEventListener( 'change', render ); - camera.position.z = 5; - camera.position.x = 5; - camera.position.y = 5; - var target = new THREE.Vector3( 0, 1, 0 ); - camera.lookAt( target ); - orbit.target = target; - camera.updateProjectionMatrix(); - - window.addEventListener( 'resize', onWindowResize, false ); - - var xhr = new XMLHttpRequest(); - xhr.onreadystatechange = function ( x ) { - - if ( xhr.readyState === xhr.DONE ) { - - if ( xhr.status === 200 || xhr.status === 0 ) { - - loadData( JSON.parse( xhr.responseText ), url ); - - } else { - - console.error( 'could not load json ' + xhr.status ); - - } - - } - - }; - xhr.open( 'GET', url, true ); - xhr.withCredentials = false; - xhr.send( null ); - -} diff --git a/utils/exporters/blender/tests/scripts/review.py b/utils/exporters/blender/tests/scripts/review.py deleted file mode 100644 index f78ab131fa1d65451eb566e64534dfa986ee21b5..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/review.py +++ /dev/null @@ -1,127 +0,0 @@ -import os -import json -import stat -import shutil -import argparse - - -os.chdir(os.path.dirname(os.path.realpath(__file__))) -os.chdir('..') -review = os.path.join(os.getcwd(), 'review') - -MASK = stat.S_IRWXU|stat.S_IRGRP|stat.S_IXGRP|stat.S_IROTH|stat.S_IXOTH - -HTML = ''' - - - %(title)s - - - - - - - - - - -''' - -def parse_args(): - parser = argparse.ArgumentParser() - parser.add_argument('json') - parser.add_argument('-t', '--tag', required=True) - return vars(parser.parse_args()) - - -def copy_for_review(tmp_json, tag): - tag_dir = os.path.join(review, tag) - if not os.path.exists(tag_dir): - print('making %s' % tag_dir) - os.makedirs(tag_dir) - dst_json = os.path.join(tag_dir, '%s.json' % tag) - print('moving %s > %s' % (tmp_json, dst_json)) - shutil.move(tmp_json, dst_json) - create_template(tag_dir, os.path.basename(dst_json)) - - print('looking for maps...') - with open(dst_json) as stream: - data = json.load(stream) - - textures = [] - materials = data.get('materials') - if data['metadata']['type'] == 'Geometry' and materials: - textures.extend(_parse_geometry_materials(materials)) - - images = data.get('images') - if data['metadata']['type'] == 'Object' and images: - for each in images: - textures.append(each['url']) - - textures = list(set(textures)) - print('found %d maps' % len(textures)) - dir_tmp = os.path.dirname(tmp_json) - for texture in textures: - texture = os.path.join(dir_tmp, texture) - dst = os.path.join(tag_dir, os.path.basename(texture)) - shutil.move(texture, dst) - print('moving %s > %s' % (texture, dst)) - - if data['metadata']['type'] == 'Object': - print('looking for non-embedded geometry') - for geometry in data['geometries']: - url = geometry.get('url') - if not url: continue - src = os.path.join(dir_tmp, url) - dst = os.path.join(tag_dir, url) - print('moving %s > %s' % (src, dst)) - shutil.move(src, dst) - elif data['metadata']['type'] == 'Geometry': - print('looking for external animation files') - for key in ('animation', 'morphTargets'): - try: - value = data[key] - except KeyError: - continue - - if not isinstance(value, str): - continue - - src = os.path.join(dir_tmp, value) - dst = os.path.join(tag_dir, value) - print('moving %s > %s' % (src, dst)) - shutil.move(src, dst) - - -def _parse_geometry_materials(materials): - maps = ('mapDiffuse', 'mapSpecular', 'mapBump', - 'mapLight', 'mapNormal') - textures = [] - for material in materials: - for key in material.keys(): - if key in maps: - textures.append(material[key]) - return textures - - -def create_template(tag_dir, filename): - html = HTML % { - 'title': filename[:-5].title(), - 'filename': filename - } - - html_path = os.path.join(tag_dir, 'index.html') - with open(html_path, 'w') as stream: - stream.write(html) - os.chmod(html_path, MASK) - - -def main(): - args = parse_args() - copy_for_review(args['json'], args['tag']) - - -if __name__ == '__main__': - main() diff --git a/utils/exporters/blender/tests/scripts/setup_test_env.bash b/utils/exporters/blender/tests/scripts/setup_test_env.bash deleted file mode 100755 index ebc5aa5d9aa076185ad81c086d6b5885f3f5cddb..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/setup_test_env.bash +++ /dev/null @@ -1,29 +0,0 @@ -#!/bin/bash - -# you must have blender setup to run from the command line -command -v blender >/dev/null 2>&1 || { echo >&2 "Blender is not accessible from the command line. Aborting."; exit 1; } - -export JSON=`python -c "import tempfile;print(tempfile.mktemp(prefix='$TAG.', suffix='.json'))"` - -export BLENDER_USER_SCRIPTS=$(cd "$DIR/../../"; pwd) - -# set the root for blend files -export BLEND=$(cd "$DIR/../blend"; pwd) - -# set the python script to exec in batch -export PYSCRIPT="$DIR/exporter.py" - -function makereview() { - if [ ! -f "$JSON" ]; then - echo "no json, export error suspected" - exit 1 - fi - python3 "$DIR/review.py" $JSON $@ -} - -function tagname() { - tag=`basename $0` - tag=${tag#test_} - tag=${tag%%.*} - echo $tag -} diff --git a/utils/exporters/blender/tests/scripts/test_buffer_geometry.bash b/utils/exporters/blender/tests/scripts/test_buffer_geometry.bash deleted file mode 100755 index 00cc36ab592b9ce01013823bc45d7acbe512f48a..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_buffer_geometry.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/cubeA.blend --python $PYSCRIPT -- \ - $JSON --vertices --normals --geometryType BufferGeometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry.bash b/utils/exporters/blender/tests/scripts/test_geometry.bash deleted file mode 100755 index 601505e34e7d16ea4071afa39b76c2d98560f173..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/cubeA.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_animation.bash b/utils/exporters/blender/tests/scripts/test_geometry_animation.bash deleted file mode 100755 index 787e1d7fd5e2eeba881c34ecf5cd607da7bf2b1a..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_animation.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/anim.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --animation rest --bones --skinning \ - --embedAnimation --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_bump_spec_maps.bash b/utils/exporters/blender/tests/scripts/test_geometry_bump_spec_maps.bash deleted file mode 100755 index a3276654ee294d855c5e9999051c0908af1bac05..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_bump_spec_maps.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/planeA.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --faceMaterials --uvs --maps --exportTextures --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_diffuse_map.bash b/utils/exporters/blender/tests/scripts/test_geometry_diffuse_map.bash deleted file mode 100755 index 601c257215a8ad25fb23ff58b8b7e09cd227c509..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_diffuse_map.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/cubeA.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --faceMaterials --uvs --maps --exportTextures --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_influences.bash b/utils/exporters/blender/tests/scripts/test_geometry_influences.bash deleted file mode 100755 index ccbbd63891c8a144a9429e2d8f997e715206c04a..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_influences.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/anim.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --animation --bones --skinning \ - --embedAnimation --influencesPerVertex 4 --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_lambert_material.bash b/utils/exporters/blender/tests/scripts/test_geometry_lambert_material.bash deleted file mode 100755 index d731351dfd8ec0a8e38253de2c73728e36f7b04a..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_lambert_material.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/cubeA.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --faceMaterials --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_light_map.bash b/utils/exporters/blender/tests/scripts/test_geometry_light_map.bash deleted file mode 100755 index e649b4f853de22912c6eaec45f5f301c10df1bb8..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_light_map.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/lightmap.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --faceMaterials --uvs --maps --exportTextures --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_mix_colors.bash b/utils/exporters/blender/tests/scripts/test_geometry_mix_colors.bash deleted file mode 100755 index 7964fd9e41a09f846f4cb4a4406195cfa156bb59..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_mix_colors.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/cubeB.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --colors --faceMaterials --mixColors --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_morph_targets.bash b/utils/exporters/blender/tests/scripts/test_geometry_morph_targets.bash deleted file mode 100755 index e4c16d8655629611b4986ec63ddcb4ced2275df1..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_morph_targets.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/anim.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --morphTargets --embedAnimation --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_normal_map.bash b/utils/exporters/blender/tests/scripts/test_geometry_normal_map.bash deleted file mode 100755 index 14544b7b9d482c54fd7711ddfb6e9b83dc541a04..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_normal_map.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/planeB.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --faceMaterials --uvs --maps --normals \ - --exportTextures --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_normals.bash b/utils/exporters/blender/tests/scripts/test_geometry_normals.bash deleted file mode 100755 index 31fd740161329f0edd5d523cd727933cd0ae84aa..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_normals.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/torusA.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --normals --indent --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_phong_material.bash b/utils/exporters/blender/tests/scripts/test_geometry_phong_material.bash deleted file mode 100755 index fe039ff564ecd835788905a5bcf669dc3ed7df5b..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_phong_material.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/torusA.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --normals --faceMaterials --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_vertex_colors.bash b/utils/exporters/blender/tests/scripts/test_geometry_vertex_colors.bash deleted file mode 100755 index 406f262368b73beb73141dfb94dd0af5b30f2fea..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_vertex_colors.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/cubeB.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --colors --faceMaterials --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_geometry_wireframe.bash b/utils/exporters/blender/tests/scripts/test_geometry_wireframe.bash deleted file mode 100755 index 143eb42191880131f5ae3355b3dd610a2da0a737..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_geometry_wireframe.bash +++ /dev/null @@ -1,8 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/cubeC.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --faceMaterials --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_area_light.bash b/utils/exporters/blender/tests/scripts/test_scene_area_light.bash deleted file mode 100755 index 4ee4c18e269656db0eed6e2c6b142b69ead3577a..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_area_light.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_area_light.blend \ - --python $PYSCRIPT -- $JSON --vertices --faces --scene \ - --lights --materials --embedGeometry --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_buffer_geometry.bash b/utils/exporters/blender/tests/scripts/test_scene_buffer_geometry.bash deleted file mode 100755 index 52a3d4765d07be578261e58dad2ba3eb0533d881..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_buffer_geometry.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/cubeA.blend --python $PYSCRIPT -- \ - $JSON --vertices --normals --geometryType BufferGeometry \ - --scene --materials --embedGeometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_buffer_geometry_noembed.bash b/utils/exporters/blender/tests/scripts/test_scene_buffer_geometry_noembed.bash deleted file mode 100755 index b1c7001209ccc8e287195838173e570c8c3f4713..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_buffer_geometry_noembed.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/cubeA.blend --python $PYSCRIPT -- \ - $JSON --vertices --normals --geometryType BufferGeometry \ - --scene --materials -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_children.bash b/utils/exporters/blender/tests/scripts/test_scene_children.bash deleted file mode 100755 index 1cc418e86c7c49bd1ea5da6e500d86da0307b395..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_children.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_children.blend \ - --python $PYSCRIPT -- $JSON --vertices --faces --scene \ - --cameras --materials --embedGeometry --lights --cameras --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_directional_light.bash b/utils/exporters/blender/tests/scripts/test_scene_directional_light.bash deleted file mode 100755 index 39db21d629bdaab72a7d4ae77aa36d94db981a75..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_directional_light.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_directional_light.blend \ - --python $PYSCRIPT -- $JSON --vertices --faces --scene \ - --lights --materials --embedGeometry --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_hemi_light.bash b/utils/exporters/blender/tests/scripts/test_scene_hemi_light.bash deleted file mode 100755 index b36df30a0d2829fa8dc3639e49d714a7eba5a9ad..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_hemi_light.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_hemi_light.blend \ - --python $PYSCRIPT -- $JSON --vertices --faces --scene \ - --lights --materials --embedGeometry --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_instancing.bash b/utils/exporters/blender/tests/scripts/test_scene_instancing.bash deleted file mode 100755 index c9cb90e7dbfcc266d6fd7b27aab318068b6a34d1..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_instancing.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_instancing.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --scene --materials --enablePrecision \ - --precision 4 --embedGeometry --indent --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_maps.bash b/utils/exporters/blender/tests/scripts/test_scene_maps.bash deleted file mode 100755 index 3d3f358b129a285364a5534df9e7f705ebed5b38..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_maps.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_maps.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --scene --materials --maps \ - --uvs --embedGeometry --exportTextures --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_no_embed.bash b/utils/exporters/blender/tests/scripts/test_scene_no_embed.bash deleted file mode 100755 index c9185c972b54386ca42ede4484d345ffec8e5725..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_no_embed.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_instancing.blend --python $PYSCRIPT -- \ - $JSON --vertices --faces --scene --materials --enablePrecision \ - --precision 4 --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_orthographic.bash b/utils/exporters/blender/tests/scripts/test_scene_orthographic.bash deleted file mode 100755 index ad02c958a9e527afce8b80f4d83642ffde2c39b9..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_orthographic.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_orthographic_camera.blend \ - --python $PYSCRIPT -- $JSON --vertices --faces --scene \ - --cameras --materials --embedGeometry --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_perspective.bash b/utils/exporters/blender/tests/scripts/test_scene_perspective.bash deleted file mode 100755 index 519787ed944201c997f9c48ce8a2c3a07da2ba86..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_perspective.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_perspective_camera.blend \ - --python $PYSCRIPT -- $JSON --vertices --faces --scene \ - --cameras --materials --embedGeometry --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_point_light.bash b/utils/exporters/blender/tests/scripts/test_scene_point_light.bash deleted file mode 100755 index 5d1021ae2b01655829b56d3119f592c8ac45ab55..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_point_light.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_point_light.blend \ - --python $PYSCRIPT -- $JSON --vertices --faces --scene \ - --lights --materials --embedGeometry --geometryType Geometry -makereview $@ --tag $(tagname) diff --git a/utils/exporters/blender/tests/scripts/test_scene_spot_light.bash b/utils/exporters/blender/tests/scripts/test_scene_spot_light.bash deleted file mode 100755 index 953a40b844afccbb78cc39855986476174ffc4e3..0000000000000000000000000000000000000000 --- a/utils/exporters/blender/tests/scripts/test_scene_spot_light.bash +++ /dev/null @@ -1,9 +0,0 @@ -#!/bin/bash - -DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" -source "$DIR/setup_test_env.bash" - -blender --background $BLEND/scene_spot_light.blend \ - --python $PYSCRIPT -- $JSON --vertices --faces --scene \ - --lights --materials --embedGeometry --geometryType Geometry -makereview $@ --tag $(tagname)