Start a new topic

3D model scale between Blender and Wikitude 3D Encoder

3D model scale between Blender and Wikitude 3D Encoder

The Wikitude SDK/3dEncoder does not take into account units, nor does it apply any unit conversions. So the size of the 3D model is determined directly by the coordinate values of the the 3D model's vertices. 

In the Wikitude3dEncoder the size of a square in the base grid has unit one. The grid is 40 x 40.

You several possibilities to change the size of your 3D model:
1. scale it within your 3D modeling app (Blender, Maya, AutoCAD, ...)
2. scale it within Wikitude Studio
3. scale it via the Wikitude JavaScript API (Model class)

The size of a 3D model 
- augmented in an geo-based AR scene is explained here
- augmented over a target image (applyies also to extended tracking) is relative to the size the target image (i.e., it's size real size). Therby the height of the target image represents one unit in augmentation space.

Hello again,

I'd like to know if there is any relativity between the size of 3D Objects in my Blender environment (usually measured in meters) and the encoded object in the 3D Encoder you provide. I've seen this post, but I'm not sure I understand it. What units should I use so that the objects encoded are displayed correctly? Should I create my 3D models in meters, centimeters or unitless?


What I want to test is having a 3D model of an object that is 1m in height and when I create it on a relative location 5 meters south from my location, I should be able to see that object looking as if it's 1 meter in height. As I can guess from here, scaling of objects is really important when augmenting real world.


Thanks in advance
Login or Signup to post a comment