Position for sphere Geometry by Three.js - autodesk

I am adding custom sphere in the forge viewer as below
var sphere_minpt =
new THREE.Mesh(
new THREE.SphereGeometry(niceRadius, 20),
material_green)
How to set position for custom sphere geometry
Please suggest some solutions

In three.js when you want to change the position of a 3d object you do this:
object.position.set ( x, y, z );
If you want to change an individual direction you do this:
object.position.x = number
However I'm assuming since you're using the forge viewer it's may be different.

Related

How to move the camera in a forge viewer to face the true north direction, using the euler angles

I understand that forge viewer uses three.js extensively, I have a couple of questions
I want to point my forge viewer camera to the north direction (true north) and further synchronise the rotation based on the north values.
Also is it possible to set the bounds ?
I'm trying to synchronise forge viewer based on a set of euler angles (pitch, yaw and roll) available at my hand
I'm using the forge viewer version 7.
Fareed also asked this via email, so I'm copying & pasting my replies here.
Not sure which source model format you used, so suppose it's Revit (RVT).
In Revit model metadata, two attributes can help calculate the north rotation to the true north.
metadata['world north vector']['XYZ']: The project north vector of the Revit view.
metadata['custom values']['angleToTrueNorth']: The angel from the project north to true north of Revit view.
// Calculate project north angle
const projectNorthVector = new THREE.Vector3().fromArray( model.getData().metadata['world north vector']['XYZ'] );
const autoCam = viewer.autocam;
const frontDirection = autoCam.sceneFrontDirection.clone(); //!<<< viewer world north
const upVector = autoCam.sceneUpDirection.clone();
let crossVector = new THREE.Vector3();
crossVector.crossVectors( frontDirection, projectNorthVector );
const projectNorthAngle = projectNorthVector.angleTo( frontDirection ) * ( crossVector.dot( upVector ) < 0 ? -1 : 1 );
// Calculate true north angle
let trueNorthAngle = metadata['custom values']['angleToTrueNorth'] * (Math.PI / 180);
// Final rotation angle from viewer world north to true north
const finalRotationAngle = projectNorthAngle + trueNorthAngle;
// and then rotate your vector by Z.
Sorry, I'm not familiar with the Euler angles (pitch, yaw, and roll), but from three.js documentation, I can see it uses intrinsic Tait-Bryan angles.
Three.js uses intrinsic Tait-Bryan angles. This means that rotations are performed with respect to the local coordinate system. That is, for order 'XYZ', the rotation is first around the local-X axis (which is the same as the world-X axis), then around local-Y (which may now be different from the world Y-axis), then local-Z (which may be different from the world Z-axis).
So, probably, you can try to get that with the either way below:
Use three.js API to get Euler Tait-Bryan angles from quaternion
const quaternion = viewer.getCamera().quaternion.clone();
const rotation = new THREE.Euler().setFromQuaternion( quaternion, 'XYZ' );
Or get it from the camera's rotation
const { rotation } = viewer.getCamera();
const eulerOrder = rotation.order;
Or refer to the Navisworks approach: https://adndevblog.typepad.com/aec/2019/07/get-roll-value-of-edit-current-viewpoint.html
viewer.navigation.setCameraUpVector( new THREE.Vector3(0,1,0), true );
const quaternion = viewer.getCamera().quaternion.clone();
let { x, y, z, w } = quaternion;
let roll = Math.atan2(2*y*w - 2*x*z, 1 - 2*y*y - 2*z*z);
let pitch = Math.atan2(2*x*w - 2*y*z, 1 - 2*x*x - 2*z*z);
let yaw = Math.asin(2*x*y + 2*z*w);
To set Euler angles to camera, here is an approach, but I think you will need to change the Euler order if it's not XYZ.
const euler = new THREE.Euler(..., ..., ..., 'XYZ');
viewer.getCamera().quaternion..setFromEuler(euler);

GLTF file not well positioned by Cesium

I want to display a hurricane (big isosurface object) in Cesium. For this I converted an OBJ file with longitude, latitude, altitude columns for each vertex of the isosurface representing the hurricane, in a new OBJ file reprojected in ECEF (Earth Centered) projection.So the final OBJ file contains now X,Y,Z for each vertex instead of longitude, latitude, altitude. After final reformat by obj2gltf, I try to display the GLTF "hurricane" file in Cesium.JS using the code below:
console.log('loading hurricane.gltf';
var mymodel = viewer.scene.primitives.add(Cesium.Model.fromGltf({
url : 'data/hurricane.gltf',
modelMatrix : Cesium.Matrix4.IDENTITY,
asynchronous: false
}));
I can see my hurricane on the earth, but not at the good position. I suspect a problem of matrix. IDENTITY matrix seems not to be the good one. I could try to make a new matrix but I can't find enough informations about the axes orientation used by Cesium.
I verified the X,Y,Z ECEF coordinates, they are good. Does anyone already meet this problem ?
If your glTF model origin is at the center of the hurricane, you can place it using a Cesium Entity, something like this:
// Longitude degrees, Latitude degrees, height in meters
var position = Cesium.Cartesian3.fromDegrees(-123.0744619, 44.0503706, height);
var heading = Cesium.Math.toRadians(0);
var pitch = 0;
var roll = 0;
var hpr = new Cesium.HeadingPitchRoll(heading, pitch, roll);
var orientation = Cesium.Transforms.headingPitchRollQuaternion(position, hpr);
var entity = viewer.entities.add({
name : 'Hurricane',
position : position,
orientation : orientation,
model : {
uri : 'data/hurricane.gltf'
}
});
viewer.trackedEntity = entity;
There are more complete working demos of this on Sandcastle.
But, if your hurricane is visible on the surface of the Earth using the identity matrix, that likely means that the origin of that model is nowhere near the center of the hurricane. You may need to edit the glTF file, to make sure that the model is centered on its own origin, and does not have some fixed Earth location pre-baked into the model's internal transformations.

Projection drift when rendering in WebGL over Google Map

I am trying to implement a WebGL-based rendering on Google Map (api3) as I want to render a massive amount of dynamic geometries.
Basically, I create a google.maps.OverlayView attached with a WebGL canvas into the map.
However, I encountered some problem with the mapping of the projection. Basically, I extracted the "fromLatLngToPoint" function from the googlemap api as follows:
function fromLatLngToPoint(a){
var c={x:0,y:0},
d=this.j;
c.x=d.x+a.lng*this.B;
var e=oe(m.sin(re(a.lat)),-(1-1E-15),1-1E-15);
c.y=d.y+.5*m.log((1+e)/(1-e))*-this.F;
return c
}
function oe(a,b,c){null!=b&&(a=m.max(a,b));null!=c&&(a=m.min(a,c));return a}
function re(a){return m.PI/180*a}
Then I implemented it in my vertex shader based on the documentation in Google Map Coordinates.
Basically, I have a event listener to send the updated projection constants, the viewport bounds, and the zoom level to my shader.
Then my shader will calculate the new screen coordinates based on these inputs.
highp float e, x, y, offsetY, offsetX;
// projection transformation for target points
e = sin(p.y* PI/180.0);
y = prj_y + 0.5 * log((1.0+e)/(1.0-e))*(-F);
x = prj_x + p.x*B;
// projection transformation for offset (bounds)
e = sin(bound_y*PI/180.0);
offsetY = prj_y + 0.5 * log((1.0+e)/(1.0-e))*(-F);
offsetX = prj_x + bound_x*B;
// calculate actual pixel coord wrt zoom/numTiles
x = (x* numTiles - offsetX* numTiles);
y = (y* numTiles - offsetY* numTiles);
gl_PointSize = 5.0;
gl_Position = projectionMatrix * modelViewMatrix * vec4(x,y,0.0,1.0);
However, as shown in the screenshot below, it seems there are some errors? The rendered geometries are distorted. (I used the google map polygon api to render some of the geometries as comparison)
Screenshot Here
I am totally at a loss, what might be the reason for this distortion?
I am suspecting that the single precision in the shader is giving rise to the error. So I am wondering if there is any workaround?
It is hard to debug this piece of code and diagnose the cause of the issue. I would suggest you using the CanvasLayer library that hides all these concrete details of specifying the coordinates you want to draw the polygon. Rather you would be able to focus on your app code and functionality. The performance will be better in terms of projected image.

Rotating a texture around x, y, z axis and Using it to draw a polygon

I'm trying to draw an arbitrary polygon with a transformed texture with Graphics API .
Here's what I'm trying to do in 3 steps:
First, I have a texture (as a BitmapData)
Second, Transform the texture - Tile it and rotate it around x, y or z axis. (y-axis for now).
Third, Draw a polygon using the transformed texture.
I could rotate it around z-axis with the code below:
var gr:Graphics = sp.graphics;
gr.clear();
var mat:Matrix = new Matrix();
mat.scale( 0.5, 0.5 );
mat.rotate( angle );
gr.beginBitmapFill( bd, mat, true, true );
gr.moveTo( points[0].x, points[0].y );
for ( var lp1:int = 1; lp1 < points.length; lp1++ )
gr.lineTo( points[lp1].x, points[lp1].y );
gr.lineTo( points[0].x, points[0].y );
gr.endFill();
But I couldn't rotate the texture around x or y axis as it requires some sort of projection I guess.
I thought about drawing a rotated Bitmap object onto a BitmapData and using it as a texture:
var bmp:Bitmap = new Bitmap( bd );
bmp.rotationY = angle;
var transformedBd:BitmapData = new BitmapData( 256, 256, true, 0 );
transformedBd.draw( bmp );
… and call gr.beginBitmapFill() with the transformedBd …
But with this code, the texture won't be tiled.
I also looked at drawTriangles() method but AFIK, it only let me draw a rotated polygon, not a polygon with rotated texture.
If anyone has insights on this issue, please share.
Any help will be greatly appreciated!
Perhaps you can:
put your 2D Texture inside a Sprite or other container
3D transform that container, for example by using
myContainer.rotationX = 20;
myContainer.rotationY = 200;
3 - then you create a new BitmapData()
4 - and you DRAW the entire myContainer into the bitmapdata.
myBitmapData.draw(myContainer, myMatrix, myColorTransform, blendMode, myRectangle, smooth);
5 - and finally you delete the original 2D texture and myContainer.
Voila, you now have a 3d transformed texture inside a single bitmapdata.

WebGL Three.js : Texture alingment on a geometry face

I would like to write text on each faces of an IcosahedronGeometry
I'm able to generate the textures and apply the textures to all the faces :
for ( var i = 0; i < geometry.faces.length; i ++ ) {
geometry.faces[i].materialIndex = i;
materials.push( new THREE.MeshBasicMaterial( { overdraw: true, map: getTexture(i), wireframe: true, wireframeLinewidth: 1} ) );
}
// 3D element
element = new THREE.Mesh( geometry, new THREE.MeshFaceMaterial(materials) );
However each textures are overwriting the other ... And I can't align them correctly
http://jsfiddle.net/jzbf7/
Any idea ?
You need to understand how the UVs are set up for IcosahedronGeometry -- they are very similar to the UVs for SphereGeometry, in which a map of the world will cover the entire sphere.
This is very different from the UVs for CubeGeometry, where the texture maps to each face.
Experiment with the updated fiddle to see for yourself: http://jsfiddle.net/jzbf7/2/
(If the sphere renders too dark, render it again -- the colors are random.)
Also, there is a bug in the IcosahedronGeometry UV map. This can be seen at the "seam".
three.js r.56