How can I convert Spark AR device's screen coordinates to local world coordinates? For example, I want to position an object at the left edge of the screen. My problem is that the objects are using local coordinates based on a central pivot point measured in meters at the center of the screen, whether the device gives the screen size in pixels.
I don't know if there is a ration and how Spark AR measured this on different devices.
You need to look at SceneModule. Plane should be a child of Focal Distance.
const TouchGestures = require('TouchGestures');
const Scene = require('Scene');
const R = require('Reactive');
const plane = Scene.root.find('plane0');
TouchGestures
.onTap()
.subscribe(
(gesture) => {
const focalPosition = Scene.unprojectToFocalPlane(R.point2d(gesture.location.x, gesture.location.y));
plane.transform.x = focalPosition.x.neg();
plane.transform.y = focalPosition.y;
plane.transform.z = 0;
}
);
Related
I am looking to turn 2 lat/lon positions into an x and y distance of the canvas, then apply the distance formula to it.
Right now I have:
const leftPoint = new LatLon(center.lat, center.lon).destinationPoint(semiMajorAxis, 270);
const rightPoint = new LatLon(center.lat, center.lon).destinationPoint(semiMajorAxis, 90);
const leftXY = Cartographic.toCartesian(Cartographic.fromDegrees(leftPoint.lon, leftPoint.lat));
const rightXY = Cartographic.toCartesian(Cartographic.fromDegrees(rightPoint.lon, rightPoint.lat));
const diameter = distanceFormula(leftXY.x, leftXY.y, rightXY.x, rightXY.y);
But the result of diameter is 18,000, even though both points are on my screen!
Cesium's Cartographic.toCartesian function converts a Cartographic (lon/lat/alt) type of coordinate to a full 3D Cartesian position. Imagine X, Y, Z with zero being the center of the Earth itself, with the Earth's surface being approximately 6.3 million meters in any direction.
If you're looking for 2D canvas / screen coordinates, you must follow this call with another function, Cesium.SceneTransforms.wgs84ToWindowCoordinates. That function converts the 3D WGS84 (Cartesian3) Earth position into a 2D (Cartesian2) screen position. There's a demo of wgs84ToWindowCoordinates being used in the Sandcastle Star Burst Example around line 287.
Also it looks like you've rolled your own LatLon class, not specified above, that appears to have similar functions to Cesium's Cartographic class. You might be able to make the code a little cleaner by using Cartographic directly instead of a homebrew class there. Likewise you don't need to roll your own distanceFormula on the last line. Once you have 2D Cartesian2 window coordinates, call Cesium.Cartesian2.distance to get the distance.
I can't understand your saying 'x and y distance of the canvas'.
Generally, for calculate distance between two point on CesiumJS follow below steps.
1.Define two points
//Define x,y coordinate and convert to radian
const longitudeRadian_1 = Cesium.Math.toRadians(longitudeDegree_1)
const latitudeRadian_1 = Cesium.Math.toRadians(latitudeDegree_1)
const longitudeRadian_2 = Cesium.Math.toRadians(longitudeDegree_2)
const latitudeRadian_2 = Cesium.Math.toRadians(latitudeDegree_2)
//Get cartographic from degrees
const Carto_Point_1 = new Cesium.Cartographic(longitudeRadian_1 , latitudeRadian_1 )
const Carto_Point_2 = new Cesium.Cartographic(longitudeRadian_2 , latitudeRadian_2)
//Get cartesian from cartographic
const Cartesian_Point_1 = Cesium.Cartographic.toCartesian(Carto_Point_1)
const Cartesian_Point_2 = Cesium.Cartographic.toCartesian(Carto_Point_2)
2.Calculate distance between two points
const distance = Cesium.Cartesian3.distance(Cartesian_Point_1, Cartesian_Point_2)
console.log(distance)
I hope this would help
I understand that forge viewer uses three.js extensively, I have a couple of questions
I want to point my forge viewer camera to the north direction (true north) and further synchronise the rotation based on the north values.
Also is it possible to set the bounds ?
I'm trying to synchronise forge viewer based on a set of euler angles (pitch, yaw and roll) available at my hand
I'm using the forge viewer version 7.
Fareed also asked this via email, so I'm copying & pasting my replies here.
Not sure which source model format you used, so suppose it's Revit (RVT).
In Revit model metadata, two attributes can help calculate the north rotation to the true north.
metadata['world north vector']['XYZ']: The project north vector of the Revit view.
metadata['custom values']['angleToTrueNorth']: The angel from the project north to true north of Revit view.
// Calculate project north angle
const projectNorthVector = new THREE.Vector3().fromArray( model.getData().metadata['world north vector']['XYZ'] );
const autoCam = viewer.autocam;
const frontDirection = autoCam.sceneFrontDirection.clone(); //!<<< viewer world north
const upVector = autoCam.sceneUpDirection.clone();
let crossVector = new THREE.Vector3();
crossVector.crossVectors( frontDirection, projectNorthVector );
const projectNorthAngle = projectNorthVector.angleTo( frontDirection ) * ( crossVector.dot( upVector ) < 0 ? -1 : 1 );
// Calculate true north angle
let trueNorthAngle = metadata['custom values']['angleToTrueNorth'] * (Math.PI / 180);
// Final rotation angle from viewer world north to true north
const finalRotationAngle = projectNorthAngle + trueNorthAngle;
// and then rotate your vector by Z.
Sorry, I'm not familiar with the Euler angles (pitch, yaw, and roll), but from three.js documentation, I can see it uses intrinsic Tait-Bryan angles.
Three.js uses intrinsic Tait-Bryan angles. This means that rotations are performed with respect to the local coordinate system. That is, for order 'XYZ', the rotation is first around the local-X axis (which is the same as the world-X axis), then around local-Y (which may now be different from the world Y-axis), then local-Z (which may be different from the world Z-axis).
So, probably, you can try to get that with the either way below:
Use three.js API to get Euler Tait-Bryan angles from quaternion
const quaternion = viewer.getCamera().quaternion.clone();
const rotation = new THREE.Euler().setFromQuaternion( quaternion, 'XYZ' );
Or get it from the camera's rotation
const { rotation } = viewer.getCamera();
const eulerOrder = rotation.order;
Or refer to the Navisworks approach: https://adndevblog.typepad.com/aec/2019/07/get-roll-value-of-edit-current-viewpoint.html
viewer.navigation.setCameraUpVector( new THREE.Vector3(0,1,0), true );
const quaternion = viewer.getCamera().quaternion.clone();
let { x, y, z, w } = quaternion;
let roll = Math.atan2(2*y*w - 2*x*z, 1 - 2*y*y - 2*z*z);
let pitch = Math.atan2(2*x*w - 2*y*z, 1 - 2*x*x - 2*z*z);
let yaw = Math.asin(2*x*y + 2*z*w);
To set Euler angles to camera, here is an approach, but I think you will need to change the Euler order if it's not XYZ.
const euler = new THREE.Euler(..., ..., ..., 'XYZ');
viewer.getCamera().quaternion..setFromEuler(euler);
How to draw a gizmo by giving it a position, orientation and eventually a scale in a CesiumJS application?
By gizmo I mean a 3-axes right-handed reference frame using (x,y,z) vectors, ideally depicted as (RGB) values, such as these, for example:
I wish I could depict the orientation of any object (e.g. a glTF) by placing such reference frame, for example, at the position of the object origin (e.g. using its longitude, latitude and elevation) and following its orientation, as defined by its heading, pitch and roll values which must follow the three given angles in their original order (heading first, pitch second and roll third) starting from the LTP-ENU (0,0,0) convention (x=0=east, y=0=north, z=0=upward).
The inspector is not an option.
You can use DebugModelMatrixPrimitive.
Here 's Sandcastle
Sample code
const viewer = new Cesium.Viewer("cesiumContainer");
const position = Cesium.Cartesian3.fromDegrees(-107.0, 40.0, 300000.0);
const redSphere = viewer.entities.add({
name: "Red sphere with black outline",
position: position,
ellipsoid: {
radii: new Cesium.Cartesian3(300000.0, 300000.0, 300000.0),
material: Cesium.Color.RED.withAlpha(0.5),
outline: true,
outlineColor: Cesium.Color.BLACK,
},
});
const heading = Cesium.Math.toRadians(10);
const pitch = Cesium.Math.toRadians(50);
const roll = Cesium.Math.toRadians(0);
const hpr = new Cesium.HeadingPitchRoll(heading, pitch, roll);
const frame = Cesium.Transforms.headingPitchRollToFixedFrame(position, hpr);
viewer.scene.primitives.add(new Cesium.DebugModelMatrixPrimitive({
modelMatrix: frame,
length: 800000,
width: 3.0
}));
viewer.zoomTo(viewer.entities);
I want to display a hurricane (big isosurface object) in Cesium. For this I converted an OBJ file with longitude, latitude, altitude columns for each vertex of the isosurface representing the hurricane, in a new OBJ file reprojected in ECEF (Earth Centered) projection.So the final OBJ file contains now X,Y,Z for each vertex instead of longitude, latitude, altitude. After final reformat by obj2gltf, I try to display the GLTF "hurricane" file in Cesium.JS using the code below:
console.log('loading hurricane.gltf';
var mymodel = viewer.scene.primitives.add(Cesium.Model.fromGltf({
url : 'data/hurricane.gltf',
modelMatrix : Cesium.Matrix4.IDENTITY,
asynchronous: false
}));
I can see my hurricane on the earth, but not at the good position. I suspect a problem of matrix. IDENTITY matrix seems not to be the good one. I could try to make a new matrix but I can't find enough informations about the axes orientation used by Cesium.
I verified the X,Y,Z ECEF coordinates, they are good. Does anyone already meet this problem ?
If your glTF model origin is at the center of the hurricane, you can place it using a Cesium Entity, something like this:
// Longitude degrees, Latitude degrees, height in meters
var position = Cesium.Cartesian3.fromDegrees(-123.0744619, 44.0503706, height);
var heading = Cesium.Math.toRadians(0);
var pitch = 0;
var roll = 0;
var hpr = new Cesium.HeadingPitchRoll(heading, pitch, roll);
var orientation = Cesium.Transforms.headingPitchRollQuaternion(position, hpr);
var entity = viewer.entities.add({
name : 'Hurricane',
position : position,
orientation : orientation,
model : {
uri : 'data/hurricane.gltf'
}
});
viewer.trackedEntity = entity;
There are more complete working demos of this on Sandcastle.
But, if your hurricane is visible on the surface of the Earth using the identity matrix, that likely means that the origin of that model is nowhere near the center of the hurricane. You may need to edit the glTF file, to make sure that the model is centered on its own origin, and does not have some fixed Earth location pre-baked into the model's internal transformations.
I am working with Google earth Engine and I am trying to extract/Filter (clip) pixels in a band using another image (band). I calculated NDVI and created a threshold value that rendered an image with NDVI > 0.3 but I wanted to extract the corresponding pixels in the visible an NIR bands.
Here is snippet code.
var s2 = ee.ImageCollection('COPERNICUS/S2');
var s2_filtered = s2.filterDate('2017-01-01', '2017-12-31')
.filterBounds(geometry) //custom Geometry
var calcNDVI = function(x){
var ndvi4 = x.normalizedDifference(["B5", "B4"]).rename("ndvi")
return x.addBands(ndvi)
}
var ndviCollection = s2_filtered.map(calcNDVI)
var maxNDVI = mosaic.select("ndvi");
var threshold = maxNDVI.gt(0.3)
I am at the point where I wanted to clip the corresponding pixels in "B", "G", "R" and "NIR" bands using the threshold variable(image). Obviously, I'm stuck here. Please let me know if there is with a way to filter/clip pixels of one band using another band with in GEE. The task is similar to using Clipper in QGIS which is the options I am left with if this doesn't work.
Thanks for your help!
The variable threshold is a mask, so you have to mask out pixels in the mosaic using the threshold mask, right? If that is the case, simply update the mask of the image:
var masked = maxNDVI.updateMask(threshold)