I have a flat game-world map that I've rastered with gdal2tiles.py (-p raster, obviously, as it has not geolocation info). Displaying it as a TMS layer works just fine.
I also have vector data I want to overlay. Through trial and error, I've found a configuration that almost works:
var scales = [0.692, 1.384, 2.768, 5.536, 11.072];
var options = {
controls: [],
maxExtent: mapBounds,
scales: scales,
units: 'm'
};
But I wonder why the values are so strange and if they are really correct. So basically the question is: What IS the actual scale of an OpenLayers map? How long is a linestring from 0,0 to 1,0 and how long to 0,1 ? What is the unit? Obviously it's not metres, and the documentation doesn't explain it anywhere, either.
In case it matters, my vector data is stored in a PostGIS database. It is generated, not from any real-world source.
Scales are the difference between ground units and screen units.
You can see the current map scale denominator been displayed with the scale control or with a call to
var map = (OpenLayers.Map);
map.getScale();
If your vector data is in a Postgis database and stored as a geometry column then it should has a proper srid number. Your vector data must match your map srid code or be transformed into same srid.
Check your map projection after it has finished rendering with a call like this:
var map = (OpenLayers.Map);
map.getProjection();
Your vector data srid must match the value reported above.
Do not you confusing those scale values with resolutions?
Related
In our application, we draw rooms by reading information from an IFC file and then generate custom objects which are added to the model builder. For each vertex, we substract the globalOffset, so that the rooms align nicely with the model. This works perfectly for most models we have. However, for one model, the globalOffset is huge and thus, the custom objects will be drawn far away from the model.
The vertices we read from the IFC file are located in a reasonable space around {0, 0, 0}.
My question now is: How is the globalOffset calculated? What properties of the IFC file are taken into account?
As already stated, the other models work fine when we subtract the globalOffset from each vertex. Here is an example:
Thanks in advance for any form of help!
EDIT:
For everyone interested in the origin of the global offset in the IFC file: search for "ifcsite", there should be a reference to a local placement and this may contain a rather big translation (at least in my case).
The global offset is the mid-point of the model bounding box by default like the below:
var bboxMeta = model.getData().metadata["world bounding box"];
var min = new THREE.Vector3(bbox.minXYZ[0], bbox.minXYZ[1], bbox.minXYZ[2]);
var max = new THREE.Vector3(bbox.maxXYZ[0], bbox.maxXYZ[1], bbox.maxXYZ[2]);
var bbox = new THREE.Box3(min, max);
var globalOffset = bbox.center();
It's used to avoid floating point precision issues for models that are far away from the viewer's origin. By default, Forge Viewer will use this offset to move the whole model to the viewer's origin.
To get the global offset you can also use the following line of code with the same output:
let globalOffset = viewer.model.getData().globalOffset;
This is related to my previous question. I'm posting a new question to try and explain the situation better.
I am placing marker objects on a model using data taken from drone surveys. I have access to high accuracy GPS data and also omega/phi/kappa rotation data.
I am trying to use the Autodesk.Geolocation extension to convert the lon/lat/alt data to viewer space.
All models were originally created in Revit.
When I use the Geolocation extension, it seems like the refPointLMV and GlobalOffset are not correctly being taken into account.
Here's an example:
As you can see, the selected point [0] on the model is nowhere near the real GPS coords. Also, the refPointLMV has huge values.
Something similar happens when I take some lon/lat/alt data from the drone photo. The drone GPS data will be close to the model positionLL84, e.g (4.586577106, 51.626037158, 49.095). However, when I do Geolocation.lonLatToLMV(4.586577106, 51.626037158, 49.095) I get a result way off screen.
We've had a support query open with Autodesk related to this open for over two months now, but not had much success there. They said the engineering team is too busy to work on this and recommended to try and fix the error on our side. Support ref LMV-5261.
I have been able to bring the result of Geolocation.lonLatToLMV into viewer space with the following code:
const gpsPosition = new THREE.Vector3(
longitude,
latitude,
altitude,
);
const position = viewer
.getExtension('Autodesk.Geolocation')
.lonLatToLmv(gpsPosition);
const data = viewer.model.getData();
const globalOffset = data.globalOffset;
const refPointTransform = data.refPointTransform;
// applying the transform
position.add(globalOffset)
position.applyMatrix4(refPointTransform);
// HACK: after applying the above transforms, a final
// rotation of -45 degrees is required to move the points into position
// once this has been done, the locations match up exactly to photos.
// Seems like a weird hack, but I've tested with over 20 drone photos, they all match up.
const quaterion = new THREE.Quaternion().setFromEuler(
new THREE.Euler(0, 0, -Math.PI / 4),
);
position.applyQuaternion(quaterion);
The problem here is that we are testing with a single model and this is clearly not a robust solution that we can expect to work with all future models and drone data we throw at it.
How long is it likely to take for the engineering team to fix this? Or are they likely to fix this at all?
Sorry for the delay due to the Chinese New Year. After checking with our engineering team, the current solution is to do the followings:
Move the Project base point to N0 E0, but remain the angle to true north
Copy the LAT LONG to Survey point
Afterward, the result of the GEO conversion should be expected.
Here are the snapshots of the above setting and the result
The measurement tool of the viewer has calibration tool. It requires that user selects two points in the viewer and define the distance with proper units.
My plan is that I will have the points defined in my model at a fixed distance. I will not need user input for this. How do I add the distance, unit, and size so as to programmatically set the calibration?
Edit: The workaround.
I need that the default units be meters and it should correctly show 1 meter on the model to 1 meter as measured by measurement tool.
For the time being, what I did is -
I manually calibrate the model using calibrate tool to meters by picking two known points in the model.
Then I used this to get the scale factor -
var measureExtension =NOP_VIEWER.getExtension('Autodesk.Measure')
var factor = measureExtension.getCalibrationFactor()
(I used the above code lines in the developer console of the browser while interacting with the viewer simultaneously.)
which gave me this value factor = 0.039369.
I am adding this scale factor in my code once the model is loaded again.
measureExtension.calibrateByScale('m', 0.039369)
This seems to solve the issue for the models that I have with me.
I know this will break once I have some different model with different default units. Please let me know if someone has a better solution.
I'm taking a quick guess by looking at the viewer3D.js source:
var measureExt = viewer.getExtension('Autodesk.Measure')
// pick from available values:
// 'decimal-ft'
// 'ft-and-fractional-in'
// 'ft-and-decimal-in'
// 'decimal-in'
// 'fractional-in'
// 'm'
// 'cm'
// 'mm'
// 'm-and-cm'
measureExt.calibrate('decimal-in', 10)
I have run the cyclone case from the OpenFOAM tutorials and want to view it using the builtin paraFOAM viewer which is based on Paraview 5.4.0.
The simulation has a number of particles in the diameter range of [2e-5, 1e-4] and i would like to scale the size of particles with the diameter array provided with the results.
To do this i select the Point Gaussian representation for the lagrangian fields (kinematiccloud), select Advanced properties, and select 'Scale by data array' after which the diameter array is chosen by default (although its not possible to change it to another field, which I suspect is a bug) but all the particles disappear from the view, as can be seen in the following screenshot:
My guess is that i need to chose proper values of the Gaussian radius and for the scale transfer function but there is no documentation to which it should be set. I have tried trial-and-error but i cannot find any settings for which i can get the particles back and have them render at different sizes.
Can someone enlighten me on how to set the Gaussian radius and scale transfer function properly?
The PointGaussian has just been improved and configuration is now automatic. You may want to try the last release of ParaView.
More info here :
https://blog.kitware.com/major-improvements-on-the-point-gaussian-representation/
I am in the process of converting OSM data into an open source Minecraft port (written in javascript - voxel.js). The javascript rendition is written such that each voxel (arbitrarily defined as a cubic meter) is created as a relation from a single point of origin (x,y,z)(0,0,0).
As an example, if one wanted to create a cubic chunk of voxels, one would simply generate voxels as a relation to the origin (0,0,0) : [(0,0,0),(1,0,0), (0,1,0)...].
My question is this: I've exported OSM data, and the standard XML output (.osm) plots nodes in latitude and longitude. My initial thought is that I can create a map by calculating the distance of each node from an arbitrary point of origin (0,0,0) = (37.77559, -122.41392) using the Haversine formula, convert the distance to meters, find the bearing, and plot it as a relation to (0,0,0).
I've noticed, however, that there are a number of other export formats available: (.osm.pbf, .osm2pgsql, .imposm). I'm assuming they plot nodes in a similar fashion (lat, lng), but some of them have the ability to import directly into a database (e.g. PostgreSQL).
I've heard of people using PG add-ons like PostGIS, but (as this is my first dive into GIS) I'm unfamiliar with their capabilities and whether something like PostGIS would help me in plotting OSM data into a 2D voxel grid.
Are there functions within add-ons like PostGIS that would enable me to dynamically calculate the distance between two Lat/Lng points, and plot them in an x,y fashion?
I guess, fundamentally, my question is: if I create a script that plots OSM data into an x,y grid would I be reinventing the wheel, or is there a more efficient way to do this?
You need to transform from the spherical coordinates (LatLon, using WGS84) to cartesian coordinates, like googles spherical mercator.
In pseudo code
transform(double lat, double lon) {
double wgs84radius = 6378137;
double shift = PI * wgs84radius;
double x = lon * shift / 180;
double y = log(tan((90+lat)*PI/360)/ (PI/180);
return {x,y}
}
This is the simplest way. Keep in mind that Lat/Lon are angles, while x and y are distances from (0/0)
The OSM data is by default in the WGS84 (EPSG:4326) projection which is based on an ellipsoidal Earth and measures latitude and longitude in degrees.
Most map tiles are generated in the EPSG:900913 "Google" spherical mercator projection. This projection is based on a spherical Earth and latitude and longitude are measured in metres from the origin.
It really seems like the 900913 projection will fit quite nicely with your requirements.
Here is some code for converting between the two.
You might like to consider using osm2psql. During the import process all of the OSM map data is converted to the 900913 projection. What you are left with is a database of all the nodes, lines and polygons of the OSM map data in an easy to access Postgres database.
I was initially intimidated by this process but it is really quite straightforward and will give you lots of flexibility when it comes to using the OSM data.