I'm developping an application with java whose permit to convert a EPSG:4326 point into EPSG:2972.
My code works :
//Grab a transform between two Coordinate Reference Systems
MathTransform mathTransform = CRS.findMathTransform("EPSG:4326","EPSG:2972", true);
//Point to convert
DirectPosition2D srcDirectPosition2D = new DirectPosition2D();
srcDirectPosition2D.setCoordinateReferenceSystem(CoordSysINT);
srcDirectPosition2D.setLocation(4.4665424,-52.4648442);
DirectPosition2D destDirectPosition2D = new DirectPosition2D();
//Transformation
mathTransform.transform(srcDirectPosition2D, destDirectPosition2D);
//Projected Point
ProjectedPoint=new Point(destDirectPosition2D.getX(),
destDirectPosition2D.getY(),null);
The result is : X: 337473.6430296206 Y :493858.9919024287 but it is wrong.
If I use a website as http://cs2cs.mygeodata.eu/ and I do the same transformation, the true result is 337470.842698;493860.962631
The result of my code correspond to a a transformation between EPSG:4326 and EPSG : 32622 but I don't understand why ?
What is wrong in my code ?
Thanks a lot
Regards
Nicolas
The Proj.4 Text for EPSG:2972 is
+proj=utm +zone=22 +ellps=GRS80 +towgs84=2,2,-2,0,0,0,0 +units=m +no_defs
In Geotool, EPSG Database v8.6 shows the WKT for EPSG:2972 as
2972=PROJCS["RGFG95 / UTM zone 22N", GEOGCS["RGFG95", DATUM["Reseau Geodesique Francais Guyane 1995", SPHEROID["GRS 1980", 6378137.0, 298.257222101, AUTHORITY["EPSG","7019"]], TOWGS84[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], AUTHORITY["EPSG","6624"]], PRIMEM["Greenwich", 0.0, AUTHORITY["EPSG","8901"]], UNIT["degree", 0.017453292519943295], AXIS["Geodetic longitude", EAST], AXIS["Geodetic latitude", NORTH], AUTHORITY["EPSG","4624"]], PROJECTION["Transverse_Mercator", AUTHORITY["EPSG","9807"]], PARAMETER["central_meridian", -51.0], PARAMETER["latitude_of_origin", 0.0], PARAMETER["scale_factor", 0.9996], PARAMETER["false_easting", 500000.0], PARAMETER["false_northing", 0.0], UNIT["m", 1.0], AXIS["Easting", EAST], AXIS["Northing", NORTH], AUTHORITY["EPSG","2972"]]
Note the highlighted toWGS84 conversion is different. This would explain the difference you are seeing. I don't know which is correct but I have a feeling that the EPSG WKT is wrong.
This difference changes the behaviour of EPSG:2972 to be like that of EPSG:32622. The more technical answer is that spheroid of the datum (GRS80) of EPSG:2972 is made to behave as if it were the spheroid of the datum (WGS84) of EPSG:32622
Related
Good day. I am trying to convert a raster to points using Google Earth Engine. My raster has one band (clusters) and it has been clipped to my ROI. I am aware of the reduceToVectors function on Google Earth Engine, but as I understand, this function creates areas with the same adjacent value, whereas what I want is to create as many points as there are pixels.
So far, I have tried different versions of:
var vectors = image.reduceToVectors({
reducer : null,
geometry : treat,
scale:30,
crs :image.projection().getInfo().crs,
geometryType : 'centroid',
labelProperty : 'null',
eightConnected: false,
maxPixels: 1e15
});
Thanks a lot for your help.
ee.Image.sample returns a point for every pixel.
var vectors = image.sample({
region: treat,
geometries: true, // if you want points
});
If you do not specify a scale and crs, it will use each pixel in the input image's original resolution. If you do, it will sample at the given scale instead.
Demonstration script:
var region = ee.Geometry.Polygon(
[[[-110.00683426856995, 40.00274575078824],
[-110.00683426856995, 39.99948706365032],
[-109.99576210975647, 39.99948706365032],
[-109.99576210975647, 40.00274575078824]]], null, false);
var image = ee.Image('CGIAR/SRTM90_V4');
Map.setCenter(-110, 40, 16);
Map.addLayer(image, {min: 1000, max: 2000}, 'SRTM');
var vectors = image.sample({
region: region,
geometries: true,
});
print(vectors);
Map.addLayer(ee.FeatureCollection([region]).style({"color": "white"}));
Map.addLayer(vectors);
https://code.earthengine.google.com/625a710d6d315bad1c2438c73bde843b
We are developing a standardized report for our activities. The last graph I need is to display the geographic area of the activities (there are close to 100 locations).
The output for these reports is PDF letter or A4 size
The report is a mplotlib figure, where:
fig = plt.figure(figsize=(8.5, 11))
rect0 = 0, .7,, 0.18, 0.3
rect1 = .3, .7, .18, .3
rect2 = .8, .29, .2, .7
rect3 = 0, 0, .8, .4
ax1 = fig.add_axes(rect0)
ax2 = fig.add_axes(rect1)
ax3 = fig.add_axes(rect2)
ax4 = fig.add_axes(rect3)
The contents and layout for axes 1-3 are settled and work great. However ax4 is where the map contents would be displayed (ideally).
I was hoping to do something like this:
map1 = Basemap(llcrnrlon=6.819087, llcrnrlat=46.368452, urcrnrlon=6.963978,
urcrnrlat=46.482906, resolution = 'h', projection='tmerc',
lon_0=6.88, lat_0=46.42, ax=4)
map1.readshapefile('a valid shape file that works') #<----- this is the sticking point
map1.draw(insert locator coordinates)
plt.savefig(report to be inserted to document)
plt.show()
However I have not been successful in obtaining a shape file that works from open street maps or GIS.
Nor have I identified the correct process to transform the data from openstreetmaps.
Nor have I identified the process to extract that information from the OSM/xml document or the transformed GeoJSON document.
Ideally I would like to grab the bounding box information from openstreetmaps and generate the map directly.
What is the process to get a shapefile that works with the .readshapefile() call?
Or alternatively how do I get the defined map into a Matplotlib axes ?
It might be easiest to use the cartopy.io.img_tiles module, which will automatically pull the OSM tiles for use with cartopy. Using the pre-rendered tiles would negate the trouble of handling and styling individual shapefiles/XML.
See the cartopy docs on using these tiles within cartopy.
I have used 2 geojson object for polygon. It's too large that I can't post it here. Now I am using TurfJs to make the union of this polygon geojson and plotting it on the map. But it's not working properly.
I think little bit points in the middle of it is a little bit different. So is there any way to ignore this points in the middle in turfjs union?
See images bellow for better understanding.
Polygon 1 :
Polygon 2 :
Now merged polygon for bellow code:
polygons = {
"type": "FeatureCollection",
"features": [poly1, poly2]
};
Now main UNION result:
union = turf.union(poly1,poly2);
So in this, i want to ignore points that are in middle of boundary I know that, there may be points that are not accurate on intersection boundary of both polygon but can I ignore points that are nearer or having little bit of differencr to ignore middle points?
Or is there is any alternative to do union of polygon that ignore few nearer distraction of point and remove middle points?
You can try running the resulting polygon through turf.buffer(result, 0, 'kilometers') (turf-buffer docs). If your result is invalid geojson then using a buffer of 0 should cleanse the geometry (remove the points/lines in the middle).
It is hard to say what will work for sure without seeing the actual GeoJSON of the result. Is there any way you can upload it to pastebin or something?
Update - Turf buffer did not work in this case. The only solution that I could get to work was doing this on the result of turf.union(p1, p2).
result.geometry.coordinates = [result.geometry.coordinates[0]]
You want to be careful with this solution as it removes everthing from the polygon other than the external ring.
To understand why/how this works, you will want to make sure you understand how the coordinates for geojson polygons work. From the geojson.org geojson polygon specification
For type "Polygon", the "coordinates" member must be an array of LinearRing coordinate arrays. For Polygons with multiple rings, the first must be the external ring and any others must be internal rings or holes.
The external ring is essentially the outline of your polygon. Any internal rings are usually represented as holes. In your case, the internal rings were actually lines.
When looking at the coordinates for a geojson polygon, you will notice that all coordinates are contained within an outer array. Here is an example of a geojson polygon with only a single (external) ring.
{"type": "Feature", "properties": {}, "geometry": {"type": "Polygon", "coordinates": **[ [ [1, 1], [1, 2], [1, 3], [1, 1] ] ]**
Notice that the first coordinate and last coordinate of a ringe must always be the same. That ensure that we get a closed shape (ie: polygon).
Now here is an example with an external ring, and an internal ring
{"type": "Feature", "properties": {}, "geometry": {"type": "Polygon", "coordinates": **[ [ [1, 1], [1, 2], [1, 3], [1, 1] ], [ [1, 2], [1, 3], [1, 1] ] ]**
Now if we apply the suggested solution to the above example, we would get the same coordinates as the first example because we are grabbing only the first set of coordinates from the polygon, which will always be the external ring. Any subsequent elements in the coordinates array will represent internal rings (which is what the lines were, even though they are technically not valid internal rings).
{"type": "Feature", "properties": {}, "geometry": {"type": "Polygon", "coordinates": **[ [ [1, 1], [1, 2], [1, 3], [1, 1] ] ]**
As you can see, we are removing all internal rings from the polygon. That is the reason that you must be careful with how you use this. If you ever have valid internal rings, it will actually get rid of those.
I think that the reason this happens is because your polygons (p1 and p2) share a border.
Faced the same problem: After trying to buffer a small positive amount and the same negative amount, the lines disappears. But this made the polygon having more points than the original so I did this workaround:
inner = [YOUR FEATURE COLLECTION]
var areas = []
for (var i = 0; i < inner.geometry.coordinates.length; i++) {
let item = inner.geometry.coordinates[i]
if (item.length > 10) areas.push(item)
}
inner = turf.polygon(areas)
As you can see I am removing the "non complex" polygons (assuming that a polygon with less than 10 points is not a real area)
This happens because the coordinates of both polygons are not 100% the same, creating a small gap when merging them together.
When faced with this problem, I had to use turf's distance method to check every vertex of the polygons, and if there was a small difference between them, I'd make them the same.
The implementation method may vary on the map library you are using, but it should go something like this:
layers.forEach(layer => {
layers.forEach(innerLayer => {
if (layer === innerLayer) return;
// Here you would check if the vertexes are close to each other, using distance.
// If the vertexes are close, you would make them equal and update the layer.
})
})
Only after making the vertex of the polygons the same, you would merge them with the union method.
Since the implementation is pretty unique and depends on the project, I won't waste both our time with actual code, but I believe that with the insights above, you should be good to go.
I am using google map map API
isLocationOnEdge
var isLocationNear = google.maps.geometry.poly.isLocationOnEdge(latlng, new google.maps.Polyline({
path: google.maps.geometry.encoding.decodePath(result.routes[0].overview_polyline)
}), .00001);
I dnt understand how tolerance is related to km
isLocationOnEdge(point:LatLng, poly:Polygon|Polyline, tolerance?:number)
Let sat i want to detect if a user is within 100m for any polyline drawn on map. How to fix this.
From one of the comments in this post:
tolerance, it is based on the decimal place accuracy desired in terms of latitude and longitude.
Example if say (33.00276, -96.6824) is on the polyline, if the tolerance is 0.00001 then if you change the point to (33.00278, -96.6824) then the point will ont be on the polyline.
So, you can probably use 0.001 as the tolerance value, if you want to find detect a location within about 100m for polyline.
For example, if your location is (1.001, 12), one of the points in polyline is(1, 12), the distance between your location and the polyline will be about 111.319 meters. The tolerance between (1.001, 12) and (1, 12) is 0.001, so the isLocationOnEdge() will return true.
If your location is (1.002, 12), distance to (1, 12), will be about 222.638 meters. The tolerance between them is 0.002, so if you use number 0.001 as the tolerance value for isLocaitonOnEdge(), it will return false.
You can see the sample code from this JSFiddle: https://jsfiddle.net/j7cco3b0/1/
You can also create a custom function to validate in meters for a better precision.
var isLocationOnEdge=function(location,polyline,toleranceInMeters) {
for(var leg of polyline.getPath().b) {
if(google.maps.geometry.spherical.computeDistanceBetween(location,leg) <= toleranceInMeters){
return true;
}
}
return false;
};
The package for isLocationOnEdge which I found is as given below -
com.google.maps.android.PolyUtil.isLocationOnEdge()
It worked for me.
I would like to know why does AS3 colorTransform only transforms the border of a shape?
Similar question is posted however I do not think that such a massive workaround needs to be done in order to do so.
I have something like:
var sh:Shape = new Shape();
sh.graphics.lineStyle(4, 0x000000);
sh.graphics.beginFill(0xFFFF00);
sh.graphics.drawRect(0, 0, 200, 200);
sh.graphics.endFill();
addChild(sh);
Ye I know we can use with(sh.graphics) do here, however if I make a color transform like:
sh.transform.colorTransform = new ColorTransform(1, 1, 1, 1, red_offset, green_ofs, b_off, 0);
Only the border of a shape is transformed.
I've tried to redraw on everyframe the object with different fill but it's an overkill in performance, about 10 3d planes were performance killers.
I can only think of that because beginFill() does not use a pen set by lineStyle() and that may be causing the problem, however I would really like to know the problem as I need my uber-super-semi3d-spinner to spin while changing colors and not his borders! :)
Thanks in advance!
I don't know why ColorTransform affects only line color (seems just design decision), but ColorMatrixFilter will transform entire shape (tested). Don't be afraid of it - it's quite simple. First four columns of matrix are multipliers (1.0 is 100%) and fifth column is added to result.
var sht:Shape = new Shape();
sht.graphics.lineStyle(4, 0x7F7FFF);
sht.graphics.beginFill(0xFFFFFF);
sht.graphics.drawRect(0, 0, 200, 200);
sht.graphics.endFill();
sht.x = 300;
sht.y = 100;
sht.filters = [ new ColorMatrixFilter(
[ 0.5, 0.0, 0.0, 0.0, 0.0,
0.0, 1.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.7, 0.0, 0.0,
0.0, 0.0, 0.0, 1.0, 0.0
])];
addChild(sht);