Reformattiong CSV to WKT - csv

I need help reformatting a CSV file of polygons into a format readable by QGIS.
The data I downloaded has a bunch of seemingly unnecessary text before the coordinates of the polygons.
The coordinates are formatted like this:
{"geodesic":false,"type":"Polygon","coordinates":[[[-124.26718718727625,49.10353039748446],[-124.26664819810578,49.1037998920697],[-124.26718718727625,49.1037998920697],[-124.26718718727625,49.10353039748446]]]}
and I need them to be formatted like this:
MULTIPOLYGON [[[-124.26718718727625,49.10353039748446],[-124.26664819810578,49.1037998920697],[-124.26718718727625,49.1037998920697],[-124.26718718727625,49.10353039748446]]]

Lets say you have all your coordinates in a text file line by line you'd make a copy of your file, use NotePad++ and go through following steps:
Ctrl+H
Find what: ({"geodesic":false,"type":"Polygon","coordinates":)(.+)(\}$)
Replace with: MULTIPOLYGON \2
Search mode: Regular expression
Click on Replace All or Alt+A
Done ...
MULTIPOLYGON [[[-124.26718718727625,49.10353039748446],...,...,[-124.26718718727625,49.10353039748446]]]
MULTIPOLYGON [[[-124.26718718727625,49.10353039748446],...,...,[-124.26718718727625,49.10353039748446]]]
MULTIPOLYGON [[[-124.26718718727625,49.10353039748446],...,...,[-124.26718718727625,49.10353039748446]]]
MULTIPOLYGON [[[-124.26718718727625,49.10353039748446],...,...,[-124.26718718727625,49.10353039748446]]]

Related

Convert JSON to shapefile

I have a json file that contains a lot of data with polygons, lines, points. But I can't exploit it to export the data in shapefile. Can someone help me on how to get there. The data is here.
https://www.sia.aviation-civile.gouv.fr/produits-numeriques-en-libre-disposition/donnees-zones-geographiques-uas.html
if someone can help me solve this problem.
You can't export all of it to one shapefile as it is not possible to mix geometry types in a shapefile, so you will need 3 shapefiles (points, lines, polygons).
I would make use of ogr2ogr the Swiss army knife of vector formats and use something like:
ogr2ogr -nlt POINT -skipfailures points.shp geojsonfile.json
ogr2ogr -nlt LINESTRING -skipfailures linestrings.shp geojsonfile.json
ogr2ogr -nlt POLYGON -skipfailures polygons.shp geojsonfile.json

Adding comma to coordinates in TSV file

I've a large TSV file that I'm viewing through MS Excel. I have a column showing arrays of coordinates. Unfortunately, the database from which the coords were taken did not include a comma separator between lat and long. The coords are formatted thus:
[[4.47917 51.9225],[-3.179559057 55.951719876],[-3.17055 55.9721],[-3.297777777 55.625],[-3.355388888 55.752611111]]
Whereas I need them formatted like:
[[4.47917, 51.9225],[-3.179559057, 55.951719876],[-3.17055, 55.9721],[-3.297777777, 55.625],[-3.355388888, 55.752611111]]
Is there a quick way that I can add these commas (the TSV has too many values for me to manually correct them).
I'm v much finding my way with this stuff so any help would be much appreciated.
Cheers
Yes, lucky you 😊 Ctrl+H to replace and replace with , . Replace space with coma followed by space.

Importing CSV file to Google maps format

I build a software that generate trails for my own use
I would like to test the software so I create A CSV file that contain the longitude and latitude of the trail points
What is the format of a CSV file that can imported to Google maps
The documentation isn't very specific about CSV files, so I just tried a bunch of formats.
Option 1 is to have separate latitude and longitude columns. You will be able to specify columns in the upload wizard.
lon,lat,title
-20.0390625,53.27835301753182,something
-17.841796875,53.27835301753182,something
Option 2 is to have a single coordinate column with the coordinates separated by space. You will be able to chose the order of the coordinate pair in the upload wizard.
lonlat,title
-20.0390625 53.27835301753182,something
-17.841796875 53.27835301753182,something
You'll also need one column that acts as the description for your points, it is, again, selectable in the wizard.
There seems to be no way to import CSVs as line geometries and no way to convert points to lines later on. Well-known-text (WKT) in the coordinate column fails to import.
The separator needs to be comma ,. Semicolons ;, spaces   and tabs don't work.

Formatting json properties

I have a JSON file with thousands of lines, which i need to have in another format.
My current code looks like that:
[{"Date":"2012-05-11","Value":19.5},
{"Date":"2012-05-15","Value":19.5},
{"Date":"2012-05-16","Value":18},
{"Date":"2012-05-17","Value":17.75},
...]
And it should look like this:
[Date.UTC(2012,5,11),19.5],
[Date.UTC(2012,5,13),19.5],
[Date.UTC(2012,5,16),18],
[Date.UTC(2012,5,17),17.75],
....]
Is there a tool or a quick way to do this, changing thousands of lines manually would take too much time.
Thanks in advance.
Programmatically, say php for instance, and treating each line of your JSON file as a string, you could use preg_replace(), but since it's a one time requirement, simply load your json file into a text editor and:
Delete the leading and ending square braces.
Find all {"Date":" replace with [Date.UTC(
Find all ","Value": replace with ),
Find all }, replace with ],
Save

Creating Shape Files from SQL Server using Ogr2ogr

I am trying to run the following code in a command window. The code executes, but it gives me no values in the .SHP files. The table has GeographyCollections and Polygons stored in a Field of type Geography. I have tried many variations for the Geography type in the sql statement - Binary, Text etc. but no luck. The output .DBF file has data, so the connection to the database works, but the shape .Shp file and .shx file has no data and is of size 17K and 11 K, respectively.
Any suggestions?
ogr2ogr -f "ESRI Shapefile" -overwrite c:\temp -nln Zip_States -sql "SELECT [ID2],[STATEFP10],[ZCTA5CE10],GEOMETRY::STGeomFromWKB([Geography].STAsBinary(),4326).STAsText() AS [Geography] FROM [GeoSpatial].[dbo].[us_State_Illinois_2010]" ODBC:dbo/GeoSpatial#PPDULCL708504
ESRI Shapefiles can contain only a single type of geometry - Point, LineString, Polygon etc.
Your description suggests that your query returns multiple types of geometry, so restrict that first (using STGeometryType() == 'POLYGON', for example).
Secondly, you're currently returning the spatial field as a text string using STAsText(), but you're not telling OGR that it's a spatial field so it's probably just treating the WKT as a regular text column and adding it as an attribute to the dbf file.
To tell OGR which column contains your spatial information you can add the "Tables" parameter to the connection string. However, there's no reason to do all the casting from WKT/WKB if you're using SQL Server 2008 - OGR2OGR will load SQL Server's native binary format fine.
Are you actually using SQL Server 2008, or Denali? Because the serialisation format changed, and OGR2OGR can't read the new format. So, in that case it's safer (but slower) to convert to WKB first.
The following works for me to dump a table of polygons from SQL Server to Shapefile:
ogr2ogr -f "ESRI Shapefile" -overwrite c:\temp -nln Zip_States -sql "SELECT ID, geom26986.STAsBinary() FROM [Spatial].[dbo].[OUTLINE25K_POLY]" "MSSQL:server=.\DENALICTP3;database=Spatial;trusted_connection=yes;Tables=dbo.OUTLINE25K_POLY(geom26986)"
Try the following command
ogr2ogr shapeFileName.shp -overwrite -sql "select top 10 * from schema.table" "MSSQL:Server=serverIP;Database=dbname;Uid=userid;trusted_connection=no;Pwd=password" -s_srs EPSG:4326 -t_srs EPSG:4326