EDIT/UPDATE -- I never figured out the error message, but new day, try again, this post helped: How to add properties to topojson file?
and the way to marry it was this, add a second NAME
topojson -o output.json --id-property=NAME,NAME -p -e counties.csv CountiesTopo.json
I have a nice topojson that shows all the counties in Georgia. It loads, displays fine. I converted it from .shp with shpescape.com
BUT! I want to blend it with some external properties, the graduation rate for each county as stored in a .csv. So I'm trying topojson at the command line.
GaCountiesTopo.json has a field called NAME; the data's capitalized.
counties.csv has a field called NAME; the data's capitalized.
I tried this:
topojson \
-o output.json \
-e counties.csv \
-- id-property=NAME \
-p \
-- CountiesTopo.json
And got this:
fs.js:427
return binding.open(pathModule._makeLong(path), stringToFlags(flags), mode);
^
Error: ENOENT, no such file or directory 'NAME'
at Object.fs.openSync (fs.js:427:18)
at Object.fs.readFileSync (fs.js:284:15)
at inputJson (/usr/local/lib/node_modules/topojson/bin/topojson:218:30)
at pop (/usr/local/lib/node_modules/topojson/node_modules/queue-async/queue.js:28:14)
at Object.q.defer (/usr/local/lib/node_modules/topojson/node_modules/queue-async/queue.js:59:11)
at /usr/local/lib/node_modules/topojson/bin/topojson:164:5
at Array.forEach (native)
at Object.<anonymous> (/usr/local/lib/node_modules/topojson/bin/topojson:163:8)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
No such file or directory 'NAME', eh? Is this a syntax error on my part? Or is it possible that the field name 'NAME' for some reason isn't matching up one-to-one? Or maybe somewhere there's a lonely county in the .json that doesn't have a counterpart in the .csv, like maybe I'm trying to match Screven with Scerven? 159 counties!
Hmm .. maybe my input, CountiesTopo.json has gotten screwed up? So, let's try this, just to see what happens:
-o output.json \
-p \
-- CountiesTopo.json
Well I plugged output.json back into my d3 code and it gave me a stripped-down map: just the outline of the state of Georgia: no counties!
So hmm … to explore topojson more, why not try converting my original .shp?
-o output.json \
-p \
-- input.shp
Gives me:
Trace: { [Error: ENOENT, open 'input.dbf'] errno: 34, code: 'ENOENT', path: 'input.dbf' }
at output (/usr/local/lib/node_modules/topojson/bin/topojson:232:29)
at notify (/usr/local/lib/node_modules/topojson/node_modules/queue-async/queue.js:49:26)
at EventEmitter.<anonymous> (/usr/local/lib/node_modules/topojson/node_modules/queue-async/queue.js:39:11)
at EventEmitter.emit (events.js:95:17)
at EventEmitter.ended (/usr/local/lib/node_modules/topojson/node_modules/shapefile/index.js:32:38)
at EventEmitter.emit (events.js:95:17)
at ReadStream.error (/usr/local/lib/node_modules/topojson/node_modules/shapefile/file.js:68:13)
at ReadStream.EventEmitter.emit (events.js:95:17)
at fs.js:1500:12
at Object.oncomplete (fs.js:107:15)
Something wrong with my input, eh? FWIW It's a U.S. Census Tiger shape file.
What do you think? How can I marry my .csv into my .json?
Thanks!
Related
To my eyes the following JSON looks valid.
{
"DescribeDBLogFiles": [
{
"LogFileName": "error/postgresql.log.2022-09-14-00",
"LastWritten": 1663199972348,
"Size": 3032193
}
]
}
A) But, jq, json_pp, and Python json.tool module deem it invalid:
# jq 1.6
> echo "$logfiles" | jq
parse error: Invalid numeric literal at line 1, column 2
# json_pp 4.02
> echo "$logfiles" | json_pp
malformed JSON string, neither array, object, number, string or atom,
at character offset 0 (before "\x{1b}[?1h\x{1b}=\r{...") at /usr/bin/json_pp line 51
> python3 -m json.tool <<< "$logfiles"
Expecting value: line 1 column 1 (char 0)
B) But on the other hand, if the above JSON is copy & pasted into an online validator, both 1 and 2, deem it valid.
As hinted by json_pp's error above, hexdump <<< "$logfiles" indeed shows additional, surrounding characters. Here's the prefix: 5b1b 313f 1b68 0d3d 1b7b ...., where 7b is {.
The JSON is output to a logfiles variable by this command:
logfiles=$(aws rds describe-db-log-files \
--db-instance-identifier somedb \
--filename-contains 2022-09-14)
# where `aws` is
alias aws='docker run --rm -it -v ~/.aws:/root/.aws amazon/aws-cli:2.7.31'
> bash --version
GNU bash, version 5.0.17(1)-release (x86_64-pc-linux-gnu)
Have perused this GitHub issue, yet can't figure out the cause. I suspect that double quotes get mangled somehow when using echo - some reported that printf "worked" for them.
The use of docker run --rm -it -v command to produce the JSON, added some additional unprintable characters to the start of the JSON data. That makes the resulting file $logfiles invalid.
The -t option allocations a tty and the -i creates an interactive shell. In this case the -t is allowing the shell to read login scripts (e.g. .bashrc). Something in your start up scripts is outputting ansi escape codes. Often this will to clear the screen, set up other things for the interactive shell, or make the output more visually appealing by colorizing portions of the data.
Could you please suggest what i am doing wrong? i cannot change the delimiter of the output file using es2csv cli tool.
es2csv -q '*' -i test_index -o test.csv -f id name -d /t
Actually this issue has been reported here: https://github.com/taraslayshchuk/es2csv/issues/51
If you don't want to wait for the fix to be released, you can change line 212 of es2csv.py like this and it will work:
csv_writer = csv.DictWriter(output_file, fieldnames=self.csv_headers, delimiter=unicode(self.opts.delimiter))
Is there any way to avoid the first eval for code below? I tried to play with ${LASTPIPE[0]} and such things but it was way too complicated for me. For instance when I tried to inject code with semicolon ($MYSQLCOMMAND; if ...) it broke my output to the pipe. I spent more than 8 hours wandering StackOverflow just to give up and write one more SQL command without pipe. Don't want to publish bad written code.
MYSQLQUERY="select * from $TABLENAME"
MYSQLTOPTIONS="--defaults-extra-file=$EXTRA -h $HOSTNAME -D $DBNAME -N -e"
MYSQLCOMMAND='mysql $MYSQLTOPTIONS "$MYSQLQUERY"'
# Exit immediately if something wrong with MySQL command
if eval "$MYSQLCOMMAND > /dev/null" ; then : ; else \
printf "Script returned non-zero exit code: %s\n" "$?" ; exit $?; fi
# Count rows for valid JSON output
ROWS=0
eval $MYSQLCOMMAND | \
while read ; \
do
((ROWS++))
done
(rest of the code generates JSON with calling the same
eval ... while read... and verified by https://jsonlint.com/)
Also I'd like to hear any your comment on the code since I'm not an experienced bash coder.
After installing topojson with sudo npm install -g topojson I am unable to convert neither a shapfile (.shp) nor a geojson file to a topojson file.
Alexanders-MacBook-Pro:topojson alexander$ geo2topo Parcel11_projected.geojson > Parcel11_topo.json
buffer.js:495
throw new Error('"toString()" failed');
^
Error: "toString()" failed
at Buffer.toString (buffer.js:495:11)
at Object.parse (native)
at ReadStream.<anonymous> (/usr/local/lib/node_modules/topojson/bin/geo2topo:107:46)
at emitNone (events.js:91:20)
at ReadStream.emit (events.js:185:7)
at endReadableNT (_stream_readable.js:974:12)
at _combinedTickCallback (internal/process/next_tick.js:74:11)
at process._tickCallback (internal/process/next_tick.js:98:9)
Alexanders-MacBook-Pro:topojson alexander$ topojson -q 1e4 -o out.json --Parcel11Cert.shp
-bash: topojson: command not found
It seems that the code is executed in the first case but the geojson is too large to convert. In the second case, the command is not recognized.
I am using the following command line reference
Any idea what could be going wrong here?
Already answered in Large geoJSON to TopoJSON. Summary: File is too large to be processed by the js engine, error message is unfortunate.
I have a 2MB json file that is only all in one line and now I get an error using jq:
$ jq .<nodes.json
parse error: Invalid literal at line 1, column 377140
How do I debug this on the console? To look at the mentioned column, I tried this:
head -c 377139 nodes.json|tail -c 1000
But I cannot find any error with a wrong t there, so it seems it is not the correct way to reach the position in the file.
How can I debug such a one-liner?
cut the file into more lines with
cat nodes.json|cut -f 1- -d} --output-delimiter=$'}\n'>/tmp/a.json
and analyse /tmp/a.json with, then you get an error with line nr:
parse error: Invalid literal at line 5995, column 47
use less -N /tmp/a.json to find that line
I see you are on a shell prompt. So you could try perl, because your operating system has it pre-installed, presumably.
cat nodes.json | json_xs -f json -t json-pretty
This tells the json_xs command line program to parse the file and prettify it.
If you don't have json_xs installed, you could try json_pp (pp is for pure-perl).
If you have neither, you must install the JSON::XS perl module with this command:
sudo cpanm JSON::XS
[sudo] password for knb:
--> Working on JSON::XS
Fetching http://www.cpan.org/authors/id/M/ML/MLEHMANN/JSON-XS-3.01.tar.gz ... OK
Configuring JSON-XS-3.01 ... OK
Building and testing JSON-XS-3.01 ... OK
Successfully installed JSON-XS-3.01 (upgraded from 2.34)
1 distribution installed
This installs JSON::XS and a few helper scripts, among them json_xs and json_pp.
Then you can run this simple one-liner:
cat dat.json | json_xs -f json -t json-pretty
After misplacing a parenthesis to force a nesting-error somewhere in the valid json file dat.json I got this:
cat dat.json | json_xs -f json -t json-pretty
'"' expected, at character offset 1331 (before "{"A_DESC":"density i...") at /usr/local/bin/json_xs line 181, <STDIN> line 1.
Maybe this is more informative than the jq output.