split a single file to multiple files by key name - json

I have a huge json file which has its keys starting with a "/". I wanted to create multiple json file based on the key name.
/upgrade-coordinator/api/v1/upgrade/eula/acceptance
/upgrade-coordinator/api/v1/upgrade/history
/upgrade-coordinator/api/v1/upgrade/nodes
/upgrade-coordinator/api/v1/upgrade/nodes-summary
/upgrade-coordinator/api/v1/upgrade/status-summary
/upgrade-coordinator/api/v1/upgrade/summary
/upgrade-coordinator/api/v1/upgrade/upgrade-unit-groups
/upgrade-coordinator/api/v1/upgrade/upgrade-unit-groups-status
Following some note in the site, I came across:
for f in `cat input.json | jq -r 'keys[]'` ; do
cat input.json | jq ".$f" > $f.json
done
or when you insist on more bashy syntax like some seem to prefer:
for f in $(jq -r 'keys[]') ; do
jq ".[\"$f\"]" < input.json > "$f.json"
done < input.json
When I tried the above, I get the error:
-bash: -/nsxapi/api/v1/vpn/l2vpn/sessions/summary.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/eula/acceptance.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/history.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/nodes.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/nodes-summary.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/status-summary.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/summary.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/upgrade-unit-groups.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/upgrade-unit-groups-status.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/upgrade-unit-groups/aggregate-info.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/upgrade-units.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/upgrade-units-stats.json: No such file or directory
-bash: -/upgrade-coordinator/api/v1/upgrade/upgrade-units/aggregate-info.json: No such file or directory
I think its probably bcos, its looking to create aggregate-info.json in the path -/upgrade-coordinator/api/v1/upgrade/upgrade-units/ -- If this is correct, how do I replace the first string "/" in every key to a blank space? Can I get some help, please?

It means - to take the first error message as example -, that the directory -/nsxapi/api/v1/vpn/l2vpn/sessions does not exist. You could verify this by doing a
ls -d -- -/nsxapi/api/v1/vpn/l2vpn/sessions
in the working directory, where your script runs.
Note that a redirection to some path does not automatically create intermittent directories.

Related

OpenGrok) How can I use '--symlink' command in OpenGrok?

I'm not sure how to use the --symlink command in OpenGrok, so I'm asking.
OpenGrok's source root folder is '/opengrok/src'.
In this folder, I created a symbolic link file with the following command.
ln -s /home/A/workspace/tmp tmp
And I did indexing with the following command.
java -Djava.util.logging.config.file=/opengrok/etc/logging.properties -jar /opengrok/dist/lib/opengrok.jar -c /usr/local/bin/ctags -s /opengrok/src -d /opengrok/data -P -S -W /opengrok/etc/configuration.xml --symlink /opengrok/src/tmp -U http://localhost:8080/source
When I connect to localhost/source, the tmp file is displayed, but when I click it, the files in tmp are not displayed and the following error message is displayed.
Error: File not found!
The requested resource is not available.
Resource lacks history info. Was remote SCM side up when indexing occurred? Cleanup history cache dir(or just the .gz for the file or db record) and rerun indexer making sure remote side will respond during indexing.
How can I access and view the files in tmp using OpenGrok?

MacOS/Windows - How to extract specific .json file from multiple zips and renaming the .json file as folder it was extracted from

I am dealing with cuckoo sandbox exported data having report.json file under each zip file.
eg > 123456.zip each zip has the file in zipfile/reports/report.json
I have multiple zip files in a folder. I want to have those zip files to be named as zipfilename.json. I have tried many ways but to fail, here's the code I am trying:
#! /bin/bash
mkdir -p "DESTDIR"
for i in *.zip ; do
unzip "$i" $i/reports/report.json -d "DESTDIR"
mv "DESTDIR/reports/report.json" "DESTDIR/$(basename "$i" .zip)_THEFILE"
done
All I get is this output showing that the file does not exist:
(base) s#Sais-MBP Downloads % sh script.sh
Archive: 1959098.zip
caution: filename not matched: 1959098.zip/reports/report.json
mv: rename DESTDIR/THEFILE to DESTDIR/1959098_THEFILE: No such file or directory
Archive: 1959100.zip
caution: filename not matched: 1959100.zip/reports/report.json
mv: rename DESTDIR/THEFILE to DESTDIR/1959100_THEFILE: No such file or directory
Any help is greatly appreciated as I cannot make any progress for the past few days.
okay, I took help from a friend and he gave me the answer to it as I have done the whole script wrong
#! /bin/bash
#
# save this file as test.sh
#
mkdir -p "DESTDIR"
for ZIPFILE in *.zip ; do
NAME="${ZIPFILE%.*}"
mkdir -p "DESTDIR/$NAME"
unzip -j $ZIPFILE reports/report.json -d "DESTDIR/$NAME/"
mv "DESTDIR/$NAME/report.json" "DESTDIR/$NAME.json"
done

Broke my terminal while installing MongoDB

I was working on installing MongoDB and following some steps I found online. Somehow I broke the terminal because every time I load my terminal I get:
-bash: uname: command not found
-bash: ps: command not found
-bash: dirname: command not found
-bash: dirname: command not found
-bash: dirname: command not found
-bash: dirname: command not found
-bash: brew: command not found
Also tried to go back into VIM to fix a path and I get
bash: vim: command not found
Any way to reset the terminal to fix this?
My view: Terminal Window
Run this to get a sane PATH again:
export PATH=$(/usr/bin/getconf PATH)
If they exist, you may want to source these files:
cd
. ./.profile
. ./.bashrc

How do I access the data in a bucket using gsutil

C:\Users\goura\AppData\Local\Google\Cloud SDK>gsutil cp -r gs://299792458bucket/X
CommandException: Wrong number of arguments for "cp" command.
getting this error
You need to give it a location to copy to probably?
Try:
gsutil cp -r gs://299792458bucket/X .
(be sure you're in a directory that doesn't have a lot of other files in it)

Trying to create $now folder and copy .sql files created folders

I have a shell script that I am trying to run every few days that copies .sql database files and moves them into a designated folder appeneded with /$now/. The script executes perfectly, but I am getting a cp: cannot create regular file '/path/to/dir/$now/': No such file or directory.
I know these folders exist because it is showing up when I 'ls -ltr' the directory.
All of my permissions are executable and writable.This has been puzzling me for about a week now and I just cant put a finger on it.
Here is my code:
#!/bin/bash
BACKUP_DIR="/backups/mysql/"
FILE_DIR=/dbase/files/
now=$(date +"%Y_%m_%d")
# setting the input field seperator to newline
IFS=$'\n'
# find db backups and loop over
for file in $(find ${FILE_DIR} -maxdepth 1 -name "*.sql" -type f -exec basename {} \;); do
# create backup directory:
mkdir -p "${BACKUP_DIR}${file%.sql}/${now}"
# copy file over
cp "${FILE_DIR}${file}" "${BACUP_DIR}${file%.sql}/${now}/"
done
Thanks in advance!
Update:
Error I am getting:
+ mkdir -pv /backups/mysql/health/2014_12_04
+ cp /dbase/files/health.sql health/2014_12_04/
cp: cannot create regular file 'health/2014_12_04/': No such file or directory
This is happening for all 9 directories already created
The error was a typo. I was missing the 'K' in $BACKUP_DIR on the cp line. Here is the correct code:
#!/bin/bash
BACKUP_DIR="/backups/mysql/"
FILE_DIR=/dbase/files/
now=$(date +"%Y_%m_%d")
# setting the input field seperator to newline
IFS=$'\n'
# find db backups and loop over
for file in $(find ${FILE_DIR} -maxdepth 1 -name "*.sql" -type f -exec basename {} \;); do
# create backup directory:
mkdir -p "${BACKUP_DIR}${file%.sql}/${now}"
# copy file over
cp "${FILE_DIR}${file}" "${BACKUP_DIR}${file%.sql}/${now}/"
done