Easiest way to convert pcap to JSON - json

I have a bunch of pcap files, created with tcpdump. I would like to store these in a database, for easier querying, indexing etc. I thought mongodb might be a good choice, because storing a packet the way Wireshark/TShark presents them as JSON document seems to be natural.
It should be possible to create PDML files with tshark, parse these and insert them into mongodb, but I am curious if someone knows of an existing/other solution.

On the command line (Linux, Windows or MacOS), you can use tshark.
e.g.
tshark -r input.pcap -T json >output.json
or with a filter:
tshark -2 -R "your filter" -r input.pcap -T json >output.json
Considering you mentioned a set of pcap files, you can also pre-merge the pcap files into a single pcap and then export that in one go if preferred..
mergecap -w output.pcap input1.pcap input2.pcap..

Wireshark has a feature to export it's capture files to JSON.
File->Export Packet Dissections->As JSON

You could use pcaphar. More info about HAR here.

Related

Export RTP statistics from wireshark/tshark into XML or CSV

Basically I would like to export the analysics of wireshark to RTP streams into CSV or XML format to read it again for some tests. I can do the following using tshark through command line.
tshark -r rtp.pcap -q -z rtp,streams
Is there a way to specify and output file and it's format? If there's a way to do this through wireshark directly, it's welcome.
Note: what need to store is the overall statistics of all the streams not the detailed one per each stream.
You can save the output to a text file using the redirect operator. i.e. > output.txt. This is very basic and difficult to parse but unfortunately, there does not seem to be any way to control the format of the output. The -T -E -e combination outputs details from each packet and the -w option outputs a raw file.
Wireshark
Go to Telephony -> RTP -> Show All Streams
You can copy the values to the clipboard in CSV.
See also Wireshark Wiki

Yarn parsing job logs stored in hdfs

Is there any parser, which I can use to parse the json present in yarn job logs(jhist files) which gets stored in hdfs to extract information from it.
The second line in the .jhist file is the avro schema for the other jsons in the file. Meaning that you can create avro data out of the jhist file.
For this you could use avro-tools-1.7.7.jar
# schema is the second line
sed -n '2p;3q' file.jhist > schema.avsc
# removing the first two lines
sed '1,2d' file.jhist > pfile.jhist
# finally converting to avro data
java -jar avro-tools-1.7.7.jar fromjson pfile.jhist --schema-file schema.avsc > file.avro
You've got an avro data, which you can for example import to a Hive table, and make queries on it.
You can check out Rumen, a parsing tool from the apache ecosystem
or When you visit the web UI, go to job history and look for the job for which you want to read .jhist file. Hit the Counters link at the left,now you will be able see an API which gives you all the parameters and the value like CPU time in milliseconds etc. which will read from a .jhist file itself.

How to export JMeter results to JSON?

We run load tests with JMeter and would like to export result data (throughput, latency, requests per second etc.) to JSON, either a file or STDOUT. How can we do that?
JMeter can save the results in a CSV format with header.
(Do not forget to select Save Field Names - it is OFF by default)
Then you can use this tool to covert the CSV to a JSON.
http://www.convertcsv.com/csv-to-json.htm
EDIT
JMeter stores the result in XML or CSV format. XML is by default (with .jtl extension). But It is always recommended to save the result in csv format.
If you want to convert XML to JSON
http://www.utilities-online.info/xmltojson/#.U9O2ifldVBk
If you are planning to use CSV, To save the result in CSV format automatically
When you are running your test via command line, to save the result in csv for a specific test
%JMETER_HOME%\bin\jmeter.bat" -n -t %TESTNAME% -p %PROPERTY_FILE_PATH% -l %RESULT_FILE_PATH% -j %LOG_FILE_PATH% -Djmeter.save.saveservice.output_format=csv
Or
You can update the jmeter.properties in bin folder to enable below property (for any test you run)
jmeter.save.saveservice.output_format=csv
Hope, it is clear!
There is no OOTB solution for this but you could inspire yourself from this patch:
https://issues.apache.org/bugzilla/show_bug.cgi?id=53668

Upload Data in MapQuest DMv2 through CSV using Data Manager API call

I need to upload data in MapQuest DMv2 through a CSV file. After going through the documentation I found following syntax of uploading data-
http://www.mapquestapi.com/datamanager/v2/upload-data?key=[APPLICATION_KEY]&inFormat=json&json={"clientId": "[CLIENT_ID]","password": "[REGISTRY_PASSWORD]","tableName": "mqap.[CLIENT_ID]_[TABLENAME]","append":true,"rows":[[{"name":"[NAME]","value":"[VALUE]"},...],...]}
This is fair enough if I want to put individual rows in in rows[], but there is no mention of the procedure to follow to upload data through a CSV file. It has been clearly mentioned that "CSV, KML, and zipped Shapefile uploads are supported ". How can I achieve it though this Data Manager API service?
Use a multipart post to upload the csv instead of the rows. You can see it working here.
I used the CURL program to accomplish that. Here is an example of a CURL.exe command line. You can call it from a batch file, or in my case, from a C# program.
curl.exe -F clientId=XXXXX -F password=XXXXX -F tableName=mqap.XXXXX_xxxxx -F append=false --referer http://www.mapquest.com -F "file=#C:\\file.csv" "http://www.mapquestapi.com/datamanager/v2/upload-data?key=KEY&ambiguities=ignore"

Convert JSON data to BSON on the command line

I'm on an Ubuntu system, and I'm trying to write a testing framework that has to (among other things) compare the output of a mongodump command. This command generates a bunch of BSON files, which I can compare. However, for human readability, I'd like to convert these to nicely formatted JSON instead, which I can do using the provided bsondump command. The issue is that this appears to be a one-way conversion.
While I can work around this if I absolutely need to, it would be alot easier if there was a way to convert back from JSON to BSON on the command line. Does anyone know of a command line tool to do this? Google seems to have come up dry.
I haven't used them, but bsontools can convert from json, xml, or csv
As #WiredPrarie points out, the conversion from BSON to JSON is lossy, and it makes no sense to want to go back the other way. Workarounds include using mongoimport instead of mongorestore, or just using the original BSON. See the comments for more deails (adding this answer mainly so I can close the question)
You can try beesn, it converts data both ways. For your variant - JSON -> BSON - use the -x switch.
Example:
$ beesn -x -i test-data/01.json -o my.bson
Disclaimer: I am an author of this tool.