Confused about Kibana import - json

I would like to know about how I can import data using kibana. Actually, its a confusion for me. I have tried to load json file using kibana, but it is not importing it.
second, if I want to work with Warc file, they do I need to convert it into JSON file and then import it or is there any other solution that I need to work on.
Hope to hear a reply.

You can import your JSON file directly to elasticsearch and later on create an index from kibana application. You can run the following command:
curl -XPUT localhost:9200/_bulk --data-binary #file.json
About the second question, I think you need to convert to a JSON file first.

Related

Export commands in MongoDB

i want to export all the queries in Mongo shell. I want get a file like sql(a file with table,inserts,querys...) but with MongoDB.
Im looking for a command for export that data and i watch a command for export the collection in a file .json, but thats not my idea.
Anyone know anything?
You have two options one export as JSON, second export as BSON.
For JSON you should use mongoimport and mongoexport binaries. Other way for BSON Mongo provides different binaries they are mongorestore and mongodump.
I'm sharing MongoDB documents, I hope those can help you.
JSON Export
https://docs.mongodb.com/v4.2/reference/program/mongoexport/#connect-to-a-mongodb-instance
BSON Export
https://docs.mongodb.com/v4.2/reference/program/mongodump/#connect-to-a-mongodb-instance
Notice that, BSON is not a human readable content, if you want to check what the exported file has you should try exporting in JSON.

How to import data from json file to mongodb atlas collection

I wanted to import data to my collection in mongodb atlas, and I was following the documentation: https://docs.mongodb.com/compass/beta/import-export/ but there is no "ADD DATA" and I don't know if Im using some other version or Im doing something else wrongly.
I need to import whole file which is json array.
The docs you referenced are for a future version of Compass. If you want to import from EJSON at the command line you can use mongoimport.
Here's the simplest syntax, but there are many variations possible.
mongoimport --db=users --collection=contacts --file=contacts.json

Formatting S3 BucketSize Logs for Logstash

I am trying for a solution to show the S3 BucketSize information in Kibana using ELK Stack. I need S3 BucketSize information stored in S3 Bucket and then read the files through Logstash. Hope this would solve viewing historical data. Since I am new to ELK, I am petty not sure how to get this work.
I have tried below things to work, but did not go well...
Using ncdu-s3 tool I have S3 BucketSize details in JSON format. I am trying to put that into Logstash and it is throwing error as below.
The JSON file format is
[1,0,{"timestamp":1469370986,"progver":"0.1","progname":"ncdu-s3"},
[{"name":"s3:\/\/BucketName\/FolderName1"},
[{"name":"FolderName1"},
{"dsize":107738,"name":"File1.rar"},
{"dsize":532480,"name":"File2.rar"},
[{"name":"FolderName2"},
{"dsize":108890,"name":"File3.rar"}]]]
I use the below command
curl -XPOST 'http://localhost:9200/test/test/1' -d #/home/ubuntu/filename.json
ERROR:
{"error":"MapperParsingException[Malformed content, must start with an object]","status":400}
I think I need to format the JSON file in order to work... Do anyone suggest a good way to go?

how to import json data in neo4j

I have json data and i want to import in neo4j.
Export data option will be there in neo4j but how to import JSON data in neo4j.
This is the link of jsfiddle. http://jsfiddle.net/harmeetsingh090/mkdm4t44/
Please help if someone know.
You can use jq to manipulate your data into CSV format and then use the LOAD CSV command.
Neo4J doesn't have a native way of doing this, but there is a plugin for Neo4J called apoc.load.json. You can load data doing the following:
CALL apoc.load.json("file:///<path_to_file>/example.json") YIELD value as document
UNWIND document.root AS root
MERGE (e:ExampleNode {id: root.id})
...
You can find more information on the plug here: https://neo4j-contrib.github.io/neo4j-apoc-procedures/. I've recently used this and found it to be quite intuitive.

how to load .json file in elasticsearch

I have a .json file and I want to load into elastic search for filtering.
A simple way to get started is to use curl. If you have a JSON file with a single document, you could index it like this:
$ curl -XPOST "http://localhost:9200/example/default" -d#file.json
example is the index name and default the doc type.
For more information, take a look at the docs:
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/docs-index_.html