Importing json file to Kibana via UI - json

I've just move from 5.2.1 to 5.2.2 (own bug fixed)
Before I migrated, I've export all queries/ searches to json file in order to upload it to the new Kibana version
At first, I've update the ES version and to make sure all works, I reopen Kibana 5.2.1 and import json file. All good :)
Afterward, updating to Kibana 5.2.2.
When I open it all searches, visualize and dashboard were in. Is this the proper and straightforward way to copy my data when updating version?
Or maybe to use like in this question?
Tnx

Ok I got it and it's quite simple :)
when creating queries/ visualize in kibana, it saved it to .
kibana default index pattern (in config file) in ES. So, when updating Kibana's version and reading from the same ES version, data will appear in the UI.
In case user wish to save it to other index pattern, he should change it in config file.
For more reading see here

Related

Filling Firestore document with data programmatically

I already have some lists of data stored on my computer, and I want to upload them to my firestore database programmatically, without having to enter them one by one.
Have already seen some articles but none of them really worked for me.
***Note that I want to import the Initial Data that is not going to change over time, and the answer below is perfectly solving that.
I have about 100K documents to import, so programmatical upload was very crucial.
For webapp NodeJS project you can do following steps:
1) Visit https://console.firebase.google.com/project/YOUR_PROJECT_ID/settings/serviceaccounts/adminsdk and download .json file which authorizes your Firebase Admin SDK.
2) Upgrade your Firestore to Blaze plan (pay as you go) as free version doesn't support importing data. You can manage/limit your spending here.
3) Paste the code you see in the page from step 1 to your .js file and use admin to import data like this:
admin
.firestore()
.collection("NAME_OF_YOUR_ORGANIZATION")
.add({key: 'value', key2: 'value2'})
If you want to edit an existing document, you can do it like this:
admin
.firestore()
.collection("NAME_OF_YOUR_ORGANIZATION")
.doc("UID_OF_EXISTING_DOCUMENT")
.set({key: 'value', key2: 'value2'})
I suggest you to checkout the Admin SDK documentation.

Open local JSON file for examination

I was wondering if it's possible to open a local JSON file so I can just check its structure? Didn't/don't want to upload the file to an online JSON format checker site and was hoping I can just utilize PAW to do that.
Don't seem to be able to do this with a local file, unless I run it through a local server, eg using MAMP, unless I missed something...?
Thanks.
You could copy the content into the txt body then switch to the JSON body this will let you view it in the nice structure, sorry currently no way to directly import a file need to copy past the content.
Take a look at jsonlint npm module. Supports JSON schema validation and pretty printing.

How can I have a Solr auto update an index from a JSON file

I want to have Solr watch and auto update an index from a JSON file. Is this doable and if so what is the best way?
No, Solr doesn't have any mechanism for watching for a file to change. You can however work around this - depending on your OS - to have a small program watch the file or directory for a change, and then submit the JSON document to Solr.
See How to execute a command whenever a file changes on Superuser.

Data import from mysql to solr?

Sorry but I am new in solr but I am stuck in this.
First what I am doing: I have use Tomcat server to install my solr.
What I want to do: I want to import mysql data to solr.
I have been searching for hours but could not find a proper solution to it have seen many question but to nearest to it was this Question. but it was no help cause it have a different error.
This is my command:
http://localhost:8080/solr/db/dataimport?command=full-import
This is my error:
Error:HTTP Status 404 - /solr/db/dataimport
type: Status report
message: /solr/db/dataimport
description The requested resource is not available.
Sorry I am new to solr any advice will be very helpful.
This line is from the example that ships with Solr. If you just found it on the internet, see the README.txt file in the example/example-DIH directory of the distribution.
If you are trying to configure your own collection, you need to replace the db part with your collection name and you need to configure DIH in solrconfig.xml (libraries and end-point) and you need to configure the import definition to use your own database.

Migrating from Lighthouse to Jira - Problems Importing Data

I am trying to find the best way to import all of our Lighthouse data (which I exported as JSON) into JIRA, which wants a CSV file.
I have a main folder containing many subdirectories, JSON files and attachments. The total size is around 50MB. JIRA allows importing CSV data so I was thinking of trying to convert the JSON data to CSV, but all convertors I have seen online will only do a file, rather than parsing recursively through an entire folder structure, nicely creating the CSV equivalent which can then be imported into JIRA.
Does anybody have any experience of doing this, or any recommendations?
Thanks, Jon
The JIRA CSV importer assumes a denormalized view of each issue, with all the fields available in one line per issue. I think the quickest way would be to write a small Python script to read the JSON and emit the minimum CSV. That should get you issues and comments. Keep track of which Lighthouse ID corresponds to each new issue key. Then write another script to add things like attachments using the JIRA SOAP API. For JIRA 5.0 the REST API is a better choice.
We just went through a Lighthouse to JIRA migration and ran into this. The best thing to do is in your script, start at the top-level export directory and loop through each ticket.json file. You can then build a master CSV or JSON file to import into JIRA that contains all tickets.
In Ruby (which is what we used), it would look something like this:
Dir.glob("path/to/lighthouse_export/tickets/*/ticket.json") do |ticket|
JSON.parse(File.open(ticket).read).each do |data|
# access ticket data and add it to a CSV
end
end