Grafana import a simple json file as data source - json

I have a bunch of data in a JSON file.
I would like to use that as (static) data source in Grafana, but I don't know how to do that.
I have installed Grafana (in a Docker container) and have added the Simple JSON plugin. But to my understanding that takes as input a URL... not a JSON file :( How can I do that?
I've had a look at the Fake JSON data source example. I see it implements a web server that will answer to some typical requests like /search or /query. But I don't understand how to adapt that. I am pretty new to Grafana as you can see...
This is what my json looks like:
{"eventid": "cowrie.direct-tcpip.request", "timestamp": "2019-01-15T10:03:24.604331Z", "session": "f3f60d4e", "src_port": 0, "message": "direct-tcp connection request to xxxx:443 from ::1:0", "system": "SSHService ssh-connection on HoneyPotSSHTransport,874,xxxxxx", "isError": 0, "src_ip": "xxxxxxxx", "dst_port": 443, "dst_ip": "xxxx", "sensor": "90a9ea4c9756"}
Thanks for your help.

Related

Continue error trying to change Google cloud trigger with REST APIs

I'm trying to change a trigger using REST API, specifically https://cloud.google.com/build/docs/api/reference/rest/v1/projects.triggers/patch. Note that I'm able to use curl and list all the triggers. Also I tried to download the trigger in JSON using https://cloud.google.com/build/docs/api/reference/rest/v1/projects.triggers/get (which works perfectly) but when I tried to upload the same file the error is always:
{
"error": {
"code": 400,
"message": "exactly 1 build config required, got: 0",
"status": "INVALID_ARGUMENT"
}
}
if I try to upload an invalid Json correctly it gives error parsing JSON, so surely it's trying to parse using JSON format.
So I tried the same experiment using "Try it!" button on Google page which opens Google APIs explorer. Same results. The interface gave me a warning that some fields are only output so I tried also to remove these fields but I got the same error.
The file I'm trying to upload is (changed some strings to remove company name)
{
"description": "Push to any branch",
"github": {
"push": {
"branch": ".*"
},
"owner": "company",
"name": "repo-utils"
},
"tags": [
"github-default-push-trigger"
],
"name": "default-push-trigger-127"
}
I think I found the issue. Google API seems to require either build or filename to be passed to specify a way to build. On the other hand the web interface allows to insert an Autodetect option for the build which will try to look for either cloudbuild.yaml or Dockerfile in the root directory. If you specify Autodetect on the web interface the Json configuration won't have either build or filename so when you try to import back that configuration it will fail.
I tried to pass filename as empty string, the web interface shows a cloudbuild.yaml (which is present) but the execution of the trigger fails.
So it seems there's no way to insert a Autodetect mode trigger using the API.

Derivatives API not returning Properties.db file

We have seen few times that the properties.db file is coming with the converted model when using the derivative APIs.
I also tried the extract node module and the zip file does not contain properties.db too.
is there any change in the api for retrieving that file?
regards,
Afshin
Unfortunately there's no specific API dedicated to retrieve the property DB file.
You can get its path by looking up the manifest after the model is completed extracted/translated and download it from there using the GET :urn/manifest/:derivativeurn endpoint:
{
"guid": "...",
"type": "resource",
"urn": "urn:adsk.viewing:fs.file:<urn>/output/properties.db",
"role": "Autodesk.CloudPlatform.PropertyDatabase",
"mime": "application/autodesk-db",
"status": "success"
},
In case it's not generated let us know the URN by dropping a line to forge.help at autodesk.com so we can look into it.

Unable to import swagger JSON or YAML into Postman

Problem
Unable to convert swagger 2.0 into a format which is being affected by Postman import functionality
Generated via /swagger.json|yaml
Swagger endpoint exposed via dropwizard jetty using swagger
swagger-core: 1.5.17
swagger-jaxrs: 1.5.17
swagger-jersey2-jaxrs: 1.5.17
swagger-models: 1.5.17
Attempts
Tried manually importing the JSON or YAML versions via the import screen
import file
import from link
paste raw text
Tried converting to different formats using: api-spec-converter and swagger2-postman-generator
Result
Error on import: Must contain an info object
Question
Has anyone managed to get around this issue allowing the import
In Swagger 2.0 the info field is mandatory. Just add the following to your YAML root:
info:
title: 'EmptyTitle'
description: 'EmptyDescription'
version: 0.1.0
Or like this if you have it in JSON format (in the root too):
"info": {
"title": "EmptyTitle",
"description": "EmptyDescription",
"version": "0.1.0"
}
Hope it helped !
Have you tried converting to Postman v2?
The swagger2-postman-generator you tried converts Swagger v2 to Postman v1. This one converts Swagger v2 to Postman v2: https://www.npmjs.com/package/swagger2-postman2-converter as used in this tutorial.
My setup: Spring boot generated swagger-ui, which gives me the raw open-api documentation.
In my case
"info": {
"title": "EmptyTitle"
}
was already available in the json which spring-boot open api generated for me, but was missing the other two fields which #BBerastegui mentions in his answer:
"description": "EmptyDescription",
"version": "0.1.0"
I added them, and the esult looks like
so that the rsult looks like this, which works:
"info": {
"title": "EmptyTitle",
"description": "EmptyDescription",
"version": "0.1.0"
}

Handling Extra JSON Tags sent from Google App Engine API

I am trying to figure out the best way to handle extra JSON returned on a 200 Success from Google App Engine.
Here is a sample of the JSON being returned from GAE
{
"username": "yo",
"password": "yo",
"email": "yo",
"id": "5654313976201216",
"kind": "photoswap#userItem",
"etag": "\"XUjxKcsckN9zXROpZ4Yj2GJxcXg/L6Zg-XPyGcr_RGBFQBwHhIYcdBQ\""
}
This does not work with the User Model that I have developed, and was causing issues because of the gson json mapping. The fields "kind" and "etag" are not part of the response message that I created, and was wondering if there is a way to remove that from being sent back from the server.
The api I have developed is written in Python
As of right now I just included those 2 values in my User Object Model, but I would like to not have to.

How to replicate a foreign remote database into a local database? (CouchDB / MongoDB)

I would like to extend the data model of a remote database that is available via a web service interface. Data can be requested via HTTP GET and is delivered as JSON (example request). Other formats are supported as well.
// URL of the example request.
http://data.wien.gv.at/daten/wfs?service=WFS&request=GetFeature&version=1.1.0&typeName=ogdwien:BAUMOGD&srsName=EPSG:4326&outputFormat=json&maxfeatures=5
First object of the JSON answer.
{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"id": "BAUMOGD.3390628",
"geometry": {
"type": "Point",
"coordinates": [
16.352910973544105,
48.143425569989326
]
},
"geometry_name": "SHAPE",
"properties": {
"BAUMNUMMER": "1022 ",
"GEBIET": "Strassen",
"STRASSE": "Jochen-Rindt-Strasse",
"ART": "Gleditsia triacanthos (Lederhülsenbaum)",
"PFLANZJAHR": 1995,
"STAMMUMFANG": 94,
"KRONENDURCHMESSER": 9,
"BAUMHOEHE": 11
}
},
...
My idea is to extend the data model (e.g. add a text field) on my own server and therefore mirror the database somehow. I stumbled into CouchDB and its document-based architecture which feels suitable to handle those aforementioned JSON objects. Now, I ask for advise on how to replicate the foreign database initially and on a regularly basis.
Do you think CouchDB is a good choice? I also thought about MongoDB. If possible, I would like to avoid building a full Rails backend to setup the replication. What do you recommend?
If the remote database is static (data doesn't change), then it could work. You just have to find a way to iterate all records. Once you figured that out, the rest is simple as a pie: 1) query data; 2) store the response in a local database; 3) modify as you see fit.
If remote data changes, you'll have many troubles going this way (you'll have to re-sync in the same fashion every once in a while). What I'd do instead is create a local database with only new fields and a reference to the original piece of data. That is, when you request data from remote service, you also look if you have something in the local db and merge those two before processing the final result.