pgAgent post json to PostgreSQL database - json

I am looking to update JSON data in a PostgreSQL database after the service has been updated or after 28 days. and is it possible to use pgAgent?
I need to bring FAA gis JSON into PostgreSQL. https://services6.arcgis.com/ssFJjBXIUyZDrSYZ/arcgis/rest/services/Digital_Obstacle_File/FeatureServer/0. But need to be up to date so the code should run every 28 days or look for a change in the JSON.
I am trying to run a job that pulls in the json to the database.

Related

WPForms : Json to SQL table output

I am using the WPForms Pro version that stored the entries into a database. I found out that WPForms stored the data into my database as JSON into the fields column of wp_entries table.
I need to further process this data into reports and I would love this data to be available as standard SQL table.
Does anyone know how to convert this JSON code into a SQL output using SQL code? I have tried JSON_TABLE function but I'm a bit unfamiliar with that logic.
Backend is MySQL / MariaDB.
My JSON data looks something like this:
{"5":{"name":"Locate cost","value":"0.90","id":5,"type":"number"},"3":{"name":"Ticker","value":"IDAI","id":3,"type":"text"}
Thanks!

saving and retrieving datetime to mysql date time, it stores previous day's date using TypeOrm, NestJs | Nodejs

I am trying to store pickup_time in datetime type mariadb column.
my stack:
NestJS (nodejs), mariadb
It saves one day previous date, heres an example:
Insertion:
"pickup_time": "2021-06-07 01:00:00.000",
Getting this:
"2021-06-06T20:00:00.000Z"
when I add only date like this: "2021-06-07",
it works fine and inserts the date .. "2021-06-07T00:00:00.000Z"
But I want to be able to store the time as well
I am sure this has been solved previously, I could not find the solution. Any help is much appreciated
I have datetime column, when I save the data it is saved correctly: 2021-03-18 00:00:00
when I retrieve it using typeorm : 2021-03-17T19:00:00.000Z
It returns a day less than the one stored. I know this has to do with timezone, I have tried adding timezone into orm config but it's not working can I just get the data back as it is stored in the db please. How can I do this?
turns out the data was saved just fine, while retrieving we have to map it back to our localString
pickup_time: new Date(pickup_time).toLocaleString()

load json files in google cloud storage into big query table

I am trying to do it with client lib using python.
the problem I am facing is that the TIMESTAMP on the JSON files are on Unix epoch TIMESTAMP format and big query can't detect that:
according to documentation:
so I wonder what to do?
I thought about changing the JSON format manually before I load it into BigQuery table?
Or maybe looking for an auto conversion from the BigQuery side?
I wondered across the internet and could not find anything useful yet.
Thanks in advance for any support.
You have 2 solutions
Either you update the format before the BigQuery integration
Or you update the format after the BigQuery integration
Before
Before means updating your JSON (manually or by script) or to update it by the process that load the JSON into BigQuery (like Dataflow).
I personally don't like this, file handling are never funny and efficient.
After
In this case, you let BigQuery loading your JSON file into a temporary table and convert your UNIX timestamp into a Number or a String. Then, perform a request into this temporary table, convert the field in the correct timestamp format, and insert the data in the final table.
This way is smoother and easier (a simple SQL query to write). However, it implies cost to read all the loaded data (to write them then)

Can data from a sql query be inserted into elasticsearch?

I'm new to elasticsearch.I have learnt how to give different queries and get search results with the understanding that each document is stored in json format.Is it possible to insert records that were obtained from an sql query on a relational database?If it is possible,how is it done? by converting each record into json format?
You need to build an index in elasticsearch similar to the way you've got your tables in the RMDBS, this can be done in a lot of ways and it really depends on what data you would need to access via elasticsearch. You shouldnt just dump your complete RMDBS data into ES.
If you search around you may find bulk data importers/synchronisers/rivers(deprecated) for your RMDBS to ES, some of these can run in the background and keep the indexes in ES upto date with your RMDBS.
You can create your own code as well which updates ES whenever any data is changed in your RMDBS. Look into the API for your platform Elastic Search Client APIhttps://www.elastic.co/guide/en/elasticsearch/client/index.html

hadoop mongodb connector read data but outputting as mysql data

is it possible to read mongodb data with hadoop connector but save output as mysql data table. So I want to read some data from mongodb collection by hadoop, processing it with hadoop and outputing it NOT already in mongodb but as MYSQL.
I used like, fetching data from mongodb as input and store result in different mongodb address. For that one you need to specify like
MongoConfigUtil.setInputURI(discussConf,"mongodb://ipaddress1/Database.Collection");
MongoConfigUtil.setOutputURI(discussConf,"mongodb://ipaddress2/Database.Collection");
for mongodb to mysql
my suggestion is , you can write normal java code to insert whatever data you need to insert in mysql . that code may be in reduce or map function