How to import JSON data to BigQuery - json

Currently I am working on a project that where I use an API to get JSON data. I got it to where I can request the browser, but I want to take the next step and place that data produced by the URL into big query. I was wondering if there was a way to use the terminal to do it?

Related

What would be the best way to write JSON in a locally stored file in React Native

Im currently working on a new design for a mobile app (only frontend). In the project I need to use states to change the css of a button. If the button has changed, next time the app is refreshed the state should be as you left it.
That is why I have a locally stored JSON file that is structured the same as how the apps current database is. Reading it is no issue, but I can't get the writing to work.
How I read the JSON:
const jsonData = require('../data.json')
function GetBaseState(id){
console.log(jsonData.bases[id].state)
}
How would I go about changing that state in the JSON file?
In order to both of reading and writing Json file , you are able to use react-native-fs. For example by readFile method you can read json file and by writeFile method writing your json file in local storage.
react-native-fs have good documents that you are able to reading that for more information and usage.

import json without caching the file (next.js or node.js)

I built an api with next.js.
I use a JSON file as a data source. I import it as a module. But if the content of the JSON changes, it still shows the first content, the same, when i started the server.
Is there a way to avoid caching JSON with import?
I need to get the JSON content, but also the updates in the JSON file, without restarting my api.
If your Server returns the JSON files with a specific File-Extension like .json you could try to turn off the caching for those file-types:
Here is an example for ngnix-servers
Here is an example for apache-servers
Another possibility is to load the JSON via Javascript where you add some random parameter to the Query-Params of the URL
Here is the Example

Converting nested JSON response to a CSV table

I am testing APIs on a locally run service using Postman. I am getting a large nested response as JSON. But I am required to produce a CSV file with the data in tables. i.e., keys as column names and the values as separate rows into the tables
An offline solution for needed, for privacy and confidentiality reasons.
First I came across this solution: postman saveResponseToFile template
A template of postman that sends the requests to a node server running locally which in turn saves the JSON to a CSV.
But all this does is save each line into a separate cell horizontally across the CSV. Not of much use for readability.
Desired Output
The needed table is something like what this online tool does:
ConvertCSV
This takes the keys as column names and creates new rows for each nested data.
Is there a OSS script that can accomplish this. Sorry if this is a duplicate. I haven't found a proper one after trying more similar scripts similar to the first.

How do I use Pentaho spoon to push data from MySQL database to facebook webpage

1) I have already made transformation mapping for getting data from specific MySQL (Table Input) and convert it as Text File output.
2) Also I have created a facebook developer account page and trying to figure out how the Facebook API works to push data from MYsql to facebook.
3) would appreciate if transformation mapping can be provided. Also I would not like to use XML, instead I would like to use JSON.
Already the msql table is converted to csv file, but I am not sure how to post the csv file to facebook or is there a way to connect mysql table to facebook directly. Please share your ideas or transformation mapping. Thanks
I would Assuemm you are familiar with Facebook Development API to do all actions like post,get and so on.
You have a step called "REST CLIENT STEP" in Pentaho.
you will have an API url to post the data that you want from mySQL. There several methods GET PUT POST DELETE
Also set the Application Format to Json (XML,JSON etc).
I used to read data from FB using REST Client by using GET Method. Work around.

How to save json response and result in an excel sheet

I want to save each json response to the respective input (in the same excel file which I have used as input file for data driven test testing.) in the each input row as shown below .Can anyone please help me in this, i am using SOAP UI PRO. I am new to groovy scripting. whether can it be done using some test steps?
empolyeeName output json Result
abcd PASS
Sounds like you need a data sink. soapUI Pro comes with one of these built in, which can be configured to write to Excel, CSV or databases.
For example:
If you're using the same Excel as both your data source and data sink, you may need to do some testing to see if there's any contention.