So basically I will be getting a feed of a few huge JSON files. I want to convert them into SQL and store them into a MySQL database.
The catch here is that later on I will be needing to get the SQL files from the database and convert them into JSON objects.
https://sqlizer.io/#/ Something like this, where it converts JSON to SQL but vice versa as well.
So I was wondering if there are any NodeJS modules/libraries that have this type of capability.
Thank you.
I don't see where is the problem. With most sql libraries in node, when doing queries in sql, you are getting json back or at least you are getting data that you can convert to JSON with JSON.stringify.
Let say we are doing it with knex and Postgres:
const db = require('knex')({
client: 'pg',
connection: {
host : '127.0.0.1',
user : 'your_database_user',
password : 'your_database_password',
database : 'myapp_test'
}
});
/*
Let assume the content of the json files is this way:
[
{
"name": "foo",
"last_name": "bar"
},
{
"name": "foo",
"last_name": "bar"
}
];
Schema of table should be (
name TEXT NOT NULL,
last_name TEXT NOT NULL
)
*/
// these are the files
const myFilesToRead = ['./file1.json', './file2.json'];
// we are looping through myFilesToRead
// reading the files
// running queries for each object
Promise.all(
myFilesToRead.map((file) => {
// yes you can require json files :)
const fContent = require(file);
return Promise.all(fContent.map((obj) => {
// knex is a query builder, it will convert the code below to an SQL statement
return db('table_name')
.insert(obj)
.returning('*')
.then((result) => {
console.log('inserted', result);
return result;
});
}));
})
)
.then((result) => {
// let's now get these objects back
return db('table_name')
.select('*');
})
.then((result) => {
// that's it
console.log(JSON.stringify(result));
});
If you want to read about knex, here is the doc:
http://knexjs.org/
Related
So I have a database that I can query using ExpressJS and NodeJS. The Database is a MySQL db. and the data within looks like this:
id: 1
username: 'someUsername'
email: 'randomEmail#email.email'
I want to somehow put the data from within the database into a JSON list and then map over it and show it to the user. Another option to achieve this, I reasoned, would be to populate the state of the app. I've thought of creating some class component, adding a mapStateToProps and assign the data returned from the queries to the state and then use the data from the state in the Reactapp itself. I am not so sure if that would be effective.
This is the minimum code example for a component that fetches data from your backend onLoad, and displaying the data using .map, without using redux (mapstatetoprops)
const DisplayData = () => {
const [ data, setData ] = useState([]);
const fetchData = async () => {
const results = await fetch(url).then(res => res.json());
setData(data)
}
useEffect(() => {
fetchData()
},[])
return ( <div>
{ data.map(item => <p>{item.name}</p> }
<pre>
{ JSON.stringify(null, data, 4) }
</pre>
</div>
}
Well, the return data that you get from the SQL query is itself an array of objects,
Your answer lies in simply iterating over the returned data and assigning it to whatever object you like.
let queryResults = returnedQueryData // data returned from SQL query
let jsonarray = {}
for (row in queryResults) {
jsonarray[row.id] = {
id: row['id'],
username: row['username'],
email: row['email']
}
To access data from the JSON array use
Object.keys(jsonarray).forEach(e => {
// Here e is the object element from the JSON array
}
Basically, I am setting up a web server via Node.js and Express (I am a beginner at this) to retrieve data by reading a JSON file.
For example, this is my data.json file:
[{
"color": "black",
"category": "hue",
"type": "primary"
},
{
"color": "red",
"category": "hue",
"type": "primary"
}
]
I am trying to retrieve all of the colors by implementing this code for it to display on localhost:
router.get('/colors', function (req, res) {
fs.readFile(__dirname + '/data.json', 'utf8', function (err, data) {
data = JSON.parse(data);
res.json(data); //this displays all of the contents of data.json
})
});
router.get('/colors:name', function (req, res) {
fs.readFile(__dirname + '/data.json', 'utf8', function (err, data) {
data = JSON.parse(data);
for (var i = 0; i < data.length; i++) {
res.json(data[i][1]); //trying to display the values of color
}
})
});
How do I go about doing this?
What you are trying to do is actually pretty simple once you break it into smaller problems. Here is one way to break it down:
Load your JSON data into memory for use by your API.
Define an API route which extracts only the colours from your JSON data and sends them to the client as a JSON.
var data = [];
try {
data = JSON.parse(fs.readFileSync('/path/to/json'));
} catch (e) {
// Handle JSON parse error or file not exists error etc
data = [{
"color": "black",
"category": "hue",
"type": "primary"
},
{
"color": "red",
"category": "hue",
"type": "primary"
}
]
}
router.get('/colors', function (req, res, next) {
var colors = data.map(function (item) {
return item.color
}); // This will look look like: ["black","red"]
res.json(colors); // Send your array as a JSON array to the client calling this API
})
Some improvements in this method:
The file is read only once synchronously when the application is started and the data is cached in memory for future use.
Using Array.prototype.map Docs to extract an array of colors from the object.
Note:
You can structure the array of colors however you like and send it down as a JSON in that structure.
Examples:
var colors = data.map(function(item){return {color:item.color};}); // [{"color":"black"},{"color":"red"}]
var colors = {colors: data.map(function(item){return item.color;})} // { "colors" : ["black" ,"red"] }
Some gotchas in your code:
You are using res.json in a for loop which is incorrect as the response should only be sent once. Ideally, you would build the JS object in the structure you need by iterating over your data and send the completed object once with res.json (which I'm guessing internally JSON.stringifys the object and sends it as a response after setting the correct headers)
Reading files is an expensive operation. If you can afford to read it once and cache that data in memory, it would be efficient (Provided your data is not prohibitively large - in which case using files to store info might be inefficient to begin with)
in express, you can do in this way
router.get('/colors/:name', (req, res) => {
const key = req.params.name
const content = fs.readFileSync(__dirname + '/data.json', 'utf8')
const data = JSON.parse(content)
const values = data.reduce((values, value) => {
values.push(value[key])
return values
}, [])
// values => ['black', 'red']
res.send(values)
});
and then curl http://localhost/colors/color,
you can get ['black', 'red']
What you're looking to do is:
res.json(data[i]['color']);
If you don't really want to use the keys in the json you may want to use the Object.values function.
...
data = JSON.parse(data)
var values = []
for (var i = 0; i < data.length; i++) {
values.push(Object.values(data[i])[0]) // 0 - color, 1 - category, 2 - type
}
res.json(values) // ["black","red"]
...
You should never use fs.readFileSync in production. Any sync function will block the event loop until the execution is complete hence delaying everything afterwords (use with caution if deemed necessary). A few days back I had the worst experience myself and learnt that in a hard way.
In express you can define a route with param or query and use that to map the contents inside fs.readFile callback function.
/**
* get color by name
*
* #param {String} name name of the color
* #return {Array} array of the color data matching param
*/
router.get('/colors/:name', (req, res) => {
const color = req.params.name
const filename = __dirname + '/data.json';
fs.readFile('/etc/passwd', 'utf8', (err, data) => {
if(err){
return res.send([]); // handle any error returned by readFile function here
}
try{
data = JSON.parse(data); // parse the JSON string to array
let filtered = []; // initialise empty array
if(data.length > 0){ // we got an ARRAY of objects, right? make your check here for the array or else any map, filter, reduce, forEach function will break the app
filtered = data.filter((obj) => {
return obj.color === color; // return the object if the condition is true
});
}
return res.send(filtered); // send the response
}
catch(e){
return res.send([]); // handle any error returned from JSON.parse function here
}
});
});
To summarise, use fs.readFile asynchronous function so that the event loop is not clogged up. Inside the callback parse through the content and then return the response. return is really important or else you might end up getting Error: Can't set headers after they are sent
DISCLAIMER This code above is untested but should work. This is just to demonstrate the idea.
I think you can’t access JSON without key. You can use Foreach loop for(var name : object){} check about foreach it may help you
I'm working on a Vue frontend/Express backend application and am having an issue with a numbers in a JSON response 'arriving' at the frontend as null.
server:
db.boards.getUserBoards(user.user_id).then((boards) => {
let userBoards = [];
if (boards) userBoards = boards;
user.boards = userBoards;
const token = jwt.sign(JSON.stringify(user), config.secret);
res.json({ message: 'Token granted', token, user });
Where boards is an array such as:
"boards": [
{
"board_id": 1,
"name": "test"
}
]
This works fine in postman (that's where the above was directly copied from), but the response in my client application has a null value
see response. It seems to only affect the numerical values within the boards array, as the user object has an numerical id field that makes it across without issue.
client:
Axios.post(`${api}/api/stuff`, credentials)
.then(({data}) => { console.log(data) })
What am I missing?!
json2json library converts one json format to another using a template variable.
var template = {
"path": ".",
"as": {
"skus": {
"path": "students,student",
"choose": ["name", "subject"],
"format": function(node, value, key) {
return { value : value };
},
"as": {
"StudentName": "name",
"StudentSubject": "subject",
}
}
}
}
transformedJson = new json2json.ObjectTemplate( template ).transform( oldJson );
I want to save this template variable in database and later use it to transform json by querying database.How can this be done?
Judging from your tag, you would like to insert this data into a MySQL database. You just need a client. In that case, you could use a package like mysql to do so. They provide a very basic example in their documentation (which I've very quickly and slightly adapted to your question):
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'localhost',
user : 'me',
password : 'secret',
database : 'my_db'
});
connection.connect();
connection.query('INSERT ? INTO your_table', [transformedJson], function(err, res) {
if (err) throw err;
console.log(res);
});
connection.end();
This is, of course, assuming that you have a MySQL database, a MySQL server, etc. Generally speaking, it's not a great idea to insert JSON into a MySQL database in the sense of just dumping a big object into a single field. However, MySQL 5.7.8 does support a native JSON data type.
Im using express, body-parser and moongose to build a RESTful web service with Node.js. Im getting json data in the body of a POST request, that function looks like this:
router.route('/match')
// create a match (accessed at POST http://localhost:3000/api/match)
.post(function(req, res) {
if (req._body == true && req.is('application/json') == 'application/json' ) {
var match = new Match(); // create a new instance of the match model
match.name = req.body.name; // set the match name and so on...
match.host = req.body.host;
match.clients = req.body.clients;
match.status = req.body.status;
// save the match and check for errors
match.save(function(err) {
if (err) {
//res.send(err);
res.json({ status: 'ERROR' });
} else {
res.json({ status: 'OK', Match_ID: match._id });
}
});
} else {
res.json({ status: 'ERROR', msg: 'not application/json type'});
}
});
The model Im using for storing a match in the database looks like this:
// app/models/match.js
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var MatchSchema = new Schema({
name: String,
host: String,
clients: { type: [String]},
date: { type: Date, default: Date.now },
status: { type: String, default: 'started' }
});
module.exports = mongoose.model('Match', MatchSchema);
But how do I validate that the json data in the body of the POST request has the key/value fields I want? For clarification, I dont want to insert data in the database that is incomplete. If I test to skip a key/value pair in the json data I get a missing field in the database and when I tries to read req.body.MISSING_FIELD parameter in my code I get undefined. All fields except date in the model is required.
Im using json strings like this to add matches in the database
{"name": "SOCCER", "host": "HOST_ID1", "clients": ["CLIENT_ID1", "CLIENT_ID2"], "status": "started"}
I use a very simple function that takes an array of keys, then loops through it and ensures that req.body[key] is not a falsy value. It is trivial to modify it to accommodate only undefined values however.
In app.js
app.use( (req, res, next ) => {
req.require = ( keys = [] ) => {
keys.forEach( key => {
// NOTE: This will throw an error for ALL falsy values.
// if this is not the desired outcome, use
// if( typeof req.body[key] === "undefined" )
if( !req.body[key] )
throw new Error( "Missing required fields" );
})
}
})
in your route handler
try{
// Immediately throws an error if the provided keys are not in req.body
req.require( [ "requiredKey1", "requiredKey2" ] );
// Other code, typically async/await for simplicity
} catch( e ){
//Handle errors, or
next( e ); // Use the error-handling middleware defined in app.js
}
This only checks to ensure that the body contains the specified keys. IT won't validate the data sent in any meaningful way. This is fine for my use case because if the data is totally borked then I'll just handle the error in the catch block and throw an HTTP error code back at the client. (consider sending a meaningful payload as well)
If you want to validate the data in a more sophisticated way, (like, say, ensuring that an email is the correct format, etc) you might want to look into a validation middle-ware, like https://express-validator.github.io/docs/