node js mysql rest api - queue http requests - persist data - mysql

What is the best architectural design for following scenario?
I want to build a CRUD-microservice using node, express and mysql. The CREATE portion is quite complex due to a large piece of json with many relational properties on each http POST request. The request.body is looking something like this:
{
key1: string <-- saved as foreign_key
key2: {...}
key3: int
key4: [ <-- saved as n:m with corresponding table
{...},
{...},
]
...
...
keyXYZ: ...
key46: int <-- saved as foreign_key
key47: string
}
The module which does all query-operations looks like this:
persistData = async (data, dbConnection) => {
const idSomething1 = await fetchOrCreateSomething1(data.key1).catch (err => console.log(err));
const idSomething2 = await fetchOrCreateSomething2(data.key46).catch (err => console.log(err));
const idSomething3 = await fetchOrCreateSomething3(data.keyXYZ).catch (err => console.log(err));
const idManyThings = await fetchOrCreateManyThings(idSomething1, idSomething2, idSomething3, data.moreStuff...).catch (err => console.log(err));
}
All fetchOrCreateSomethingX = async () => {} functions are async to let the main function persistData wait for an newly created or retrieved record id.
This is wrapped inside an exported constructor function:
function DataHandler(data, res) {
db.getConnection()
.then((dbConnection) => {
persistData(data, dbConnection)
.then(() => {
dbConnection.release();
});
});
}
module.exports = DataHandler;
The endpoint does the following:
const createFunc = (req, res) => {
new DataHandler(req.body, res);
};
app.post("/create", createFunc);
I know that especially the last part does not work because the new DataHandler-object is overwritten as soon as the endpoint gets hit again. If the persisting process hasn't finished before the endpoint gets hit again the data from the first request is lost. I also know that express won't be able to send back responses which isn't ideal. If the new DataHandler instead would be stored to a new variable or const than at least both processes would run. But the main problem is that the data gets shuffled as the persistData() is running in parallel not encapsulated from each other.
I can't find any example or best practice how to design this well. Any hint or resource would be great!
Is a queuing system like the kue library the way to go?

Related

Is there a way to get a nested JSON object using react native?

My JSON file can be found using this link. The object "features" have a nested object called "properties", where I want to access the data from that object. I've tried to use the useEffect() hook from React and implemented that in the code below. I tried to get the "properties" sub object by implementing the following code: data.features.properties, but that returns undefined. What code am I implemented wrong or what logic is incorrect?
useEffect(() => {
fetch('https://www.vaccinespotter.org/api/v0/states/' + stateAbb + '.json')
.then((response) => response.json())
.then((json) => {
setData(json);
.catch((error) => console.error(error))
.finally(() => setLoading(false));
}, [stateAbb]);
stateAbb is the state abbreviation for the state that the user selects in a text input on a different screen. propData seems to store the "features" object as I have used the alert() function and typeof() to determine that propData is an object.
I've tried to JSON.parse() and implemented some other StackOverflow answers, such as this and this. The effect still remains the same. data.features works as an object but data.features.properties returns undefined.
Any help would be appreciated!
Thanks!
React hooks doesn't allow async await in useEffect so you can create new function like this
useEffect(()=>{
fetchData()
},[])
const fetchData = async ()=>{
try{
const response = await fetch('https://www.vaccinespotter.org/api/v0/states/' + stateAbb + '.json')
const json = await response.json()
console.log(json); // your data is here!
}catch(err){
console.log(err)
}
}

Read file with fs module and keep the original file

I'm actually working on a NodeJS API that send mail.
index.handlebars is a template that I want to use everytime that I need to send an email
So I use Node File Systeme (fs) to readFileSync() and then replace() the data need before sending email to the user.
Here is an exemple :
const readMe = fs.readFileSync('./mails/index.handlebars', 'utf8', (error, data) => {
if (error) {
console.log(error);
} else {
data = data.toString('utf8').replace('{%CONFIRMATION%}', "SELLER AS VALIDATE YOUR ORDER")
return data
}
})
console.log(readMe);
First sometimes, replace() is not working for me and nothing happend. I don't know why.
But when it work, my goal is to not overwrite index.handlebars. What I mean by that is replace() all the stuff and then send it BUT keep index.handlebars as it was before replace().
Is it possible ?
Thanks a lot.
The fs module provides fs.readFile (read file asynchronously, takes a callback) and fs.readFileSync (read synchronous, does not require a callback).
You are currently mixing up the 2 signatures in trying to do a synchronous read with a callback.
To use readFileSync (synchronous), you should
// synchronous without callback
const data = fs.readFileSync('./mails/index.handlebars', { encoding: 'utf8', flag: 'r' })
const replaced = data.replace('{%CONFIRMATION%}', "SELLER AS VALIDATE YOUR ORDER")
console.log(replaced);
For readFile (asynchronous), you use the callback
// asynchronous with callback
fs.readFile('./mails/index.handlebars', 'utf8', (error, data) => {
if (error) {
console.log(error);
} else {
data = data.replace('{%CONFIRMATION%}', "SELLER AS VALIDATE YOUR ORDER")
// perform necessary operations on data
console.log(data);
}
})

How to insert multiple record of JSON that stored in an Array using mysql and knex js (Node js)?

I have a code to store id_company, id_variablepoint, and answer to table "answer" using knex js (node js) and mysql
router.post("/insert_answer", async (req, res, next) => {
const id_company = req.body.id_company;
const id_variablepoint = req.body.id_variablepoint;
const answer = req.body.answer;
try{
const tambah = await knex("answer").insert([{id_company:id_company, id_variablepoint:id_variablepoint, answer:answer}])
res.json({
"data":tambah
})
}catch(e){
const error = new Error ("ERROR: "+e);
next(error);
}
});
and I have a problem to store request that look like this because it's stored in array
[
{
"id_company":2,
"id_variablepoint":57,
"answer":"choose"
},
{
"id_company":2,
"id_variablepoint":49,
"answer":"choose"
}
]
My database structure looks like this:
db structure
please help me, thank you
According to knex doc knex 'insert' support arrays. so you don't need to worry in your scenario. just pass the request into knex insert.
const tambah = await knex("answer").insert(req.body);

One response after few async functions

I have a web page with a form where a user can edit personal info, education, work history and etc.
And the user can add more than one degree, for example: bs, ms, phd. And a few job positions as well.
When the user push 'save' button I send all this data to my server. I send it all in one request. In the server I have a point to handle the request.
app.post(config.version + '/profile', (req, res, next) => {});
And there I do a few MySQL queries to insert/update/delete a data. I use mysql package from npm to do that.
new Promise((resolve, reject) => {
const userQuery = `INSERT INTO user ...;`;
const degreesQuery = 'INSERT INTO degree ...;';
const positionsQuery = 'UPDATE position SET ...;';
this.connection.query(userQuery, err => {});
this.connection.query(degreesQuery, err => {});
this.connection.query(positionsQuery, err => {});
resolve({});
})
In the end I do resolve({}) but I want to select updated profile and send it back (because in MySQL tables for degrees I add ids that helps me to not insert again duplicate data). So, my question is how to do resolve({}) only when all my async this.connection.querys finished?
My suggestion is to run all the queries in a Promise.all().
Example:
const queries = [
`INSERT INTO user ...;`;,
'INSERT INTO degree ...;',
'UPDATE position SET ...;'
];
Promise.all(queries.map((query) => {
return new Promise((resolve, reject) => {
this.connection.query(query, err => {
return err ? reject(err) : resolve();
});
});
})
.then(() => {
// continue
// get your updated data here with and send it as response
})
If your db library has support for Promise write it this way
Promise.all(queries.map((query) => {
return this.connection.query(query);
})
.then(() => {
// continue
// get your updated data here with and send it as response
})

GraphQL: fulfill query from JSON file source

I've just started messing about with GraphQL, and I'd like a resolver that uses a JSON file on disk as the data source. What I've got so far causes GraphQL to return null.
How do I do this and why doesn't the approach below work?
var schema = buildSchema(`
type Experiment {
id: String
trainData: String
goldData: String
gitCommit: String
employee: String
datetime: String
}
type Query {
# Metadata for an individual experiment
experiment: Experiment
}
schema {
query: Query
}`);
var root = {
experiment: () => {
fs.readFile('./data/experimentExample.json', 'utf8', function(err, data) {
if (err) throw err;
console.log(data);
return JSON.parse(data);
});
}
};
const app = express();
app.use('/graphql', graphqlHTTP({
rootValue: root,
schema: schema,
graphiql: true
}));
app.listen(4000);
console.log('Running a GraphQL API server at localhost:4000/graphql');
The callback function you're passing to readFile runs asynchronously, which means returning a value from it doesn't do anything -- the function the readFile call is inside is done executing and has returned a value (null) by the time your callback is done.
As a rule of thumb, when dealing with GraphQL, you should stay away from callbacks -- your resolvers should always return a value or a Promise that will eventually resolve to a value.
Luckily, fs has an asynchronous method for reading files, so you can just do:
const root = {
experiment: () => {
const file = fs.readFileSync('./data/experimentExample.json', 'utf8')
return JSON.parse(file)
}
};
// or even cleaner:
const root = {
experiment: () => JSON.parse(fs.readFileSync('./data/experimentExample.json', 'utf8'))
};
As an additional example, here's how you would do that with a Promise:
// using Node 8's new promisify for our example
const readFileAsync = require('util').promisify(fs.readFile)
const root = {
experiment: () => readFileAsync('./data/experimentExample.json', {encoding: 'utf8'})
.then(data => JSON.parse(data))
};
// Or with async/await:
const root = {
experiment: async () => JSON.parse(await readFileAsync('./data/experimentExample.json', {encoding: 'utf8'}))
};
Of course there's no need to promisify readFile since you already have an async method available, but this gives you an idea of how to work with Promises, which GraphQL is happy to work with.