Best way to fetch data from many tables in node.js - mysql

If I have a view in my (mvc project) contains data from many tables in the database, what is the best way to fetch them without getting into the nested tree of doom
Model1.findAll().then(model1Data => {
Model2.findAll().then(model2Data => {
Model3.findAll().then(model3Data => {
Modeln.findAll().then(modelnData => {
res.render('view', {
model1Data: model1Data,
model2Data: model2Data,
model3Data: model3Data,
modelnData: modelnData
});
})
})
})
})
Note: the above query has no where clauses, joins, or any other conditions

Here you can use 2 ways either Promise.all() or async/await :
Promise.all() :
const promises = [
Model1.findAll(),
Model2.findAll(),
Model3.findAll(),
Modeln.findAll()
]
Promise.all(promises).then((data) => {
res.render('view', data );
});
Async/await :
let model1Data = await Model1.findAll();
let model2Data = await Model2.findAll();
let model3Data = await Model3.findAll();
let modelnData = await Modeln.findAll();
res.render('view', {
model1Data: model1Data,
model2Data: model2Data,
model3Data: model3Data,
modelnData: modelnData
});
NOTE :
I would suggest to use Promise.all() if the queries are not dependent
on each other , as it will start execution and don't wait for the
first one to complete as it does in async/await.
For More Deatil : DO READ

Related

How do I populate a JSON array with the data from my MYSQL database?

So I have a database that I can query using ExpressJS and NodeJS. The Database is a MySQL db. and the data within looks like this:
id: 1
username: 'someUsername'
email: 'randomEmail#email.email'
I want to somehow put the data from within the database into a JSON list and then map over it and show it to the user. Another option to achieve this, I reasoned, would be to populate the state of the app. I've thought of creating some class component, adding a mapStateToProps and assign the data returned from the queries to the state and then use the data from the state in the Reactapp itself. I am not so sure if that would be effective.
This is the minimum code example for a component that fetches data from your backend onLoad, and displaying the data using .map, without using redux (mapstatetoprops)
const DisplayData = () => {
const [ data, setData ] = useState([]);
const fetchData = async () => {
const results = await fetch(url).then(res => res.json());
setData(data)
}
useEffect(() => {
fetchData()
},[])
return ( <div>
{ data.map(item => <p>{item.name}</p> }
<pre>
{ JSON.stringify(null, data, 4) }
</pre>
</div>
}
Well, the return data that you get from the SQL query is itself an array of objects,
Your answer lies in simply iterating over the returned data and assigning it to whatever object you like.
let queryResults = returnedQueryData // data returned from SQL query
let jsonarray = {}
for (row in queryResults) {
jsonarray[row.id] = {
id: row['id'],
username: row['username'],
email: row['email']
}
To access data from the JSON array use
Object.keys(jsonarray).forEach(e => {
// Here e is the object element from the JSON array
}

fetching data with react hook returns undefined on nested obj properties

Im trying to display data that has been fetched. but i cannot seem to display nested objects properties in react. Any ideas? if i log the data i first get a undefined, then the correct data.
my guess is that i need to wait for the data to be loaded then display it. but it does work for the title that is not in a nested obj.
function SingleBeneficiary({ match }) {
const [data, setData] = useState({ data: []});
const id = match.params.id
useEffect(() => {
async function fetchData() {
const response = await fetch(`http://localhost:8081/v1/beneficiary/${id}`);
const jsonData = await response.json()
setData(jsonData)
}
fetchData();
}, [])
return (
{data.title} // works
{data.address.careOf} // dont work
The data
{
"title":"myTitle",
"address":{
"careOf": "my adress"
}
}
Can you try like this?
I set initial data to null, and in return I check if it is not null.
If address can be null, additional null check is required.
function SingleBeneficiary({ match }) {
const [data, setData] = useState(null);
const id = match.params.id
useEffect(() => {
async function fetchData() {
const response = await fetch(`http://localhost:8081/v1/beneficiary/${id}`);
const jsonData = await response.json()
setData(jsonData)
}
fetchData();
}, [])
return (
<div>
{data && (
<div>
<p>data.title</p>
<p>data.address.careOf</p>
</div>
)}
</div>
);
}
You should check if address has careOf property before using it because first time data will be undefined and in second render it will have the data after the api call.
{data.address && data.address.careOf}
For anyone who is having a similar issue(i.e. fetching data via api and only the first time it runs, it will show the data as undefined but after manual refreshing, it works fine), here is a quick and sketchy addition you might consider alongside with 1. "Inline If with Logical && Operator" method and 2. using useState for checking if the api loading is over. With those three, mine worked.
Try fetching the desired data in the previous page of your app; in this case, add the following lines in any page you'll see before "SingleBeneficiary".
const response = await fetch(`http://localhost:8081/v1/beneficiary/${id}`);
const jsonData = await response.json()
Maybe it has to do with npm cache, but not really sure what's going on.
replace
return (
{data.title}
{data.address.careOf}
)
with
return (
{data?.title}
{data?.address?.careOf}
)

Knex Transaction - Using Await - Not executing 2nd SQL Statement

I am using knex 0.19.4 in node js 10.x. I have 2 SQL statements - Insert and Update which has to happen as a transaction.
// Using Var, so that below code has access to this variable
var sqlStageInsert = kx('stage').insert({
officeid: 'OFF000',
taskid: 'T002',
});
var sqlTaskPcUpdate = kx('task')
.update({ pc: 100})
.where('task.taskno', taskno)
.limit(1);
1st Try - 2nd SQL sqlTaskPcUpdate Not getting Executed
const sqlUpdateInsert = kx.transaction(function (trx) {
sqlStageInsert.transacting(trx)
.then(function () {
console.log(sqlTaskPcUpdate.toString()); // This is outputing correct SQL
return sqlTaskPcUpdate.transacting(trx);
})
.then(trx.commit)
.catch(trx.rollback);
});
await sqlUpdateInsert;
2nd Try - Getting error Transaction query already complete. This is based on Commit/rollback a knex transaction using async/await
await kx.transaction(async (trx) => {
try {
await sqlStageInsert.transacting(trx);
await sqlTaskPcUpdate.transacting(trx);
trx.commit();
} catch (error) {
trx.rollback();
throw error;
}
});
I would suggest you to try inserting the data in stage table first and then retrieve a common value which belongs to both the table for applying in where class of updating task table(Assuming both the table contain any one common column having same data).
Please note that as per knexjs.org website, knex.transaction() uses returning statement with respect to PostgreSQL to insert/ update more than one table to maintain consistency, however MySQL won't support transaction due to which I'm using return values in below code.
FYR, http://knexjs.org/#Transactions
Please refer below code snippet for your reference :
db.transaction(trx => {
trx.insert({
officeid: 'OFF000',
taskid: 'T002'
})
.into('stage')
.then(() => {
return trx('stage').where({taskid}).then(resp => resp[0].taskno)
})
.then(stageTaskno => {
return trx('task')
.update({pc: 100})
.where({stageTaskno})
.limit(1)
.then(resp => {
resp.json(resp)
})
})
.then(trx.commit)
.catch(trx.rollback)
});
hope this is helpful, cheers!

Is it possible to perform an action with `context` on the init of the app?

I'm simply looking for something like this
app.on('init', async context => {
...
})
Basically I just need to make to calls to the github API, but I'm not sure there is a way to do it without using the API client inside the Context object.
I ended up using probot-scheduler
const createScheduler = require('probot-scheduler')
module.exports = app => {
createScheduler(app, {
delay: false
})
robot.on('schedule.repository', context => {
// this is called on startup and can access context
})
}
I tried probot-scheduler but it didn't exist - perhaps removed in an update?
In any case, I managed to do it after lots of digging by using the actual app object - it's .auth() method returns a promise containing the GitHubAPI interface:
https://probot.github.io/api/latest/classes/application.html#auth
module.exports = app => {
router.get('/hello-world', async (req, res) => {
const github = await app.auth();
const result = await github.repos.listForOrg({'org':'org name});
console.log(result);
})
}
.auth() takes the ID of the installation if you wish to access private data. If called empty, the client will can only retrieve public data.
You can get the installation ID by calling .auth() without paramaters, and then listInstallations():
const github = await app.auth();
const result = github.apps.listInstallations();
console.log(result);
You get an array including IDs that you can in .auth().

How to do a very large query on sails-mongo?

I'm using sails 0.11.2. With the latest sails-mongo adapter.
I have a very large database (gigabytes of data) of mainly timestamp and values. And i make queries on it using the blueprint api.
If I query using localhost:1337/datatable?limit=100000000000 the nodejs hangs on 0.12 with a lot of CPU usage, and crashes on v4. It crashes on the toJSON function.
I've finded out that i need to make multiple queries on my API. But I don't how to proceed to make it.
How can i make multiple queries that "don't explode" my server?
Update:
On newer version 0.12.3 with latest waterline and sails-mongo, the queries goes much smoother. The crashes on the cloud was that I didn't had enough RAM to handle sailsjs and mongodb on same T2.micro instance.
I've moved the mongodb server to a M3.Medium instance. And now the server don't crash anymore, but it freezes. I'm using skip limit and it works nicely for sails.js but for mongodb is a great waste of resources!
Mongodb make an internal query using limit = skip + limit. and then moves the cursor to the desired data and returns. When you are making a lot's in pagination you are using lots of internal queries. As the query size will increase.
As this article explains, the way to get around the waste of resources in MongoDB is to avoid using skip and cleverly use _id as part of your query.
I did not use sails mongo but I did implement the idea above by using mongo driver in nodejs:
/**
* Motivation:
* Wanted to put together some code that used:
* - BlueBird (promises)
* - MongoDB NodeJS Driver
* - and paging that did not rely on skip()
*
* References:
* Based on articles such as:
* https://scalegrid.io/blog/fast-paging-with-mongodb/
* and GitHub puclic code searches such as:
* https://github.com/search?utf8=%E2%9C%93&q=bluebird+MongoClient+_id+find+limit+gt+language%3Ajavascript+&type=Code&ref=searchresults
* which yielded smaple code hits such as:
* https://github.com/HabitRPG/habitrpg/blob/28f2e9c356d7053884107d90d04e28dde75fa81b/migrations/api_v3/coupons.js#L71
*/
var Promise = require('bluebird'); // jshint ignore:line
var _ = require('lodash');
var MongoClient = require('mongodb').MongoClient;
var dbHandleForShutDowns;
// option a: great for debugging
var logger = require('tracer').console();
// option b: general purpose use
//var logger = console;
//...
var getPage = function getPage(db, collectionName, query, projection, pageSize, processPage) {
//console.log('DEBUG', 'filter:', JSON.stringify(query,null,2));
projection = (projection) ? projection['_id']=true : {'_id':true};
return db
.collection(collectionName)
.find(query)
.project(projection)
.sort({'_id':1}).limit(pageSize)
.toArray() // cursor methods return promises: http://mongodb.github.io/node-mongodb-native/2.1/api/Cursor.html#toArray
.then(function processPagedResults(documents) {
if (!documents || documents.length < 1) {
// stop - no data left to traverse
return Promise.resolve();
}
else {
if (documents.length < pageSize) {
// stop - last page
return processPage(documents);
}
else {
return processPage(documents) // process the results of the current page
.then(function getNextPage(){ // then go get the next page
var last_id = documents[documents.length-1]['_id'];
query['_id'] = {'$gt' : last_id};
return getPage(db, collectionName, query, projection, pageSize, processPage);
});
}
}
});
};
//...
return MongoClient
.connect(params.dbUrl, {
promiseLibrary: Promise
})
.then(function(db) {
dbHandleForShutDowns = db;
return getPage(db, collectionName, {}, {}, 5, function processPage(pagedDocs){console.log('do something with', pagedDocs);})
.finally(db.close.bind(db));
})
.catch(function(err) {
console.error("ERROR", err);
dbHandleForShutDowns.close();
});
The following two sections show how the code manipulates _id and makes it part of the query:
.sort({'_id':1}).limit(pageSize)
// [...]
var last_id = documents[documents.length-1]['_id'];
query['_id'] = {'$gt' : last_id};
Overall code flow:
Let getPage() handle the work, you can set the pageSize and query to your liking:
return getPage(db, collectionName, {}, {}, 5, function processPage(pagedDocs){console.log('do something with', pagedDocs);})
Method signature:
var getPage = function getPage(db, collectionName, query, projection, pageSize, processPage) {
Process pagedResults as soon as they become available:
return processPage(documents) // process the results of the current page
Move on to the next page:
return getPage(db, collectionName, query, projection, pageSize, processPage);
The code will stop when there is no more data left:
// stop - no data left to traverse
return Promise.resolve();
Or it will stop when working on the last page of data:
// stop - last page
return processPage(documents);
I hope this offers some inspiration, even if its not an exact solution for your needs.
1. run aggregate
const SailsMongoQuery = require('sails-mongo/lib/query/index.js')
const SailsMongoMatchMongoId = require('sails-mongo/lib/utils.js').matchMongoId
const fn = model.find(query).paginate(paginate)
const criteria = fn._criteria
const queryLib = new SailsMongoQuery(criteria, {})
const queryOptions = _.omit(queryLib.criteria, 'where')
const where = queryLib.criteria.where || {}
const queryWhere = Object.keys(where).reduce((acc, key) => {
const val = where[key]
acc[key] = SailsMongoMatchMongoId(val) ? new ObjectID(val) : val
return acc
}, {})
const aggregate = [
{ $match: queryWhere }
].concat(Object.keys(queryOptions).map(key => ({ [`$${key}`]: queryOptions[key] })))
// console.log('roge aggregate --->', JSON.stringify(aggregate, null, 2))
model.native((err, collection) => {
if (err) return callback(err)
collection.aggregate(aggregate, { allowDiskUse: true }).toArray(function (err, docs) {
if (err) return callback(err)
const pk = primaryKey === 'id' ? '_id' : primaryKey
ids = docs.reduce((acc, doc) => [...acc, doc[pk]], [])
callback()
})
})
2. run sails find by id`s
query = Object.assign({}, query, { [primaryKey]: ids }) // check primary key in sails model
fn = model.find(query) // .populate or another method
fn.exec((err, results) => { console.log('result ->>>>', err, results) })