I have a pretty heavyweight query on the server that results in a new page render, and I'd like to pass along some of the results of the query to the client (as a javascript array of objects). This is basically so I don't have to do a separate JSON query later to get the same content (which is mostly static). The data will be useful eventually, but not initially so I didn't put it directly into the document.
app.get('/expensiveCall', function(req, res) {
// do expensive call
var data = veryExpensiveFunction();
res.render('expensiveCall.jade', {
locals: {
data: data,
}
});
});
});
data is a array of objects and only some are initially used. I'd like to pass either the entirety of data over or some subsets (depending on the situation). My jade looks like normal jade, but I would like to include something like
<script type="text/javascript">
var data = #{data};
</script>
but this doesn't work (it's an array of objects).
You can't inline a JS object like that, but you can JSON.stringify it before:
<script type="text/javascript">
var data = !{JSON.stringify(data)};
</script>
Related
Shopify stores load their apps within an anonymous async function, as seen below. I build themes and the biggest performance issue is the number of apps loaded on a page (10+ all the time).
I want to build a small extension that counts the number of strings in the url variable below. Super hacky options happily accepted.
<script>
(function() {
function asyncLoad() {
var urls = [];
};
if(window.attachEvent) {
window.attachEvent('onload', asyncLoad);
} else {
window.addEventListener('load', asyncLoad, false);
}
})();
</script>
Arrays have a length. Since your variable url is an array, you can probably get away with url.length as a count of the number of elements in the variable.
I am new to office 365 word JavaScript API. I am trying to send a Json object to a dialog from the parent using the dialog api. But I couldn't find a better solution for that. I have found it is possible to send a Json object from the dialog to the parent using below code snippet.
Office.context.ui.messageParent
can someone give me a good solution with a code snippet to solve this problem?
You can try something like that
In parent web page (the actual add-in) javascript code
Office.context.ui.displayDialogAsync(url, options, function(result) {
var dialog = result.value;
dialog.addEventHandler(Office.EventType.DialogMessageReceived, function(args){
dialog.close();
var json = JSON.parse(args.message);
//do what ever you need to do...
});
});
NOTE: for the sake of simplicity I omitted "error checks" if callback function receive error result. You should take care of that as well.
The web page that is opened at url will have a function for pushing back the json object after representing it as a string
var asString = JSON.stringify(myObj);
Office.context.ui.messageParent(asString);
Of course the webpage opened in the dialog window must also reference Office.js.
Here is the documentation link for this so-called dialogAPI https://dev.office.com/reference/add-ins/shared/officeui
Edit:
the original question is to send data from parent to children
If you need to send info to the page opened in dialogAPI. I suggest your append query parameters to url. You can stringify your Json object and pass it. This is not very clean thought.
Standardized way to serialize JSON to query string?
You can send JSON data or object back to your parent easily.
This code snippet should be in your child page's(Dialog page) JS file.
(function () {
"use strict";
// The Office initialize function must be run each time a new page is loaded
Office.initialize = function (reason) {
$(document).ready(function () {
$('#btnLogin').click(submit);
});
};
function submit() {
// Get and create the data object.
var email = $('#txtEmail').val();
var password = $('#txtPassword').val();
var data = {
email: email,
password: password
}
// Create the JSON and send it to the parent.
var json = JSON.stringify(data);
Office.context.ui.messageParent("json");
}
})();
See here: https://dev.office.com/docs/add-ins/develop/dialog-api-in-office-add-ins
Find section "Passing information to the dialog box".
Two primary ways:
Add query parameters to the URL
Store the information somewhere that is accessible to both the host window and dialog box, e.g. local storage
I have created a webpage with Node JS, Express JS, Mongoose and D3 JS.
In the webpage, it contains 3 pull down menus: Department, Employee, Week.
The usage of the webpage is as follows:
When 'Department' is selected, 'Employee' menu will be filtered to show only those from the selected 'Department'. The same goes to 'Week' after 'Employee' is selected.
After the 3 menus are selected and 'PLOT' button is clicked, a line chart (using d3.js) will be plotted to show the employee working hours for the month.
MongoDB Json
{ dep: '1',
emp: 'Mr A',
week: 1,
hrs: [{
{1,8},
{2,10},
...
}]
}
Here are the snippets of my codes:
routes.js
// Connect the required database and collection
var dataAll = require('./models/dataModel');
module.exports = function(app) {
app.get('/api/data', function(req, res) {
dataAll.find({}, {}, function(err, dataRes) {
res.json(dataRes);
});
}
app.get('*', function(req,res) {
res.sendfile('./index.html');
}
}
index.html
... // More codes
<div id="menuSelect1"></div>
<div id="menuSelect2"></div>
<div id="menuSelect3"></div>
...
<script src="./display.js" type='text/javascript'></script>
... // More codes
display.js
//Menu (Department,Employee,Week) Information is gathered here
queue()
.defer(d3.json, "/api/data")
.await(createPlot);
function createPlot(error, plotData) {
var myData = plotData;
var depData = d3.nest()
.key(function(d) {return d.dep;})
.rollup(function(v) {return v.length;})
.entries(myData);
selectField1 = d3.select('#menuSelect1')
.append("select")
.on("change", menu1change)
.selectAll(depData)
.enter()
.append("option")
.attr("value", function(d) {return d.key;})
.text(function(d) {return d.key;});
function menu1Change() {
//Filter Next Menu with the option chosen in this menu
... // More codes
var selectedVal = this.options[this.selectedIndex].value;
var empData = dataSet.filter(function(d) { return d.emp = selectString; });
... // More codes
}
... // More codes
}
Problem:
Functionally, it is working as expected. Problem is when the database is getting larger and larger, the loading of the page becomes very very slow (mins to load). I believe it should be due to the routing where all data is retrieved (.find({},{})) but I thought I need it because I am using it in 'display.js' to filter my menu options.
Is there a better way to do this to resolve the performance issue?
It is rarely necessary to send all the data to the client. In fact, I haven't seen an API with a single endpoint that returns the entire database to everyone.
It's hard to give you any specific solution not knowing how your data looks like, how large it is, how fast it grows etc. The performance issues may be related to querying the database, to large data transfer, or large JSON to parse by the browser.
In any case, you shouldn't send all your database to the client with no limits. Usually it is implemented with a number of records to skip and a maximum number of records to return.
Some frameworks like LoopBack does it for you, see:
https://docs.strongloop.com/display/public/LB/Skip+filter
https://docs.strongloop.com/display/public/LB/Limit+filter
If you're using Express then you'll have to implement the limits yourself.
To test the bottleneck, you can run the Mongo shell and try to run the .find({},{}) query from there to see how long it takes. You can see the transfer size and time in the browser's developer tools. This may find you narrow down the place that needs most attention, but returning the entire database no matter how large it is, is already a good place to start.
I am trying to parse some json with Handlebars on my website. I don't get any errors but also don't get any content. I've developed my own rest point to return a json response and I think my problem might be there somewhere, but you can see the response in the code.
http://codepen.io/anon/pen/Czdxh
$(document).ready(function(){
var raw_template = $('#post-template').html();
// Compile that into an handlebars template
var template = Handlebars.compile(raw_template);
// Retrieve the placeHolder where the Posts will be displayed
var placeHolder = $("#all-posts");
// Fetch all Blog Posts data from server in JSON
$.getJSON("https://instapi-motleydev.rhcloud.com/liked",function(data){
$.each(data,function(index,element){
// Generate the HTML for each post
var html = template(element);
// Render the posts into the page
placeHolder.append(html);
});
});
});
Thanks for any help!
The problem was I was getting an array response from the server and needed to adapt my template to include the {#each this} syntax. Also switched my getJSON to a simple get and looped over the reaction that way and tossed the $.each handler.
I have to obtain a json that is incrusted inside a script tag in certain page... so I can't use regular scraping techniques, like cheerio.
Easy way out, write the file (download the page) to the server and then read it using string manipulation to extract the json (there are several) work on them and save to my db hapily.
the thing is that I'm too new to nodeJS, and can't get the code to work, I think that I'm trying to read the file before it is fully written, and if read it time before obtain [Object Object]...
Here's what I have so far...
var http = require('http');
var fs = require('fs');
var request = require('request');
var localFile = 'tmp/scraped_site_.html';
var url = "siteToBeScraped.com/?searchTerm=foobar"
// writing
var file = fs.createWriteStream(localFile);
var request = http.get(url, function(response) {
response.pipe(file);
});
//reading
var readedInfo = fs.readFileSync(localFile, function (err, content) {
callback(url, localFile);
console.log("READING: " + localFile);
console.log(err);
});
So first of all I think you should understand what went wrong.
The http request operation is asynchronous. This means that the callback code in http.get() will run sometime in the future, but the fs.readFileSync, due to its synchronous nature will execute and complete even before the http request will actually be sent to the background thread that will execute it, since they are both invoked in what is commonly known as the (same) tick. Also fs.readFileSync returns a value and does not use a callback.
Even if you replace fs.readFileSync with fs.readFile instead the code still might not work properly since the readFile operation might execute before the http response is fully read from the socket and written to the disk.
I strongly suggest reading: stackoverflow question and/or Understanding the node.js event loop
The correct place to invoke the file read is when the response stream has finished writing to the file, which would look something like this:
var request = http.get(url, function(response) {
response.pipe(file);
file.once('finish', function () {
fs.readFile(localFile, /* fill encoding here */, function(err, data) {
// do something with the data if there is no error
});
});
});
Of course this is a very raw and not recommended way to write asynchronous code but that is another discussion altogether.
Having said that, if you download a file, write it to the disk and then read it all back again to the memory for manipulation, you might as well forgo the file part and just read the response into a string right away. Your code will then look something like so (this can be implemented in several ways):
var request = http.get(url, function(response) {
var data = '';
function read() {
var chunk;
while ( chunk = response.read() ) {
data += chunk;
}
}
response.on('readable', read);
response.on('end', function () {
console.log('[%s]', data);
});
});
What you really should do IMO is to create a transform stream that will strip away all the data you need from the response, while not consuming too much memory and yielding this more elegantly looking code:
var request = http.get(url, function(response) {
response.pipe(yourTransformStream).pipe(file)
});
Implementing this transform stream, however, might prove slightly more complex. So if you're a node beginner and you don't plan on downloading big files or lots of small files than maybe loading the whole thing into memory and doing string manipulations on it might be simpler.
For further information about transformation streams:
node.js stream api
this wonderful guide by substack
this post from strongloop
Lastly, see if you can use any of the million node.js crawlers already out there :-) take a look at these search results on npm
According to the http module help 'get' does not return the response body
This is modified from the request example on the same page
What you need to do is process the response with in the callback (function) passed into http.request so it can be called when it is ready (async)
var http = require('http')
var fs = require('fs')
var localFile = 'tmp/scraped_site_.html'
var file = fs.createWriteStream(localFile)
var req = http.request('http://www.google.com.au', function(res) {
res.pipe(file)
res.on('end', function(){
file.end()
fs.readFile(localFile, function(err, buf){
console.log(buf.toString())
})
})
})
req.on('error', function(e) {
console.log('problem with request: ' + e.message)
})
req.end();
EDIT
I updated the example to read the file after it is created. This works by having a callback on the end event of the response which closes the pipe and then it can reopen the file for reading. Alternatively you can use
req.on('data', function(chunk){...})
to process the data as it arrives without putting it into a temporary file
My impression is that you serializing a js object into JSON by reading it from a stream that's downloading a file containing HTML. This is do-able yet hard. Its difficult to know when you're search expression is found because if you parse as the chunks come in then you never know if you received only context and you could never find what you're looking for because it was split into 2 or many parts which were never analyzed as a whole.
You could try something like this:
http.request('u/r/l',function(res){
res.on('data',function(data){
//parse data as it comes in
}
});
This allows you to read data as it comes in. You can handle it to save to disc, db, or even parse it if you accumulated the contents within the script tags into a single string then parsed objects in that.