I've recently started with pushing my locally tested Node,mongo, angularjs sites to live environments hosted on DigitalOcean.
I'm having inconsistency with ajax/http calls. on my Local machine, I am able to do http request and update an angularjs variable and this in return populates the html on the frontend. all works Great! now testing this on my server with same envireontment setup, the only time the variable load new data is when i refresh the page.
For example (not my actualy code):
Nodejs - app.js:
app.get('/getlist', requiredAuthentication, function(req, res) {
list.find({'username':req.session.user.username}, function(err,list) {
res.send(list);
});});
Angularjs - angular_app.js:
$scope.onClick = function (points, evt) {
$http.get('/getlist').then(function(response) {
$rootScope.list = response;});
};
Jade - home:
li(ng-repeat="row in list")
So like I said, this works perfectly on my local machine, but on my server I must refresh my page to load new data, it's as though my variable gets cached on the server.
Any idea would help.
Thanks.!
------- UPDATE - testing v0.1 --------
So after some intensive testing here is what I've found, but still no fix.
If I add new data via an http post, and I go look in my mongo db, I see the new data. Then when I click on the ng-click to retrieve the new data via HTTP, it doesn't return the new data, and is stuck on the old.
If I leave the page open for 10mins, and then click the button, it retrieves the new data, this is such a shlep.
Sounds like cache, but why des it work perfectly on my local?
When looking at the console > network > status. it is code 304, and this means nothing changed?
------- UPDATE - testing v0.2 --------
I've now tested the return data with a log in the console and I did the GET with ajax jQuery, and I'm getting the same issue/behaviour, it's stuck on the same collection of data, so my conclusion must be that node.js is causing the issues.
------- UPDATE - testing v0.3 --------
Okay so I've completely stopped mongo and switched everything to mysql using node-mysql. once again, on my local it works like a machine and on my actual server its laggy with reading new data.
I used Sequal PRO to access mysql and I started adding new entries to a table.
Opening my web url in the brower it Immediately showed the new entries. But after that, adding new entries or deleting entries only showed affect in 10mins or so.
So my conclusion is that Nodejs is caching like a mother, anyone know more bout this? am i really the only one every to experience this?
Try res.json for return data from node
app.get('/getlist', requiredAuthentication, function(req, res)
{
list.find({'username':req.session.user.username}, function(err,list)
{
res.json(list);
});
});
My conclusion to this issue was that port 80 was somehow caching the content of a page and will only load new data with a page refresh.
I upgrade Node and used latest Express. And I'm running my web app on a custom port, all is working now.
Related
I'am currently playing around with the Couchbase Sync-Gateway and have built a demo app.
What is the intended behavior if a user logs in with the same username on a different device (which has an empty database) or if he deleted the local database?
I'am expecting that all the data from the server should get synced back to the clients.
Is this correct?
My problem is that if i'am deleting the database or login from a different device, nothing will get synced.
Ok i figured it out and it's exactly how i thought it would be.
If i log in from a different device i get all the data synced automatically.
My problem was the missing sync function. I thought it will use a default and route all documents to the public channel automatically.
I'am now using the following simple sync-function:
"sync": `function (doc, oldDoc) {
channel('!');
access('demo#example.com', '*');
}`
This will simply route all documents to the public channel and grant my demo-user access to it.
I think this shouldn't be used in production but it's a good starting point for playing around.
Now everything is working fine.
Edit: I've now found the missing info:
https://docs.couchbase.com/sync-gateway/current/configuration-properties.html#databases-this_db-sync
If you don't supply a sync function, Sync Gateway uses the following default sync function
...
The channels property is an array of strings that contains the names of the channels to which the document belongs. If you do not include a channels property in a document, the document does not appear in any channels.
I have read near 20 other posts about this particular error, but most seem to be issues with the code calling Response.Close or similar, which is not our case. I understand that this particular error means that typically a user browsed away from the web page or cancelled the request midway, but in our case we are getting this error without cancelling a request. I can observe the error just after a few seconds, the download just fails in the browser (both Chrome and IE, so it's not browser specific).
We have a web api controller that serves a file download.
[HttpGet]
public HttpResponseMessage Download()
{
//
// Enumerates a directory and returns a Read-only FileStream of the download
var stream = dataProvider.GetServerVersionAssemblyStream(configuration.DownloadDirectory, configuration.ServerVersion);
if (stream == null)
{
return new HttpResponseMessage(HttpStatusCode.NotFound);
}
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StreamContent(stream)
};
response.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment");
response.Content.Headers.ContentDisposition.FileName = $"{configuration.ServerVersion}.exe";
response.Content.Headers.ContentType = new MediaTypeHeaderValue(MediaTypeNames.Application.Octet);
response.Content.Headers.ContentLength = stream.Length;
return response;
}
Is there something incorrect we are doing in our Download method, or is there something we need to tweak in IIS?
This happens sporadically. I can't observe a pattern, it works sometimes and other times it fails repeatedly.
The file download is about 150MB
The download is initiated from a hyperlink on our web page, there is no special calling code
The download is over HTTPS (HTTP is disabled)
The Web Api is hosted on Azure
It doesn't appear to be timing out, it can happen just after a second or two, so it's not hitting the default 30 second timeout values
I also noticed I can't seem to initiate multiple file downloads from the server at once, which is concerning. This needs to be able to serve 150+ businesses and multiple simultaneous downloads, so I'm concerned there is something we need to tweak in IIS or the Web Api.
I was able to finally fix our problem. For us it turned out to be a combination of two things: 1) we had several memory leaks and CPU intensive code in our Web Api that was impacting concurrent downloads, and 2) we ultimately resolved the issue by changing MinBytesPerSecond (see: https://blogs.msdn.microsoft.com/benjaminperkins/2013/02/01/its-not-iis/) to a lower value, or 0 to disable. We have not had an issue since.
I am trying to create a node.js app to automatically update a webpage every few seconds with new data from a mysql database. I have followed the information on this site: http://www.gianlucaguarini.com/blog/push-notification-server-streaming-on-a-mysql-database/
The code on this site does indeed work, but upon further testing it keeps running the "handler" function and therefore executing the readFile function for each row of the database processed.
I am in the process of learning node.js, but cannot understand why the handler function keeps getting called. I would only like it to get called once per connection. Constantly reading the index.html file like this seems very ineffecient.
The reason that I know the handler function keeps getting called is that I placed a console.log("Hello"); statement in the handler function and it keeps outputting that line to the console.
Do you provide the image URLs that the client.html is looking for? Here's what I think is happening:
The client connects to your server via Socket.IO and retrieves the user information (user_name, user_description, and user_img). The client then immediately tries to load an image using the user_img URL. The author's server code however, doesn't appear to support serving these pictures. Instead it just returns the same client.html file for every request. This would be why it appears to be calling handler over and over again - it's trying to load a picture for every user.
I would recommend using the express module in node to serve static files instead of trying to do it by hand. Your code would look something like this:
var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);
http.use(app.static(__dirname + "/public"));
That essentially says to serve any static files they request from the public folder. In that folder you will put client.html as well as the user photos.
I am about to develop an application where employees go to service repair machines at customer premises. They need to fill up a service card using a tablet or any other mobile device.
In case of no Internet connection, I am thinking about using HTML5 offline storage, mainly IndexedDB to store the service card (web form) data locally, and do a sync at the office where Internet exists. The sync is with a MySQL database.
So the question: is it possible to sync indexedDB with mysql? I have never worked with indexedDB, I am only doing research and saw it is a potential.
Web SQL is deprecated. Otherwise, it could have been the closer solution.
Any other alternatives in case the above is difficult or outside the standard?
Your opinions are highly appreciated.
Thanks.
This is definitly do able. I am only just starting to learn indexeddb the last couple of days. This is how I would see it working tho. Sorry dont have code to give you.
Website knows its in offline mode somehow
Clicking submit form saves the data into indexeddb
Later laptop or whatever is back online or on intranet and can now talk to main server sends all indexeddb rows to server to be stored in mysql via an ajax call.
indexeddb is cleared
repeat
A little bit late, but i hope it helps.
This is posible, am not sure if is the best choice. I can tell you that am building a webapp where I have a mysql database and the app must work offline and keep trace of the data. I try using indexedDB and it was very confusing for me so I implemented DexieJs, a minimalistic and straight forward API to comunicate with indexedDB in an easy way.
Now the app is working online then if it goes down the internet, it works offline until it gets internet back and then upload the data to the mysql database. One of the solutions i read to save the data was to store in a TEXT field the json object been passed to JSON.stringify(), and once you need the data back JSON.parse().
This was my motivation to build the app in that way and also that we couldn't change of database :
IndexedDB Tutorial
Sync IndexedDB with MySQL
Connect node to mysql
[Update for 2021]
For anyone reading this, I can recommend to check out AceBase.
AceBase is a realtime database that enables easy storage and synchronization between browser and server databases. It uses IndexedDB in the browser, and its own binary db format or SQL Server / SQLite storage on the server side. MySQL storage is also on the roadmap. Offline edits are synced upon reconnecting and clients are notified of remote database changes in realtime through a websocket (FAST!).
On top of this, AceBase has a unique feature called "live data proxies" that allow you to have all changes to in-memory objects to be persisted and synced to local and server databases, so you can forget about database coding altogether, and program as if you're only using local objects. No matter if you're online or offline.
The following example shows how to create a local IndexedDB database in the browser, how to connect to a remote database server that syncs with the local database, and how to create a live data proxy that eliminates further database coding altogether.
const { AceBaseClient } = require('acebase-client');
const { AceBase } = require('acebase');
// Create local database with IndexedDB storage:
const cacheDb = AceBase.WithIndexedDB('mydb-local');
// Connect to server database, use local db for offline storage:
const db = new AceBaseClient({ dbname: 'mydb', host: 'db.myproject.com', port: 443, https: true, cache: { db: cacheDb } });
// Wait for remote database to be connected, or ready to use when offline:
db.ready(async () => {
// Create live data proxy for a chat:
const emptyChat = { title: 'New chat', messages: {} };
const proxy = await db.ref('chats/chatid1').proxy(emptyChat); // Use emptyChat if chat node doesn't exist
// Get object reference containing live data:
const chat = proxy.value;
// Update chat's properties to save to local database,
// sync to server AND all other clients monitoring this chat in realtime:
chat.title = `Changing the title`;
chat.messages.push({
from: 'ewout',
sent: new Date(),
text: `Sending a message that is stored in the database and synced automatically was never this easy!` +
`This message might have been sent while we were offline. Who knows!`
});
// To monitor realtime changes to the chat:
chat.onChanged((val, prev, isRemoteChange, context) => {
if (val.title !== prev.title) {
console.log(`Chat title changed to ${val.title} by ${isRemoteChange ? 'us' : 'someone else'}`);
}
});
});
For more examples and documentation, see AceBase realtime database engine at npmjs.com
Since about an hour, I can't retrieve file content via the download URL attribute.
Each time I try to get it, API answers a 401 (unauthorized error).
Here's the code used: https://gist.github.com/arnaudbreton/5409015
Credentials are stored in GAE datastore and successfully retrieved / refresh.
The first call to file endpoint is working but not the second call to download content.
It was working this morning.
I tried different things so far:
- Revoke client secret (found as a solution in an other thread)
- Create a new client to test
- Disconnect my APP from Drive, accept it again
Nothing seems to solve my issue.
Thanks for your help.
A fix/rollback is in progress, should be back to normal soon.
You can use
resp.alternateLink;
resp.webContentLink;
i got stucked in the same issue a day back , using downloadUrl to get the content but got it with webContentLink.
var request = gapi.client.drive.files.list();
request.execute(function (resp) {
resp.alternateLink;
resp.webContentLink;
});