AngularJS, $resource and application cache - json

I'm writting an application using angular and spring. It has to be able to work offline. After doing some research I found that application cache is the best way to go since I need to cache all the .css and .js files. The problem is that I can't get the data returned by spring and asked by the $resource object to be cached. When I turn off the server, static data are cached but I get a "GET error" in chrome's console about the .json he can't retrieve.
angular.module('MonService', ['ngResource']).
factory('Projet', function($resource){
return $resource('json/accueil');
});
I've tried something such as saving the response manually in a .json file then caching this file as well and use it as the source for the $resource but it seems long and complicated...
Or using localstorage, something like :
var cache, AmettreDansCache;
donne = {};
cache= window.localStorage.getItem('projets');
if (!cache) {
AmettreDansCache= $resource('json/accueil');
window.localStorage.setItem('projets', JSON.stringify(AmettreDansCache));
return AmettreDansCache
}
else{
return angular.extend(donne, JSON.parse(cache));
}
i don't think this is working, anyway what's the way to do it using application cache only ?

There is a module built on top of resource that does caching, on both reading and writing data. You should check it out. It will keep a copy of your data on the clients browser and when failed to save (due to being offline) it will save it locally and keep retrying to save.
https://github.com/goodeggs/angular-cached-resource
There is also a small article about the module:
http://bites.goodeggs.com/open_source/angular-cached-resource/

Related

Why do the weather samples in FetchData seem to get cached for the sample Blazor app?

The Blazor app in Visual Studio uses a Http.GetFromJsonAsync call to get the data for Weather Forecasts from a json file in wwwroot.
When I change the data in the file, I still see the same data in the table?
When I copy the file, and change the code to use the new filename, I get the changed results.
Is there some caching happening with wwwroot files? I've tried hard refresh, that doesn't make a difference, but changing browser does. I know that Blazor caches the framework files...but is this happening to all wwwroot, how do I change this behaviour?
Thanks in advance.
The fetchdata sample page (from new blazorwasm) retrieves data on initialize component:
protected override async Task OnInitializedAsync()
{
forecasts = await Http.GetFromJsonAsync<WeatherForecast[]>("sample-data/weather.json");
}
When you go out of this page and come back, initialize is running again and a request is done.
But, because this is a GET request, the browser can deliver answer from cache:
They are some ways to avoid cache on Blazor GET requests, learn about it here: Bypass HTTP browser cache when using HttpClient in Blazor WebAssembly
Also, you can use the simple trick to add a random string to query string:
protected override async Task OnInitializedAsync()
{
var randomid = Guid.NewGuid().ToString();
var url_get = $"sample-data/weather.json?{randomid}";
forecasts = await Http.GetFromJsonAsync<WeatherForecast[]>(url_get);
}
In short, it seems to get cached because a get request can be cached by browser and is the browser who retrieve the data.

Interrupted downloads when downloading a file from Web Api (remote host closed error 0x800704CD)

I have read near 20 other posts about this particular error, but most seem to be issues with the code calling Response.Close or similar, which is not our case. I understand that this particular error means that typically a user browsed away from the web page or cancelled the request midway, but in our case we are getting this error without cancelling a request. I can observe the error just after a few seconds, the download just fails in the browser (both Chrome and IE, so it's not browser specific).
We have a web api controller that serves a file download.
[HttpGet]
public HttpResponseMessage Download()
{
//
// Enumerates a directory and returns a Read-only FileStream of the download
var stream = dataProvider.GetServerVersionAssemblyStream(configuration.DownloadDirectory, configuration.ServerVersion);
if (stream == null)
{
return new HttpResponseMessage(HttpStatusCode.NotFound);
}
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StreamContent(stream)
};
response.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment");
response.Content.Headers.ContentDisposition.FileName = $"{configuration.ServerVersion}.exe";
response.Content.Headers.ContentType = new MediaTypeHeaderValue(MediaTypeNames.Application.Octet);
response.Content.Headers.ContentLength = stream.Length;
return response;
}
Is there something incorrect we are doing in our Download method, or is there something we need to tweak in IIS?
This happens sporadically. I can't observe a pattern, it works sometimes and other times it fails repeatedly.
The file download is about 150MB
The download is initiated from a hyperlink on our web page, there is no special calling code
The download is over HTTPS (HTTP is disabled)
The Web Api is hosted on Azure
It doesn't appear to be timing out, it can happen just after a second or two, so it's not hitting the default 30 second timeout values
I also noticed I can't seem to initiate multiple file downloads from the server at once, which is concerning. This needs to be able to serve 150+ businesses and multiple simultaneous downloads, so I'm concerned there is something we need to tweak in IIS or the Web Api.
I was able to finally fix our problem. For us it turned out to be a combination of two things: 1) we had several memory leaks and CPU intensive code in our Web Api that was impacting concurrent downloads, and 2) we ultimately resolved the issue by changing MinBytesPerSecond (see: https://blogs.msdn.microsoft.com/benjaminperkins/2013/02/01/its-not-iis/) to a lower value, or 0 to disable. We have not had an issue since.

All data is gone on page reload. Is there any way to avoid that?

I have developed one dashboard application in angular js which is having search functionality and multiple views with lots of different data coming from that search functionality.
Its single page application. So I am facing issue on page reload all data is gone and it shows view with blank data.
Is there any way to solve this issue or any language which help me to create single page application and maintain same data on page reload??
Any suggestion will be appreciated.
Search Page
Data after search
Data after reload
You can use sessionStorage to set & get the data.
Step 1 :
Create a factory service that will save and return the saved session data based on the key.
app.factory('storageService', ['$rootScope', function($rootScope) {
return {
get: function(key) {
return sessionStorage.getItem(key);
},
save: function(key, data) {
sessionStorage.setItem(key, data);
}
};
}]);
Step 2 :
Inject the storageService dependency in the controller to set and get the data from the session storage.
app.controller('myCtrl',['storageService',function(storageService) {
// Save session data to storageService on successfull response from $http service.
storageService.save('key', 'value');
// Get saved session data from storageService on page reload
var sessionData = storageService.get('key');
});
You need to save data before changing the state of application / moving to another page,route.
So, either save that data using
Angular services (Save data, change route, come back check the data in service and reassign variables from service.)
Local-storage to save data. (Save data, change route, come back check the data in service and reassign variables from service.)
Rootscope. Set data in rootscope and move .. after coming back check the variables and reassign.
Your problem is not saving the data, but saving the state in your URL. As it is, you only save the "step", but no other information, so when you reload the page (refresh, or close/re-open browser, or open bookmark, or share...) you don't have all the information needed to restore the full page.
Add the relevant bits to the URL (as you have a single page app, probably in the fragment identifier). Use that information to load the data, rather than relying on other mechanisms to pass data between the "pages".
Get data on page refresh, that is something like,
$rootScope.on('onStateChangeStart', function(){
// here make a call to the source
$http(); // request and try to get data
})
use localStorage or save data to your server for users in a database temporarily and get it back with an API call(localstorage has a 10MB) limit.
You can have your code try to retrieve localstorage values first if they exist.

Nodejs on Server Ajax/http calls not reading new data Mongodb

I've recently started with pushing my locally tested Node,mongo, angularjs sites to live environments hosted on DigitalOcean.
I'm having inconsistency with ajax/http calls. on my Local machine, I am able to do http request and update an angularjs variable and this in return populates the html on the frontend. all works Great! now testing this on my server with same envireontment setup, the only time the variable load new data is when i refresh the page.
For example (not my actualy code):
Nodejs - app.js:
app.get('/getlist', requiredAuthentication, function(req, res) {
list.find({'username':req.session.user.username}, function(err,list) {
res.send(list);
});});
Angularjs - angular_app.js:
$scope.onClick = function (points, evt) {
$http.get('/getlist').then(function(response) {
$rootScope.list = response;});
};
Jade - home:
li(ng-repeat="row in list")
So like I said, this works perfectly on my local machine, but on my server I must refresh my page to load new data, it's as though my variable gets cached on the server.
Any idea would help.
Thanks.!
------- UPDATE - testing v0.1 --------
So after some intensive testing here is what I've found, but still no fix.
If I add new data via an http post, and I go look in my mongo db, I see the new data. Then when I click on the ng-click to retrieve the new data via HTTP, it doesn't return the new data, and is stuck on the old.
If I leave the page open for 10mins, and then click the button, it retrieves the new data, this is such a shlep.
Sounds like cache, but why des it work perfectly on my local?
When looking at the console > network > status. it is code 304, and this means nothing changed?
------- UPDATE - testing v0.2 --------
I've now tested the return data with a log in the console and I did the GET with ajax jQuery, and I'm getting the same issue/behaviour, it's stuck on the same collection of data, so my conclusion must be that node.js is causing the issues.
------- UPDATE - testing v0.3 --------
Okay so I've completely stopped mongo and switched everything to mysql using node-mysql. once again, on my local it works like a machine and on my actual server its laggy with reading new data.
I used Sequal PRO to access mysql and I started adding new entries to a table.
Opening my web url in the brower it Immediately showed the new entries. But after that, adding new entries or deleting entries only showed affect in 10mins or so.
So my conclusion is that Nodejs is caching like a mother, anyone know more bout this? am i really the only one every to experience this?
Try res.json for return data from node
app.get('/getlist', requiredAuthentication, function(req, res)
{
list.find({'username':req.session.user.username}, function(err,list)
{
res.json(list);
});
});
My conclusion to this issue was that port 80 was somehow caching the content of a page and will only load new data with a page refresh.
I upgrade Node and used latest Express. And I'm running my web app on a custom port, all is working now.

Can't retrieve file content via download URL

Since about an hour, I can't retrieve file content via the download URL attribute.
Each time I try to get it, API answers a 401 (unauthorized error).
Here's the code used: https://gist.github.com/arnaudbreton/5409015
Credentials are stored in GAE datastore and successfully retrieved / refresh.
The first call to file endpoint is working but not the second call to download content.
It was working this morning.
I tried different things so far:
- Revoke client secret (found as a solution in an other thread)
- Create a new client to test
- Disconnect my APP from Drive, accept it again
Nothing seems to solve my issue.
Thanks for your help.
A fix/rollback is in progress, should be back to normal soon.
You can use
resp.alternateLink;
resp.webContentLink;
i got stucked in the same issue a day back , using downloadUrl to get the content but got it with webContentLink.
var request = gapi.client.drive.files.list();
request.execute(function (resp) {
resp.alternateLink;
resp.webContentLink;
});