I'm trying to get dojo to show Json data that comes from a remote web service. I need to be clear though - the web server hosting the html/dojo page I access isn't the same server as the one that's running the web service that returns the json data - the web service server just can't serve html pages reliably (don't ask!!).
As a test I move the page into the same web server as the web service and the below works. As soon as I move it back so that the html/dojo is served from Apache (//myhost.nodomain:82 say) and the web service sending the json is "{target:http://myhost.nodomain:8181}", then it stops working.
I've used FFox to look at the network & I see the web service being called ok, the json data is returned too & looks correct (I know it is from the previous test), but the fields are no longer set. I've tried this with DataGrid and the plain page below with the same effects.
Am I tripping up over something obvious???
Thanks
require([
"dojo/store/JsonRest",
"dojo/store/Memory",
"dojo/store/Cache",
"dojox/grid/DataGrid",
"dojo/data/ObjectStore",
"dojo/query",
"dojo/domReady!"
],
function(JsonRest, Memory, Cache, DataGrid, ObjectStore, query) {
var myStore, dataStore, grid;
myStore = JsonRest(
{
target: "http://localhost:8181/ws/job/definition/",
idProperty: "JOB_NAME"
}
);
myStore.query("JOB00001"
).then(function(results) {
var theJobDef = results[0];
dojo.byId("JOB_NAME").innerHTML = theJobDef.JOB_NAME;
dojo.byId("SCHEDULED_DAYS").innerHTML = theJobDef.SCHEDULED_DAYS;
});
}
);
Its true what Frans said about the cross domain restriction but dojo has this link to work around the problem.
require(["dojo/request/iframe"], function(iframe){
iframe("something.xml", {
handleAs: "json"
}).then(function(xmldoc){
// Do something with the XML document
}, function(err){
// Handle the error condition
});
// Progress events are not supported using the iframe provider
});
you can simply use this and the returned data can be inserted into a store and then into the grid.
Are you familiar with the Same Origin Policy:
http://en.wikipedia.org/wiki/Same-origin_policy
Basically it restricts websites to do AJAX requests to other domains than the html page was loaded from. Common solutions to overcome this are CORS and JSON-P. However, remember that these restrictions are made for security reasons.
Related
I have a nodeJS server, that takes JSON from three websites and sends it to be displayed on my website(in JSON). The JSON on the websites that I'm taking from is constantly updated, every 10 seconds. How can I make my NodeJS server constantly update so it has the most up to date data?
I'm assuming this isn't possible without refreshing the page, but it would be optimal if the page wasn't refreshed.
If this is impossible to do with NodeJS and there is a different method of accomplishing this, I would be extremely appreciative if you told me.
Code:
router.get("/", function(req, res){
var request = require('request-promise');
var data1;
var data2;
var data3;
request("website1.json").then(function(body){
data1 = JSON.parse(body);
return request("website2.json");
})
.then(function(body) {
data2 = JSON.parse(body);
return request("website3.json");
})
.then(function(body){
data3 = JSON.parse(body);
res.render("app.ejs", {data1: data1, data2: data2, data3: data3});
})
});
Here's some general guidelines:
Examine the APIs you have available from your external service. Find out if there is anything it offers that lets you make a webSocket connection or some other continuous TCP connection so you can get realtime (or close to realtime) notifications when things change. If so, have your server use that.
If there is no realtime notification API from the external server, then you are just going to have to poll it every xx seconds. To decide how often you should poll it, you need to consider: a) How often you really need new data in your web pages (for example, maybe data that is current within 5 minutes is OK), b) What the terms of service and rate limiting are for the 3rd party service (e.g. how often will they let you poll it) and c) how much your server can afford to poll it (from a server load point of view).
Once you figure out how often you're going to poll the external service, then you build yourself a recurring polling mechanism. The simplest way would be using setInterval() that is set for your polling interval time. I have a raspberry pi node.js server that uses a setInterval() to repeatedly check several temperature sensors. That mechanism works fine as long as you pick an appropriate interval time for your situation.
Then for communication of new information back to a connected web page, the best way to get near "real time" updates form the server is for the web page to make a webSocket or socket.io connection to your server. This is a continuously connected socket over which messages can be sent either way. So, using this mechanism, the client makes a socket.io connection to your server. The server receives that connection and the connection stays open for the lifetime of that web page. Then, anytime your server has new data that needs to be sent to that web page, it can just send a message over that socket.io connection. The web page will receive that message and can then update the contents of the web page accordingly based on the data in the message. No page refresh is needed.
Here's an outline of the server code:
// start up socket.io listener using your existing web server
var io = require('socket.io')(app);
// recurring interval to poll several external web sites.
setInterval(function () {
var results = {};
request("website1.json").then(function (body) {
results.data1 = JSON.parse(body);
return request("website2.json");
}).then(function (body) {
results.data2 = JSON.parse(body);
return request("website3.json");
}).then(function (body) {
results.data3 = JSON.parse(body);
// decide if anything has actually changed on external service data
// and whether anything needs to be sent to connected clients
io.emit("newData", results);
}).catch(function(err) {
// decide what to do if external service causes an error
});
}, 10000);
The client code would then be generally like this:
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io();
socket.on("newData", function(data) {
// process new data here and update the web page
});
</script>
I have a RESTful Web API that is running properly as I can test it with Fiddler. I see calls going through, I see responses coming back.
I am developing a tablet application that needs to use the Web API in order to fetch data or make updates in the repository.
My calls do not return and there is not a single trace in the Fiddler to show that my calls even reach the server.
The first call I need to make is to login. The URI would be this:
http://localhost:53060/api/user
This call would normally return some information about the user (such as group membership, level of authorization and so on). The Web API uses Windows Authentication, so the repository is able to resolve all these fields based on the credentials passed in. As I said, in Fiddler I see the three calls made to the URI as the authentication is negotiated between the caller and the server. The third call returns with a JSON object that contains all information generated from the repository as expected.
Now, moving to my client I have the following:
var webApiClient = new HttpClient(new HttpClientHandler()
{
UseDefaultCredentials = true
})
{
BaseAddress = new Uri("http://localhost:53060/")
};
webApiClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpResponseMessage response = await webApiClient.GetAsync("api/user");
var userLoginInfo = await response.Content.ReadAsAsync<UserLoginInformation>();
My call to "GetAsync" never returns and, like I said, I see no trace of it in Fiddler.
Any idea of what I'm doing wrong?
Changing the URL where the Web API was exposed seemed to have fixed the problem. Thanks to #Nkosi for the suggestion.
For anyone stumbling onto this question and asking themselves how to change the URL of the Web API, there are two ways. If the simulator is running on the same machine with the Web API, the change has to be made in the "applicationhost.config" file for IIS Express. You can locate this file by right-clicking on the IIS Express icon in the Notification Area (the bottom right corner) and selecting show all websites. Highlight the desired Web API and it will show where the application host configuration file is located. In there, one needs to locate the following section:
<bindings>
<binding protocol="http" bindingInformation="*:53060:localhost" />
</bindings>
and replace the "localhost" name with the IP address of the machine where the Web API is running.
However, this approach will not work once you start testing your tablet app with a real device. IIS Express must be coerced into exposing the Web API to the outside world. I found an excellent node.js package that can help with that. It is called IISExpress-proxy.
I have an angularjs based web application with some functionality deployed to users that I need to hide. I've added the code to hide it and successfully verified the controls are hidden when appropriate but there are still users who have the old version of the file and can perform the undesired activities. Is there a way I can control from the server the view file to refresh on the client? (The tester was able to clear their cache but it's a burden to the users in the field)
Thanks!
Scott
One way to handle this would be to version the files. For example, the following line in your index.html
<script src="abc.js" />
could be rewritten as
<script src="abc.js?v1" />
v1 is the current file version and should be changed for each deployment of your application when abc.js has changed.
Since index.html(the initial page) is obtained from the server, updations to abc.js will now be reflected on all your clients.
This would need to be automated in a huge application. You could use Grunt for this. You can refer the following answer on StackOverflow for automating this:
https://stackoverflow.com/a/20446748/802651
UPDATE
HTML views/templates are cached using $templateCache in AngularJS. Basically, when you request templates for the first time, browser requests the template from the server and puts it in the template cache. Any subsequent requests to the same template are served from the template cache.
If you do not want these to be cached, you could listen to the $routeChangeStart event inside app.run block to remove the specific templates.
app.run(function($rootScope, $templateCache) {
$rootScope.$on('$routeChangeStart', function(event, next, current) {
if (typeof(current) !== 'undefined'){
$templateCache.remove(current.templateUrl);
}
});
});
Reference: http://opensourcesoftwareandme.blogspot.in/2014/02/safely-prevent-template-caching-in-angularjs.html
Is there any known and consolidated alternative for defining a new Angular scope reading data from outside?
I am working on a demo that should make available a standalone html page which reads the data from the same html file position, and on client machines without any webserver.
This because the HTML is generated on the fly from a pdf.
Do you have any idea?
In my working code below I should change $http.get('data.json'.. to avoid the Google restriction (on Firefox my sample is working fine).
<script>
var isisApp = angular.module('isisApp', []);
isisApp.controller('ISISListCtrl', function($scope, $http) {
$http.get('data.json').success(function(data) {
$scope.IsisDocument = data;
etc.....
and this is the error I get from Chrome:
XMLHttpRequest cannot load file:///C:/temp/data.json. Cross origin requests are only supported for HTTP. angular.js:8081
Error: A network error occurred.
Thanks in advance
Fabio
If you want to test your code, while developing, you have two options:
Use a local web server. You could use Node.js platform, using expressjs.
Start Chrome from the terminal with the –allow-file-access-from-files option
Is possible to intercept 404 error without using web server (browsing html file in the filesystem) ?
I tried with some javascript, using an hidden iframe that preload the destination page and check for the result and then trigger a custom error or redirect to the correct page.
This work fine but is not good on perfomance.
A 404 error is an HTTP status response. So unless you are trying to retrieve this file using an HTTP request/response, you can't have a genuine 404 error. You can only mimic one in something like the way you suggest. Any "standard" way of handling a 404 error is dependent on your flavour of web server anyway...
404 is a HTTP response code, and as such only delivered through the HTTP protocol by servers that speak it. The file:// extension isn't a real protocol response as such, it's a hack built into clients (like browsers) that enable local file support, however it's up to browsers / clients themselves whether they expose any response codes from their file:// implementation. In theory they could report them in the DOM, for example, but they would be response codes exposed to themselves, and as such rarely implemented. Most don't, and there isn't a standard way for it. You may look into browser extensions, like Firefox, and see if they support it, but then, this is highly unstandard and will likely break if you pop it on the web.
Why don't you want to use the server?
I don't believe that it's possible to handle a 404 error client-side, because a 404 error is server-side.
Whenever you load a webpage, you make a request to the server. Thus, when you ask for a file that's not there, it's the server that handles the error. Regular HTML/CSS/JavaScript only come into the picture when the server sends back a response to tell you that it can't find the file.
Steve
Because I was looking for this today. You can now do this without a server by using a Service Worker to cache the custom 404 page, and then serve it when a fetch request status is 404. Following the instructions on the google cache lab, the worker files looks as follows:
const filesToCache = [
'/',
'404.html'
];
const staticCacheName = 'pages-cache-v1';
self.addEventListener('install', event => {
console.log('Attempting to install service worker and cache static assets');
event.waitUntil(
caches.open(staticCacheName).then(cache => {
return cache.addAll(filesToCache);
});
);
});
self.addEventListener('fetch', event => {
console.log('Fetch event for ', event.request.url);
event.respondWith(
caches.match(event.request).then(response => {
if (response) {
console.log('Found ', event.request.url, ' in cache');
return response;
}
console.log('Network request for ', event.request.url);
return fetch(event.request).then(response => {
console.log('response.status:', response.status);
// fetch request returned 404, serve custom 404 page
if (response.status === 404) {
return caches.match('404.html');
}
});
});
);
});