Server Sent Event Does Not Maintain Open Connection
I’ve been working with some example code that I found on the web to get some basic server sent events working on our server, but I’ve stumbled across some strange behavior when testing across servers. Below is the function being called in my controller:
public function hi() {
header("Content-Type: text/event-stream\n\n");
$counter = rand(1, 10);
while (1) {
// Every two seconds, send a "ping" event.
echo "event: ping\n";
$curDate = date(DATE_ISO8601);
echo 'data: {"time": "' . $curDate . '"}';
echo "\n\n";
ob_flush();
flush();
sleep(2);
}
}
My View:
<script>
var evtSource = new EventSource("picks/hi");
evtSource.onmessage = function(e) {
console.log(e.data);
}
evtSource.onerror = function(e) {
console.log("EventSource failed.");
};
evtSource.onopen = function(e) {
console.log("Connection open");
};
evtSource.addEventListener("ping", function(e) {
var newElement = document.createElement("li");
var obj = JSON.parse(e.data);
console.log(obj.time);
}, false);
</script>
Now here’s the bizarre part. This code works as it should on my hostgator account. It opens a connection, maintains it, and spits out the time every two seconds. If you view the console in Chrome, you will see the following output:
Connection open
2014-04-02T15:10:24-0400
{"time": "2014-04-02T15:10:26-0400"}
{"time": "2014-04-02T15:10:28-0400"}
{"time": "2014-04-02T15:10:30-0400"}
{"time": "2014-04-02T15:10:32-0400"}
{"time": "2014-04-02T15:10:34-0400"}
{"time": "2014-04-02T15:10:36-0400"}
{"time": "2014-04-02T15:10:38-0400"}
Unfortunately, this same block of code does not work on our server through Amazon Web Services. When I use the console to compare what is going on between the two, I can see that the code on the hostgator server does indeed open and maintain a persistent connection. When I do the same on the AWS instance, I see that the GET request for data is stuck in a pending status. I believe this is an apache issue, but I cannot find any information. What do I need to do on the server side to enable server sent events to properly function?
A few things to check...are your server and client side scripts both of the same origin? I know this can cause issues in some browser, but admittedly it's been a while since I played around with SSE's, so I don't know if this still holds true.
You may also want to add header('Cache-Control: no-cache'); under your content-type declaration just be certain.
You may have to contact AWS support to see what their thoughts are.
Also, and I know this sounds stupid, but completely clear all of your browser info after changing your code (cookies, cache, etc...).
If your server is running behind nginx then you need to add
proxy_buffering off;
to nginx settings. For apache, look into this link:
Prevent output buffering with PHP and Apache
Related
Recently a post was featured in Hacker News about websites abusing WebSockets to find open ports on the client's machine.
The post does not go into any details, so I decided give it a try.
I opened a web server on port 8080 and tried running this script in Chrome's console:
function test(port) {
try {
var start = performance.now();
var socket = new WebSocket('ws://localhost:' + port);
socket.onerror = function (event) {
console.log('error', performance.now() - start, event);
}
socket.addEventListener('close', function(event) {
console.log('close', performance.now() - start, event);
})
socket.addEventListener('open', function (event) {
console.log('open', performance.now() - start, event);
socket.send('Hello Server!');
});
socket.addEventListener('message', function (event) {
console.log('message ', performance.now() - start, event);
});
} catch(ex) {
console.log(ex)
}
}
Indeed Chrome logs different a error message (ERR_CONNECTION_REFUSED) when I try to connect to a port that is not open:
test(8081)
VM1886:3 WebSocket connection to 'ws://127.0.0.1:8081/' failed: Error in connection establishment: net::ERR_CONNECTION_REFUSED
And when I try to connect to a port that is open but is not listening to WebSockets (Unexpected response code: 200):
test(8080)
WebSocket connection to 'ws://127.0.0.1:8080/' failed: Error during WebSocket handshake: Unexpected response code: 200
But I can't find any way to access and read these errors in JavaScript.
Control flow does not reach the catch clause catch(ex) { console.log(ex) } and the event objects that Chrome passes to socket.onerror do not seem to be any different whether the port is open or not.
Timing attacks also don't seem to be helping at least in Chrome. Delta time between onerror and new Socket() creation seems to increase after calling test(...) a few times.
So is there actually a way for a web page to determine if a port is open on my computer?
The presentation slides linked to below show it was well known in 2016 and lack of a timing difference in your tests show mitigations may have been applied upstream.
https://datatracker.ietf.org/meeting/96/materials/slides-96-saag-1/
It might only work on windows:
https://blog.avast.com/why-is-ebay-port-scanning-my-computer-avast
With chrome dev tool, I can see the number of requests in a page, but it seems that there is no way to measure number of connections.
Is it possible in chrome dev tool? if not, what tools can I use instead?
You can enable the Connection ID header in the Network panel, which is a unique identifier for a particular connection. You can sort the column to see how many requests there were for a particular connection instance, but there's no built in way to see how many or filter the results.
However, this data can be exported into a JSON formatted file, known as the HAR (HTTP Archive). You can do this by right-clicking on panel and selecting 'Save as HAR with Content'.
You can extract the data from the JSON, and filter and aggregate however you like. I have created a simple example script that will load the HAR from the local file system, parse the data, and filter the content, so that it shows how many unique Connection IDs appeared in the session.
function loadFile(event) {
var file = event.target.files[0];
if (file) {
var reader = new FileReader();
reader.onload = function(e) {
var contents = e.target.result;
var data = JSON.parse(contents);
getUniqueConnectionCount(data);
}
reader.readAsText(file);
} else {
alert('Failed to load file.');
}
}
function getUniqueConnectionCount(data) {
var entries = data.log.entries;
var uniqueConnectionIds = entries.map(function(item) {
return item['connection'];
}).filter(function(x, i, a) {
return a.indexOf(x) === i && i > 0;
});
console.log('There were ', uniqueConnectionIds.length, ' unique connections found', uniqueConnectionIds);
}
document.getElementById('files').addEventListener('change', loadFile, false);
<div>
<input type='file' id='files' name='files' />
</div>
Note: Make sure 'Preserve Log' is un-checked to avoid seeing data from previous sessions. This is just a quick example for your use case, but I might look into extending this to be more generic.
It depends what connections you are interested in. You can use Chrome Developer tools the way Gideon Pyzer pointed out to see html connections. But if you are interested in TCP or some other protocol you can use Wireshark (free and open-source packet analyzer) to capture those connections.
Then there is "Capture Network Log" in chrome. Type "chrome://net-export/" in address field, set ready and press "Start Loggin to Disk" button - it will save your browser network activity to json file.
I am creating a digital signage player that uses Chrome as it's display engine. We need to be able to still muddle along if the network goes down without too much interruption.
Chrome works fine caching images, and I've set the "Exipres" header to be a month after access. I can set the player computer offline and have the app run for days with no problem. If I reboot the machine the right way (Start->Shut Down), caching still works as expected.
The issue is that when Chrome exits abnormally - Either a crash or power loss - on reboot, Chrome ignores the cache and refuses to load images. This happens if I cut power 5 minutes after it loads the page, so content is not expiring.
My guess is that Chrome is set to ignore the cache after an abnormal exit to prevent corrupted cache from continually crashing the browser. However, this behavior is not what I need.
Does anyone know of a command line arg or flag I can set to keep this from happening?
Thanks for your help.
I tried everything I could think of to make Chrome not invalidate the local cache on system failure, and came up empty. There's a few other people who had the same question, and I didn't see an answer.
Here's what I did that made this work, and if someone else is having the same problem, it might be the workaround that you need.
I added a service worker that would cache images. The code below isn't perfect yet, but should be a starting place for someone... (FYI, I learned this 5 minutes ago, so if someone wants to give me a pointer or two on how to make this more elegant, I'm all ears.)
We cache anything that has a response type of "cors" so we cache only images coming from the remote server. Note that your images must be loaded via https for this to work.
Taken (mostly) from: https://developers.google.com/web/fundamentals/getting-started/primers/service-workers
var CACHE_NAME = 'shine_cache';
var urlsToCache = [
'/'
];
self.addEventListener('install', function(event) {
// Perform install steps
event.waitUntil(
caches.open(CACHE_NAME)
.then(function(cache) {
console.log('Opened cache');
return cache.addAll(urlsToCache);
})
);
});
self.addEventListener('fetch', function(event) {
//console.log('Handling fetch event for', event.request);
if (event.request.method == 'POST') {
//console.log("Skipping POST");
event.respondWith(fetch(event.request));
return;
}
if (event.request.headers.get('Accept').indexOf('image') !== -1) {
event.respondWith(
caches.match(event.request)
.then(function(response) {
// Cache hit - return response
if (response) {
console.log("Returning from cache.", event.request);
return response;
}
// IMPORTANT: Clone the request. A request is a stream and
// can only be consumed once. Since we are consuming this
// once by cache and once by the browser for fetch, we need
// to clone the response.
var fetchRequest = event.request.clone();
return fetch(fetchRequest).then(
function(response) {
console.log("Have a response.", response);
// Check if we received a valid response
if(!response || response.status !== 200 || response.type !== 'cors') {
return response;
}
// IMPORTANT: Clone the response. A response is a stream
// and because we want the browser to consume the response
// as well as the cache consuming the response, we need
// to clone it so we have two streams.
var responseToCache = response.clone();
caches.open(CACHE_NAME)
.then(function(cache) {
console.log("Caching response", event.request);
cache.put(event.request, responseToCache);
});
return response;
}
);
})
);
}
});
I'm trying to use STOMP with Apache AMQ as I was hoping web sockets would give me a better performance than the typicalorg.activemq.Amq Ajax connection.
Anyway, my activemq config file has the proper entry
<transportConnector name="ws" uri="ws://0.0.0.0:61614?maximumConnections=1000&wireFormat.maxFrameSize=104857600"/>
And I'm connecting to it via the following means:
function amqWebSocketConn() {
var url = "ws://my.ip.address:61614/stomp";
var client = Stomp.client(url);
var connect_callback = function() {
alert('connected to stomp');
client.subscribe("topic://MY.TOPIC",callback);
var callback = function(message) {
if (message.body) {
alert("got message with body " + message.body);
} else { alert("got empty message"); }
};
};
client.connect("", "", connect_callback);
}
When I first open up the web browser & navigate to http://localhost:8161/admin/connections.jsp It shows the following:
Name Remote Address Active Slow
ID:mymachine-58770-1406129136930-4:9 StompSocket_657224557 true false
Shortly there after - it removes itself. Is there something else I need such as a heart beat to keep the connection alive?
Using
var amq = org.activemq.Amq;
amq.init({
uri : '/myDomain/amq',
timeout : 50,
clientId : (new Date()).getTime().toString()
});
Kept the connection up for the TCP AJAX Connection
I have faced similar problem, solved it using this
client.heartbeat.incoming = 0;
client.heartbeat.outgoing = 0;
You have to add these two lines before connect.
Even after this I have seen disconnection after 5-10 minutes, if there are no incoming messages. To solve that you have to implement ondisconnect call back of connect method.
client.connect('','',connect_callback,function(frame){
//Connection Lost
console.log(frame);
//Reconnect and subscribe again from here
});
This is successfully working in my application.
I can't get my Yahoo! Application Platform to run I keep getting denied access even though their policy file accepts requests from any domain.
OK: Policy file accepted: http://social.yahooapis.com/crossdomain.xml
Error: Request for resource at http://social.yahooapis.com/v1/user/<user id>/profile?oauth_signature_method=HMAC-SHA1&lang=en-US&oauth_consumer_key=<key>&oauth_token=<long ass token>&oauth_version=1.0&format=json&oauth_nonce=<blah blah>&oauth_timestamp=1262846353®ion=US&oauth_signature=<foo bar> by requestor from http://<my domain>/YOSSimple.swf is denied due to lack of policy file permissions.
The url works btw, I editted some stuff out since it has my keys and stuff.
Links to the stuff I'm trying to do
http://developer.yahoo.com/flash/yos/
http://developer.yahoo.com/flash/yos/examples/simple/YOSSimple.fla
YOSSimple properly creates the url actually since if I type it in my browser I'm prompted if I want to download the file that contains information regarding my profile.
But it just wont open it in Flash.
I'm guessing that it's not loading the policy file automatically. You should try using
Security.loadPolicyFile("http://social.yahooapis.com/crossdomain.xml");
Do you have a webproxy installed with which you can monitor what files exactly are loaded? My favorite is Charles but there are also free FF plugins like Httpfox
EDIT:
I think I know what's going wrong. It's going wrong the other way around, the swf from yahoo is trying to access your swf, but doesn't have the correct permissions. Would you try
Security.allowDomain( 'http://social.yahooapis.com/' );
http://www.ieinspector.com/httpanalyzer/
use HTTP analyzer to see whats happening?
also check your not missmatching http://www. with http:// because flash treats them as different domains
also are you running the code locally on your machine. It could be your local security settings
A simple WebProxy will fix this:
<?php
// PHP Proxy
// Loads a XML from any location. Used with Flash/Flex apps to bypass security restrictions
// usage: proxy.php?url=http://mysite.com/myxml.xml
$session = curl_init($_GET['url']); // Open the Curl session
curl_setopt($session, CURLOPT_HEADER, false); // Don't return HTTP headers
curl_setopt($session, CURLOPT_RETURNTRANSFER, true); // Do return the contents of the call
$xml = curl_exec($session); // Make the call
header("Content-Type: text/xml"); // Set the content type appropriately
echo $xml; // Spit out the xml
curl_close($session); // And close the session
?>
Modify the web proxy example above to support multiple options as follows:
$sOptions = "";
foreach($_GET as $sIndex => $sValue) {
if ($sIndex == 'url') {
$url = $sValue;
}
else {
if (strlen($sIndex) > 0) {
$sOptions .= "&" . $sIndex;
}
if (strlen($sValue) > 0) {
$sOptions .= "=" . $sValue;
}
}
}
$url .= $sOptions;
$session = curl_init($url); // Open the Curl session