Using Yslow, it's complaining about cookie free components. I understand what the problem is, but is there no solution other than using a subdomain? Using a subdomain can create SSL issues, especially along with CDN, and plugin updates.
Is there not an alternative? Such as a script that says "don't cookie these components?"
Thanks.
When a cookie is set for a domain, the browser sends it for every request, as long as the cookie is not expired. So no, there is no way to say "don't send cookies for this request" except a different domain which you mentioned.
Depending on your website, you can, instead of using cookies, use:
query parameters: GET my_file.php?my_session_id=abcd123456
custom headers (for ajax requests only): you can set HTTP headers X-MY-SESSION-ID: abcd123456
POST body variables
You asked for alternatives, so I gave some but it's hard to implement these if all you want is to avoid sending cookies when loading a static asset.
So my suggestion is to look at how many extra bytes you're actually sending for each request and trying to reduce that to a minimum.
In my opinion, only one cookie is really needed for the user's session id, so that's about 50-60 bytes. You may also choose to use Google Analytics, which will add about a couple hundred bytes. Everything besides that is bloat. But don't sweat it if Yslow shames you for 300 extra bytes, because it won't make a huge difference.
Related
I'd like to login to a RESTful back-end server written in Laravel5, with the single page front-end application leveraging Polymer's custom element.
In this system, the persistence(CRUD) layer lives in the server. So, authentication should be done at the server in responding to client's api request. When a request is valid, the server returns User object in JSON format including user's role for access control in client.
Here, my questions is how I can keep the session, even when a user refreshes the front-end page? Thanks.
This is an issue beyond Polymer, or even just single page apps. The question is how you keep session information in a browser. With SPAs it is a bit easier, since you can keep authentication tokens in memory, but traditional Web apps have had this issue since the beginning.
You have two things you need to do:
Tokens: You need a user token that indicates that this user is authenticated. You want it to be something that cannot be guessed, else someone can spoof it. So the token better not be "jimsmith" but something more reliable. You have two choices. Either you can have a randomly generated token which the server stores, so that when presented on future requests, it can validate the token. This is how just most session managers work in app servers like nodejs sessions or Jetty session or etc. The alternative is to do something cryptographic so that the server only needs to validate mathematically, not check in a store to see if the token is valid. I did that for node in http://github.com/deitch/cansecurity but there are various options for it.
Storage: You need some way to store the tokens client-side that does not depend on JS memory, since you expect to reload the page.
There are several ways to do client-side storage. The most common by far is cookies. Since the browser stores them without your trying too hard, and presents them whenever you access the domain that the cookie is registered for, it is pretty easy to do. Many client-side and server-side auth libraries are built around them.
An alternative is html5 local storage. Depending on your target browsers and support, you can consider using it.
There also are ways you can play with URL parameters, but then you run the risk of losing it when someone switches pages. It can work, but I tend to avoid that.
I have not seen any components that handle cookies directly, but it shouldn't be too hard to build one.
Here is the gist for cookie management code I use for a recent app. Feel free to wrap it to build a Web component for cookie management.. as long as you share alike!
https://gist.github.com/deitch/dea1a3a752d54dc0d00a
UPDATE:
component.kitchen has a storage component here http://component.kitchen/components/TylerGarlick/core-resource-storage
Simplest way if you use PHP is to keep the user in a PHP session (like a normal non SPA application).
PHP will store the user info on the server, and generate automatically a cookie that the browser will send with any request. With a single server with no load balancing, the session data is local and very fast.
Is there a reliable way to stop all browsers from caching an image locally.
This is not (just) about the freshness of the image, but rather a concern about sensitive images being stored on the local drive.
Adding a random url param to the img url as suggested in similar questions does not help because that just ensures the next request is not the last request in cache (at least that is my understanding). What I really need is for the image to never be saved locally or at least not accessible outside the browser session if it is saved.
You need to send appropriate cache-control headers when serving up the response for image request. See this post for information on standard ways to do this in several programming languages.
How to control web page caching, across all browsers?
There is an alternate, and possibly more foolproof yet more complex, approach which would be to directly populate base 64 encoded images data directly into the img src attrbitute. So far as I know this would not be subject to caching, as there is not a separate HTTP request made to retrieve the image. Of course you still need to make sure the page is not cached, which gets back to the initial problem of serving up appropriate headers for primary HTML request.
I have shopping cart script in our site that is setup to be secure (https with SSL certificate). I have links in the script leading to other parts of my site that are not secure (WordPress blog, etc).
In the secure site, if I have links that are not secure ( http ), it triggers a message to user in browser, alerting of unsecured links. If I put the outgoing links in the script as relative links, when the user clicks on them and goes outside of script, it keeps them in secure mode (which we don't want for other parts of our site).
Years ago, I remember having this issue. I think I got around it by using a HTTP Redirect for every outgoing link in the secure site. Using a HTTP Redirect, I would have https://www.example.com/outgoinglink1a redirect to http://www.example.com/outgoinglink1b in the HTTP Redirect. This way, I could put https://www.example.com/outgoinglink1a in the secure site, and when it was clicked, it would lead to http://www.example.com/outgoinglink1b
In modern times, how do I have links in the secure site that lead to other parts of the site that aren't secure, without triggering SSL Error Message to user when they are in Secure part of site? Is using some type of 301 redirect in .htaccess better? Is there another preferred or easier method (than using HTTP Redirects) for accomplishing this?
Thank you for any guidance.
You can use https-2-http redirects to the unsecured site to avoid browser warnings.
But for multiple reasons, safety being one of them, I would really advice against using http and https for the same domain, even if lot of big sites still do it. You would ether have to use different cookies for the secure and the normal site, or the one cookie u use for your shopping cart can't have the secure flag, in which case you really don't need https in my opinion. Also, you will never be able to implement HSTS.
You've already gone to the lengths bought a certificate and set up an https-server, now why not secure the whole site?
Update to answer your question in the comment:
That is of course a deal-breaker, if you rely on those and the hosts haven't implemented https yes (which they probably will sooner or later, or they are going to be out of business)
Depending on what they actually do, you maybe could proxy the request to those scripts and serve them from you https-enabled server. But I would really consider this last a resort.
The slowing down part is mostly just the handshake. If you enable session resumption there shouldn't be too much overhead to actually slow down your site. Make sure your TLS session cache is big enough and that the ticket lifetime is ample.
Of course, your mileage may vary. So make sure you test your https site before going online.
I heard of such horror stories as well, but I think most of the time it's probably due to faulty or at least sub-standard implementation. Make sure you redirect EVERY single http-request to https with the 301 status and you should be fine. For some months now enabling https should actually help with your Google pagerank.
To link to an external site (differnt FQDN) you don't have to implement any trickery to avoid browser warnings - that's just linking to a different site and has nothing to do with mixed content policies.
I am developing a Chrome App that needs to work off-line, i.e., I want it to use the App Cache, rather than sending a request to the server, as much as possible. However, should it be necessary to hit the server, then I would like to include a few bytes of data (so the server can collect and analyse some statistics about, for example, how many previous requests were served by the cache alone).
I can not send the data bytes in the URL query, because this defeats the cache (and is a well known technique for ensuring the cache is bypassed - exactly what I do NOT want to do).
I can't use a POST request, as that will also defeat the cache.
I have thought about including data in a header, but am unsure how, and unsure if this will do the trick as everything I have found about this idea recommends against it. I do not want to get in the trap of relying on something completely undocumented in case it stops working in the future (and I am not sure what will work today).
Can I include data in a GET request that will have absolutely no effect on the App Cache (in Chrome in particular), but is available to the server if and when the request makes it that far?
See comment above; this question isn't really about Chrome Apps. It's more about webapps that use App Cache.
It might be helpful to decouple normal web requests from the reporting feature. You are trying to piggyback the stats on the normal requests, but there doesn't seem to be a need to do so. Moreover, for the occasional client that caches perfectly, you'll never hear from that client because the reporting never occurs. So even in the best case the design would have been flaky.
Instead, you could increment a counter every time you make a request that might be cacheable, and keep it somewhere persistent like in localStorage. Then periodically report it to a noncached destination. You can then subtract the traffic you are receiving at the web server (which represents uncached requests) from that reported total (which represents all requests, cached or not). The difference is cached requests.
It is impossible to identify a user or request as unique since duping is trivial.
However, there are a handful of methods that, combined, can hamper cheating attempts and give a user quasi-unique status.
I know of the following:
IP Address - store the IP address of each visitor in a database of some sort
Can be faked
Multiple computers/users can have the same address
Users with dynamic IP addresses (some ISP issue them)
Cookie tracking - store a cookie per visitor. Visitors that don't have it are considered "unique"
Can be faked
Cookies can be blocked or cleared via browser
Are there more ways to track non-authorized (non-login, non-authentication) website visitors?
There are actually many ways you can detect a "unique" user. Many of these methods are used by our marketing friends. It get's even easier when you have plugins enabled such as Java, Flash etc.
Currently my favorite presentation of cookie based tracking is evercookie (http://samy.pl/evercookie/). It creates a "permanent" cookie via multiple storage mechanisms, the average user is not able to flush, specifically it uses:
Standard HTTP Cookies
Local Shared Objects (Flash Cookies)
Silverlight Isolated Storage
Storing cookies in RGB values of
auto-generated, force-cached PNGs
using HTML5 Canvas tag to read pixels
(cookies) back out
Storing cookies in Web History
Storing cookies in HTTP ETags
Storing cookies in Web cache
window.name caching
Internet Explorer userData storage
HTML5 Session Storage
HTML5 Local Storage
HTML5 Global Storage
HTML5 Database Storage via SQLite
I can't remember the URL, but there is also a site which tells you how "anonymous" you are based on everything it can gather from your web browser: What plugins you have loaded, what version, what language, screensize, ... Then you can leverage the plugins I was talking about earlier (Flash, Java, ...) to find out even more about the user. I'll edit this post when I find the page whcih showed you "how unique you are" or maybe somebody knows »» actually it looks as if every user is in a way unique!
--EDIT--
Found the page I was talking about: Panopticlick - "How Unique and trackable is your browser".
It collects stuff like User Agent, HTTP_ACCEPT headers, Browser Plugins, Time Zone, Screen Size and Depth, System Fonts (via Java?), Cookies...
My result: Your browser fingerprint appears to be unique among the 1,221,154 tested so far.
Panopticlick has a quite refined method for checking for unique users using fingerprinting. Apart from IP-adress and user-agent it used things like timezone, screen resolution, fonts installed on the system and plugins installed in the browser etc, so it comes up with a very distinct ID for each and every user without storing anything in their computers. False negatives (finding two different users with the exact same fingerprint) are very rare.
A problem with that approach is that it can yield some false positive, i.e. it considers the same user to be a new one if they've installed a new font for example. If this is ok or not depends on your application I suppose.
Yes, it's impossible to tell anonymous visitors apart with 100% certainty. The best that you can do is to gather the information that you have, and try to tell as many visitors apart as you can.
There is one more piece of infomration that you can use:
Browser string
It's not unique, but in combination with the other information it increases the resolution.
If you need to tell the visitors apart with 100% certainty, then you need to make them log in.
There is no sure-fire way to achieve this, in my view. Of your options, cookies are the most likely to yield a reasonably realistic number. NATing and proxy servers can mask the IP addresses of a large number of users, and dynamic IP address allocation will confuse the results for a lot of others
Have you considered using e.g Google Analytics or similar? They do unique visitor tracking as part of their service, and they probably have a lot more money to throw at finding heuristic solutions to this problem than you or I. Just a thought!