I have an ASP.NET MVC project under development in VS 2013 and running on localhost. A cookie is sent with the
Set-Cookie: Tx=88440; expires=Sun, 03 Apr 2016 12:49:28 GMT; domain=localhost; path=/
Chrome receives this ok. But when I refreshes the page in Chrome, the cookie Tx is not sent back to the server. Is there anything wrong with the cookie header? Does a path of / mean it will apply to all paths beneath /?
Chrome ignores localhost for cookies. Design decision apparently.
Related
I have a web application and two computer, both with win7 and chrome v56.0.2924.76.
and in the app it need several js file. when I request the resources, it response with the same response headers
Cache-Control:max-age=7200, must-revalidate
Connection:Keep-Alive
Content-Encoding:gzip
Content-Language:en-US
Content-Type:application/javascript
Date:Mon, 06 Mar 2017 09:40:11 GMT
Expires:Mon, 06 Mar 2017 11:40:11 GMT
Keep-Alive:timeout=5, max=496
Last-Modified:Wed, 22 Feb 2017 18:47:28 GMT
Transfer-Encoding:chunked
X-Powered-By:Servlet/3.0
in one desktop, everything works fine, and it get all js file from cache, while in another desktop,most are fine while some are still getting from the remote.
the difference is: on the not cached desktop the request has requester header with all detail like Accept/Accept-Encoding/Cookie/etc.
and on the cached desktop have the request header say
Provisional headers are shown
I think since one of them is working, the remote server is fine, the problems is in the chrome side, while they both are the same version, I want to know anything else can I check.
Here is the root cause, "not working one" don't installed the correct certification so that the "not working one"'s https is not secure, and there is a warning before the url say "not secure", and chrome seems not cache the not secured content so the files are not cached, after I installed the correct certification it works fine now.
I'm setting the Cache-Control and Expires HTTP headers to allow caching of static resources. This works fine as expected on Chrome and Firefox. However, IE11 and Safari make a fresh request for the static resources everytime.
Accept-Ranges:bytes
Cache-Control:max-age=31535999
Content-Length:186824
Content-Type:application/x-font-woff
Date:Thu, 21 Apr 2016 09:54:15 GMT
ETag:W/"186824-1461231024000"
Expires:Fri, 21 Apr 2017 09:54:15 GMT
Last-Modified:Thu, 21 Apr 2016 09:30:24 GMT
Server:Apache-Coyote/1.1
Do I need to set any special headers for IE and Safari? I'm using org.apache.catalina.filters.ExpiresFilter to set the response headers.
Turns out this was because of "Always refresh from server" option turned on by default when IE Developer tools is opened.
I'm developing a web site and i have setting up the following structure in production eviroment:
NGINX (listening 443,proxy_pass) -> Varnish (listening 80) -> NGINX (listenig 8080).
So i can serve via https cached content.
The problem is that i have noticed that google crome is not caching locally content served via SSL (Firefox does)
When i pointed to the 80 port , both, Firefox and Chrome caches the content but chrome does not cache when try 443.
Those are the headers i get for a testing image:
Serving by HTTP
Accept-Ranges:bytes
Age:28
Cache-Control:max-age=31536000
Content-Length:190667
Content-Type:image/jpeg
Date:Fri, 18 Dec 2015 12:06:49 GMT
ETag:"5673f678-2e8cb"
Expires:Sat, 17 Dec 2016 12:06:49 GMT
Last-Modified:Fri, 18 Dec 2015 12:05:12 GMT
Served by HTTPS
Accept-Ranges:bytes
Age:74
Cache-Control:max-age=31536000
Connection:keep-alive
Content-Length:190667
Content-Type:image/jpeg
Date:Fri, 18 Dec 2015 12:19:34 GMT
ETag:"5673f678-2e8cb"
Expires:Sat, 17 Dec 2016 12:06:49 GMT
Last-Modified:Fri, 18 Dec 2015 12:05:12 GMT
Anyone knows if this may be a config problem with my servers, or maybe its a chrome normal way of deal with SSL conections?
Thanks
Finally i found the problem.
i remove from the response headers the "Vary: cookies" header and now chrome is locally caching the content served via HTTPS
Here is a very simple example to illustrate my question using JQuery from a CDN to modify the page:
<html>
<body>
<p>Hello Dean!</p>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
<script>$("p").html("Hello, Gabe!")</script>
</body>
</html>
When you load this page with an internet connection, the page displays "Hello Gabe". When I then turn off the internet connection, the page displays "Hello Dean" with an error -- JQuery is not available.
My understanding is that CDNs have a long Cache-Control and Expire in the header response, which I understand to mean that the browser caches the file locally.
$ curl -s -D - https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js | head
HTTP/1.1 200 OK
Server: cloudflare-nginx
Date: Fri, 17 Apr 2015 16:30:33 GMT
Content-Type: application/javascript
Transfer-Encoding: chunked
Connection: keep-alive
Last-Modified: Thu, 18 Dec 2014 17:00:38 GMT
Expires: Wed, 06 Apr 2016 16:30:33 GMT
Cache-Control: public, max-age=30672000
But this does not appear to be happening. Can someone please explain what is going on? Also -- how can I get the browser to use the copy of JQuery in the cache somewhere?
This question came up because we want to be using CDN's to serve external libraries, but also want to be able to develop the page offline -- like on an airplane.
I get similar behavior using Chrome and Firefox.
There's nothing to deal with CDN. When then browser encounters a script tag, it will request it to the server, whether it's hosted on a CDN or on your server. If the browser previously loaded it, on the same address, the server tells whether it should be reloaded or not (sending 304 HTTP status code).
What you are probably looking for is to cache your application for offline use. It's possible with HTML5 cache manifest file. You need to create a file listing all files needed to be cached for explicit offline use
Since the previous answer recommended using a cache manifest file, I just wanted to make people aware that this feature is being dropped from the web standards.
Info available from Mozilla:
https://developer.mozilla.org/en-US/docs/Web/HTML/Using_the_application_cache
It's recommended to use Service workers instead of the cache manifest.
We have an ASP.net website, (https://website.example.com) which is loading external libraries (css and javascript) from a different sub-domain name (https://library.example.com)
The resources we are loading via the library project are only css files and javascript plugins, which themselves doesn't make any request (via AJAX).
Testing the website in normal environments, everything works fine.
However, opening it from an Internet Explorer 8 browser, returns an error:
internet explorer has modified this page to prevent cross site scripting
Could the fact that we are referencing external resources cause the error?
If yes, what would be the solution to fix this problem?
I think 90% of the websites downloads references from external domains (like CDN servers) for example.
Here's one way- configure the X-XSS-Protection header on your server.
This will tell IE to disable XSS protection on your site.
Looks something like this :
GET / HTTP/1.1
HTTP/1.1 200 OK
Date: Wed, 01 Feb 2012 03:42:24 GMT
Expires: -1
Cache-Control: private, max-age=0
Content-Type: text/html; charset=ISO-8859-1
Set-Cookie: PREF=ID=6ddbc0a0342e7e63:FF=0:TM=1328067744:LM=1328067744:S=4d4farvCGl5Ww0C3; expires=Fri, 31-Jan-2014 03:42:24 GMT; path=/; domain=.google.com
Set-Cookie: NID=56=PgRwCKa8EltKnHS5clbFuhwyWsd3cPXiV1-iXzgyKsiy5RKXEKbg89gWWpjzYZjLPWTKrCWhOUhdInOlYU56LOb2W7XpC7uBnKAjMbxQSBw1UIprzw2BFK5dnaY7PRji; expires=Thu, 02-Aug-2012 03:42:24 GMT; path=/; domain=.google.com; HttpOnly
P3P: CP="This is not a P3P policy! See http://www.google.com/support/accounts/bin/answer.py?hl=en&answer=151657 for more info."
Server: gws
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN
Transfer-Encoding: chunked
1000
Please read here for more details