Chrome - multiple requests - google-chrome

Whenever I send a GET-request to my webapp using chrome, according to my apache access log two identical requests (not always, but most of the times, I can't reproduce it - it's not for the favicon) get send to the sever, although only one is shown in the chrome dev tools. I deactivated all extensions and it's still happening.
Is this https://news.ycombinator.com/item?id=1872177 true and is it a chrome feature or should I dig deeper within my app to find the bug?

I think it's even worse than that. My experience here (developing with Google App Engine), is that Chrome is making all kinds of extra requests.
This is possibly due to the option that is in the Settings, checked by default:
Predict network actions to improve page load performance
Here is a really weird example: my website's page runs a notifications check every 15 seconds (done in javascript). Even after closing all tabs related to my website, I see requests coming from my IP, some random pages but also the notification check request. To me that means that Chrome has a page of my website running in the background and is even evaluating its javascript.
When I request a page, I pretty much always get another request for one of the links in that page. And it also requests resources of the extra pages (.css, .js, .png files). Lots of requests going on.
I have seen the same behavior with the development server that runs locally.
Happens also from another computer / network.
Doesn't happen with Firefox.
Also, see What to do with chrome sending extra requests?

Related

Loading resources via URL is slow in Chrome - ONLY via https, ONLY from one specific website

I have this 1 Mb file for testing purposes (though the issue applies to any file hosted at lptoronto.com): https://lptoronto.com/sandbox/test.png
It always loads instantly in Internet Explorer (both via http and https).
In Google Chrome (currently Version 63.0.3239.132 (Official Build) (64-bit) on Windows 10) the situation is as follows.
When fetched via http, this file loads instantly without an issue.
When fetched via https, the same file takes about 40 (!) seconds to load (with very rare and irregular exceptions, when it also loads fast via https on occasion).
Chrome network monitor shows that all that 40 seconds the image is being slowly but steadily downloaded at low speed, i.e. there is nothing like waiting for server response etc.
Here's the screencast showing IE and Chrome side-by-side loading the same image:
https://www.youtube.com/watch?v=M4cUuhG1YuM
From time to time the issue disappears for a few minutes or an hour, but then re-appears again, without me doing anything on my side.
Same behavior is observed on at least one other computer - the one of my colleague (different ISP, different location).
Needless to say that I'm testing in a clean environment - cache cleared, extensions, firewall, antivirus disabled, connection verified and measured etc.
No Chrome issue whatsoever with any other website, be it http or https.
Hosting provider is yet unable to troubleshoot on their end, but they're still trying to help (it takes some time). They tried disabling mod_deflate, re-installing SSL, disabling caching rules, but to no avail.
The same issue was once observed about 2 months ago. That time I asked the hosting provider to disable SSL completely just to be able to work on my website content. When they re-enabled SSL in less than a day, the issue has gone; but now it re-appeared again, and there is no clue as to what is going on.
The bottomline:
issue appears only in Chrome, only with certain site, only with https
changing only the browser solves the issue
changing only the protocol (https to http) solves the issue
I honestly tried to google out anything similar, but failed to.
I would appreciate if you try the link above in the incognito Chrome window and report the load/refresh time, and of course any ideas are more than welcome.

Phishing Detected! warning in Chrome

I have encountered the "Phishing Detected" warning in Chrome browser on my dev site. Interestingly I don't encounter the same warning in Firefox or Safari even though, as far I can tell, they are using the same phishing database (although in Safari preferences it says "google safe browsing service is unavailable"). I also don't encounter the warning on the same page of the production sites.
It first popped up on a new account verification page I created which amongst other things asked users to confirm their PayPal account with the GetVerifiedStatus API. This requires only name and email.
I have also encountered the warning on a configuration page which asks for the PayPal email address which the user wishes to receive payments to.
Neither page requests a password or any other data that would be considered a secret.
As you might gather I have zeroed in on a potential false positive on the PayPal portion of the content as if perhaps I am phishing for PayPal information beyond the payers email address. There has been no malicious code injection or any such thing. Even when i've removed all content from the page the warning is still present.
I reported the first incorrect detection to Google, and intend to do the same for the second incident, however what I really want to clear up is:
What content can lead to this warning?
How can I avoid it in the future?
How can I get some info from the "authorities" on which urls are blocked? (Webmaster Tools is not showing warnings for the dev site)
How can I flush my local cache of "bad sites" in case I want to re-test?
Clearly having a massive red alert presented to a user on a production site would be disastrous, and there is a (perhaps deliberate) lack of information about how this safe browsing service actually works.
I have been developing a website for a banking software developer and ran into the Phishing warning as well. Unlike you I had no PayPal associations in any of my code and well not even any data collection besides a simple contact form. Here are some things I managed to figure out to resolve my false positive warnings.
1) The warnings in Chrome (red gradient background) is a detection method built into the Chrome browser itself and it does not require to check any blacklists to give the warning. In fact Google themselves claim that this is one of the methods that they discover new potentially harmful sites. When your site is actually on the blacklists you get another red warning screen with diagonal lines in the background. This explains why you only see the warning in Chrome.
2) What actually triggers this warning is obviously kept kind of hidden. I could not find anything to help me debug the content of my site. You have pretty much done this, so for anybody else in need of help, I had to isolate the parts of my site to see what was triggering the warnings. Due to the nature of the site I was working on it turned out to be the combination of words and phrases in the content itself. (e.g Banking Solutions, Online Banking, Mobile Banking). Alone they did not trigger anything but when loaded together chrome would do its thing. So I'm not sure what your triggers are or even what the list of possible triggers are. Sorry...
3) I found that simply quitting Chrome completely and restarting it resets the "cache" for whether it has perviously detected a page. I closed Chrome hundreds of times while getting to the bottom of my warnings.
Thats all I have and hope it helps.
Update: My staging area was accessed via an IP address. Once I moved the site to use a domain instead all the warnings stopped in chrome.
I experienced the same today while creating an SSL test report for my web server customers. What I had there was simply something like this:
"Compare the SSL results of our server to the results of a well-known bank and its Internet banking service". I just wanted to show that the banking site had grading B whereas ours had grading A-.
I had two images from SSL-Labs (one the results for my server and the other the results of the bank). No input fields, no links to any other site and definitely no wording about then name of the bank.
One h1, two h2 titles and two paragraphs plus two images.
I moved the HTML to the page and opened it in my Chrome browser. The web server log told me that a Google service had loaded the page after 20 seconds from my first preview. Nobody else had seen it so far. The phishing site warning came to me (webmaster) in less than an hour.
So it seems to me that the damn browser is making the decision and reporting to Google which then automatically checks and blocks the site. So the site is being reported to Google by Google tools, the trial is run by Google and the sentence is given by Google. Very, very nice indeed.

Chrome basic auth and digest auth issue

I'm having some issues with Chrome canceling some HTTP requests and I'm suspecting cached authentication data to be the cause. Let me first write down some important factors about the application I'm writing.
I was using Basic Authentication scheme for some time to guard several services and resources in my web app.
In the meantime I was using/testing the app heavily using Chrome with my main Google Account fully synced. Most frequently I was using my name - "lukasz" - as the username in Basic Auth.
Recently I have switched my application to use Digest Authentication.
Now, some of the HTTP requests I'm making are failing with status=failed with no apparent reason. It only happens when I'm using user "lukasz", if I enter some other unique username - there is no problem.
I looked everywhere in the backend and frontend and I couldn't locate the issue to be in our code. I can easily reproduce this with user "lukasz" each time. So I reverted my code to Basic Auth (while not touching the rest of app) and the problem was gone.
That led me to think that there is something wrong with cached passwords. So I cleared the cache in Chrome, but that didn't help. After several hours of analyzing the issue I decided to make sure that I'm running fresh instance of Chrome, so I reinstalled it (deleting the disk data along the way). TADAAA! The problem was gone and I couldn't reproduce this anymore.
Then I synchronized my Google Account with this newly installed Chrome and after a short while the requests to my app started failing again!! So I took a deeper look at this (cleaning profile data from disk and redoing all the steps) and indeed it looks like the problem starts as soon as my account is synced with cloud!
Yes, I know it sounds dodgy. It sounds ridiculous. It sounds stupid. But I am almost sure that those two problems are somehow related (failing requests and account sync).
My idea is this: Chrome somehow remembered that I was using "lukasz/my-pass" with Basic Auth for certain services. After I switched to Digest Auth the same combination of credentials (lukasz/my-pass) is now acting funny. Perhaps under the hood Chrome still thinks that this is Basic Auth and cancels requests when it learns otherwise?
UPDATE:
I've did some low level debugging with chrome://net-internals/ and it appears that the problem is while reading cache entry. This seems to prove my initial assumption.
I did some investigation and found this article. Apparently always adding "Last-Modified" header to my http response has solved the issue in Chrome (I'm still having some problems in FF, but that's off topic).
However, it still doesn't solve my issue entirely. Why the requests were failing in the first place?
You could try using incognito mode and see what happens. It may give you some hints without having to clear the cache or re-installing Chrome.
Also take a look at How to clear basic authentication details in chrome

Safari Chrome Randomly Timeout, Firefox Never

Thanks in Advanced,
Safari and Chrome are randomly timing out / freezing / hanging. This happens randomly on random pages. I can click refresh about 5 times and then the pages load quickly. If I let it try to download, the page get the server is not responding error. The pages always load very fast with Firefox. I typically see the problem after I am logged in the backend which involves sessions / cookies. Cookies are enabled on both browsers. IE also works but is slower than firefox. I am not sure what is wrong with the code that is causing the problem.
Safari 6.0/
Chrome 21.0
Firefox 15.0
on Mac 10.8.1
I have also had problems on windows xp with chrome. Very few times I had lost my session and had to relogin on IE 6.
Try adding an EXPIRES header:
Web pages are becoming increasingly complex with more scripts, style
sheets, images, and Flash on them. A first-time visit to a page may
require several HTTP requests to load all the components. By using
Expires headers these components become cacheable, which avoids
unnecessary HTTP requests on subsequent page views. Expires headers
are most often associated with images, but they can and should be used
on all page components including scripts, style sheets, and Flash.
and configuring Entity Tags:
Entity tags (ETags) are a mechanism web servers and the browser use to
determine whether a component in the browser's cache matches one on
the origin server. Since ETags are typically constructed using
attributes that make them unique to a specific server hosting a site,
the tags will not match when a browser gets the original component
from one server and later tries to validate that component on a
different server.
I got the information above using YSlow on Chrome.

Application cache only when users want it?

Some browsers does not asks the client if he want use the application cache, but simply just downloads the whole thing at first visist (e.g. the browser on android). That can cause troubles when the application cache is many MB's and the client is on a mobile network - that is expensive! And is it possible to stop the cache from being downloaded? Pressing the menu button on e.g. android will not close the browser, it will be running in the background.
Is it a good idea to only add the manifest based on a cookie set when the client pushes a "hey i want to offline cache this site" button? Will that cause any new challenges?
I've tested this, and it seems to work. Needs more testing though. Can provide a link to the site in about two weeks if anyone is interrested.
One possible solution, as a you hinted at yourself, would be to ask the user if he/she wants to cache the app offline, and if so, only then redirect to a page which has a link to the manifest file (which is subsequently downloaded by the browser). You cannot stop a download midway unless of course, your network disconnects mid way.