How to block all unnecessary incoming http requests? - google-chrome

I have a hosted website and loading the page takes time due to the unnecessary HTTP requests coming to the website. In the console, I can see many websites sending data to my website. It is like my website is sending GET request to those other websites automatically even when there is no piece of code requesting to those websites.
I dont know what meetlookup.com is for but still I am seeing that in console.
Also not just youtube plays video in background in every tab I open in Google Chrome. How to block all those incoming requests?

Related

How can a website stream a video without a request appearing in the network tab of the developer tools?

Normally whenever I try to stream a video I can see the request when it comes through but in this case you can see the requests for a couple of images and GIFS they show beforehand as an advert but when the video comes there is no request visible. What is going on? This is true both when using Tamper data and Google Chrome developer tools

Dynamic img (or video) tags don't load resources at all, HTTP requests are "pending"

I'm having some problems with the resources I try to load on my web application, using img or video HTML tags.
I use angular for my app and dynamically set the src param of the img tags, using the ng-src="{{ src }}" directive.
There are not that many images and resources to load (on the model I'm currently testing, it loads about 30 images or videos), but it seems like the browsers (tested on Chrome and Firefox on Windows and OSX) don't handle correctly that many parallel requests, and the requests appear as "pending" on Chrome, and does not show any status on Firefox.
At the moment I took these screenshots, they were no other requests running. I have this problem all the time, but the number of correct requests and pending requests change almost every time I refresh the page.
The resources URIs are of course correct, and I can open pictures in another tab with no problem.
If it can be of any relevance, even though it doesn't seem like the request is even sent from the browser, the resources are loaded from an Azure blob storage, and CORS options are correctly set.
The problem happens both when I test my app locally (with localhost as hostname) and when it is hosted on a web server.
Have you any idea on why this problem is happening and how I can fix it?
Thanks !
I finally found what caused the problem!
I had several videos loading with a video tag with the preload="metadata" attribute. It caused many connections to start, and it seems that even when the preload finished, it prevented the requests for the other resources to be sent. I think it is a browser bug, it should not be the desired behavior.
With preload="none" on the video tags, I don't have any problem like that.

automatic login to website only works when session is made

I'm making an offline webpage that automatically logs into an online website.
The website uses ssl (https) and to login it uses a form (post variables)
The problem I'm encountering is the following:
The site only accepts my offline form, when I open the online login page first.
This is because the website uses (server-side) sessions which are made when opening the first page. (The purpose of the session is to detect a time-out)
When I first open the online website and then run my offline page everything works fine.
So I need to make my offline webpage open the online website before posting the form automatically.
I tryed this with an iframe, but this doesn't work in internet explorer, as it is a https website. (It does work in Chrome, firefox,...)
I was wondering if Ajax could send a https page request before posting my form. But I guess not as it is https.
Does anybody know a method to send a https page request just like the browser does, but without showing it's output? Afterward I can automatically submit my form.
Thanks in advance!
Internet Explorer treats iframes from other domains as third party content, and uses a separate set of security policies for them. The security zone settings are also in effect between file:/// "local machine"/"offline" webpages and "internet"/"online" webpages. Cookies are usually blocked from third party content (depending on your settings), which means that the unique session key set in the cookies won't be saved. Without this key, the site you are trying to log in to will "forget" your session/login.
Cross-domain AJAX request are also affected by security zones and cross-domain policies, but the settings may differ between IE versions.
There are ways around the limitations, using P3P policies, if you control the target web page. Cookie blocked/not saved in IFRAME in Internet Explorer shows how. But, if you do own the web page, it would be better to enable or implement your own "remember me" feature.
In your case, depending on if you are the only one to use your offline autologin webpage, perhaps allowing third party cookies in Internet Explorer will help. See Options > Privacy > Allow all cookies (or in a similar). This will allow others (mostly ad companies) to track you all over the internet though.
If the purpose for your autologin page is testing, rather than actually using the browser as a human being, perhaps you can automate both logging in and testing?

HTTP header to detect a preload request by Google Chrome

Google Chrome 17 introduced a new feature which preloads a webpage to improve rendering speed upon actually making the request (hitting enter in the omnibar).
Two questions:
Is there a HTTP header to detect such a request on server side, and if one actually exists what is the proper response in order to prevent such preloading (to prevent unintended requests which might have unwanted effects)?
Does Google Chrome check the robots.txt before making preload requests?
Is there a robots.txt setting which targets only this specific behaviour? (I supose/hope disallow already works).
Is there a meta tag to inform Google Chrome to never preload again on the current domain?
When Firefox pre-fetches content (at the behest of the referrer page’s
markup), it sends the following header with the request: X-moz:
prefetch
Safari does similarly, using: X-Purpose: preview. According to this
ticket , Chrome does, too.
For pre-rendering, Chrome does not send any header whatsoever to the
client. Instead, one must use the Page Visibility API, in JS
source, additional reading
Chrome stopped sending X-Purpose header in 2011 and they stated that they won't fix it there: https://code.google.com/p/chromium/issues/detail?id=86175.
They re-introduced sending Purpose:prefetch headers with all nostate-prefetch requests back in 2018 as stated by the last comment on this issue. https://bugs.chromium.org/p/chromium/issues/detail?id=86175#c65

Hosting Facebook iframes on pages on Cloudfront

I've switched my Facebook page to pull an iframe as a result of Facebook's recent announcement that they were supporting iframes in pages. Since you need to host the iframe page outside of Facebook, I figured it would be nice to do using Cloudfront to host the files (an HTML page, a CSS stylesheet and a jpg image). Unfortunately, despite setting the permissions on the Cloudfront files to 744, the iframe page loads correctly in a browser, but when called from Facebook, I get this error message.
When I host the same files on my Media Temple server, the iframe on the actual Facebook page also loads correctly.
Is there a reason why Facebook and Cloudfront don't play together? I haven't been able to find one so far.
Unfortunately Facebook loads the iframe page using an HTTP POST, not an HTTP GET and is not compatible since Amazon has a REST interface and properly uses POST to upload/modify content.
Best,
David Bullock
I ran into this problem today and it caused some headscratching. As David Bullock points out the problem is that Facebook loads the HTML page via a POST request but S3 (and thus by extension CloudFront) won't serve resources in response to this (it returns 405 Method Not Allowed).
You can host your CSS, scripts and images on S3 / CloudFront, but the initial HTML page has to be on some other server. If you're concerned about load or latency from across the globe then you could try implementing a tiny redirector that forwards the Facebook POST request to the CloudFront-cached location (you'll have to return 303 See Other to do this redirection so the browser makes a GET request instead: see RFC 2616).
There is a shockingly easy workaround to the fact that AWS rejects POST requests and the fact that Facebook requires page tabs to be hosted via HTTPS: just redirect the request through https://bit.ly/….
Yes, it's really that easy.
Host the HTML page wherever you like. HTTP is fine.
Create a bit.ly-shortened URL to the page.
Use it—substituting https:// for http://—as the "Secure Page Tab
URL" as you create your Facebook Page Tab.
Activate the tab using this highly-undocumented dialog URL: https://www.facebook.com/dialog/pagetab?app_id=app_id&redirect_uri=bitly_url
Boom: done.
OK, it can be done: but you need to host the images on Cloudfront and the rest of the content on S3. Amazon provides a set of clear instructions on how to this. Issue solved.
Use Cloudfront to trap the 405 error and serve your html as the "Custom Error Response" page to the desired index page
Updated:
This was down voted, however I'm going to help lots of Facebook developers with the following. The final Issue that we had with hosting a facebook application on S3 was with CORS. We fixed the 405's by doing a "Custom Error Response" See this link for details:
http://blog.celingest.com/en/2014/10/02/tutorial-using-cors-with-cloudfront-and-s3/