Getting 429 from IPFS backend - ipfs

I am relatively new to Dapps and IPFS. Was trying out fetching an IPFS asset and I am getting 429 for my requests. This is somehow related to my IP, as I am able to successfully fetch from my friend's IP. Somehow my IP has been rate limited I guess.
Is there any way I can get whitelisted for IPFS. Does it get solved if I host my own IPFS node ?
<html>
<head>
<title>429 Too Many Requests</title>
</head>
<body>
<center>
<h1>429 Too Many Requests</h1>
</center>
<hr>
<center>openresty</center>
</body>
</html>
<!-- a padding to disable MSIE and Chrome friendly error page -->
<!-- a padding to disable MSIE and Chrome friendly error page -->
<!-- a padding to disable MSIE and Chrome friendly error page -->
<!-- a padding to disable MSIE and Chrome friendly error page -->
<!-- a padding to disable MSIE and Chrome friendly error page -->
<!-- a padding to disable MSIE and Chrome friendly error page -->
The response headers also doesn't to return anything useful.
access-control-allow-headers →X-Requested-With, Range, Content-Range, X-Chunked-Output, X-Stream-Output
access-control-allow-methods →GET, POST, OPTIONS
access-control-allow-origin →*
access-control-expose-headers →Content-Range, X-Chunked-Output, X-Stream-Output
content-length →568
content-type →text/html
date →Mon, 02 Aug 2021 05:51:56 GMT
server →openresty
strict-transport-security →max-age=31536000; includeSubDomains; preload
x-ipfs-pop →gateway-bank3-sg1
Please help !

If you're hitting a gateway throttle, it sounds like you're doing requests from a centralised location (as you mentioned, mostly from your own IP). If possible, you can decentralise that access out, having the clients the users are using (if a website, that'd be in the web page) access the gateway you're using themselves, where the HTTP requests are coming from their PCs, rather than all coming from the one server.
However you're also correct when you ask:
Does it get solved if I host my own IPFS node ?
Yes it does get solved, as you're actually using IPFS at that point, instead of just a gateway (HTTP). Your node will pull the data from several other nodes, instead of just that one gateway node.

Related

Website works fine with http but not with the https #laravel

I am trying to figure out that why my sub domain website works fine with Http but not when its https.
Is there SSL certification problem.
You have a bunch of scripts that are linking with http. Browsers won't run these on a secure page. You need to change the links to https:
<script src="http://bloodsuvidha.vampy.in/js/bootstrap.min.js" type="text/javascript"></script>
This is also true with stylesheets:
<link rel="stylesheet" type="text/css" href="http://bloodsuvidha.vampy.in/css/bootstrap.min.css">
The answers given till now are correct but none solved my problem.
The main problem was my application was behind CloudFlare
Laravel detects a Request as secure or insecure by checking a HTTP header i.e.
$_SERVER['REQUEST_SCHEME']
which remains HTTP even the Request is HTTPS due to cloudflare.
CloudFlare sets a different header for the same i.e.
$_SERVER['HTTP_X_FORWARDED_PROTO']
Which must be checked to detect a request is secure or not
Following this article and making some changes to this I successfully managed to generate HTTPS URL without making any changes to previous application code.
Credit Cybersupernova (Stack User )

Prevent Browser File Caching

I need to disable caching for single files in all browsers.
I have a website that generates small video clips. There is a preview stage where the results can be watched.
An mp4 called preview.mp4 is displayed. When the user goes back, edits the video and wants to preview it again, the old preview.mp4 is being displayed even though the file on the server is changed.
How can I prevent the caching of this video file? Or what are the other ways to fix it?
Note: it's a single page application so I don't reload any HTML files. Only PHP content. Hence the headers I set, are not useful in this scenario:
<meta http-equiv="X-UA-Compatible" content="IE=Edge"/>
<meta http-equiv="cache-control" content="no-store" />
Thanks.
It's not the web page itself you should be concerned about, but the mp4 file which is downloaded and cached separately.
Ensure the response headers of the mp4 file prevent browser caching.
Cache-Control: no-cache
Hence the headers I set, are not useful in this scenario:
<meta http-equiv="cache-control" content="no-store" />
There problem here is that those are not headers.
They are HTML elements (which are part of the HTML document, which is the body of the HTTP response) which attempt to be equivalent to HTTP headers … and fail.
You need to set real HTTP headers, and you need to send them with the video file (rather than the HTML document).
The specifics of how you do that will depend on the HTTP server you use.
Further reading:
Caching in the HTTP specification
Caching Tutorial for Web Authors and Webmasters
Caching Guide in the Apache HTTPD manual
How to Modify the Cache-Control HTTP Header When You Use IIS
Try adding a cache key
preview.mp4?cachekey=randNum
Where randNum can be a timestamp or you use a random number generator to generate randNum.

CSP upgrade-insecure-requests not upgrading iframe requests

I have a iframe which has insecure source http://example.com inside my site (https://example.com) and I have a CSP policy <meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests"> to get rid of the Mixed Content error. This works fine as the insecure request gets upgraded as https://example1.com. But the example1.com redirects the request to another domain which is insecure again as http://example2.com and it is blocked by the CSP as a mixed content loading in a secured site. I need to fix this and tried few ways of white listing both the domains to make sure the are not get blocked by using <meta http-equiv="Content-Security-Policy" content="child-src self http://example1.com http://example1.com">.
And I read about the CSP specifications to know more about to solve the issue but couldn't get any. Have any one faced the same issue and know any solutions?

HTTP header "expires" does not render the page from cache

I'm not really familiar with all the meta allowing you to manage the cache client-side so i tried to make a simple example with the HTTP header "expires"
With the following code:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<meta http-equiv="expires" content="mon, 18 Jul 2016 1:00:00 GMT" />
<title>MY TITLE</title>
</head>
<body>
MY BODY
</body>
</html>
When i load the page for the first time (cache cleared before). the page is saved in the cache but when i update my body with "MY BODY2" and reload the page, the page displays "MY BODY 2". The browser was supposed to take the page from cache (with "MY BODY") since the expiration is in july 2016, no ?
Thank you for helping me to put some light on this problem
It depends how you reload the page.
You've basically three options:
Browse to another page and then back. This should use the cache.
Press F5 or reload but. This is an explicit reload so will check with the server if there's a new version - even if it's cached - and download it if there's a newer version.
Force reload (Ctrl+F5 in some browsers). This says ignore the cache and download from scratch (even if the cached version appears to be the same as what server will send you).
I suspect you've done option 2 and didn't realise this would check with the server and assumed it would use the cache if still valid. The reason it actually checks with the server is that a reload is often done when the user suspects the content has changed, or wants to re-download it (e.g. If page didn't render properly).
Also it should be noted that meta headers in HTML are not as good as http response headers set up by the server for various reasons including browser support reasons.
And finally it's worth opening developer tools (F12 in Chrome for example) and checking network tab to see what's going on but in that case make sure you don't have "Disable cache" ticked when it's open (it's ticked by default in Chrome as most developers don't want to use a cache when developing).

Specifying Etag or Expiry date in HTML

I have a pure HTML website, and does not have access to the IIS server, its a basic site, now whenever I check the site Performance in the performance testing tools like (Pingdom Tools, GTMetrix, Google Insights etc).
It always says "Leverage browser caching", and this adversely affects my site performance
I did a lot of research for set the expiry date for the (css,js,images,html etc), but all shows the option with IIS. I am using Pure HTML with No Apache, No IIS, its a basic windows hosting provider.
Can anyone tell me the steps I can use to set the expiry headers of the above source from the HTML itself?
If the pages (from whatever type/extension) are static (not dynamic like PHP, ASP, etc.), the caching mechanism should be pretty automatic. The web server is supposed to add Last-Modified or ETag headers for you and the browser (or "user agent") is supposed to understand these.
You can check these headers are present or not with a tool such as Fiddler2 (on Windows).
If they are not present, then you would have to use a HTTP equivalent META tag, like this:
<meta http-equiv="last-modified" content="Sun, 27 Jan 2012 11:52:12 GMT" />
Use meta-Tags to set HTTP headers in the HTML:
<meta http-equiv="foo" content="bar" />
http://en.wikipedia.org/wiki/Meta_element#HTTP_message_headers