I have a website staged. Although everything seems to be displaying properly, opening developer tools shows I am getting two 404 errors (see pictures). I want to fix this error but I don't have much information. Is there some way to find out which specific element is causing the error? How should I go about debugging this?
Seek in the settings for a tab called Network.
Simply check the http status code for each of your elements from there.
This is a capture of the Firefox debugger, it works very similarly in other browsers dev tool.
Get into the developer tools and look for the Network tab.
Here, you can find all the resources calls with API calls along with HTTP status, headers, URL, request, response, etc.
Just see the network tab for the specific file for which you are facing issues.
Check the URL, HTTP code, response, and you can trigger what's the issue and solve it.
Let me know, what exactly is the issue so that I can help better.
Here are some ofthe tools you can check to fix 404 Pages
Raventools
Dead Link Checker
Google Analytics
W3C Link Checker
**Built in browser diagnostic is also helpful.
Related
I'm unable to understand the meaning of this error which I found in my console in the Edge browser:
Error with Permissions-Policy header: Origin trial controlled feature not enabled: 'interest-cohort'.
Your guidance is much appreciated.
I can't see any errors in my HTML file, and I don't use Google analytics.
This has to do with FLOC. Or Federated Learning of Cohorts. It's an old experiment that Google ran to try to do away with 3rd party cookies. Some code in your website is blocking Floc from running. Unblocking is fairly straight forward to get rid of the warning. But you have to make the decision if you want to unblock floc or leave as is. You will only see this issue in Chrome base browsers. We need to know what type of website you are running? Drupal? Wordpress? Versions?
So I have been testing a website using Selenium, specifically a page with a credit card form embedded in an iframe. I want to access the content of said iframe but due to CORS I get the error message:
Uncaught DOMException: Blocked a frame with origin "<url>" from accessing a cross-origin frame.
I made a quick google search and realized you can bypass CORS using the "--disable-web-security" flag, so my code now looks like the following:
options = webdriver.ChromeOptions()
options.add_argument("--disable-web-security")
self.driver = webdriver.Chrome(os.getenv("CHROME_DRIVER"), options=options)
Surprisingly, the CORS exception keeps on popping and I'm currently stuck on how to go from here. I really do have to access the iframe's content, there isn't a workaround for this.
Since I was confused as to why this didn't work, I replicated the problem with another website, in this case Amazon, which functions similarly (credit card form embedded in an iframe). I ran the code with web security enabled and I get the same CORS error, as expected. But then I disable web security, exactly as mentioned before, and it works! I can now access the iframe.
I also downgraded to an older version of Chrome (86) from the current most stable (88) and nothing happens again. I'm using Ubuntu 20.04.
So now I'm wondering - why isn't the flag working for the first scenario I mentioned? Is there a chance the first website is forcing the browser's web security or something related? I'm not an expert in web development so any input on this would be valuable.
Turns out, all I needed was to add --disable-site-isolation-trials and a non-empty --user-data-dir along with --disable-web-security as stated here
For me, adding these flags to my Chrome launcher fixed my CORS issues:
--disable-web-security --user-data-dir=~/chromeTemp
Pls note, this only turns warnings off, this is definitely not a CORS solution
We have a react front-end that is communicating with an ASP Core API.
Sometimes we detect there is something wrong with the front-end, including service workers, local cache, and that kind of stuff, so we want to tell the client to clean it up.
I've implemented the Clear-Site-Data (dev-moz) (w3c) as a response header, as "Clear-Site-Data": "cache", "cookies", "storage", "executionContexts"
When testing this out in Firefox, it works, and in the console I'm seeing:
Clear-Site-Data header found. Unknown value “"executionContexts"”. SignIn
Clear-Site-Data header forced the clean up of “cache” data. SignIn
Clear-Site-Data header forced the clean up of “cookies” data. SignIn
Clear-Site-Data header forced the clean up of “storage” data.
When doing the same in Chrome, it's not working, and I'm seeing the message
The request's credentials mode prohibits modifying cookies and other local data.
I'm trying to figure out what to do to fix it, but there are barely any any references. Just 7 results, mostly from browser integration test logs
All the documentation says that this should be implemented and working in Chrome... Any idea what's the problem?
Try manually reloading the page after the Clear-Site-Data has been received (so that the local data / cache is cleared and the header no longer contain Clear-Site-Data).
Both Firefox & Chrome don't appear to support executionContexts, which tells the browser to reload the original response.
If header contains executionContexts, then the browser should ignore it (as you see in Firefox console). However you can try wildcard mapping (*). (This will also add support for future properties).
Response.AppendHeader("Clear-Site-Data", "\"*\"");
Also Google reuse parts of their Chrome source code in their open source project Chromium. If you take a look at Chromium source code (https://doss-gitlab.eidos.ic.i.u-tokyo.ac.jp/sneeze/chromium/blob/9b22da4739ec7bf54fb8e730662e2ab7996532e0/content/browser/browsing_data/clear_site_data_handler.cc line 308). This implements the same exception The request's credentials mode prohibits modifying cookies. A flag LOAD_DO_NOT_SAVE_COOKIES is somehow being sent. The console error maybe an caused by an additional response header or a small chance theres a bug in Chrome.
Right now, my advice would be do not implement the Clear-Site-Data header at this time.
Despite the spec being widely available for some years, vendor support is still hit-and-miss and is now actually going in reverse.
As per the w3c github for this, there are a number of issues regarding executionContexts. The wildcard ('*') mentioned by Greg in their answer is not supported by Chrome, and Mozilla are about to remove the cache value as well.
All this points to a flawed standard which is likely to be removed at some point in the future.
I'm getting this chrome flag when trying to post and then get a simple form.
The problem is that the Developer Console shows nothing about this and I cannot find the source of the problem by myself.
Is there any option for looking this at more detail?
View the piece of code triggering the error for fixing it...
The simple way for bypass this error in developing is send header to browser
Put the header before send data to browser.
In php you can send this header for bypass this error ,send header reference:
header('X-XSS-Protection:0');
In the ASP.net you can send this header and send header reference:
HttpContext.Response.AddHeader("X-XSS-Protection","0");
or
HttpContext.Current.Response.AddHeader("X-XSS-Protection","0");
In the nodejs send header, send header reference :
res.writeHead(200, {'X-XSS-Protection':0 });
// or express js
res.set('X-XSS-Protection', 0);
Chrome v58 might or might not fix your issue... It really depends to what you're actually POSTing. For example, if you're trying to POST some raw HTML/XML data whithin an input/select/textarea element, your request might still be blocked from the auditor.
In the past few days I hit this issue in two different scenarios: a WYSIWYG client-side editor and an interactive upload form featuring some kind of content preview. I managed to fix them both by base64-encoding the raw HTML before POSTing it, then decoding it on the receiving PHP page. This will most likely fix the issue and, most importantly, increase the developer's awareness level regarding the data coming from POST requests, hopefully pushing him into adopting effective data encoding/decoding strategies and strengthen their web application from XSS-type attacks.
To base64-encode your content on the client side you can either use the native btoa() function, which is supported by most browsers nowadays, or a third-party alternative such as a jQuery plugin (I ended up using this, which worked ok).
To base64-decode the POST data you can then use PHP's base64_decode(str) function, ASP.NET's Convert.FromBase64String(str) or anything else (depending on your server-side scenario).
For further info, check out this blog post that I wrote on the topic.
In this case, being a first-time contributor at the Creative forums, (some kind of vBulletin construct) and reduced to posting a PM to the moderators before forum access it is easy for one to encapsulate the nature of the issue from the more popular answers above.
The command was
http://forums.creative.com/private.php?do=insertpm&pmid=
And as described above the actual data was "raw HTML/XML data within an input/select/textarea element".
The general requirement for handling such a bug (or feature) at the user end is some kind of quick fixit tweak or twiddle. This post discusses the option of clearing cache, resetting Chrome settings, creating a new_user or retrying the operation with a new beta release.
It was also suggested that one launches a new instance with the following:
google-chrome-stable --disable-xss-auditor
The launch actually worked in this W10 1703 Chrome 061 edition after this modified version:
chrome --disable-xss-auditor
However, on logging back in to the site and attempting the post again, the same error was generated. Perhaps the syntax wants refining or something else is awry.
It then seemed reasonable to launched Edge and repost from there, which turned out to be no problem at all.
This may help in some circumstances. Modify Apache httpd.conf file and add
ResponseHeader set X-XSS-Protection 0
It may have been fixed in Version 58.0.3029.110 (64-bit).
I've noticed that if there is an apostrophe ' in the text Chrome will block it.
When I update href from javascript:void(0) to # in the page of POST request, it works.
For example:
login
Change to:
login
I solved the problem!
In my case when I make the submmit, I send the HTML to the action and in the model I had a property that accept the HTML with "AllowHTML".
The solution consist in remove this "AllowHTML" property and everything go OK!
Obviously I no longer send the HTML to the action because in my case I do not need it
It is a Chrome bug. The only remedy is to use FireFox until they fix this Chrome bug. XSS auditor trashing a page, that has worked fine for 20 years, seems to be a symptom, not a cause.
I've got a client that sees the "Page can not be displayed" (nothing else) whenever they perform a certain action in their website. I don't get the error, ever. I've tried IE, FF, Chrome, and I do not see the error. The client sees the error on IE.
The error occurs when they press a form submit button that has only hidden fields.
I'm thinking this could be some kind of anti-malware / virus issue. has anyone ever dealt with this issue?
In IE, go to the "Anvanced" section of "Internet Options" and uncheck "Show friendly HTTP errors". This should give you the real error.
Is this an IE message? Ask them to switch off "short error messages" (or whatever they are called in the english version) somewhere deep in IEs options - This will make IE display the error message your server is sending instead of its own unhelpful message.
Also I've heard that IE might be forced to show server provided error messages if only the page is long/large enough, so you might want to add a longer " " section to error messages. This information is old enough that it might have effected older versions of IE - I usually get to the root of problems with eliminating the "short error messages"
Note: I'm neither running IE nor Windows, therefor can only operate on memory regarding the name of the config options of IE6...
Update: corrected usage in the suggestion to provide longer error messages... Perhaps somebody with access to IE can approve if longer error pages still force IE to display the original error page instead of the user friendly (sic) one.
It would be useful to you to figure out which error code is returned. Is it 404 - Resource not found or 503 - Forbidden Access? There are a few more, but in any case, it would help you figure out the cause of the problem.
If your client is running IE, ask him to disable friendly error messages in the advanced options.
Check their "hosts file". The location of this file is different for XP and vista
in XP I believe it's C:\windows\hosts or C:\windows\system32\hosts
Look for any suspicious domains.. Generally speaking, there should only be ~2 definitions (besides comments) in the files defining localhost and other local ip definitions. If there's anything else, make sure it's supposed to be there.
Otherwise, maybe the site's just having issues? Also, AFAIK, FF never displays "Page cannot be displayed", so are you sure this is the case in all browsers?
You can try using ieHTTPHeaders to see what is going on behind the scenes.
Do you have any events applied to your submit button? Are you doing a custom submit button that is a hyperlink with an href like "javascript:void(0)" and an event attached that submits the form?
Alought this is a 2008 thread,
but I think maybe someone still use windows xp in the virtualbox in 2018 like me.
The issue I met in 2018 is:
1. Ping to 8.8.8.8 can get correct responses.
2. HTTP sites is working fine, but HTTPS is not.
3. I cannot connect to any site with HTTPS so I cannot download Chrome or Firefox.
And my solution is to enable the TLS 1.0 for secure connections
Everything is fine.