net::ERR_INSECURE_RESPONSE when loading geojson file - google-chrome

I'm looking for some help figuring out why this issue occurs and how to fix it. I'm trying to create an interactive map (replicate this). I have a .geojson file "capecchi.github.io/projects/AirQuality/static_temp.geojson" with all the data, but opening williamcapecchi.com/AirQuality/AQmap_static.html causes the error "net::ERR_INSECURE_RESPONSE" (When using Chrome-- It seems to work fine in IE). I've read other answers related to this error, but being new to development, I'm not sure what is causing this error in my case. After reading this thread I checked the security tab in developer, but instead of finding an explanation of the certificate problem, only saw this:
Any advice would be greatly appreciated!
Bill
*edit: apparently I can't add images to questions yet and so I'll explain: when I check the security tab it simply tells me that "This page is not secure." On the left, I see "Main Origin" as http: //williamcapecchi. com and "Non-Secure Origins" as http:// d3js. org. Clicking on either of these it tells me "Your connection with this origin is not secure."

For anyone else struggling with this, my solution was to use http:// addresses instead of https://. Then everything worked fine!

Related

Chrome ERR_EMPTY_RESPONSE

I have a problem with chrome which don't display some of my static files (css, js mainly). it says that an empty response has been sent by the server
but when I try to "curl" the same URL, it works
I don't really know what's wrong here. I tried disabling cache, clearing all type of cache, incognito mode, disabling chrome extensions... none of those worked
thanks
update
If I try to access the same URL with a random parameter in query string (let's say ?id=1), it works but only once ! when I refresh the page, I get the same error, and so on...
I finally found the problem :
I had something wrong in my .htaccess file. I had module_deflate which wasn't enabled in my apache configuration, so somehow, this prevented my local website to work properly. Just enabled it in my /etc/apache/http.conf and that's it.
If you get the same problem, I would recommend to comment all your .htaccess lines and then uncomment them one by one to see if any of them can cause the problem.
hope this would help someone else.

Unable to create a Collection in Firestore - request blocked:mixed-content

I've been trying to create a new collection at https://console.cloud.google.com/firestore/data?project={PROJECT} however when I do, nothing happens. Opening the dev tools and looking at the Networking tab shows the requests are not going through (see screenshot). I'm using Chrome 83.0.4103.97 on OSX 10.15.4 (19E287). I have tried to go to Site Settings and change unsafe content to Allow, however that did not make a difference.
Anybody got any idea how I could make it work?
#Alice and #Frank resolved it however I think this is worth creating an answer.
In such situation first thing to check if incognito window is behaving the same way. There might be some extension that are blocking request.
In this situation there was "Disconnect" extension which was blocking.
I hope it will help someone in future!

Chrome: ERR_BLOCKED_BY_XSS_AUDITOR details

I'm getting this chrome flag when trying to post and then get a simple form.
The problem is that the Developer Console shows nothing about this and I cannot find the source of the problem by myself.
Is there any option for looking this at more detail?
View the piece of code triggering the error for fixing it...
The simple way for bypass this error in developing is send header to browser
Put the header before send data to browser.
In php you can send this header for bypass this error ,send header reference:
header('X-XSS-Protection:0');
In the ASP.net you can send this header and send header reference:
HttpContext.Response.AddHeader("X-XSS-Protection","0");
or
HttpContext.Current.Response.AddHeader("X-XSS-Protection","0");
In the nodejs send header, send header reference :
res.writeHead(200, {'X-XSS-Protection':0 });
// or express js
res.set('X-XSS-Protection', 0);
Chrome v58 might or might not fix your issue... It really depends to what you're actually POSTing. For example, if you're trying to POST some raw HTML/XML data whithin an input/select/textarea element, your request might still be blocked from the auditor.
In the past few days I hit this issue in two different scenarios: a WYSIWYG client-side editor and an interactive upload form featuring some kind of content preview. I managed to fix them both by base64-encoding the raw HTML before POSTing it, then decoding it on the receiving PHP page. This will most likely fix the issue and, most importantly, increase the developer's awareness level regarding the data coming from POST requests, hopefully pushing him into adopting effective data encoding/decoding strategies and strengthen their web application from XSS-type attacks.
To base64-encode your content on the client side you can either use the native btoa() function, which is supported by most browsers nowadays, or a third-party alternative such as a jQuery plugin (I ended up using this, which worked ok).
To base64-decode the POST data you can then use PHP's base64_decode(str) function, ASP.NET's Convert.FromBase64String(str) or anything else (depending on your server-side scenario).
For further info, check out this blog post that I wrote on the topic.
In this case, being a first-time contributor at the Creative forums, (some kind of vBulletin construct) and reduced to posting a PM to the moderators before forum access it is easy for one to encapsulate the nature of the issue from the more popular answers above.
The command was
http://forums.creative.com/private.php?do=insertpm&pmid=
And as described above the actual data was "raw HTML/XML data within an input/select/textarea element".
The general requirement for handling such a bug (or feature) at the user end is some kind of quick fixit tweak or twiddle. This post discusses the option of clearing cache, resetting Chrome settings, creating a new_user or retrying the operation with a new beta release.
It was also suggested that one launches a new instance with the following:
google-chrome-stable --disable-xss-auditor
The launch actually worked in this W10 1703 Chrome 061 edition after this modified version:
chrome --disable-xss-auditor
However, on logging back in to the site and attempting the post again, the same error was generated. Perhaps the syntax wants refining or something else is awry.
It then seemed reasonable to launched Edge and repost from there, which turned out to be no problem at all.
This may help in some circumstances. Modify Apache httpd.conf file and add
ResponseHeader set X-XSS-Protection 0
It may have been fixed in Version 58.0.3029.110 (64-bit).
I've noticed that if there is an apostrophe ' in the text Chrome will block it.
When I update href from javascript:void(0) to # in the page of POST request, it works.
For example:
login
Change to:
login
I solved the problem!
In my case when I make the submmit, I send the HTML to the action and in the model I had a property that accept the HTML with "AllowHTML".
The solution consist in remove this "AllowHTML" property and everything go OK!
Obviously I no longer send the HTML to the action because in my case I do not need it
It is a Chrome bug. The only remedy is to use FireFox until they fix this Chrome bug. XSS auditor trashing a page, that has worked fine for 20 years, seems to be a symptom, not a cause.

'Script error' fix does not work for Chrome

The fix for the cryptic JS Script error (resulting from errors in a cross-origin script) is well documented. I've implemented the solution and it now works for Firefox, but it does not for Chrome. Has anyone else encountered this problem or know what might be going wrong? I did look through this post, where they identify it as a bug that was fixed back in 2013, so I'm not sure what gives.
I ran into this issue, which might not be your issue, but in case someone else comes here with the same thing:
Adding the CORS headers to the HTTP responses for the linked status is not enough. I overlooked this, but adding crossorigin="anonymous" (or crossorigin="use-credentials see here on MDN ) to the <script... tag is also necessary.
It makes sense - in order to allow sharing of source code info in between two domains, one would want permissions to be given on both sides - the page requesting the external script, and the source of the script both have to be okay with showing debug info.
It's already in the "The Fix" link in the original post, but I lost some time to it, so thought it was worth reiterating.
I ran into the same issue, and I don't know if this helps you (probably not since it's been over a year), but for any future readers: send your files over HTTPS, not just HTTP. It seems that Chrome is stricter than Firefox and won't allow this 'fix' to work over regular HTTP.
(And if you're using webpack, be sure to have devtool: "source-map" set (as opposed to eval-source-map)

This webpage has a redirect loop (ERR_TOO_MANY_REDIRECTS)

We have a site that is not working in Google Chrome V44. It works well in IE and Firefox. All of sudden after updating chrome browser to V44, we unable to login to the system and just receiving this problem.
We're trying to figure out as why this is happening. We have 2 instances of our system in our server. Our live site is the one that is not working in Chrome V44 while the other - our demo site is fine. The only difference of the these sites is that our live has SSL. So our first impression is that there's a problem with Chrome V44 with our site with certificate.
I think Chrome can't establish secure connection with the site.
Has anyone experienced this issue?
Please help. Thanks.
This is due to a SSL in Chrome V44 where it incorrectly sends a HTTP_HTTPS header to be set, however the HTTPS header is still set correctly. It has been quite widely reported: http://www.zdnet.com/article/brand-new-chrome-44-release-added-a-bug/
https://ma.ttias.be/chrome-44-sending-https-header-by-mistake-breaking-web-applications-everywhere/
In order to stop this, in PHP, I added the following to the very top of my index.php file:
<?php
if (!isset($_SERVER['HTTPS'])) {
$_SERVER['HTTP_HTTPS'] = 0;
}
?>
Ensuring there is no space between the ?> and the next
I've recently had the chrome redirect loop on gmail.
Possibly significantly, I was doing some work involving changing my system time and it hasn't worked since. This guide helped to do that.
There is an available work-around, which is to use gmail in incognito mode, which does still work, although requires you to log in each time
In that case I would say this is an internal problem with you organization's setup. I would speak with your SysAdmin or IT staff. But just to be sure, use your phone carrier's internet, or a cafe nearby, basically something off your network to check if you can reproduce the error.
The issue with my MVC solution was, i had recently updated complete Nuget packages in my solution. After the update i forgot to update
<assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
<dependentAssembly>
section with new dll bindings which installed while update. So in my hosting server due to connectionstring issue, i was not overwriting the current .config file. So once i did the update in assemblyBinding section in .config file the issue gone.
There might be many reasons for the redirect loop. If you are confident your setup is done properly without any errors, then it might be the issue with your browser. You can try the following
Deleting cache and cookies
Correcting your system time (if it is not set to automatic)
Resetting the browser
Source
You should be able to fix this problem, you can try to clear your cookies in your browser
Open your Chrome browser.
Type "chrome://settings/clearBrowserData" in the address bar and press Enter.
Make sure you are clearing items from the Beginning of time. Then select Cookies and other site data. Click the Clear browsing data button.
If find from google search, this tutorial could helps you https://windows10freeapps.com/fix-err_too_many_redirects-error-google-chrome-browser