I do get some weird certificate issue, if I try to setup Openshift / OKD behind the Cloudflare DNS Proxy. My setup needs to work such as:
Client --> Cloudflare DNS Proxy --> PfSense HAProxy TCP LoadBalancer --> Openshift/OKD
This does work perfectly fine for any own service and route in Openshift, which uses TLS Mode edge.
However I do want to expose the default web console to an subdomain. The web console is deployed with TLS mode reencrypt, this is done by an operator. TLS mode reencrpyt does not seem to work. I have tried to deploy an other service with TLS mode reencrypt - does not work!
I am getting some error Messages in different Browsers, e.g. Firefox SSL_ERROR_NO_CYPHER_OVERLAP
My Tries so far:
a) Change TLS Mode to edge for the web console => does not work, since it will be changed back immediately
b) Use different Certificates between Cloudflare and the Originservers => does not change the behavior, also I doubt that this is an issue since TLS mode edge does work, so the connection is working.
I have no other idea to debug and or to fix this problem.
What are some ways to handle this and make my setup work??
Related
I'm a front-end developer working on an application where the login/ response put a Session-Cookie on the client. The later request will be authorized since the user "logged in".
Starting from Chrome 80
All cookies without a SameSite attribute will be treated as if they had SameSite=Lax specified. In other words, they will be restricted to first-party only (server and client on the same domain).
If you need third-party cookies (server and client on different domains), then they must be marked with SameSite=None.
Restricted to first-party by default
Set-Cookie: cname=cvalue; SameSite=Lax
Allowed in third-party contexts
Set-Cookie: cname=cvalue; SameSite=None; Secure
For my application, I want the default behavior. My client and server running on the same domain in production. But in development I'm working from localhost (different domain).
Up until now, chrome had special flag under chrome://flags - SameSite by default cookies. I could Enable this flag on my development machine and the login passed. And in production, I didn't need this flag because I wanted the default behavior.
Starting from Chrome 91
The SameSite by default cookies flag was removed. This means that from this version I can't login into my app, without deploying it to production.
Does anybody knows how can I get the Session-Cookie while working from localhost. But still keeping the security of SameSite=Lax. If possible with client only changes, but if needed also with server changes.
Chrome DevTools - SameSite error message
Chrome 80 Flags menu - These flags removed in Chrome 91
Update
I tried to solve this by making the server use SameSite=None (development only).
This causes a different error: Connection isn't secure. This is because when using SameSite=None you are required to add the suffix Secure and of curse use HTTPS connection.
Secure connection has its own problems like having to pay for a Certificate in development.
Workaround: Downgrade Chrome
This is not a solution! just a temporary workaround for anybody like me how got his work halted due to this update.
Uninstall Chrome
Go to "Add or remove programs" and uninstall Chrome. Notice that user data like cookies and saved browser passwords may be lost.
Download Chrome v90 from slimjet.com, or from any other site. Then install Chrome.
Prevent auto-update Chrome, according to this StackOverflow solution: open C:\Program Files (x86)\Google\Update
rename the file GoogleUpdate.exe to GoogleUpdate2.exe.
This will cause Chrome to not find the update package.
Update Flags - Open Chrome and type: chrome://flags
Search #same-site-by-default-cookies and Disable the flag
I have found a way to fix it and share it with everyone :-)
Description appears in the issues section:
Specify SameSite=None and Secure if the cookie should be sent in
cross-site requests. This enables third-party use.
In the Developer Tools section, go to the Application tab, and on the left side to Cookies:
The cookie that you want to share with other domains, mark the Secure
check and in Samesite put None. Update the site tab locally and you
will be able to use the cookies that allow you to send through the
domain of origin
I hope this brightens your day
As of Chrome v107 (Nov 2022)
I had a similar issue, spent a few hours digging, and what I found is that the only solution for Chrome is to make your front-end connection secure, ie https (using a proxy for instance): Link
An alternative solution is to use Firefox and set: about:config > network.cookie.sameSite.noneRequiresSecure=false. This allows SameSite=None; Secure=false
In our case, we are able to also run our server locally on a different port and point our client app to that localhost address for development purposes.
For example, I have the client app running on localhost:1234 and sending requests to a local copy of the server running on localhost:5678. This ensures that cookies are set successfully since the client and server are now "SameSite".
Admittedly, this is perhaps more of a workaround than a solution, but I hope it helps in the short term.
If you want to perform "unsafe" CORS requests (which means performing a POST/PUT/DELETE request) you will need to modify the tomcat conf/context.xml file, to set sameSiteCookies to "none" instead of "lax".
...
<!-- default samesite cookies configuration, for CORS set sameSiteCookies to "none" and configure bundle for HTTPS -->
<CookieProcessor sameSiteCookies="none" />
...
You can set the SameSite attribute manually to "None" + tick "Secure" inside the devtools for development.
That way you would not have to modify your production environment (keep the cookies as SameSite=Lax).
I can't intercept requests made by Chrome version 73.0.3683.86 to my localhost site.
Local host site is running on IIS on http://127.0.0.3:80
Burp proxy lister is default one on 127.0.0.1:8080
Interception rules are default one as well
In my LAN settings, "Bypass proxy server for local addresses" is not enabled
When Interception is turned ON and I reload page in Chrome browser, no request is "caught" by Burp, my local site loads and only the external requests are intercepted, such as loading external scripts from CDN.
Also under "Proxy" > "HTTP History" there is only request to external sites, and all requests to http://127.0.0.3:80 are not recorded.
When I reload same page by Internet Explorer 11, initial GET request is intercepted by Burp, as expected. Also "Proxy" > "HTTP History" shows all the requests to local site http://127.0.0.3:80
What is the problem with the Chrome? Thanks!
Found the solution late yesterday. I am using the Chrome extension ProxySwitchy, but it doesn't matter if you use that or the system proxy configuration. The solution works the same way.
You can solve this problem by adding an entry in /etc/hosts file like below
127.0.0.1 localhost
127.0.0.1 somehostname
Now burp will intercept request from somehostname
Which version of Chrome are you using?
Have you tried using the FoxyProxy Chrome extension?
As a workaround, you could modify the hosts file on your machine.
I experienced the same issue when I upgraded from Opera 58.0 to 60.0. I think that this is Chrome related, because I've also experienced it in all other Chrome browsers. Opera 58 utilizes Chrome 71.0.3578.98. Opera 60 utilizes version Chrome 73.0.3683.103. Something was definitely updated in Chrome between these versions to cause this problem to happen.
You have to subtract the implicit bypass rules defined in Chrome (https://chromium.googlesource.com/chromium/src/+/master/net/docs/proxy.md#Implicit-bypass-rules)
Requests to certain hosts will not be sent through a proxy, and will
instead be sent directly.
We call these the implicit bypass rules. The implicit bypass rules
match URLs whose host portion is either a localhost name or a
link-local IP literal. Essentially it matches:
localhost
*.localhost [::1]
127.0.0.1/8
169.254/16
[FE80::]/10
https://chromium.googlesource.com/chromium/src/+/master/net/docs/proxy.md#Bypass-rule_Subtract-implicit-rules
Whereas regular bypass rules instruct the browser about URLs that
should not use the proxy, Subtract Implicit Rules has the opposite
effect and tells the browser to instead use the proxy.
In order to be able to proxy through the loopback interface, you have to add the entry
<-loopback>
in the list of hosts for which you don't want to a proxy. It is a bit confusing, indeed.
Make sure you haven't enabled socks proxy option, it happened with me too and i found the solution when i disabled the socks proxy option, just make sure it's disabled!
Example:
It helped me
I turned on this settings
I'm debugging a local site.
I'm getting the following message in chrome.
Your connection is not private
Attackers might be trying to steal your information from t.buyamerica.com (for example, passwords, messages, or credit cards). Learn more
NET::ERR_CERT_COMMON_NAME_INVALID
This is not new, and normally I just click ADVANCED and Procced ...
but lately it just stuck in a loop and display the error message again.
This is a local site therefore the key-pair is indeed invalid, but is there a way to by-pass this issue without installing a proper https for all my local (vagrant based) servers?
NOTE:
The current by-pass for me is to use the same domain as the original site, so that the local site is www.somesite.com, and the actual site is somesite.com
I solved this issue as follow:
In
System Preference -> Network -> WiFi -> Advanced -> Proxies I saw that Secure Http Proxy (HTTPS) is checked and the value for the proxy is localhost:8888
I unchecked the Secure Http Proxy (HTTPS) and it seems to solve the issue.
NOTE: this is a specific MAC issue that apparently caused by a system upgrade (my current version is 10.10.5 (14F2511) Yosemite, MacBook Air (13-inch, Mid 2012))
I never set a proxy server or run any proxy on localhost:8888
You change your local domain something like http://yourdomain.test.
Don't forget the 'http'. And if you're using .dev, change it to .test
Is there any reason why a file may load over http but not over https?
I am curious because I just enabled ssl on a subdomain and it does not seem to be properly. I can see the green lock but if i load the site with it, i see no files.
Like if I have a file at
http://site.exmpl.org/file.html
when i go to
https://site.exmpl.org/file.html
it does not load.
I have ssl enabled because i have the green lock, also i am using cloudflare if that helps
I assume that you may have your SSL mode configured to "Full" in the CloudFlare Crypto section- But lack a ssl certificate installation on your subdomain.
--If not--
You may not have SNI or a Dedicated IP setup for your website then your apache server is likely using your certificate, but connecting to the web space of whoever first setup a SSL Certificate on that server. This is often a problem on shared web hosting environment. You can attempt to contact your provider to ask for help in getting SNI properly configured. You can also acquire a Dedicated IP from your provider.
in cloudflare dashboard under SSL/TLS section go to Edge Certificate instead of overview.
In Edge Certificate there is option called "Always Use HTTPS" which explains "Redirect all requests with scheme “http” to “https”. This applies to all http requests to the zone." just turn in on and after sometimes you are good to go.
I'm using Advanced REST Client to test external API which requires me to specify
Connection: Keep-Alive. The connection fails (NO RESPONSE) and inspecting Chrome console I noticed Refused to set unsafe header "Connection" followed by net::ERR_INSECURE_RESPONSE
Is there any Chrome settings that allow me to override this? BTW, the API works when I use external tools like APIGee. I've tried Chrome CORS extension (Allow Control Allow Origin) but still unsuccessful.
The issue is that chrome is refusing to load a resource that has an invalid or expired SSL certificate. Even if you could get it to bypass that it would be a bad idea as it would make man in the middle attacks easier in your application.
My suggestion would be (if you trust the server or if it's running locally) to import that certificate to your store so it's trusted in your development environment. If the cert is expired and it's hosted locally look at the documentation on how to change the certificate or to add a self signed one (which you then also would add to your trusted sites)
How to add a self signed very to your store
For Mac
For windows
You'll have to restart chrome for it to see the certs in the store after doing this
Again, be sure you trust these certs origin as they'll be considered trusted as if a legit CA HAD issued them