I have an iFrame and, understandably, I can't edit the elements.
Is there a work around for this, like using some sort of proxy? I would need it to work for any website and have things like sessions and cookie data persist (for logins etc).
Is it at all possible?
Nup, and you can't use CORS because it is only supported by XHR.
Same Origin Policy is going to stop you, and rightfully so.
If you could proxy the site through your own domain, protocol and port, it'd work, but that is often quite difficult.
Related
I added HPKP header to my site, but it is not honored by Chrome or Safari. I tested it manually by setting a proxy and by going to chrome://net-internals/#hsts and looking for my domain - which did not found. The HPKP seems correct, and I also tested it using HPKP toolset so I know it is valid.
I am thinking I might be doing something weird with my flow. I have a web app, which is served over myapp.example.com. On login, the app redirects the user to authserver.example.com/begin to initiate OpenID Connect Authorization Code flow. HPKP header is returned only from authserver.example.com/begin, and I think this might be the issue. I have include-subdomain in the HPKP header so I think this is not the issue.
This is the HPKP header (line breaks added for readability):
public-key-pins:max-age=864000;includeSubDomains; \
pin-sha256="bcppaSjDk7AM8C/13vyGOR+EJHDYzv9/liatMm4fLdE="; \
pin-sha256="cJjqBxF88mhfexjIArmQxvZFqWQa45p40n05C6X/rNI="; \
report-uri="https://reporturl.example"
Thanks!
I added HPKP header to my site, but it is not honored by Chrome or Safari... I tested it manually by setting a proxy...
RFC 7469, Public Key Pinning Extension for HTTP, kind of sneaks that past you. The IETF published it with overrides, so an attacker can break a known good pinset. Its mentioned once in the standard by name "override" but the details are not provided. The IETF also failed to publish a discussion in a security considerations section.
More to the point, the proxy you set engaged the override. It does not matter if its the wrong proxy, a proxy certificate installed by an mobile device OEM, or a proxy controlled by an attacker who tricked a user to install it. The web security model and the standard allow it. They embrace interception and consider it a valid use case.
Something else they did was make the reporting of the broken pinset a Must Not or Should Not. It means the user agent is complicit in the coverup, too. That's not discussed in a security considerations section, either. They really don't want folks to know their supposed secure connection is being intercepted.
Your best bet to avoid it is move outside the web security model. Don't use browser based apps when security is a concern. Use a hybrid app and perform the pinning yourself. Your hybrid app can host a WebView Control or View, but still get access to the channel to verify parameters. Also see OWASP's Certificate and Public Key Pinning.
Also see Comments on draft-ietf-websec-key-pinning on the IETF mailing list. One of the suggestions in the comment was change the title to "Public Key Pinning Extension for HTTP with Overrides" to highlight the feature. Not surprisingly, that's not something they want. They are trying to do it surreptitiously without user knowledge.
Here's the relevant text from RFC 6479:
2.7. Interactions with Preloaded Pin Lists
UAs MAY choose to implement additional sources of pinning
information, such as through built-in lists of pinning information.
Such UAs should allow users to override such additional sources,
including disabling them from consideration.
The effective policy for a Known Pinned Host that has both built-in
Pins and Pins from previously observed PKP header response fields is
implementation-defined.
Locally installed CAs (like those used for proxies like you say are running) override any HPKP checks.
This is necessary so as not to completely break the internet given the prevalence of them: anti-virus software and proxies used in large corporations basically MITM https traffic through a locally issued certificate as otherwise they could not read the traffic.
Some argue that locally installing a CA requires access to your machine, and at that point it's game over anyway, but to me this still massively reduces the protection of HPKP and that, coupled with the high risks of using HPKP, means I am really not a fan of it.
My site uses HTTP authentication and I've learned it isn't very secure and it causes a lot of problems for many browsers, and not all browsers may support it, so I want to use an alternative that is secure and more widely supported; what are some alternatives?
Is it possible to lock all directories using an HTML login page?
My site uses HTTP authentication and I've learned it isn't very secure
That's false... unless you're referring to something like basic auth over an insecure channel. In that case, anything over the insecure channel has potential issues. (Even if you did some client-side encryption hackery, you still have the problem that the remote host is not verified without the TLS or SSL layer.)
Basic auth is fine in some cases, and not for others. It depends on what you're trying to do.
it causes a lot of problems for many browsers, and not all browsers may support it
Completely false. I've never seen a browser that didn't support basic auth and digest auth.
what are some alternatives?
This isn't possible to answer without a better understanding of your requirements. Two-factor auth with a DNA sample and a brainwave scan might be more secure but chances are that's not what you're looking for. Besides, you can't forget about the rest of your system and you've told us nothing about that.
Is it possible to lock all directories using an HTML login page?
Yes. How you do this depends on what you're running server-side, but yes it's completely possible and often done.
I have shopping cart script in our site that is setup to be secure (https with SSL certificate). I have links in the script leading to other parts of my site that are not secure (WordPress blog, etc).
In the secure site, if I have links that are not secure ( http ), it triggers a message to user in browser, alerting of unsecured links. If I put the outgoing links in the script as relative links, when the user clicks on them and goes outside of script, it keeps them in secure mode (which we don't want for other parts of our site).
Years ago, I remember having this issue. I think I got around it by using a HTTP Redirect for every outgoing link in the secure site. Using a HTTP Redirect, I would have https://www.example.com/outgoinglink1a redirect to http://www.example.com/outgoinglink1b in the HTTP Redirect. This way, I could put https://www.example.com/outgoinglink1a in the secure site, and when it was clicked, it would lead to http://www.example.com/outgoinglink1b
In modern times, how do I have links in the secure site that lead to other parts of the site that aren't secure, without triggering SSL Error Message to user when they are in Secure part of site? Is using some type of 301 redirect in .htaccess better? Is there another preferred or easier method (than using HTTP Redirects) for accomplishing this?
Thank you for any guidance.
You can use https-2-http redirects to the unsecured site to avoid browser warnings.
But for multiple reasons, safety being one of them, I would really advice against using http and https for the same domain, even if lot of big sites still do it. You would ether have to use different cookies for the secure and the normal site, or the one cookie u use for your shopping cart can't have the secure flag, in which case you really don't need https in my opinion. Also, you will never be able to implement HSTS.
You've already gone to the lengths bought a certificate and set up an https-server, now why not secure the whole site?
Update to answer your question in the comment:
That is of course a deal-breaker, if you rely on those and the hosts haven't implemented https yes (which they probably will sooner or later, or they are going to be out of business)
Depending on what they actually do, you maybe could proxy the request to those scripts and serve them from you https-enabled server. But I would really consider this last a resort.
The slowing down part is mostly just the handshake. If you enable session resumption there shouldn't be too much overhead to actually slow down your site. Make sure your TLS session cache is big enough and that the ticket lifetime is ample.
Of course, your mileage may vary. So make sure you test your https site before going online.
I heard of such horror stories as well, but I think most of the time it's probably due to faulty or at least sub-standard implementation. Make sure you redirect EVERY single http-request to https with the 301 status and you should be fine. For some months now enabling https should actually help with your Google pagerank.
To link to an external site (differnt FQDN) you don't have to implement any trickery to avoid browser warnings - that's just linking to a different site and has nothing to do with mixed content policies.
I am looking for ways to browse sites that are blocked by proxy filters at my location.
One solution i came up with was to build a page that would take a input of a URL and display the site in an iframe. Thus i would have a window into a browser on a page that is being displayed by my proxy. I was going to host this on my personal web site and use it to access restricted content. this way i have access to blogs, and forums where there is a wealth of information that is blocked by a backwards blanketed restriction list.
How can i make a web page similar to this? Would it be simple html and javascript, do I need .Net?
What you aim to do has to be done server-side. When you put a page in an iframe, your web browser loads it, and will do so just as if you went directly to the URL.
There is no way around this via client-side code, such as JavaScript.
If you truly want to reinvent the wheel, pick a language and look into whatever functions download files. No need to do this though when there are plenty of web-based proxy services, such as http://www.hidemyass.com.
Even if you loaded it in an iframe, the request for the page in the iframe will still go through the proxy and so you will still be blocked.
You'd have to do something like open a socket to the site through your web host and then download the content and redisplay it. That's assuming your host isn't also blocked. Also, you'll lose the benefits of cookies and sessions this way (ie. you won't be able to be logged into things unless the session id is in the query string).
The fastest and simplest solution would be to create a free Log Me In account at www.logmein.com. then setup your host computer at home, login from work, and browse freely. I do this myself at work so no one can see my personal browsing history when I dont want them to. This of course would only work if logmein.com was not a blocked site at your work. good luck!
It depends upon the "filter" complexity. If you have your own website that you can reach through the proxy or if your computer can run as a webserver, you could try accessing via a proxy script such as "CGIProxy." There are online services that do this too. However, some proxy filters can detect these methods as well and you'd still be out of luck. No javascript or HTML tricks can overcome the proxy filter.
I have a page that is just a non interactive display for a shop window.
Obviously, I don't link to it, and I'd also like to avoid people stumbling across it (by Google etc).
It will always be powered by Chrome.
I have thought of...
Checking User Agent for Chrome
Ensuring resolution is 1920 x 1080 (not that useful as it is a client side check)
Banning under robots.txt to keep Google out of it
Do you have any more suggestions?
Should I not really worry about it?
Not that I would EVER recommend what I'm about to suggest - how about filtering by IP address. Since you provider IP is rarely going to change you can use Javascript to kick out or deny requests from IP addresses other than yours. Maybe a clean redirect to http://www.google.com or something silly like that. Although I would still suggest locking it down with a login and password and just have it write a never expiring cookie. That's still not a great idea but a shy bit better than the road your trucking down right now.
You could always limit the connections by IP address (If you know it ahead of time/it's reliable):
Apache's access control
If it is just for a shop window, do you even need access to a web page?
You can host the file locally.
Personally, I wouldn't worry about it, if no-one is linking to it externally it is unlikely to ever be found by search engines.