Looking for a script or program to convert all HTTP links to HTTPS - html

I am trying to find a script or program to convert my html website links from http to https.
I have looked all over hundreds of search results and web articles and I used the Word Press SSL plugin but it missed numerous pages with http links.
Below is one of thousands of my links I need to convert:
http://www.robert-b-ritter-jr.com/2015/11/30/blog-121-we-dont-need-the-required-minimum-distributions-rmds
I am looking for a way to do this quickly instead of one at a time.

The HTTPS Everywhere extension will automatically rewrite unsecure HTTP requests to HTTPS. Keep in mind not all websites offer a secure and encrypted connection.

Related

Problems with the whitelist of an addon

I am developing an addon for gmail, in this I need to make calls to an api hosted on a server, the problem arises in that I can only add urls to the whitelist of the addon if they are HTTPS, and the query that I need to do has the following format
HTTP://example/:8080/example/oslc/am/cf/resource/
Is there any way to add http queries to the whitelist? or any other way to make them?

should rel-canonical also include protocol (http/https)?

I'm migrating my website from http to https (although it will still support access via http)
Currently all of my pages have accurate rel-canonical meta tags set in the HTML, but obviously they all point to the canonical http:// url.
Should I now be updating those to https:// too, or is it ok to leave them as http?
I'm wondering whether Google will penalise me, or start detecting duplicate content, if I start mixing them
Yes Google sees http and https as different sites so you should update them.
A redirect on the server might be sufficient in the short term but personally I would be looking to update the pages as soon as you can.

Disable https to http redirect

My site is using HTTPS only.
I allow using BBCodes to show images. Users are placing images like "https://imagehoster.net/img.png" and the imagehoster is using a redirect so the browser loads it via HTTP "http://imagehoster.net/img.png". This makes the browser showing annoying mixed content warnings. Is there a way to prevent this?
Short: NO
Long:
the have no really web server listening to ssl.
in fact, there is only a firewall/proxy which sends a http locate to the browser.
You can't intercept that request. even if you could, where to redirect to?
they don't provide a ssl server, because it takes to much resources for encryption or it takes to much traffic, because proxy#s can't cache.
An idea to solve that problem:
detect those links, download them and store a copy on your server.
replace the link. maybe you need only to store a preview. if the click on it redirect to the original link on a new browser window.

Link to http or https

While adding a hyperlink to another site (which has SSL), the site documentation sometimes say to link to the http:// link instead of the https:// (e.g. Play store, which is a site that uses SSL but it does not tell you to link to https; instead, it says to link to http). They do not matter (as they function normally), but would there be a reason to link to the http:// instead the https://?
Maybe they don't want extra encryption and lowering down the site speed as SSL may decrease performance somewhat.
If users are downloading large, public files, there may be a system burden to encrypt these each time.
Some browsers may not support SSL.
You will probably want the home page accessible via HTTP, so that users don't have to remember to type https to get to it.
Your specific portion of page needs secure http(https) not whole site.
Your site is indexed mainly on http on Search engines.

Request permission to perform Cross-Origin XMLHttpRequest

I am working on a project where I need make cross-origin requests, but there does not appear to be any way to allow this in a pure web page.
Chrome extensions can simply request permission to the domains they would like to make requests to as in the following example.
"permissions": [
"http://www.google.com/",
"https://www.google.com/"
]
http://developer.chrome.com/extensions/xhr.html
I found https://developers.google.com/chrome/apps/docs/no_crx which seemed like something closer to what I was looking for, but the only permissions allowed are "geolocation", "notifications", and "unlimitedStorage".
There is the HTTP header Access-Control-Allow-Origin which could be set on the domains I would like to make requests to, but they are not under my control so that is not practical.
Similarly the Content-Security-Policy: connect-src https://www.google.com; is primarily used to further restrict access instead of opening up access.
http://www.html5rocks.com/en/tutorials/security/content-security-policy/
I understand the security concerns, but as a quick search will show people get around this by making a proxy server. Wouldn't it make sense to allow the equivalent request to be made, meaning a request without the user's session/cookie information (like incognito mode)? Or some mechanism by which the page can request permission in the same manner as an extension? Seems somewhat backwards to require things like this to be down in browser specific manner.
Just like webspeech api (or getUserMedia) requests access to use microphone.
Any thoughts or perhaps something I missed?
EDIT: I posted this elsewhere and got:
If you are making requests from domains that are under your control, there are other options (like JSONP) that you can use to access data from another domain. Or, you can load an iframe and use postMessage() to interact with the contents - there are lots of tools that also enforce that the domain you're trying to communicate with is willing to share that data.
Me:
JSONP looks like a solution for data sources that provide JSON, but I am not sure that will solve my overall problem. I am trying to create a tool that will pull in data from a variety of sources to do both displaying a result and interpreting the information to perform an action. One query might be a google search which jsonp or the other official methods should allow for, but that does not work for scraping data from other web pages. All of the requests being made will not require user session information and thus a proxy would work, but will add latency and maintenance costs.
The postMessage() interface would require the pages being requested to implement listeners right?
So far the "best" solution still seems to be to have a companion extension that runs in a privileged environment that can make the requests and communicate the results with the page. The tool does a variety of other things that work within a web page so I would rather leave the primary environment as the web page with the option to run the extension.